What is our primary use case?
Initially we were automating the regression suite for SAP ECC.
From there we moved into a web application called HVAC Partners, which is something that we've developed that is a type of customer portal. That application also connects to SAP, but it does some other things that don’t necessarily connect to SAP. It is a kind of front end for quotes and sales orders that go into SAP, but it's also reporting status of orders and status of warranty claims and the like for the customers.
From there we moved into the Middle East SAP ECC instance and automated their regression suite and, from there we rolled out S/4HANA for our service business. With S/4, SAP releases updates every quarter. Because the S/4HANA instance is in the public cloud we have two weeks, essentially, to regression test and test any new functionality. We started with the last release and we did about 70 percent of the testing with Worksoft. We also used the S/4 automation tool, which is more for unit testing so it's not as valuable as Worksoft. We're wrapping up that automation in about the next month and we'll be moving on to a European rollout of S/4. We'll just start working our way across Europe and those implementations.
We have it on a virtual server and we're using remote desktop access for the offshore automation engineers to access it.
How has it helped my organization?
We're using it with Fiori and it's working fine. We have integration in and out of S/4 to Salesforce.com so we also automated those. The test cases were end-to-end. We start in Salesforce, which is a web application, with, for example, a quote, and then it goes into S/4 and gets reviewed and approved. It then goes back to Salesforce with the approval and a sales order is entered that ends up going back to S/4. And then there's fulfillment, back and forth, and eventually billing and collections. We were able to do that whole automation with Worksoft, plugging into Salesforce as well as integrating to S/4 and doing the S/4 automation, back and forth. It's been incredibly useful. We saved something like 80 percent of the time it would have taken to manually test, using this tool.
In terms of using the Capture feature without knowledge of testing tools, we brought on some new support people. One of them is our web support person and she had no background in Worksoft. She's been using it to do all the initial captures for our HVAC Partners. She's been able to use it very easily. Our more experienced automation engineers will follow up, after she's done the Capture piece, and troubleshoot some of the stuff that she might not understand yet. They're working with her so that she does learn it. But she's been able to use it very easily.
Worksoft’s ability to build tests and reuse them is very good. We ended up obsoleting the tests and not using them with the other tool we used, whereas now, we rerun these, at a minimum, every month. We do that for a few reasons. One reason is to keep the health of the tests up. Suppose a material is obsoleted. The test that has that material in it is going to fail because it's going to say, "Material not found." Or suppose a customer is no longer a customer and he has been blocked or archived. We run the tests to make sure that the scripts don't need any changes. We also use them in case a process has changed. We're releasing changes to SAP about every two weeks: support tickets, enhancements, maintenance, etc. If a business process changes, then the automated test needs to change to reflect that change. Running them every month, at a minimum, helps make sure that everything is healthy.
The other reason is to identify anything in our quality system that could unintentionally impact other things that the programmers didn't realize. We've caught a couple of those in queue and they said, "Okay. I didn't mean to do that. I only meant to change this one thing,” but it changed all kinds of things and we were able to catch that before it went into production. So the reusability is fabulous if you create the tests properly: no hard-coding, and you’re using data tables to hold any of your field selections, and you're using good automation standards, so you create and consume your data. If you create it and consume it, when you rerun it, it does the whole thing again. You don't have to worry about finding a sales order that works, for example. You really have to create a logical test design to make it reusable but as long as you do that, it's very reusable.
It dramatically reduces the time we spend on testing. Before we started using this tool, everyone was pretty much doing testing manually and test events were taking from two to six weeks. What they did in two to six weeks, depending on the scope of what they were doing and how many people they had involved, we can usually do in one to two days.
The most dramatic was when we finished the Middle East automation. They were bringing up another company code and they wanted us to run regression testing on all of their current company codes, about seven of them. We completed it in about four days. The IT director came to us and said that it reduced their labor by 93%. “Quite frankly,” he said, "we would never have been able to do all of that testing. We would have had to engage a minimum of 28 people, and it would've taken them a minimum of eight weeks, and we still would not have been able to do all of the tests. We wouldn't have gotten them done." We were able to do it in a fraction of the time and with a broader scope than they would've been able to do. They would've done as much as they could and then they would have gone live and hoped for the best.
And we've also been able to use it for other things like certain recurring tasks that had been done manually. We had people who were manually monitoring Tidal jobs, which are batch jobs that have been scheduled to run. If a Tidal job fails, somebody has to go in and figure out why it failed and either restart it or fix it, and then rerun it. These are jobs like billing jobs and we automated them. They probably spend 15 minutes a week on billing jobs now, whereas we had somebody doing this about 12 hours a week. And then that person would have to send out an email to whomever the relevant person was saying, "Hey, check your batch job. This isn't running." They now spend about 15 minutes running it. It saves the emails to the users, documents the results in a spreadsheet, and puts it out to a SharePoint where the auditors can pull them any time they want. It was the same thing with monitoring the claims jobs. We've done a few things like that which have added to the value.
Automation using Certify has also saved testing time, big-time. As I said, the Middle East: 93 percent. For the S/4HANA project, what we did in three or four days, they had been taking two weeks and not getting through at all. With the release, you don't get to say to SAP, "Hey, testing is running behind, we need another week," because it's in the public cloud. Like it or not, they're going live. The drill is supposed to be: You test during week one and you remediate in week two and you go live that weekend. We got our stuff done, 70 percent of the work, in about three days, and it was our first time, out-of-the-gate, so it'll go easier with the next release. The rest of the team took the entire two weeks to do their 30 percent. And within the 30 percent they were doing, a lot of them were smaller tests. We were doing end-to-end tests that go through Salesforce and S/4, etc.
In terms of defects, the value is finding the defects prior to moving something into production. There are two I'm thinking of that we found in Mexico. One of them would've brought shipping to a halt and the other one would have brought receiving to a halt. If you shut down factories, even for a short period of time, there is this domino effect. The value of those finds is huge. And this wasn't even something that the guys making this change were testing. They were testing the piece that they changed, which was working. What they didn't realize is that they changed all items instead of just that subset. It was a minor goof in the programming. It was just too broad of a statement.
I started in IT about nine years ago and we did total manual testing. We would have defects in the high hundreds to 1,000 during the implementation testing. Now, we're probably under 100, so it's much lower. It could be that we're just getting better at implementations.
What is most valuable?
It's pretty seamless with SAP and Salesforce because they've built in the field definitions and all the things that you need. You literally turn it on and execute your script and it records it. It's very simple. Then you can go back and put in some of the other functions. For example, instead of hard-coding field selections, you put in a data table so you can run it multiple times or with multiple data. It was actually written to work very well with SAP.
Salesforce came a little later. Obviously, big companies like ours don't just use SAP. We have integrations with Salesforce and CRM and CPQ and all these other programs that integrate with SAP. Worksoft started looking at its customer-base and saying, "Okay, what are the popular ones you guys use?" Salesforce was one that came up. So one of their releases, about a year or so ago, included the ability to record in Salesforce like it has for SAP, so it's super-easy.
We’ve used the Capture not only to train people on how to do things, but also to provide the output to our users so they can validate that what we tested was proper. Capture is very good. It is lengthy, though, because it documents every keystroke that you do. At the beginning it will list all the field selections that we use and then it will give you each step: what it is, pass or fail. If we put in a screenshot, that shows up. It's up to you as to whether you put in screenshots or not. A lot of the times the documentation we provide as a PDF is lengthy but it's also very thorough, which is good.
Certify provides codeless, end-to-end process automation across packaged applications. It works well with SAP and Salesforce, for example. Another one that they have done all the definitions for is Oracle. We haven't started on our Oracle ERPs yet, but it's good to know that we can. We just don't have automaters who know Oracle, so we're sticking to what we know right now. The process automation makes for relatively fast automation compared to the other tool that we've tried to use. It makes it so much easier because you don't need any technical programming knowledge. A lot of the other ones are Java-based or based on other tech languages. That's a skillset that the average tester or support person does not have. It makes it very easy for those guys and the learning is quicker too, because the troubleshooting is easier. You look at the code, you can read what it's doing. You understand the business process and you say, "Okay, that's failing because we failed to set this flag, or fill out this field." It’s pretty simple.
What needs improvement?
I would like the ability to more easily modify the report from the Capture feature. One of the things I don't like is that it keeps repeating all the field selections throughout. To me, if we put them up front, we shouldn't have to repeat them at the different steps. It should just be Pass/Fail and show the screenshot. I've talked to them about this in the past.
There's another part of the Worksoft suite that probably does a better job at documentation for training purposes and providing an understanding of business process. It's the Certify BPP which we're not using right now because we're really focusing on automating all these different ERP systems. Whereas the testing is very detailed, which is great for the auditors and it's great for the users because they see everything we're doing, it makes for some big PDFs. It's a double-edged sword.
Also, with the codeless process automation across packaged applications, once in a while, if we get a weird application that's not widely used, it gets a little stickier. First, the software has to learn the fields, so you have to identify all the fields. Once you do that, as long as there isn’t any non-standard code in the application, then it works fine. But there's that one step that you have to do, a step you don't have to do with SAP and Salesforce, for example.
In addition, Worksoft definitely needs to continue the march toward bringing in more and more of the software that people commonly use. They're doing that, but they can only march so fast.
I know Worksoft is doing some stuff with RPA. There are other tools that strictly do RPA, but aren't automated testing so I'm not sure if they will be able to compete with those. I know that we did do some automation, what we call "bots," with Worksoft, and it was clunkier than some of the RPA tools that are currently on the market. I suspect that they'll come up with a very competitive offering.
I would also like to see some better reporting of testing status, reporting that we can easily generate to say "Okay, we're 50 percent done and we've got 10 fails and 800 passes." That's what test management software is for and Certify integrates with that. Bang-for-buck, it's probably not a great place for Worksoft to invest. They're probably better off with RPA and bringing on the ability to more easily test software, like Salesforce and CPQ. I'd love to be able to do that as easily as I can with SAP. I would like that same ability to use Capture in CPQ, instead of using Silverlight.
For how long have I used the solution?
We started using Worksoft Certify in 2016, so it's been about four years.
What do I think about the stability of the solution?
Stability is very good. They do a lot of releases. They are probably using an Agile methodology, so every time I turn around, they have three more releases out there. It would be helpful if they could release once or twice a year, but I understand why they are doing it. They are adding new features because they want to get them out as quickly as they can. I just don't have time to stop, do an upgrade, and move on.
We haven't had a problem yet with the solution. It's been very good, and you don't have to upgrade every time they do a release. We do it probably once a year.
What do I think about the scalability of the solution?
Scalability is fabulous.
We've certainly taken on more projects. When I first started about nine years ago, there was one major implementation at a time. At the moment we have about six major projects going on and, with the unwinding due to the spinoff, there are probably about 50, but those are not being tested with automated software. We're focusing on just the two SAP ERPs, S/4, and the ancillary web apps. It does allow them to implement faster. Since we did the Middle East, they've brought up two new companies in six months, which is amazing for them. It probably would have been one at a time over a year and a half or two years, otherwise.
We don't use Certify to create RPA at this point. We have so many ERPs to automate that we're sticking to that right now. We're trying to get to where we can pick up more licenses and build up the team so we can start doing some of these other things. Right now, with the spin-off from our parent, everybody is hyper-focused on unwinding. When you're part of a big organization like we were — we're still pretty big but we were huge, Fortune 50 — and you start unwinding things, there are so many shared services and servers that are on their domain, etc. It's going to take us two to three years to unwind all that. So we're marching ahead on our ERPs and I'm keeping my head down. I have my seven licenses, although I want to get about 10 more, but I'm not going to raise my hand until we get unwound.
How are customer service and technical support?
Most of our issues have been our own internal infrastructure issues. We have a very tightly controlled infrastructure, so I'm always banging up against that. Worksoft has been able to help us solve these problems, and they're not even their problems.
Which solution did I use previously and why did I switch?
It's far easier than other solutions. We previously had HP Quality Center and we could not maintain it. Prior to my taking over testing, they had implemented that tool. They brought in some outside contractors who did the initial automation and they handed it off to the support team to maintain. But it was so complex to update it when there was an error, or just for general maintenance that needed to be done, that they found it easier to just manually test. They quit using the tool. It was a complete waste.
With Worksoft, in stark contrast, there was a little bit of a learning curve up front because for about 70 percent of your effort you can use its record function that just records your keystrokes. But then you have to go in and harden the script, and put in data tables and screenshots and validations, that type of stuff. But compared to the other tool, there are no real programming skills needed. You learn how to use the functions and when you look at the script or the test, it's not like looking at code. You can actually read it and say, "Oh okay, that's inputting the month and the year," or "That's validating that the sales order posted." It's in English and it's very clear to follow. There's a drag-and-drop, and delete and all the things that you're used to using with other applications, like Word and Excel, that makes it very simple to use. Initially we had a little bit of training involved, but since then it has been incredibly easy compared to the old tool. The old tool didn't make it past a couple of years. It's been four years with Worksoft and we've got interest, globally, from other parts of the company that are asking, "When are you going to automate our regression suite?" So it has been very well received.
How was the initial setup?
Setting it up was pretty straightforward. My biggest frustration was with our infrastructure. We set it up as a remote desktop but our company has all these firewalls and restrictions around access, and my team is mostly offshore contractors.
The offshore contractors have different access than I do. I spent a lot of time whitelisting different web sites to give them the access to the software we are testing.
Deployment took about a month and a half, mostly due to the infrastructure problems. However, now, when we need to upgrade the system, we can pull it down and run the installation. Then, we always get on a call with Worksoft, because if we miss one step and it doesn't work, we can't afford to have the team down. So we get on a call and spend about an hour running through the update.
What about the implementation team?
Worksoft was fabulous help with the setup. They would get on a call anytime. They would help us walk through issues and help us figure them out; even how to navigate our systems. Their assistance during the setup was phenomenal.
What was our ROI?
We have seen ROI but it's very hard to capture because a lot of the benefits are hard to monetize. We have seen a huge reduction in the time to test and a huge reduction in the number of people needed to test. Rather than lay off a bunch of people, we've chosen to do more projects, so our rate of implementations has gone up.
The 93 percent reduction in labor that the Middle East calculated was pretty impressive. I would say that, on average, it would be more like a 70 percent reduction in test time, because you still have to have people review the tests to make sure they're comfortable. Even though we say everything passed, they're going to want to review them. And then there's the retesting of any remediation that needs to be done.
What's my experience with pricing, setup cost, and licensing?
The initial investment is probably a little high. It was a little hard for me to sell, but it was a one-shot deal and that's why it's so high. All we are doing now is paying annual maintenance, which we don't have to do if we don't want upgrades, but we do.
It is based on the number of licenses. If we had bought a larger number of licenses, our costs would have come down significantly, which is fair. I did struggle a little bit trying to sell it because our company had already had one failure with a testing solution, and here I was asking for money to try again. However, since we got it in, we have had great success.
We have seven licenses today. The people using it are three automation engineers/quality assurance testers who do SAP ECC. We also have three who do web application testing. They are the ones creating the automation for our portals, e.g., customer portal. I have one test lead who oversees this team and bounces between both SAP and web testing. We haven't bought a whole lot of licenses and haven't rolled it out to a massive number of users. We're doing all the work ourselves.
Since these are concurrent licenses, we could double the number of users with our current licenses because six out of the seven are offshore. While we are sleeping, they're using them.
Which other solutions did I evaluate?
The solutions we evaluated were all Java-based and they all took skills that we didn't have and we would have had to hire people to use them, or we would have had to train people. The people those solutions might be good for would be developers, but I'm not going to get a budget for a bunch of developers on a test team. And developers don't want to test; developers want to develop. I wouldn't even be able to hang onto these people. That's what failed with our initial attempt. We brought in programmers, they came up with a test, and nobody could maintain them afterwards. It was an investment that we threw out.
What other advice do I have?
There are a number of lessons I have learned from using Certify.
- When you get started with it, you need to make sure that you have an executive sponsor so that you get the cooperation you need.
- Pick up some mentoring services from Worksoft to help you get started.
- You need to document your test cases well. Don't just start without good documentation, because then you make mistakes and then you have to rework that particular test script.
- Be very organized in the naming conventions and the standards you're using to do the automation. For example, don't shortcut. Fill out the fields that explain what the test objective is. That way, when somebody else comes in a year later and they ask, "What does this test do?" it's right there. Be organized.
- Try not to do too much with a single test. We wrote some that were crazy long: 500 to 600 steps because our process was a very complicated process. Step back and think in terms of logical chunks, because a script which is that big is difficult to maintain. You fix one thing and you get 20 percent of the way through and something fails. So you fix that and then you get another 20 percent and something else fails. It will take somebody half a day to fix one script. You can't have that delay when you have 500 that you're maintaining.
I would put Worksoft Certify right up there at a 10 out of 10. It's been the easiest package that we've done. The S/4HANA tool that comes pre-written, where we just go in and change our data to make it applicable to us, is pretty simple but it's not flexible enough. You can only test S/4HANA within those four walls and almost nobody uses just S/4HANA. There are always integrations. So Certify, as a tool that works across integrations from one package to another, documents the results, is easy to maintain, and easy to use, is a 10. I have not seen a package that is this easy and we did look at other ones. This one was just head-and-shoulders above them. It's really a fabulous product, I'm so impressed with it.