Tricentis Tosca Review

Provides us with a central repository which allows us to share and reuse test cases globally


What is our primary use case?

We use it for test automation and manual tests, and we use it as our central repository for all test cases.

How has it helped my organization?

We're currently going through an upgrade of a system here, in-house, and Tosca has allowed us to do more testing compared to what we were previously able to do. In the past, it would just take too much time for humans to do. This additional testing that we're doing, which provides additional test coverage for the upgrade, would take a couple of individuals three weeks to do, and we're able to do it within 24 hours with Tosca.

Also, the solution enables us to run the entire regression test suite, immaterial of where a change has been made. We're able to run all our regression tests at one time. We do not have CI/CD set up yet, so it still has to be manually kicked off. For example, for the in-house upgrade I mentioned, most of what they're building is regression test cases because we want to make our upgrades faster and we want to do them more frequently. So as the users are building out the test cases, they're also creating the regression execution list. They can build upon that and then run that when they need to. And then all they do is do the analysis of the execution.

Tosca is also slowly starting to remove redundant test cases. We're working with the users to build better test cases and remove those redundancies as well. This really helps because we have the central repository which allows us to share everybody's test cases and to reuse the test cases globally. We have delivery teams in Edinburgh and in the Netherlands as well as here in the US. It really allows for that single collaboration and for reduction and reuse of test cases. We're still analyzing how this has affected our testing efficiency. That's one of the reasons we're upgrading to 12.3. We want the Tricentis Analytics to help paint that picture a little bit better for us.

We are starting to see the test speed increase a little bit. Looking, again, at that in-house upgrade's tests, if we had manual testers doing it, it would take a couple of weeks, and now we can execute test cases in 24 hours. Another team just completed some test automation test cases that take two minutes to run. When they first ran them they ran into errors and issues which were legitimate issues. The developers fixed them and they ran the tests again and found some more errors. They went back to developers. The whole time it took to resolve things was about 20 minutes. That is a huge improvement over how things worked before. It would have taken hours if it was manually tested, verified, etc.

Finally, we have BAs who have been trained on Tosca, people who have picked it up fairly quickly with little background in development or coding, and they've been able to get up and running on it fairly quickly. We would like to get to the business to be able to help control testing as well, but we're not there yet. We are using exploratory testing, but we're not 100 percent using that quite yet.

What is most valuable?

The most valuable features are 

  • Tosca BI 
  • Tosca Commander.

Tosca BI is important to make sure that our data integrity is in check and validated; to make sure our data is good. Our data is the number-one important driver for our company, so if that's not good, we have some big problems. 

Tosca Commander lets any test cases that are UI driven validate, and tests the UI and expected results.

Also, the nice thing about the model's solution is that we're able to build out the modules within Tosca, to facilitate ease of maintaining the test cases. It allows one spot to do an update and that flows through all the test cases that need to be updated.

What needs improvement?

The main area where there is room for improvement is how they do upgrades. Going through this current upgrade, we were delayed a month because we are using a third-party tool. It's called Tosca Connect by Tasktop. When this latest upgrade broke that relationship between the two, it took Tricentis a month to come back with a workable solution. To me, that resulted in critical customer impact and it took way too long for them to resolve. Their whole upgrade process needs to be better and cleaner, from an end-user standpoint.

For how long have I used the solution?

We've been using Tosca for three years. We're using version 11.2 and we're currently going through an upgrade to 12.3.

What do I think about the stability of the solution?

The stability seems pretty robust. We've come across some minor issues, but I think you'll have that with any kind of thing with any software. Those issues were the result of a combination of the environment and not having the knowledge of other things that would help resolve them.

What do I think about the scalability of the solution?

In terms of scalability, one of the things I do is challenge people to prove that it doesn't work. It has yet to be proven to me that it doesn't work with something. We've always found a way to be able to do test automation with it.

We have developers and BAs using the tool. We have 16 delivery teams and there are an average of seven people on each team. We require two people for maintenance of the solution. They are test automation engineers. If you're talking about maintaining test cases, everybody on the delivery team is involved. It's in their goals to maintain their test cases for what they deliver, and that includes developers and BAs.

Every day, Tosca is being used more and more. Part of our digital transformation is that we keep growing the tool base and what the test cases are.

How are customer service and technical support?

Tosca's technical support is mediocre. I've worked with better support. The reason I say that is because, here in the US, if we put in a support ticket, that ticket usually goes to the European support and there's that time lag. Something that could be responded to within an hour or two can have a 24-hour gap until we get any response.

Also, sometimes we run into some who are technical and then we run into support members who aren't very technical, and who just give a canned answer. That's not very helpful. There's definitely room for improvement in terms of their support.

Which solution did I use previously and why did I switch?

We used Selenium before Tosca. One of the reasons we decided to look at other tools was that, with Selenium, the maintenance of the scripts started becoming higher. The more and more that test cases were built, the higher the maintenance cost was and we started seeing a diminishing return as a result. 

Also, you really can't do database testing, for the most part. There are ways around that limitation, but it just doesn't work very well with Selenium. 

From a reporting standpoint, you really can't produce robust reports like you can within Tosca. 

Finally, the infrastructure with Selenium is a little bit more challenging and there's really no built-in test management with Selenium as well.

How was the initial setup?

On a scale of one to ten, with one being straightforward and ten being complex, the setup was around six or seven. It wasn't completely straightforward but it wasn't totally complex either. It just needed some insight from Tricentis to make sure things were being done correctly.

The first deployment took about a week. We then went through an upgrade where we used their consulting service as well, because that was when they changed how the licensing was done and the license server. That took about two weeks to do. With this latest upgrade, we're going on our fifth week now. It's taken us a little longer because of that issue that we uncovered.

Our implementation strategy was a global implementation, where we empowered all delivery teams to be able to use Tosca. We set up the application servers and the repository in a global data center and then we pushed out Tosca Commander to all the delivery teams so they could do test automation and manual testing as well.

What about the implementation team?

We used a Tricentis consultant for the first setup and deployment. They were fantastic. They're very knowledgeable, they're great at what they do, they know the product inside and out, and it really helped speed things up. 

What was our ROI?

ROI is one of the reasons we're trying to get Tricentis Analytics set up, so we can provide factual data on the test cases and the return. We have KPIs set up to help us with that.

What's my experience with pricing, setup cost, and licensing?

We paid upfront for the licenses and maintenance and we pay the maintenance fees as we move forward, yearly. There are no additional costs that I am aware of.

Which other solutions did I evaluate?

We evaluated SmartBear, HPE QTP (now UFT), QuerySurge, and Alteryx.

It was very apparent that Tosca was an all-in-one solution. We could do database testing, UI, and API testing. The others focus on one area. HPE and SmartBear focus on things such as UI/UX interface-type testing, while Alteryx and QuerySurge focus on backend database testing. Tricentis encompasses all that and that's why we selected them.

What other advice do I have?

If you're going to have somebody come in and help set up, don't go with a third-party vendor, but have Tricentis consultants come in, do training, and help set up. It's worth the extra money to have them come in and do right, versus a third-party that might not know everything. That was our experience. We tried the third-party route to help us, and we immediately saw that it was not the right way to go. That's when we went to Tricentis themselves to come in and help us.

The biggest lesson I've learned is more connected to the tool's adoption. If people have been doing the same thing for over ten years, it tends to be a little bit hard for them to switch over because they want to do things the way they've always been doing them. But the tool itself is fairly simple. It's a pretty solid tool.

There are a few ways to overcome the resistance to new technology. But it's really about creating urgency around why the tool is important to the company and why people need to adopt it. We've done lunch-and-learns to help people understand it. We also champion any success stories with the tool, through newsletters that go out. Sometimes, it's just about bringing in new people with the right mindset.

We've slowly been increasing our rate of testing automation using Tosca. We did a digital transformation with SAFe Agile, and then with adopting test automation. So the tool-adoption piece has taken a little bit longer. We're not quite where we want to be yet. We're still building on it. Tosca covers about 50 percent of our test cases at this point. The solution hasn't reduced our cost of testing yet. We're still a little immature in this process.

In terms of delivering more features for release, we are watching features but we're not there yet. We're just not mature enough to have those hard facts to help us state that this is helping complete more features. From a defects standpoint, we're trying to get a better handle on how defects are captured here, because before there wasn't a concise, approved process on defects. Everybody did their own thing and I'm pretty sure things fell through the cracks. People weren't identifying defects as defects. That's another one of the things we're trying to get better at.

Tricentis keeps building on BI and Tosca itself, and they're just getting better and better every time. A lot of the things they're focusing on are the right things.

Which deployment model are you using for this solution?

On-premises
Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
Add a Comment
Guest
Sign Up with Email