Tricentis qTest Review

Very much a QA-centric application; using it is pretty seamless if you're a QA engineer


What is our primary use case?

It's our primary tool for managing for testing across the Guardian enterprise.

How has it helped my organization?

It's helped us in having a web interface and being intuitive for how testing is done. If you're a tester, it makes a lot of sense. Instead of an application that we try to modify to make use of in QA, this is very much a QA-centric application. How to use this and what it's referring to are pretty seamless if you're a QA engineer. To that end, it has really increased the productivity of my team. In an agile world, being able to create suites of test cases, and copy them from one project to another project, is really important.

The on-demand reporting has also helped. To be able to just look at defect counts, and how much progress was made for the day, or where we stand overall with the project, is really important. All of that has really simplified things for us quite a bit.

On a weekly basis, for reporting it has definitely saved at least 50 percent of our time, if not more.

In terms of it helping to resolve issues when they occur, being able to log into Defects, go right into JIRA, add that defect to the user story, right there at that point, means we connect all of that. That is functionality we haven't had in the past. As a communication hub, it works really well. It's pretty much a closed loop; it's all contained right there. There's no delay. You're getting from the defect to the system to JIRA to the developer.

It very much provides our team with clear demarcations for which steps live in JIRA and which steps live in qTest. That takes away the discussions like, "What are we going to use? How are we going to communicate? Where will the data be?" Any of that preparation time is behind us. It's now the default for how the teams function and what they do. That's a really powerful process. The fact that it's a reusable, repeatable process makes everybody much more comfortable and trusting of the data that they're getting. They can then focus on the issues at hand. qTest really becomes a tool, and the best thing about a tool is not knowing you're using it. To that end, it's doing a really great job.

I can't say that we've seen a decrease in critical defects in releases since we started using qTest, but we have more visibility into our test coverage, block test cases, daily activities, etc. But I can't say that it's done anything to necessarily improve the quality of code.

Overall, it has helped to increase testing efficiency by around 30 percent. A lot of that, again, is due to being able to reuse things and being able to get to the metrics quickly. I can't overemphasize how easy it makes things.

What is most valuable?

We are using qTest Manager and qTest Insights, primarily.

We have a global testing team, so we needed a centralized, easy, web-based interface for accessing all of our testing and being able to manage test cases. We ship over 400 projects a year, so we needed something that was going to scale to that. That's what the Manager piece is for.

The Insights piece is for obtaining where we are in terms of status and finding out about metrics, etc. It provides insight into the status of my testing.

Also, we are fully integrated with JIRA, so back and forth, we use JIRA as an agile shop. JIRA does all of our user stories, etc., and is the main source for defects. qTest Manager is the testing hub. The integration between the two has been great, pretty much seamless. We did run into one defect with volume, but the 9.7.1 release fixed that.

What needs improvement?

They're coming out with a new feature now, an analytics module. I, personally, slide more toward the metrics/analytics side of things, so anything they can do to come up with a reliable template where I can look at all of my metrics at a project-, quarter-, or enterprise-level, would be fantastic. So that's my "nirvana" goal. I don't want to have to go to Tableau. I have a lot of hopes in their analytics module.

And I would really love to find a way to get the results, into qTest Manager, of Jenkins' executing my Selenium scripts, so that when I look at everything I can look at the whole rather than the parts. Right now, I can only see what happens manually. Automation-wise, we track it in bulk, as opposed to the discrete test cases that are performed. So that connection point would be really interesting for me.

We have between 150 and 200 users who are all QA. Project managers might cycle in sometimes for metrics, but we publish our metrics. You can embed scripts that come out of Insights, which is a really great feature. It's a feature I would really like to see them work on more, to make sure their APIs are bi-directional or timely. It's a little unclear if they refresh at a certain point in time or when I click it. That is one area that is a little murky.

For how long have I used the solution?

We're just about to start our third year using qTest.

What do I think about the stability of the solution?

We hit a wall before the 9.7.1 upgrade — we waited too long to upgrade. We hit a volume where we started seeing that JIRA and qTest were out of sync a little bit. It seemed to be a timing thing. But once we upgraded, that all went away. 

In terms of stability, I don't think it's ever crashed. Sometimes the Insights module is slow to load up but I think that is a timeout issue.

What do I think about the scalability of the solution?

I have a better feeling about scalability with 9.7.1 than I did prior to that. We should be okay. There will become a time, though, where we're going to have to consider archiving data, and how we want to do that. That would be a great feature for them to have over time, to be able to go back and archive.

We continue to bring a lot of projects into the department, so the volume of projects that qTest will manage for us continues to increase. We're starting to integrate it a lot with other products. An example would be SmartBear — we do a lot of API testing there. Anything that Tricentis would build, API-wise, along those lines would be really helpful.

We use NeoLoad for all our performance testing and that integrates with AppDynamics, so I don't know that we would need to integrate them, but it would be nice if it were an option. We definitely continue to use JIRA. We'll continue to expand on that platform. There's a lot of potential.

How are customer service and technical support?

I speak very highly of the company, especially the QASymphony folks who were merged into Tricentis. There was some merger pain in terms of availability. We found that our calls were cycling. But they recognized that pretty quickly and definitely helped us get on the right path. 

When we were doing the upgrade, we were able to get slots scheduled fairly easily. 

So tech support is as I expect it to be, at this point. 

I have names of people whom I can call. That's always nice. It's not just "1-800-qTest." As a vendor they're attentive. They've been up here a few times and we definitely have a view into their roadmap. I find that as much as you're willing to give, you'll get.

Which solution did I use previously and why did I switch?

We were using the HP suite. We switched because of price point and ease of use. We went into agile quickly, as an enterprise, and HP wasn't at an agile point at that time. We needed to make a switch.

qTest is much more intuitive and straightforward. There's not a lot of complexity to it. HP opened up the world so there were far too many features than we needed. That became a burden over time. HP's integration with JIRA was difficult. It was a thick client and it was very difficult to use the web interface and have good response times. I could go on and on, but you get the gist of it.

How was the initial setup?

We did a prototype two years ago and demo'ed it. It definitely played strong. The price point was right and then we started road-mapping it in 2018. We started implementing in October of 2018. We have a lot to do here so it took us until June of 2019 to get us all to steady-state. But it went without hitches, and that's probably due to a combination of how much planning we put into it and its ease of use.

You need to plan it. You need to know what your JIRA templates look like. You need to know what your JIRA workflow is, and then you need to understand what you want qTest Manager to look like. If you're integrating with JIRA, that will be the defining piece in how all of that structure will look. Once you understand that — and fortunately, I have control over both in my department, so we are really intimate with what our Jira template looks like — it really maximizes how it integrates and the efficiency of how to get to where we wanted to go. It sounds like it took a long time, but it was really a lot of planning time and then we did the cutover. We also had a lot of training that we put into it. Having done this before, qTest was, by far, one of the easiest ones I've done.

We do internal audits on our own. We look back quarterly and say, "Are we meeting our own processes? Do we have reliable, reputable standards with our projects and the metrics, the way we count things? Are we consistent?" I do you think you have to measure yourself, in addition to measuring your projects. That's really helped us significantly.

As for adoption of the tool, it's been really easy. It has simplified a lot of things. Things are right there. You can quickly drill through and it's pretty intuitive to pick up. There's not a lot of complexity around it. There are not a lot of unnecessary fields. The training on it and the adoption of it have been a lot easier than with HP.

What about the implementation team?

The deployment was all my department. We have a third-party vendor, Cognizant, that we work with. We have an 80/20 split: 80 percent of the department is Cognizant, 20 percent is Guardian. This touched everybody in the department, and we're somewhere between 150 and 200 people. But we had a core team of about a dozen people who mapped and planned it all out, and then they touched the rest of the department as their projects migrated over.

I have two to two-and-a-half people maintaining it.

What was our ROI?

I definitely see ROI in that I have testers who are focused more on doing really complex testing, rather than writing test cases. The reusable regression suite is always a good thing; to be able to copy and move tough cases from one project to another. I don't want my testers to be rewriting things.

What's my experience with pricing, setup cost, and licensing?

I have not looked at it recently, but our license price point is somewhere between $1,000 and $2,000 a year. It's pretty low when you think about what we used to have. We haven't had any additional costs from Tricentis. We do the hosting on Amazon, so that's our cost.

Which other solutions did I evaluate?

We actually did a bake-off between Tricentis and QASymphony. And then we got the best of both worlds when Tricentis acquired QASymphony.

We looked at Zephyr and Xray but they were really too small-scale for the enterprise that we have. They probably would have saved us a lot of money, but our efficiency would have really scaled off.

What other advice do I have?

What I've learned from using the solution is "don't be afraid of change." HP was the blockbuster of our industry. There are a lot of great options out there. Do your due diligence and be brave.

Also, have a plan. It's not something that you want to go into and figure out as you're going. You need to really sit down and consider where you are, where you want to go, what variables are going to help you figure out how to implement this. It's just like any other software package. You need to need to have a plan. You need to have a training plan. You need to make sure your team understands the opportunity and what they're going to get out of it. It can be scary, so you have to manage change as much as you have to manage implementation.

In terms of using qTest to look into failures, we haven't really enabled that part of it, yet. We use Selenium to do all of our automation, and that's a little different than using the Tricentis application. We're a Java shop and .NET shop, so we wanted to go with an open-source tool so we could hire Java developers for automation. We also use Selenium for open-source test automation. We know there are some exploratory options in qTest, and we will start setting the roadmap for 2020 in that direction. We definitely want to expand what we're using within the product, now.

Getting our upgrade to 9.7.1 was really significant for us. This past year has been a migration year. We got to steady-state around June. I wanted to get everyone to steady-state, spend some time with it, get our upgrade behind us, and then start to expand out to use some other pieces of functionality in 2020.

We use qTest to provide results to executives, but it's usually sanitized through my team a little bit, just because we're still getting used to the cycles and execution. We can do multiple runs and we have to get down to what the actual results are, not the overall multiple runs' results. I use qTest for that and that information gets cleaned up before it goes out to executives. The information it provides them is accurate.

qTest is an eight out of ten at this point. For me, it's been the metrics. Every company counts things differently. To understand their reports, out-of-the-box, and to align the solution to where I want it to be — do I massage it to maintain the metric that I have or do I have to wait for a breaking point or do I redefine my calculation — is where those two points go. That has taken a little more than I would've expected. All the data is there. It's just a matter of how you're layering it out for your company.

Knowing what our calcs are and knowing what the qTest calcs are, and where they diverge, would've been really great. We were a little more naive than we had planned for.

I hope Tricentis keeps it alive and well. It's a great little product that is going to quickly grow to be something that gets out there with the big boys.

Which deployment model are you using for this solution?

Public Cloud

If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?

Amazon Web Services (AWS)
Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
Add a Comment
Guest
Sign Up with Email