Tricentis qTest Review

Helps us resolve issues faster because everyone is working off of the same information in one location


What is our primary use case?

We are using qTest to store and house test scripts and test design. We're using the test execution to execute and we're using the Defects module to track defects.

How has it helped my organization?

This is an SAP implementation and until we brought in qTest the team had no tool. They were doing everything manually in Excel, all their tests and execution. I came on board to help lead. We've done multiple test cycles. We're in UAT right now. They did one integration test cycle without the tool and we've done two with the tool. It's helped with productivity when you compare it to doing it manually in Excel.

The solution's reporting has enabled test team members to research errors from the run results, for the most part. In terms of its effect on their productivity, they went from a complete manual Excel testing solution to this tool. We're able to execute 550 test scripts within a six-week period. Some of these are simple tests but some of them are robust integration scenarios.

Our company had an assessment done by an outside organization before the tool was in use. When they did their assessment they said they were not seeing good metrics. When they did the second assessment, they asked, "What's changed since our last assessment? We're seeing numbers, we're seeing reports, we're seeing data. Everything looks stable and not all over the place." They definitely noticed the change, and that's an outside organization.

qTest helps us compile issues and have one place to look for them. We're not chasing down emails and other sources. So in the grand scheme of things, it does help to resolve issues faster because everyone's working off of the same information in one location.

Overall, the solution has increased testing efficiency by about 60 percent, compared to what they were doing before.

What is most valuable?

We get the most benefit from the

  • test library
  • test execution.

There is also a high level of reporting provided in the tool and you can extract reporting out of the tool. That has been very beneficial.

Ease of use is another helpful aspect. Right now, during the UAT, we have a lot of business users using the tool. They test the UAT scripts. The team quickly adapted to the tool. I was able to roll it out and I was actually commended by several internal people, as well as external consultants who were doing assessments, as to how things had changed. They were shocked that I was able to implement the tool so quickly. I had never used the tool before either. I received a trial copy of the tool and was able to quickly implement it and have it up and running for the team.

What needs improvement?

The Insights reporting engine is a little challenging to use in terms of setting up a report and getting the data. It took me a while to understand how to use the tool. 

I'm mainly extracting the data out of the tool. I'm not necessarily using any of the dashboards in the tool. There are some fields that I did not make site-specific because I had to get things up and running quickly. The fields are in both the Test Run area and Defects. If you do a project via site-specific, you can't get any of those fields out of Insights. That's a limitation that they need to figure out. They shouldn't have that limitation on the tool.

In addition, I really can't stand the Defects module. It's not easy to use. ALM Micro Focus used to be called QC. That solution's Defects Module is really robust. For example, let's say you have a defect and you have a query. You can actually walk through each defect by just clicking an arrow. You go through that defect, add your updates, click the "next" arrow, and walk down through them. But with the qTest Defects module you can't do that. You have to run a query. You're pretty much just querying a database. It's not really a module, or at least a robust module. Everything is very manual. By contrast, qTest's test design and test execution modules are very robust. They just missed the boat on the Defects module. From what I've heard and from what I can understand, other people are using JIRA or something else to do their defects tracking, and we're not. I needed a tool to do everything. That's their weakest link.

For how long have I used the solution?

We have been using the product for a few months.

What do I think about the stability of the solution?

Overall, qTest is stable. Sometimes we see some performance slowdowns, a hiccup or glitch-type of pause. But for the most part, it has been operating. We haven't felt any pain yet.

What do I think about the scalability of the solution?

Right now, it's handling everything we're throwing at it. Since we're in UAT, this will be the highest number of people in the tool and probably the most activity in the tool, and it's been supporting things without any issues.

We went from 30 licenses to 60 licenses during this four-month period of time. I don't think that number will be increased. Once this project is over, the number of consultants will be reduced and the number of people involved will be reduced.

How are customer service and technical support?

Technical support has been fine, acceptable. Their responses have been in an inappropriate amount of time for the most part. 

There are just those two limitations that I've uncovered, as compared to other tools that I've used. So a lot of my interactions are like, "Hey, I want to do this," and they say, "Oh, you can't do that," or "the tool doesn't support that." That's the thing I have run into the most. It's not a support issue, it's just a tool issue. Functionality.

Which solution did I use previously and why did I switch?

Cost and time were the main reasons I went with qTest. If I were to have my choice, I probably would have implemented the Micro Focus product because I am familiar with it and know it can do everything I wanted to do. But that would likely have been overkill; way more than this project needed, and it was much more costly. 

I was looking at another tool, the SmartBear QAComplete tool that I had used on a previous project. I didn't necessarily like that tool, but its cost was less than either qTest or HP QC/ALM. But once I get my hands on qTest, I definitely liked it better than the QAComplete product.

The ease of use and the interface helped push me toward qTest. I had also called a friend and he said. You have to look at QASymphony or Tricentis. This qTest is good." I said, "Are you sure?" He said, "Yes, it's good. Trust me." That helped push me over the top.

How was the initial setup?

The initial setup was challenging. There are certain areas where it's very strict in how you have to set up your project. There are some strict guidelines that you have to follow. You have to have a release and a test plan. You can't do certain things within the test design module or test execution module. There are only certain ways that you can set up a folder structure, whether it's related to a cycle or a test suite or a module. I would prefer fewer restrictions. The restrictions are what made it complicated.

Deployment itself was done over a weekend. This is not on-prem, it's in the cloud. I set up the structure and then had to understand how to load the test scripts. It was very fast.

Our implementation strategy was to get it done as soon as possible. It was very off-the-cuff. There was no time to plan. I landed here right before this test cycle was supposed to start and I knew that if we left it as a manual execution we would fail miserably. For me, the plan was to identify, learn, and implement a tool, all within less than a week. It took me two weeks, including training myself. There was no plan other than "we need a tool."

What about the implementation team?

I used Justin, I used one of the support people, and I had one meeting with one of their people. I had no more than four hours of support and a couple of emails. 

Overall, my experience with their support during deployment was good. I was asking some questions and needed to take an approach that either they didn't agree to or didn't understand why I was doing it that way. The one person I remember talking to was so tied up in the Agile methodology that she couldn't see outside the Agile box, and that's what I needed. We weren't coming from an Agile methodology.

What was our ROI?

Over the four months there has been ROI. If I crunched the numbers I would probably find it has paid for itself already.

What's my experience with pricing, setup cost, and licensing?

The price I was quoted is just under $60,000 for 30 licenses, annually, and that's with a 26.5 percent discount.

Which other solutions did I evaluate?

I've used QAComplete from SmartBear. I've used HP QC or ALM from Micro Focus. I also used an old IBM rational test manager which I think was called SQA.

I think qTest was really built to support Agile, where the other tools were built to support traditional Waterfall but were easily adaptable to Agile. qTest is probably going to struggle a bit before they can truly support non-Agile implementations.

What other advice do I have?

The biggest lesson I have learned from using qTest is that every tool has limitations and you need to be able to adapt and overcome and not be stuck with one way of doing things. You have to find out where the tool shines and where it doesn't and make sure that the team feels the least amount of pain when you do implement it.

This solution has been implemented for one particular project. We have 60 concurrent licenses available and we have about 120 users who have been given access. Their roles in the project are either business analysts or quality testers. But these people also have their roles within the business. Some are managers within finance, some are directors, some are AP specialists, some are AR specialists. The project is a financial system implementation so we have a sampling of users from all departments executing scripts.

Since implementing the tool, we've seen a decrease in critical defects but I don't know if I can attribute it to the tool. I don't know if that's possible. It might be a stretch. But we definitely have seen a drop in critical defects. Over the last four months we have seen a 40 to 60 percent drop.

For deployment and maintenance of the solution, it's just me. I'm the one who picked the tool, I'm the one who implemented the tool, I'm the main administrator of the tool, and I am leading all of the testing efforts.

Setting up the users is pretty simple. I would recommend it. If you're looking for something quick, easy to use, and robust, it's definitely a very good tool. If I could get them upgrade the Defects module, I would be very happy.

I do love it. I'm giving it a nine out ten just because I don't think any tool out there is a ten, but Tricentis is close.

Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
Add a Comment
Guest
Sign Up with Email