Tricentis qTest Review

Integration with JIRA makes all test cases available to anybody in the company with JIRA access


What is our primary use case?

qTest is our test case management tool.

How has it helped my organization?

Our company's workflow starts in JIRA. We create epics, stories, bugs, etc. All of those things are integrated within qTest. There was a disconnect before, with the testers working in Quality Center, while developers and business analysts were working in JIRA. qTest has eliminated that piece, because there is a specific JIRA integration. All the test cases are available in the links section within JIRA, so they're visible for anybody in the company who has access to JIRA. They can pick up the item, the cause-of-issue type, and look at a story or bug and see what level of QA testing has been done and whether its status is pass/fail. All of the test statuses are available in the story itself, so there is one place to view things.

We also use that information for release management. Every release will have an associated JIRA tag for release to production. It's easier for the change-management people to look at JIRA itself and see what level of testing has been done, if it's pass/fail, etc.

We use Selenium WebDriver for test automation. We use Python automation scripts which are located in BitBucket, the central location where we keep all our automation scripts. We execute these scripts with Jenkins and then use a qTest plugin to push the results from Jenkins to qTest test results, once the executions are over. We can also run the same automation scripts within the qTest Automation Host feature. Through the Launch feature we can kick off automation scripts, which are available in BitBucket. So we can either use Jenkins or qTest to run the automation scripts. Because of the reporting mechanism we are directly passing test results to the execution tab, so senior staff can see how many scripts we ran, how many passed, how many failed, in a detailed report in Insight. Jenkins has the ability to talk to qTest. Previously, when we used Quality Center, it didn't have any capability to talk with JIRA.

qTest also provides our team with clear demarcations for which steps live in JIRA and which steps live in qTest. It's a positive feature. It improves our understanding of expectations as to what requirements are to be filled in through JIRA, and blind test-case management and controls available in qTest. It also separates roles and responsibilities and allows people to work within their boundaries.

What is most valuable?

The solution's real-time integration with JIRA is seamless. With ALM or QC, we had an additional plugin which was a schedule-based integration between the tool and JIRA. Sometimes things errored out and there were too many integration issues. We didn't have an updated sync-up between JIRA and Quality Center. Quality Center was predominantly used by testers and JIRA was being used by other users in our company, including PMs, DSAs, and devs. qTest solved one of those challenges for us.

The reports qTest provides to executives are pretty good because what we do is mimic what we used to do with Quality Center: defect reports, throughput reports, aging reports, and an execution summary report to show how many test cases there have been. The executives are reviewing them and they do not find any difference between what we had in Quality Center versus what we are doing in qTest.

What needs improvement?

We are starting to use qTest Insights a little bit. Right now, on a scale of one to five, I would say the Insights reporting engine is a three because we are facing some performance issues.

For example, qTest offers a baseline feature where you can only base sort-order for a specific story or requirement on two fields. However, our company has so many criteria and has so many verticals that this baseline feature is not sufficient. We would want another field to be available in the sort order. When tickets come over from JIRA, it would be helpful to be able to sort by sprint, to begin with. And within the sprint there are labels or subcategories. Currently, it allows us to only sort on a sprint and then subcategory. We would like to see things bucketed or placed in a folder with a status within the subcategory. We need three fields instead of two. When we raised this item, Tricentis said that it's a feature request. 

Also, the features that are customizable or specific for a team are still not available in Insights reporting. We have submitted approximately 15 or 20 tickets to Tricentis so far to address those features/enhancements/bugs. That's all in the works.

Another important issue is that we can export from JIRA. When we export, any attachments in JIRA will be part of the export. Although qTest is just plugging into the test cases, it's not letting us export attachments from JIRA. Everybody else in the company who is operating in JIRA would like to see the attachments that come out of the integration links, meaning the test cases. That is actually a feature that has to be set by Tricentis and not JIRA. Again, that required a new feature request to be submitted.

For how long have I used the solution?

We have been using qTest for around six months.

What do I think about the scalability of the solution?

So far the scalability looks pretty good. I cannot say for sure because in a matter of six months we have 26 projects that are live and functional. So far, so good, but I cannot talk about the scalability yet.

From what I have heard from Tricentis, there is no restriction on data storage. In terms of latency, because the application itself is in cloud, and we shouldn't be seeing any performance issues accessing qTest.

We have about 55 people, contract testers, who have access to the edit, add, and execute features. We have three admins. And we have about 100 people who are view-only users of qTest items. We don't require any people to maintain the solution since it's hosted on the cloud.

We definitely anticipate increasing the number of projects in qTest.

How are customer service and technical support?

Technical support is friendly and quick. Most of the time we get a response the same day. They're located in Vietnam. There is a ticketing process. If we have an issue we open a ticket with them. If we need to, they will schedule a meeting with us to complete the request. They respond on time.

Representatives come over or Skype us to tell us about the next version date and the like. We get the communications from Tricentis indicating the dates of rollout of new versions.

Which solution did I use previously and why did I switch?

We used to have Micro Focus ALM Quality Center as our test management tool and we were nearing our licensing limitation at the time. We evaluated a couple of tools in the market and we picked up qTest because it had a better reporting mechanism and dashboard features, along with a clean integration with JIRA.

How was the initial setup?

The initial setup was straightforward. There were clear project templates and clear user templates available. We were able to add and update roles as needed. The user list was already available. All we had to do was checkmark and save. It was really seamless setting up users within the tool. 

Likewise, we could model it as a waterfall or agile template, and it then gave us the workstreams created in the folder structure mechanism within qTest. These are all good features that allowed us to quickly set things up and keep things moving.

It's hard to say how long it takes to set up qTest because it's handled by Tricentis. All they told us was that they had finished their deployment.

We were given a sandbox and some sample projects to be evaluated and tested. We had a month or so during which all our testers were given access to those sample projects. We tested them and we said we were good to go. The production environment was then available for us to roll out our projects.

Our organization’s adoption of the solution has been pretty positive. Users were looking forward to it. They embraced it pretty quickly and pretty well.

What was our ROI?

It's too early to tell about a return on investment.

What's my experience with pricing, setup cost, and licensing?

I believe we have an annual subscription.

Which other solutions did I evaluate?

We evaluated QASymphony and QMetry. To begin with, we had a list of about ten tools that researched on the internet and via some phone calls. We narrowed it down to these two and Tricentis.

The main differentiators were the dashboard and reporting mechanism, the artifact reporting mechanism, and the JIRA integration. Those were the reasons we chose Tricentis.

What other advice do I have?

It's a simple tool. The usability is pretty straightforward. For a QA tester who is an active user, the UI is pretty simple, the linkage of requirements to test case is simple, and there is searchability of test case across the project. Overall, it's better than Quality Center in the ways that I have explained.

My suggestion would be to always put your use cases up-front with vendors whose tools you're looking at. Ask for a demo and make sure that your use cases match up to the demo that's being performed. We had some challenges with JIRA and the Quality Center integration, real-time interfaces, the time lag, and visibility for all people into the tool. These were challenges in Quality Center that qTest has overcome.

At this time the report section in qTest is used by QA managers and above. Our QA testers are not looking directly into the reports to gather stats. It's the QA managers, directors, and VP, as well as people in IT, who are starting to look at the metrics within the reports and who are trying to form a consensus on the reporting bugs. Our testers just log bugs and Insight reports are gathered by a QA lead to create a defect summary. Then the QA lead talks to the PM, and dev, etc. So the reporting itself is not affecting the productivity of the testers. Overall qTest is not affecting our testers' productivity.

In terms of executives or business users reviewing results provided by qTest, we are just starting that process. They are reviewing the results but there is no active, standardized communication, back and forth, on the artifacts that they review.

I can't say we have seen a decrease in critical defects in releases. qTest is not even enabling people to put out the right requirements. The defect reduction is in the upstream work: how they write the requirements, how they code, how we test. qTest is just the tool that allows us to do a pass or fail. The tool itself is not an enabler of defect reduction.

It's a flexible tool but, overall, I cannot say that it has increased testing efficiency.

Which deployment model are you using for this solution?

Private Cloud
Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
Add a Comment
Guest
Sign Up with Email