If you were talking to someone whose organization is considering Tricentis qTest, what would you say?
How would you rate it and why? Any other tips or advice?
I would recommend planning how you're going to organize using it and have everybody organized the same way as they use it. A lot of times you see this in software: They build in flexibility thinking they're doing you a favor because they're making it flexible and thinking you can use it the way you want. But if you have ten users, those ten users each use it ten different ways. If there's no flexibility at all, the ten users use it the same way. To me, that's almost better. Even if it's not exactly how we want, at least it's the same. Uniformity, over being able to choose exactly how I use it, would be my preference. The biggest lesson I've learned from using qTest is that we need dedicated QA people. What will happen is something like the following. I have a developer, Amos, who, thinking he's doing the right thing, goes in and loads up 20 tests and then he gives that to the business to test. And they think, "Hey, the expectation is that I do exactly what this thing says." The problem is we only then test it from the perspective of the developer. We're not actually getting the business to think about what they should look at or, better yet, developing a dedicated QA team which knows to look for defects. It's a myopic perspective on testing. And because of that, we do not find as many defects as we would. That is not a qTest issue, though. If we had a dedicated testing team using qTest, that would be ideal. We have not seen a decrease in critical defects and releases since we started using it but I wouldn't blame qTest for that. It's more that we do not have a dedicated QA team. My management team seems to think that qTest is a substitute for a dedicated QA team and we have the developers and the business desk use it to test. But developers and business are not as good at finding defects as a dedicated QA team is. In terms of maintenance and for administration of the solution, we don't have anybody dedicated to those tasks. People do the maintenance needed to get done whatever they need done. It's mostly me who creates projects, adds users, etc. We have 56 users, who are primarily developers and on the business side. Overall, it gets the job done, but it's a struggle to do it. It's not as intuitive to use as it could be.
Go for it, take a shot at it. Try it out with the 30-day free trial. If you really find it to be a good fit for your company, the productivity and the cost, go ahead and choose it. It's definitely a good tool. The biggest thing we've learned from this tool is the ease of using it. It is easier. There is a possibility of creating the entire application lifecycle management by moving around different tabs and moving around different options. With one screen it is easy for a QA person to get into it. We have not used Insights that much. We have used it to some extent but we haven't gone into the details in the graphics and the reporting. Because our own product is changing so often — the versions and the management and the configuration of the product are changing — we do not have a stable release for our product. So we are not set up completely with Insights. We are in the process of doing so. About 40 percent of what we do is still manual testing; only 60 percent is automated. The basic aim is for at least 80 percent automation. Our team which is working on qTest Manager is located in Ukraine, so a team leader there could provide more elaborate answers than me. I'm leading it from our head office. The team in Ukraine are the people who are using it on a day-to-day basis. I would rate qTest at seven out of ten. To make it a ten there are a few things here and there which could be easier for the user, like giving popups in between operations. When I want to delete something it asks me, "Are you sure you want to delete?" But it does not do that everywhere. So there are some small things, here and there, which could really improve the tool. It is supported in Chrome, Firefox, Safari and IE11. I would like to see more browser compatibility options, like using it in Edge. And when I move to different browsers, the format of the tool is not consistent.
qTest is something that the whole software development team can utilize. It's not just for testers. You would probably get your main licenses for the testers, but for the rest of the team, who are in and out of the tool throughout the day, you can get a set of concurrent licenses for them. The biggest thing I've learned from using this solution is that we should have done it sooner. I've used Insights a little bit to help me with managing my people and it looks pretty cool, but that's about as far as I've used it. And in terms of the solution's reporting enabling test team members to research errors from the run results, we're haven't got quite that far yet. As we get more information in there, that will be the next step for us to start looking at, so that they can start researching errors. We're really working on the quality to where, hopefully, we're releasing good quality and no production issues. We've got the UFT automation set up on the server. We just haven't finished putting in the scripts and seeing how to use the test execution part of qTest to run that, and how the results are put into qTest. That's our next step. Right now, I can say it's an eight out of ten, and that's just because we haven't made it through all the features of the product yet. But we are very happy with what we see so far.
Do a cost-benefit analysis. qTest is more costly than other tools. If you have multiple teams, it's going to be essential, and it's worth buying qTest. Apart from that, if cost is not a factor, there are more benefits from qTest and it's definitely a tool you can go for. All the features we have used are pretty impressive and good. The JIRA integration is the only thing that, if it is very critical, you need to plan accordingly. It's a good investment for the implementation of the QA process. It creates more accountability in the team and also makes a lot of things easy for the managers as well. It simplifies a lot of QA processes. These are the things we've learned from using the solution. As we start having other teams use the tool, they should also be able to see and take advantage of these things. Not many business users are using qTest. We share reports with them and they use them for management and other purposes. Primarily, qTest is used by the QA team only. But people take the reports as a starting point for discussion for things like product-improvement purposes. The business users rarely go into the tool to get to the various details they need. Mostly the reports are PDFs that we generate. That becomes the source for them instead of them logging into it and getting information. The IT team maintains it along with all the software that we have installed on our premises. That team is taking care of it. But we hardly have any maintenance requests for qTest. There have been a couple of times where we had outages but, apart from that, we have hardly had any maintenance requests for qTest. We haven't seen any change in the number of defects. It mainly creates transparency, and accountability has been increased. It's easily understandable, including the reports. It's pretty comprehensive and provides all the essential details that we need to publish from any of the teams. I would rate qTest at nine out of ten. It's a perfectly good tool. It definitely serves its purpose and I can definitely recommend it.
It's a simple tool. The usability is pretty straightforward. For a QA tester who is an active user, the UI is pretty simple, the linkage of requirements to test case is simple, and there is searchability of test case across the project. Overall, it's better than Quality Center in the ways that I have explained. My suggestion would be to always put your use cases up-front with vendors whose tools you're looking at. Ask for a demo and make sure that your use cases match up to the demo that's being performed. We had some challenges with JIRA and the Quality Center integration, real-time interfaces, the time lag, and visibility for all people into the tool. These were challenges in Quality Center that qTest has overcome. At this time the report section in qTest is used by QA managers and above. Our QA testers are not looking directly into the reports to gather stats. It's the QA managers, directors, and VP, as well as people in IT, who are starting to look at the metrics within the reports and who are trying to form a consensus on the reporting bugs. Our testers just log bugs and Insight reports are gathered by a QA lead to create a defect summary. Then the QA lead talks to the PM, and dev, etc. So the reporting itself is not affecting the productivity of the testers. Overall qTest is not affecting our testers' productivity. In terms of executives or business users reviewing results provided by qTest, we are just starting that process. They are reviewing the results but there is no active, standardized communication, back and forth, on the artifacts that they review. I can't say we have seen a decrease in critical defects in releases. qTest is not even enabling people to put out the right requirements. The defect reduction is in the upstream work: how they write the requirements, how they code, how we test. qTest is just the tool that allows us to do a pass or fail. The tool itself is not an enabler of defect reduction. It's a flexible tool but, overall, I cannot say that it has increased testing efficiency.
What I've learned from using the solution is "don't be afraid of change." HP was the blockbuster of our industry. There are a lot of great options out there. Do your due diligence and be brave. Also, have a plan. It's not something that you want to go into and figure out as you're going. You need to really sit down and consider where you are, where you want to go, what variables are going to help you figure out how to implement this. It's just like any other software package. You need to need to have a plan. You need to have a training plan. You need to make sure your team understands the opportunity and what they're going to get out of it. It can be scary, so you have to manage change as much as you have to manage implementation. In terms of using qTest to look into failures, we haven't really enabled that part of it, yet. We use Selenium to do all of our automation, and that's a little different than using the Tricentis application. We're a Java shop and .NET shop, so we wanted to go with an open-source tool so we could hire Java developers for automation. We also use Selenium for open-source test automation. We know there are some exploratory options in qTest, and we will start setting the roadmap for 2020 in that direction. We definitely want to expand what we're using within the product, now. Getting our upgrade to 9.7.1 was really significant for us. This past year has been a migration year. We got to steady-state around June. I wanted to get everyone to steady-state, spend some time with it, get our upgrade behind us, and then start to expand out to use some other pieces of functionality in 2020. We use qTest to provide results to executives, but it's usually sanitized through my team a little bit, just because we're still getting used to the cycles and execution. We can do multiple runs and we have to get down to what the actual results are, not the overall multiple runs' results. I use qTest for that and that information gets cleaned up before it goes out to executives. The information it provides them is accurate. qTest is an eight out of ten at this point. For me, it's been the metrics. Every company counts things differently. To understand their reports, out-of-the-box, and to align the solution to where I want it to be — do I massage it to maintain the metric that I have or do I have to wait for a breaking point or do I redefine my calculation — is where those two points go. That has taken a little more than I would've expected. All the data is there. It's just a matter of how you're layering it out for your company. Knowing what our calcs are and knowing what the qTest calcs are, and where they diverge, would've been really great. We were a little more naive than we had planned for. I hope Tricentis keeps it alive and well. It's a great little product that is going to quickly grow to be something that gets out there with the big boys.
The biggest lesson I have learned from using qTest is that every tool has limitations and you need to be able to adapt and overcome and not be stuck with one way of doing things. You have to find out where the tool shines and where it doesn't and make sure that the team feels the least amount of pain when you do implement it. This solution has been implemented for one particular project. We have 60 concurrent licenses available and we have about 120 users who have been given access. Their roles in the project are either business analysts or quality testers. But these people also have their roles within the business. Some are managers within finance, some are directors, some are AP specialists, some are AR specialists. The project is a financial system implementation so we have a sampling of users from all departments executing scripts. Since implementing the tool, we've seen a decrease in critical defects but I don't know if I can attribute it to the tool. I don't know if that's possible. It might be a stretch. But we definitely have seen a drop in critical defects. Over the last four months we have seen a 40 to 60 percent drop. For deployment and maintenance of the solution, it's just me. I'm the one who picked the tool, I'm the one who implemented the tool, I'm the main administrator of the tool, and I am leading all of the testing efforts. Setting up the users is pretty simple. I would recommend it. If you're looking for something quick, easy to use, and robust, it's definitely a very good tool. If I could get them upgrade the Defects module, I would be very happy. I do love it. I'm giving it a nine out ten just because I don't think any tool out there is a ten, but Tricentis is close.
Make sure you set it up the way you do your business. The process is essential, not just the tool that you're using to manage it. The biggest lesson I have learned from using qTest is that I should have used it years ago. We should have had this a long time ago, not just five years ago. I send out periodical reports of all the metrics that we do, usually twice a year. We use other tools for keeping track of tasks that have to be done on each one of the projects. We use Microsoft Planner. It makes it easier for people to actually do their assignments and then let us know that the tasks are completed. If we had the JIRA tool or something of that nature, that would help the process. But, at this time, we don't use that functionality.
The biggest lesson I have learned is that the transitioning process is only difficult if you drag it out. Transitioning over to a new product needs to happen quickly. It needs to be a top-down decision and the information needs to be disseminated to everybody in a quick and efficient manner. We saw that happen easily with the qTest product and that sold me on the lesson that I learned, when it comes to implementing new, global-enterprise software. qTest is a great solution. It should definitely be at the top of your list when you're looking at test case management solutions. It's really the service and support that comes from Tricentis that sets it apart. In addition to that, its integrations with systems that we are already using is the force multiplier that allows qTest to be even more efficient than just another tool that people have to use on a regular basis. It has become the only tool that they have to use on a regular basis. In our company, executives or business users don't review results provided by qTest because that would be a kind of an Insights thing and we don't really use that. They do see the testing status in their tool of choice because we have the links to JIRA, so that they don't have to review the testing status within qTest. They don't log into qTest at all. They see the information that they want through our links with the ticketing system. The solution doesn't really help us to quickly solve issues when they occur, but I don't really feel like that's its job. Its job isn't to help me solve issues. Its job is to make sure that I'm aware that there are issues that need solving, and that information is distributed to all the people who need it, right when it happens. There are some things in there that help me figure out what's going on and what do I need to do to fix a problem; it depends, of course, on the problem. But I don't feel that qTest's job is to help me solve problems. qTest's job is to make sure that I'm aware of the status of problems, that there are problems, and whether or not they're being worked on.
The biggest lesson I've learned from using the solution, because of the Insights challenge, is that I would probably do more of a formal trial. They are aware there are issues with it, and they are going to work on it. Absolutely use it for its test management capabilities, without a doubt, but have an alternative solution for your reporting metrics. Your testing using the tool is not going to change the result of the testing. It's just that the means are more efficient. Our testing scope has been the same and our processes have all been the same. But we're implementing a tool that's a little more organized. We're not really going to become better testers just because we're tracking things a little bit differently. It gives us more efficiencies and an overall improvement in the transparency and visibility of testing progress and its status. qTest has been pretty rock-solid.