We performed a comparison between OpenText ALM / Quality Center and OpenText Silk Test based on real PeerSpot user reviews.
Find out what your peers are saying about Microsoft, Atlassian, Nutanix and others in Application Lifecycle Management (ALM) Suites."I love linking/associating the requirements to a test case. That's where I get to know my requirement coverage, which helps a lot at a practical level. So, we use the traceability and visibility features a lot. This helps us to understand if there are any requirements not linked to any test case, thus not getting tested at all. That missing link is always very visible, which helps us to create our requirement traceability matrix and maintain it in a dynamic way. Even with changing requirements, we can keep on changing or updating the tool."
"I found the ease of use most valuable in Micro Focus ALM Quality Center. Creating test cases is easier because the solution allows writing in Excel."
"Within Quality Center, you have the dashboard where you can monitor your progress over different entities. You can build your own SQL query segments, and all that data is there in the system, then you can make a dashboard report."
"You can do your development from start to finish: starting with the requirements, ending with defects, and testing in-between."
"ALM Quality Center is a reliable, consolidated product."
"Business process management is the most valuable feature of the solution."
"Easily integrates with Oracle e-Business Suite."
"Cross project customization through template really helps to maintain standards with respect to fields, workflows throughout the available projects."
"The ability to develop scripts in Visual Studio, Visual Studio integration, is the most valuable feature."
"The scalability of the solution is quite good. You can easily expand the product if you need to."
"The major thing it has helped with is to reduce the workload on testing activities."
"Scripting is the most valuable. We are able to record and then go in and modify the script that it creates. It has a lot of generative scripts."
"The statistics that are available are very good."
"A good automation tool that supports SAP functional testing."
"The feature I like most is the ease of reporting."
"There are always new features and more support for new and legacy technology architectures with each release. But the bad news is a growing list of long-standing issues with the product rarely gets addressed."
"It is nice, but it does have some weaknesses. It's a bit hard to go back and change the requirement tool after setup."
"It's not intuitive in that way, which has always been a problem, especially with business users."
"We operate in Sweden, and there are not so many Swedish people that know the product."
"We have had a poor experience with customer service and support."
"As soon as it's available on-premises we want to move to ALM Octane as it's mainly web based, has the capability to work with major tests, and integrates with Jenkins for continuous integration."
"We would like to have support for agile development."
"They should specify every protocol or process with labels or names."
"Everything is very manual. It's up to us to find out exactly what the issues are."
"The solution has a lack of compatibility with newer technologies."
"The support for automation with iOS applications can be better."
"The pricing is an issue, the program is very expensive. That is something that can improve."
"They should extend some of the functions that are a bit clunky and improve the integration."
"We moved to Ranorex because the solution did not easily scale, and we could not find good and short term third-party help. We needed to have a bigger pool of third-party contractors that we could draw on for specific implementations. Silk didn't have that, and we found what we needed for Ranorex here in the Houston area. It would be good if there is more community support. I don't know if Silk runs a user conference once a year and how they set up partners. We need to be able to talk to somebody more than just on the phone. It really comes right down to that. The generated automated script was highly dependent upon screen position and other keys that were not as robust as we wanted. We found the automated script generated by Ranorex and the other key information about a specific data point to be more robust. It handled the transition better when we moved from computer to computer and from one size of the application to the other size. When we restarted Silk, we typically had to recalibrate screen elements within the script. Ranorex also has some of these same issues, but when we restart, it typically is faster, which is important."
"Could be more user-friendly on the installation and configuration side."
More OpenText ALM / Quality Center Pricing and Cost Advice →
Earn 20 points
OpenText ALM / Quality Center is ranked 6th in Application Lifecycle Management (ALM) Suites with 197 reviews while OpenText Silk Test is ranked 25th in Functional Testing Tools. OpenText ALM / Quality Center is rated 8.0, while OpenText Silk Test is rated 7.6. The top reviewer of OpenText ALM / Quality Center writes "Offers features for higher-end traceability and integration with different tools but lacks in scalability ". On the other hand, the top reviewer of OpenText Silk Test writes "Stable, with good statistics and detailed reporting available". OpenText ALM / Quality Center is most compared with Microsoft Azure DevOps, OpenText ALM Octane, Jira, Tricentis qTest and Zephyr Enterprise, whereas OpenText Silk Test is most compared with Selenium HQ, OpenText UFT One, OpenText UFT Developer, Apache JMeter and froglogic Squish.
We monitor all Application Lifecycle Management (ALM) Suites reviews to prevent fraudulent reviews and keep review quality high. We do not post reviews by company employees or direct competitors. We validate each review for authenticity via cross-reference with LinkedIn, and personal follow-up with the reviewer when necessary.