We performed a comparison between OpenText Silk Test and OpenText UFT One based on real PeerSpot user reviews.
Find out in this report how the two Functional Testing Tools solutions compare in terms of features, pricing, service and support, easy of deployment, and ROI."The statistics that are available are very good."
"Scripting is the most valuable. We are able to record and then go in and modify the script that it creates. It has a lot of generative scripts."
"A good automation tool that supports SAP functional testing."
"The major thing it has helped with is to reduce the workload on testing activities."
"The ability to develop scripts in Visual Studio, Visual Studio integration, is the most valuable feature."
"The scalability of the solution is quite good. You can easily expand the product if you need to."
"The feature I like most is the ease of reporting."
"It's not only web-based but also for backend applications; you can also do the integration of the applications."
"I like the Help feature in UFT One. For example, if you are navigating a particular window, where there are different options. One wouldn’t know the purpose of every option, but there is no need to search because that window contains a Help button. If you click on that Help button, it directly navigates to the respective help needed. VBScript is very easy to understand and easy to prepare scripts with minimal learning curve."
"We have used it for the web and Windows-based applications. It is very productive in terms of execution."
"The scalability of Micro Focus UFT One is good."
"The most valuable features for us are the GUI, the easy identification of objects, and folder structure creation."
"The ease of record and playback as well as descriptive programming are the most valuable features of UFT (QTP)."
"It is a stable solution."
"With frequent releases, using automation to perform regression testing can save us huge amount of time and resources."
"The solution has a lack of compatibility with newer technologies."
"We moved to Ranorex because the solution did not easily scale, and we could not find good and short term third-party help. We needed to have a bigger pool of third-party contractors that we could draw on for specific implementations. Silk didn't have that, and we found what we needed for Ranorex here in the Houston area. It would be good if there is more community support. I don't know if Silk runs a user conference once a year and how they set up partners. We need to be able to talk to somebody more than just on the phone. It really comes right down to that. The generated automated script was highly dependent upon screen position and other keys that were not as robust as we wanted. We found the automated script generated by Ranorex and the other key information about a specific data point to be more robust. It handled the transition better when we moved from computer to computer and from one size of the application to the other size. When we restarted Silk, we typically had to recalibrate screen elements within the script. Ranorex also has some of these same issues, but when we restart, it typically is faster, which is important."
"Everything is very manual. It's up to us to find out exactly what the issues are."
"The support for automation with iOS applications can be better."
"The pricing is an issue, the program is very expensive. That is something that can improve."
"Could be more user-friendly on the installation and configuration side."
"They should extend some of the functions that are a bit clunky and improve the integration."
"Object identification has room for improvement, to make it more efficient."
"Technical support could be improved."
"The product wasn't easy for developers to learn and pick up in the area revolving around scripting for automation, and there was a lot of resistance from developers, causing my company to rely on specialist resources."
"One area for improvement is its occasional slowness."
"The price is very high. They should work to lower the costs for their clients."
"The solution is expensive."
"There is a lot of room for improvement when it comes to friction-free continuous testing across the software life cycle, as a local installation is required to run UFT."
"I'd like to see test case-related reports included in the solution."
Earn 20 points
OpenText Silk Test is ranked 26th in Functional Testing Tools while OpenText UFT One is ranked 2nd in Functional Testing Tools with 89 reviews. OpenText Silk Test is rated 7.6, while OpenText UFT One is rated 8.0. The top reviewer of OpenText Silk Test writes "Stable, with good statistics and detailed reporting available". On the other hand, the top reviewer of OpenText UFT One writes "With regularly occurring releases, a QA team member can schedule tests, let the tests run unattended, and then examine the results". OpenText Silk Test is most compared with Selenium HQ, OpenText UFT Developer, Apache JMeter, froglogic Squish and Katalon Studio, whereas OpenText UFT One is most compared with Tricentis Tosca, OpenText UFT Developer, Katalon Studio, SmartBear TestComplete and Postman. See our OpenText Silk Test vs. OpenText UFT One report.
See our list of best Functional Testing Tools vendors, best Regression Testing Tools vendors, and best Test Automation Tools vendors.
We monitor all Functional Testing Tools reviews to prevent fraudulent reviews and keep review quality high. We do not post reviews by company employees or direct competitors. We validate each review for authenticity via cross-reference with LinkedIn, and personal follow-up with the reviewer when necessary.