We performed a comparison between OpenText Silk Test and Sauce Labs based on real PeerSpot user reviews.
Find out in this report how the two Functional Testing Tools solutions compare in terms of features, pricing, service and support, easy of deployment, and ROI."The statistics that are available are very good."
"The ability to develop scripts in Visual Studio, Visual Studio integration, is the most valuable feature."
"A good automation tool that supports SAP functional testing."
"The major thing it has helped with is to reduce the workload on testing activities."
"The feature I like most is the ease of reporting."
"The scalability of the solution is quite good. You can easily expand the product if you need to."
"Scripting is the most valuable. We are able to record and then go in and modify the script that it creates. It has a lot of generative scripts."
"It offers the single best solution for integrating deep automated browser testing in a CI/CD pipeline."
"Live device testing. As we all know, It's really hard and challenging to find/purchase many real devices to test because it will be costly and not all the team can be able to purchase all of the devices out there. We used to have a lot of real devices under our labs. However, it is really time-consuming to maintain those devices and make sure they are up to date with the testing requirements."
"The most critical thing is that this software aligns with our Agile and DevOps way of doing things. It integrates with kickoff scripts through DevOps."
"The most valuable feature is the ability to run concurrent automated tests up to a specified value, depending on what we are currently paying for."
"From an infrastructure support perspective, the number of VMs, browsers installations and versions that we would be maintaining without Sauce Labs would be a lot. This includes not only the infrastructure costs, but also the maintenance costs and people's time. The labor cost associated with maintaining all of that would be considerably high. In terms of efficiency, having concurrent VMs with various browser combinations available has allowed us to run multiple executions by all our teams."
"It has significantly enhanced our testing accuracy by approximately 50%."
"It provides zero maintenance browser instances."
"I like the dashboard and seeing the test results. As a manager, I like to see the insights of the people using it, understanding the total path and run. I can see all of that as a manager. I also know team members love seeing the dashboard and seeing the test results in real-time."
"The solution has a lack of compatibility with newer technologies."
"The pricing is an issue, the program is very expensive. That is something that can improve."
"They should extend some of the functions that are a bit clunky and improve the integration."
"Everything is very manual. It's up to us to find out exactly what the issues are."
"Could be more user-friendly on the installation and configuration side."
"The support for automation with iOS applications can be better."
"We moved to Ranorex because the solution did not easily scale, and we could not find good and short term third-party help. We needed to have a bigger pool of third-party contractors that we could draw on for specific implementations. Silk didn't have that, and we found what we needed for Ranorex here in the Houston area. It would be good if there is more community support. I don't know if Silk runs a user conference once a year and how they set up partners. We need to be able to talk to somebody more than just on the phone. It really comes right down to that. The generated automated script was highly dependent upon screen position and other keys that were not as robust as we wanted. We found the automated script generated by Ranorex and the other key information about a specific data point to be more robust. It handled the transition better when we moved from computer to computer and from one size of the application to the other size. When we restarted Silk, we typically had to recalibrate screen elements within the script. Ranorex also has some of these same issues, but when we restart, it typically is faster, which is important."
"Sauce Labs can include new technologies like generative AI capabilities."
"They should provide a JIRA integration plugin so that we can easily log issues."
"On a rare occasion, I will come into a ticket where a customer will have reached out to me after reaching out to Sauce Labs, saying, "Sauce Labs doesn't understand what I am going through. They are not being very helpful." So, I try to do clean up there. Outside of those extremely rare occasions, I have only had one or two of those support issues."
"Lacks the ability to start multiple tests simultaneously."
"Unable to segregate reports for tests that are currently being developed, and might not be returning useful results."
"Running tests in the SauceCloud can take longer than running in a local environment."
"I can't remove team members that have left the organization. I can only set them as inactive. It would be really nice to clean up my data and delete them from the team management."
"Better and programmatic controls on request/response recordings and sharing with developers."
Earn 20 points
OpenText Silk Test is ranked 26th in Functional Testing Tools while Sauce Labs is ranked 11th in Functional Testing Tools with 113 reviews. OpenText Silk Test is rated 7.6, while Sauce Labs is rated 8.8. The top reviewer of OpenText Silk Test writes "Stable, with good statistics and detailed reporting available". On the other hand, the top reviewer of Sauce Labs writes "Robust documentation, helpful support representative, good licensing model". OpenText Silk Test is most compared with OpenText UFT One, Selenium HQ, OpenText UFT Developer, Apache JMeter and froglogic Squish, whereas Sauce Labs is most compared with BrowserStack, Perfecto, LambdaTest, Bitbar and Tricentis Tosca. See our OpenText Silk Test vs. Sauce Labs report.
See our list of best Functional Testing Tools vendors, best Test Automation Tools vendors, and best Regression Testing Tools vendors.
We monitor all Functional Testing Tools reviews to prevent fraudulent reviews and keep review quality high. We do not post reviews by company employees or direct competitors. We validate each review for authenticity via cross-reference with LinkedIn, and personal follow-up with the reviewer when necessary.