We performed a comparison between OpenText Silk Test and SmartBear TestComplete based on real PeerSpot user reviews.
Find out in this report how the two Test Automation Tools solutions compare in terms of features, pricing, service and support, easy of deployment, and ROI."The major thing it has helped with is to reduce the workload on testing activities."
"The ability to develop scripts in Visual Studio, Visual Studio integration, is the most valuable feature."
"A good automation tool that supports SAP functional testing."
"The scalability of the solution is quite good. You can easily expand the product if you need to."
"The feature I like most is the ease of reporting."
"Scripting is the most valuable. We are able to record and then go in and modify the script that it creates. It has a lot of generative scripts."
"The statistics that are available are very good."
"This company offers end-to-end capabilities for test suite creation and execution. One feature that I particularly appreciate is the tagging system. Tags are highly valuable, as they allow you to assign tags to your test cases. When there's an impact in a specific area, you can search for and run all test cases associated with that tag. I find this functionality very useful."
"It allows us to test both desktop and web applications."
"The solution helps improve the stability of our product. It also decreases the work of our manual quality assurance engineers."
"The most valuable features are the desktop and mobile modules."
"The solution is great as a record and playback tool. It also has valuable regression testing."
"Customer service and technical support responsiveness are high. Everyone is very professional."
"TestComplete fits almost perfectly with a large amount of stacks, such as Delphi, C#, Java and web applications."
"It's cross platform automation capabilities specially ranging across web, UNIX (via putty), and other systems."
"The solution has a lack of compatibility with newer technologies."
"Could be more user-friendly on the installation and configuration side."
"We moved to Ranorex because the solution did not easily scale, and we could not find good and short term third-party help. We needed to have a bigger pool of third-party contractors that we could draw on for specific implementations. Silk didn't have that, and we found what we needed for Ranorex here in the Houston area. It would be good if there is more community support. I don't know if Silk runs a user conference once a year and how they set up partners. We need to be able to talk to somebody more than just on the phone. It really comes right down to that. The generated automated script was highly dependent upon screen position and other keys that were not as robust as we wanted. We found the automated script generated by Ranorex and the other key information about a specific data point to be more robust. It handled the transition better when we moved from computer to computer and from one size of the application to the other size. When we restarted Silk, we typically had to recalibrate screen elements within the script. Ranorex also has some of these same issues, but when we restart, it typically is faster, which is important."
"The pricing is an issue, the program is very expensive. That is something that can improve."
"Everything is very manual. It's up to us to find out exactly what the issues are."
"They should extend some of the functions that are a bit clunky and improve the integration."
"The support for automation with iOS applications can be better."
"Right now, when you buy the solution, you need to pay for one solution. You receive one set up and you install it and it's just in that one machine. It would be ideal if they could offer one subscription where you can connect to different machines with a group subscription."
"SmartBear products generally have a weak link when it comes to integration with other test management tools like Inflectra."
"Increased performance with less memory and CPU usage."
"The test object repository needs to be improved. The hierarchy and the way we identify the objects in different applications, irrespective of technology, needs adjustments. The located and test objects are not as flexible compared to other commercial tools."
"Name Mapping feature should be clearer. Whenever I use it, I do not really know what will work and what will not work."
"It is very hard to read the test log generated by TestComplete Executor if the log file is very big. TestComplete Executor is a small tool for just running the TestComplete test framework (not for developing)."
"The licensing costs are a little bit high and should be reduced."
"The integration tools could be better."
Earn 20 points
OpenText Silk Test is ranked 24th in Test Automation Tools while SmartBear TestComplete is ranked 7th in Test Automation Tools with 71 reviews. OpenText Silk Test is rated 7.6, while SmartBear TestComplete is rated 7.6. The top reviewer of OpenText Silk Test writes "Stable, with good statistics and detailed reporting available". On the other hand, the top reviewer of SmartBear TestComplete writes "A stable product that needs to improve its integration capabilities with other test management tools". OpenText Silk Test is most compared with Selenium HQ, OpenText UFT One, OpenText UFT Developer and Apache JMeter, whereas SmartBear TestComplete is most compared with Tricentis Tosca, Katalon Studio, Ranorex Studio, OpenText UFT One and froglogic Squish. See our OpenText Silk Test vs. SmartBear TestComplete report.
See our list of best Test Automation Tools vendors, best Functional Testing Tools vendors, and best Regression Testing Tools vendors.
We monitor all Test Automation Tools reviews to prevent fraudulent reviews and keep review quality high. We do not post reviews by company employees or direct competitors. We validate each review for authenticity via cross-reference with LinkedIn, and personal follow-up with the reviewer when necessary.