We performed a comparison between HeadSpin and OpenText Silk Test based on real PeerSpot user reviews.
Find out what your peers are saying about Tricentis, OpenText, Perforce and others in Functional Testing Tools."The most valuable features of the product are the performance parameters it gives us."
"The initial setup of HeadSpin was very easy and user-friendly. It was easy to configure and write a script."
"It has an interesting feature called AV box testing. A lot of companies that are in the OTT segment don't really understand what their streaming is like. They can't test for streaming quality. There are restrictions where you cannot simulate live streaming. For example, on Netflix, you can't simulate how a movie is being streamed on a remote device. That's why HeadSpin has got this AV box testing feature. It is a patented feature. They send an AV box to your location, and you can test live streaming, which is something that no other company does."
"The most valuable feature is that this is the first connected intelligence all-in-one platform."
"The most valuable feature of HeadSpin it's the integration with other solutions. It is great. I can search for an element or do a quick debugging on the application right on HeadSpin. It's very useful."
"The technical support is really helpful because we can set up direct calls with them if we want to. We can use Zoom or Google Meet to interact with them directly, and if there is an issue in our system, they will help us by reproducing the issue in their machines and trying to figure out a solution. The support is really smooth, and we like that they're very supportive."
"The feature I like most is the ease of reporting."
"Scripting is the most valuable. We are able to record and then go in and modify the script that it creates. It has a lot of generative scripts."
"The major thing it has helped with is to reduce the workload on testing activities."
"The ability to develop scripts in Visual Studio, Visual Studio integration, is the most valuable feature."
"The statistics that are available are very good."
"The scalability of the solution is quite good. You can easily expand the product if you need to."
"A good automation tool that supports SAP functional testing."
"Sometimes, devices go offline and some features are not functioning on some devices, specifically on iOS."
"Support and pricing could be improved."
"If you want to do some testing or check the devices manually or check the application in a particular device manually, it is really laggy. That's a disappointment because sometimes we would like to do manual testing when our local devices are not available."
"They should automate their onboarding. A lot of things are still manual. They can create a video assistant or something like that to completely automate the entire process."
"HeadSpin needs to improve the hardware. With the mobile, the battery life reduces and must be continuously charged."
"HeadSpin could improve on the user interface because it is very poor. The checks that are done on the iOS devices are very difficult, but for Android, it runs great. For all iOS devices, the user interface and how it interacts with the device are very poor."
"The support for automation with iOS applications can be better."
"We moved to Ranorex because the solution did not easily scale, and we could not find good and short term third-party help. We needed to have a bigger pool of third-party contractors that we could draw on for specific implementations. Silk didn't have that, and we found what we needed for Ranorex here in the Houston area. It would be good if there is more community support. I don't know if Silk runs a user conference once a year and how they set up partners. We need to be able to talk to somebody more than just on the phone. It really comes right down to that. The generated automated script was highly dependent upon screen position and other keys that were not as robust as we wanted. We found the automated script generated by Ranorex and the other key information about a specific data point to be more robust. It handled the transition better when we moved from computer to computer and from one size of the application to the other size. When we restarted Silk, we typically had to recalibrate screen elements within the script. Ranorex also has some of these same issues, but when we restart, it typically is faster, which is important."
"Could be more user-friendly on the installation and configuration side."
"They should extend some of the functions that are a bit clunky and improve the integration."
"The solution has a lack of compatibility with newer technologies."
"The pricing is an issue, the program is very expensive. That is something that can improve."
"Everything is very manual. It's up to us to find out exactly what the issues are."
Earn 20 points
HeadSpin is ranked 19th in Functional Testing Tools with 6 reviews while OpenText Silk Test is ranked 25th in Functional Testing Tools. HeadSpin is rated 8.0, while OpenText Silk Test is rated 7.6. The top reviewer of HeadSpin writes "It fulfills everything from automation to manual performance". On the other hand, the top reviewer of OpenText Silk Test writes "Stable, with good statistics and detailed reporting available". HeadSpin is most compared with Perfecto, Sauce Labs, BrowserStack, pCloudy and Tricentis Tosca, whereas OpenText Silk Test is most compared with Selenium HQ, OpenText UFT One, OpenText UFT Developer, Apache JMeter and froglogic Squish.
See our list of best Functional Testing Tools vendors.
We monitor all Functional Testing Tools reviews to prevent fraudulent reviews and keep review quality high. We do not post reviews by company employees or direct competitors. We validate each review for authenticity via cross-reference with LinkedIn, and personal follow-up with the reviewer when necessary.