We performed a comparison between HeadSpin and OpenText UFT Digital Lab based on real PeerSpot user reviews.
Find out in this report how the two Functional Testing Tools solutions compare in terms of features, pricing, service and support, easy of deployment, and ROI."The most valuable feature is that this is the first connected intelligence all-in-one platform."
"The most valuable feature of HeadSpin it's the integration with other solutions. It is great. I can search for an element or do a quick debugging on the application right on HeadSpin. It's very useful."
"It has an interesting feature called AV box testing. A lot of companies that are in the OTT segment don't really understand what their streaming is like. They can't test for streaming quality. There are restrictions where you cannot simulate live streaming. For example, on Netflix, you can't simulate how a movie is being streamed on a remote device. That's why HeadSpin has got this AV box testing feature. It is a patented feature. They send an AV box to your location, and you can test live streaming, which is something that no other company does."
"The technical support is really helpful because we can set up direct calls with them if we want to. We can use Zoom or Google Meet to interact with them directly, and if there is an issue in our system, they will help us by reproducing the issue in their machines and trying to figure out a solution. The support is really smooth, and we like that they're very supportive."
"The most valuable features of the product are the performance parameters it gives us."
"The initial setup of HeadSpin was very easy and user-friendly. It was easy to configure and write a script."
"For automation testing, the tool provides the record and playback option, which helps with object detection easily."
"It is a complete solution for mobile application testing."
"The most valuable feature of this solution is virtualization."
"The product is easy to use."
"The fact that it allows users to test on real mobile devices instead of emulators is something that projects have told us is beyond compare."
"There are numerous valuable features such as automation, the ones that facilitate importing and synchronization capabilities between our platform, Jira, and Azure DevOps."
"The solution is easy to use. There are features to orchestrate mobile testing, including mobile testing automation. You can test different devices at the same time."
"HeadSpin needs to improve the hardware. With the mobile, the battery life reduces and must be continuously charged."
"Sometimes, devices go offline and some features are not functioning on some devices, specifically on iOS."
"They should automate their onboarding. A lot of things are still manual. They can create a video assistant or something like that to completely automate the entire process."
"Support and pricing could be improved."
"HeadSpin could improve on the user interface because it is very poor. The checks that are done on the iOS devices are very difficult, but for Android, it runs great. For all iOS devices, the user interface and how it interacts with the device are very poor."
"If you want to do some testing or check the devices manually or check the application in a particular device manually, it is really laggy. That's a disappointment because sometimes we would like to do manual testing when our local devices are not available."
"The product's object detection method needs to be improved since it can help testers do perfect testing."
"The documentation and user interface both need improvement."
"I would like to see more integration with automation tools."
"We like to host the tools centrally. We would need them to be multi-tenants, so different projects could log on and have their own set of devices and their own set of apps, and they wouldn't see data from other projects that are using it."
"We need to scale devices easily. Some customers would like to loop in AWS or other cloud providers to check if their devices have the cloud factor. OpenText UFT Digital Lab needs to improve it."
"For the most part, the key challenge is ensuring that customers fully utilize the product as intended and adopt the appropriate frameworks to implement the solutions effectively."
"They should introduce a pay-per-use subscription model."
HeadSpin is ranked 19th in Functional Testing Tools with 6 reviews while OpenText UFT Digital Lab is ranked 20th in Functional Testing Tools with 16 reviews. HeadSpin is rated 8.0, while OpenText UFT Digital Lab is rated 7.4. The top reviewer of HeadSpin writes "It fulfills everything from automation to manual performance". On the other hand, the top reviewer of OpenText UFT Digital Lab writes "Robust solution for application lifecycle management with numerous valuable features". HeadSpin is most compared with Perfecto, Sauce Labs, BrowserStack, pCloudy and Tricentis Tosca, whereas OpenText UFT Digital Lab is most compared with OpenText UFT One, Appium, Perfecto, AWS Device Farm and Sauce Labs. See our HeadSpin vs. OpenText UFT Digital Lab report.
See our list of best Functional Testing Tools vendors and best Mobile App Testing Tools vendors.
We monitor all Functional Testing Tools reviews to prevent fraudulent reviews and keep review quality high. We do not post reviews by company employees or direct competitors. We validate each review for authenticity via cross-reference with LinkedIn, and personal follow-up with the reviewer when necessary.