We performed a comparison between HeadSpin and Parasoft SOAtest based on real PeerSpot user reviews.
Find out in this report how the two Functional Testing Tools solutions compare in terms of features, pricing, service and support, easy of deployment, and ROI."The technical support is really helpful because we can set up direct calls with them if we want to. We can use Zoom or Google Meet to interact with them directly, and if there is an issue in our system, they will help us by reproducing the issue in their machines and trying to figure out a solution. The support is really smooth, and we like that they're very supportive."
"It has an interesting feature called AV box testing. A lot of companies that are in the OTT segment don't really understand what their streaming is like. They can't test for streaming quality. There are restrictions where you cannot simulate live streaming. For example, on Netflix, you can't simulate how a movie is being streamed on a remote device. That's why HeadSpin has got this AV box testing feature. It is a patented feature. They send an AV box to your location, and you can test live streaming, which is something that no other company does."
"The most valuable features of the product are the performance parameters it gives us."
"The most valuable feature of HeadSpin it's the integration with other solutions. It is great. I can search for an element or do a quick debugging on the application right on HeadSpin. It's very useful."
"The most valuable feature is that this is the first connected intelligence all-in-one platform."
"The initial setup of HeadSpin was very easy and user-friendly. It was easy to configure and write a script."
"Good write and read files which save execution inputs and outputs and can be stored locally."
"We have seen a return on investment."
"If you want something that’s not provided out of the box, then you can write it yourself and integrate it with SOAtest."
"The testing time is shortened because we generate test data automatically with SOAtest."
"Parasoft SOAtest has improved the quality of our automated web services, which can be easily implemented through service chaining and service virtualization."
"Automatic testing is the most valuable feature."
"We do a lot of web services testing and REST services testing. That is the focus of this product."
"Since the solution has both command line and automation options, it generates good reports."
"Support and pricing could be improved."
"If you want to do some testing or check the devices manually or check the application in a particular device manually, it is really laggy. That's a disappointment because sometimes we would like to do manual testing when our local devices are not available."
"Sometimes, devices go offline and some features are not functioning on some devices, specifically on iOS."
"HeadSpin needs to improve the hardware. With the mobile, the battery life reduces and must be continuously charged."
"HeadSpin could improve on the user interface because it is very poor. The checks that are done on the iOS devices are very difficult, but for Android, it runs great. For all iOS devices, the user interface and how it interacts with the device are very poor."
"They should automate their onboarding. A lot of things are still manual. They can create a video assistant or something like that to completely automate the entire process."
"The product is very slow to start up, and that is a bit of a problem, actually."
"During the process of working with SOAtest and building test cases, the .TST files will grow. A negative side effect is that saving your changes takes more time."
"Reports could be customized and more descriptive according to the user's or company's requirements."
"Enabling/disabling an optional element of an XML request is only possible if a data source (e.g., Excel sheet) is connected to the test. Otherwise, the option is not available at all in the drop-down menu."
"Reporting facilities can be better."
"The feedback that we received from the DevOps of our organization was that the tool was a little heavy from the transformation perspective."
"Tuning the tool takes time because it gives quite a long list of warnings."
"UI testing should be more in-depth."
HeadSpin is ranked 19th in Functional Testing Tools with 6 reviews while Parasoft SOAtest is ranked 23rd in Functional Testing Tools with 30 reviews. HeadSpin is rated 8.0, while Parasoft SOAtest is rated 8.2. The top reviewer of HeadSpin writes "It fulfills everything from automation to manual performance". On the other hand, the top reviewer of Parasoft SOAtest writes "Reliable with a good interface but uses too much memory". HeadSpin is most compared with Perfecto, Sauce Labs, BrowserStack, pCloudy and Tricentis Tosca, whereas Parasoft SOAtest is most compared with Postman, SonarQube, Coverity, Polyspace Code Prover and Klocwork. See our HeadSpin vs. Parasoft SOAtest report.
See our list of best Functional Testing Tools vendors.
We monitor all Functional Testing Tools reviews to prevent fraudulent reviews and keep review quality high. We do not post reviews by company employees or direct competitors. We validate each review for authenticity via cross-reference with LinkedIn, and personal follow-up with the reviewer when necessary.