We performed a comparison between OpenText LoadRunner Enterprise and ReadyAPI Performance based on real PeerSpot user reviews.
Find out in this report how the two Load Testing Tools solutions compare in terms of features, pricing, service and support, easy of deployment, and ROI."The initial setup was straightforward. I was able to download everything myself without any IT support."
"The solution offers helpful guidelines and has good documentation."
"It's a very powerful tool."
"The most beneficial features of the solution are flexibility and versatility in their performance."
"LoadRunner Enterprise's most valuable features are load simulation and creating correlation for parameters."
"For me, LoadRunner stands out, especially with its reporting capabilities, the graphs that can be generated, and the unique feature of measuring our application's response alongside our infrastructure metrics, such as CPU, memory, or disk usage, all presented in graph form. This is something other applications struggle to match."
"It is also good for reporting purposes, which would be most familiar for QC and UFT users."
"For me, the test coverage and the performance and load testing aspects are valuable."
"ReadyAPI automation can help us validate the functionality of most web services, allowing us to find out the exact number of defects before deployment to the user interface."
"We find the product to be scalable."
"he initial deployment process is easy."
"It's like a centralized interface that allows us to increase the quality of our APIs."
"It stores good reports, as in, improved reports if compared with the SoapUI. It also has in-built security. You just need to switch and check the security testing. My team has never used it, but I know ReadyAPI provides those facilities as well."
"The performance and reporting of this solution have been its most valuable features."
"We can scale."
"The solution can be improved by making it more user-friendly, and by including autocorrelation capability."
"LoadRunner Enterprise's reporting should be quicker, easier, and more flexible."
"It's not that popular on the cloud."
"I'd rate the scalability a six out of ten. The main reason is that it's a very expensive application. Other companies might not be able to afford it. For example, if we need to test an application with 10,000 concurrent users, the license can cost a lot of money. That's where OpenText tools shoot themselves in the foot compared to other tools. Because of the price, many companies, like one I used to work for, decided not to renew their licenses and switched to open-source testing tools."
"A room for improvement in Micro Focus LoadRunner Enterprise is that it should take multiple exhibitions for a particular scenario and have automatic trending for that. This will be a very useful feature that lets users look into how many exhibitions happened for the scenario and their performance, and you should be able to see the data within the Performance Center dashboard. For example, there's one scenario I'm focusing on multiple times in a month, and if I check five times, there's no way for me to see the trend and find out how it went with those five exhibitions. It would be great if the Performance Center has a view of all five exhibitions, particularly transaction by transaction, and how they happened. If Micro Focus LoadRunner Enterprise shows you the time trends, information about one exhibition to another, and how each performed, it'll be an immense feature, and that should be visible to every user. Reporting should be simpler in Micro Focus LoadRunner Enterprise. If I did a scenario with one exhibition now, and I did that scenario again, then I should be able to schedule that scenario for the exhibition, and if that scenario is executed multiple times, there should be the option to turn it into a single view that shows you all the transactions, how the performance was, what the trend graph is for a particular time, etc."
"Third-party product integrations could be a little more slickly handled."
"We are expecting more flexible to use Jenkins in continuous integration going forward."
"The cost of the solution is high and can be improved."
"The solution’s interface could be improved."
"I'd not sure if they have the same level of documentation for performance and security testing."
"We need some time to understand and configure the solution."
"This solution could be improved by offering artificial AI testing in addition to API testing. For example, we would like to have machine learning testing because when test applications, manual work could be completed automatically using this functionality."
"This is an area for improvement with the tool. We unnecessarily use JMeter for some website testing, which we would like to avoid by introducing this tool for API and load testing because it provides load testing features."
"I want the solution to be able to monitor Apache Kafka activity as well."
"It is very slow sometimes."
More OpenText LoadRunner Enterprise Pricing and Cost Advice →
OpenText LoadRunner Enterprise is ranked 5th in Load Testing Tools with 81 reviews while ReadyAPI Performance is ranked 8th in Load Testing Tools with 7 reviews. OpenText LoadRunner Enterprise is rated 8.4, while ReadyAPI Performance is rated 8.2. The top reviewer of OpenText LoadRunner Enterprise writes "Saves time and effort, and makes it easy to set up scenarios and execute tests". On the other hand, the top reviewer of ReadyAPI Performance writes "Straightforward to install with the ability to add multiple assertions but the price is too high". OpenText LoadRunner Enterprise is most compared with OpenText LoadRunner Cloud, OpenText LoadRunner Professional, OpenText Silk Performer, Tricentis NeoLoad and Apache JMeter, whereas ReadyAPI Performance is most compared with SmartBear LoadNinja and Apache JMeter. See our OpenText LoadRunner Enterprise vs. ReadyAPI Performance report.
See our list of best Load Testing Tools vendors and best Performance Testing Tools vendors.
We monitor all Load Testing Tools reviews to prevent fraudulent reviews and keep review quality high. We do not post reviews by company employees or direct competitors. We validate each review for authenticity via cross-reference with LinkedIn, and personal follow-up with the reviewer when necessary.