We performed a comparison between OpenText LoadRunner Cloud and OpenText LoadRunner Enterprise based on real PeerSpot user reviews.
Find out in this report how the two Performance Testing Tools solutions compare in terms of features, pricing, service and support, easy of deployment, and ROI."Keeping up with DevOps, thus the best feature of StormRunner is that we don't have to build and maintain infrastructure anymore."
"The TruClient feature is the most valuable for us. An application with testing can only be scripted using TruClient, so it's part web-based, but it also has its own protocol combined with HTTP and HTML. So many other tools do not recognize this specific proprietary protocol. Using TruClient, we can still create scripts that cover everything that we need to cover."
"The product supports a wide variety of technology compared to any other tool."
"The record and playback feature is the most valuable feature. It's all driven by the script, so it's a script-based tool where the background tracing starts. Java's background process does a lot of tracing. The process starts in the background. It sees what peaks of volumes that the process can handle. It's easy to use because it's script based, record, and playback. I"
"This solution is SaaS based so we can utilize cloud technology, which is less time consuming and saves a lot of of money."
"It's fast, easy to use, has a user-friendly UI, and you can split users."
"The reports are very relevant to the customers’ expectations."
"The most valuable feature is having load generators in countries where we don’t have access to them."
"What we call the LoadRunner analysis is the most useful aspect of the solution."
"With LoadRunner Enterprise, doing various types of performance testing, load testing, and automation testing has been very helpful for some of the teams."
"We haven't had an outage since we started using the solution."
"Micro Focus LoadRunner Enterprise Is very user-friendly."
"It offers easy integration with third-party tools like Dynatrace, Splunk, etc."
"Support is nice, quick, and responsive."
"This is a product that has a lot of capabilities and is the most mature tool of its kind in the market."
"The host performance testing of any application using a host/controller is the most valuable feature."
"There are three modules in the system that are different products packaged into one, and they can sometimes be difficult to figure out, so they should be better integrated with each other."
"An area for improvement is analytics on why response times are slow from certain countries."
"We did have some challenges with the initial implementation."
"One area of improvement in the software's support is the replaying of captured data within the development environment. It would be beneficial if the replay feature could accurately mimic what the actual application is doing for better analysis and testing."
"CI/CD integration could be a little bit better. When there's a test and if you see that there are high response times in the test itself, it would be great to be able to send an alert. It would give a heads-up to the architect community or ops community."
"In terms of new features, they can natively integrate with Chaos engineering tools such as Chaos Monkey and AWS FIS. With LoadRunner, we can generate load, and if Chaos tools are also supported natively, it will help to get everything together."
"The product price could be more affordable."
"The product must provide agents to monitor servers."
"We'd like the product to include protocol identifiers whenever a tester wants to test a new application."
"Micro Focus LoadRunner Enterprise needs to add more features for Citrix performance-based applications testing. This was one of the challenges we observed. Additionally, we experienced some APIs challenges."
"It is tough to maintain from the infrastructure side."
"The solution is a very expensive tool when compared with other tools."
"The solution is expensive."
"New features have been added in latest version and need to be improved with the DevOps integration."
"After they get over the acquisition, the first improvement is going to be tailoring it for their existing stack of other products. How would LoadRunner work for Documentum? How would it work for Business Network? How would it work for other apps? They can have a pre-package or a guide because they are all in the same family as opposed to being outside."
"I'd rate the scalability a six out of ten. The main reason is that it's a very expensive application. Other companies might not be able to afford it. For example, if we need to test an application with 10,000 concurrent users, the license can cost a lot of money. That's where OpenText tools shoot themselves in the foot compared to other tools. Because of the price, many companies, like one I used to work for, decided not to renew their licenses and switched to open-source testing tools."
More OpenText LoadRunner Enterprise Pricing and Cost Advice →
OpenText LoadRunner Cloud is ranked 6th in Performance Testing Tools with 39 reviews while OpenText LoadRunner Enterprise is ranked 5th in Performance Testing Tools with 81 reviews. OpenText LoadRunner Cloud is rated 8.2, while OpenText LoadRunner Enterprise is rated 8.4. The top reviewer of OpenText LoadRunner Cloud writes "Enterprise modeling, server maintenance, and competitive pricing". On the other hand, the top reviewer of OpenText LoadRunner Enterprise writes "Saves time and effort, and makes it easy to set up scenarios and execute tests". OpenText LoadRunner Cloud is most compared with Tricentis NeoLoad, OpenText LoadRunner Professional, BlazeMeter, Apache JMeter and OpenText UFT One, whereas OpenText LoadRunner Enterprise is most compared with OpenText LoadRunner Professional, OpenText Silk Performer, Tricentis NeoLoad, Apache JMeter and OpenText ALM / Quality Center. See our OpenText LoadRunner Cloud vs. OpenText LoadRunner Enterprise report.
See our list of best Performance Testing Tools vendors and best Load Testing Tools vendors.
We monitor all Performance Testing Tools reviews to prevent fraudulent reviews and keep review quality high. We do not post reviews by company employees or direct competitors. We validate each review for authenticity via cross-reference with LinkedIn, and personal follow-up with the reviewer when necessary.