We performed a comparison between Micro Focus LoadRunner Enterprise and Micro Focus LoadRunner Professional based on our users’ reviews in five categories. After reading all of the collected data, you can find our conclusion below.
Comparison Results: Micro Focus LoadRunner Professional comes out on top in this comparison. Micro Focus LoadRunner Professional is a mature and feature-rich solution with a proven ROI, whereas Enterprise users report being dissatisfied with the product’s ROI.
"The user interface is fine."
"Provides the performance of load test applications and reliably on good reporting."
"We have Performance Center as a platform to share with others that don't do performance testing full-time, so that they in an agile fashion, on demand can go ahead and get real issue-finding testing done."
"The solution is a very user-friendly tool, especially when you compare it to a competitor like BlazeMeter."
"The most valuable part of the product is the way you can scale the basic testing easily."
"Now that LoadRunner integrates with Dynatrace and other monitoring tools, it simplifies the process of integration into a company, taking merely five minutes to set up. This ease of integration allows for quick comparison of monitoring and performance results, a feature I highly appreciate."
"We can measure metrics like hits per second and detect deviations or issues through graphs. We can filter out response times based on timings and identify spikes in the database or AWS reports."
"Creating the script is very easy and user friendly."
"Stability-wise, I rate the solution a nine out of ten...Scalability-wise, I rate the solution a nine out of ten."
"The Analysis feature makes it easy to analyze cross-data and we can pin to the focus period."
"The number of protocols that it supports, and especially, for example, when it talks about SAP GUI-based performance testing."
"It has good protocol coverage."
"Enables us to test most of the products and projects that we have across all the different technologies, without having to look at other tools."
"The initial setup and installation of the software were very easy and straightforward."
"The reporting mechanism is a valuable feature that generates good reports."
"LoadRunner is a very systematic tool for anyone to use. Even someone who is actually a first time user of LoadRunner can actually get a lot of benefit out of the tool."
"New features have been added in latest version and need to be improved with the DevOps integration."
"More real-time monitoring should be available for the system under test."
"The product's scalability must be improved."
"A room for improvement in Micro Focus LoadRunner Enterprise is that it should take multiple exhibitions for a particular scenario and have automatic trending for that. This will be a very useful feature that lets users look into how many exhibitions happened for the scenario and their performance, and you should be able to see the data within the Performance Center dashboard. For example, there's one scenario I'm focusing on multiple times in a month, and if I check five times, there's no way for me to see the trend and find out how it went with those five exhibitions. It would be great if the Performance Center has a view of all five exhibitions, particularly transaction by transaction, and how they happened. If Micro Focus LoadRunner Enterprise shows you the time trends, information about one exhibition to another, and how each performed, it'll be an immense feature, and that should be visible to every user. Reporting should be simpler in Micro Focus LoadRunner Enterprise. If I did a scenario with one exhibition now, and I did that scenario again, then I should be able to schedule that scenario for the exhibition, and if that scenario is executed multiple times, there should be the option to turn it into a single view that shows you all the transactions, how the performance was, what the trend graph is for a particular time, etc."
"They need to focus on minimizing the cost."
"Micro Focus's technical support could be more responsive."
"They had wanted to change the GUI to improve the look and feel. However, since that time, we see a lot of hanging issues."
"I think better support for cloud-based load generators would help. For example, integrate with Amazon AWS so you can quickly spin up a load generator in the cloud, use it, spin it down."
"If they can make LoadRunner more comprehensive, it would really help."
"I recently just got to see LoadRunner Developer, but it is still not fully developed to use."
"The price of this solution should be cheaper."
"We'd like the solution to be a bit more user-friendly."
"The solution needs to reduce its pricing. Right now, it's quite expensive."
"Licensing costs could be reduced."
"There should be more integration with more open-source platforms."
"I would like to see better-licensing costs."
More OpenText LoadRunner Enterprise Pricing and Cost Advice →
More OpenText LoadRunner Professional Pricing and Cost Advice →
OpenText LoadRunner Enterprise is ranked 5th in Performance Testing Tools with 81 reviews while OpenText LoadRunner Professional is ranked 2nd in Performance Testing Tools with 77 reviews. OpenText LoadRunner Enterprise is rated 8.4, while OpenText LoadRunner Professional is rated 8.4. The top reviewer of OpenText LoadRunner Enterprise writes "Saves time and effort, and makes it easy to set up scenarios and execute tests". On the other hand, the top reviewer of OpenText LoadRunner Professional writes "A sophisticated tool that supports many languages and works with all kinds of applications". OpenText LoadRunner Enterprise is most compared with OpenText LoadRunner Cloud, OpenText Silk Performer, Tricentis NeoLoad, Apache JMeter and OpenText ALM / Quality Center, whereas OpenText LoadRunner Professional is most compared with Tricentis NeoLoad, OpenText LoadRunner Cloud, Apache JMeter, IBM Rational Performance Tester and Tricentis Tosca. See our OpenText LoadRunner Enterprise vs. OpenText LoadRunner Professional report.
See our list of best Performance Testing Tools vendors and best Load Testing Tools vendors.
We monitor all Performance Testing Tools reviews to prevent fraudulent reviews and keep review quality high. We do not post reviews by company employees or direct competitors. We validate each review for authenticity via cross-reference with LinkedIn, and personal follow-up with the reviewer when necessary.