We just raised a $30M Series A: Read our story

Micro Focus LoadRunner Enterprise OverviewUNIXBusinessApplication

Micro Focus LoadRunner Enterprise is the #3 ranked solution in our list of top Performance Testing Tools. It is most often compared to Micro Focus LoadRunner Professional: Micro Focus LoadRunner Enterprise vs Micro Focus LoadRunner Professional

What is Micro Focus LoadRunner Enterprise?

Micro Focus Performance Center is a global cross-enterprise performance testing tool which enables you to manage multiple, concurrent performance testing projects across different geographic locations without any need to travel between the locations. Performance Center administers all your internal performance testing needs. With Performance Center, you manage all aspects of large-scale performance testing projects, including resource allocation and scheduling, from a centralized location accessible through the Web. Performance Center helps streamline the testing process, reduce resource costs, and increase operating efficiency.

Micro Focus LoadRunner Enterprise is also known as Performance Center, Micro Focus Performance Center, HPE Performance Center.

Micro Focus LoadRunner Enterprise Buyer's Guide

Download the Micro Focus LoadRunner Enterprise Buyer's Guide including reviews and more. Updated: October 2021

Micro Focus LoadRunner Enterprise Customers

Hexaware, British Sky Broadcasting, JetBlue

Micro Focus LoadRunner Enterprise Video

Pricing Advice

What users are saying about Micro Focus LoadRunner Enterprise pricing:
  • "I have not been directly involved in price negotiations but my understanding is that while the cost is a little bit high, it provides good value for the money."
  • "It is a bit expensive, especially for smaller organizations, but over-all it can save you money."
  • "They have a much more practical pricing model now."
  • "The price is okay. You're able to buy it, as opposed to paying for a full year."

Micro Focus Performance Center Software Reviews

Filter by:
Filter Reviews
Industry
Loading...
Filter Unavailable
Company Size
Loading...
Filter Unavailable
Job Level
Loading...
Filter Unavailable
Rating
Loading...
Filter Unavailable
Considered
Loading...
Filter Unavailable
Order by:
Loading...
  • Date
  • Highest Rating
  • Lowest Rating
  • Review Length
Search:
Showingreviews based on the current filters. Reset all filters
NIKHIL_JAIN
Performance Test Lead at a financial services firm with 10,001+ employees
Real User
Top 20
Full geographical coverage, integrates well with monitoring tools, granular project inspection capabilities

Pros and Cons

  • "One of the most valuable features of this solution is recording and replaying, and the fact that there are multiple options available to do this."
  • "Micro Focus needs to improve in terms of support. With the same support plan but when the product was owned by HP, support was more responsive and better coordinated."

What is our primary use case?

We use this solution for performance and load test different types of web-based applications and APIs. We want to make sure that before any application or any upgrade to an existing application is made available to an actual user, it is sufficiently tested within the organization.

We want to ensure that if there is a high volume of users, they have a seamless experience. We don't want them to experience slowness or an interruption in service, as a result of an increase in the number of users on the web service or website. Essentially, we test to guarantee that all of our users have a good experience.

How has it helped my organization?

When it comes to delivering enterprise-level testing capabilities, this solution is really good.

Using this tool, we are able to test an application end-to-end from any area. Specifically, we are able to test our applications that are used across geographies. This includes worldwide locations starting from one end of Asia to the other end of the Americas. Geographically, we have full testing coverage for virtually all of our enterprise applications.

In terms of application coverage, there have been very few or no applications at the enterprise level that we have not been able to test using this tool. I think there is only one, but that was a unique case. Apart from that, at an enterprise level, in terms of coverage and geographically as well as technically, we have been able to test everything using this solution.

Micro Focus has a platform where I can share what is good and what further improvements I can make. There is also a community where we can leave feedback.

As an admin, I have the ability to copy all of the details from one project to another. However, I don't recall functionality for cross-project reporting. If there are two projects available then I cannot run a load test or report metrics from the other project.

LoadRunner Enterprise offers multiple features to perform a deep dive into a project. For example, we can see how many load tests of a particular application were run over a certain period of time. We can also see what scripts and tests were built over a time period. There is lots of information that it provides.

It is very important that we are able to drill down into an individual project because we sometimes have to look into what set of tests was executed for a particular project, as well as how frequently the tests were run. This helps us to determine whether the results were similar across different executions, or not. For us, this is an important aspect of the functionality that this tool provides.

One of the major benefits, which is something that we have gained a lot of experience with, is the internal analytics capability. It has multiple graphical and analytical representations that we can use, and it has helped us a lot of times in pinpointing issues that could have caused SEV1 or SEV2 defects in production.

We found that when we ran the load test, those issues were identified by using the analytic graphs that LoadRunner provides. Based on this knowledge, we have been able to make the required corrections to our applications. After retesting them, we were able to release them to production. This process is something that we find very useful.

In terms of time, I find it pretty reasonable for test management. There are not too many things that we have to do before starting a load test. Once one becomes good at scripting, it does not take long. Of course, the length of time to run depends on how big and how complex the script is. Some load tests have five scripts, whereas some have between 25 and 30 scripts. On average, for a test with 10 scripts, the upper limit to set it up and run is a couple of hours.

Overall, we don't spend too much time setting up our tests.

What is most valuable?

One of the most valuable features of this solution is recording and replaying, and the fact that there are multiple options available to do this. For example, a normal web application can be recorded and replayed again on many platforms. Moreover, it can be recorded in different ways.

An application can be recorded based on your user experience, or just the backend code experience, or whether you want to record using a different technology, like a Java-specific recording, or a Siebel-specific recording. All of these different options and recording modes are available.

The scheduling feature is very helpful because it shows me time slots in calendar format where I can view all of the tests that are currently scheduled. It also displays what infrastructure is available to me to schedule a load test if I need to.

What needs improvement?

Something that is missing is a platform where I can share practices with my team. I would like to be able to inform my team members of specific best practices, but at this point, I can only share scripts and stuff like that with them. Having a private community for my own team, where I can share information about best practices and skills, would be helpful.

Micro Focus needs to improve in terms of support. With the same support plan but when the product was owned by HP, support was more responsive and better coordinated.

The monitoring and related analytical capabilities for load tests should be brought up to industry standards. This product integrates well with tools like Dynatrace and AppDynamics but having the built-in functionality improved would be a nice thing to have.

For how long have I used the solution?

I have been using Micro Focus LoadRunner Enterprise for approximately 15 years. It was previously known as Performance Center and before that, it was simply LoadRunner. In terms of continuous, uninterrupted usage, it has been for approximately nine years.

I am a long-time user of Micro Focus products and have worked on them across multiple organizations.

What do I think about the stability of the solution?

Our tool is hosted on-premises and we have not faced stability issues as such. One of the problems that we sometimes experience is that suddenly, multiple machines become unresponsive and cannot be contacted. We call these the load generators in LoadRunner nomenclature. When this happens, we have to restart the central server machine and then, everything goes back to normal. That sort of issue happens approximately once in six months.

Apart from that, we have not observed any stability issues. There are some defects within the tool which from time to time, we have raised with Micro Focus. If they have a fix available, they do provide it. Importantly, it does not make the product unusable until that is fixed.

What do I think about the scalability of the solution?

This product is easy to scale and as a user, we have not encountered any such issues. Over time, if I have to add more machines to monitor, or if I have to add more machines to use during a load test, it's pretty straightforward.

If I compare it with other tools, I would say that it does not scale as well. However, as a user, it is okay and I've never faced any issues with adding more machines.

How are customer service and technical support?

Whenever we have any support required from Micro Focus, the process begins with us submitting a ticket and they normally try to solve it by email. But if required, they are okay with having a video conference or an audio conference. They use Cisco technology for conferencing and they are responsive to collaboration.

Unfortunately, technical support is not as good as it used to be. From an end-user perspective, coming from both me and several of my team members, we have found that over the last year and a half, the quality of support has gone down a couple of notches. It has been since the transition from HP to Micro Focus, where the support is simply no longer at the same level.

The level of support changes based on the plan that you have but our plan has not changed, whereas the responsiveness and coordination have. Generally speaking, interacting with HP was better than it is with Micro Focus, which is something that should be improved.

Which solution did I use previously and why did I switch?

I have not used other similar tools.

How was the initial setup?

I have not set up other tools, so I don't have a basis for comparison. That said, I find that setting up LoadRunner Enterprise is not very straightforward.

Whether it's an initial setup or an upgrade to our existing setup, it's very time-consuming. There are lots of things that we have to look into and understand throughout the process. It takes a lot of time and resources and that is one of the reasons we are considering moving to the cloud version. Ideally, our effort in upgrading to the newer versions is reduced by making the transition. The last couple of upgrades have been very consuming in terms of time and effort, which could have been spent on more productive work.

To be clear, I was not involved in setting it up initially. Each time we deploy this product, we set it up as a new one but use our older version as a base. Prior to the configuration, we have to update it. However, it is older and it does not upgrade, so we have to install it as a new version. I do not see a significant difference in time between installing afresh and upgrading an existing installation. 

If I am able to identify the needs and what is required, from that point, it takes almost the same amount of time whether it is a clean install or an upgrade. The biggest challenge with LoadRunner Enterprise is to identify the database that we're using and then upgrade it. As soon as the database is upgraded successfully, 70% to 75% of the work is complete. It is the biggest component, takes the longest, and is the most effort-consuming as well.

What about the implementation team?

I am involved in the installation and maintenance, including upgrades.

What's my experience with pricing, setup cost, and licensing?

I have not been directly involved in price negotiations but my understanding is that while the cost is a little bit high, it provides good value for the money.

Which other solutions did I evaluate?

I did not evaluate other tools before implementing this one.

What other advice do I have?

At this time, we do not make use of LoadRunner Developer Integration. We are thinking of migrating to the latest version of LoadRunner, which probably has the LoadRunner Developer functionality. Once we upgrade to the new version, we plan to use it.

We are not currently using any of the cloud functionality offered by Micro Focus. In our organization, we do have multiple applications that are hosted on the cloud, and we do test them using LoadRunner Enterprise, but we do not use any component of LoadRunner Enterprise that is hosted on the cloud.

I am an active member in several online communities, including LinkedIn, that are specific to performance testing. As such, I have seen different experts using different tools, and the overall impression that I get from LoadRunning Enterprise is that it offers good value for the price. The level of coverage in terms of scripting and analysis had helped to solidify their position as a market leader, at least a decade ago.

Nowadays, while others have closed the gap, it is still far ahead of other tools in the space. My advice is that if LoadRunner Enterprise can be made to fit within the budget, it is the best tool for performance testing and load testing.

I would rate this solution an eight out of ten.

Which deployment model are you using for this solution?

On-premises
Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
Flag as inappropriate
RM
Senior Consultant at a computer software company with 5,001-10,000 employees
Consultant
Top 5
Tests the performance of our applications and has the ability to share the screen while you are running a test

Pros and Cons

  • "This product is better oriented to large, enterprise-oriented organizations."
  • "While the stability is generally good, there are a few strange issues that crop up unexpectedly which affect consistent use of the product."

What is our primary use case?

Our primary use case for Performance Center is testing the performance of all of our applications.

What needs improvement?

One thing that always fails at our company is that after you have checked in an application then it usually crashes in some way. You get some strange error message. We found out you can open the test you have set up and usually, it works without the error the second time. So you just close the application test and open it again, and then it is okay. So that is quite confusing if you are new to the product, but you do not care about the inconvenience or even notice it after using the tool for a while. It does not seem very professional and it is really a buggy behavior that should be fixed.

One feature I would like to see included in the next release of Performance Center would be to be able to run more fluidly with True Client so you could put more virtual users in Performance Center. That would help. I'm not sure how easy it is to compile something like that, but it would be valuable.

For how long have I used the solution?

We've been using Performance Center for about a year.

What do I think about the stability of the solution?

We have had some problems with instability. At one point Performance Center suddenly went down for two days, but usually, it works. It works okay now and has not been a problem, but it was worse in the beginning. They have changed something, so it is better now than it was, I think.

What do I think about the scalability of the solution?

The scalability is good enough. Sometimes we get a message from the generators that they are at 80% or more capacity. That is an error we get quite commonly. We only have eight gigabytes on the generators and it is recommended to use 16 gigabytes. I guess that is likely the reason why we have this problem. This happens a lot more often when we are running TruClient. The 80% capacity error comes up very fast in that case. We can not run many users with TruClient at all.

How are customer service and technical support?

It is not usually me who calls tech support but I got the impression that the team is quite pleased with it. Usually, it is good. On the other hand, we have had some problems now that are not resolved. For example, one of my applications is not running at all because we are running on version 12.53. There was some problem with the REST (Representational State Transfer) services and the coding part of our REST services. We were using a very old encoding version that we are not using anymore. We stopped using it a long time ago. But it was still supposed to be compatible in 12.53, and that is what we are using. I know the problem was fixed from version 12.56 and up, but we have not been able to complete the upgrade. 

I'm able to run the tests on the application locally, but not in Performance Center. So we are waiting for this upgrade at the moment to resolve these issues.

Which solution did I use previously and why did I switch?

We are currently using 12.53 and we are trying to upgrade it to 12.63 but it looks like there's a problem with the upgrade. We would like to switch to take better advantage of some features that are currently difficult to work with. We used LoadRunner concurrently for a while, and while it was a good product there were things about Performance Center that we prefer.

How was the initial setup?

I was not included in the process when they installed the solution, but it took quite a lot more time than I would have expected. I guess, based partly on the length of time it took, that it was not very straightforward to set up and must have been a bit difficult. The other reason it does not seem easy is that the team has tried to upgrade now two times now and both times they had to roll back to the previous version. We'll see when a fix is issued and they try to upgrade again if the issue is solved. It looks like there are problems with connecting properly. The team has a ticket in with Micro Focus about the problem, but we are not sure what the problem stems from and a resolution has not been provided.

What's my experience with pricing, setup cost, and licensing?

I'm not quite sure about the exact pricing because I do not handle that part of the business, but I think the Performance Center is quite expensive. It is more expensive than LoadRunner, although I am not sure how many controllers you can run for the same price. They said Performance Center was costing us around 40 million Krones and that is about 4 million dollars. But I think that was with ALM (Application Lifecycle Management) as well and not only for Performance Center.

Which other solutions did I evaluate?

Before we used Performance Center at all, we used LoadRunner (Corporate version, 50 licenses). But now we changed over almost entirely to Performance Center and we are phasing LoadRunner out. For a while, we were running both at the same time to compare them. The nice thing is that we do not need to have many controllers connected with Performance Center. The bad thing is that more than one person may want to use the same generator. So sometimes we have problems. I guess we had the same problem before when we used LoadRunner because everyone can't run a test at the same time.

There are some good things and some bad things about Performance Center in comparison to LoadRunner. The good thing is that you are able to share the screen while you are running a test. On the other hand, you do not get all the same information you get with LoadRunner when you run the tests. After you have done the tests, you can just copy the completed file and you get the same test results as if you had run on LoadRunner. So that is not really a problem. But when first running the Performance Center application for testing, I missed some of the information I got from LoadRunner. It is just a different presentation.

What other advice do I have?

The advice I would give to someone considering this product is that they should try LoadRunner first before they start using Performance Center — especially if it is a small company. They need to know and be able to compare LoadRunner to Performance Center in the right way. After you have used LoadRunner then compare Performance Center. If they are part of a small company and they expect to expand they will know the difference. If they are already a very big company, they can save some money by using Performance Center directly. We are quite a big company, so Performance Center makes sense for us.

On a scale from one to ten where one is the worst and ten is the best, I would rate Performance Center as an eight. It is only this low because we have had so many problems here installing it and upgrading it. Sometimes it runs very slow just to set up tests, or it just crashes. Like when setting up a spike test, you start using the spike test process and it suddenly crashes after you have almost finished everything. Executing the tests were a lot easier and more stable in LoadRunner.

You can manage to make Performance Center work, but you have to be patient.

Which deployment model are you using for this solution?

On-premises
Disclosure: My company has a business relationship with this vendor other than being a customer: Partner.
Learn what your peers think about Micro Focus LoadRunner Enterprise. Get advice and tips from experienced pros sharing their opinions. Updated: October 2021.
542,721 professionals have used our research since 2012.
NK
Senior IT Process Analyst at a energy/utilities company with 10,001+ employees
Real User
Top 20
Helps us to identify performance bottlenecks and increase testing efficiency

Pros and Cons

  • "We have a centralized delivery team and we are able to meet enterprise requirements, which include different types of protocols that are involved, including scripting. The technology supports that and enables us to have a wider range of testing. Enterprise-level testing is something that we are satisfied with."
  • "It would be good if we could look forward at the future technology needs we have. I would like to see Micro Focus provide more customer awareness around how LoadRunner can fulfill requirements with Big Data use cases, for example, where you do performance testing at the scale of data lakes... when it comes to technologies our company has yet to adopt, I would like to see an indication from Micro Focus of how one does performance testing and what kinds of challenges can we foresee. Those kinds of studies would really help us."

What is our primary use case?

Performance testing is an integral part of the testing life cycle. It determines whether the application being rolled out for end-users is in line with our expectations. It contributes quite well.

Initially, we had a completely on-premises implementation of LoadRunner. In 2018, we moved to cloud. The load generators are still internal, but the rest of the components sit in the Micro Focus cloud environment as a SaaS enterprise solution.

How has it helped my organization?

It's quite versatile. As a company, we have applications that span across different platforms and technologies, including legacy. We've been using it for applications on mainframes and with the latest technologies as well. We are able to attain our requirements from a performance testing standpoint. It helps us to be confident and to be aware of where issues are before we release a product to a wider audience.

When you have that scalability, it helps in performing end-to-end testing seamlessly. Our organization has applications that span multiple applications and technologies, to complete a single business process. That type of scalability helps us to achieve our performance testing objectives.

It has definitely helped us to identify the performance bottlenecks. Whenever we get into the procurement of other applications, we consider the historical performance KPIs. That really helps us to define those optimum KPIs with respect to other vendors.

In terms of efficiency, certain features have been introduced that were quite complementary and have really helped us with our delivery.

What is most valuable?

We have a centralized delivery team and we are able to meet enterprise requirements, which include different types of protocols that are involved, including scripting. The technology supports that and enables us to have a wider range of testing. Enterprise-level testing is something that we are satisfied with.

LoadRunner helps to facilitate sharing of best practices and skills. That's the way we expect any enterprise tool to work. It helps us to follow best practices and share them with other teams as well. It's quite important to have that consistency in terms of the quality of deliverables. It plays a key role. It enables us to have that benchmarking in terms of quality and is one of the crucial requirements for us.

The cross-project reporting and business views are among the valuable features because a huge platform can have multiple projects that are being executed in parallel. In that scenario, the reporting provides a holistic view for the stakeholders.

What needs improvement?

It would be good if we could look forward at the future technology needs we have. I would like to see Micro Focus provide more customer awareness around how LoadRunner can fulfill requirements with Big Data use cases, for example, where you do performance testing at the scale of data lakes. That also applies to when we need to deal with applications that are adopting the latest technologies, where our company doesn't have a footprint. It would help us to have a better view and be prepared to address those requirements efficiently.

The Micro Focus team has done a good job of introducing us to product owners and product managers, and in talking about the upcoming roadmap and features of the tool. That's been quite good. But when it comes to technologies our company has yet to adopt, I would like to see an indication from Micro Focus of how one does performance testing and what kinds of challenges can we foresee. Those kinds of studies would really help us.

For how long have I used the solution?

We've been using LoadRunner since 2012.

What do I think about the scalability of the solution?

The current licensing model is something that offers us flexibility, compared to what we had earlier. That's something which is really beneficial.

Any plans to increase our usage of LoadRunner depend on the business demand. Our company depends on a number of IT applications for which implementation is planned and are in scope for performance testing. We will carve out a plan for introducing performance testing of them.

Penetration and performance testing have increased over time and we are growing well. For applications that are already in the maintenance phase, depending on the volume of change that is introduced into them and how critical they are, we introduce performance testing. However, the number of custom applications is quite limited within our company.

How are customer service and technical support?

So far, Micro Focus technical support has been smooth.

The solution supports multiple protocols such as open source, VuGen, TruWeb, TruClient, and SAP. A few years ago, we also wanted support for IoT. That did not exist. That's something we requested and the product team added it to the roadmap.

Which solution did I use previously and why did I switch?

Since the early days, we were with HP Performance Center, and then with Micro Focus LoadRunner. We have stuck with the same supplier and product.

How was the initial setup?

We have become very accustomed to the product, using it as long as we have. We have never come across any kind of difficulty and we have received support from the vendor whenever we have required help.

Our migration to LoadRunner Cloud happened in 2018, and took approximately six months. Our company was being cautious because we wanted to ensure business continuity, so we went for a phased project migration approach.

We went with that approach because there were multiple aspects that needed to be taken care of, from a security standpoint. We had to get required clearances because we needed to open certain ports and firewalls. That took some time. Once that was cleared, we did a proof of concept and quickly started moving projects in a phased manner, and we haven't seen many difficulties since then.

What's my experience with pricing, setup cost, and licensing?

The contract that we had with Micro Focus was a bit complex, but now it's much simpler. As a customer, I have clarity about it. That is something that helps us to serve the business better.

What other advice do I have?

It's a tool that really helps you when you have a very varied landscape and you have technologies and platforms and infrastructure which include legacy and new ones, with a mix of SaaS. LoadRunner has the ability to support different protocols and that serves the purpose. It's a one-stop solution.

We wanted to integrate LoadRunner reports to a time-series database, an open-source tool like Grafana. We learned a lot from that integration. The integration of the solution into a CI pipeline is something that we haven't explored widely, but it's an area we are looking forward to investing in soon. We are exploring more in terms of the integration capabilities of LoadRunner with other tools.

Performance testing is a specialized skill and we don't have too many using the solution, but we do have a couple of professionals who have been doing performance testing for more than 15 years. The rest have been into performance testing for the last seven to eight years, with exposure to different protocols and technologies. We are aiming to scale up and cross-train them in multiple protocols so that we can reach some of our goals without any hindrance this year. We would like to have less dependency, in terms of expertise, on specific technologies and protocols.

Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
Flag as inappropriate
SA
Laboratory Director at a consultancy with 51-200 employees
Real User
Top 20Leaderboard
A trustworthy solution for enterprise-wide testing and collaboration

Pros and Cons

  • "I think the number one feature everybody likes is the capability to easily generate virtual users as well as the reporting."
  • "It's not that popular on the cloud."

What is our primary use case?

Initially, I've been using it for small use cases, just to test scenarios of less than 1,000 users. I think generally it's been very good. My team has even deployed it for clients within banking. It's still a go-to tool; although, as far as SaaS goes, recently we have had more suggestions to go with Neosyde. 

What is most valuable?

I think the number one feature everybody likes is the capability to easily generate virtual users as well as the reporting. Recently, we are starting to look at things more from the diagnostic perspective as well as from the troubleshooting perspective. It gives us many more options for troubleshooting and presenting reports. The other reason why LoadRunner is quite popular for us is that it has a long track record. We know if we need to look for a solution we can still search and find a use case or a solution quite easily.

I like the new pricing model. It helps us to ramp up much better, especially when we were trying to use this for SaaS applications. They have a much more practical pricing model now. It allows us to break it down smaller and also build-up towards a price model that works for the client. I think that was a big bottleneck in the past — now it looks much better.

From a technical perspective, LoadRunner has always been good. You can trust that it can deliver. The big bottleneck in the past has always been the pricing model. Now, with the new approach, with the use of SaaS, we are currently in proposals to recommend LoadRunner as a solution for one of our government clients. We are doing an implementation there. 

What needs improvement?

I think LoadRunner is still getting into grips with me — maybe, I've not used it that much. It's not that popular on the cloud. Also, we have not tried this on mobile platforms with mobile virtual users.

For how long have I used the solution?

I have used LoadRunner for quite some time — roughly 10 years.

What do I think about the scalability of the solution?

Both the scalability and stability are strong points for LoadRunner. We have no complaints so far. Of course, there's always this concern around if we have sufficient use on the hardware to create the required scale for the number of users, but I think that's easy to workaround. This is what enterprise users do; we don't really have that much of a complaint there.

How are customer service and technical support?

We work quite closely with the local team in Malaysia — they do their job.

How was the initial setup?

Generally, the initial setup has not been much of a problem. If you have some level of intermediate knowledge on networks as well as some quick training on LoadRunner, you should be able to set it up within a week or two.

What other advice do I have?

Proper training is important. If you have teams that want to use the product, you need to ensure that they go through the right training. Get your guys to sit through the LoadRunner training or get someone experienced to train them.

Make sure that your team trains before they go and apply the system because LoadRunner is not actually something that you do, plug-and-play. You do need a little bit of configuration, and it's not for beginners. It is meant for people with at least an intermediate understanding of networks, and an intermediate understanding of performance application — you need to have that. I would say it's always important to ensure that you work very closely with the development team. To get the best out of the tool, you need to have a solid collaboration. When you want to troubleshoot, you want to review or uncover the performance issues; you need to make sure that you work quite closely with the development teams as well.

On a scale from one to ten, I would give LoadRunner a rating of eight. We have not used it for global distributed testing, and we also don't know its full capability from a mobile perspective. That's an area that I cannot comment on yet, so I'm reserving my judgment on that. That's the reason why I am giving it an eight.

From my perspective, there's still a gap in terms of the area that LoadRunner is being marketed to. Its biggest strength, in my opinion, is the reporting. If they could keep the reporting, but give it a lighter engine to generate virtual users, that would be perfect. 

Which deployment model are you using for this solution?

On-premises
Disclosure: My company has a business relationship with this vendor other than being a customer: Partner
David Collier
Co-Founder at Nobius IT
Reseller
Top 5Leaderboard
Performance testing that should be part of your everyday application development lifecycle

Pros and Cons

  • "The most valuable part of the product is the way you can scale the basic testing easily."
  • "Third-party product integrations could be a little more slickly handled."

What is our primary use case?

The primary use case for clients is that they often have large application development teams and application development projects that they needed to scale. So, for instance, if they were developing a new banking website and they needed to check that the application that sits on that website was scalable from a few hundred concurrent users to many thousands of users, they could test the load response using LoadRunner.  

That is what LoadRunner does, it does the performance testing and measures load-bearing response.  

What is most valuable?

I think, for me, the most valuable part of the product is the way you can do the basic testing. You can create the test script and then simulate thousands of users very, very easily. Instead of having to have lots-and-lots of systems that would emulate users, you just needed a couple to emulate tens of thousands of users. So the scalability of LoadRunner while it was testing scalability is really valuable.  

What needs improvement?

To improve the product, I think the integrations could be a little bit more slick. It does handle a lot of great integrations, but then some of them can be a little bit clunky to implement. The integration with third-party tools needs to be stepped up a little bit.  

As far as other things that need to be added, it has changed quite a lot recently, and I have not had vast amounts of experience with the latest version. So I am afraid it would not be fair for me to go further in expounding on that question. Things that I talk about may already have been included.  

For how long have I used the solution?

I was a presales consultant and so I was kind of a technical consultant as well. I was working with the solution end-to-end for about seven years. My main focus was not LoadRunner for the entire time, but I gained knowledge of LoadRunner and I gave presentations about it. It has been about eight to twelve months since I last did anything serious with it. However, I am still familiar with the product.  

What do I think about the scalability of the solution?

The scalability is really very good. In some ways, it is the purpose of the product: testing by use of scaling loads.  

How was the initial setup?

The initial setup is pretty straightforward. But I have got to say, having worked on-and-off with LoadRunner over a period of time, I knew kind of instinctively how to set it up after a while. In other words, in my case, I would say it was simple. On the other hand, I think it the first time I tried to set it up it was a nightmare. After that, it was easy because I learned a lot about it. If I had to score it out of ten for initial setup with ten being the best, it would probably be seven-out-of-ten. It is not really going to be super easy for first-timers to deploy.  

What about the implementation team?

The deployment could take quite a while, even when I was used to doing it. Getting the software installed and running is pretty quick and that is not a problem. But creating the projects and creating the test scripts can take a little while. To get up and running and doing stuff within it, it is probably just around a week. Doing it professionally with the integrations and with all the correct testing scripts, it can take a month and more. It really all depends on the purpose and how you want to use it.  

What other advice do I have?

My advice to people considering LoadRunner is that if you are going to use the product, use it as part of your everyday application development lifecycle. Do not just use it right at the end, because it gives you some great insights during the development phase as well as at the end. You will end up writing cleaner application code with it. So bake the use of LoadRunner into your full application life cycle.  

On a scale from one to ten where one is the worst and ten is the best, I would rate this solution as between and eight and nine-out-of-ten. I could be slightly biased, having worked for the company that sells it. But it is a very good, professional solution. With the latest updates, it is very comprehensive and one of the best products of the sort. Let's say nine-out-of-ten because there is always room for improvement.  

Disclosure: I am a real user, and this review is based on my own experience and opinions.
Lloyd Witt
Managed Services Architect at a computer software company with 201-500 employees
MSP
Top 5Leaderboard
A stable solution for enterprise-wide testing and collaboration

Pros and Cons

  • "Micro Focus LoadRunner Enterprise Is very user-friendly."
  • "The reporting has room for improvement."

What is our primary use case?

I am a managed service provider, a reseller, and a consultant. In other words, I am a total geek. 

I added a whole bunch of features and changes three or four years, but I don't know if they followed my recommendations; however, they did implement some changes that I suggested.

There's an onsite version and there's a cloud version. We typically don't want an enterprise type version because the clients that we work with are fairly large. The last place we used this solution employed 150,000 people.

We have clients that have as few as 10 employees, and other clients that have thousands of employees. I would say the mid-sized businesses that we work with are between 250 and 700 people.

It's all Citrix. We do load balance. We do load testing for Citrix deployments to determine whether or not we're going to get what we expected.

The ability to run long packages for extended periods of time, and actually mimic end users. That's really what we use it for.

We use it for validation. When you put together a system that has two to three thousand people on it, you need to be able to test it. To do that, you need a product that allows you to cast two to three thousand users on a system.

What is most valuable?

Micro Focus LoadRunner Enterprise Is very user-friendly.

What needs improvement?

The reporting has room for improvement.

For how long have I used the solution?

I have been using this solution, on and off, for roughly six to seven years. 

In the last 12 months, I don't think I've actually loaded it up, but I have had my PS team load it up several times.

What do I think about the stability of the solution?

It's a stable solution. I'd give Micro Focus LoadRunner Enterprise a 4.5 out of 5 on stability.

We never experienced any bugs or glitches; those are typically in the actual loads that you're running, but that's not their fault, that's your fault.

What do I think about the scalability of the solution?

Scalability-wise, I have not had any problems. It's gone as high as I needed it to go. There are issues when supporting two to three thousand users. I don't ever go any higher than that.

A typical test is between roughly 150 and 250 users, and the most I've ever gotten is 3000. The scalability has been there for what I needed it to do. I really can't speak outside of that realm.

How are customer service and technical support?

I have never called their technical support, but their online documentation is pretty good.

Which solution did I use previously and why did I switch?

We deployed three different solutions. One of them was free from VMware and the other one was Login PSI. We didn't really switch, it's just different feature sets we're looking for or methodology we want to use; whether or not the client wants to spend a hundred grand upfront.

How was the initial setup?

For me, the initial setup is straightforward — I've done it a few times now.

What's my experience with pricing, setup cost, and licensing?

The price is okay. You're able to buy it, as opposed to paying for a full year. You can just on-demand purchase it for your users for a day or two, which is nice in an MSP business like mine. If I need to use it for separate clients, I don't have to have a huge layout of capital upfront.

What other advice do I have?

Make sure you know what your use case is before you buy it.

On a scale from one to ten, I would give this solution a rating of nine. It's very good at doing what it needs to do. I think that the reporting needs a little bit of work, but that's pretty much it. I think every reporting system needs a little bit of work, so take that with a grain of salt.

Which deployment model are you using for this solution?

Public Cloud
Disclosure: My company has a business relationship with this vendor other than being a customer: Reseller
RajaRao
Associate at a computer software company with 10,001+ employees
Real User
Top 10
User friendly with good reporting and many useful features

Pros and Cons

  • "The solution is a very user-friendly tool, especially when you compare it to a competitor like BlazeMeter."
  • "The solution is a very expensive tool when compared with other tools."

What is most valuable?

The solution is a very user-friendly tool, especially when you compare it to a competitor like BlazeMeter.

The custom meter is nice. It has a lot of features. 

When compared with BlazeMeter, I use the plain data. In the cloud after one year it has been very good. 

With reporting, we will see the door unlock on the main portal very quickly, because LoadRunner has very good analysis tools. You can analyze data and get the error data as well. You can merge them together and dig down into specific points of time. It's great for correlating drafts within the number of users, between accounts, and with support. These functionalities are not there in BlazeMeter.

What needs improvement?

The solution is a very expensive tool when compared with other tools. 

The stability in some of the latest versions has not been ideal. They need to work to fix it so that it becomes reliably stable again.

The cloud solution of LoadRunner is not user-friendly when compared to BlazeMeter. They need to improve their cloud offering in order to compete. It also shouldn't be a standalone tool.

For how long have I used the solution?

I've been using the solution for about one year now.

What do I think about the stability of the solution?

In terms of stability, it depends on what you are using. Sometimes version 5.3 and the newer versions are not stable. The latest versions we are finding are not so stable when compared with the previous versions we've used, so some glitches are there. They need to rectify that. It was stable for two years, and now it's not.

What do I think about the scalability of the solution?

The solution is scalable, and it's based on the number of licenses you have. In comparison, with BlazeMeter, I ran thousands of users, because it's very cheap and we could scale up the number of users easily with very little overhead.

In my experience, I've used BlazeMeter to scale up to 5,000 users. With Micro Focus, there are not more than 2,000 users.

How are customer service and technical support?

In Micro Focus, I have worked with various types of clients. Some clients have platinum customer status, and some have gold. For those levels, the support will be there. At platinum, technical support is very helpful at updating their support and the support is good.

Which solution did I use previously and why did I switch?

I also use BlazeMeter.

With LoadRunner, I use it with a paid tool, and since I am following the protocol, I need it to be easy to use. Whereas with BlazeMeter, we use it with JMeter. We need to use it sometimes if we want support. We need to configure some properties or some customers' ratings before we can use it.

What's my experience with pricing, setup cost, and licensing?

The solution needs to reduce licensing costs. Its main competition, for example, is free to use, so I'm sure it's rather difficult to compete with it on a cost level.

What other advice do I have?

We're partners with Micro Focus.

I haven't found many products in this particular niche that have compared to JMeter and BlazeMeter tools.

I'd rate the solution eight out of ten.

I suggest other potential users review Micro Focus. If the client has the budget for the solution, I'd recommend it. If they don't have a budget, I'd suggest they instead opt to look a  freeware solution, and I'd suggest they evaluate JMeter or BlazeMeter. 

Which deployment model are you using for this solution?

On-premises
Disclosure: My company has a business relationship with this vendor other than being a customer: partner
MA
DevOps Engineer at a tech services company with 501-1,000 employees
Real User
Top 20
A mature tool with lots of capabilities, but it is resource-intensive and the technical support is frustrating

Pros and Cons

  • "This is a product that has a lot of capabilities and is the most mature tool of its kind in the market."
  • "The TruClient protocol works well but it takes a lot of memory to run those tests, which is something that can be improved."

What is our primary use case?

We use LoadRunner for performance testing.

What is most valuable?

This is a product that has a lot of capabilities and is the most mature tool of its kind in the market.

What needs improvement?

In the DevOps model, performance testing has become a bottleneck. This is because, after the completion of a sprint, people are in a hurry to send it to production but it first needs performance testing. Whenever there is a code change, it takes a lot of time to rescript and debug the script.

The TruClient protocol works well but it takes a lot of memory to run those tests, which is something that can be improved. Basically, it is too resource-intensive.

Performance testing needs to be better integrated into an agile framework.

There should be a way of automatically increasing the load generators on the cloud, without specifically having to spin up the agent and configure it. There is a third-party tool that does this. 

For how long have I used the solution?

I have been using LoadRunner Enterprise for nine years.

What do I think about the stability of the solution?

Although it is stable, the whole performance testing process becomes very slow because of its complex scripting and having to rework the scripts. Ultimately, this will become obsolete if it is not improved.

What do I think about the scalability of the solution?

LoadRunner is a scalable product. We have a team of four or five people who use it.

How are customer service and technical support?

The technical support is very bad. It seems that people will ask a lot of unnecessary questions just to bide the time. We are very frustrated with the support team.

Which solution did I use previously and why did I switch?

This is the only tool that I have used for performance testing. It is the tool that I use most of the time in my role.

How was the initial setup?

The initial setup is fine. Deploying LoadRunner is quick but to set up the performance testing itself takes a lot of time.

What other advice do I have?

When it comes to organization, people compare automation testing with performance testing. Automation testing is something that is very easily integrated within an agile and faster delivery framework. The scripting in automation testing is robust because it is GUI-based. When it comes to performance testing, it is request-response-based and the scripts are not very robust in some of the application platforms. Because of that, people feel that performance testing is a bottleneck and it takes a lot of time.

I would rate this solution a seven out of ten.

Which deployment model are you using for this solution?

On-premises
Disclosure: I am a real user, and this review is based on my own experience and opinions.