Eggplant Performance Review

Integrates well other solutions, offers good reporting, and is scalable

What is our primary use case?

Our organization's primary use case for this product is omnichannel functional testing and performance testing. There's an incorporation of robotics for functional testing of wellness devices.

How has it helped my organization?

The solution helps us by shifting our performance testing earlier into the last cycle. We've got three levels of performance testing that we do today, L1, L2, and L3. L1 happens at a developer level, L2 happens on our initial integrated testing, and then L3 is the border performance testing for scalability.

What is most valuable?

The best thing about Eggplant Performance is its integration with our user experience. Under load, we're able to get visibility on what the user experience would be on a physical device itself. It's a combination of Eggplant's ability to combine both performance testing with functional testing together that allows us to understand the impacts to the users themselves when the system is under load.

The integration into the pipeline for support of the technology specs that we're testing has been very good. 

The reporting that comes out of Eggplant Performance is good, and its ability to integrate with Dynatrace quite easily ensures that we get deep insights into the application under load.

The interface and monitoring are very good.

We find the solution stable and scalable.

Technical support is helpful.

What needs improvement?

On Eggplant's Performance side, it does what we wish and everything we need it to do at this stage. It's integrating very well into the pipeline. Overall, it's helped us. I don't recall any features that are lacking for our use case.

I'd like to see the ability to integrate the user experience through device forms like AWS device forms or source labs.

For how long have I used the solution?

I've been dealing with the solution for close to two and a half years at this point.

What do I think about the stability of the solution?

We haven't had any stability issues from a performance or functional point of view at this stage.

What do I think about the scalability of the solution?

For our requirements, it's highly scalable. It meets the requirements. We don't have 200,000 BTU load tests, and therefore I couldn't talk to scalability up to those kinds of volumes, however, for our requirements, it suffices in terms of scalability.

We have three performance testers that are using the solution currently.

We don't plan to increase usage in terms of performance. Performance is a very specialized field, and we have specialists in that area who using performance where there is a big expansion on the Eggplant Functionality part.

How are customer service and technical support?

Our technical support is excellent. The representatives in South Africa is well versed in the solution, and they're able to provide both training as well as onsite support as required. Support both in terms of maintenance or technical issues as well as the ability to provide onsite consulting services has been excellent.

Which solution did I use previously and why did I switch?

The solutions that were being used previously were primarily open-source solutions. These included NACE APM, and Gatling for performance testing. We've used Hexawise in terms of exploratory testing and K1 testing. Those were the solutions that were primarily targeted. With case management, there is a solution called Telstra that we utilized.

How was the initial setup?

The initial setup is very straightforward from the Eggplant Performance point of view. What was nice is we were able to also use our functional scripts to drive the performance test.

The deployment took us about a week to get it up and running initially. We were using another performance testing tool, which was Gatling at the time, and the ability for us to create a performance test was pretty straightforward. It's as easy or easier than any of the other performance testing tools we've used in the past.

What's my experience with pricing, setup cost, and licensing?

In terms of pricing, we're happy with the pricing model. It has the ability to use a combination of on-premise or term licensing. We have the ability to say, "Well, I need for this performance test to be able to rent 10,000 versus our standard 1,000 users, and to be able to rent just that difference for a short period of time." That was very attractive to us. They are quite flexible in their cost structure.

What other advice do I have?

We've been asked to expand testing into nine traditional test automation areas. We're looking at the automation of wellness devices, such as physical watches or robotics, which Eggplant supports. We utilize a combination of everything Eggplant. That includes Eggplant Functional, Eggplant 50 AI, and Eggplant Performance.

Overall, I would rate the solution nine out of ten.

I would strongly recommend that organizations use the AI capability that comes with Eggplant now. I'm talking primarily from the Eggplant Functional point of view. It's been a revelation to us in terms of its ability to assist us in exploratory testing and to integrate with our current model-driven architecture. We use a solution from Sparx Systems called Enterprise Architect, and we're able to directly integrate the model from Enterprise Architect with Eggplant through its AI engine.

Which deployment model are you using for this solution?

Hybrid Cloud
**Disclosure: My company has a business relationship with this vendor other than being a customer: partner
Find out what your peers are saying about Keysight Technologies, Micro Focus, Apache and others in Performance Testing Tools. Updated: July 2021.
524,194 professionals have used our research since 2012.
Add a Comment
ITCS user