Anil Kulkarni - PeerSpot reviewer
Senior Director/Practice Leader at Cirruslabs
Real User
Top 20
Great traceability feature with good reporting
Pros and Cons
  • "Produces good reports and has a great traceability feature."
  • "Lacks sufficient plug-ins."

What is our primary use case?

We are customers of Micro Focus and I'm a senior director of our company. 

What is most valuable?

The overall licensing and reporting has definitely improved. As a leader, I was able to get the reports I needed and the same applies to developers. Traceability really helps me and is a great feature. When I used to be a test manager, it was very useful. ALM is user-friendly. 

What needs improvement?

I'd like to see some readily available plugins where we could integrate other tools because we're in an open-source world now, and there are a lot of tools that I need to integrate. It requires a lot of effort to create the APIs to connect to ALM and run the scripts. The solution lacks Agile features. 

For how long have I used the solution?

I've been using this solution for 15 years. 

Buyer's Guide
OpenText ALM / Quality Center
April 2024
Learn what your peers think about OpenText ALM / Quality Center. Get advice and tips from experienced pros sharing their opinions. Updated: April 2024.
769,630 professionals have used our research since 2012.

What do I think about the stability of the solution?

The solution is stable. 

What do I think about the scalability of the solution?

The solution is scalable. 

How was the initial setup?

The initial setup is relatively easy but we hired a third-party organization to assist. 

What other advice do I have?

I rate this solution 10 out of 10. 

Which deployment model are you using for this solution?

On-premises
Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
Tools Architect at S2 Integrators
Real User
Top 20
Good defect management and test planning but needs better technical support
Pros and Cons
  • "The product can scale."
  • "I'm looking at more towards something more from a DevOps perspective. For example, how to pull the DevOps ecosystem into the Micro Focus ALM."

What is our primary use case?

We primarily use the solution for test management.

What is most valuable?

The dashboard reporting is great.

It offers very good defect management, test planning, and execution. 

It's been stable so far. 

The product can scale. 

What needs improvement?

I'm looking at more towards something more from a DevOps perspective. For example, how to pull the DevOps ecosystem into the Micro Focus ALM. You have other different tools in the market which have more towards DevOps capabilities, like integration with pipelines, et cetera. I need more of that within Micro Focus ALM basically.

We could have higher quality technical support.

For how long have I used the solution?

I've been using the solution for 15 years. 

What do I think about the stability of the solution?

We haven't had any issues with stability. It's reliable. The performance is good. There aren't any bugs or glitches, and it doesn't crash or freeze. 

What do I think about the scalability of the solution?

We have found the product to be scalable. 

We have between 7,000 and 8,000 users right now. 

How are customer service and support?

The technical support is average. We aren't unhappy with them. However, we don't find them to be exceptional. The quality of the response could be better. It's the first level of response that's the issue. They don't dig deep. They might read your something or ask questions unrelated to the underlying issues. They need people with good product knowledge even at level one support.

Which solution did I use previously and why did I switch?

We've only ever used this solution. We did not use anything else previously. 

How was the initial setup?

We have four admins that can maintain the product.

The initial setup is complex in terms of understanding everything and building up the infrastructure required for deploying it. Setting up all the infrastructure from the servers to the database, to load balancers can be difficult. Many things are there. It takes a good amount of knowledge actually to deploy it correctly.

What about the implementation team?

Initially, the implementation was done by Micro Focus. After that, the four admins take of upgrading, et cetera. The initial implementation was done many years ago by Micro Focus, and thereafter it's all the admins who take care of everything else.

What's my experience with pricing, setup cost, and licensing?

I don't handle any aspect of the pricing. 

What other advice do I have?

I'm both a consultant and a user. I'm a Micro Focus partner.

It's all about what you need. If you really want to deploy a good test management tool, which gives benefits and helps you manage everything, and you're really serious about test management and application management, then go for it. If you just want a tool that takes care of something from testing an ALM, you're not as serious and likely don't need this. 

I'd rate the solution seven out of ten. 

Which deployment model are you using for this solution?

On-premises
Disclosure: My company has a business relationship with this vendor other than being a customer: Partner
PeerSpot user
Buyer's Guide
OpenText ALM / Quality Center
April 2024
Learn what your peers think about OpenText ALM / Quality Center. Get advice and tips from experienced pros sharing their opinions. Updated: April 2024.
769,630 professionals have used our research since 2012.
Test Advisory, Management & Implementation at a energy/utilities company with 51-200 employees
Real User
A good stand-alone test management tool, but its pricing could be improved
Pros and Cons
  • "As a stand-alone test management tool, it's a good tool."
  • "The product is good, it's great, but when compared to other products with the latest methodologies, or when rating it as a software development tool, then I'll have to rate it with a lower score because there's a lot of other great tools where you can interconnect them, use them, scale them, and leverage. It all depends on the cost."

What is our primary use case?

It's a business process requirement and is being used for test cases, test executions, defect locks, metrics, dashboards, etc.

In implementation projects, things work in the waterfall methodology so it's the best tool to collect all the requirements in one place to tie up into the test cases and test executions, so this solution is extensively being used in the company for implementation projects, particularly in test management activities. 

What is most valuable?

I like all the features this solution provides. It is a good stand-alone test management tool.

What needs improvement?

Pricing could be improved as it's high-priced. I don't exactly know the pricing point, but previously, I know that it was really high so fewer people were able to use it for their projects. That's the only disadvantage I could think of.

One other thing: I'm not sure if Micro Focus ALM Quality Center has this feature, or other people could be using this feature currently, but if it can be connected to any automation tool then it can pass those automation test scripts, which internally it can reflect that requirement if it passed. If that feature is there, then it's good.

If that feature isn't available, what I would like to see right now is whether it can be done manually. You can say that manually, these test cases that are linked to the requirement have passed.

If this solution, on the other hand, can be connected to an automation tool, then it can update us automatically about the test script and whether there's a link between the test scripts and the requirement, then we can say: "Okay, this requirement ran automation test scripts and it passed, and that means coverage is good."

I don't know whether this feature is currently available. If it's there, good. If it isn't, then that would probably be one last item I would be looking for which I'd like to be integrated into the test management tool.

For how long have I used the solution?

I'm currently using the Micro Focus ALM Quality Center.

What do I think about the stability of the solution?

About the stability of this solution, I noticed a glitch. Sometimes if I go into any of the test cases, it will show as if it doesn't have anything, but if you click the box, it'll show the content of the box e.g. company information, steps, or expected results in those test cases. Apart from that, I didn't see any other glitches.

What do I think about the scalability of the solution?

I have no issues with scalability because if you want more projects, you can add more projects, and if you want more texture, spaces, or cycles, you can add them. I find it good.

How are customer service and support?

Currently we don't have any technical concerns on the ALM side, so no improvement needed support-wise.

How was the initial setup?

The setup was a one-time thing and I didn't find it difficult.

Which other solutions did I evaluate?

I was able to evaluate Jira, Confluence and Xray.

What other advice do I have?

We don't have any technical concerns about the Micro Focus ALM Quality Center. Probably, it's on a different piece of Micro Focus solution called MF Connect which connects the ALM to the DevOps so that's a different one.

My advice to others looking to implement Micro Focus ALM Quality Center is that using it successfully depends on the person and the project. It may not be the same for other people, but installing it and using it offers less hassle, but I won't suggest it for everybody because analysis needs to be done when using this solution for particular projects. Users need to think about their requirements and if their requirements are not being met, then this tool may be obsolete, but as a test management stand-alone tool, it's a good tool.

I've been using this solution full-fledged and I don't see any improvements which I required in this project. I started to use this product when it was in Mercury, and Mercury then went into HP, then into Micro Focus, so I'm a longtime fan of this HPQC ALM thing. But these days, things are working differently in Agile. So Agile: It works on stories and so forth, but there is no repository of requirements or any kind of history of things. There, a project comes and it works in an Agile fashion. I don't know how good this tool is when used in an Agile perspective, but I'm sure that it is a good test management tool.

I'm rating ALM based on two points. One rating is for the product. The product is good, it's great, but when compared to other products with the latest methodologies, or when rating it as a software development tool, then I'll rate it a five out of ten because there's a lot of other great tools where you can interconnect them, use them, scale them, and leverage. It all depends on the cost.

As a stand-alone test management tool, I'm giving it a nine out of ten.

If I'm trying to scale and I'm spending more money, my rating will go down. If it's able to scale with less money like Jira, Confluence, or some other tool like Xray, then scaling may be done faster with less cost to the user.

Wherever you put five out of ten, I would say to upgrade that to seven out of ten.

Which deployment model are you using for this solution?

Public Cloud
Disclosure: My company has a business relationship with this vendor other than being a customer: Partner
PeerSpot user
YingLei - PeerSpot reviewer
YingLeiProduct Marketing Manager at a tech vendor with 10,001+ employees
Vendor

Hi Jose, thanks for the detailed review and being a long-term user of ALM/QC.  You mentioned you want "it can be connected to any automation tool then it can pass those automation test scripts, which internally it can reflect that requirement if it passed". Yes, the feature is there, through our Jenkins plugin, see Jenkins integrations (microfocus.com).


Other resources:


https://community.microfocus.c...


https://www.microfocus.com/pnx...


Test Management Architect at a insurance company with 1,001-5,000 employees
Real User
Enables management of all the important assets and metrics

What is most valuable?

The overall task management. Managing all the assets and metrics.

What needs improvement?

I'm not familiar with all the changes, but they definitely have to be more DevOps friendly. They have to certainly be more open source friendly. That's the world we live in, where we can cut costs away from large-scale vendor contracts and service contracts. The ability to seamlessly integrate and provide more capability for those, managing those infrastructures and solutions, is going to be critical historically.

A lot of the vendor products - not just HPE or, in this case, Micro Focus, or whomever that I've dealt with over the years - were much more proprietary, much more exclusive. And what we're finding now is that the world doesn't work like that. Particularly as you move left and shift towards DevOps, application teams now don't consume from a central resource, they consume based upon decisions made internally to that application team.

Ultimately, what they need is flexibility. So any vendor product needs to have that intrinsic in its fiber, to be able to adopt open source, and integrate basically into almost anything, to expand out the choices available to an application; to make the decisions that need to be made independently at the time that they need to make them.

Not having looked at the latest, ALM Octane, just coming from the old world, at the time that it was necessary to implement a test management system to gather more information, metrics across different teams, different platforms, it served the purpose.

Things change constantly these days. There's a lot more going on. There are a lot more integrations available. I think if we're looking at the legacy owned product, I think its kind of come and gone as far as its ability to do what you need to do in a DevOps world. Any solutions in the future - I know ALM Octane is the heir apparent to the old infrastructure - it's going to have to be more DevOps friendly. It will need to be able to enable the consumers, the application's users who ultimately become the developers, to see the value in a more organized test management practice, versus more of a kind of hidden, under the sheets unit testing.

It's actually a whole trajectory of different solutions, different tests, that need to follow the pipeline for those folks. Anything that's not DevOps friendly, that's not DevOps easily consumable, to make the case for a more formal test management practice, is really going to end up by the wayside at the end of the day.

For how long have I used the solution?

11 years.

What do I think about the stability of the solution?

My experience with the solution is that it has been fairly stable. What lies underneath is what creates the instability at the end of the day, the architecture that you are providing the solution on top of. I think once you figure out a viable, scalable approach to it, then the software itself, at least in my experience, has been very stable in running a test management operation.

What do I think about the scalability of the solution?

It has met our needs. Just as long as you have the right architecture from the old days of physical server hardware to more of the newer stuff, which is VMware within datacenters - more virtualized.

And certainly the next rage for everybody is moving into Cloud infrastructure. So things are becoming much more self-service. You're getting model scaling. You're getting the things that are making the system more maintainable. But from a scalability standpoint, you want to be able to scale to the needs at the time that you need them. The Cloud certainly provides that capability.

How is customer service and technical support?

I think like every company, they're changing the landscape. Support, in my experience, has been pretty good. There are always challenges based upon the routing/tier structure of who gets the issue first, how it gets routed, how it gets filtered down to the specific expertise that you need. That depends on your acumen as far as knowing your tooling, knowing your approach, what that's going to be.

Somebody who is very savvy, will obviously have frustrations coming into a tier-one support desk. Who they really need to go talk to at the end of the day may be somebody, and it will vary by company, like a tier-three, real low-level, very experienced resource support tech who fixes those issues. So it's going to vary based upon the customer's competency versus how they are routed through a support desk.

What other advice do I have?

Testing is going to be testing. And the same challenges that you have in any of the different industries are going to be the challenges that you have in the ours, the insurance and financial industry, as well.

You know from DevOps to Agile, to Shift Left to Cloud, to managing your test assets efficiently and effectively, industry is really not going to make a difference.

I've been in a number of different sectors over the years. I've been in QA about 25 years, and having been in the natural gas industry, financials, insurance, HR systems. They are all pretty much the same challenges around testing. So I don't see a discrepancy in terms of the application you're testing. It's almost agnostic to the challenges that are innate with trying to test, within any type of development environment. Now, it just happens to be a more self-service DevOps model, where application teams make those decisions. But there's still always going to be those QA challenges.


Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
PeerSpot user
Business Systems Consultant at Wells Fargo
Real User
It enables our testers to work in a single application and provides traceability among testing and defects
Pros and Cons
  • "I like the traceability, especially between requirements, testing, and defects."
  • "I would rate it a 10 if it had the template functionality on the web side, had better interfaces between other applications, so that we didn't have dual data entry or have to set up our own migrations."

What is most valuable?

I like the traceability, especially between requirements, testing, and defects. Being able to build up a traceability matrix, being able to go through and show what's been covered, where your defects are, etc.

How has it helped my organization?

It's allowed us to be a little more consistent across the board. We have probably 80% of our QA teams using Quality Center. It is a system of record.

It really does allow our testers to work in a single application. It's not as good if you don't set things up in advance to work with other applications. But we're working on that part.

What needs improvement?

I'd like to see an easier way to upgrade and install. I'd like to see it less required to have a client. I know that Octane doesn't require a client, but Octane is not mature enough for our organization. I'd like to see some of the good points from that integrated into it.

I would rate it a 10 if it had the template functionality on the web side, had better interfaces between other applications, so that we didn't have dual data entry or have to set up our own migrations.

What do I think about the stability of the solution?

It's been around a really long time. It is very stable. It does require a little more work to upgrade, add patches, because you have to take it down. But then again, while it's running, we've had very little down time, very few issues from a system perspective.

When we do have to take it down, we usually take a full weekend, because we're a very large instance. But usually the install and upgrade goes through and takes three or four hours, and then it's just going through and running repair/realign or upgrade on the existing projects.

What do I think about the scalability of the solution?

Quality Center is very scalable. We have over 700 active projects on our instance. That's projects, not users.

How is customer service and technical support?

I've seen a lot of improvement over the years, from tech support. We are premier customers, or whatever the newest term is. We do meet biweekly with them and when we have an issue, we can escalate it and we get very fast response times.

How was the initial setup?

We're a company that has gone through a lot of mergers and consolidations, and we've gone through and actually consolidated a lot of instances into ALM and, with that, the complexity is more with the users than it is with the application.

Getting it installed, getting it set up, that's the easy part. Getting people trained to use it, that's a little bit harder. But once people start using it, they find that they're not sure how they did their job before.

What other advice do I have?

The most important criteria when selecting a vendor to work with are:

  • They need to be stable.
  • They need to be financially sound.
  • They need to have a good technology and support base.
  • They also need to be responsive to the company, because it's a big company, so we expect people to respond.

I would advise a colleague considering this solution to start with a plan. Make sure you know what it is that you want to accomplish with Quality Center, and only add fields that will meet that. Use your current documentation, your current processes, to help design the fields and the projects for it, rather than just adding things one at a time. Don't allow a "wild west," which is where anybody can add fields, add workflow. You want to manage that from the top down.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
Sr. Test Automation Engineer with 201-500 employees
Real User
You get the most value using all modules from Management to Defects.

What is most valuable?

ALM: You cannot just say one feature is most important. You get the most value
using all modules from Management to Defects. When you use the tool end-to-
end, you can pull efficient project reports (especially scorecards) from the
Dashboard. So everything is integrated and only then you can evaluate the tool
fairly. ALM is very flexible and each module can be used independently, but
when you do that you are only using the tool as storage, not as a test
management tool.

UFT: It became much more stable tool in terms of object recognition over the
years. It is easy to use as long as the user has basic software development
knowledge and understands that the software automation process is not just a
record/playback.

How has it helped my organization?

ALM: We currently successfully manage all testing projects due to ALM’s invaluable capabilities, which are listed below:

  • Built on best practices with a flexible structure, organization, and documentation for all phases of the application testing process.
  • Serves as a central repository for all testing assets and provides a clear foundation for the entire testing process.
  • Establishes seamless integration and smooth information flow from one stage of the testing process to the next.
  • Supports the analysis of test data and coverage statistics, to provide a clear picture of an application’s accuracy and quality at each point in its life-cycle.
  • Supports communication and collaboration among distributed testing teams.
  • Reduces time needed to create test execution summary reports.
  • Reduces the time needed to write and execute manual tests with HPE Sprinter tool.
  • Users can capture their actions automatically as steps in a formal test.

UFT: We save time executing smoke and regression tests. We also use UFT to create test data.

What needs improvement?

I would like to see better Reporting functionality especially more sophisticated graphs, for example Actual vs. Planned or high level progress graphs using indicators like traffic lights. I would like to see more sophisticated or flexible Dashboard views, such as editing and resizing. I use scorecards and pull them into the Project Reports using customized templates. Scorecards can only be refreshed from the Site Admin, which then test leads has to depend on the ALM Admin to refresh the reports if needed after the scheduled auto run. There should be ability to refresh scorecards (execute KPIs) from the project itself or give more frequent auto refresh option like even every 5 min. This is a really burden on the team.

I would like to see Requirements mapped to test steps so we can combine multiple requirements validation in to one test case but map the verification steps to the associated requirements, so if the step fails only fails one requirement not all. When we are operating in an Agile world we do not have time to write test cases to capture one-to-one coverage. I know ALM allows many-to-many mapping but we cannot get true requirement pass/fail status if we use many-to-many option. Test configuration option kind of on the right path, but can only be use for data driven test cases, I cannot add design steps. If we can add design steps to a subset of a main test using Test Configuration option, I think we may be able to differentiate individual requirement that was failed without failing all the requirements mapped to the main test case.

For how long have I used the solution?

We have used this solution for 17 years.

What do I think about the stability of the solution?

I did not encounter any issues with stability.

What do I think about the scalability of the solution?

I did not encounter any issues with scalability.

How is customer service and technical support?

In terms of technical support, I usually get solutions to my issues. I did not have any issues to call technical support for a long time.

How was the initial setup?

If you follow the instruction, the setup is straightforward. It definitely requires an experience user to do the installation and setups, especially for upgrades.

What other advice do I have?

I always used ALM and UFT. However, I had training and evaluated IBM JAZZ tools.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
it_user671379 - PeerSpot reviewer
Manager at a retailer with 10,001+ employees
Vendor
The most valuable features are overview, primary requirements, and test cases.

What is most valuable?

In ALM, the most valuable features are the overview, the primary requirements, test cases, defects, and traceability. Manual applications handle the regulations, so we must have the tracking capabilities. Even some of the core systems are not allowed to go down. It's very important that we know what we have tested and what is working and what is not working. That we can find out from ALM.

What do I think about the stability of the solution?

Stability is no problem.

How was the initial setup?

The first time we installed it was a long, long time ago. We bought small, five license versions of Test Director from Mercury in 2007 and it has continuously grown since then. Today we have 600 users and 130 active projects. The environment gets bigger and bigger all the time.

It's complicated to upgrade. For ALM, we have roughly 600 users. In ALM, we have roughly 130 active projects. So it takes a long time to upgrade. Some of the big projects are 5 GB of data. To migrate that to a new version takes maybe two or three hours, even if we have huge hardware. 

It's very complicated. We'd gladly like to upgrade to newer versions. We plan to use Octane, but we will not end up in a situation where we have two tools. We would like to, but we must find a smarter way to do some kind of migration. Several of the applications have regulations that we follow and we must be able to track 10 years back. We can't just throw away the data we have in there. 

If not upgrading ALM, probably they would like to search and would like to find something else. They really need to find a smart way to migrate some part of it. Of course, it's a totally different tool.

Which other solutions did I evaluate?

We have looked at many alternatives. We have compared ALM to almost everything. We even have JIRA for smaller projects now. ALM and JIRA are two totally different products that are for two totally different needs. 

For example, we have an on-premises solution of ALM. You have to log into the active directory, so it's not so easy to give to someone outside the company. It's also struggling with different browsers. It's doesn’t work very well on a Mac, for example. The Mac developers and the Mac teams don't like ALM. Now it works much better on Chrome, but we're struggling there as well. They haven't been following the world with browser support. It's problematic to use ALM in Edge, for example.

But with JIRA, on the other hand, you don't have any requirements. It's easy to set up. It's easy to start up and have your backlog there. But after a while, you figure out what is going on. For maintenance and for testing, you need a plugin for this, you need a plugin for that, and you need a plugin for something else. It's not so easy to get the overview or the helicopter view of it, if you compare that with ALM. But I understand why some like it and it has some kind of need. I hope we can mine that capital when we upgrade to Octane.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
it_user487383 - PeerSpot reviewer
Senior Systems Engineer at a financial services firm with 1,001-5,000 employees
Real User
The advantage is that we can test applications before they go to production.

What is most valuable?

ALM is a giant library, and Performance Center and LoadRunner require it to run.

How has it helped my organization?

We use it to support Performance Center and it runs underneath it as one big system. The advantage is that we can test applications before they go to production, and as long as we're testing in a production-sized environment, we have a pretty good idea how an application will perform in production.

What needs improvement?

It's like the overall software framework, and Performance Center is just leveraging that framework for storing things such as tests, scripts and test results. ALM works together with LoadRunner and Performance Center as one big system. As newer protocols are developed and newer technologies come along, it's nice to see HPE be ahead of that as much as possible so that by the time that it's really needed, they're already ahead of the curve and they've got most of their performance issues resolved as far as how the software's going to run.

What do I think about the stability of the solution?

The stability on the old versions is good. On the newer versions, the bleeding edge is still being worked on.

What do I think about the scalability of the solution?

It's very scalable. No issues with scalability.

How are customer service and technical support?

Premium support is great, but before that when we just had general support, it was not all that great. We had issues with trying to get support to call us back on tickets and turnaround time on resolution.

Which solution did I use previously and why did I switch?

We previously used IBM Rational.

How was the initial setup?

It's not exactly straightforward. Their instructions were not all they could have been, but we still got it installed.

What other advice do I have?

As far as we know, it's the best tool on the market right now. They're considered the Cadillacs of the testing tools right now. Don't necessarily go with their most recent version code release right now. It kind of depends on what your needs are and the size of computer shop that you've got.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
Buyer's Guide
Download our free OpenText ALM / Quality Center Report and get advice and tips from experienced pros sharing their opinions.
Updated: April 2024
Buyer's Guide
Download our free OpenText ALM / Quality Center Report and get advice and tips from experienced pros sharing their opinions.