- Traceability from requirements to executed tests and fixed defects
- Visibility and reporting on test execution
It gave us control over the development of requirements and tests needed for the bank's transition from bespoke back-end systems to an Oracle banking system.
The user interface is still dated. Writing test scripts in HPE ALM is generally avoided as the interface is too awkward to use. At the software development company where I worked, the test scripts were routinely created in Excel and uploaded into Quality Center. This process was seen as much more productive than using the HPE QC interface.
At the bank where I worked, I was responsible for training and supporting the end user testers. I constantly found myself defending the way HPE ALM worked. When executing tests, the users would get themselves lost, would expect an action to have taken place when it had not, had lots of problems logging defects, had problems getting screenshots into a defect, and understanding and using favourites.
The interface they have developed is quite good at the top level, i.e., grouping into Requirements, Test Cases, Test Execution, and Defect Management. However, once you get into each area, the complexity of the application kicks in. There is no 'flow' of the basic functionality. For example, in Requirements, the basic function is to create requirements and link them. There should be a wizard that guides the user through that process, which includes suggestions about grouping and structuring, etc. Instead, you are just left dangling.
Another example: Test cases are to be written or modified, linked, selected for execution, and executed. Test execution in particular is a prime candidate for 'wizard-like' guidance for the tester. A much-clearer indication of where the test execution is up to by test set and test case as well the test steps would be most helpful.
What do I mean by 'flow'? Based on my own application development experience, the basic function(s) of the application should be obvious to the user and the easiest to perform. Extra functionality steps are seen as and executed as digressions from the standard flow. I realize that this is not easy in a product as complex as HPE ALM that has been hacked together by several companies and many developers over the years, which I guess is the reason it is the way it is.
We imported a large number of tests and requirements and found a few 'gotchas' on the way, but generally it filled our needs at the bank that I was working at the time.
I used HPE ALM at the bank for 9-10 months.
We had a number of issues with importing the requirements and test cases we had created in Excel. Mind you, we were using a very complicated setup in Excel.
We only encountered stability issues in the importing from Excel. Things would not go quite as smoothly as expected; quite a few emails back and forth to the local supplier.
I did not encounter any scalability issues.
The local supplier Assurity were good.Technical Support:
The local suppliers were very good, but if it was something that had to go back to HPE, forget it.
We were using Excel spreadsheets to try and test a major system change in a bank. Need I say more?
At the bank, the setup took the local supplier over two days to complete. The issues were mostly around meshing into the bank's security and local IT systems.
The vendor was okay. The person sent in to do the installation was very knowledgeable on the product.
It is totally over-priced.
I did not evaluate other options. The system was picked by the QA manager at the time, based on his experience with the product.
I think it is over-priced and would recommend looking carefully at other options.