What is most valuable?
I am most familiar with the Test and Requirements Management functionality, especially Test. I think the data model has stood up well. The UI is easy to train a new tester on, and the dashboard has be a good tool for others wanted "real time" status of the progress in testing. The traceability tracking is very good, though we mostly export and work in Excel for Analysis.
Support has been excellent - an advantage of working with a somewhat smaller vendor interested in making users successful.
How has it helped my organization?
We migrated from TestLog, Excel and Word documents, and are still partially in transition for requirements. With a significant portion of the QA team offshore a web-based solution was needed. We are able to use the tool very collaboratively across 3 or more teams performing testing and managing requirements. Combined with the dashboard which increased the visibility of test results in real time across the organization, and a 2-3X improvement in traceability, we have much better coverage and can prove it. It has a capable REST API interface and an integration hub. It plays well with other tools, and doesn't seem to assume that it is the center of your tool universe."
What needs improvement?
There are some areas where UI is a little week, and some areas where's it's very good. The ability to track, version and display changes to test cases over time is excellent. While the data model for traceability has been solid, I often find that most analysis requires export to Excel. Happily the export works well, and there is a decent (re-)import capability. Like any tool, it has some quirks here and there, and there are a set of work-arounds for certain things which have become part of our usage.
For how long have I used the solution?
What was my experience with deployment of the solution?
We used the SasS model which has been easy and allows simple access for contract help offshore to be added as needed without compromising internal systems. The rate of adoption was much higher than expected, or an onsite installation might have been considered. Although we don't use it, there have been updates to the onsite deployment and support models recently which seem to be getting decent reviews on the user forums.
What do I think about the stability of the solution?
There were some glitches early on, perhaps because our usage outgrew whatever type of server were on. This is the core tool for the QA team so we are sensitive to outages. We have had very few issues with availability of late. Patches sometimes cause minor issues / adjustments ... perhaps plugging a hole that we were taking advantage of... It has been prudent to alert users when an update is coming and ask them to be attentive to any changes.
What do I think about the scalability of the solution?
We did have some issues of scalability issues early on. I believe we pushed the envelope on numbers of test cases, and did suffer some performance issues, and outages in effect. I believe some improvements were to address our experience. It seems like things have been tuned, and our organization now has 2 instances (for a number of reasons). I'm not aware of significant issues with scalability anymore.
How are customer service and technical support?
It has been OK. Last time I looked into it, I thought the license management tools were a little weak. OK for floating licenses but I'd have liked the ability to export records for all license usage to Excel (as the tool does well elsewhere) so I could do my own analysis. Technical Support
It's been one of the best aspects of Jama in my experience. The support people will dig for a work-around when I've run into a problem. The creation of an online community at the start of 2015 has been very helpful. There are any number of capable users willing to share their experience.
Which solution did I use previously and why did I switch?
TestLog. It was "out of gas" for our needs, and probably had been for some time. In theory it has a web interface but it was definitely not scalable to our needs.
How was the initial setup?
Using SasS was pretty easy of course, but configuration was necessary. In the beginning we had just *one person* set up the framework for accounts, groups and permissions, and it's stood up well as a result. We got a little consultant help early, but still made a few configuration mistakes (in retrospect). Within a couple of weeks the first team had test cases ported and was using Jama. Rarely did we need to revisit the prior tool.
What was our ROI?
Don't have any hard numbers to share. I know the test coverage is significantly improved and now we can check it easily and continuously. The cost of licensing went up noticably since we started with Jama, and its usage far outstripped our original expectations nevertheless.
What's my experience with pricing, setup cost, and licensing?
Ask for help pulling data about your usage. Floating licenses seem proportionally more expensive than some other tools, but that floating usage is tracked reasonably well. You might start out with more floats and see how it goes unless you know you have folks (like QA) who will be using the tool 8 hours/day. The Enterprise licensing model seemed to work out.
Which other solutions did I evaluate?
We evaluated a number of other tools. There were more comprehensive and configurable options but they would have required more dedicated resource in-house to configure and manage. Simpler, less expensive options seemed like they might be outgrown in a few years. Jama was a good compromise for the usage anticipated.
What other advice do I have?
We set up several projects to tinker in early, and that was quite helpful. We made some tweaks after porting, but got it mostly right at the start. We were mindful of getting caught up in analysis paralysis, but I believe within 6-8 all teams had moved to Jama and never really looked back.