Rally Software Review

The BQ score lets us evaluate whether a team has understood the requirements completely or not.


How has it helped my organization?

The velocity of work that we were doing prior to using CA Agile Central, was not quantitative, i.e., we were not able to quantify how many user story points we had delivered or how much work, we were doing.

By using the CA Agile Central tool organization-wide, it gives us a clear picture of how many user stories we can pick and the capacity of the entire team. It is a very good tool; we can see the capacity and the entire velocity of the team. So, this is how it has improved our deliverables. Earlier, we used to deliver in around an 8-week sprint. Now, since our company is using it for about two to three years, the team has matured on the process. They have really improved deliverables and reduced it to a 5-week sprint, that we are giving to the clients.

What is most valuable?

One of the features that I like is the discussion thread, that we can subscribe to, if at all, someone wants to discuss something. It has to be in that particular feature or the user story itself, rather than in an email. Thus, one can subscribe to know whatever discussion is taking place. So, you can get an email regarding what is new and what has been added to that discussion, which is a great feature.

Another feature that is valuable, since I have used JIRA as well, is in regards to the BQ score, that we are giving after having the grooming sessions. This gives us an important way to evaluate whether a team has understood the requirement completely or not. So, the scaling part as to how the team is doing, is really a good feature.

I can be assured in terms of the rating. Initially, I got a 3 star rating, as my teammates were not clear with most of the things. However, when I gave them some clarification and they are good with it, then they changed the rating to around 4 or 5. This gives me confidence to do a release plan, as my team is very confident and have given me a higher rating. This is a great thing.

What needs improvement?

The product is really good and there is very little space, as to what needs to improve.

The only thing that I can think of is to improve the section of the acceptance criteria, that is located far below the user story description. Sometimes, what happens is that people who are looking at the user story are not looking at the acceptance criteria, because they need to scroll down to look at it. It would be better to display it on the top, where you can see the user story, without having to scroll down; perhaps, by having another screen where you can see the acceptance criteria field.

For example, from what I have seen often is that, instead of our developers going to the acceptance criteria field after the user story, instead they come to me and ask me where it is. That means, they haven't gone through the entire user story until the bottom of the page. The practice that I have asked them to follow is to read the notes right at the bottom of the user story, so as to access the acceptance criteria field. So, if one is looking at the laptop screen, then there should be a button aligned somewhere on the topmost part, that will immediately prompt as to where the acceptance criteria field is located, i.e., somewhere within the description box itself. So, no one will have to scroll up or down to look for it. This is the only thing that I found that needs improvement; rest of the stuff is great.

What do I think about the stability of the solution?

We have experienced some crashing instances.

Sometimes, I enter my password; however, then something gets downloaded on my desktop and it asks me to put my password in again for the CA Agile Central tool. Within a two-year framework, this has caused me to log in and change my CA Agile Central password around three to four times. It doesn't give me any notification that the product is down or undergoing maintenance. There are some caution notes displayed as well, such as "This is scheduled maintenance time and CA Agile won't be available currently". However, there have been various instances where people are not getting any notifications as such, and they are being logged out off their accounts.

Sometimes, there are minor stability issues, such as when inserting your password and getting a message that you have to insert it again. We have had some instances where the users are logged out of their accounts.

We have two workspaces and while switching from workspace 1 to workspace 2, recently, we have seen some of the records getting lost, i.e., it was either in the recycle bin or orphaned. This has been noticed for the first time, while migrating workspaces.

Most of the times, to get a report, we are going to user stories and grabbing the columns into one particular view, which we then export to Excel. From there, we get a pivot table, so as to extract the exact data that we need. For example, sometimes, we need to perform a matrix analysis, in order to know how many defects have been encountered in a particular release. In this case, we cannot simply make a comparison matrix chart and but if we need to search it by a particular criteria, we can do that. However, if we want to do a graphical representation/chart of the data, then it will not show that.

I wrote a user story before my PA planning; after my PA planning, 2 got deleted and 3 got added. We want to know from the previous user stories as to how many got deleted or were newly added, in a graphical chart. This is something from the matrix side.

Thus, matrix is an area if CA can improve it, i.e., without having to export data from Excel, we will be able to get the information easily.

What do I think about the scalability of the solution?

Although, it is a manual task, we can get the data easily in a table format. Since it is a manual process, it is not straightforward. However, we do have a large number of users, who are doing this kind of matrix analysis. We have around 52 to 55 product owners, who are managing up to three scrum teams each. Thus, you can take an average of around two scrum teams. At the end of every release/sprint, we are pulling up this data matrix, to know what has happened exactly during this particular release, i.e., where we were good or not. There are charts, where we can do burnup/burndown and have all those variations between accepting the user stories, until the time it is completed.

We need some similar type of matrix for other criteria as well, such as how many defects are there or the BQ scores that have been given to the user stories. For example, if my release plan comprises 30 user stories, then how can I identify out of my release plan, how many user stories are not created well, due to issues such as the requirements were not clear enough or they were too big or did they have architectural insignificance or not. Thus, if we get this type of matrix, then we have the justification for it. However, if we have to pull out the data manually and if, we haven't grabbed the correct parameters, then we can miss out on some of the criteria, for this 30-point user story. We won't be able to pinpoint exactly as to why the release plan has not been carried out correctly.

How are customer service and technical support?

I would rate the technical support at around 9/10. They were very aggressive in terms of looking at the issue and providing the correct guidance. So, I appreciate them.

Which solution did I use previously and why did I switch?

In my current company, from the very beginning, they have been using the CA Agile Central solution.

At my previous company, I was using Microsoft Team Foundation Server (TFS) for similar purposes. It was not too friendly. For example, if you are in their UI, there is a big chance that you are going to get lost somewhere and you won't know how to come back to the point where you started. This wasn't very good. Probably, they are not matured yet.

Since I started using CA Agile Central, I found it to be very good in those terms. We have feature descriptions, detailed user stories, attachments, discussions and we can even see the revisions. So, having those features in different tabs gives us a kind of flexibility to look at what is going on and who did what to change it. They could add a tag to see if the user story is of architectural significance or UX significance.

This tool is most productive in my day-to-day job.

How was the initial setup?

The setup is not very complex. As soon as you log in, the dashboard we see is really cool. One would really like to see the graphs and charts. So, that is very nice.

What other advice do I have?

There are many tools in the market out there. I have worked both with JIRA and Microsoft TFS, so it can be seen clearly that the CA Agile Central tool is entirely developed. There is a timebox and a sprint-based UI, as well, in it; it is very easy.

When I used TFS, it was very clumsy and you can't see yourself, returning to the very first point. But, here you have everything; it is very flexible, very simple and decent. You can start anywhere and can return to the very same point, again at the end.

I would suggest for most of the product development companies, if they want to have tracking of their user stories, then use a very simple tool like the CA Agile Central solution. It gives you a number a functionalities, along with a very decent UI. The UI is not very fancy, but it is going to give you a very nice picture of the status, as to what is going on with the features and user stories. So, my recommendation would be to go with the CA Agile solution. It is a very good tool.

**Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
More Rally Software reviews from users
...who work at a Financial Services Firm
...who compared it with Jira
Find out what your peers are saying about Broadcom, Atlassian, Microsoft and others in Application Lifecycle Management (ALM) Suites. Updated: July 2021.
522,281 professionals have used our research since 2012.
Add a Comment