We use it for enterprise-level solutions.
We use it for enterprise-level solutions.
While we are testing, when there is data that's not accessible or we need to quickly generate data, TDM comes in handy. We can create batch files as well. We can write scripts which automatically create data and we can integrate it with the automatic Dev scripts. This feature is very good. We have used these kinds of features for smaller solutions, although not at a very large scale, because of the complexities involved in the enterprise-level data.
The entire tool is good and I like the synthetic data generation, that's really good. It's valuable because you don't have Prod data so, instead, you can create multiple copies of similar data. You can write rules and create permutations and combinations according to your needs. Or you can take a snippet of the Prod data and replicate it. All of that is really helpful.
The UI could be improved and I see they are going to web-based. That's still in progress but I really hope all of that happens pretty soon and the entire UI gets migrated from the desktop to web-based.
The integration with various utilities is also really important. That still has to happen. That's a major area for improvement.
It has become pretty stable over the past couple of years. When it started it had issues but right now, I don't think there are any major issues.
It's a tool so scalability depends on you use it. Scalability is pretty relative. It provides a lot of features and it's up to you how you utilize them. It's pretty scalable. It has automated features and I don't think there is any other tool in the market which provides such a level of automated solutions. The demand in the industry, with respect to enterprise solutions, is pretty complex and CA TDM is pretty good. It is scalable but not to the extent that a foolproof enterprise solution can be provided using this tool.
Support is pretty good. We get answers to problems most of the time and, if we don't, they get in touch with the tech team and we get on a call with them and we figure it out together.
The setup is of medium complexity. It's been a long time since I set it up. I have had it on my laptop for a long time, but this is what I remember. The configuration does not happen by clicking a button and then you can start using it. It has its own steps. You register the depository, etc., to get into the tool. The installation itself is fine, but configuring it and getting it ready to use could be better.
The time it takes depends. At times I have installed it in a couple of hours, but if I get stuck... I don't remember all the issues I have faced, it's been a while, but I do remember that I had issues.
Every project and every implementation have to have a strategy. There are a few basic things that we look for and we follow a checklist to see if the project is feasible for TDM or model-based testing or some other solution. As far as implementation strategies are concerned, they are very specific to the client and the kind of ecosystem the client has. The basic strategy would be to not go "big-bang," to start with the basic and medium-complexity tests to show the ROI, and then roll it out one-by-one across the enterprise. But there can be a lot of nuances in the strategy document.
In terms of the number of staff needed for deployment, to start with we would not need more than two people to perform the PoC and do due diligence on the requirements. We would need two to three people in a bigger organization and one person for a smaller solution It depends on the requirements and on how much work is involved. To maintain it, one person should be enough.
Nothing happens quickly. It requires six to eight months to show a return on investment, minimum. You are going to invest in the tool, then you are going to do training, then you are going to do roll it out. And organizations have different project teams. They have to change the mindset. That process takes time. It's good when it happens. Once you have the system in place, after something like a year-and-a-half you'll see a good enough return on investment. That's the strategy we have. But we have to convince the client so that they understand this approach.
The problem is that the cost of this tool is pretty high. Even if an organization likes the tool, at times it becomes difficult for us to sell the license. CA provides licenses for different utilities like masking but even if you break it up, the pricing is still high.
IBM Optim is one competitor as is Informatica. IBM has come up with the synthetic data feature in the last years although I don't recall the name of the tool they acquired. Informatica vs IBM Optim does not provide synthetic data yet.
Normal TDM features, like masking, are provided by both IBM and Informatica. People usually go for Informatica because it is easier for them to adopt the tool. Informatica is a very popular tool on the market for basic TDM-related activities and it's not as costly as TDM.
I have been acquainted with this tool for three-and-a-half years and, since it was acquired by CA, we have worked pretty closely with CA to give feedback on what is expected out of the tool. We have worked very closely with the developers, as well, to enhance the tool.
We have two or three clients using it.
See how CA Test Data Manager can reshape your testing space.