CA Test Data Manager Review

With synthetic data generation, we can test applications with three or four times the production load. We would like to see it generate synthetic data for non-relational DBs.


What is most valuable?

One of the most valuable features to us is synthetic data generation. We generate a lot of synthetic data for our performance testing and bulging up our performance environment to see how much load they can sustain. We've been doing it for relational data structures.

At a recent conference, I was talking to the product management team. We have a big use case for synthetic data generated for non-relational data structures. They have it on their road map, but we would love to see that coming out very soon. With modernization, relational databases are going away and the non-relational databases are coming up. That's a big use case for us, especially with the Grav database. We have a big, huge Grav database. We need to generate a lot of synthetic data for that.

How has it helped my organization?

It has really changed the culture in the company because nobody could ever imagine generating millions of records. Even production systems have just a couple of million records. When you want to test your applications with three or four times the production load, you can never actually achieve it because there is no other way besides synthetic data generation. You can’t have that volume of data in your DBs. Even if you subset your entire production, you would get just one X of it. To get three or four X of it, you have to go to either data cloning or to synthetic data generation.

What needs improvement?

The solution can really improve on non-relational data structures because that's a big industry use case which we are foreseeing, with non-relational database structures. I talk about databases. I talk about request-response pairs; the services data generation. We use it so much for virtualization. If we could create the web services request-response pairs non-relationally supporting GET, POST, and so on; that would be a big win for us.

For how long have I used the solution?

I've been using CA Test Data Manager since it was first released as Datamaker about 2.5 years ago. I've been using it pretty regularly since then. It has undergone a big, big transformation. There is a lot of good stuff coming up.

What do I think about the stability of the solution?

We still use the old tech line version of it, but we have seen the demos as it's moving to the web interface. I think its going to be very stable going down the line.

What do I think about the scalability of the solution?

It is not very scalable because even to generate maybe a couple of million records, it takes six to seven hours. If cloud muscle power could be included with it – like if the synthetic data generation can be done using a cloud instance; it's all synthetic data, so nothing is PII in it – if you could have a cloud feature where the data can be generated in the cloud, which might have multi-GB of RAM in memory, that would be great for us.

How is customer service and technical support?

Technical support is getting better. It's getting better and slower at the same time. That is because when I started my interaction with Grid Tools, it used to work on the bleeding edge of technology. Whatever enhancements we used to submit, the turnaround time was a couple of weeks and we would get whatever we need, whatever new features we needed. The processes were really ad-hoc. Rather than writing support tickets, you would literally reach out to somebody who you know who really works on the product. You reach out to them and they keep passing your ticket or enhancement request from person to person. Now the process is very much streamlined, but we have lost that turnaround time capability.

What other advice do I have?

When selecting a vendor, my personal requirements would be: the tools should be stable and there should be a knowledge repository for it. When you see the PPT presentation, it just gives you an introduction about the tool and it gives you the capabilities of the tool. To really get your hands dirty, you need an intense video or documentation to work on it.

I think the more webinars you do, the better. If you can record the webinars, archive them, that would be great. If you could try to solve some more complex use cases in your demos, that would be great. Most companies give you a demo of new features with zero complexity. Actually, when looking at the demo, and you are trying to solve your own use cases, you just get choked. You can't proceed any further because your use cases are really more complex than what was being shown in the demo. From the recovery aspect, if they can come up with more intense videos which shows real complex use cases, that's going to be great.

Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
Add a Comment
Guest
Sign Up with Email