We just raised a $30M Series A: Read our story

Broadcom Test Data Manager OverviewUNIXBusinessApplication

Broadcom Test Data Manager is #3 ranked solution in top Test Data Management tools and top Data Masking tools. IT Central Station users give Broadcom Test Data Manager an average rating of 8 out of 10. Broadcom Test Data Manager is most commonly compared to Informatica Test Data Management (TDM):Broadcom Test Data Manager vs Informatica Test Data Management (TDM). The top industry researching this solution are professionals from a computer software company, accounting for 40% of all views.
What is Broadcom Test Data Manager?

CA Test Data Manager offers an automated solution to one of the most time-consuming and resource-intensive problems in Continuous Delivery - creating, maintaining and provisioning of the test data needed to rigorously test evolving applications. CA Test Data Manager uniquely combines elements of data subsetting, masking and synthetic, on-demand data generation to enable testing teams to meet the agile needs of the organization.

Broadcom Test Data Manager is also known as CA Test Data Manager, DataMaker, DataFinder, Fast Data Masker, CA TDM.

Buyer's Guide

Download the Test Data Management Buyer's Guide including reviews and more. Updated: November 2021

Broadcom Test Data Manager Customers

Manheim, Arag

Broadcom Test Data Manager Video

Archived Broadcom Test Data Manager Reviews (more than two years old)

Filter by:
Filter Reviews
Industry
Loading...
Filter Unavailable
Company Size
Loading...
Filter Unavailable
Job Level
Loading...
Filter Unavailable
Rating
Loading...
Filter Unavailable
Considered
Loading...
Filter Unavailable
Order by:
Loading...
  • Date
  • Highest Rating
  • Lowest Rating
  • Review Length
Search:
Showingreviews based on the current filters. Reset all filters
ITCS user
IT Specialist at a financial services firm with 1,001-5,000 employees
Real User
Masks and generates data while obeying the relationships in our relational databases

Pros and Cons

  • "The data generation is one of the most valuable features because we are able to write a lot of rules. We have some specific rules here in Turkey, for example, Turkish ID IBAN codes for banks."
  • "There are different modules for masking. There is a portal and there is a standalone application as well. The standalone application is more old-fashioned. When you write rules on this old-fashioned interface, because it has more complex functions available for use, you can't migrate them to the portal."

What is our primary use case?

We use it for data generation, for performance testing, and other test cases. We also use data masking and data profiling for functional testing. Data masking is one of the important aims in our procurement of this tool because we have some sensitive data in production. We have to mask it to use it in a testing environment. Our real concern is masking and we are learning about this subject.

How has it helped my organization?

CA TDM is valuable for us because we use relational databases where it's problematic to sustain the relationships, foreign keys, and indexes. TDM obeys all the relationships and does the masking and data generation according to those relationships. 

Also, the testing team is using TDM to write the rules. Using this tool, our knowledge of data discovery skills has increased. That is an advance for our company.

In terms of performance testing, before TDM, preparing the data and data generation took a week for 20,000 sets of data. Now, with TDM, it takes just one day, which is great. We haven't had much experience with masking yet, we are in the adaptation phase, but data generation has increased our performance by about 60 percent.

What is most valuable?

The tool has strong data generation functions. When we needed special function that is not in the list, the support team has generated these functions and added with patches in a limited time frame.

For performance testing, we needed large amounts of data. The effort for data generation for this purpose has also decreased specifically.

Depending on security politicies and regulations we have to obey, we needed masked production data for testing. With the help of this tool, considering data integrity we can mask the data in a variety of ways (like shuffling, using seed list, using functions etc.)

What needs improvement?

There are different modules for masking. There is a portal and there is a standalone application as well. The standalone application is more old-fashioned. When you write rules on this old-fashioned interface, because it has more complex functions available for use, you can't migrate them to the portal. 

We also have some security policies in our company that needed adaptation. For example, the people writing the rules would see all the production data, which is a large problem for us. It would be helpful if there was an increase in the ability to apply security policies.

For how long have I used the solution?

One to three years.

What do I think about the stability of the solution?

The tool is stable. This was one of the reasons that we chose it. We haven't had an issue with any unknown problems or issues, so it has paid off.

What do I think about the scalability of the solution?

Scalability is a matter of how you use your systems. Our requirements required using it for MS SQL Server, Db2, and LUW Db2. We scaled the tool with all the databases we have, so it's scalable.

How are customer service and technical support?

Technical support is okay. We haven't had many issues lately, but we had a bug at the proof of concept stage and they solved it.

Which solution did I use previously and why did I switch?

We did not have a previous solution.

How was the initial setup?

The initial setup was straightforward. One of CA's consultants came to our company and did the installation in about two days. We use mainframes here, and mainframes are very complex. Still, the consultant did it in two days.

What about the implementation team?

We worked with a CA consultant to do all the adaptation over the course of about two months. We were happy with him.

What's my experience with pricing, setup cost, and licensing?

Part of the licensing is dependent on whether you want to use the portal. It's based on floating users. The other part is dependent on what type of system you are using. We are using mainframe, so we paid good money for a mainframe license. It's okay because, for us, the main work of this tool is on those systems. The mainframe is a critical system, so the cost is okay.

Which other solutions did I evaluate?

We looked at IBM Optim and Informatica TDM.

What other advice do I have?

It's important to know the requirements of your system, for example, the security policies you have to observe. The requirements may include a concern about relational or other database systems. You have to know your systems. Depending on your system, consider using one or more consultants, because we had a problem just using one. Also, compare all the tools by doing proofs of concept. That's important.

We have been using it for three months, but before that we also did a proof of concept in stages for about a year.

Regarding future use, we plan to use it in automation testing with content integration tools. Before running the automated tests, we will prepare our generated data with TDM. We also have a future plan for storage virtualization and use of Docker applications. It is possible that for Docker we would also use the TDM rule set. I want to believe it's scalable.

We have five testers using it to write rules. We also have 20 business analysts using and running these rules. In terms of maintenance, two developers would be enough. Our consultant coached our developers regarding our requirements. A testing engineer would also be okay for maintenance.

Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
PB
Front Line Manager at a energy/utilities company with 1,001-5,000 employees
Real User
Leaderboard
We use most of the test matching features in our testing processes

Pros and Cons

  • "I like the integration a lot. We use the test matching feature and the ability to make and find data."
  • "When I run my App Test, I can directly connect with TDM. I can publish all my data into one table in TDM, then run my App Test directly."
  • "When we publish a lot data into the target, sometimes it is not able to handle it."
  • "The relationship between cables needs to be added."
  • "A lot of research, data analysis, and work needs to be done on the source system in the tool which requires a data expert. This tool definitely requires a senior person to work on this issue, as it might be a challenge for a tester."

What is our primary use case?

We use a lot of creating data mods and test matching features. We did establish the subsetting cloning process as well, but based on the client requirements. Creating data mods and using test matching features are something which we have found fits our purpose. We use most of the test matching features in our testing processes and also the integration with App Test is something we heavily use.

What is most valuable?

We can always get different sets of data and obtain the most recent data. They have production refreshers, where we can get data subsets of that data and to do our testing. We can also lock it down for a particular user's test or generate data. E.g., if the data is not there, there are multiple forms and variables through which data can go through and so we can generate it. 

When I run my App Test, I can directly connect with TDM. I can publish all my data into one table in TDM,  then run my App Test directly. Essentially, my one test runs like 100 sets of data, I just click one test and it runs with 100 sets of data. 

I like the integration a lot. We also use the test matching feature and the ability to make and find data.

What needs improvement?

The relationship between cables needs to be added.

A lot of research, data analysis, and work needs to be done on the source system in the tool which requires a data expert. This tool definitely requires a senior person to work on this issue, as it might be a challenge for a tester.

For how long have I used the solution?

One to three years.

What do I think about the stability of the solution?

There have been major upgrades in the last couple of years. There was a stability issue, when I started to work on it. After we upgraded to 4.0, a lot of these problems were solved and there were advanced features, as well. With 4.0, the product is now really stable, and working fine.

The stability issue was that we used to log in and see errors. We had to work around them. For example, we would log through the admin, then run some queries, and log back in again. After the upgrade was done, everything now appears to be fine. 

What do I think about the scalability of the solution?

When we publish a lot data into the target, sometimes it is not able to handle it. What we do normally is we create the data margins of the source and we try to publish short sets of data into the target. Many times the publish will fail. I think the reason is because of huge sets of data that we publish. I am not sure if it is the tool issue, but I have seen this a lot of times before, where it is not able to publish a huge set of data (thousands and thousands of sets of data). With a few sets, it works. When we try to publish a lot of data, every time there is a publish error. Though, I have not tried it lately, we have seen this before.

How are customer service and technical support?

Tech support for TDM is fine. I have logged maybe one or two issues for TDM. Most of our issues are for App Test. 

Which solution did I use previously and why did I switch?

We did not use another solution before TDM.

How was the initial setup?

The setup was pretty straightforward.

What's my experience with pricing, setup cost, and licensing?

Know all the data requirements before buying this product. There are a lot of features for this tool, which may or may not be useful to a particular company. Make sure of what your requirements for your data are. 

What other advice do I have?

I would certainly recommend this product, because of the vast variety of data it provides for testing and its different features, like subsetting and cloning. I have heard of products which can do cloning and similar stuff, but there are many additional features in this tool, which is very useful for the testing and finding defects. 

We found we mostly use one or two features, so we need to be very clear on what we need before choosing a product. Test Data Manager is good, and there are a lot of advantages that you get from using the tool, especially for testing and incubating with the App Test, which is something that we use a lot. 

Disclosure: My company has a business relationship with this vendor other than being a customer: Partner.
Find out what your peers are saying about Broadcom, Informatica, IBM and others in Test Data Management. Updated: November 2021.
552,695 professionals have used our research since 2012.
Jyotirmay Mishra
Senior Project Manager /Senior Solution Architect at Cognizant
Consultant
Data privatization, provisioning, and generation for DevOps and CI/CD pipeline

What is our primary use case?

Test Data Management solution for our DevOps model, which is very useful. Data privatization (GDPR enable), synthetic test data generation, and test data provisioning are its main interesting features.

How has it helped my organization?

On-time production and real time data for DevOps testing environment and CI/CD pipeline.

What is most valuable?

Data privatization, provisioning, and generation for DevOps and CI/CD pipeline. 

What needs improvement?

More features on Big Data environment data privatization.  Synthetic data generation on domain specific.  SAP test data generation for SAP testing.

For how long have I used the solution?

Still implementing.

What is our primary use case?

Test Data Management solution for our DevOps model, which is very useful. Data privatization (GDPR enable), synthetic test data generation, and test data provisioning are its main interesting features.

How has it helped my organization?

On-time production and real time data for DevOps testing environment and CI/CD pipeline.

What is most valuable?

Data privatization, provisioning, and generation for DevOps and CI/CD pipeline. 

What needs improvement?

  • More features on Big Data environment data privatization. 
  • Synthetic data generation on domain specific. 
  • SAP test data generation for SAP testing.

For how long have I used the solution?

Still implementing.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
it_user797949
Domain Manager at KeyBank National Association
Video Review
Real User
Enables us to incorporate automation and self-service to eliminate all of our manual efforts

Pros and Cons

  • "It removes manual intervention. A lot of the time that we've spent previously was always manually, as an individual running SQL scripts against databases, or manually going through a UI to create data. These solutions allow us to incorporate automation and self-service to eliminate all of our manual efforts."
  • "Core features that we needed were synthetic data creation, and to be able to do complex data mining and profiling across multiple databases with referential integrity intact across them. CA's product actually came through with the highest score and met the most of our needs."
  • "All financial institutions are based on mainframes, so they're never going to go away. There are ppportunities to increase functionality and efficiencies within the mainframe solution, within this TDM product."

How has it helped my organization?

The benefit is that it removes manual intervention. A lot of the time that we've spent previously was always manually, as an individual running SQL scripts against databases, or manually going through a UI to create data. These solutions allow us to incorporate automation and self-service to eliminate all of our manual efforts.

What is most valuable?

Currently the data mining, complex data mining, that we do out there. Any sort of financial institution runs along the same challenges that we face in that referential integrity across all databases, and finding that one unique customer piece of information that meets all the criteria that we're looking for. All the other functions are fabulous as far as sub-setting, data creation.

What needs improvement?

I think the biggest one will be - all financial institutions are based on mainframes, so they're never going to go away. Opportunities to increase functionality and efficiencies within the mainframe solution, within this TDM product. Certainly, it does what we need out there, but there's always opportunities for greatly improving it.

What do I think about the stability of the solution?

Stability for the past year and a half has been very good. We have not had an outage that has prevented us from doing anything. It has allowed us to connect to the critical databases that we need, so no challenges.

What do I think about the scalability of the solution?

We haven't run into any issues at this point. So far we think that we're going to be able to get where we need to. In the future, as we expand, we may have a need to increase the hardware associated with it and optimize some query language, but I think we'll be in good shape.

Which solution did I use previously and why did I switch?

We were not using a previously solution. It was all home-grown items that we did out there, so a lot of automated scripting and some performance scripting that we did, in addition to manual efforts. 

As we looked at what the solutions were, some of our core features that we needed were synthetic data creation, and to be able to do complex data mining and profiling across multiple databases with referential integrity intact across them. CA's product actually came through with the highest score and met the most of our needs.

What other advice do I have?

I'd rate it about an eight. It provides the functionality that we're needing. There are always opportunities for improvement and I don't ever give anyone a 10, so it's good for our needs.

Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
it_user778602
Systems Architect at Global Bank Corporation
Real User
The initial setup was straightforward. Just install, next, and start using it.

Pros and Cons

  • "​The initial setup was straightforward. Basically, just install, next, and start using it​."

    What is our primary use case?

    We are having a serious trouble delivering and providing development environments and test environments for the diversifications that we use at the bank. What we are trying to do is use the time for that delivery, and that is why we are exploring using CA tools for networks. We already have the Server Virtualization tool, and now we are entering into the test of the management phase of tables.

    How has it helped my organization?

    We realized before what we are looking for. Now, we are trying to get everybody to work with the DevOps mindset. That is the main advantage and benefit that we are seeing here.

    What is most valuable?

    They are able to provide some technical data, then provide that data to multiple teams for testing purposes. That is the most value for us.

    What needs improvement?

    We just started using it. I can't tell you right now.

    For how long have I used the solution?

    Still implementing.

    What do I think about the stability of the solution?

    I can't tell you, because we have not had it for a long time. We have had it for two months. We are starting to really implement it, but so far, so good.

    What do I think about the scalability of the solution?

    I can't really tell you much. I can tell you about the integration, because we are integrating it with the Service Virtualization tool. Here, we are having a lot of benefits.

    For example, we are starting to provide the data necessary to prove little pieces of code or small integration packages, and we have seen a lot of value there.

    How are customer service and technical support?

    We have a provider based in Costa Rica. They are the ones that are giving us all the implementation and support. 

    Specifically for problems, we have not had any issues yet.

    Which solution did I use previously and why did I switch?

    We were not previously using a different solution.

    We were already using some CA products before, like Service Desk and ITCM, those are related to infrastructure and the computer themselves. The provider talked to use about if we wanted to get into DevOps. CA had the tools for implementing this strategy, and that is how we found out about it. We saw the demos, then we decided the solution was made for us.

    How was the initial setup?

    The initial setup was straightforward. Basically, just install, next, and start using it. I figured it out right away.

    Which other solutions did I evaluate?

    We did look at SiSoft in Costa Rica, but I do not even know the names of the others, because I was not directly involved with those. I know that we selected SiSoft as an option because of their experience.

    We chose CA because of their experience.

    What other advice do I have?

    The first thing that you should start with is the requirements and automation. That is the first step. We started with the Service Virtualization, then we found that we should have started with the requirements first. Because the natural processes are defined in our requirements, then you go to the virtualizer services that you are going to need, and then the data, but we started with the Service Virtualization. Now, we realized that we needed to start with the requirements first.

    It would be working better if the requirements were designed right.

    Most important criteria when selecting a vendor: 

    • Experience. The vendor already has to have big clients. 
    • The implementation process has to be very easy. It was.
    Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
    it_user797916
    Senior Dev manager at T-Mobile
    Video Review
    Real User
    Gives us quality testing solutions that we can use repeatedly

    Pros and Cons

    • "When you have hundreds and hundreds of databases everywhere, how do you do that without looking for keys? If you do it manually, you're going to need a lot of people. With the tool, I work with five people on my team. Right now, we're modeling 700 different databases, and we can do it very quickly because of the tool."
    • "The additional feature probably would be a data reservation that is more robust, where you can actually use it consecutively."
    • "They could make it easier to model databases, because right now it is really technical, so to train someone on it takes a while."

    How has it helped my organization?

    The benefit is that it's something you can repeat over and over again, once a solution is in place. You can benefit from it as often as you want to and without having to constantly looking for support.

    What is most valuable?

    The most valuable feature of this solution is data quality, that ability to provide data for testers and developers that is actually usable.

    What needs improvement?

    The additional feature probably would be a data reservation that is more robust, where you can actually use it consecutively. It is a difficult way to do things, so the data reservation, for us, is very important.

    Also, maybe they could make it easier to model databases, because right now it is really technical, so to train someone on it takes a while.

    What do I think about the stability of the solution?

    I would say very stable. At this point, the solution is very well-vetted and used and maintained, so we are able to get good results out of the tools that we are using.

    What do I think about the scalability of the solution?

    The solution - once it's in place - you really don't have to give a tool to the users. All you have to do is make it visible to them, so they can use it as a self-service solution, which is very helpful. You don't have to grow a team to have thousands of people. You could have a very small nimble team that does development, and then just propagate to everybody else to use with a click.

    Which solution did I use previously and why did I switch?

    There's no other way to get good data. You can't do it manually. It has to have some solution. You need a tool because federation of the data is the most important thing that you need to be able to test with.

    Our applications are very complex. When you have hundreds and hundreds of databases everywhere, how do you do that without looking for keys? If you do it manually, you're going to need a lot of people. With the tool, I work with five people on my team. Right now, we're modeling 700 different databases, and we can do it very quickly because of the tool.

    What other advice do I have?

    I give it an eight out of 10, only because it is complex. There are some things that still need to be worked out, but overall, it's a very good solid solution.

    Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
    it_user778575
    QA Director at Sogeti UK
    Real User
    We are able to create test data for specific business case scenarios; it's user-friendly

    Pros and Cons

    • "The most valuable feature is the Portal that comes with the tool. That helps make it look much more user-friendly for the users. Also its ease of use - even for developers it's not that complicated."
    • "They should make the Portal a little more user-friendly, make it even easier to configure things directly from the Portal."
    • "There were some issues with initial setup. It wasn't as smooth as we had thought. We ran into a network issue, a firewall issue, things like that. It wasn't something we could not fix. We worked with CA support and with the client's team to fix it. But there were issues, it took a lot of time to install and configure."

    What is our primary use case?

    We are using it to implement test data management.

    It is a new implementation so there were some challenges. But so far, it has been good.

    What is most valuable?

    The most valuable feature is the Portal that comes with the tool. That helps make it look much more user-friendly for the users. 

    Also its ease of use - even for developers it's not that complicated.

    It gives us the ability to 

    • mask the data
    • sub-set the data
    • synthetically generate test data
    • create test data for specific business case scenarios

    and more.

    What needs improvement?

    • Addition of more data sources.
    • Make the Portal a little more user-friendly, make it even easier to configure things directly from the Portal.

    For how long have I used the solution?

    Less than one year.

    What do I think about the stability of the solution?

    It is stable, but it is not where even CA wants it to be. There have been numerous releases going on and there are still some we are waiting for. But, overall it's good.

    What do I think about the scalability of the solution?

    It is scalable. This particular tool is  used by certain types of engineers, TDM engineers. But the recipient of the tool can be anybody so it can be scaled for as many licenses as the customer is willing to pay for. It's kind of expensive.

    How are customer service and technical support?

    Tech support has been very helpful.

    They have been responsive as best they can. I'm assuming that they're very busy, and they are. They usually respond within the same day. And usually the requests that go to the technical support side are not that simple either, so I can understand that.

    Which solution did I use previously and why did I switch?

    We are partners with CA, so this was one of the strategic directions my company also wanted to take. And CA had the near-perfect solution, which we thought we should invest in, together.

    How was the initial setup?

    It was good. There were some issues. It wasn't as smooth as we had thought.

    We ran into a network issue, a firewall issue, things like that. It wasn't something we could not fix. We worked with the CA support and with the client's team to fix it. But there were issues, it took a lot of time to install and configure.

    Which other solutions did I evaluate?

    We are a consulting company, so when we go to a client we do an evaluation. Often we have to tell them what about the different products we evaluated. So in this case CA TDM has competition: Informatica has a similar product called Informatica TDM; IBM has a similar product called IBM InfoSphere Optim. These are the main competitors of CA.

    What other advice do I have?

    When selecting a vendor the important criteria are 

    • ease of use
    • responsiveness of the technical support
    • forward-looking products. By that I mean, do they have a plan for the next three months, six months, year, not just make the product and then forgot about it.

    For this particular area, test data management, because I am involved in evaluating other companies' products as well, CA so far is the leader. I personally compare each feature for all the companies we evaluate. So far CA is the number one. There is still some improvement to be done, which CA is aware of. But I think I would advise a colleague that we can start with CA.

    Disclosure: My company has a business relationship with this vendor other than being a customer: Partner.
    Adam Topper
    Senior Test Data Management Specialist at a transportation company with 10,001+ employees
    Video Review
    Real User
    What I find to be most valuable is its ability to do synthetic data creation

    Pros and Cons

    • "There is other stuff that they are working on right now, and that includes things like the vTDM or Virtual Test Data Management. That is where they have the ability to do test data clones. It is really neat because after doing some of that creation, or if you are going to do some subsetting, then you have a great looking database."
    • "What I find to be most valuable is its ability to do synthetic data creation. I love that because it has a lot of flexibility and you do not have to worry about one specific database or how you are going to manage all the data points."
    • "I have always gotten a call back within an hour from CA's technical support solutions and they are a wonderful team to work with."
    • "As the solution continues to evolve, the one thing I like about it is the API-friendly layers that they have added into the realm. So, I just would like to see more support around that and more usability."

    How has it helped my organization?

    The benefits of doing something like the data creation is that you are going to be able to totally have control of your data from the get-go. You are not worrying about "what you see is what you get" kind of results from a production set. Instead you are just using all sorts of built-in functions, seed lists, or data calls which are live at that moment to be able to really manipulate your data and create exactly the data set that your testers need. That is very powerful. 

    Then on the vTDM side, which is the database cloning, the ability for them to have full control of their environment is the most important aspect of testing sometimes. Now, you do not have to worry about what is on your left, what is on your right, and who you are going to be hurting by trying to do the best testing that you can. Instead you have just you are own set that you can work with. You can spin it up and burn it down when you are done. 

    What is most valuable?

    TDM has tons of great solutions involved in one package. For me personally, what I find to be most valuable is its ability to do synthetic data creation. I love that because it has a lot of flexibility and you do not have to worry about one specific database or how you are going to manage all the data points. What you can do is  instead of taking everything from production and wondering what you are going to get from there, you can just create it all from the get-go yourself. That is a beautiful thing to be able to do. 

    There is other stuff that they are working on right now, and that includes things like the vTDM or Virtual Test Data Management. That is where they have the ability to do test data clones. It is really neat because after doing some of that creation, or if you are going to do some subsetting, then you have a great looking database. What you can do now is you can take really small copy, actually a full size copy but really small in size, and you can send that out to any of your testers to be able to use personally for their testing. That is cool because it gives them an expanded way to do their testing. They can do tons of unit tests, functional tests, and destructive tests, and they do not have to worry about the environment around them, because this is just their copy. 

    What needs improvement?

    As the solution continues to evolve, the one thing I like about it is the API-friendly layers that they have added into the realm. So, I just would like to see more support around that and more usability. If we can continue expand upon that, then it is going to open a lot of doors.

    What do I think about the stability of the solution?

    The stability of TDM is probably what I love about it the most. We have not had any major issues at any point since we have started utilizing the tool. Anytime we have had either a small bug or something that we would like to see as an enhancement, it is always been a quick call to the support team to be able to work on it with me. 

    What do I think about the scalability of the solution?

    TDM scalability has actually come a long way. I am really impressed with what they have done over the last couple of years for it. Specifically, what I really like is, as they moved from a thick line that was a little bit clumsy and cluttered, what we see now is that we have a web portal that is built on top of an entire API framework. Utilizing those APIs, it gives us a chance to do whatever we want with the tool, not just its specific built-in functions through the UI layer, but anything I want to do integrated with the open source abilities out there as well.

    How are customer service and technical support?

    TDM's technical support, specifically CA's technical support as a whole, has been wonderful for us. I have close relationships at this point with any of the guys there. If ever I see a potential bug, if ever I have a question that I just can't figure out myself in a very short amount of time, or if there is a new enhancement that I want to have worked on, I am a quick phone call away or a support ticket away. I can honestly say that if I had gone the route of creating a ticket, which we all know can sometimes go into a black hole with some companies, I have always gotten a call back within an hour from CA's technical support solutions and they are a wonderful team to work with. 

    Which solution did I use previously and why did I switch?

    My company decided to invest in Test Data Manager because we were going through a large transformation. We were doing both a legacy system transformation and an agile transformation for our teams. Out of that spawned tons of productivity and tons of great morale boosting initiatives at the office. 

    It also created tons of havoc for someone in the test data realm, because now we were working with, instead of just a couple databases on a legacy system, every database under the sun that those teams decided to work with. 

    We decided to look at a bunch of different systems solutions that were out there, and we decided that TDM had the best full set of solutions available for us to be able work with the largest amount of our teams.

    It is neat because as we have continued (we made that decision a couple of years go), the industries continued one of their biggest pushes which has been this idea to shift left and do continuous testing as part of a continuous delivery system. TDM has actually positioned itself very nicely to allow us to do that, because by doing things like the data creation, then I can create a fully isolated database for people to be able to work with it at the earliest stages for the unit testing and functional testing. Then, I can move them directly with the same solutions into a more integrated environment, and use the same tool set to be able to build out all the data they are going to need there and so on, as it moves down the line all the way into production.

    What other advice do I have?

    It is not an easy thing to give a 10 on a scale of one to 10 for ratings, but I have to say, I think the TDM solution is a 10.

    Most important criteria when selecting a vendor: I think when you are trying to choose a vendor to work with on some of these solutions, you really have to decide if they are going to work with you as well. CA has been a great partner for us, because they are always there to work with us. It is not them having a solution that they are trying to cram down your throat, instead it is them trying to figure out what your needs are and building solutions around that. 

    Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
    it_user796329
    IT Manager at The Williams Companies, Inc.
    Video Review
    Real User
    Allows us to find the right test data and to get required inputs into our API test

    Pros and Cons

    • "TDM allows us to find the right test data for the test that we need, and then it also allows us to get the required data inputs into our API test, so that we can do a full test."
    • "​The scalability is outstanding. We're able to scale it to any size of data that we want. We can do small data sets, we can do large data sets."

      What is our primary use case?

      One thing that we're using Test Data Manager for, is to build data marks so that we can test APIs of our application using users from every company within our application.

      How has it helped my organization?

      The benefits are that TDM allows us to find the right test data for the test that we need, and then it also allows us to get the required data inputs into our API test, so that we can do a full test.

      What needs improvement?

      One of the features that I wanted, which I think is going to be released, is to be able to create virtualized data sets, or virtualized databases. That's a feature we're going to take advantage of. All of our developers will be able to have their own virtual copy of a golden copy of our database, and be able to do transactions against their virtual copy, and then restore back to a known good checkpoint.

      What do I think about the stability of the solution?

      This solution has been very stable for us. We've gone through multiple upgrades of versioning, and each one of them gets progressively better. 

      What do I think about the scalability of the solution?

      The scalability is outstanding. We're able to scale it to any size of data that we want. We can do small data sets, we can do large data sets.

      How are customer service and technical support?

      On many occasions, we have sought CA's technical team to help us solve problems, and they've always been very responsive. A good relationship.

      Which solution did I use previously and why did I switch?

      Our team made the decision that we were going to get into DevOps and do test automation. As a way of providing our API test adequate data, we knew we needed to have a better solution than manually collecting data from databases. So we brought in Test Data Manager to work in conjunction with our app test.

      What other advice do I have?

      If I were talking to my peer managers, I would recommend Test Data Manager - and I have, on multiple occasions - because it does allow the developer to have quick access to data that, normally, would take them hours or sometimes days to gather. 

      I would say TDM, on a scale of one to 10, is probably in the eight category. It's a very solid solution. I think it can do more for us, and we're always trying to find new ways of using Test Data Manager.

      Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
      it_user778785
      Lead Test Data Engineer at a healthcare company with 1,001-5,000 employees
      Real User
      Allows us to have quicker releases as well as speeds up our testing, but needs to get everything centralized into the portal

      Pros and Cons

      • "​The masking of data has really been key for us to be able to replace PII with fake values."
      • "Needs a better fit and finish."
      • "Get everything centralized into the portal."

      What is our primary use case?

      As part of the DevOps function, what we are trying to do is create a subset in masked environments that is the basis for our development and testing. In the health care industry, we have a lot of concern with PII and PHI, so making sure that it is not exposed in non prod environments. Also, lessening the amount of disk space which is used for these environments, making it faster, speedier to market, etc.

      So far, it is performing well. We are still in our initial build out phase, but it is all coming together. 

      How has it helped my organization?

      We are still in our DevOps build out phase. However, it will allow us to have quicker releases, more frequent with smaller content, as well as speed up our testing.

      What is most valuable?

      The masking of data has really been key for us to be able to replace PII with fake values. To be able to model the data, then subset it based upon a set of drivers that I use. 

      What needs improvement?

      They are currently working on a lot of things that we have requested to simplify: the UI and to make it more consistent; a better fit and finish. Get everything centralized into the portal.

      For how long have I used the solution?

      Still implementing.

      What do I think about the stability of the solution?

      It is improving. We have become part of the design partner program for several of the tools and we really see them heading in the right direction. We look forward to some of the improvements that they are making.

      How is customer service and technical support?

      We have used technical support throughout: logging some tickets and some issues that we have had. As well as going into the communities page to give some recommendations. Technical support is really quick to respond and usually very helpful.

      Things have been relatively easy to either work around or they have been able to be identified pretty quickly and get flagged for future fixes.

      How was the initial setup?

      The initial setup was complex, because it is new to the company as well as new to me. It is a new role for me, so building it from the ground up certainly makes it complex.

      What other advice do I have?

      Feel free to ask questions. There are really helpful people that are willing to take the time to work with you and provide a solution. Think about it holistically. Do not think about it as a siloed approach to just solving one problem. Think about how it is going to fit into the larger DevOps organization. 

      CA willingness to listen to customers has surprised me. Being part of this design partner program, I really feel like our concerns are being taken seriously and it is not just sales pitch stuff. 

      What do you personally like to see in a vendor: That you are being taken seriously. That your concerns are valid and to see actual movement with them. This is my first vendor that I have worked with extensively, which has been good.

      Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
      it_user779256
      Solutions Architect at American Express
      Real User
      Allows me to generate and manage synthetic data, but the interface could be better

      Pros and Cons

      • "It allows us to create a testing environment that is repeatable. And we can manage the data so that our testing becomes automated, everything from actually performing the testing to also evaluating the results."

        What is our primary use case?

        Generate synthetic test data.

        It has performed fine. It provides us the capabilities that we were anticipating.

        How has it helped my organization?

        It allows us to create a testing environment that is repeatable. And we can manage the data so that our testing becomes automated, everything from actually performing the testing to also evaluating the results. We can automate that process. Plus, we're no longer using production data.

        What is most valuable?

        1. I am able to maintain metadata information based off of the structures and 
        2. I am able to generate and manage synthetic data from those.

        What needs improvement?

        The interface based, on our unique test case - because we are extremely unique platform - could be better. We have to do multiple steps just to create a single output. We understand that, because we are a niche architecture, it's not high on their list, but eventually we're hoping it becomes integrated and seamless.

        As noted in my answer on "initial setup", I would like to see that I don't have to do three steps, rather that it's all integrated into one. Plus, I'd like to know more about their API, because I want to be able actually call it directly using an API, and pass in specific information so that I can tune the results to my specific needs for that test case. And actually make it to where I can do it for multiple messages in one call.

        What do I think about the stability of the solution?

        Stability is fine. It's stable. It's not like it crashes or anything like that, because it's just a utility that we use to generate data. Once we generate the data, we capture it and maintain it. We don't use the tool to continually generate data, we only generate it for the specific test case, and then don't generate it again. But it gives us the ability to handle all the various combinations of variables, that's the big part.

        What do I think about the scalability of the solution?

        For our platform, scalability probably isn't really an issue. We're not planning on using it the way it was intended because we're not going to use it for continually generating more data. We want to only generate specific output that we will then maintain separately and reuse. So, the only time we will generate anything is anytime there is a different test case needed, a different condition that we need to be able to create. So, scalability is not issue.

        How are customer service and technical support?

        Tech support is great. We've had a couple of in-house training sessions. It's coming along fine. We're at a point now where were trying to leverage some other tools, like Agile Designer, to start managing the knowledge we're starting to capture, so that we can then begin automating the construction of this component with Agile Designer as well.

        Which solution did I use previously and why did I switch?

        We didn't have a previous solution.

        How was the initial setup?

        The truth is that I was involved in setup but they didn't listen to me. "They" are other people in the company I work for. It wasn't CA that did anything right or wrong, it was that the people that decided how to set it up didn't understand.  So we're struggling with that, and we will probably transition over. Right now we have it installed on laptops, and it shouldn't be. It should be server based. We should have a central point where we can maintain everything.

        So, the set up is fairly straightforward, except for the fact that there are three steps that we have to go through. We have to do a pre-setup, a pre-process, then we can do our generation of our information, and then there's a post-process that we have to perform, only because of the unique characteristics of our platform.

        Which other solutions did I evaluate?

        In addition to CA Test Data Manager, we evaluated IBM InfoSphere Optim. Those were the two products that were available to our company at the time when I proposed the idea of using it in this way.

        We chose CA because they had the capability of doing relationship mapping between data variables.

        What other advice do I have?

        The most important criterion when selecting a vendor is support. And obviously it comes down to: Do they offer the capabilities I'm interested in at a reasonable price, with good support.

        I rate it at seven out of 10 because of those three steps I have to go through. If they get rid of those, make it one step, and do these other things, I'd give it a solid nine. Nothing's perfect.

        For my use, based on the products out there that I have researched, this is the best one.

        Disclosure: I am a real user, and this review is based on my own experience and opinions.
        it_user778692
        Software Developer Engineer at a financial services firm with 5,001-10,000 employees
        Vendor
        It saves us time from generating the same amount of data in real-time

        What is our primary use case?

        What we do is we generate data that we use to generate like socials. We can also virtualize the bureaus that the software calls out to, so we can get a good response back that we can simulate.

        How has it helped my organization?

        It is just easy to manage from what I have seen so far.

        What is most valuable?

        It saves us time from generating the same amount of data in real-time.

        What needs improvement?

        There are some known bugs that I have found, but I think those are known issues. I think those issues just need to be worked out.

        For how long have I used the solution?

        Less than one year.

        What do I think about the stability of the solution?

        So far, so good.

        What do I think about the scalability of the solution?

        So far it has worked for our…

        What is our primary use case?

        What we do is we generate data that we use to generate like socials. We can also virtualize the bureaus that the software calls out to, so we can get a good response back that we can simulate.

        How has it helped my organization?

        It is just easy to manage from what I have seen so far.

        What is most valuable?

        It saves us time from generating the same amount of data in real-time.

        What needs improvement?

        There are some known bugs that I have found, but I think those are known issues. I think those issues just need to be worked out.

        For how long have I used the solution?

        Less than one year.

        What do I think about the stability of the solution?

        So far, so good.

        What do I think about the scalability of the solution?

        So far it has worked for our enterprise services, and we are pretty large. So, I would say it is fairly scalable at the moment.

        How was the initial setup?

        Initial setup was pretty straightforward. We had one user that was primarily working on it. She probably spent a few months initially setting it up, but it is just because we do not know the product. We did not know it at first, and working out all the kinks to work with our environment.

        What other advice do I have?

        I would say trial it out.

        Disclosure: I am a real user, and this review is based on my own experience and opinions.
        MS
        Network Engineer at a financial services firm with 1,001-5,000 employees
        Real User
        It scales very well to our network and we have a very large network

        What is our primary use case?

        Monitoring network devices using SNMP. It works very well. 

        How has it helped my organization?

        • Scalability
        • The ability to have multiple pieces of information on the same screen. 

        What is most valuable?

        • The flexibility
        • The ability to view the data the way we want it. 

        What needs improvement?

        More data visualization, the way that we are looking at data, we want to be able to see it in different ways. So, we are looking to expand the visualization of that data.

        What do I think about the stability of the solution?

        It is very stable. We have had issues, but we have worked through those issues with CA, and they have been successfully resolved. 

        What do I think about the scalability of the solution?

        It scales very well to our network. We have a very large network. Finding a solution that can actually monitor all the devices and interfaces, this product has been able to do that.

        How are customer service and technical support?

        Technical support is very good. They have performed to our expectations.

        Which solution did I use previously and why did I switch?

        We were previously using a different solution, however CA purchased that solution.

        How was the initial setup?

        Due to our environment, it was complex. The product itself is simple. 

        Which other solutions did I evaluate?

        SevOne.

        What other advice do I have?

        I would recommend this solution.

        Most important criteria when selecting a vendor: 

        • Stability
        • The size of the company
        • The ability to respond to our needs and meet our needs. 
        • The breadth of software that they have available for what we are looking to do.
        Disclosure: I am a real user, and this review is based on my own experience and opinions.
        it_user558576
        Engagement Manager at a tech services company with 5,001-10,000 employees
        MSP
        Synthetic data generation is outstanding

        What is most valuable?

        The synthetic data generation. It generates the data. By default you get the data but then you have to modify the data. There, I find that it does that amazingly well. I have not seen that feature as capable in other tools.

        How has it helped my organization?

        This data is a very, very important thing because there are a lot of challenges around this data right now and it's very complex. Creating test data right from scratch is going to be complex. But with the Data Finder's synthetic data generation, the whole copy is created and then on top of it all the data manipulation is done. 

        What needs improvement?

        I can't think of anything at this point in time.

        What do I think about the stability of the solution?

        Stability is very good. In my role as an Engagement Manager, I don't, on a day to day, use the tool. My team does that. But I have not heard any complaints.

        What do I think about the scalability of the solution?

        Scalability. It's good as well.

        How are customer service and technical support?

        I haven't used tech support personally, but my team does. I have actually been to the CA office in Scottsdale many times. The support is very good because we are, in many ways, a partner with CA.

        Which solution did I use previously and why did I switch?

        We were using our own tool, DDC2 which is a homegrown tool, as well as in some areas IBM Optim.

        How was the initial setup?

        I was not involved in the initial setup.

        What other advice do I have?

        In terms of advice it depends on what you need. Based on our experience we have seen this is a very good tool. Especially when you need to get the bulk data and make changes to it on the fly to do testing. This is the tool that you can use.

        Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
        it_user752190
        Senior System Engineer at a comms service provider with 10,001+ employees
        Vendor
        Can mask data according to your needs and statistical distribution

        Pros and Cons

        • "The whole process is done by functions which are compiled on the source environment itself. Normally, you take the data from the source, you manage them - for example, mask them - and then you load this masked data into the destination. With this solution, it's completely different. On the source environment, there are functions compiled inside the environment, which means they are amazingly fast and, on the source environment, data are masked already. So when you take them, you already take masked data from the source. So you can copy them, even with an unencrypted pipe."
        • "We are using a specific database. We are not using Oracle or SQL, Microsoft. We are using Teradata. There are some things that they don't have in their software. For example, when delivering data, they are not delivering them in the fastest possible way. There are some things which are faster."

        What is our primary use case?

        Data masking, exactly what this tool is created for. We are going to use it for the incorporation into test or development environments.

        We are managing a lot of customer data, and the idea is to not have, or approve, or give a lot of permissions to read all this data. We need to mask them, but we still need to work with them, which means that developers need access to a lot of data.

        We have needed a tool where the data provided for developers should be easy and anonymized. This is probably the one and only tool with so many sophisticated features. We need those features for masking/anonymizing data with statistical distribution and with preparation of test/dev data (a lot of data).

        How has it helped my organization?

        This tool is super fast and it has solved many of our issues. It is also much better than many other solutions which are on the market. We've already tested different ones, but this one looks the best currently.

        We can deliver, first, securely; second, safely; and third, without extra permissions. We don't need to go through a whole procedure so that developers have permission to access production data. It's not needed anymore. And it will work with production data because it's almost the same data but, of course, not real. The structure of the data is the same and the context of the data is the same but the values are different.

        The features are very technical and are definitely what we need. We've got some rules, especially from security, from compliance, but we need to take care of our customer data, very securely, and subtly. There is no other product that gives you these opportunities.

        What is most valuable?

        • Masking of data. 
        • There are lots of filters, templates, vocabularies, and functions (which are very fast) to mask data according to your needs and statistical distribution, too.

        The functionality of this tool is something that changed our work. We need to manage the data, and for developers to work on actual data. On the other hand, you don't want to give this data to the developers because they are customer data that developers shouldn't see. This tool can deliver an environment which is safe for developers. Developers can work on a big amount of data, proper data, actual data, but despite the fact that they are actual, they are not true, because they are masked. For the developer, it's absolutely proper because instead of a customer's date of birth, he's got a different date of birth, which mean its actual data but not the exact data, it's already masked. 

        The whole process is done by functions which are compiled on the source environment itself. Normally, you take the data from the source, you manage them - for example, mask them - and then you load this masked data into the destination. With this solution, it's completely different.

        On the source environment, there are functions compiled inside the environment, which means they are amazingly fast and, on the source environment, data are masked already. So when you take them, you already take masked data from the source. So you can copy them, even with an unencrypted pipe.

        These are two pros you cannot find anywhere. Most tools - for example, Informatica - are taking data as they are, in the original, not masked form, then on the Informatica server you need to mask them, and then you're sending them to the destination. Here, in TDM, you already take masked data.

        What needs improvement?

        If you want to automate something, you need to figure it out. There is no easy way (software is only for Windows). I am missing a lot of terminal tools, or API for the software.

        The software is working on Windows and, from some perspectives, that might be a problem. From our perspective, it is a problem because we need to have a different team to deploy for our Windows machines. This is a con from our perspective. Not a big one, but still.

        They have already improved this product since our testing of it, so it may be that the following no longer applies.

        The interface is definitely one you need to get used to. It's not like a current interface which is really clear, easy to check. It's like from those days, some time ago, an interface that you need to get to know.

        Also, we are using a specific database. We are not using Oracle or SQL, Microsoft. We are using Teradata. There are some things that they don't have in their software. For example, when delivering data, they are not delivering them in the fastest possible way. There are some things which are faster.

        We asked CA if there would be any possibility to implement our suggestions and they promised us they would but I haven't seen this product for some time. Maybe they are already implemented. The requests were very specifically related to the product we have, Teradata. This was one of the real issues. 

        Overall, there was not much, in fact, to improve.

        For how long have I used the solution?

        Less than one year.

        What do I think about the stability of the solution?

        We didn't face any issues with stability.

        The only problems we had, and we asked CA to solve, were some very deep things related to our products. It was not core issues, in fact. It was, '"We would like to have this because it's faster, or that because it's more robust or valuable."

        What do I think about the scalability of the solution?

        I cannot answer because we only did a PoC, so I have no idea how it will work, if there will be a couple of designers working with the stool.

        Still, I don't see any kind of issues because there will be only a few people working with the design of masking and the rest will be done on the scripting level, so it's possible we won't see it at all. 

        How are customer service and technical support?

        During the PoC we had a support person from CA assigned to us who helped in any way we needed.

        Which solution did I use previously and why did I switch?

        We didn't use any other resolution, we simply needed to have it implemented and we tried to figure it out. We looked at the market for what we could use. TDM was our very first choice.

        How was the initial setup?

        I didn't do the setup by myself, it was done by a person from CA. It didn't look hard. It looked pretty straightforward, even with configuration of the back-end database.

        Which other solutions did I evaluate?

        After doing our PoC we tried to figure out if there was any other solution which might fit. We tried and, from my perspective, because I was responsible for the whole project, there was no solution we might use in the same way or in a similar way. This product exactly fits our compliance and security very tightly, which is important.

        There aren't any real competitors on the market. I think they simply found a niche and they started to develop it. We really tried, there are many options out there, but there are some features only specific to this product and there are features you might need, if you, for example, work for a big organization. And these features aren't in any other product.

        There are many solutions for masking data, there are even very basic Python modules you can use for masking data but you need to take data from the source, you need to mask them, and you need to deliver the data to the destination. If you have a big organization like ours, and you have to copy one terabyte of data, it will take hours. With this solution, this terabyte is done in a couple of minutes.

        What other advice do I have?

        We did a proof of concept with TDM to see if the solution fits our needs. We did it for a couple of months, did some testing, did some analysis, and tried to determine if it fit our way of working. Now we are going to implement it in production.

        If there is a big amount of data to mask and you need to deliver it conveniently, pretty easily, there is no other solution. Configuration is easy. It's built slightly differently, the design is slightly different than any other tool, but the delivery of the masked data is much smoother than in any other solution. You don't need to use something like a stepping stone. You don't need to copy data to some place, then mask it, and then send it, because you copy data which is already masked. Data is masked on the fly, before they are copied to the destination. You don't need anything like a server in the middle. In my opinion, this is the biggest feature this software has.

        Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
        it_user572823
        AVP Quality Assurance at GM Financial
        Video Review
        Vendor
        Gives you confidence in data that you're creating and keeps you out of the SOX arena, because there's no production data within that environment.

        What is most valuable?

        Test Data Manager allows you to do synthetic data generation. It gives you a high level of confidence in your data that you're creating. It also keeps you out of the SOX arena, because there's no production data within that environment. The more that you can put in controls and keep your data clean, the better off you are. There are some laws coming into effect in the next year or so that are going to really scrutinize production data being in the lower environments.

        How has it helped my organization?

        We have certain aspects of our data that we have to self-generate. The VIN number is one that we have to generate and we have to be able to generate on the fly. TDM allows us to generate that VIN number based upon whether it's a truck, car, etc. We're in the car, auto loan business.

        What needs improvement?

        I would probably like to see improvement in the ease of the rule use. I think sometimes it gets a little cumbersome setting up some of the rules. I'd like to be able to see a rule inside of a rule inside of a rule; kind of an iterative process.

        What do I think about the stability of the solution?

        TDM has been around for a couple of years. I used it at my previous company, as well. It's been really stable. It's a tool that probably doesn't get utilized fully. We intend on taking that, partnering it with the SV solution and being able to generate the data for the service virtualization aspect.

        What do I think about the scalability of the solution?

        Scalability is similar along the SV lines; it's relatively easy to scale. It's a matter of how you want to set up your data distribution.

        How are customer service and technical support?

        We were very pleased with the technical support.

        Which solution did I use previously and why did I switch?

        When you have to generate the amount of loan volume that we need – 50 states, various tax laws, etc. – I needed a solution that I can produce quality data that fits the target testing we need; any extra test cases; etc. We’re more concentrated on being very succinct in the delivery and the time frame that we need to get the testing done in.

        I used CA in my previous company. I have prior working relationship with them.

        How was the initial setup?

        The initial setup was done internally. Obviously, the instructions that were online when we downloaded it, we were able to follow those and get the installation done. We did have a couple of calls into the technical solution support area and they were able to resolve it fairly quick.

        What other advice do I have?

        I think from my synthetic generation, a lot of times generating synthetic data can be cumbersome. TDM, with some of the rules aspect of it, you can generate it and have your rules in place that you know your data's going to be very consistent. When we want a particular loan to come through with a particular credit score, we can generate the data. We can select and generate the data out of TDM that will create me a data file for my in-front script, through using DevTest.

        I also push the service virtualization record to respond to the request of the loan, hitting the credit bureau, returning a certain credit score, which then gets us within that target zone for that loan we're looking for, to trigger a rule.

        Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
        it_user572907
        Senior Specialist at Cox Automotive
        Video Review
        Vendor
        The data masking is a powerful aspect of the tool and I have found the best success in the data generation features.

        What is most valuable?

        A lot of people, when they first started looking at the tool, started immediately jumping in and looking at the data masking, the data subsetting that it can do, and it works fantastically to help with the compliance issues for masking their data. That's a very powerful aspect of the tool.

        But the part I found the best success in is actually the data generation features. In really investing into that concept of generating data from the get-go, we can get rid of any of those concerns right off the bat, since we know it's all made-up data in the first place.

        We can fulfill the request of any team to very succinct and specific requirements for them each time. When I look at it as a whole, it's that data generation aspect that really is the big win for me.

        How has it helped my organization?

        When I look at the return on investment, there are not only huge financial gains on it. In fact, when I recently ran the numbers, we had about $1.1 million in savings on just the financials from 2016 alone. What it came down to is, when we started creating our data using Test Data Manager, we reduced our hours used by about 11,800 in 2016. That's real time. That's a significant, tangible benefit to the company.

        When you think about it, that's somewhere around six employees that you've now saved; let alone, you have the chance to focus on all the different testing features, instead of having them worrying about where they're going to get their test data from.

        What needs improvement?

        It's cool that right now with this tool, they're doing a lot of things to continuously improve it. I think Test Data Management as a strategy across the whole organization, has really picked up a lot of momentum, and CA’s been intelligent to say, "We have a really great product here, and we can continue to evolve it."

        Right now, they're taking everything and taking it from a desktop client and moving it into a web portal. I think there's going to be a lot of flexibility in that. If I was going to look at one thing that I am hoping they are going to improve on is – it is a great database tool – I'm not always sure about the programmatic abilities of it. Moreover, specifically, it's great in terms of referential integrity across multiple systems, multiple tables, but I do find a couple of limitations every now and then, because of trying to maintain that referential integrity; that I have to go in and try to manually make sure I want to break things.

        For how long have I used the solution?

        I've been using it for about two-and-a-half years at my current position, and I've actually been familiar with the tool for about the last five or six years.

        What do I think about the stability of the solution?

        The stability is wonderful on it. I don't think that, at any point, have I had a showstopper issue with the application. It's never caused any major issues with our systems, and I will give credit where credit's due. Even right now, as they continue to enhance the tool, it has still stayed wonderfully stable through that process, and everyone on CA’s side has been there to support on any kind of small bug or enhancement that might come up along the way.

        What do I think about the scalability of the solution?

        It has scaled tremendously. Especially, again, I don't want to harp back too much on it, but when you start looking at data generation, your options are endless in the way you want to incorporate that into your environment.

        I have my manual testers utilizing this to create data on the fly at any moment. I have my automation users who are going through a little bit more of it, getting daily builds sent to them. I have more performance guys sending requests in for hundreds of thousands of records at any given time, that might have taken them two weeks to build out before, that I can now do in a couple hours. It ties in with our pipelines out to production.

        It's a wonderful tool when it comes to the scalability.

        How are customer service and technical support?

        Any time that I've had something that I question and said, "Could this potentially be a bug," or even better, "I would love this possible enhancement", it's been a quick phone call away or an email. They respond immediately, every single time, and they communicate with me, look at what our use case is on the solutions, and then come up with an answer for me, typically on the spot. It's great.

        Which solution did I use previously and why did I switch?

        We knew we needed to invest in a new solution because our company was dealing with a lot of transformations. Not only do we still have a large root in our legacy systems, that are the iSeries, DB2-type of systems, but we have tons and tons of applications that have been built on a much larger scale in the past 40 years, since the original solutions were rolled out. Not only did we have a legacy transition occurring within our own company, but we also changed the way that our teams were built out. We went from teams that were a waterfall, iterative, top-down approach, to a much more agile shop.

        When you look at the two things together, any data solution that we were using before, maybe manual hands on keyboards, or automated scripts for it, just weren't going to cut it anymore. They weren't fast enough, and able to react enough. We started looking at it and realized that Test Data Manager by CA was the tool that could actually help to evolve that process for us.

        When selecting a vendor, I wanted someone that I'm going to have actually some kind of personal relationship with. I realized that we can't always have that with everyone that we're working with, but CA has done a wonderful job of continuously reaching out and saying, “How are you doing? How are you using our product? How do you plan on using our product? Here's what we’re considering doing. Would that work for you?" They've been a wonderful partner, in terms of communication of the road map of where this is all going.

        How was the initial setup?

        It's a great package that they have out there. It's a plug-and-play kind of system, so it executes well on its own to get up and running in the first place. When they do send releases in, it's as simple as loading the new release.

        What's kind of neat about it is, if they do have something that needs to be upgraded on an extension of the system, some of the repositories and things like that, it's smart enough to actually let you know that needs to happen. It's going to shut it down, take care of it itself, and then rebuild everything.

        Which other solutions did I evaluate?

        We evaluated other options when we first brought it in. We looked at a couple of the others. The reason that we ended up choosing Test Data Manager was that it was stronger, at the time at least, in its AS/400 abilities, which is what all of our legacy systems are built on. It was much more advanced than anything else that we were seeing on the market.

        What other advice do I have?

        It’s not something that I would often give, but I do give this a perfect rating. We've been able to solve any of the data issues that we were having initially when we first brought it in, and it's expanded everything that we can do as we looked into the future right now of where we want to go with this. That includes its tie-ins for service virtualization; that includes the way that we can build out our environments in a way that we'd never considered before. It's just always a much more dynamic world that we can react a lot faster to, and attribute most all of that to Test Data Manager.

        Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
        it_user572886
        Client Partner at a financial services firm with 11-50 employees
        Video Review
        Vendor
        Provides a centralized view of the test data and how efficiently you can use it across business units.

        What is most valuable?

        The most important feature I see is how to have a centralized view of the test data; how efficiently you can use the test data across different business units, starting from generating the data that you need to use, to how to use it, repetitively; how you can grow on it, on top of the base data that you create. TDM is very, very efficient.

        How has it helped my organization?

        Now, though, my company is going through a process to be more agile, which is basically the theme of a recent CA conference I attended. While we are trying to go through the agile journey, there are some building blocks that need to be in place. Test data in the whole product lifecycle, is very, very important; how good is the data that you have; how efficiently you can run those test cases, again and again, repetitively, which is very important; and I guess the features that TDM gives fit right into that; what we are looking for, in the journey that we are having.

        What needs improvement?

        I think one thing we would like to see is how quickly it can be used like a SaaS product. You can just plug in incoming data that we have from different sources; how quickly that can be integrated and how the test data can be generated. That quickness I think is something that can be improved.

        If plugins can be developed very quickly, that will help companies like us, because we have hundreds of data sources.

        What do I think about the stability of the solution?

        Looking at the use cases that we have seen and some of the customer testimonials that we see, I think it shows it has gone through the journey and how the quality of the product is. Some of the proof of concepts that we have done, working with the technology people, we can see the stability of the product; how it can also be very useful to our company.

        What do I think about the scalability of the solution?

        I think one of the main features of TDM is how you can scale from a small organization; how you can use it in a very big organization. In our company, that is everything. Only because of that very feature, scalability, we are considering TDM.

        How are customer service and technical support?

        We have not actually used technical support, because we had gone through some of the proof of concepts, as I’ve mentioned. We were already working with some of the Test Data Management group. We are in the process of finalizing the last couple of products that we are looking at and TDM definitely is at the top.

        Which solution did I use previously and why did I switch?

        We were previously using several products, including some in-house products. From my previous experience working with CA, I knew some of the products that they offer. While during the process of the RFP and also some of the new contracts; old contracts are getting renewed; we definitely saw it come to mind and that's how we turned to CA and started this engagement.

        How was the initial setup?

        We are here to buy it, as I’ve mentioned, but there was an initial setup to do the proof of concepts. I was involved in it with some of their technology people.

        It was easy I think because of the experts who were there. They know how to interact with people who don't know the product. When we evolved through that product knowledge, I think they also took us through the journey. It was very easy to interact with them.

        Which other solutions did I evaluate?

        There were also other vendors on our shortlist and, as I’ve mentioned, scalability was one of the main reasons because we are growing faster than ever. The data is growing by TBs and TBs, so I think it was one of the big reasons and secondly, I think if we also look at the market, the rating for TDM is very good. I think it was a unilateral decision that TDM should be one of the final products that we should go for.

        What other advice do I have?

        Use this product with a proof of concept for your organization. The product is very agile. It can fit into small and big organizations, so don't be afraid of that. The product has a lot of features; the way it has been, the product works, is the features. It can work for you as a smaller organization. It can work for a very large organization. Scalability is there. Just use it as a proof of concept and you'll see the power of the TDM.

        If we are able to get those – maybe it is there, and we just have to see it in my company – plugging in from all of those different sources, I think that will definitely make it 100% the product that we're looking for.

        Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
        it_user558156
        Quality Assurance at a logistics company with 1,001-5,000 employees
        Vendor
        With synthetic data generation, we can test applications with three or four times the production load. We would like to see it generate synthetic data for non-relational DBs.

        What is most valuable?

        One of the most valuable features to us is synthetic data generation. We generate a lot of synthetic data for our performance testing and bulging up our performance environment to see how much load they can sustain. We've been doing it for relational data structures.

        At a recent conference, I was talking to the product management team. We have a big use case for synthetic data generated for non-relational data structures. They have it on their road map, but we would love to see that coming out very soon. With modernization, relational databases are going away and the non-relational databases are coming up. That's a big use case for us, especially with the Grav database. We have a big, huge Grav database. We need to generate a lot of synthetic data for that.

        How has it helped my organization?

        It has really changed the culture in the company because nobody could ever imagine generating millions of records. Even production systems have just a couple of million records. When you want to test your applications with three or four times the production load, you can never actually achieve it because there is no other way besides synthetic data generation. You can’t have that volume of data in your DBs. Even if you subset your entire production, you would get just one X of it. To get three or four X of it, you have to go to either data cloning or to synthetic data generation.

        What needs improvement?

        The solution can really improve on non-relational data structures because that's a big industry use case which we are foreseeing, with non-relational database structures. I talk about databases. I talk about request-response pairs; the services data generation. We use it so much for virtualization. If we could create the web services request-response pairs non-relationally supporting GET, POST, and so on; that would be a big win for us.

        For how long have I used the solution?

        I've been using CA Test Data Manager since it was first released as Datamaker about 2.5 years ago. I've been using it pretty regularly since then. It has undergone a big, big transformation. There is a lot of good stuff coming up.

        What do I think about the stability of the solution?

        We still use the old tech line version of it, but we have seen the demos as it's moving to the web interface. I think its going to be very stable going down the line.

        What do I think about the scalability of the solution?

        It is not very scalable because even to generate maybe a couple of million records, it takes six to seven hours. If cloud muscle power could be included with it – like if the synthetic data generation can be done using a cloud instance; it's all synthetic data, so nothing is PII in it – if you could have a cloud feature where the data can be generated in the cloud, which might have multi-GB of RAM in memory, that would be great for us.

        How is customer service and technical support?

        Technical support is getting better. It's getting better and slower at the same time. That is because when I started my interaction with Grid Tools, it used to work on the bleeding edge of technology. Whatever enhancements we used to submit, the turnaround time was a couple of weeks and we would get whatever we need, whatever new features we needed. The processes were really ad-hoc. Rather than writing support tickets, you would literally reach out to somebody who you know who really works on the product. You reach out to them and they keep passing your ticket or enhancement request from person to person. Now the process is very much streamlined, but we have lost that turnaround time capability.

        What other advice do I have?

        When selecting a vendor, my personal requirements would be: the tools should be stable and there should be a knowledge repository for it. When you see the PPT presentation, it just gives you an introduction about the tool and it gives you the capabilities of the tool. To really get your hands dirty, you need an intense video or documentation to work on it.

        I think the more webinars you do, the better. If you can record the webinars, archive them, that would be great. If you could try to solve some more complex use cases in your demos, that would be great. Most companies give you a demo of new features with zero complexity. Actually, when looking at the demo, and you are trying to solve your own use cases, you just get choked. You can't proceed any further because your use cases are really more complex than what was being shown in the demo. From the recovery aspect, if they can come up with more intense videos which shows real complex use cases, that's going to be great.

        Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
        it_user558477
        IT Manager at a financial services firm with 1,001-5,000 employees
        Vendor
        Data masking, subsetting, and synthetic data generation cover our needs. They should have more than just desktop-based interfaces.

        What is most valuable?

        We started the TDM journey due to our data masking needs.

        How has it helped my organization?

        It doesn’t only cover our data masking needs, but also our data subsetting and synthetic data generation needs. Finally, it gave us the idea of making test data as a service within our organization.

        What needs improvement?

        The current version of the product is composed of several sub applications and the UI screens of those apps are not user-friendly enough. We are expecting that existing product should be merged to a single web based application supporting HTML5.

        For how long have I used the solution?

        I have about three months of experience. We haven’t installed the product yet. We are at the beginning of the implementation phase.

        What do I think about the stability of the solution?

        We are not using it right now. This is the first time we are using the TDM solution. We are just about to install the product. We have not seen any stability issues yet.

        What do I think about the scalability of the solution?

        We will see about scalability when we implement the product. We are just beginning the implementation phase. We are just about to install the product right now in the production environment.

        How is customer service and technical support?

        For technical support, we are fine.

        Which other solutions did I evaluate?

        We reviewed different products, like Informatica TDM or IBM Optim. Finally, we decided on the CA TDM solution. This looks like a very Agile product that continues to be enhanced.

        Inside the TDM, there are data makers, or different kind of sub-applications. Now they are moving to a single web application interface on TDM.

        Disclosure: I am a real user, and this review is based on my own experience and opinions.
        it_user466854
        Practice Leader - DevOps at CIBER
        Consultant
        We use it to assist our clients with data privacy and the regulatory recommendations.

        What is most valuable?

        The most valuable features for us are masking, data profiling, and creating data subsets. More specifically, we are able to assist our clients with data privacy and the regulatory recommendations that come from the government. We help them to comply with PI, IP, HI and PCI regulations.

        How has it helped my organization?

        CA Test Data Manager is enormously helpful to us. We assist our customers by speeding up the application development process using real-time test data and synthetic test data, which mimics the real test data.

        What needs improvement?

        Integration

        What do I think about the stability of the solution?

        CA Test Data Manager is pretty stable, but integration is where we are looking for some improvements.

        What do I think about the scalability of the solution?

        It is fairly scalable for the implementations I've participated in. We haven't yet utilized the current available capacity.

        How are customer service and technical support?

        I would give technical support 8/10. Generally, we get a solution to an issue, but we have to go through multiple iterations before we get a complete resolution.

        Which solution did I use previously and why did I switch?

        Previous to implementing Test Data Manager all our work was done manually. We used custom SQL scripts, but because of ICD regulatory recommendations, we switched to Test Data Manager.

        How was the initial setup?

        Initial setup was complex in comparison to other solutions for which we did proof-of-concept. There are a lot of contact points with the TDM suite, which I personally felt increased the complexity.

        Which other solutions did I evaluate?

        We evaluated Delphix and IBM, as well as CA Test Data Manager. One of the reasons we chose CA, aside from the fact that we are CA partners, is due to support for PCI and PHI in terms of faster test data generation. The biggest differentiation was in generating test cases from the data. CA implemented this for test matching and then integrated it with Agile Requirements Designer. That tipped the scales in favor of CA TDM.

        When choosing a vendor, we look for continuous innovation and continued support. Continuous innovation can release features into the market ahead of other vendors. So that's something we always look for.

        What other advice do I have?

        My recommendation is to perform a detailed evaluation. If only simple, straightforward, and small-scale test data management is needed, I don’t think a large solution such as CA TDM is necessary. To justify the cost of CA TDM, you need to have need for large-scale test data management.

        Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
        it_user558504
        Consultant at a tech consulting company with 51-200 employees
        Consultant
        Provides stable automation tests. Generates and scrubs test data.

        What is most valuable?

        The most valuable feature of the product is its ability to generate and scrub test data. You can use it to pull data down from production and remove the customers' private data. There is reliable automation so that your automated tests can run over and over again without failing due to bad data. It has a fairly intuitive UI and there is support when you need it.

        How has it helped my organization?

        Automation tests are more stable so they can be run more continuously.

        What needs improvement?

        I would like to see more support for generating the data for loading performance.

        What do I think about the stability of the solution?

        This product is stable enough.

        What do I think about the scalability of the solution?

        We haven't had any issues so far, so I guess I'd say I'm comfortable with the scalability right now.

        Which other solutions did I evaluate?

        We didn't have a solution in place before we selected this software. Automation testing was new and as soon as the automation tests started being developed, the need for data came very quickly. I think being able to locate help and people that know how to do implementation and setup when you get stuck brought us to this solution. At some point, you need to decide to either build or buy.

        What other advice do I have?

        When you do your implementation, find an experienced implementer to get it set up and get it done right the first time.

        Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
        it_user542772
        COE Consultant Test at a financial services firm with 10,001+ employees
        Consultant
        It has removed the dependency of making production data available for development and testing activities.

        What is most valuable?

        Data masking and synthetic data generation.

        How has it helped my organization?

        By using this product we are able to provide test data for the development and testing teams. It has removed the dependency of making production data available for development and testing activities. Using data masking techniques we can comply the rule of non-disclosure of personally identifiable information

        What needs improvement?

        Automating repetitive tasks.

        For how long have I used the solution?

        More than one year.

        What was my experience with deployment of the solution?

        By using different functionalities of CA Test Data Manager, we were able to mask and deploy the data very easily to various environments.

        What do I think about the stability of the solution?

        Product is quite stable but it has some functional bugs which are fixed as soon as they are reported to the support team.

        What do I think about the scalability of the solution?

        Yes, the product is quite helpful to suffice our data masking and synthetic data generation requirement.

        How are customer service and technical support?

        Customer Service:

        8 out of 10

        Technical Support:

        9 out of 10, all our queries and functional defects were resolved within very little time. CA technical support people are proactive and we get the fixes in very little time.

        Which solution did I use previously and why did I switch?

        No, we didn’t use any other tool beforehand.

        How was the initial setup?

        Initial setup was pretty straightforward. We just require a license and a native server, after installation the product will be available for all users under the server.

        What about the implementation team?

        We implemented this in-house.

        Which other solutions did I evaluate?

        No, we didn’t evaluate other options. We had researched this tool and then chose it for our requirements.

        Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
        Adam Topper
        Senior Test Data Management Specialist at a transportation company with 10,001+ employees
        Real User
        We have moved data creation from manual or limited and costly automated processes to a set of weekly data builds and an On-Demand offering capable of delivering versatile data.

        What is most valuable?

        Synthetic Data Creation with use of flexible built in data functions.

        How has it helped my organization?

        We have moved data creation from manual or limited and costly automated processes to a set of weekly data builds and an On-Demand offering capable of delivering versatile data that meets the needs of our many teams.

        What needs improvement?

        An increase in the types of programmatic capabilities could allow the tool to be more powerful. For instance, often data inserts in one table are contingent upon entries or flags from another. In these situations, there is no way to choose to include/exclude a row based on the primary table.

        For how long have I used the solution?

        2.5 years

        What was my experience with deployment of the solution?

        The tool installs in a snap and includes test repositories that allow for new users to start working with the application immediately.

        What do I think about the stability of the solution?

        The stability of the tool has never been an issue. Any time a possible defect has surfaced, the support team was quick to respond. Beyond that, there have been constant new versions created that provide optimizations.

        What do I think about the scalability of the solution?

        The many databases supported and data delivery formats available provide a seemingly endless supply of options to meet the ever growing demand of our testing teams.

        How are customer service and technical support?

        Customer Service:

        Above and beyond that of any company that I’ve worked with before. I’ve never been more than an hour or two without a response to a standard ticket creation.

        Technical Support:

        Also above the standard. Those that support TDM have an intimate knowledge with the product and it’s many available use cases.

        Which solution did I use previously and why did I switch?

        All previous solutions were homegrown and they missed the complete solution we found in CA’s TDM.

        What about the implementation team?

        Our process had ups and downs as we tempted to get TDM off the ground. The winning combination for us was TDM experts from a vendor-partner, Orasi Software, Inc. working hand in hand with employees that had an intimate knowledge with our systems.

        Which other solutions did I evaluate?

        This tool had been purchased by another group in our company but it’s potential was not realized.

        What other advice do I have?

        Our biggest wins in implementing this tool was to start out by working with a singular team and find some data delivery wins. After that internal proof of concept was realized, expansion to other teams became much more simple.

        Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
        it_user396528
        Sr Test Manager at a transportation company with 501-1,000 employees
        Video Review
        Vendor
        In the past, creating test data was a manual process. We now use CA TDM to capture our data model, and then create data on demand.

        What is most valuable?

        There are a lot of great features with Test Data Manager. It's allowed us to really revolutionize the way that we are doing our testing. We use Test Data Manager to create synthetic data for our testing purposes. We have a legacy application that has been around for a very long time, and a lot of the people that originally developed it are no longer with the company, so we've had to do a lot what we call, "Data Archaeology," to go in and recreate our data model. Test Data Manager has allowed us to do that, capture that model, and then create that data on demand, saving us a huge amount of time and money in the process.

        How has it helped my organization?

        It's really allowed us to focus on the craft of testing, and not focus on the creating data, spending time setting up the scenario that we need, or trying to find that sort of thing. It's allowed us to be more focused on our efforts, it's allowed us to be faster. We run an agile shop, and so we've been able to use that data in our pipeline as we deliver stuff in a CICD world. It's been great.

        What do I think about the stability of the solution?

        It's been very stable. We haven't had any issues with the technology at all.

        What do I think about the scalability of the solution?

        It has scaled for us very well. We're able to do small amounts of data and large amounts for performance testing.

        How are customer service and technical support?

        It's been great. They're very knowledgeable in the tool, they've helped us overcome some hurdles as we've been evolving in the practice ourselves and they've been super.

        Which solution did I use previously and why did I switch?

        Test data creation, for us, was a very manual process. The options were either creating it with hands-on keyboards with people actually having to spend time creating it, and we would also do some things where we would just grab data out of production. That didn't always work for us, because you didn't know what you were going to get when you captured that data.

        Not being able to create your own data was very time consuming to create data on your own. Or, when you did things like grab things out of production. You didn't always get all the data combinations that you might need, it just happened to be whatever was there at the time. You certainly weren't getting some of the negative scenarios that we need to test with also.

        How was the initial setup?

        CA and one of our vendor partners came in and helped us do a Proof of Concept. They helped set up the infrastructure, and then walked us through the application, and actually proved out that we could do in in our environment, because it was, for us, an interesting environment. It was, I wouldn't say easy, but we were able to get it done.

        What was our ROI?

        The return on investment for us has been great. In the last year, we captured some metrics around our ROI, and we've saved over eleven thousand hours in manual time of trying to create test data. In that same time, that eleven thousand hours, has translated to about eleven million dollars in cost savings, just from that. Additional benefits around being able to shorten our test cycle time, allow us to go deeper and further into our testing and freeing up people to actually do the work that they're paid to do, and not being doing things like creating data so that they can actually do the work.

        Which other solutions did I evaluate?

        It was a green field for us. We just really didn't know what, where, so we looked at everything that was available in the space, and we chose CA because they probably did have the best offering that would support the most technologies. We have everything from legacy, RPG iSeries type stuff, all the way up to applications in the cloud, and we needed something that would scale for all of those.

        What other advice do I have?

        Important criteria when selecting a vendor: we want to work with somebody that's going to help us solve the problem that we're trying to solve. Do it in a way that works for our enterprise. We don't want to do something that only works for one team or one solution, something that's scalable. We have a lot of different technologies in our organization, so we need something that will work with all of them.

        We need a partner that will help us. This was something new to us, and so we needed somebody to help us dive in, figure out how to do it in our world, and translate that out and help us be successful with the implementation.

        Rating: I would say it is a 10/10. It's been a great experience for us. We really have benefited a lot from it. It's been a great story, as we've learned how to sell it and embed it into our process, and it's been really beneficial for us. We got a lot of conversations with people while we've been here about recommending the application. It's been transformative for us in order to implement that.

        It's an interesting process to go through. Organizations need to change their model and stop thinking about how they've always done things, and so the challenge is really getting people to think in a new way, but once they do, it's been a terrific experience. It's really allowed us to focus on the craft of what we're trying to do, rather than lots of other activities that we don't need to be spending time on.


        Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
        ITCS user
        Practice Manager (Testing Services) at a financial services firm with 1,001-5,000 employees
        Video Review
        Vendor
        Includes basic services which allow you to mask data and create synthetic data. It also includes test matching which accelerates test cycles and allows automation to happen.

        What is most valuable?

        You've got the basic services of the TDM tool which allows you to mask data, it allows you to create synthetic data, but I think what really sets TDM apart from the other competitors, is the kind of the added extras that you get with doing true test data management, so you've got things like the cubing concepts that are grid tools, or data maker, kind of really brings to bear within test data management teams. You've also got test matching as well which massively accelerates test cycles and really gives stability and allows automation to happen.

        How has it helped my organization?

        We've got a centralized COE in terms of test data management within our organization, benefits that are really three fold in terms of cost, quality and time to market. In terms of the quality we through test data management where, data is kind of the glue that holds systems together and therefore, if I understand my test data, I understand what I'm testing. Through the tooling and kind of the maturity in the tooling we're really bringing an added quality aspect in terms of what we test and how we test, and the risk based testing that we might approach.

        In terms of the speed to market, because we don't manually produce data anymore, we use intelligent profiling techniques, test data matching, we massively reduce the time we spend finding data, and we also can produce data on the fly, which turns around test data cycles. In terms of cost, because we're doing it a lot quicker, it's a lot cheaper.

        We have a centralized test data management team that caters for all development within my organization. We've created an organization that is so much more effective and optimized in terms of the kind of the time to get to test execution, to identify data and get into execution in the right way.

        What needs improvement?

        I think the kind of the big area for exploitation for us is already a feature that already exists within the tool. The TCO element is something massive, I talked earlier on about the kind of the maturity and the structure that it gives you to testing. I think this is kind of a game changer in terms of articulating impact of change and no project goes swimmingly first time and therefore the ability to impact a test through kind of a making simple process changes is a massive benefit.

        What do I think about the stability of the solution?

        The stability of the solution is really fine. I think the really big question is the stability of underlying system that it's trying to manipulate and the tool is the tool, it does what it needs to do.

        What do I think about the scalability of the solution?

        Within our organization we have many, many platforms, many, many different technologies. One of the interesting challenges we always have is in terms of, especially when we're doing performance testing, can we get the kind of the volumes of data in sufficient times, and we use things like data explosion quite often and it does what it needs to do and it does it very quickly.

        How are customer service and technical support?

        We work in an organization where we use many tools from many different suppliers. I think that the kind of a relationship that my organization has with CA is kind of a much richer one in terms of, you know, it's not just a tool support.

        Which solution did I use previously and why did I switch?

        Originally we used to spend probably, hours and hours and hours of spreadsheet time manually creating, keying data, massively inefficient, massively error prone, and clearly as part of a financial institution we need to conform to regulations. Therefore we needed an enterprise solution to make sure that we could actually deliver a regulatory data, test data, to suit our projects.

        The initial driver with kind of really buying any tooling initially is kind of what's the problem statement, what's the driver to get these things in? I think once you realize that there is so much more than just the regulatory bit, as I say, the time, cost, quality aspect that it can actually give to test, that's really the kind of the bigger benefit than just regulatory.

        How was the initial setup?

        We've had the tool for about four or five years now within the organization. As you might expect we first got the guys in not knowing anything about the tool and not really knowing how to deploy it, therefore what we needed to do was we called on the CA guys to come in and really to show us how the tool works, but also how to manipulate that within our organization. We had a problem case that we wanted to address, we used that as the proving item, and that's really where we started our journey in terms of a dedicated test data management function.

        Which other solutions did I evaluate?

        Important evaluation criteria: to be honest it's got to be around what does the tool do? A lot of the tools on the market do the same thing, whether there are things that differentiate those tools, and what's really the organization's problem statement they're trying to fulfill. Once you've got the tool, that's great, but you need the people and process, and without that, it comes back to the relationship that you have with the CA guys, you've just got to shelfwear and a tool. We went through a proper RFP selection process where we kind of set our criteria and we kind of we invited a few of the kind of the vendors into come and demonstrate what they could do for us and picked the one that was best suited to us.

        What other advice do I have?

        Rating: no one's perfect. You got to go in the top quartile of it, so you're probably eight upwards. I think in terms of test data management solutions, it's the best out there. I think that the way that tool is going it's kind of moving into other areas in TCO and kind of the integration with SV, it's a massive thing for us.

        I think the recommendation is that absolutely this is kind of the best in breed. As well as buying the tool, it would be a mistake to not also invest in kind of understanding how the tool integrates into the organization and kind of how to bring that into the kind of the tools team, the testing teams and the environment teams that you need to work with.

        Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.