What is most valuable?
- Able to access heterogeneous databases: It can work with any kind of database available in the market. In fact, it can be applied to NoSQL databases as well.
- SQL override allows users to write simple to complex queries inside mapping for performance tuning the load process, pre and post SQL can help control the load process to get desired outputs.
- Error handling techniques: Informatica provide a suite of options to handle errors. From logging error in database tables to e-mail notifications with log files, to decision/assignment tasks for data control, it can handle error at every stage of workflow. Restart options for batched work-flows.
- Debugging, is one other feature I found extremely useful. Before running a full load, you can run a subset of data from source-target and check for values generated from logic at every transformation.
- While in debugger mode Target tables do not load any physical data. The best part of debugging in Informatica is that, in the debugger mode, you can alter logic in expression transformation and check for the value generated from the logic. This will allow you to change logic if it's needed later.
- Transformations - the types of transformations (SQL Logics) provided are unmatched to any other ETL tool out there. When it comes to using transformations, even a beginner can understand with minimal effort, how to use them or what the expected output will be using that particular transformation. Most of them are self-explanatory, and provide the desired output. Especially Expression Transformation (which is the heart of any mapping).
- Can be used to build any type of logic with flexibility giving the number of logical functions (date/time/string etc...). Functions have a syntax that is very easy to understand.
- Informatica provides all the functions indexed in a proper fashion in expressions editor with syntax to allow users to build logic with ease.
- Version control, as check-in and check-out are easy and allow space for adding comments. These comments can come in handy during deployment when writing queries.
How has it helped my organization?
Our organization had to import data from AS400, implement a sales cost analysis and load data into Data Mart, which was further consumed by QlikView. This was done with data from five different countries in different time zones. From the point of developing mapping, replicating them for other countries given their respective databases to loading the data, everything was automated and scheduled with notification about the loads. It changed the way we did business. Customers orders, modification, and tracking was all smooth. Every morning, we would receive one e-mail with consolidated load statistics and it all starts with one e-mail.
What needs improvement?
There are three areas where they can make a significant improvement:
- Live feeds, where if a database object changes then object definitions should automatically get refreshed as well. This would avoid re-import of objects. Auto refresh will effect all the short-cut objects, but ultimately if the object in the database has changed, then the mapping will fail or provide incorrect data given the position of the column or name of the column doesn't exist anymore.
- The GUI Interface. Instead of having to open a separate window for Designer, WF Manager and WF Monitor. If these three windows could be merged into three separate tabs or built into the hierarchy of building sub-tasks, for example, workflow opens a session, and the session opens a mapping unlike opening only mapping properties currently, that would be nice. SAP BODS has that structure and I would like to see something along those lines, where I don't have to refresh the mapping and session every time something changes
- Version rollback, where version control is a blessing and boon. While the version control is a good feature, sometimes it becomes a huge burden of the database and the Repository should have a way to rollback keeping most current object and purge all other versions.
For how long have I used the solution?
Starting with version 7.1, I have been using Informatica for 7+ years.
What was my experience with deployment of the solution?
Deployment can get complicated depending on Queries. Adding proper labels and comments during version control can make Deployment very smooth. I did not come across any technical issues using deployment. Rollback feature adds a lot of value to deployment. With a single click, you can rollback all the objects if you notice any discrepancy with objects between environments.
What do I think about the stability of the solution?
Informatica is very stable tool. Only a few times, where Informatica server is remotely based, connectivity can be slow at times. I have had a few instances when expression editors gets grayed out when using RDP or Docking Laptop but editing "regedit" file resolved this issue. Otherwise this is a very stable, powerful and robust tool.
What do I think about the scalability of the solution?
Informatica can handle extremely large volumes of data very well. With features like CDC, Incremental Load, Mapping Parameters and variables, dynamic look-up, pre and post SQL, Informatica provides flexibility in handling huge volumes of data with ease. Certainly a lot depends on optimized mappings and work-flows (batching and performance tuning).
How are customer service and technical support?
Customer Service:
We've only had to contact customer service twice in 6+ years, and they were very good with their responses and were very professional.
Technical Support:
I would say 9/10. For what I needed, technical support was able to resolve it in timely fashion. I also appreciate their follow-ups.
Which solution did I use previously and why did I switch?
We were using a conventional RPG programming tool to do analysis, but every time you add more tables for data analysis, profiling, quality, manipulation, it was turning into pages and pages of code. A user friendly GUI interface like Informatica solution, provided the right kind of solution and was easy to migrate from programming to Informatica.
How was the initial setup?
The initial setup was a conventional data warehouse pattern. Later on, when we started implementing CRM, SCM and ERP it started getting a bit complex. However breaking down projects into multiple data models and organizing
What about the implementation team?
We had a boot-camp training and an Informatica expert onsite for a few months. Later, we picked up a fair amount of technology and started implementing it in-house.
What was our ROI?
Informatica is 100% value for money. The kind of flexibility and stability it offers in dealing with heterogeneous data is amazing.
What's my experience with pricing, setup cost, and licensing?
It is not an economical software, but if you are planing for a long term robust end-to-end enterprise level tool that can handle any kind of data, and any type of task relating to BI or data warehousing, it does not require a lot of thinking. You can bank on Informatica for your solutions.
Which other solutions did I evaluate?
Data Stage
Ab Initio
MicroStrategy
What other advice do I have?
Informatica is a great product. If you can spend a good amount of time researching what you want,have a proper SDLC in place, and work with the technicality of Informatica, I am sure most of the projects can roll out into production in a timely fashion and produce results. In my experience, not having proper road-map in place, and not auditing change requests, business analysts will struggle with their requirements. This causes more bottlenecks and rework than an actual development. Having said that, no project is a walk in the park, but Informatica can be the icing on the cake if the foundation is good.
Which version of this solution are you currently using?
9.x