Informatica PowerCenter Review
It can work with any kind of database, including NoSQL ones, but when a database object changes then object definitions do not get refreshed automatically.


Valuable Features

  • Able to access heterogeneous databases: It can work with any kind of database available in the market. In fact, it can be applied to NoSQL databases as well.
  • SQL override allows users to write simple to complex queries inside mapping for performance tuning the load process, pre and post SQL can help control the load process to get desired outputs.
  • Error handling techniques: Informatica provide a suite of options to handle errors. From logging error in database tables to e-mail notifications with log files, to decision/assignment tasks for data control, it can handle error at every stage of workflow. Restart options for batched work-flows.
  • Debugging, is one other feature I found extremely useful. Before running a full load, you can run a subset of data from source-target and check for values generated from logic at every transformation.
  • While in debugger mode Target tables do not load any physical data. The best part of debugging in Informatica is that, in the debugger mode, you can alter logic in expression transformation and check for the value generated from the logic. This will allow you to change logic if it's needed later.
  • Transformations - the types of transformations (SQL Logics) provided are unmatched to any other ETL tool out there. When it comes to using transformations, even a beginner can understand with minimal effort, how to use them or what the expected output will be using that particular transformation. Most of them are self-explanatory, and provide the desired output. Especially Expression Transformation (which is the heart of any mapping).
  • Can be used to build any type of logic with flexibility giving the number of logical functions (date/time/string etc...). Functions have a syntax that is very easy to understand.
  • Informatica provides all the functions indexed in a proper fashion in expressions editor with syntax to allow users to build logic with ease.
  • Version control, as check-in and check-out are easy and allow space for adding comments. These comments can come in handy during deployment when writing queries.

Improvements to My Organization

Our organization had to import data from AS400, implement a sales cost analysis and load data into Data Mart, which was further consumed by QlikView. This was done with data from five different countries in different time zones. From the point of developing mapping, replicating them for other countries given their respective databases to loading the data, everything was automated and scheduled with notification about the loads. It changed the way we did business. Customers orders, modification, and tracking was all smooth. Every morning, we would receive one e-mail with consolidated load statistics and it all starts with one e-mail.

Room for Improvement

There are three areas where they can make a significant improvement:

  1. Live feeds, where if a database object changes then object definitions should automatically get refreshed as well. This would avoid re-import of objects. Auto refresh will effect all the short-cut objects, but ultimately if the object in the database has changed, then the mapping will fail or provide incorrect data given the position of the column or name of the column doesn't exist anymore.
  2. The GUI Interface. Instead of having to open a separate window for Designer, WF Manager and WF Monitor. If these three windows could be merged into three separate tabs or built into the hierarchy of building sub-tasks, for example, workflow opens a session, and the session opens a mapping unlike opening only mapping properties currently, that would be nice. SAP BODS has that structure and I would like to see something along those lines, where I don't have to refresh the mapping and session every time something changes
  3. Version rollback, where version control is a blessing and boon. While the version control is a good feature, sometimes it becomes a huge burden of the database and the Repository should have a way to rollback keeping most current object and purge all other versions.

Use of Solution

Starting with version 7.1, I have been using Informatica for 7+ years.

Deployment Issues

Deployment can get complicated depending on Queries. Adding proper labels and comments during version control can make Deployment very smooth. I did not come across any technical issues using deployment. Rollback feature adds a lot of value to deployment. With a single click, you can rollback all the objects if you notice any discrepancy with objects between environments.

Stability Issues

Informatica is very stable tool. Only a few times, where Informatica server is remotely based, connectivity can be slow at times. I have had a few instances when expression editors gets grayed out when using RDP or Docking Laptop but editing "regedit" file resolved this issue. Otherwise this is a very stable, powerful and robust tool.

Scalability Issues

Informatica can handle extremely large volumes of data very well. With features like CDC, Incremental Load, Mapping Parameters and variables, dynamic look-up, pre and post SQL, Informatica provides flexibility in handling huge volumes of data with ease. Certainly a lot depends on optimized mappings and work-flows (batching and performance tuning).

Customer Service and Technical Support

Customer Service:

We've only had to contact customer service twice in 6+ years, and they were very good with their responses and were very professional.

Technical Support:

I would say 9/10. For what I needed, technical support was able to resolve it in timely fashion. I also appreciate their follow-ups.

Previous Solutions

We were using a conventional RPG programming tool to do analysis, but every time you add more tables for data analysis, profiling, quality, manipulation, it was turning into pages and pages of code. A user friendly GUI interface like Informatica solution, provided the right kind of solution and was easy to migrate from programming to Informatica.

Initial Setup

The initial setup was a conventional data warehouse pattern. Later on, when we started implementing CRM, SCM and ERP it started getting a bit complex. However breaking down projects into multiple data models and organizing

Implementation Team

We had a boot-camp training and an Informatica expert onsite for a few months. Later, we picked up a fair amount of technology and started implementing it in-house.

ROI

Informatica is 100% value for money. The kind of flexibility and stability it offers in dealing with heterogeneous data is amazing.

Pricing, Setup Cost and Licensing

It is not an economical software, but if you are planing for a long term robust end-to-end enterprise level tool that can handle any kind of data, and any type of task relating to BI or data warehousing, it does not require a lot of thinking. You can bank on Informatica for your solutions.

Other Solutions Considered

  • Data Stage

  • Ab Initio

  • MicroStrategy

Other Advice

Informatica is a great product. If you can spend a good amount of time researching what you want,have a proper SDLC in place, and work with the technicality of Informatica, I am sure most of the projects can roll out into production in a timely fashion and produce results. In my experience, not having proper road-map in place, and not auditing change requests, business analysts will struggle with their requirements. This causes more bottlenecks and rework than an actual development. Having said that, no project is a walk in the park, but Informatica can be the icing on the cake if the foundation is good.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
2 visitors found this review helpful

5 Comments

it_user333507Consultant

Very informative and quiet a detailed review. Helps to know more about the awesome tool from an efficient developer's experience. Thanks Azhar for sharing :)

27 October 15
Joop MoerkensReal UserTOP 20POPULAR

Hello Azhar
I am absolutely very interested in what you have to tell about Informatica. Actually, it is used widely in the Energy Company I work for in Holland, Essent (although not in the team I work with). I hear many different comments about Informatica. It is rumoured that it doesn't pick up deletes correctly in the source system (a SAP solution based on Oracle), and that for this reason intensive checks have to be done between the source system and the ETL environment every saturday. Could you please comment on this, and share your experience. Any information you can give will be highly appreciated.
Kind regards
Joop Moerkens

14 November 15
Azhar AhamadReal UserTOP 20POPULAR

Hello Joop,

Thank you for your comment. Unfortunately Informatica integration service does not interact with DB to pick up live feeds. So, anytime Insert/Update/Delete occurs on a DB object, it's corresponding definition in Informatica needs to be refreshed manually (rename/replace). SAP Data Services does some of that. Running customs scripts to check the changes and validate them against Informatica repository is a good way, but has a lot of overhead. As such when existing columns, their position and metadata does not change, it will not invalidate the mapping unless you need the new column added.

One other way is to create Views on top of these DB tables and import views as source definitions. So, unless the new column in DB is needed in your mapping, it will not impact your existing mappings. Please make sure objects used in the mappings are created as global short-cuts, so this way you will not miss any mappings when you perform dependency checks from global shortcut objects. Hope this helps. Please feel free to provide your feedback.

15 November 15
GaryMReal UserTOP 5POPULAR

Azhar, SSIS has same issue. SSIS reports "metadata needs refresh" syntax error which is easy to fix but of course requires re-deploying the ETL objects. It begs the question however - do we really want it automatically refreshing and potentially causing data mapping/transformation problems? Maybe our expectations that it should be automatic are unreasonable?

17 December 15
Azhar AhamadReal UserTOP 20POPULAR

GaryM, I understand that automatic refresh can cause data mapping errors, but identifying the changes, notifying the users and letting the user decide how to apply the changes (update, ignore, create a copy of mapping and edit ...). When the object in DB changes, chances that mapping will fail are high depending on the type of change, why not know about it and prepare for it before hand. Either you need to run a SQL and identify DB changes periodically and apply them in Informatica manually or Informatica identifies them and let's users decide (when repository reconnects).

07 March 16
Guest
Why do you like it?

Sign Up with Email