Data Integration Tools Forum

Content Specialist
IT Central Station
May 07 2018
One of the most popular comparisons on IT Central Station is Informatica PowerCenter vs SSIS. In your opinion, which is better and why? Thanks! --Rhea
etlsolut192060Hands down Informatica. I've used both. Informatica is system and DB agnostic and can run on any platform. If set up correctly like I do in my architecture moving from the platform (Windows) to Linux or AIX would take less than a day. Also, Informatica is highly parameterizable and easier to develop in. The shared objects ability is a huge time-saving advantage. Also, Informatica has built-in version controls.
Mark BennettInformatica is the more capable tool on paper whilst SSIS is more widely used, less expensive and there is an abundance of expertise available. However, it all comes down to which tool better aligns to a customer's requirements.
Michael LuryeLike any other technology there is no universal answer to which tool is better, it all depends on your requirements, budget, skill set and other considerations. Having said that if your DW/BI environment is based on the SQL Server stack I would go with SSIS unless there is a good reason to bring in another ETL tool.
Can someone help me to determine the quality/pros and cons of the Denodo data integration solution and the best (other suppliers/options) value option(s) in data integration tool licensing space?  Thank you.
Business Analyst at
I have a compoents input record format with a field set to NULL, like, record string(3) INPUT = NULL; end; Comp A -----FlowAtoB---------> Comp B The above record format is the FlowAtoB. I can understand that this sets the value of INPUT to null. But, In case I have value for this from previous component A, how will this behave?<a href=>Ab Initio</a> I'm not able to clearly understand the use of this from the help document.
Business Analyst at
Doing Python on exceptionally small tasks makes me admire the dynamically typed nature of this language (no want for assertion code to hold track of types), which regularly makes for a faster and less painful development system alongside the manner. but, I experience that during a whole lot larger tasks this may virtually be a hindrance, because the code might run slower than say, its equal in C++. however then again, the use of Numpy and/or Scipy with Python may additionally get your code to run just as fast as a native C++ software (in which the code in C++ could sometimes take longer to broaden). I publish this query after reading Justin Peel's touch upon the thread "Is Python faster and lighter than C++?" wherein he states: "also, folks who communicate of Python  being gradual for critical range crunching haven't used the Numpy and Scipy modules. Python is in reality taking off in medical computing these days. Of course, the velocity comes from the use of modules written in C or libraries written in Fortran, however it really is the beauty of a scripting language in my view." Or as S. Lott writes on the identical thread concerning Python: "...because it manages memory for me, I do not ought to do any reminiscence management, saving hours of chasing down middle leaks." I also inspected a Python/Numpy/C++ associated overall performance query on "Benchmarking (python vs. c++ the use of BLAS) and (numpy)" wherein J.F. Sebastian writes "...there is no difference between C++ and numpy on my machine." both of those threads were given me to wondering whether or not there is any actual advantage conferred to knowing C++ for a Python programmer that makes use of Numpy/Scipy for producing software program to analyze 'large facts' in which overall performance is manifestly of top notch significance (but additionally code readability and improvement speed are a need to)?

Sign Up with Email