IBM Cognos Review

The jobstreams makes grouping steps fairly easy, but lookup stability needs to be improved.

Valuable Features:

  • The jobs, which can be used easily, to place the right SQL-statement in the required order, makes monitoring them a lot easier
  • The jobstreams, makes it possible to group the jobsteps effectively, and in the right order
  • the scheduler, which makes it easy to run the jobs and jobstreams at any convenient moment
  • The lookups, if working correctly (they can be unstable), provide a good way to ensure referential integrity between fact tables and dimension tables

Improvements to My Organization:

It gave the organisation the possibility to have a closer look a the data in a more organized way. If they would have had the stability to see it through, they probably would have required insights in their (in)efficiencies that greatly would have improved the effectiveness and efficiency of the core activities (it was a callcenter).

Room for Improvement:

The stability of the lookups is not very good, and really needs improvement. The tool forces you in the direction of a datamart/Kimball architecture. If that is not the architecture that you want, it requires a lot of tricks to fold it to your needs. When you use hierarchies and dimensions, you are basically more or less forced into building these first, before building the fact tables. If these dimensions need adjustment or modifications, implementing these is a very complicated delicate and a faultridden operation.

Other Advice:

The most effective way to get a sturdy result in the Cognos Datamanager, is to leave out the fancy stuff. Only use the jobs and jobstreams (and of course the scheduler to run them). Guard referential integrity by making the correct joins in the queries which are in the jobs (and putting these of course in the right order). Then, Cognos Datamanager can be very useful, sturdy, and flexible.

I understand from fellow forum user that the Cognos Datamanager will not have technical support from IBM anymore starting from 30th September 2015. If you haven’t made a choice yet for a datawarehouse solution, choosing the Cognos Datamanager could give you serious support issues in the not so far future.

In a more general sense, why do you want a datawarehouse? If you want to resolve issues with data quality, a datawarehouse can make these visible, but there are a lot cheaper ways to achieve this. And, of course, knowing about the problem, but taking no steps to tackle these, won't solve anything. If you want a sturdy, reliable, comparable, standardized and well organized data, a datawarehouse is a good option. However, it takes time and effort. However, quite a lot of organisations use the Cognos Datamanager. Migrating to another environment is a costly and potentially risky undertaking. I expect that a lot of Cognos Datamanager applications will probably be in use for many years to come. Some useful dinosaurs, such as COBOL, still survives. I think the Cognos Datamanager can easily go that way too.

Which version of this solution are you currently using?

**Disclosure: I am a real user, and this review is based on my own experience and opinions.
More IBM Cognos reviews from users
...who work at a Computer Software Company
...who compared it with IBM Cognos Express [EOL]
Learn what your peers think about IBM Cognos. Get advice and tips from experienced pros sharing their opinions. Updated: June 2021.
511,773 professionals have used our research since 2012.
Add a Comment
ITCS user

author avatarBob Edis
Top 20LeaderboardConsultant

I think it's important to note that this forum is for Business Intelligence tools and solutions. "Cognos" is an IBM brand that embraces many tool sets that may or may not embrace BI.

Cognos DataManager is NOT a BI tool but rather an ETL tool. It was developed by Cognos specifically designed to support Kimball star schema models. I understand that IBM is phasing it out in preference for InfoSphere DataStage ETL. I think DataManager has a better GUI than DataStage for tasks around creating/maintaining dimensions and facts but DataStage has a LOT more functionality.

The Cognos brand has a BI suite named as Cognos BI. It also has analytics tool such as TM1, and targeted tools such as ICM (was Varicent).

author avatarit_user267465 (BI Engineer at a media company with 1,001-5,000 employees)

My main point is: if used correctly, the Cognos Datamanager can be a very effective tool for ETL, and in the end BI.
My opinion is that BI and ETL cannot be seperated that easily. If data are not presented in a way the end user (or business, or any other relevant actor) can read and interpret them, these data are useless. If the form in which data are put is readable, but the data are incorrect, the presentation is a disguise for errors, or even deceit. ETL can be very usefull, but it's not the only way to make data useful. The same applies to a datawarehouse. However, ETL can be a very useful way to standardize data and achieve reliabiliity.
I am very interested in hearing the informed opinion of SnrConsultant710 on Datastage, TM1, Cognos BI, ICM and Varicent.

author avatarBob Edis
Top 20LeaderboardConsultant

Joop, I agree with you on the interdependence of BI and ETL. I would also propose that the data model is an equal partner in this mix.
Business Intelligence has come a long way since it was called Decision Support Systems (DSS) last century. It was once though of as the domain of large, cashed-up companies with mature IT environments. But today it is much harder to pin down. Even the name is changing - BI is out, Analytics is in!
BI solutions, and the ETL to support them, can now be focused on business needs from the corner shop to the multi-national and the various business units and departments in them. The mature toolsets such as Cognos BI and Business Objects still have an edge for large and complex deployments with sophisticated IT organisations. They can also work very well, albeit at a relatively high sticker price, for medium and small organisations with or without IT departments. But in this space the big boy tools are finding strong contenders with the likes of Tableau and QlikView, et al.
ETL and physical data models (data warehouse, data marts) are critical to BI success to large and medium size deployments, less so in small. Many small businesses have data in isolated repositories like spreadsheets and it is quicker, cheaper, and more convenient to run BI directly against these.

author avatarBob Edis
Top 20LeaderboardConsultant

Re DataStage, TM1, and ICM:
DataStage is a flexible,fast, and large scale ETL tool. The way is it designed to use modules for specific ETL tasks and the linking together of these modules makes it more future proof than Data Manager. DataStage can be used for virtually any data movement and conversion need; Data Manager is constrained to building and maintaining data marts (it can be traditional ETL too but is limited). I prefer the UI of Data Manager to DataStage but IBM is retiring the former and banking on the latter. DataStage handles traditional RDBMS and flat file ETL but also adds CDC, big data, plug ins, web services, etc., and shares a common meta data repository with other Information Server tools including QualityStage (data cleansing), Information Analyzer( meta data anlysis and business rule compliance), and Information Governance Center (business glossary, data lineage, impact analysis). All the Information Server tools can share data and results via the repository. The Metadata Access Manager tool provides very good import and export of metadata with a plethory of other tools including data modeling tools.
The only thing I think it lacks is a good, stable, web based UI like Cognos BI has.

TN1: I have only used TM1 as a data source for BI projects so I do not feel I have the authority to speak to it.

ICM (nee Varicent) is an interesting Cognos branded tool. It is very different from any other tool I've used with the exception of a Boorland tool I tried last century. ICM has a very specific scope - generate data the then reports on how much a sales guy should get paid (commission, bonus) based on their contribution to sales, their ranking in the business, and the compensation plan they're on. It breaks down business rules into small code objects designed to be "black boxes" with clearly defined input and output connectors. These are built in a bottom up approach until the final output that the report needs is provided. These objects act like graphical Java classes and can be reused as needed for different reports.
The reporting engine is adequate but could be replaced with CBI to be more flexible.
The development process is rapid once all the business rules are understood well enough to be coded. However is is a resource hog. To be effective it relies on many pre-calculated data sets (object output) and these suck up RAM like crazy. On disk data is stored in a database or flat files. But to get an answer TODAY is must use in-memory data sets.
A small but growing market. There are several competitors with ICM.



author avatarit_user494622 (CSO, Digital Transformation | Vinci | Bosch at a tech services company with 201-500 employees)
Real User

ColdLight is a leader in advanced analytics, predictive technology and machine learning science. At the center is Neuron, a secure and flexible automated learning engine that learns from massive amounts of data, discovers patterns and then delivers mathematically validated recommendations directly to the business teams that can take action. Neuron can be deployed as a SaaS cloud service, minimizing any infrastructure impact for our customers.