Equalum Valuable Features

JB
Director of Enterprise Architecture at a pharma/biotech company with 10,001+ employees

The ability to stream data out of Oracle and SQL Server databases onto Kafka topics. When using alternative technologies, there is usually a lot of software development. With Equalum, it is just configuration.

Equalum provides us with a single platform for the following core architectural use cases:

  • CDC replication
  • Data Streaming
  • Batch ETL (though much less for this).

CDC replication and data streaming are very important to us. Batch ETL is not that important because we have other solutions for that.

View full review »
RM
Managing Director at a consultancy with 11-50 employees

It's a really powerful platform in terms of the combination of technologies they've developed and integrated together, out-of-the-box. The combination of Kafka and Spark is, we believe, quite unique, combined with CDC capabilities. And then, of course, there are the performance aspects. As an overall package, it's a very powerful data integration, migration, and replication tool. We've looked at a number of other products but Equalum, from a technology perspective, comes out head and shoulders above the rest. We tend to focus mainly on trying to move the legacy businesses here in Japan, which are very slow in moving, and in Korea, from batch and micro-batch to real-time. The combination of those technologies that I just mentioned is really powerful.

It also stands out, very much so, in terms of its ease of use. It's super-simple to use. It has its own Excel-type language, so as long as you know how to use Excel, in terms of data transformation, you can use this tool. And we're talking about being able to do massive data integration and transformation. And that's not referring to the drag-and-drop capabilities, which are for people who have zero skills. Even for them it's that easy. But if somebody does want to do some customization, it has a CLI that's based on the solution's own CLI code transformations, which are as easy as an Excel-type of command. And they've got all the documentation for that.

For consultants, it's a dream tool. A large consultancy practice providing services to large enterprises can make a boatload of money from consultants spending hours and hours developing workflows and actually implementing them right away. And they could then copy those workflows across organizations, or inside the same organization. So you create a drag-and-drop scenario once, or a CLI once, and you could use that in multiple situations. It's very much a powerful tool from a number of angles, but ease of use is definitely one of them.

In addition, it's a single platform for core architectural use cases: CDC replication, streaming ETL, and batch ETL. It also has micro-batch. It's got it all, from end-to-end. It's the glue. There are a lot of other products out there, good products, but there's always a little bit of something missing from the other products. Equalum did its research well and understood the requirements of large enterprise and governments in terms of one tool to rule them all, from a data migration integration perspective.

The speed of data delivery is super-fast. In some cases, when you look at the timestamp of data in the target versus the source, it's down to the hundredths of a second and it's exactly the same number. That's at the highest level, but it's super-fast. It's lightning fast in terms of its ability to handle data.

View full review »
HO
Partner at Gulf Consulting

For us, one of the most valuable features is the zero-coding part, which makes it a lot easier to accomplish the building of data pipelines. The interface goes hand-in-hand with the zero coding and provides an immediate response to anything that you code. You have a real-time view of the input and output. That is extremely important because this is the type of work where you work for a week, and then you find out you have to go back to the source. The zero coding and the user interface are the elements that sell the product.

The second one is more to do with how we work with Equalum. It's not necessarily a product feature, but more a company-related feature. It's the flexibility of their company, how they respond to us. That is quite important for a new product and they are definitely very supportive. Whatever needs to be done, they do it, 24-hours-a-day, if necessary. That's what separates it and why we think it's a successful solution.

Equalum provides a single platform for core architectural use cases, including CDC replication, streaming ETL, and batch ETL. That is important to our clients because there is no other single-focus product that covers these areas in that much detail, and with this many features on the platform. The fact that they are single-minded and focused on CDC and ETL makes this such a rich solution. Other solutions cover these things a little bit in their multi-function products, but they don't go as deep. At the end of the day, that's why a client pays for the license.

View full review »
Buyer's Guide
Equalum
April 2024
Learn what your peers think about Equalum. Get advice and tips from experienced pros sharing their opinions. Updated: April 2024.
767,667 professionals have used our research since 2012.
SK
Database Administrator at a energy/utilities company with 1,001-5,000 employees

It has good features. It has a replication feature that is wonderful because the data is streaming live and we can change the pulling rates. Initially, this took 50 seconds. However, whatever changes happened from SQL Server to Oracle, they now happen within 30 seconds when it is pulled via Equalum. 

The Equalum tool is a good development tool and user-friendly as well. The front-end is user-friendly because it has a nice, easy methodology. It takes hardly a day to teach someone who can then create the workflow. Once the workflow is set, you don't have to do anything. The data constantly flows from SQL Server to Oracle, i.e., the source to the target. 

It has a strong command line feature. So, if it is front-end, like in SSIS, then I have to create each flow manually. However, in Equalum, we can write a command line program and deploy 50 to 100 flows together at once through the command line. 

Equalum provides a single platform for the following core architectural use cases: CDC replication, streaming ETL, and batch ETL. The CDC is important for me as an SQL Server DBA. So, if there is no CDC, then all my data has to be pulled directly from my tables, which then have to already be linked to the application. So, there will be a performance hit. Now, because there is CDC, the change data captured goes into the CDC table and Equalum pulls from that CDC table. Therefore, there is no user impact on my DB servers. 

They have something called binary logs for Oracle. If you have these logs in place, then you can pull the data through the logs. That is convenient because you can pull the big data in through batch processing, which I have not personally used myself. Though I have seen, in my organization, people using batches because they can schedule them. While my data is live streaming and keeps on streaming every three minutes, some data doesn't require live streaming. So, every day in the morning, after I pull the data from source to target, then they can use batch processing, which is good.

It is important to me that the solution provides a no-code UI, with Kafka and Spark fully-managed in the platform engine, because then I don't have to take care of anything. There are no backup problems. For the flows that I create, I don't have to make a backup, restore or maintain them. I just need to create the workflow from my end. I need a user in my source and a user in target from the database perspective. Then, the front-end is taken care by Equalum to Kafka, which makes it very user-friendly.

When we are taking the data from the source to target, we can add fields, like timestamp. So, data accuracy is very prompt and 100 percent. Whatever data you have in the source, that is the exact data reflected in the target. For the many months that I have been using it for all my projects, I haven't found any data discrepancies, etc. There has not been a time when the source of data is different from the target data, which is very good.

View full review »
SK
Software Engineer Specialist at a energy/utilities company with 1,001-5,000 employees

Their performance monitoring (how things are flowing) is very visual, if something is failing, you can see it in there at the higher level, e.g., your sources are down, your agent is down, or your flow is not running. All those kinds of things are very visual. You just log into the tool and can see what is happening. 

The alerting system is getting better with every release.

It takes me an hour to transition the solution's knowledge to somebody else. It is really efficient that way. I haven't seen any complications.

All our architectural use cases are on a single platform, not multiple platforms. You don't have to dump into different modules because it is the same module everywhere.

It is a self-managed, self-healing system. For example, I have been getting alerts on CPU usage. They say, "CPU usage is high." Then, it sends me a warning or critical alert within seconds so I can see that it has been resolved. For example, if the database goes down, then it stops at that point, keeps on trying until the database comes up, and begins to heal itself. So, it is self-recovering and self-healing. It is the same for the target. If the target goes down, it sends you an alert saying, "The target has gone down." I don't worry about it. I can ignore the alert, because when Target comes back, it starts all over again. I really like self-recovering and self-healing because I don't need to take many actions.

Initially, I wanted the incremental data loss too and batch. So, they put those together very fast for our security. Initially, it was very basic, then they enhanced it to the level that we wanted it. So, they came back with a solution quickly. Alerting is another feature that they put together, but we did not ask for that.

In multi-tenant architecture, if I am in finance and another department is on the operations' side, then we don't have to go into each other's area. We can have our own separation of products, which is pretty cool.

View full review »
TF
Senior Software Engineer at a retailer with 201-500 employees

Its most valuable feature is the change data capture (CDC). This is usually a little bit more of a pain if I was using an open source or other tools, but I find their change data capture and data query pretty intuitive.

Equalum provides a single platform for the following core architectural use cases: CDC replication, streaming ETL, and batch ETL. This is core to our company. I would score it as a nine out of 10. It is pretty much how we were moving all our data through their system.

The no-code part is useful. It is like a seven out of 10 for us. We are all software engineers, so it just helps speed up a lot of the data mapping. For example, I just did five documents now. It probably saved me 50 percent of my time.

View full review »
IA
Ingestion Solutions Head of Data at a financial services firm with 11-50 employees

I found two features in Equalum that I consider the most valuable.

One is that Equalum is a no-code tool. You can do your activities on its graphical interface, which doesn't require complex knowledge of extracting, changing, or loading data.

Another feature of Equalum that I like the most is that it monitors the data transfers and tells you if there's any issue so that you can quickly check and correct it. Equalum also tells you where the problem lies, for example, if it's a hardware or communication issue.

View full review »
Buyer's Guide
Equalum
April 2024
Learn what your peers think about Equalum. Get advice and tips from experienced pros sharing their opinions. Updated: April 2024.
767,667 professionals have used our research since 2012.