We just raised a $30M Series A: Read our story
2018-08-14T07:42:00Z

What is your primary use case for Apache Hadoop?

2

How do you or your organization use this solution?

Please share with us so that your peers can learn from your experiences.

Thank you!

ITCS user
Guest
99 Answers

author avatar
Top 20Real User

We mainly use Apache Hadoop for real-time streaming. Real-time streaming and integration using Spark streaming and the ecosystem of Spark technologies inside Hadoop.

2020-12-08T22:10:56Z
author avatar
Top 20Real User

As an example of a use case, when I was a contractor for Cisco, we were processing mobile network data and the volume was too big. RDBMS was not supporting anything. We started using the Hadoop framework to improve the process and get the results faster.

2020-07-14T08:15:56Z
author avatar
Top 5Real User

The primary use is as a data lake.

2020-02-07T02:52:00Z
author avatar
Top 20Real User

We are primarily dumping all the prior payment transaction data into a loop system and then we use some of the plug and play analytics tools to translate it.

2019-12-16T08:14:00Z
author avatar
Top 20Real User

We primarily use the solution for the enterprise data hub and big data warehouse extension.

2019-11-27T05:42:00Z
author avatar
Real User

The primary use case of this solution is data engineering and data files. The deployment model we are using is private, on-premises.

2019-09-29T07:27:00Z
author avatar
Real User

We primarily use this product to integrate legacy systems.

2019-07-28T07:35:00Z
author avatar
Real User

We use this solution for our Enterprise Data Lake.

2019-07-16T01:59:00Z
author avatar
Real User

We use it as a data lake for streaming analytical dashboards.

2018-08-14T07:42:00Z
Find out what your peers are saying about Apache, VMware, Snowflake Computing and others in Data Warehouse. Updated: September 2021.
541,108 professionals have used our research since 2012.