2015-10-25 12:49:50 UTC

When evaluating Hadoop, what aspect do you think is the most important to look for?


Let the community know what you think. Share your opinions now!

Guest
22 Answers
Real UserTOP 10

It depends...what is your endgame ?
Hadoop these days mostly servers as a distributed clustering file system that specializes in storing very large files. If you are merely interested in writing software for distributed processing....Apache Spark, or NVIDIA CUDA are a much better choice....if you are interested in the distributed processing of large amounts of data, then the common practice is to use Apache Spark to write the code to process the data, and Hadoop for persistent file system storage.

2016-12-31 03:25:54 UTC31 December 16
Vendor

The enterprise readiness of the distribution.

2015-11-24 23:26:48 UTC24 November 15
Find out what your peers are saying about Apache, Cloudera, Hortonworks and others in Hadoop. Updated: August 2019.
366,593 professionals have used our research since 2012.
Sign Up with Email