AWS Compute Optimizer vs Apache Spark comparison

Cancel
You must select at least 2 products to compare!
Apache Logo
3,093 views|2,345 comparisons
89% willing to recommend
Amazon Web Services (AWS) Logo
151 views|61 comparisons
100% willing to recommend
Comparison Buyer's Guide
Executive Summary

We performed a comparison between Apache Spark and AWS Compute Optimizer based on real PeerSpot user reviews.

Find out what your peers are saying about Amazon Web Services (AWS), Apache, Zadara and others in Compute Service.
To learn more, read our detailed Compute Service Report (Updated: April 2024).
767,995 professionals have used our research since 2012.
Featured Review
Quotes From Members
We asked business professionals to review the solutions they use.
Here are some excerpts of what they said:
Pros
"It provides a scalable machine learning library.""The memory processing engine is the solution's most valuable aspect. It processes everything extremely fast, and it's in the cluster itself. It acts as a memory engine and is very effective in processing data correctly.""The most valuable feature of this solution is its capacity for processing large amounts of data.""The product’s most valuable features are lazy evaluation and workload distribution.""The most valuable feature of Apache Spark is its memory processing because it processes data over RAM rather than disk, which is much more efficient and fast.""The data processing framework is good.""It is highly scalable, allowing you to efficiently work with extensive datasets that might be problematic to handle using traditional tools that are memory-constrained.""We use it for ETL purposes as well as for implementing the full transformation pipelines."

More Apache Spark Pros →

"I find the solution's scaling capability to be an important benefit. You can scale it vertically or horizontally, i.e., you can upgrade the hardware or clone the machine. The solution is also easy to manage and flexible. Additionally, you get some layers of security without paying for it."

More AWS Compute Optimizer Pros →

Cons
"The initial setup was not easy.""This solution currently cannot support or distribute neural network related models, or deep learning related algorithms. We would like this functionality to be developed.""The management tools could use improvement. Some of the debugging tools need some work as well. They need to be more descriptive.""We use big data manager but we cannot use it as conditional data so whenever we're trying to fetch the data, it takes a bit of time.""At times during the deployment process, the tool goes down, making it look less robust. To take care of the issues in the deployment process, users need to do manual interventions occasionally.""We've had problems using a Python process to try to access something in a large volume of data. It crashes if somebody gives me the wrong code because it cannot handle a large volume of data.""At the initial stage, the product provides no container logs to check the activity.""It would be beneficial to enhance Spark's capabilities by incorporating models that utilize features not traditionally present in its framework."

More Apache Spark Cons →

"I have two areas of improvement to comment on. Most of the product names in AWS are not indicative of what they are doing. Moreover, AWS is not organized and you do not have the full platform with you. It is hard to know some AWS services."

More AWS Compute Optimizer Cons →

Pricing and Cost Advice
  • "Since we are using the Apache Spark version, not the data bricks version, it is an Apache license version, the support and resolution of the bug are actually late or delayed. The Apache license is free."
  • "Apache Spark is open-source. You have to pay only when you use any bundled product, such as Cloudera."
  • "We are using the free version of the solution."
  • "Apache Spark is not too cheap. You have to pay for hardware and Cloudera licenses. Of course, there is a solution with open source without Cloudera."
  • "Apache Spark is an expensive solution."
  • "Spark is an open-source solution, so there are no licensing costs."
  • "On the cloud model can be expensive as it requires substantial resources for implementation, covering on-premises hardware, memory, and licensing."
  • "It is an open-source solution, it is free of charge."
  • More Apache Spark Pricing and Cost Advice →

  • "I find the solution's pricing reasonable. You need to pay extra for IP and other miscellaneous costs."
  • More AWS Compute Optimizer Pricing and Cost Advice →

    report
    Use our free recommendation engine to learn which Compute Service solutions are best for your needs.
    767,995 professionals have used our research since 2012.
    Questions from the Community
    Top Answer:We use Spark to process data from different data sources.
    Top Answer:In data analysis, you need to take real-time data from different data sources. You need to process this in a subsecond, and do the transformation in a subsecond
    Top Answer:I find the solution's scaling capability to be an important benefit. You can scale it vertically or horizontally, i.e., you can upgrade the hardware or clone the machine. The solution is also easy to… more »
    Top Answer:I find the solution's pricing reasonable. You need to pay extra for IP and other miscellaneous costs.
    Top Answer:I have two areas of improvement to comment on. Most of the product names in AWS are not indicative of what they are doing. Moreover, AWS is not organized and you do not have the full platform with… more »
    Ranking
    5th
    out of 16 in Compute Service
    Views
    3,093
    Comparisons
    2,345
    Reviews
    25
    Average Words per Review
    432
    Rating
    8.7
    10th
    out of 16 in Compute Service
    Views
    151
    Comparisons
    61
    Reviews
    1
    Average Words per Review
    463
    Rating
    8.0
    Comparisons
    Learn More
    Overview

    Spark provides programmers with an application programming interface centered on a data structure called the resilient distributed dataset (RDD), a read-only multiset of data items distributed over a cluster of machines, that is maintained in a fault-tolerant way. It was developed in response to limitations in the MapReduce cluster computing paradigm, which forces a particular linear dataflowstructure on distributed programs: MapReduce programs read input data from disk, map a function across the data, reduce the results of the map, and store reduction results on disk. Spark's RDDs function as a working set for distributed programs that offers a (deliberately) restricted form of distributed shared memory

    AWS Compute Optimizer recommends optimal AWS Compute resources for your workloads to reduce costs and improve performance by using machine learning to analyze historical utilization metrics. Over-provisioning compute can lead to unnecessary infrastructure cost and under-provisioning compute can lead to poor application performance. Compute Optimizer helps you choose the optimal Amazon EC2 instance types, including those that are part of an Amazon EC2 Auto Scaling group, based on your utilization data.

    By applying the knowledge drawn from Amazon’s own experience running diverse workloads in the cloud, Compute Optimizer identifies workload patterns and recommends optimal compute resources. Compute Optimizer analyzes the configuration and resource utilization of your workload to identify dozens of defining characteristics, for example, if a workload is CPU-intensive, or if it exhibits a daily pattern or if a workload accesses local storage frequently. The service processes these characteristics and identifies the hardware resource headroom required by the workload. Compute Optimizer infers how the workload would have performed on various hardware platforms (e.g. Amazon EC2 instances types) and offers recommendations.

    Sample Customers
    NASA JPL, UC Berkeley AMPLab, Amazon, eBay, Yahoo!, UC Santa Cruz, TripAdvisor, Taboola, Agile Lab, Art.com, Baidu, Alibaba Taobao, EURECOM, Hitachi Solutions
    Expedia, Intuit, Royal Dutch Shell, Brooks Brothers
    Top Industries
    REVIEWERS
    Computer Software Company30%
    Financial Services Firm15%
    University9%
    Marketing Services Firm6%
    VISITORS READING REVIEWS
    Financial Services Firm24%
    Computer Software Company13%
    Manufacturing Company7%
    Comms Service Provider6%
    No Data Available
    Company Size
    REVIEWERS
    Small Business40%
    Midsize Enterprise19%
    Large Enterprise40%
    VISITORS READING REVIEWS
    Small Business17%
    Midsize Enterprise12%
    Large Enterprise71%
    No Data Available
    Buyer's Guide
    Compute Service
    April 2024
    Find out what your peers are saying about Amazon Web Services (AWS), Apache, Zadara and others in Compute Service. Updated: April 2024.
    767,995 professionals have used our research since 2012.

    Apache Spark is ranked 5th in Compute Service with 60 reviews while AWS Compute Optimizer is ranked 10th in Compute Service with 1 review. Apache Spark is rated 8.4, while AWS Compute Optimizer is rated 8.0. The top reviewer of Apache Spark writes "Reliable, able to expand, and handle large amounts of data well". On the other hand, the top reviewer of AWS Compute Optimizer writes "Easy to manage, flexible, and has good scaling options". Apache Spark is most compared with Spring Boot, AWS Batch, Spark SQL, SAP HANA and Cloudera Distribution for Hadoop, whereas AWS Compute Optimizer is most compared with .

    See our list of best Compute Service vendors.

    We monitor all Compute Service reviews to prevent fraudulent reviews and keep review quality high. We do not post reviews by company employees or direct competitors. We validate each review for authenticity via cross-reference with LinkedIn, and personal follow-up with the reviewer when necessary.