Spark SQL Room for Improvement

SurjitChoudhury - PeerSpot reviewer
Data engineer at Cocos pt

In terms of improvement, the only thing that could be enhanced is the stability aspect of Spark SQL. There could be additional features that I haven't explored but the current solution for working with databases seems effective. I haven't worked extensively with all components, so there might be untapped features that could enhance the solution's value.

View full review »
Sahil Taneja - PeerSpot reviewer
Principal Consultant/Manager at Tenzing

Spark SQL can improve the documentation they have provided. It can be a bit unclear at times. They could improve the documentation a bit more so that we can understand it more easily.

Moreover, they could improve SparkUI to have more advanced versions of the performance and the queries and all.

View full review »
Lucas Dreyer - PeerSpot reviewer
Data Engineer at BBD

It takes a bit of time to get used to using this solution versus Panda as it has a steep learning curve. You need quite a high level of skill with SQL in general to use this solution. If SQL is not someone's primary language, they might find it difficult to get used to. 

This solution could be improved if there was a bridge between Panda and Spark SQL such as translating from Panda operations to SQL and then working with those queries that are generated. 

In a future release, it would be useful to have a real time dashboard versus batch updates to Power BI. 

View full review »
Buyer's Guide
Spark SQL
March 2024
Learn what your peers think about Spark SQL. Get advice and tips from experienced pros sharing their opinions. Updated: March 2024.
768,415 professionals have used our research since 2012.
Aria Amini - PeerSpot reviewer
Data Engineer at Behsazan Mellat

It would be useful if Spark SQL integrated with some data visualization tools. For example, we could integrate Spark SQL with Tableau for data visualization.

View full review »
SB
CTO at Dokument IT d.o.o.

I'm using DBeaver to connect Spark with external tools. I've experienced some incompatibilities when using the Delta Lake format. It is compatible when you're using Databricks on the cloud, but when I'm using Spark on-premise, there are some incompatibility issues. We expect interactive queries with Dremio to provide better results. We issue a query but see that it's a batch process in the background. The documentation is also limited, especially in the setup for Thrift servers.

View full review »
Mahdi Sharifmousavi - PeerSpot reviewer
Lecturer at Amirkabir University of Technology

There are many inconsistencies in syntax for the different querying tasks like selecting columns and joining between two tables so I'd like to see a more consistent syntax. Notations should be unified for all tasks within Spark SQL. 

View full review »
KM
Senior Analyst/ Customer Business and Insights Specialist at a tech services company with 501-1,000 employees

It would be beneficial for aggregate functions to include a code block or toolbox that explains calculations or supported conditional statements. Multiple functions come within an aggregate so it is important to understand them. When you are trying to do something new, it would be easier and quite unique to get information within the solution rather than having to search the web. 

For example, once you select an aggregate it tells you what type of functions the solution can perform and includes a code block explaining its calculations. Or, a certain conditional statement gives you a second option or explains other types of statements the solution performs as part of a rule-level function. 

View full review »
SS
Analytics and Reporting Manager at a financial services firm with 1,001-5,000 employees

Anything to improve the GUI would be helpful.

We have experienced a lot of issues, but nothing in the production environment.

View full review »
DM
Data Analytics Practice head at bse

The service is complex. This is due to the fact that it's a combination of a lot of technology.

The solution needs to include graphing capabilities. Including financial charts would help improve everything overall.

View full review »
PK
Cloud Team Leader at TCL

I would like to have the ability to process data without the overhead. To use the same API to process both terabytes data and be able to process one GB of data. 

View full review »
AG
Engineering Manager/Solution architect at a computer software company with 201-500 employees

This solution could be improved by adding monitoring and integration for the EMR. 

View full review »
KG
Associate Manager at a consultancy with 501-1,000 employees

There should be better integration with other solutions.

View full review »
QG
Corporate Sales at a financial services firm with 10,001+ employees

Being a new user, I am not able to find out how to partition it correctly. I probably need more information or knowledge. In other database solutions, you can easily optimize all partitions. I haven't found a quicker way to do that in Spark SQL. It would be good if you don't need a partition here, and the system automatically partitions in the best way. They can also provide more educational resources for new users.

View full review »
it_user986637 - PeerSpot reviewer
Project Manager - Senior Software Engineer at a tech services company with 11-50 employees

In the next release, maybe the visualization of some command-line features could be added.

View full review »
Buyer's Guide
Spark SQL
March 2024
Learn what your peers think about Spark SQL. Get advice and tips from experienced pros sharing their opinions. Updated: March 2024.
768,415 professionals have used our research since 2012.