Apache Airflow Primary Use Case

SUDHIR KUMAR RATHLAVATH - PeerSpot reviewer
Student at University of South Florida

Apache Airflow is like a freeway. Just as a freeway allows cars to travel quickly and efficiently from one point to another, Apache Airflow allows data engineers to orchestrate their workflows in a similarly efficient way.

There are a lot of scheduling tools in the market, but Apache Airflow has taken over everything. With the help of airflow operators, any task required for day-to-day data engineering work becomes possible. It manages the entire lifecycle of data engineering workflows.

View full review »
FB
Product Owner at La Poste S.A.
Our use cases are a bit complex, but primarily for data extraction, transformation, and loading (ETL) tasks. View full review »
Damian Bukowski - PeerSpot reviewer
Program Python at Santander Bank Polska

In my company, we use Apache Airflow as an orchestrator because we have a lot of business use cases that involve the automation of people's jobs. For example, if someone takes a file and then moves the file from one folder to another, and we have a lot of scripts to do this in PL/SQL or bash pipelines, we decide to move all of this to be orchestrated through one hub application. Instead of having a few things on the database from Oracle while a few things run on local machines, in our company, we wanted this all to be orchestrated through one thing, which is why we chose Apache Airflow.

View full review »
Buyer's Guide
Apache Airflow
March 2024
Learn what your peers think about Apache Airflow. Get advice and tips from experienced pros sharing their opinions. Updated: March 2024.
765,234 professionals have used our research since 2012.
Ravan Nannapaneni - PeerSpot reviewer
Senior Lead Engineer at Oliver Wyman

Apache Airflow is utilized for automating data engineering tasks. When creating a sequence of tasks, Airflow can assist in automating them.

View full review »
Punit_Shah - PeerSpot reviewer
Director at Smart Analytica

We utilize Apache Airflow for two primary purposes. Firstly, it serves as the tool for ingesting data from the source system application into our data warehouse. Secondly, it plays a crucial role in our ETL pipeline. After extracting data, it facilitates the transformation process and subsequently loads the transformed data into the designated target tables.

View full review »
UjjwalGupta - PeerSpot reviewer
Module Lead at Mphasis

The main use case is orchestration. We use it to schedule our jobs.

View full review »
SabinaZeynalova - PeerSpot reviewer
Data Engineer Team Lead at Unibank

We use Apache Airflow for the automation and orchestration of model deployment, training, and feature engineering steps. It is a model lifecycle management tool.

View full review »
PA
Senior Data Engineer at a tech services company with 1,001-5,000 employees

Apache Airflow is useful for workflow automation, making it capable of automating pipelines, data pipelines, and data warehouse processes. I don't have a strong need for Apache Airflow because I do everything with a dbt or data build tool since it has its own integrated workflow process.

I use Fivetran to synchronize my data. I don't need to do any automation on that and don't have any need for workflow automation. I have everything I need.

View full review »
Miodrag Milojevic - PeerSpot reviewer
Senior Data Archirect at Yettel

It serves as a versatile tool for data ingestion, enabling various tasks including data transformation from one type or format to another. It facilitates seamless preparation and processing of data, supporting diverse operations such as format conversion, type transformation, and other related functions.

View full review »
MW
Analytics Solution Manager at Telekom Malaysia

There are a few use cases we have for Apache Airflow, one being government projects where we perform data operations on a monthly basis. For example, we'll collect data from various agencies, harmonize the data, and then produce a dashboard. In general, it's a BI use case, but focusing on social economy.

We concentrate mainly on BI, and because my team members have strong technical backgrounds we often fall back to using open source tools like Airflow and our own coded solutions. 

For a single project, we will typically have three of us working on Airflow at a time. This includes two data engineers and a system administrator. Our infrastructure model is hybrid, based both in the cloud and on-premises. 

View full review »
Mikalai Surta - PeerSpot reviewer
Head of Big Data Department at IBA Group

We use Apache Airflow for the orchestration of data pipelines.

View full review »
AS
Associate Data Engineer at a outsourcing company with 201-500 employees

We were using Apache Airflow for our orchestration needs. We used it for all the jobs that we had created in Databricks, Fivetran, or dbt. These were the three primary tools that we were using. There were a few others, but these were the three primary tools. So, Apache Airflow was for the job orchestration and connecting them to each other for building our entire data pipeline. We were also using Apache Airflow for dbt CI/CD purposes.

View full review »
Nomena NY HOAVY - PeerSpot reviewer
Lead Data Scientist at MVola

Currently, I am a lead data scientist. Our primary use cases for Apache Airflow are for all orchestrations, from the basic big data lake to machine learning predictions. It is used for all the MLS processes. It is also used for some ELT, to transform, load, and export all big data from restricted, unrestricted, and all phase processes.

View full review »
AT
Lead of Monitoring Tech at a educational organization with 1,001-5,000 employees

We use Apache Airflow to send our data to a third-party system.

View full review »
Fadi Bathish - PeerSpot reviewer
Project Manager at Siren Analytics

We use this solution to monitor BD tasks.

View full review »
JR
Senior Software Engineer at a pharma/biotech company with 1,001-5,000 employees

I'm a data engineer. In the past, I used Airflow for building data pipelines and to populate data warehouses. With my current company, it's a data product or datasets that we sell to biopharma companies.

We are using those pipelines to generate those datasets.

View full review »
Luiz Cesar Gosi - PeerSpot reviewer
Senior Analytics Engineer at TalkDesk

We use Apache Airflow for data orchestration.

View full review »
YS
Software engineer at Naver Corp

My team works on commerce services. We use Airflow to synchronize user information or product information from other services. We use the tool for automating data pipelines. We store user history about API calls and show it on a statistics page, like daily or real-time statistics. We use the solution to aggregate API user's data.

View full review »
Mahendra Prajapati - PeerSpot reviewer
Senior Data Analytics at a media company with 1,001-5,000 employees

Our primary use case for this solution is scheduling task rates. We capture the data from the SQL Server location and migrate it to the central data warehouse.

View full review »
Anandhavelu Arumugam - PeerSpot reviewer
Technical Lead at a media company with 5,001-10,000 employees

I use this solution for scheduling purposes. We have our own Python framework to run jobs, do the extractions, and for transformation loading.

We have 20 people who are using Airflow. It's being used on a daily basis. We don't have any plans to increase usage because we have low data sets.

The solution is deployed on cloud. The cloud provider is Azure.

View full review »
Joaquin Marques - PeerSpot reviewer
CEO - Founder / Principal Data Scientist / Principal AI Architect at Kanayma LLC

Our primary use case for the solution is setting up workflows and processes applied everywhere because most industries are based on workflows and processes. We've deployed it for all kinds of workflows within the organization.

View full review »
SG
Engineering Manager - OTT Platform at Amagi

We are a technology, media, and entertainment-technology company. We are using Apache Airflow for architecting our media workflows. We are using it for two major workflows.

We have had it set up for some time on our own cloud. Recently, we migrated the setup to AWS.

View full review »
JP
Senior Solutions Architect/ Software Architect at a comms service provider with 51-200 employees

We normally use the solution for creating a specific flow for data transformation. We have several pipelines that we use and due to the fact that they're pretty well-defined, we use it in conjunction with other tools that do the mediation portion. With Airflow, we do the processing of such data.

View full review »
AN
Solution Architect at EPAM Systems

The primary use case for this solution is to automate ETL process for datawarehouse.

View full review »
AJ
Associate Director - Technologies at a tech services company with 51-200 employees

Our primary use case is to integrate with SLAs.

View full review »
CP
Virksomhedskonsulent - Digitalisering, Forretningsudvikling, BPM, Teknologi & Innovation at a consultancy with 51-200 employees

We mainly used the solution in banking, finance, and insurance. We are looking for some opportunities in production companies, but this is only at the very early stages.

View full review »
Buyer's Guide
Apache Airflow
March 2024
Learn what your peers think about Apache Airflow. Get advice and tips from experienced pros sharing their opinions. Updated: March 2024.
765,234 professionals have used our research since 2012.