Managed Airflow
Configure Apache Airflow by following the listed steps.

Step1: Create S3 Bucket

Create a S3 bucket by following the steps here.

Step2: Upload DAGs in S3 Bucket

Package and upload your DAG (Directed Acyclic Graph) code to Amazon S3. Amazon MWAA loads this code into Airflow.
S3 Objects for Aitflow configuration
Make sure Versioning is enabled for the custom plugins in a plugins.zip, and Python dependencies in a requirements.txt on your Amazon S3 bucket.
Refer to Amazon documentation on DAGs for more details.

Step3: Configure Airflow Environment

Navigate to DevOps > Analytics > Airflow
Provide the required information like airflow version, name, S3 bucket, DAGs folder location.
If Plugins.zip and requirements.txt is specified while adding airflow, you would need to provide the S3 Version of these files.
User can also enable logs while creating airflow.

View Airflow Environment

User can view Airflow Environment details from DuploCloud Portal. You can view AIrflow Envrionment in AWS Console by click Webserver URL.
Export as PDF
Copy link
Edit on GitHub
On this page
Step1: Create S3 Bucket
Step2: Upload DAGs in S3 Bucket
Step3: Configure Airflow Environment
View Airflow Environment