> Apache Airflow DAGs
Orchestrate data pipelines with Airflow. TaskFlow API, sensors, XCom, and scheduling.
fetch
$
curl "https://skillshub.wtf/skillshub-team/catalog-batch5/airflow-dags?format=md"SKILL.md•Apache Airflow DAGs
Airflow DAGs
TaskFlow API
from airflow.decorators import dag, task
from datetime import datetime
@dag(schedule='@daily', start_date=datetime(2024, 1, 1), catchup=False)
def etl():
@task()
def extract() -> dict: return fetch_data()
@task()
def transform(data: dict) -> dict: return clean(data)
@task()
def load(data: dict): write_to_db(data)
load(transform(extract()))
etl()
Classic DAG
with DAG('pipeline', schedule='0 6 * * *', default_args={'retries': 2}) as dag:
t1 = PythonOperator(task_id='extract', python_callable=extract)
t2 = BashOperator(task_id='transform', bash_command='python transform.py')
t1 >> t2
Sensors: FileSensor, HttpSensor — wait for conditions
XCom: ti.xcom_push(key='count', value=42) / ti.xcom_pull(task_ids='extract')
CLI: airflow dags trigger / airflow tasks test
> related_skills --same-repo
> Nix Dev Shells with direnv
Auto-activate reproducible dev environments with Nix flakes and direnv.
> Dagger with GitHub Actions
Run Dagger CI/CD pipelines in GitHub Actions for portable, testable builds.
> Bun + Hono API
Build fast APIs with Bun runtime and Hono framework.
> Deno Fresh Framework
Build full-stack web apps with Fresh on Deno. Islands, routes, and zero runtime overhead.