ZenML
Apache Airflow
All integrations

Apache Airflow

Streamline ML Workflows with Apache Airflow Orchestration in ZenML

Add to ZenML

Streamline ML Workflows with Apache Airflow Orchestration in ZenML

Seamlessly integrate the robustness of Apache Airflow with the ML-centric capabilities of ZenML pipelines. This powerful combination simplifies the orchestration of complex machine learning workflows, enabling data scientists and engineers to focus on building high-quality models while leveraging Airflow's proven production-grade features.

Features with ZenML

  • Native execution of ZenML pipelines as Airflow DAGs
  • Simplified management of complex ML workflows
  • Enhanced efficiency and scalability for MLOps pipelines
  • Compatibility with both local and remote Airflow deployments

Apache Airflow integration screenshot

Main Features

  • Robust workflow orchestration for data pipelines
  • Extensive library of pre-built operators and sensors
  • Intuitive web-based user interface for monitoring and managing workflows
  • Scalable architecture for running workflows on distributed systems
  • Strong focus on extensibility, allowing custom plugins and operators

How to use ZenML with Apache Airflow


from zenml import step, pipeline
from zenml.integrations.airflow.flavors.airflow_orchestrator_flavor import AirflowOrchestratorSettings

@step
def my_step():
    print("Running in Airflow!")

airflow_settings = AirflowOrchestratorSettings(
    operator="airflow.providers.docker.operators.docker.DockerOperator",
    operator_args={}
)

@pipeline(settings={"orchestrator.airflow": airflow_settings})
def my_airflow_pipeline():
    my_step()

if __name__ == "__main__":
	my_airflow_pipeline()]

Additional Resources

Connect Your ML Pipelines to a World of Tools

Expand your ML pipelines with more than 50 ZenML Integrations

  • Amazon S3
  • Argilla
  • AutoGen
  • AWS
  • AWS Strands
  • Azure Blob Storage
  • Azure Container Registry
  • AzureML Pipelines
  • BentoML
  • Comet
  • CrewAI