ZenML
Docker
All integrations

Docker

Effortlessly Run ZenML Pipelines in Isolated Docker Containers

Add to ZenML

Effortlessly Run ZenML Pipelines in Isolated Docker Containers

Integrate ZenML with Docker to execute your ML pipelines in isolated environments locally. This integration simplifies debugging and ensures consistent execution across different systems.

Features with ZenML

  • Isolated Pipeline Execution: Run each step of your ZenML pipeline in a separate Docker container, ensuring isolation and reproducibility.
  • Local Debugging: Debug issues that occur when running pipelines in Docker containers without the need for remote infrastructure.
  • Consistent Environments: Maintain consistent execution environments across different systems by leveraging Docker containers.
  • Easy Setup: Seamlessly integrate Docker with ZenML using the built-in local Docker orchestrator.

Docker integration screenshot

Main Features

  • Containerization of applications
  • Isolation of processes and dependencies
  • Portability across different systems
  • Efficient resource utilization
  • Reproducibility of environments

How to use ZenML with Docker


from zenml import step, pipeline
from zenml.orchestrators.local_docker.local_docker_orchestrator import (
    LocalDockerOrchestratorSettings,
)

@step
def preprocess_data():
    # Preprocessing logic here
    pass

@step
def train_model():
    # Model training logic here
    pass

settings = {
    "orchestrator.local_docker": LocalDockerOrchestratorSettings(
        run_args={"cpu_count": 2}
    )
}

@pipeline(settings=settings)
def ml_pipeline():
    data = preprocess_data()
    train_model(data)

if __name__ == "__main__":
    ml_pipeline()

Additional Resources

Connect Your ML Pipelines to a World of Tools

Expand your ML pipelines with more than 50 ZenML Integrations

  • Amazon S3
  • Apache Airflow
  • Argilla
  • AutoGen
  • AWS
  • AWS Strands
  • Azure Blob Storage
  • Azure Container Registry
  • AzureML Pipelines
  • BentoML
  • Comet