Integrations
Databricks
and
ZenML logo in purple, representing machine learning pipelines and MLOps framework.
Deploy Scalable, Production-Ready ML Models with Databricks and ZenML
Databricks
All integrations

Databricks

Deploy Scalable, Production-Ready ML Models with Databricks and ZenML
Add to ZenML

Deploy Scalable, Production-Ready ML Models with Databricks and ZenML

Integrate Databricks Model Serving with ZenML to deploy and serve AI models effortlessly. This integration provides a unified interface to deploy, govern, and query models, leveraging Databricks' managed infrastructure for scalability and enterprise security.

Features with ZenML

  • Seamless model deployment to Databricks Inference Endpoints directly from ZenML pipelines
  • Switch between MLflow and Databricks Model Deployers without changing pipeline code
  • Secure model deployment into VPC-accessible endpoints for enterprise security
  • Scale model serving with dedicated, autoscaling infrastructure managed by Databricks
  • Turn models into production-ready APIs with minimal infrastructure or MLOps overhead

Main Features

  • Unified interface to deploy, govern, and query models
  • Dedicated and autoscaling infrastructure for model serving
  • Secure model deployment into VPC-accessible endpoints
  • Support for a variety of workload sizes and types (CPU, GPU)
  • Integration with Databricks Model Registry for model versioning and management

How to use ZenML with
Databricks

from zenml.integrations.databricks.steps.databricks_deployer import databricks_model_deployer_step

@step(enable_cache=False)
def deployment_deploy() -> (
    Annotated[
        Optional[DatabricksDeploymentService],
        ArtifactConfig(
            name="databricks_deployment", is_deployment_artifact=True
        ),
    ]
):
    # deploy predictor service
    zenml_client = Client()
    model_deployer = zenml_client.active_stack.model_deployer
    databricks_deployment_config = DatabricksDeploymentConfig(
        model_name=model.name,
        model_version=model.run_metadata["model_registry_version"].value,
        workload_size="Small",
        workload_type="CPU",
        scale_to_zero_enabled=True,
        endpoint_secret_name="databricks_token",
    )
    deployment_service = model_deployer.deploy_model(
        config=databricks_deployment_config,
        service_type=DatabricksDeploymentService.SERVICE_TYPE,
        timeout=1200,
    )
    logger.info(
        f"The deployed service info: {model_deployer.get_model_server_info(deployment_service)}"
    )
    return deployment_service


@pipeline(on_failure=notify_on_failure)
def databricks_deploy_pipeline():
    deployment_deploy()
    notify_on_success(after=["deployment_deploy"])

databricks_deploy_pipeline().run()

This code example demonstrates deploying a trained model to Databricks Inference Endpoints using ZenML. The train_model step trains and saves the model. The databricks_model_deployer_step is used within the pipeline to deploy the saved model, specifying the model name, version, and workload size. Finally, the pipeline is run with the desired model name and version.

Additional Resources
GitHub: ZenML Databricks Integration Example
Read the Databricks Model Deployer integration documentation
Databricks Model Serving documentation

Deploy Scalable, Production-Ready ML Models with Databricks and ZenML

Integrate Databricks Model Serving with ZenML to deploy and serve AI models effortlessly. This integration provides a unified interface to deploy, govern, and query models, leveraging Databricks' managed infrastructure for scalability and enterprise security.
Databricks

Start Your Free Trial Now

No new paradigms - Bring your own tools and infrastructure
No data leaves your servers, we only track metadata
Free trial included - no strings attached, cancel anytime
Dashboard displaying machine learning models, including versions, authors, and tags. Relevant to model monitoring and ML pipelines.

Connect Your ML Pipelines to a World of Tools

Expand your ML pipelines with Apache Airflow and other 50+ ZenML Integrations
Kubernetes
Azure Blob Storage
Great Expectations
Kaniko
Tekton
Neptune
Pigeon
TensorBoard
Azure Container Registry
AWS
Pillow