Integrations
Google Cloud Storage (GCS)
and
ZenML logo in purple, representing machine learning pipelines and MLOps framework.
Seamlessly Store your pipeline step outputs with Google Cloud Storage (GCS)
The image is blank. No elements are visible for description or keyword inclusion.
Google Cloud Storage (GCS)
All integrations

Google Cloud Storage (GCS)

Seamlessly Store your pipeline step outputs with Google Cloud Storage (GCS)
Add to ZenML

Seamlessly Store your pipeline step outputs with Google Cloud Storage (GCS)

Integrate Google Cloud Storage (GCS) with ZenML to leverage a scalable and reliable artifact store for your ML workflows. This integration enables you to store and share pipeline artifacts, making it ideal for collaboration, remote execution, and production-grade MLOps.

Features with ZenML

  • Seamlessly store and retrieve ZenML pipeline artifacts using GCS.
  • Share artifacts across teams and enable remote pipeline execution.
  • Scale storage effortlessly to handle production-grade ML workflows.
  • Ensure secure access to artifacts with GCP authentication methods.
  • Easily integrate GCS with other ZenML stack components.

Main Features

  • Scalable and durable object storage for any amount of data
  • Highly available and performant storage infrastructure
  • Secure access control and encryption for data protection
  • Seamless integration with other GCP services and tools
  • Cost-effective storage with flexible pricing options

How to use ZenML with
Google Cloud Storage (GCS)
zenml integration install gcp
zenml stack set ...
from typing_extensions import Annotated

from zenml import pipeline, step
from zenml.client import Client

@step
def my_step(input_dict: dict) -> Annotated[dict, "dict_from_aws_cloud_storage"]:
    output_dict = input_dict.copy()
    output_dict["message"] = "Store this in cloud storage"
    return output_dict

@pipeline
def my_pipeline(input_dict: dict):
    my_step(input_dict)

if __name__ == "__main__":
    input_data = {"key": "value"}
    my_pipeline(input_data)

		# access the remote file from local code
		data = Client().get_artifact_version(name_id_or_prefix="dict_from_aws_cloud_storage").load()
		
    print(
        "The artifact value you saved in the `my_pipeline` run is:\n "
        f"{data}"
    )

The example demonstrates how to create and register a GCP stack in ZenML through the dashboard. The registered ZenML stack contains a GCS Artifact Store. This enables seamless storage and retrieval of pipeline artifacts using Google Cloud Storage. For this the stack is set as the active stack. The example python code then implicitely uses the configured Cloud Storage as backend.

Additional Resources
ZenML GCS Artifact Store Documentation
Using Artifact Registry with Google Cloud

Seamlessly Store your pipeline step outputs with Google Cloud Storage (GCS)

Integrate Google Cloud Storage (GCS) with ZenML to leverage a scalable and reliable artifact store for your ML workflows. This integration enables you to store and share pipeline artifacts, making it ideal for collaboration, remote execution, and production-grade MLOps.
Google Cloud Storage (GCS)

Start Your Free Trial Now

No new paradigms - Bring your own tools and infrastructure
No data leaves your servers, we only track metadata
Free trial included - no strings attached, cancel anytime
Dashboard displaying machine learning models, including versions, authors, and tags. Relevant to model monitoring and ML pipelines.

Connect Your ML Pipelines to a World of Tools

Expand your ML pipelines with Apache Airflow and other 50+ ZenML Integrations
Seldon
XGBoost
Argilla
GitHub Container Registry
PyTorch
Facets
Azure Container Registry
Databricks
Neptune
HyperAI
TensorBoard