Skip to content

Storage S3

Persists task output to AWS S3. Good for serverless and cloud-native workflows.

Note

Requires pip install dotflow[aws]

Example

from dotflow import Config, DotFlow, action
from dotflow.providers import StorageS3


@action
def step_one():
    return {"message": "hello from S3"}


@action
def step_two(previous_context):
    print(previous_context.storage)
    return "ok"


config = Config(
    storage=StorageS3(
        bucket="dotflow-io-bucket",
        prefix="workflows/",
        region="us-east-1",
    )
)


def main():
    workflow = DotFlow(config=config)

    workflow.task.add(step=step_one)

# Code below omitted 👇
👀 Full file preview
from dotflow import Config, DotFlow, action
from dotflow.providers import StorageS3


@action
def step_one():
    return {"message": "hello from S3"}


@action
def step_two(previous_context):
    print(previous_context.storage)
    return "ok"


config = Config(
    storage=StorageS3(
        bucket="dotflow-io-bucket",
        prefix="workflows/",
        region="us-east-1",
    )
)


def main():
    workflow = DotFlow(config=config)

    workflow.task.add(step=step_one)
    workflow.task.add(step=step_two)
    workflow.start()

    return workflow


if __name__ == "__main__":
    main()

Authentication

StorageS3 uses the default boto3 credential chain:

  1. Environment variables: AWS_ACCESS_KEY_ID + AWS_SECRET_ACCESS_KEY
  2. Shared credentials file: ~/.aws/credentials
  3. IAM Role: automatic on Lambda, EC2, ECS

No credentials are needed in code — boto3 handles it transparently.

References