Auto-generated API documentation from source code. Generated using: OpenAI/gpt-5-nano-2025-08-07

Table of Contents

latch.init

The Latch SDK is a commandline toolchain to define and register serverless workflows with the Latch platform. This module re-exports a collection of utilities and resources from several submodules. The following names are imported and exposed by this package:

latch.resources.tasks

The following documentation reflects the public API and code content found in the module src/latch/resources/tasks.py. It documents actual definitions and their behavior as implemented, using the provided docstrings where present and concise, factual descriptions otherwise. Module description Latch tasks are decorators to turn functions into workflow ‘nodes’. Each task is containerized, versioned and registered with Flyte_ when a workflow is uploaded to Latch. Containerized tasks are then executed on arbitrary instances as Kubernetes Pods, scheduled using flytepropeller. The type of instance that the task executes on (eg. number of available resources, presence of GPU) can be controlled by invoking one of the set of exported decorators. .. from latch import medium_task @medium_task def my_task(a: int) -> str: … .. _Kubernetes Pods: https://kubernetes.io/docs/concepts/workloads/pods/ .. _flytepropeller: https://github.com/flyteorg/flytepropeller .. _Flyte: https://docs.flyte.org/en/latest/

Functions

get_v100_x1_pod()

Description: Pod configuration for p3.2xlarge on-demand with 1x V100 GPU. Parameters: none Returns: Pod
def get_v100_x1_pod() -> Pod:
Example:
pod = get_v100_x1_pod()

get_v100_x4_pod()

Description: Pod configuration for p3.8xlarge on-demand with 4x V100 GPUs. Parameters: none Returns: Pod
def get_v100_x4_pod() -> Pod:
Example:
pod = get_v100_x4_pod()

get_v100_x8_pod()

Description: Pod configuration for p3.16xlarge on-demand with 8x V100 GPUs. Parameters: none Returns: Pod
def get_v100_x8_pod() -> Pod:
Example:
pod = get_v100_x8_pod()

_get_large_gpu_pod()

Description: Pod configuration for g5.16xlarge on-demand. Parameters: none Returns: Pod
def _get_large_gpu_pod() -> Pod:
Example:
pod = _get_large_gpu_pod()

_get_small_gpu_pod()

Description: Pod configuration for g4dn.2xlarge on-demand. Parameters: none Returns: Pod
def _get_small_gpu_pod() -> Pod:
Example:
pod = _get_small_gpu_pod()

_get_large_pod()

Description: Pod configuration for large on-demand instance types: c6i.24xlarge, c5.24xlarge, c5.metal, c5d.24xlarge, c5d.metal Parameters: none Returns: Pod
def _get_large_pod() -> Pod:
Example:
pod = _get_large_pod()

_get_medium_pod()

Description: Pod configuration for medium on-demand instance types: m5.8xlarge, m5ad.8xlarge, m5d.8xlarge, m5n.8xlarge, m5dn.8xlarge, m5a.8xlarge Parameters: none Returns: Pod
def _get_medium_pod() -> Pod:
Example:
pod = _get_medium_pod()

_get_small_pod()

Description: Pod configuration for any available instance. Parameters: none Returns: Pod
def _get_small_pod() -> Pod:
Example:
pod = _get_small_pod()

_get_l40s_pod(instance_type: str, cpu: int, memory_gib: int, gpus: int) -> Pod

Description: Helper function to create L40s GPU pod configurations. Parameters:
  • instance_type: str – taint key/value for the pod
  • cpu: int – CPUs requested
  • memory_gib: int – memory in GiB
  • gpus: int – number of GPUs
Returns: Pod
def _get_l40s_pod(instance_type: str, cpu: int, memory_gib: int, gpus: int) -> Pod:
Example:
pod = _get_l40s_pod("g6e-xlarge", cpu=4, memory_gib=32, gpus=1)

custom_memory_optimized_task(cpu: int, memory: int)

Description: Deprecated helper returning a custom task configuration with specified CPU and RAM allocations. Notes:
  • This function is deprecated and will be removed in a future release.
  • It raises a deprecation warning.
Parameters:
  • cpu: int – number of CPU cores to request
  • memory: int – memory in GiB to request
Returns: a partial Flyte task configured with the specified Pod settings.
def custom_memory_optimized_task(cpu: int, memory: int):
Example:
task = custom_memory_optimized_task(16, 128)

custom_task(cpu: Union[Callable, int], memory: Union[Callable, int], *, storage_gib: Union[Callable, int] = 500, timeout: Union[datetime.timedelta, int] = 0, **kwargs)

Description: Returns a custom task configuration requesting the specified CPU/RAM allocations. If cpu, memory, or storage_gib are callables, returns a dynamic task configuration using DynamicTaskConfig with a small pod config; otherwise, constructs a static _custom_task_config Pod. Parameters:
  • cpu: Union[Callable, int] – CPU cores to request (integer or callable for dynamic)
  • memory: Union[Callable, int] – memory in GiB to request (integer or callable)
  • storage_gib: Union[Callable, int] (default 500) – storage in GiB (integer or callable)
  • timeout: Union[datetime.timedelta, int] (default 0) – timeout for the task
  • **kwargs – additional keyword arguments
Returns: a partial Flyte task configured for either a dynamic or a static custom task.
def custom_task(
    cpu: Union[Callable, int],
    memory: Union[Callable, int],
    *,
    storage_gib: Union[Callable, int] = 500,
    timeout: Union[datetime.timedelta, int] = 0,
    **kwargs,
):
Example:
task = custom_task(8, 32, storage_gib=200)

lustre_setup_task()

Description: Returns a partial Flyte task configured for Lustre setup with a Nextflow work directory PVC. Parameters: none Returns: a partial Flyte task
def lustre_setup_task():
Example:
task = lustre_setup_task()

nextflow_runtime_task(cpu: int, memory: int, storage_gib: int = 50)

Description: Returns a partial Flyte task configured for Nextflow runtime with a shared work directory volume mounted at /nf-workdir. Parameters:
  • cpu: int – CPU cores
  • memory: int – memory in GiB
  • storage_gib: int – storage in GiB (default 50)
Returns: a partial Flyte task
def nextflow_runtime_task(cpu: int, memory: int, storage_gib: int = 50):
Example:
task = nextflow_runtime_task(4, 16, 64)

_get_l40s_pod(instance_type: str, cpu: int, memory_gib: int, gpus: int) -> Pod

Description: Helper function to create L40s GPU pod configurations. Parameters:
  • instance_type: str – taint key/value for the pod
  • cpu: int – CPUs requested
  • memory_gib: int – memory in GiB
  • gpus: int – number of GPUs
Returns: Pod
def _get_l40s_pod(instance_type: str, cpu: int, memory_gib: int, gpus: int) -> Pod:
Example:
pod = _get_l40s_pod("g6e-xlarge", cpu=4, memory_gib=32, gpus=1)

g6e_xlarge_task, g6e_2xlarge_task, g6e_4xlarge_task, g6e_8xlarge_task, g6e_12xlarge_task, g6e_16xlarge_task, g6e_24xlarge_task, g6e_48xlarge_task

Descriptions: Partial Flyte tasks configured for specific L40s GPU pod configurations. Parameters: none Returns: a partial Flyte task for each respective instance type and resources.
g6e_xlarge_task = functools.partial(
    task,
    task_config=_get_l40s_pod("g6e-xlarge", cpu=4, memory_gib=32, gpus=1)
)
Example:
task = g6e_xlarge_task

Classes

latch.resources.workflow

This module defines decorators and helpers to convert Python callables into Flyte PythonFunctionWorkflow objects. It includes internal utilities for metadata generation and docstring injection, a workflow decorator that supports usage with or without arguments, and a nextflow_workflow helper for Nextflow-style workflows.

Functions

_generate_metadata()

Generates metadata for a function based on its signature.
def _generate_metadata(f: Callable) -> LatchMetadata:
  • Parameters
    • f: Callable — Function to generate metadata for.
  • Returns
    • LatchMetadata — Generated metadata for the function.

_inject_metadata()

Injects metadata into the function’s docstring, mutating it in place.
def _inject_metadata(f: Callable, metadata: LatchMetadata) -> None:
  • Parameters
    • f: Callable — Function to mutate the docstring of.
    • metadata: LatchMetadata — Metadata to inject.
  • Returns
    • None

workflow()

Decorator to expose a Python function as a Flyte PythonFunctionWorkflow. Can be used as @workflow without arguments or with a LatchMetadata argument.
def workflow(
    metadata: Union[LatchMetadata, Callable],
) -> Union[PythonFunctionWorkflow, Callable]:
  • Parameters
    • metadata: Union[LatchMetadata, Callable] — Either a LatchMetadata instance or the function to decorate (when used without parentheses).
  • Returns
    • Union[PythonFunctionWorkflow, Callable] — A PythonFunctionWorkflow if used directly or a decorator if used with arguments.

nextflow_workflow()

Sets unpack_records in NextflowMetadata and delegates to workflow.
def nextflow_workflow(
    metadata: NextflowMetadata,
) -> Callable[[Callable], PythonFunctionWorkflow]:
  • Parameters
    • metadata: NextflowMetadata — Metadata for the Nextflow-style workflow.
  • Returns
    • Callable[[Callable], PythonFunctionWorkflow] — A decorator compatible with workflow.

Examples

from latch.resources.workflow import workflow

@workflow
def add(a: int, b: int) -> int:
    return a + b
from latch.resources.workflow import workflow
from latch.types.metadata import LatchMetadata

md = LatchMetadata("my_op", None)
md.parameters = {}
@workflow(md)
def multiply(x: int, y: int) -> int:
    return x * y
from latch.resources.workflow import nextflow_workflow
from latch.types.metadata import NextflowMetadata

nfm = NextflowMetadata("nf_op", None)
@nextflow_workflow(nfm)
def process(records: list[dict[str, int]]) -> None:
    pass

latch.resources.conditional

Module: conditional

This module exposes a factory function to create a ConditionalSection for conditional execution in workflows. It delegates to conditional from flytekit.core.condition to create the ConditionalSection.

Functions

create_conditional_section()

Creates a new conditional section in a workflow, allowing a user to conditionally execute a task based on the value of a task result.
  • The conditional sections can be n-ary with as many elif clauses as desired.
  • Outputs from conditional nodes can be consumed, and outputs from other tasks can be passed to conditional nodes.
  • Boolean expressions in the condition use & (and) and | (or) operators.
  • Unary expressions are not allowed. If a task returns a boolean, use built-in truth checks like result.is_true() or result.is_false().
Args:
  • name (str): The name of the conditional section, to be shown in Latch Console
Returns:
  • ConditionalSection
def create_conditional_section(name: str) -> ConditionalSection:
    ...
Example:
section = create_conditional_section("fractions")

latch.resources.map_tasks

map_tasks

A map task lets you run a pod task or a regular task over a list of inputs within a single workflow node. This means you can run thousands of instances of the task without creating a node for every instance, providing valuable performance gains! Some use cases of map tasks include:
  • Several inputs must run through the same code logic
  • Multiple data batches need to be processed in parallel
  • Hyperparameter optimization
Args:
  • task_function: The task to be mapped, to be shown in Latch Console
Returns:
  • A conditional section
Intended Use:
@task
def a_mappable_task(a: int) -> str:
    inc = a + 2
    stringified = str(inc)
    return stringified

@task
def coalesce(b: typing.List[str]) -> str:
    coalesced = "".join(b)
    return coalesced

@workflow
def my_map_workflow(a: typing.List[int]) -> str:
    mapped_out = map_task(a_mappable_task)(a=a).with_overrides(
        requests=Resources(mem="300Mi"),
        limits=Resources(mem="500Mi"),
        retries=1,
    )
    coalesced = coalesce(b=mapped_out)
    return coalesced
This module re-exports map_task from flytekit.core.map_task.

latch.resources.reference_workflow

Module: reference_workflow This module defines workflow_reference to create a Flyte Launch Plan reference using the current workspace as the project and the domain set to 'development'. It imports reference_launch_plan from flytekit.core.launch_plan and current_workspace from latch.utils.

Functions

workflow_reference()

Returns a Flyte Launch Plan reference for the given name and version in the current workspace under domain 'development'.
  • Parameters
    • name (str): The name of the launch plan to reference.
    • version (str): The version of the launch plan to reference.
  • Returns: The value returned by reference_launch_plan configured with:
    • project=current_workspace()
    • domain="development"
    • name=name
    • version=version
def workflow_reference(
    name: str,
    version: str,
):
    return reference_launch_plan(
        project=current_workspace(),
        domain="development",
        name=name,
        version=version,
    )

Examples

from reference_workflow import workflow_reference

lp = workflow_reference(name="my_workflow", version="v1")

latch.types.init

Module: latch.types (src/latch/types/init.py)

Latch types package initializer. This module re-exports several type definitions and a utility function from submodules under latch.types. The available exports are:
  • LatchDir
  • LatchOutputDir
  • LatchFile
  • LatchOutputFile
  • file_glob
  • DockerMetadata
  • Fork
  • ForkBranch
  • LatchAppearanceType
  • LatchAuthor
  • LatchMetadata
  • LatchParameter
  • LatchRule
  • Params
  • Section
  • Spoiler
  • Text

Functions

file_glob()

Re-exported from latch.types.glob. No docstring is present in this module for this entry; see the original definition for details.
# Signature defined in latch.types.glob

Classes

LatchDir

Class for directory type representations.
  • Description: Class for directory type representations.
  • Methods: Not documented in this module; see the corresponding definitions in the submodule.

LatchOutputDir

Class for output directory representations.
  • Description: Class for output directory representations.
  • Methods: Not documented in this module; see the corresponding definitions in the submodule.

LatchFile

Class for file path representations.
  • Description: Class for file path representations.
  • Methods: Not documented in this module; see the corresponding definitions in the submodule.

LatchOutputFile

Class for output file representations.
  • Description: Class for output file representations.
  • Methods: Not documented in this module; see the corresponding definitions in the submodule.

DockerMetadata

Class representing Docker metadata.
  • Description: Class representing Docker metadata.
  • Methods: Not documented in this module; see the corresponding definitions in the submodule.

Fork

Class representing a fork of a project.
  • Description: Class representing a fork of a project.
  • Methods: Not documented in this module; see the corresponding definitions in the submodule.

ForkBranch

Class representing a fork branch.
  • Description: Class representing a fork branch.
  • Methods: Not documented in this module; see the corresponding definitions in the submodule.

LatchAppearanceType

Enum-like type for appearance.
  • Description: Class representing appearance type information.
  • Methods: Not documented in this module; see the corresponding definitions in the submodule.

LatchAuthor

Class representing an author.
  • Description: Class representing an author.
  • Methods: Not documented in this module; see the corresponding definitions in the submodule.

LatchMetadata

Class representing latch metadata.
  • Description: Class representing latch metadata.
  • Methods: Not documented in this module; see the corresponding definitions in the submodule.

LatchParameter

Class representing a parameter.
  • Description: Class representing a parameter.
  • Methods: Not documented in this module; see the corresponding definitions in the submodule.

LatchRule

Class representing a rule.
  • Description: Class representing a rule.
  • Methods: Not documented in this module; see the corresponding definitions in the submodule.

Params

Class representing a collection of parameters.
  • Description: Class representing a collection of parameters.
  • Methods: Not documented in this module; see the corresponding definitions in the submodule.

Section

Class representing a section.
  • Description: Class representing a section.
  • Methods: Not documented in this module; see the corresponding definitions in the submodule.

Spoiler

Class representing a spoiler.
  • Description: Class representing a spoiler.
  • Methods: Not documented in this module; see the corresponding definitions in the submodule.

Text

Class representing text content.
  • Description: Class representing text content.
  • Methods: Not documented in this module; see the corresponding definitions in the submodule.
Notes:
  • This module solely re-exports names from their respective submodules; no additional behavior is defined here.
  • For detailed class definitions and methods, refer to the originating modules where these names are implemented.

latch.types.file

Module: latch.types.file Latch types for file handling within Flyte tasks. This module defines a LatchFile class to represent a file object with both a local path and an optional remote path, a type alias for an output file, and a transformer that converts between LatchFile instances and Flyte Literals. Notes:
  • LatchOutputFile is a type alias for LatchFile annotated as an output in Flyte. It is defined as: Annotated[LatchFile, FlyteAnnotation({"output": True})]

Classes

LatchFile

Documentation Represents a file object in the context of a task execution. The local path identifies the file object’s location on local disk in the context of a task execution. LatchFile inherits implementation of __fsopen__ from FlyteFile, so methods like open can retrieve a string representation of self. .. @task def task(file: LatchFile): with open(file, “r”) as f: print(f.read()) mypath = Path(file).resolve() The remote path identifies a remote location. The remote location, either a latch or s3 url, can be inspected from an object passed to the task to reveal its remote source. It can also to deposit the file to a latch path when the object is returned from a task. .. @task def task(file: LatchFile): path = file.remote_path # inspect remote location

Returning a different file to LatchData.

return LatchFile(”./foobar.txt”, “latch:///foobar.txt”) Constructor and behavior overview
  • Initializes with a local path and an optional remote path.
  • Normalizes path input depending on its URI scheme (file, latch, or plain string).
  • If a remote_path is provided, it is stored; otherwise a remote path may be inferred from the input.
  • If a downloader is supplied via kwargs, the FlyteFile base class is initialized with it; otherwise a downloader function is defined that interacts with Flyte’s context to fetch data lazily.
Constructor signature
def __init__(
    self,
    path: Union[str, PathLike],
    remote_path: Optional[Union[str, PathLike]] = None,
    **kwargs,
) -> None:
    ...
Other notable methods and properties
  • size(self) -> int Returns the size of the remote data via LPath(self.remote_path).size().
  • _idempotent_set_path(self, hint: Optional[str] = None) -> None Sets the local path using the Flyte context’s file access if not already set.
  • _create_imposters(self) -> None Creates local filesystem placeholders (parent directories and empty file) at the local path.
  • local_path (property) -> str Local file path for the environment executing the task.
  • remote_path (property) -> Optional[str] Remote URL referencing the object (LatchData or S3).
  • __repr__(self) -> str String representation, differing based on whether remote_path is set.
  • __str__(self) -> str Simple string representation of the LatchFile.
Code example (basic usage)
# Basic usage: create a LatchFile with a local path
lf = LatchFile("./my_file.txt")

# With a remote path
lf_remote = LatchFile("./my_file.txt", "latch:///remote_path.txt")

LatchFilePathTransformer

Inferred purpose Transformer that converts between LatchFile instances and Flyte Literal values for file paths. Constructor
def __init__(self) -> None:
    TypeTransformer.__init__(self, name="LatchFilePath", t=LatchFile)
Methods
  • to_literal(...) -> Literal Convert a LatchFile to a Flyte Literal containing a Blob with the file URI.
def to_literal(
    self,
    ctx: FlyteContext,
    python_val: LatchFile,
    python_type: type[LatchFile],
    expected: LiteralType,
) -> Literal:
    ...
  • to_python_value(...) -> LatchFile Convert a Flyte Literal back to a LatchFile. Behavior depends on the expected Python type:
    • If expected_python_type is PathLike, raise TypeError.
    • If expected_python_type is not a subclass of LatchFile, raise TypeError.
    • If the URI is not remote, return a LatchFile-like instance initialized with the URI.
    • If the URI is remote, create a LatchFile with a local path and a downloader that fetches the data.
def to_python_value(
    self,
    ctx: FlyteContext,
    lv: Literal,
    expected_python_type: Union[type[LatchFile], PathLike],
) -> LatchFile:
    ...
Registration The transformer is registered with the TypeEngine:
TypeEngine.register(LatchFilePathTransformer())
Usage note
  • This transformer relies on Flyte context utilities such as ctx.file_access to handle remote/local data paths and downloads.

latch.types.directory

Module: latch.types.directory

This module defines the LatchDir class, several TypedDicts used for GraphQL data structures, and a transformer for integrating LatchDir with Flyte literals. It also defines a type alias for marking a LatchDir as an output.
  • Exports data structures used for describing directory-related GraphQL responses
  • Provides LatchDir, a FlyteDirectory-based representation of a directory with local and remote paths
  • Exposes LatchOutputDir as a tagged output type
  • Provides LatchDirPathTransformer to convert between Python LatchDir objects and Flyte literals

Classes

LatchDir

A class that represents a directory in the context of a task execution. It encapsulates a local path on disk and an optional remote path. The local path is used during task execution, while the remote path refers to the remote object (LatchData or S3). The class integrates with Flyte’s context and provides directory listing, remote data resolution, and helpers to materialize remote contents locally.
  • Description: Represents a directory in the context of a task execution. The local path identifies the directory location on local disk; the remote path identifies a remote location (LatchData or S3). It can be used as a task input or as an output returned from a task.
  • Methods and attributes:

__init__(self, path: Union[str, PathLike], remote_path: Optional[PathLike] = None, **kwargs)

Initializes a LatchDir with a local path and optional remote path. If the provided path is file://, the local path is adjusted accordingly; if it uses the latch scheme, the path is normalized. Remote path handling is set based on whether a remote path is provided or the path is a valid URL. If a downloader is provided, it is used; otherwise a default downloader is defined that interacts with the Flyte context.
def __init__(
    self,
    path: Union[str, PathLike],
    remote_path: Optional[PathLike] = None,
    **kwargs,
)

_idempotent_set_path(self)

Ensures that the local path is populated/generated once by obtaining a random local directory from the Flyte context, if available.
def _idempotent_set_path(self):

iterdir(self) -> list[Union[LatchFile, "LatchDir"]]

Returns a list of the directory’s children. If remote_path is not set, it enumerates the local directory contents and returns corresponding LatchDir or LatchFile objects. If remote_path is set, it queries remote metadata (via GraphQL) to construct child objects.
def iterdir(self) -> list[Union[LatchFile, "LatchDir"]]:

size_recursive(self)

Returns the size of the directory contents recursively by delegating to the remote path via LPath(self.remote_path).size_recursive().
def size_recursive(self):

_create_imposters(self)

Contacts the remote service to retrieve descendant information and creates corresponding local files/directories as placeholders, ensuring the local view matches the remote structure.
def _create_imposters(self):

local_path (property)

Returns the local filesystem path for this directory as seen by the executing environment.
@property
def local_path(self) -> str:

remote_path (property)

Returns the remote URL for the directory, if any.
@property
def remote_path(self) -> Optional[str]:

__repr__(self)

Returns a string representation of the LatchDir, showing the local path or the remote path depending on whether a remote path is set.
def __repr__(self):

__str__(self)

Returns a human-readable string for the directory, indicating whether it is local or remote.
def __str__(self):
  • Notes:
    • The class depends on Flyte context management, remote data accessors, and a remote directory concept used by LatchData.
    • It relies on helper utilities such as format_path and normalize_path and integrates with LatchFile for non-directory children.
  • Related aliases and transformers:
    • LatchOutputDir is an alias for a LatchDir annotated as an output in Flyte.
    • LatchDirPathTransformer provides conversion between LatchDir instances and Flyte literals.

LatchOutputDir

A type alias for marking a LatchDir as the output of a Flyte workflow.
LatchOutputDir = Annotated[LatchDir, FlyteAnnotation({"output": True})]
  • Description: Marks a LatchDir as an output so the console can optimize existence checks for remote paths when needed.

LatchDirPathTransformer

A transformer for converting between LatchDir instances and Flyte literals, enabling serialization/deserialization within Flyte.
class LatchDirPathTransformer(FlyteDirToMultipartBlobTransformer):
    def __init__(self):
        TypeTransformer.__init__(self, name="LatchDirPath", t=LatchDir)
  • Methods:
to_literal(self, ctx: FlyteContext, python_val: LatchDir, python_type: type[LatchDir], expected: LiteralType)
Converts a LatchDir to a Flyte Literal representing its remote path as a multipart blob.
def to_literal(
    self,
    ctx: FlyteContext,
    python_val: LatchDir,
    python_type: type[LatchDir],
    expected: LiteralType,
):
to_python_value(self, ctx: FlyteContext, lv: Literal, expected_python_type: Union[type[LatchDir], PathLike]) -> FlyteDirectory
Converts a Flyte Literal back to a Python LatchDir (or a subclass of LatchDir). If the expected type is a PathLike, a TypeError is raised since casting to PathLike is not supported. If the URI is not remote, the local URI is returned as the path. If remote, a LatchDir is created with a downloader to fetch data on demand.
def to_python_value(
    self,
    ctx: FlyteContext,
    lv: Literal,
    expected_python_type: Union[type[LatchDir], PathLike],
) -> FlyteDirectory:
  • Notes:
    • This transformer is registered with the TypeEngine to enable automatic handling of LatchDir types in Flyte workflows.

Examples

Basic usage pattern visible from signatures:
from latch.types.directory import LatchDir

# Basic local directory
d = LatchDir("./foo")

# With a remote path
d_remote = LatchDir("./foo", "latch:///foo")

# Simple usage inside a task (illustrative; actual task decorators/imports not shown here)
# @task
# def my_task(dir: LatchDir):
#     ...
#     return LatchDir("./foo", "latch:///foo")
  • Note: The examples above illustrate basic creation and usage based on the constructor signatures and class behavior described in the code.

latch.types.metadata

Module: latch.types.metadata This module defines data models and helpers for representing workflow and parameter metadata used by Latch CLI integrations. It includes dataclasses for parameter metadata, UI flow elements, Snakemake/Nextflow integration metadata, and helper functions for samplesheet creation and representation.

Functions

_samplesheet_repr(v: Any) -> str

Convert a value to a string representation suitable for a samplesheet.
  • Description: If the value is None, returns an empty string; if it is a LatchFile or LatchDir, returns its remote_path; if it is an Enum, returns its value; otherwise returns str(v).
  • Signature:
def _samplesheet_repr(v: Any) -> str:
    ...
Example
# Basic usage (conceptual)
_samplesheet_repr(None)          # -> ""
_samplesheet_repr(some_file)      # -> file.remote_path
_samplesheet_repr(SomeEnum.VALUE) # -> "VALUE"

default_samplesheet_constructor(samples: List[DC], t: DC, delim: str = ",") -> Path

Create a CSV samplesheet from a list of dataclass instances.
  • Description: Writes a CSV named samplesheet.csv using the field names of the dataclass type t as headers. Values are converted using _samplesheet_repr. Returns the path to the created file.
  • Signature:
def default_samplesheet_constructor(samples: List[DC], t: DC, delim: str = ",") -> Path:
    ...
Example
# Basic usage
# Given a list of dataclass instances 'samples' of type 'SomeDataclass'
default_samplesheet_constructor(samples, t=SomeDataclass, delim=",")

Classes

LatchRule

Description: Class describing a rule that a parameter input must follow
  • Attributes:
    • regex: str — A string regular expression which inputs must match
    • message: str — The message to render when an input does not match the regex
  • Methods:
    • dict (property) — Returns a dictionary representation of the instance
    • __post_init__ — Validates that regex is a valid regular expression

LatchAppearanceEnum

Description: Enum for appearance types with options “line” and “paragraph”
  • Members:
    • line — Value: “line”
    • paragraph — Value: “paragraph”

MultiselectOption

Description: Dataclass representing a selectable option in a multiselect widget
  • Attributes:
    • name: str — Display name of the option
    • value: object — Associated value for the option

Multiselect

Description: Dataclass representing a multiselect input with options and allow_custom flag
  • Attributes:
    • options: List[MultiselectOption] — Available options
    • allow_custom: bool — Whether custom entries are allowed

LatchAuthor

Description: Dataclass for author metadata
  • Attributes:
    • name: Optional[str] — The name of the author
    • email: Optional[str] — The email of the author
    • github: Optional[str] — A link to the GitHub profile of the author

FlowBase

Description: Base class for all flow elements
  • This is a frozen dataclass serving as a common base for UI flow elements like Section, Text, Params, etc.

Section

Description: Flow element that displays a child flow in a card with a given title
  • Attributes:
    • section: str — Title of the section
    • flow: List[FlowBase] — Flow displayed in the section card
  • Methods:
    • __init__(section: str, *flow: FlowBase) — Initializes a section with a title and child flow elements

Text

Description: Flow element that displays a markdown string
  • Attributes:
    • text: str — Markdown body text

Title

Description: Flow element that displays a markdown title
  • Attributes:
    • title: str — Markdown title text

Params

Description: Flow element that displays parameter widgets
  • Attributes:
    • params: List[str] — Names of parameters whose widgets will be displayed
  • Methods:
    • __init__(*args: str) — Initializes with a list of parameter names

Spoiler

Description: Flow element that displays a collapsible card with a given title
  • Attributes:
    • spoiler: str — Title of the spoiler
    • flow: List[FlowBase] — Flow displayed in the spoiler card
  • Methods:
    • __init__(spoiler: str, *flow: FlowBase) — Initializes a spoiler with a title and child flow

ForkBranch

Description: Definition of a Fork branch
  • Attributes:
    • display_name: str — String displayed in the fork’s multibutton
    • flow: List[FlowBase] — Child flow displayed when the branch is active
  • Methods:
    • __init__(display_name: str, *flow: FlowBase) — Initializes a fork branch

Fork

Description: Flow element that displays a set of mutually exclusive alternatives
  • Attributes:
    • fork: str — Name of a str-typed parameter to store the active branch’s key
    • display_name: str — Title shown above the fork selector
    • flows: Dict[str, ForkBranch] — Mapping between branch keys to branch definitions
  • Methods:
    • __init__(fork: str, display_name: str, **flows: ForkBranch) — Initializes fork with branches

LatchParameter

Description: Class for organizing parameter metadata
  • Attributes:
    • display_name: Optional[str] — The name used to display the parameter
    • description: Optional[str] — The parameter’s description
    • hidden: bool — Whether the parameter should be hidden by default
    • section_title: Optional[str] — Section header for grouping
    • placeholder: Optional[str] — Placeholder text for input
    • comment: Optional[str] — Comment about the parameter
    • output: bool — Whether the parameter is an output
    • batch_table_column: bool — Show in batch table in UI
    • allow_dir: bool — Allow directories in UI
    • allow_file: bool — Allow files in UI
    • appearance_type: LatchAppearance — How the parameter should be rendered (line/paragraph or multiselect)
    • rules: List[LatchRule] — Validation rules for inputs
    • detail: Optional[str] — Additional detail
    • samplesheet: Optional[bool] — Use samplesheet input UI
    • samplesheet-related fields:
      • allowed_tables: Optional[List[int]] — Registry Tables allowed for samplesheet
    • _custom_ingestion: Optional[str] — Custom ingestion hook
  • Methods:
    • __str__() — YAML-like string representation including metadata
    • dict (property) — Returns a dictionary with an __metadata__ key containing metadata, including nested appearance details and rules

SnakemakeParameter(Generic[T], LatchParameter)

Description: Dataclass for Snakemake parameter metadata
  • Attributes:
    • type: Optional[Type[T]] — The python type of the parameter
    • default: Optional[T] — Default value
  • Inherits all fields from LatchParameter

SnakemakeFileParameter(SnakemakeParameter[Union[LatchFile, LatchDir]])

Description: Deprecated: use file_metadata in SnakemakeMetadata instead
  • Attributes:
    • type: Optional[Union[Type[LatchFile], Type[LatchDir]]] — The python type of the parameter
    • path: Optional[Path] — Destination path for file
    • config: bool — Expose path in Snakemake config
    • download: bool — Download in JIT step

SnakemakeFileMetadata

Description: Dataclass describing file metadata for Snakemake
  • Attributes:
    • path: Path — Local path where the file will be copied
    • config: bool — Expose in Snakemake config
    • download: bool — Download in JIT step

NextflowParameter(Generic[T], LatchParameter)

Description: Dataclass for Nextflow parameter metadata
  • Attributes:
    • type: Optional[Type[T]] — The python type of the parameter
    • default: Optional[T] — Default value
    • samplesheet_type: Literal["csv", "tsv", None] — Samplesheet type (CSV/TSV)
    • samplesheet_constructor: Optional[Callable[[T], Path]] — Custom samplesheet constructor
    • results_paths: Optional[List[Path]] — Output sub-paths exposed in UI under “Results”
  • Methods:
    • __post_init__() — Validates samplesheet constraints and requires a constructor if needed or raises errors via Click

DC (TypeVar)

Description: Type variable bound to any dataclass
  • Used for typing in default_samplesheet_constructor with generic dataclass types

_samplesheet_repr(v: Any) -> str (see Functions)

Description: Helper function (see Functions section)

default_samplesheet_constructor (see Functions)

Description: Helper to construct a samplesheet from dataclass instances (see Functions)

NextflowRuntimeResources

Description: Resources for Nextflow runtime tasks
  • Attributes:
    • cpus: Optional[int] — Number of CPUs required for the task
    • memory: Optional[int] — Memory required (GiB)
    • storage_gib: Optional[int] — Storage required (GiB)
    • storage_expiration_hours: int — Hours to retain workdir after failure

LatchMetadata

Description: Class for organizing workflow metadata
  • Attributes:
    • display_name: str — The human-readable name of the workflow
    • author: LatchAuthor — Metadata about the workflow author
    • documentation: Optional[str] — Link to workflow documentation
    • repository: Optional[str] — Link to repository hosting the workflow
    • license: str — SPDX identifier
    • parameters: Dict[str, LatchParameter] — Parameter metadata map
    • wiki_url: Optional[str] — Wiki URL
    • video_tutorial: Optional[str] — Video tutorial URL
    • tags: List[str] — Tags for categorization
    • flow: List[FlowBase] — Flow elements describing UI flow
    • no_standard_bulk_execution: bool — Disable standard bulk execution
    • _non_standard: Dict[str, object] — Non-standard metadata
    • about_page_path: Optional[Path] — Path to markdown about page
  • Methods:
    • validate() — Validates fields (e.g., about page path type)
    • dict (property) — Returns a dictionary with metadata; excludes parameters
    • __str__() — YAML-like string with parameters expanded

DockerMetadata

Description: Class describing credentials for private docker repositories
  • Attributes:
    • username: str — Account username
    • secret_name: str — Secret name containing the password

EnvironmentConfig

Description: Class describing environment for spawning Snakemake tasks
  • Attributes:
    • use_conda: bool — Use Snakemake conda directive
    • use_container: bool — Use Snakemake container directive
    • container_args: List[str] — Additional container arguments

FileMetadata (TypeAlias)

Description: Type alias for file metadata mappings
  • Type: Dict[str, Union[SnakemakeFileMetadata, "FileMetadata"]]

SnakemakeMetadata(LatchMetadata)

Description: Class for organizing Snakemake workflow metadata
  • Attributes:
    • output_dir: Optional[LatchDir] — Directory for Snakemake outputs
    • name: Optional[str] — Name of the workflow
    • docker_metadata: Optional[DockerMetadata] — Docker credentials
    • env_config: EnvironmentConfig — Environment configuration
    • parameters: Dict[str, SnakemakeParameter] — Snakemake parameter metadata
    • file_metadata: FileMetadata — File metadata mappings
    • cores: int — Number of cores (Snakemake)
    • about_page_content: Optional[Path] — Path to About page content
  • Methods:
    • validate() — Validates fields (e.g., about_page_content type)
    • dict (property) — Returns a dictionary including metadata; omits about_page_content
    • __post_init__() — Calls validate(), assigns default workflow name if needed, and registers global metadata

NextflowMetadata(LatchMetadata)

Description: Class for organizing Nextflow workflow metadata
  • Attributes:
    • name: Optional[str] — Name of the workflow
    • parameters: Dict[str, NextflowParameter] — Nextflow parameter metadata
    • runtime_resources: NextflowRuntimeResources — Runtime resources
    • execution_profiles: List[str] — Execution config profiles
    • log_dir: Optional[LatchDir] — Directory to dump Nextflow logs
    • upload_command_logs: bool — Upload .command.* logs after task execution
  • Methods:
    • dict (property) — Returns a dictionary including metadata, omitting about_page_path
    • __post_init__() — Calls validate(), assigns name if missing, and registers global metadata

Global Registries

  • _snakemake_metadata: Optional[SnakemakeMetadata] — Registry for Snakemake metadata
  • _nextflow_metadata: Optional[NextflowMetadata] — Registry for Nextflow metadata
Notes:
  • Many classes include docstrings directly in the code describing their purpose and usage.
  • Some fields and behaviors are implemented with validation logic in __post_init__ or validate() methods; these validations are reflected in the class descriptions above where relevant.

latch.types.glob

Module description

This module defines the function file_glob, a utility to construct a list of LatchFile objects from a glob pattern. It validates the provided remote_directory URL using is_valid_url and returns an empty list if the URL is not valid. The function resolves file paths relative to the caller’s working directory (or an optional target_dir if provided) and constructs LatchFile instances whose remote paths are formed by appending the matched file name to the remote_directory.

Functions

file_glob()

Constructs a list of LatchFile objects from a glob pattern.
  • Description Constructs a list of LatchFile objects from a glob pattern. This utility helps pass collections of files between tasks. The remote location of each constructed LatchFile is created by appending the file name matched by the pattern to the directory represented by remote_directory.
  • Parameters
    • pattern: str
      A glob pattern to match a set of files, e.g., '*.py'. The pattern is resolved relative to the working directory of the caller unless a target_dir is provided.
    • remote_directory: str
      A valid latch URL pointing to a directory, e.g., 'latch:///foo'. This must be a directory and not a file.
    • target_dir: Optional[Path]
      An optional Path object to define an alternate working directory for path resolution.
  • Returns
    • List[LatchFile]
      A list of instantiated LatchFile objects.
def file_glob(
    pattern: str, remote_directory: str, target_dir: Optional[Path] = None
) -> List[LatchFile]:
  • Examples
# Basic usage
files = file_glob("*.fastq.gz", "latch:///fastqc_outputs")

latch.types.json

Module: latch.types.json

This module defines type aliases for JSON-compatible data structures:
  • JsonArray
  • JsonObject
  • JsonValue

JsonArray

JsonArray: TypeAlias = List["JsonValue"]
"""JSON-compatible list"""
  • Description: JSON-compatible list
  • Type alias: List["JsonValue"]

JsonObject

JsonObject: TypeAlias = Dict[str, "JsonValue"]
"""JSON-compatible dictionary"""
  • Description: JSON-compatible dictionary
  • Type alias: Dict[str, "JsonValue"]

JsonValue

JsonValue: TypeAlias = Union[JsonObject, JsonArray, str, int, float, bool, None]
"""JSON-compatible value

Can be a dictionary, an array, or a primitive value
"""
  • Description: JSON-compatible value. Can be a dictionary, an array, or a primitive value
  • Type alias: Union[JsonObject, JsonArray, str, int, float, bool, None]

latch.functions.messages

This module provides utilities to display task execution messages on the Latch console and to post those messages to the Latch backend when running inside the platform.
  • Exports:
    • NUCLEUS_URL (str): URL derived from environment variable LATCH_CLI_NUCLEUS_URL or a default.
    • ADD_MESSAGE_ENDPOINT (str): Endpoint to post task execution messages.
    • message(typ: str, data: Dict[str, Any]) -> None: Post or print a task execution message.

Functions

message(typ: str, data: Dict[str, Any])

Display a message prominently on the Latch console during and after a task execution. The Latch platform first processes this message internally, then displays it under your task’s execution page.
  • Parameters
    • typ: str
      • A message type that determines how your message is displayed. Currently one of 'info', 'warning', or 'error'.
    • data: Dict[str, Any]
      • The data displayed on the Latch console, formatted as follows: {'title': ..., 'body': ...}.
  • Returns
    • None
  • Raises
    • RuntimeError: If an internal error occurs while processing the message.
  • Example usage:
@small_task
def task():

    ...

    try:
        ...
    catch ValueError:
        title = 'Invalid sample ID column selected'
        body = 'Your file indicates that sample columns a, b are valid'
        message(type='error', data={'title': title, 'body': body})

    ...
  • Signature (for reference)
def message(typ: str, data: Dict[str, Any]) -> None:

latch.functions.operators

operators.py

DEPRECATED Mimics channel operators from Nextflow, using the correspondence Channel —> Python Dictionary This module provides a set of utilities for performing dictionary-based joins and Cartesian products, as well as utilities to group tuple-like data.

Functions

_combine(item1: Any, item2: Any)

def _combine(item1: Any, item2: Any):
Combines two items for use in *_join functions. The rules followed are:
  • If both items are lists, the lists are concatenated
  • If one of the items is a list, the other is appended to that list
  • Otherwise, the output is a new list containing both items
This is so that composition of joins works as expected. We also use list addition so as to not modify the input items and instead return a new copy. Parameters
  • item1: Any
  • item2: Any
Returns
  • List[Any] A new list containing the combined items following the rules above.

left_join(left: Dict[str, Any], right: Dict[str, Any]) -> Dict[str, Any]

def left_join(left: Dict[str, Any], right: Dict[str, Any]) -> Dict[str, Any]:
A standard left join of two dictionaries, joining on their keys Parameters
  • left: Dict[str, Any]
  • right: Dict[str, Any]
Returns
  • Dict[str, Any] Dictionary containing joined values or original left values when no match.

right_join(left: Dict[str, Any], right: Dict[str, Any]) -> Dict[str, Any]

def right_join(left: Dict[str, Any], right: Dict[str, Any]) -> Dict[str, Any]:
A standard right join of two dictionaries, joining on their keys Parameters
  • left: Dict[str, Any]
  • right: Dict[str, Any]
Returns
  • Dict[str, Any]

inner_join(left: Dict[str, Any], right: Dict[str, Any]) -> Dict[str, Any]

def inner_join(left: Dict[str, Any], right: Dict[str, Any]) -> Dict[str, Any]:
A standard inner join of two dictionaries, joining on their keys Parameters
  • left: Dict[str, Any]
  • right: Dict[str, Any]
Returns
  • Dict[str, Any] Dictionary containing values for keys present in both input dictionaries.

outer_join(left: Dict[str, Any], right: Dict[str, Any]) -> Dict[str, Any]

def outer_join(left: Dict[str, Any], right: Dict[str, Any]) -> Dict[str, Any]:
A standard outer join of two dictionaries, joining on their keys Parameters
  • left: Dict[str, Any]
  • right: Dict[str, Any]
Returns
  • Dict[str, Any] Dictionary containing all keys from both inputs, combining where present in both.

group_tuple(channel: List[Tuple], key_index: Optional[int] = None) -> List[Tuple]

def group_tuple(channel: List[Tuple], key_index: Optional[int] = None) -> List[Tuple]:
Operator to mimic the groupTuple construct from Nextflow: The groupTuple operator collects tuples (or lists) of values emitted by the source channel grouping together the elements that share the same key. Finally it emits a new tuple object for each distinct key collected. Args
  • channel: List[Tuple] A list of tuples to be grouped by key_index
  • key_index: Optional[int] Which index of the tuple to match against - if not provided, defaults to 0
Example
channel = [(1,'A'), (1,'B'), (2,'C'), (3, 'B'), (1,'C'), (2,'A'), (3, 'D')]
group_tuple(channel) # key_index defaults to grouping by the first element (index 0)
Returns
  • List[Tuple] List of grouped tuples, one per distinct key.

latch_filter(channel: List[Any], predicate: Union[Callable, re.Pattern, type, None]) -> List[Any]

def latch_filter(
    channel: List[Any],
    predicate: Union[Callable, re.Pattern, type, None],
) -> List[Any]:
Filters a given list with either a predicate, a regex, or a type Parameters
  • channel: List[Any]
  • predicate: Union[Callable, re.Pattern, type, None]
Returns
  • List[Any] Filtered list according to the predicate, pattern, or type. If no predicate is provided or unrecognized, returns the original channel.

combine(channel_0: List[Any], channel_1: List[Any], by: Optional[int] = None) -> Union[List, Dict[str, List[Any]]]

def combine(
    channel_0: List[Any],
    channel_1: List[Any],
    by: Optional[int] = None,
) -> Union[List, Dict[str, List[Any]]]:
Creates a Cartesian product of the two provided channels, with the option to first group the elements of the channels by a certain index and then doing individual products on the groups. Args
  • channel_0: List[Any] If by is provided, all elements must be tuples of the same length
  • channel_1: List[Any] If by is provided, all elements must be tuples of the same length as channel_0
  • by: Optional[int] If provided, which index to group by first.
Returns
  • Union[List, Dict[str, List[Any]]] The Cartesian product as a list of tuples, or a dictionary of grouped products if by is used.
Example
c0 = ['hello', 'ciao']
c1 = [1, 2, 3]
combine(c0, c1)
If by is provided, the function groups elements by the specified index before producing the product, returning a list of combined tuples per group.

latch.functions.secrets

The module provides a utility to retrieve secrets stored in a Latch workspace via the Nucleus API.

Functions

get_secret()

A utility to allow users to reference secrets stored in their workspace on Latch. Important: When running an execution locally, whether on your own computer or using latch develop, the only secrets you will be able to access are the ones in your personal workspace. To use secrets from a shared workspace, register your workflow and run it on Latch. Examples:
get_secret("test-secret")
Returns the secret value from the response. Parameters
  • secret_name: str — Name of the secret to retrieve.
Returns
  • The secret value from the response (likely a string).
Raises
  • ValueError — If the HTTP response status code is not 200; the error message is taken from the response JSON as {"error": ...}.
Signature
def get_secret(secret_name: str):

latch.registry.project

Module description

This module defines the registry project model and related mutation helpers used to manage tables within a registry project. It includes a Project class that can lazily load project data, list contained tables, and perform batched mutations through a ProjectUpdate transaction. It also defines internal data structures used to represent mutations (upserts and deletes) and a dedicated exception for missing projects.

Classes

ProjectNotFoundError

Exception raised when a registry project cannot be found or access is denied.
  • Inherits: NotFoundError
Code examples are not provided for exceptions.

Project

Registry project (folder containing Tables). This class provides lazy loading of project data, accessors for the display name and contained tables, and a transaction mechanism to update the project. Docstring: Registry project (folder containing :class:tables <latch.registry.table.Table>). :meth:~latch.account.Account.list_registry_projects is the typical way to get a :class:Project. Key attributes:
  • id: str Unique identifier.
Key methods:
  • load(self) -> None (Re-)populate this project instance’s cache. Future calls to most getters will return immediately without making a network request. Always makes a network request.
  • get_display_name(self, *, load_if_missing: bool = True) -> Optional[str] Get the display name of this project. This is an opaque string that can contain any valid Unicode data. Display names are not unique and must not be used as identifiers. If load_if_missing is True and the display name is not cached, the project is loaded first.
  • list_tables(self, *, load_if_missing: bool = True) -> Optional[List[Table]] List Registry tables contained in this project. If load_if_missing is True and the table list is not cached, the project is loaded first.
  • update(self, *, reload_on_commit: bool = True) -> Iterator[“ProjectUpdate”] Start an update transaction. The transaction will commit when the context manager closes unless an error occurs. No changes occur until the transaction commits. If reload_on_commit is True, the project will be reloaded after commit.
  • repr(self) -> str Return a developer-friendly string representation.
  • str(self) -> str Alias for repr.
Code signatures (examples):
def load(self) -> None:
    ...
def get_display_name(self, *, load_if_missing: bool = True) -> Optional[str]:
    ...
def list_tables(self, *, load_if_missing: bool = True) -> Optional[List[Table]]:
    ...
def update(self, *, reload_on_commit: bool = True) -> Iterator["ProjectUpdate"]:
    ...
def __repr__(self) -> str:
    ...
def __str__(self) -> str:
    ...

ProjectUpdate

Represents a batch update transaction for a Project. Accumulates mutations (upserts and deletes) and commits them in a single network request. Docstring: Dataclass for project update operations. Key methods:
  • upsert_table(self, display_name: str) -> None Creates an upsert mutation for a table with the given display name. Not idempotent; multiple calls with the same args create multiple tables.
  • _add_table_upserts_selection(self, upserts: List[_ProjectTablesUpsertData], mutations: List[l.SelectionNode]) -> None Internal helper to build GraphQL selection for upsert mutations.
  • delete_table(self, id: str) -> None Creates a delete mutation for the table with the given id.
  • _add_table_deletes_selection(self, deletes: List[_ProjectTablesDeleteData], mutations: List[l.SelectionNode]) -> None Internal helper to build GraphQL selection for delete mutations.
  • commit(self) -> None Commit this project update transaction. All pending updates are committed in one network request; the mutation is atomic. May be called multiple times.
  • clear(self) -> None Remove pending updates. Cancels any uncommitted mutations.
Code signatures (examples):
def upsert_table(self, display_name: str) -> None:
    ...
def _add_table_upserts_selection(
    self, upserts: List[_ProjectTablesUpsertData], mutations: List[l.SelectionNode]
) -> None:
    ...
def delete_table(self, id: str) -> None:
    ...
def _add_table_deletes_selection(
    self, deletes: List[_ProjectTablesDeleteData], mutations: List[l.SelectionNode]
) -> None:
    ...
def commit(self) -> None:
    ...
def clear(self) -> None:
    ...

latch.account

Module description

Public API for account management in the latch SDK. Exposes Account and AccountUpdate classes to fetch and mutate registry projects within a workspace. It uses GraphQL via the execute function to fetch project data and perform mutations. Exposed exception: AccountNotFoundError.

Classes

Account

Class representing a user or team workspace. Can be used to fetch related resources.
  • Docstring: User or team workspace. Can be used to fetch related resources. :meth:current is the typical way of getting an :class:Account. If the current request signer (CLI user or execution context) lacks permissions to fetch some information, the corresponding operations will act as if the information does not exist. Update operations will usually produce errors.
  • Public attribute
    • id: str — Unique identifier.
    Description based on name and usage in code.
  • Methods
    • current() Get current account. Python signature:
      @classmethod
      @cache
      def current(cls) -> Self:
      
      Description: In an execution context, this is the workspace in which the execution was run. In the CLI context (when running latch commands) this is the current setting of latch workspace, which defaults to the user’s default workspace. Returns: Current account.
    • load() (Re-)populate this account instance’s cache. Python signature:
      def load(self) -> None:
      
      Description: Future calls to most getters will return immediately without making a network request. Always makes a network request.
    • *list_registry_projects(self, , load_if_missing: bool = True) -> Optional[List[Project]] Python signature:
      def list_registry_projects(
          self, *, load_if_missing: bool = True
      ) -> Optional[List[Project]]:
      
      Description: List Registry projects owned by this workspace. Parameters:
      • load_if_missing: bool — If true, :meth:load the project list if not in cache. If false, return None if not in cache.
      Returns: Projects owned by this workspace.
    • *update(self, , reload_on_commit: bool = True) -> Iterator[“AccountUpdate”] Python signature:
      @contextmanager
      def update(self, *, reload_on_commit: bool = True) -> Iterator["AccountUpdate"]:
      
      Description: Start an update transaction. The transaction will commit when the context manager closes unless an error occurs. No changes will occur until the transaction commits. The transaction can be cancelled by running :meth:AccountUpdate.clear before closing the context manager.
    • repr(self) Python signature:
      def __repr__(self):
      
      Description: Returns a string representation of the account.
    • str(self) Python signature:
      def __str__(self):
      
      Description: String representation of the account (usually same as repr).

AccountUpdate

Class for account update transactions.
  • Public class description (inferred/purpose): Class for account update transactions.
  • Methods
    • upsert_registry_project(self, display_name: str) Python signature:
      def upsert_registry_project(self, display_name: str):
      
      Description: Upsert a registry project. Not idempotent. Two calls with the same args will create two projects. Parameters:
      • display_name: str — Display name of the new project.
    • _add_registry_projects_upsert_selection(self, upserts: List[_AccountRegistryProjectsUpsertData], mutations: List[l.SelectionNode]) Python signature:
      def _add_registry_projects_upsert_selection(
          self,
          upserts: List[_AccountRegistryProjectsUpsertData],
          mutations: List[l.SelectionNode],
      ):
      
      Description: Prepare GraphQL selection for upsert mutations based on pending upserts.
    • delete_registry_project(self, id: str) Python signature:
      def delete_registry_project(self, id: str):
      
      Description: Delete a registry project by ID.
    • _add_registry_projects_delete_selection(self, deletes: List[_AccountRegistryProjectsDeleteData], mutations: List[l.SelectionNode]) Python signature:
      def _add_registry_projects_delete_selection(
          self,
          deletes: List[_AccountRegistryProjectsDeleteData],
          mutations: List[l.SelectionNode],
      ):
      
      Description: Prepare GraphQL selection for delete mutations based on pending deletes.
    • commit(self) -> None Python signature:
      def commit(self) -> None:
      
      Description: Commit this account update transaction. All pending updates are committed with one network request. Atomic: The entire transaction either commits or fails with an exception.
    • clear(self) Python signature:
      def clear(self):
      
      Description: Remove pending updates. May be called to cancel any pending updates that have not been committed.

Exceptions

AccountNotFoundError

Subclass of NotFoundError used when an account does not exist or you lack permissions.
  • Python signature:
class AccountNotFoundError(NotFoundError): ...
Description:
  • Raised when the account identified by id does not exist or permissions are insufficient to access it.

Examples

Basic usage patterns evident from signatures.
from latch.account import Account

# Get the current account (workspace)
acc = Account.current()

# List registry projects in the current workspace
projects = acc.list_registry_projects()

# Begin an update transaction
with acc.update() as upd:
    upd.upsert_registry_project("New Project")
# Create and commit a simple update without assuming existing data
with acc.update() as upd:
    upd.upsert_registry_project("Another Project")