Table of Contents
-
Core SDK
- latch.init - Main SDK initialization and imports
-
Workflow Resources
- latch.resources.tasks - Task decorators and execution
- latch.resources.workflow - Workflow definition and execution
- latch.resources.conditional - Conditional workflow logic
- latch.resources.map_tasks - Parallel task execution
- latch.resources.reference_workflow - Workflow references
-
Data Types
- latch.types.init - Type system overview
- latch.types.file - File handling (
LatchFile
,LatchOutputFile
) - latch.types.directory - Directory handling (
LatchDir
,LatchOutputDir
) - latch.types.metadata - Workflow metadata and UI configuration
- latch.types.glob - File pattern matching (
file_glob
)
-
Utility Functions
- latch.functions.messages - Console messaging (
message
) - latch.functions.operators - Data manipulation operators (deprecated)
- latch.functions.secrets - Secret management (
get_secret
)
- latch.functions.messages - Console messaging (
-
Data Management
- latch.ldata - Latch Data cloud storage (
LPath
)
- latch.ldata - Latch Data cloud storage (
latch.init
The Latch SDK is a command line toolchain to define and register serverless workflows with the Latch platform. This module re-exports a collection of utilities and resources from several submodules. The following names are imported and exposed by this package:latch.resources.tasks
Latch tasks are decorators to turn Python functions into workflow ‘nodes’. Each task is containerized, versioned and registered with Flyte when a workflow is uploaded to Latch. Containerized tasks are then executed on arbitrary instances as Kubernetes Pods, scheduled usingflytepropeller
under the hood.
The type of instance that the task executes on (eg. number of available
resources, presence of GPU) can be controlled by invoking one of the set of
exported decorators.
Functions
Below are the functions available in thetasks
module.
custom_memory_optimized_task(cpu: int, memory: int)
Description: Deprecated helper returning a custom task configuration with specified CPU and RAM allocations.
Notes:
- This function is deprecated and will be removed in a future release.
- It raises a deprecation warning.
cpu
(int): Number of CPU cores to requestmemory
(int): Memory in GiB to request
- A partial Flyte task configured with the specified Pod settings
custom_task(cpu: Union[Callable, int], memory: Union[Callable, int], *, storage_gib: Union[Callable, int] = 500, timeout: Union[datetime.timedelta, int] = 0, **kwargs)
Description: Returns a custom task configuration requesting the specified CPU/RAM allocations.
If cpu
, memory
, or storage_gib
are callables, returns a dynamic task configuration using DynamicTaskConfig
with a small pod config; otherwise, constructs a static _custom_task_config
Pod.
Parameters:
cpu
(Union[Callable, int]): CPU cores to request (integer or callable for dynamic)memory
(Union[Callable, int]): Memory in GiB to request (integer or callable)storage_gib
(Union[Callable, int], default 500): Storage in GiB (integer or callable)timeout
(Union[datetime.timedelta, int], default 0): Timeout for the task**kwargs
: Additional keyword arguments
- A partial Flyte task configured for either a dynamic or a static custom task
lustre_setup_task()
Description: Returns a partial Flyte task configured for Lustre setup with a Nextflow work directory PVC.
Parameters:
- None
- A partial Flyte task
nextflow_runtime_task(cpu: int, memory: int, storage_gib: int = 50)
Description: Returns a partial Flyte task configured for Nextflow runtime with a shared work directory volume mounted at /nf-workdir
.
Parameters:
cpu
(int): CPU coresmemory
(int): Memory in GiBstorage_gib
(int, default 50): Storage in GiB
- A partial Flyte task
g6e_xlarge_task
, g6e_2xlarge_task
, g6e_4xlarge_task
, g6e_8xlarge_task
, g6e_12xlarge_task
, g6e_16xlarge_task
, g6e_24xlarge_task
, g6e_48xlarge_task
Partial Flyte tasks configured for specific L40s GPU pod configurations.
Parameters:
- None
- A partial Flyte task for each respective instance type and resources
v100_x1_task
, v100_x4_task
, v100_x8_task
Partial Flyte tasks configured for specific V100 GPU pod configurations.
Parameters:
- None
- A partial Flyte task for each respective instance type and resources
latch.resources.workflow
This module defines decorators and helpers to convert Python callables into FlytePythonFunctionWorkflow
objects. It includes internal utilities for metadata generation and docstring injection, a workflow
decorator that supports usage with or without arguments, and a nextflow_workflow
helper for Nextflow-style workflows.
Functions
workflow()
Decorator to expose a Python function as a Flyte PythonFunctionWorkflow. Can be used as @workflow
without arguments or with a LatchMetadata
argument.
metadata
(Union[LatchMetadata, Callable]): Either aLatchMetadata
instance or the function to decorate (when used without parentheses)
Union[PythonFunctionWorkflow, Callable]
: APythonFunctionWorkflow
if used directly or a decorator if used with arguments
nextflow_workflow()
Decorator to expose a Python function as a Nextflow-style workflow.
metadata
(NextflowMetadata): Metadata for the Nextflow-style workflow
Callable[[Callable], PythonFunctionWorkflow]
: A decorator compatible withworkflow
latch.resources.conditional
This module exposes a factory function to create aConditionalSection
for conditional execution in workflows. It delegates to conditional
from flytekit.core.condition
to create the ConditionalSection
.
Functions
create_conditional_section()
Creates a new conditional section in a workflow, allowing a user to conditionally execute a task based on the value of a task result.
- The conditional sections can be n-ary with as many
elif
clauses as desired. - Outputs from conditional nodes can be consumed, and outputs from other tasks can be passed to conditional nodes.
- Boolean expressions in the condition use
&
(and) and|
(or) operators. - Unary expressions are not allowed. If a task returns a boolean, use built-in truth checks like
result.is_true()
orresult.is_false()
.
name
(str): The name of the conditional section, to be shown in Latch Console
ConditionalSection
latch.resources.map_tasks
map_tasks
A map task lets you run a pod task or a regular task over a list of inputs within a single workflow node. This means you can run thousands of instances of the task without creating a node for every instance, providing valuable performance gains. Some use cases of map tasks include:- Several inputs must run through the same code logic
- Multiple data batches need to be processed in parallel
- Hyperparameter optimization
task_function
: The task to be mapped, to be shown in Latch Console
- A conditional section
latch.resources.reference_workflow
This module definesworkflow_reference
to create a Flyte Launch Plan reference using the current workspace as the project and the domain set to 'development'
. It imports reference_launch_plan
from flytekit.core.launch_plan
and current_workspace
from latch.utils
.
Functions
workflow_reference()
Returns a Flyte Launch Plan reference for the given name
and version
in the current workspace under domain 'development'
.
Parameters:
name
(str): The name of the launch plan to referenceversion
(str): The version of the launch plan to reference
- The value returned by
reference_launch_plan
configured with:project=current_workspace()
domain="development"
name=name
version=version
Examples
latch.types.init
Latch types package initializer. This module re-exports several type definitions and a utility function from submodules underlatch.types
. The available exports are:
LatchDir
LatchOutputDir
LatchFile
LatchOutputFile
file_glob
DockerMetadata
Fork
ForkBranch
LatchAppearanceType
LatchAuthor
LatchMetadata
LatchParameter
LatchRule
Params
Section
Spoiler
Text
Functions
file_glob()
Constructs a list of LatchFiles from a glob pattern.
Convenient utility for passing collections of files between tasks. See nextflow’s channels or snakemake’s wildcards for similar functionality in other orchestration tools.
The remote location of each constructed LatchFile will be constructed by appending the file name returned by the pattern to the directory represented by the remote_directory
.
pattern
(str): A glob pattern to match a set of files, e.g.'*.py'
. Will resolve paths with respect to the working directory of the caller.remote_directory
(str): A valid latch URL pointing to a directory, e.g.latch:///foo
. This must be a directory and not a file.target_dir
(Optional[Path]): An optional Path object to define an alternate working directory for path resolution.
List[LatchFile]
: A list of instantiated LatchFile objects.
Classes
This module re-exports the following classes from their respective submodules. For detailed documentation, see the individual module sections:LatchDir
- See latch.types.directoryLatchOutputDir
- See latch.types.directoryLatchFile
- See latch.types.fileLatchOutputFile
- See latch.types.fileDockerMetadata
- See latch.types.metadataFork
- See latch.types.metadataForkBranch
- See latch.types.metadataLatchAppearanceType
- See latch.types.metadataLatchAuthor
- See latch.types.metadataLatchMetadata
- See latch.types.metadataLatchParameter
- See latch.types.metadataLatchRule
- See latch.types.metadataParams
- See latch.types.metadataSection
- See latch.types.metadataSpoiler
- See latch.types.metadataText
- See latch.types.metadata
latch.types.file
Latch types for file handling within Flyte tasks. This module defines a LatchFile class to represent a file object with both a local path and an optional remote path, a type alias for an output file, and a transformer that converts between LatchFile instances and Flyte Literals. Notes:LatchOutputFile
is a type alias forLatchFile
annotated as an output in Flyte. It is defined as:Annotated[LatchFile, FlyteAnnotation({"output": True})]
Classes
Other notable methods and properties-
size(self) -> int
Returns the size of the remote data viaLPath(self.remote_path).size()
. -
local_path
(property) ->str
Local file path for the environment executing the task. -
remote_path
(property) ->Optional[str]
Remote URL referencing the object (LatchData or S3).
latch.types.directory
Module for directory handling in Latch workflows and standalone Python environments. ProvidesLatchDir
for working with directories that can be stored locally or remotely (on Latch Data or S3).
Classes
LatchDir
Represents a directory with both local and remote path management.
Constructor:
path
(Union[str, PathLike]): The local path to the directoryremote_path
(Optional[PathLike]): The remote path (latch:// or s3:// URL) where the directory is stored
local_path
(str): Local directory path for the environment executing the taskremote_path
(Optional[str]): Remote URL referencing the directory (LatchData or S3)
iterdir() -> List[Union[LatchFile, LatchDir]]
: Returns a list of the directory’s children (files and subdirectories)size_recursive() -> int
: Returns the total size of the directory and all its contents recursively
LatchOutputDir
A LatchDir
tagged as the output of some workflow.
Definition:
LatchDir
s that point to objects that don’t exist.
Usage:
latch.types.metadata
Module for defining workflow metadata, parameter configurations, and UI flow elements. It provides the building blocks for creating rich, interactive workflow interfaces in the Latch Console, including parameter validation, custom UI layouts, and integration with Snakemake/Nextflow workflows.Functions
default_samplesheet_constructor(samples: List[DC], t: DC, delim: str = ",") -> Path
Creates a CSV samplesheet from a list of dataclass instances.
Parameters:
samples
(List[DC]): List of dataclass instances to convert to CSVt
(DC): The dataclass type to use for column headersdelim
(str): CSV delimiter, defaults to ”,”
Path
: Path to the createdsamplesheet.csv
file
Classes
LatchRule
Defines validation rules for parameter inputs using regular expressions.
Attributes:
regex
(str): Regular expression pattern that inputs must matchmessage
(str): Error message displayed when validation fails
LatchAppearanceEnum
Controls how text input fields are rendered in the UI.
Values:
line
: Single-line text inputparagraph
: Multi-line text area
MultiselectOption
Represents a single option in a multiselect widget.
Attributes:
name
(str): Display name shown in the UIvalue
(object): Value associated with this option
Multiselect
Creates a multiselect input widget with predefined options.
Attributes:
options
(List[MultiselectOption]): List of available optionsallow_custom
(bool): Whether users can enter custom values
LatchAuthor
Contains metadata about the workflow author.
Attributes:
name
(Optional[str]): Author’s nameemail
(Optional[str]): Author’s email addressgithub
(Optional[str]): Link to author’s GitHub profile
UI Flow Elements
These classes define the layout and organization of workflow parameters in the Latch Console UI.FlowBase
Base class for all UI flow elements. This is a frozen dataclass that serves as the foundation for organizing workflow interfaces.
Section
Creates a card with a title containing child flow elements.
Constructor:
section
(str): Title displayed on the section card*flow
(FlowBase): Variable number of flow elements to display in the section
Text
Displays markdown-formatted text in the UI.
Attributes:
text
(str): Markdown content to display
Title
Displays a markdown title in the UI.
Attributes:
title
(str): Markdown title text
Params
Displays parameter input widgets for specified parameters.
Constructor:
*args
(str): Names of parameters to display
Spoiler
Creates a collapsible section with a title and child flow elements.
Constructor:
spoiler
(str): Title of the collapsible section*flow
(FlowBase): Flow elements to display when expanded
ForkBranch
Defines a single branch within a Fork element.
Constructor:
display_name
(str): Text displayed on the branch button*flow
(FlowBase): Flow elements to display when this branch is active
Fork
Creates a conditional UI flow where users can select between mutually exclusive options.
Constructor:
fork
(str): Name of the string parameter that stores the selected branch keydisplay_name
(str): Title shown above the fork selector**flows
(ForkBranch): Named branches, where keys become the parameter values
Core Metadata Classes
LatchParameter
Defines metadata and behavior for workflow parameters in the Latch Console UI.
Key Attributes:
display_name
(Optional[str]): Human-readable name for the parameterdescription
(Optional[str]): Help text describing the parameterhidden
(bool): Whether to hide the parameter by defaultplaceholder
(Optional[str]): Placeholder text in input fieldsoutput
(bool): Whether this parameter represents a workflow outputrules
(List[LatchRule]): Validation rules for the parameterappearance_type
(LatchAppearance): How to render the input (line/paragraph/multiselect)
samplesheet
(Optional[bool]): Enable samplesheet input UIallowed_tables
(Optional[List[int]]): Registry table IDs allowed for samplesheet
LatchMetadata
The main class for defining workflow metadata and UI configuration.
Core Attributes:
display_name
(str): Human-readable workflow nameauthor
(LatchAuthor): Workflow author informationdocumentation
(Optional[str]): Link to workflow documentationrepository
(Optional[str]): Link to source code repositorylicense
(str): SPDX license identifierparameters
(Dict[str, LatchParameter]): Parameter definitionsflow
(List[FlowBase]): UI layout configuration
tags
(List[str]): Categorization tagswiki_url
(Optional[str]): Link to wiki documentationvideo_tutorial
(Optional[str]): Link to tutorial videoabout_page_path
(Optional[Path]): Path to markdown about page
Integration Metadata Classes
DockerMetadata
Configuration for private Docker repositories.
Attributes:
username
(str): Docker registry usernamesecret_name
(str): Name of the secret containing the password
SnakemakeMetadata(LatchMetadata)
Extended metadata class for Snakemake workflows with additional Snakemake-specific configuration.
Additional Attributes:
output_dir
(Optional[LatchDir]): Directory for Snakemake outputsdocker_metadata
(Optional[DockerMetadata]): Docker registry credentialscores
(int): Number of cores for Snakemake executionparameters
(Dict[str, SnakemakeParameter]): Snakemake-specific parameter metadata
NextflowMetadata(LatchMetadata)
Extended metadata class for Nextflow workflows with additional Nextflow-specific configuration.
Additional Attributes:
runtime_resources
(NextflowRuntimeResources): Computational resourcesexecution_profiles
(List[str]): Nextflow execution profileslog_dir
(Optional[LatchDir]): Directory for Nextflow logsparameters
(Dict[str, NextflowParameter]): Nextflow-specific parameter metadata
Helper Classes
SnakemakeParameter(Generic[T], LatchParameter)
Snakemake-specific parameter with type information and default values.
NextflowParameter(Generic[T], LatchParameter)
Nextflow-specific parameter with samplesheet integration and results path configuration.
NextflowRuntimeResources
Defines computational resources for Nextflow tasks.
Attributes:
cpus
(Optional[int]): Number of CPUsmemory
(Optional[int]): Memory in GiBstorage_gib
(Optional[int]): Storage in GiBstorage_expiration_hours
(int): Workdir retention time
latch.types.glob
Module for creating collections ofLatchFile
objects from glob patterns. This is useful for batch processing files and passing file collections between workflow tasks.
Functions
file_glob()
Creates a list of LatchFile
objects by matching files with a glob pattern and mapping them to a remote directory.
Signature:
pattern
(str): Glob pattern to match files (e.g.,"*.fastq.gz"
,"data/*.csv"
)remote_directory
(str): Latch URL pointing to a directory (e.g.,"latch:///my_data"
)target_dir
(Optional[Path]): Alternative working directory for pattern resolution
List[LatchFile]
: List ofLatchFile
objects with local paths matching the pattern and remote paths in the specified directory
- The
remote_directory
must be a valid Latch URL pointing to a directory (not a file) - If the remote directory URL is invalid, an empty list is returned
- Each matched file gets a remote path constructed by appending the filename to the remote directory
- This is particularly useful for workflows that need to process multiple files of the same type
latch.functions.messages
Module for displaying messages to users during workflow execution. These messages appear prominently in the Latch Console and help communicate task status, warnings, and errors to users.Functions
message(typ: str, data: Dict[str, Any]) -> None
Displays a message in the Latch Console during task execution. Messages are shown on the task execution page and help users understand what’s happening or if there are issues.
Parameters:
typ
(str): Message type determining display style. Options:"info"
: Informational messages (blue styling)"warning"
: Warning messages (yellow/orange styling)"error"
: Error messages (red styling)
data
(Dict[str, Any]): Message content with required keys:"title"
(str): Brief message title"body"
(str): Detailed message content
None
RuntimeError
: If message processing fails
latch.functions.operators
⚠️ DEPRECATED - This module is deprecated and may be removed in future versions. This module provides utilities for data manipulation operations inspired by Nextflow channel operators. It includes functions for dictionary joins, tuple grouping, filtering, and Cartesian products.Functions
left_join(left: Dict[str, Any], right: Dict[str, Any]) -> Dict[str, Any]
Performs a left join on two dictionaries, keeping all keys from the left dictionary.
Parameters:
left
(Dict[str, Any]): Left dictionary (all keys preserved)right
(Dict[str, Any]): Right dictionary (matched keys only)
Dict[str, Any]
: Dictionary with all left keys, combined with right values where keys match
right_join(left: Dict[str, Any], right: Dict[str, Any]) -> Dict[str, Any]
Performs a right join on two dictionaries, keeping all keys from the right dictionary.
Parameters:
left
(Dict[str, Any]): Left dictionary (matched keys only)right
(Dict[str, Any]): Right dictionary (all keys preserved)
Dict[str, Any]
: Dictionary with all right keys, combined with left values where keys match
inner_join(left: Dict[str, Any], right: Dict[str, Any]) -> Dict[str, Any]
Performs an inner join on two dictionaries, keeping only keys present in both dictionaries.
Parameters:
left
(Dict[str, Any]): Left dictionaryright
(Dict[str, Any]): Right dictionary
Dict[str, Any]
: Dictionary containing only keys present in both inputs, with combined values
outer_join(left: Dict[str, Any], right: Dict[str, Any]) -> Dict[str, Any]
Performs an outer join on two dictionaries, keeping all keys from both dictionaries.
Parameters:
left
(Dict[str, Any]): Left dictionaryright
(Dict[str, Any]): Right dictionary
Dict[str, Any]
: Dictionary containing all keys from both inputs, with combined values where keys match
group_tuple(channel: List[Tuple], key_index: Optional[int] = None) -> List[Tuple]
Groups tuples by a specified key index, mimicking Nextflow’s groupTuple
operator.
Parameters:
channel
(List[Tuple]): List of tuples to groupkey_index
(Optional[int]): Index to group by (defaults to 0)
List[Tuple]
: List of grouped tuples, one per distinct key
latch_filter(channel: List[Any], predicate: Union[Callable, re.Pattern, type, None]) -> List[Any]
Filters a list using a predicate function, regex pattern, or type check.
Parameters:
channel
(List[Any]): List to filterpredicate
(Union[Callable, re.Pattern, type, None]): Filter criteria:Callable
: Function that returns True/False for each itemre.Pattern
: Regex pattern to match against stringstype
: Type to filter by (e.g.,str
,int
)None
: Returns original list unchanged
List[Any]
: Filtered list based on the predicate
combine(channel_0: List[Any], channel_1: List[Any], by: Optional[int] = None) -> Union[List, Dict[str, List[Any]]]
Creates a Cartesian product of two lists, with optional grouping by a tuple index.
Parameters:
channel_0
(List[Any]): First list to combinechannel_1
(List[Any]): Second list to combineby
(Optional[int]): If provided, group tuples by this index before combining
Union[List, Dict[str, List[Any]]]
:- If
by
is None: List of tuples representing the Cartesian product - If
by
is provided: Dictionary with grouped products
- If
by
, all elements in both lists must be tuples of the same length.
latch.functions.secrets
Module for securely retrieving secrets stored in Latch workspaces. Secrets are encrypted values that can be used to store sensitive information like API keys, database passwords, or authentication tokens.Functions
get_secret(secret_name: str) -> str
Retrieves a secret value from the Latch workspace.
Parameters:
secret_name
(str): Name of the secret to retrieve
str
: The decrypted secret value
latch.ldata
Module for working with Latch Data (LData) - Latch’s cloud storage system. This module provides theLPath
class for interacting with files and directories stored in Latch Data, including operations like uploading, downloading, copying, and metadata retrieval.
Classes
LPath
Represents a remote file or directory path hosted on Latch Data. Provides a pathlib-like interface for working with cloud storage.
Constructor:
path
(str): The Latch path, must start with “latch://”
- Lazy Loading: Metadata is fetched on-demand to minimize network requests
- Caching: Metadata is cached after first fetch for improved performance
- Path Operations: Supports path joining with
/
operator - File/Directory Operations: Upload, download, copy, delete, and directory listing
- Metadata Access: Get file size, content type, version ID, and node information
fetch_metadata()
: Force refresh of all cached metadatanode_id(load_if_missing=True)
: Get the unique node IDname(load_if_missing=True)
: Get the file/directory nametype(load_if_missing=True)
: Get the node type (file, directory, etc.)size(load_if_missing=True)
: Get file size in bytessize_recursive(load_if_missing=True)
: Get total size including subdirectoriescontent_type(load_if_missing=True)
: Get MIME content typeversion_id(load_if_missing=True)
: Get version identifieris_dir(load_if_missing=True)
: Check if path is a directory
iterdir()
: List contents of directory (non-recursive)mkdirp()
: Create directory and all parent directoriesrmr()
: Recursively delete files and directories
upload_from(src: Path, show_progress_bar=False)
: Upload local file/directorydownload(dst=None, show_progress_bar=False, cache=False)
: Download to local pathcopy_to(dst: LPath)
: Copy to another LPath
__truediv__(other)
: Join paths using/
operator
- Use
load_if_missing=False
when you know metadata is already cached - Call
fetch_metadata()
to refresh stale cache when needed - Use
cache=True
indownload()
for repeated downloads of the same file - Handle
LatchPathError
for robust error handling - Use path joining with
/
operator for cleaner code - Check
is_dir()
before calling directory-specific methods
LatchPathError
Exception raised when LPath operations fail.
Attributes:
message
(str): Error descriptionremote_path
(Optional[str]): The Latch path that caused the erroracc_id
(Optional[str]): Account ID associated with the error
LDataNodeType
Enum representing different types of Latch Data nodes.
Values:
account_root
: Root directory of an accountdir
: Regular directoryobj
: File objectmount
: Mounted storagelink
: Symbolic linkmount_gcp
: Google Cloud Platform mountmount_azure
: Azure mount