In this guide, we will walk through how to upload a simple Snakemake workflow to Latch.
config.yaml
by typing:
latch_metadata
folder in your workflow directory:
latch_metadata/__init__.py
file instantiates a SnakemakeMetadata
object, which contains the Latch-specific metadata displayed on the Latch Console when executing a workflow. Feel free to update the output_dir
, display_name
, or author
fields.
The SnakemakeMetadata
object also contains parameters
and file_metadata
fields specifying the workflow’s input parameters.
LatchDir
(which is a pointer to a directory hosted on Latch Data). When we register this workflow, these parameters will be exposed to the user on the Latch UI. Upon execution, the workflow orchestrator will download these directories to the local machine before executing the task.
How does the orchestrator know which local path to download the remote files? For each SnakemakeParameter
of type LatchFile
or LatchDir
, we use SnakemakeFileMetadata
object to specify the local path to copy files to before the Snakemake job runs.
Dockerfile
in your root directory. Let’s analyze each relevant section of the generated Dockerfile
:
bwa
, samtools
, etc.) required for the workflow to execute. The latch dockerfile
command will detect the existence of an environment.yaml
file in the root directory and create a conda environment from that file. If your workflow doesn’t have an environment.yaml
file, you must manually install packages in the Dockerfile.
.dockerignore
to avoid copying any large data files that you do not want in your container.
snakemake_jit_entrypoint.py
file is generated. Once the registration finishes, stdout
provides a link to your workflow on Latch.
data
/` directory, which you can use for testing.
Once you have uploaded the data and selected the appropriate input parameters, click Launch Workflow
. You should now see the workflow task executing.
output_dir
folder, as defined in your Latch Metadata.