Releases: latchbio/latch
Latch 2.11.2
Enhancements/Bug Fixes
- LatchDir initialized with local path initialized to Path object fails on upload
2.11.0
Latch 2.10.0
New Features
- Use best practices in
latch init
templates -
LatchOutputDir
used to indicate output location on Latch
-
- Regex rules used to validate files
-
- Splits tasks into files
-
- Include empty template
-
- Remove yaml metadata from docstring
-
- Use messages in examples
-
- Error handling
-
- Add LICENSE file
-
- Add README file
- Allow user to select template in GUI or pass flag
- Allow user to run
latch init .
Enhancements/Bug Fixes
- LatchDir type transformer bug with Annotated types
-
- LatchOutputDir is fixed
2.10.0
Latch 2.10.0
New Features
- The
latch cp
command now displays a x number of files out of n indicator and displays which stage of the download is going on (network request to get presigned urls vs downloading blob data). - A new error that is thrown when there is an inconsistency between a
LatchMetadata
object and its associated workflow's parameters. - The function
get_secret
which allows users to reference secrets they've uploaded to the Latch platform from within a workflow.
Removals
- Removed broken SDK tests (launching CRISPResso2)
Enhancements/Bug Fixes
requests
library given higher version lower bound to fix warning with one of its dependencieslytekit
version updated to- pin
numpy
to version1.22
(breaking changes in later versions of this library) - have better behavior when downloading directories during local development
- force retry on connection closed on file uploads (POST requests more generally)
- pin
latch get-params
will escape class attribute names (representation of Enum type) if they are python keywordslatch preview
now requires a directory argument instead of a workflow name argument, and now behaves consistently with regular parameter interface generation.- The crash reporter now prints stack traces of caught exceptions in the correct order
latch develop
now throws an error when run on a workflow that hasn't been registered yet.
Internal Changes
- Reworked how internal configs are stored, eschewing a flat dictionary of API endpoints in favor of a nested dataclass. This removes a small class of potential mistakes arising from misspelling, and gives the benefit of IDE intellisense.
- Made both configs singletons.
2.9.3
Latch 2.9.3
New Features
Local Development with latch develop
Version 2.9.3
introduces the latch develop
command, which allows for quick iteration and easy debugging while developing a workflow.
Previously, to debug a workflow, a user would have to first register a new version of their workflow, and then run it on the platform to see if it worked.
This practice is inefficient and expensive, both in compute resources and in time invested. The goal of latch develop
is to mitigate these costs and create a simple, seamless environment within which a user can iterate on and debug a workflow effectively.
To use this, navigate to a workflow directory and run latch develop .
. This will spawn a REPL in which there are utilities for running tasks individually, running scripts, and spawning interactive shells to debug environment issues.
Important: to use this feature, you must have registered the specific workflow at least once.
Read more here.
Deprecations
The following are now deprecated, and will no longer be maintained. They have been moved to a dedicated deprecated
folder and will be removed in a future release of latch
.
- The commands
latch rm
,latch mkdir
, andlatch touch
. - The operators
left_join
,right_join
,inner_join
,outer_join
,group_tuple
,latch_filter
, andcombine
v2.4.1
- multi threaded file upload
- tasks (both preset tasks (eg.
small_task
) andcustom_task
can accept full range of kwargs flyte tasks ingest - caching is enabled by default
- retries are enabled by default, additionally most program failures from python exceptions are considered "retryable" and automatically retry
v2.0.0
v2.0.0
v1.17.0
Templates and Tags
This release brings the --template
flag to latch init
as well as a tags
field in LatchMetadata
.
Templates
The Latch SDK now supports creating two new template workflows automatically, namely one already with a Dockerfile that installs R
, and one already with a Dockerfile that installs conda
. To generate these, simply use latch init
with the --template
option like so:
>>> latch init [package_root] --template=[...]
Valid values for the template option as of now are r
, conda
, and default
. In particular, the default
option (as well as just not providing the flag itself) creates the default assemble_and_sort
workflow.
Tags
You can now give workflows tags - simply provide a list of strings to the tags
parameter in LatchMetadata
. These tags will be rendered in the console, and can be used to describe the biological domains your workflow falls under.
What's Changed
- add custom connections to login by @maximsmol in #152
- Better tutorials and examples by @hannahle in #153
- Docs: clarify how to use latch preview by @hannahle in #150
- templates for R and conda workflows by @ayushkamat in #154
- fix: exec type error by @r614 in #155
New Contributors
Full Changelog: v1.16.1...v1.17.0
v1.16.0
Curated workflow references
Custom workflow references have been deprecated in favor of curated references to the latest versions of pipelines in the Latch Verified project.
You can import and use a reference as follows.
from latch.verified import deseq2_wf
@workflow
def example(...):
deseq2_wf(
report_name=run_name,
count_table_source="single",
raw_count_table=count_matrix_file,
raw_count_tables=[],
count_table_gene_id_column="gene_id",
output_location_type="default",
output_location=custom_output_dir,
conditions_source=conditions_source,
manual_conditions=manual_conditions,
conditions_table=conditions_table,
design_matrix_sample_id_column=design_matrix_sample_id_column,
design_formula=design_formula,
number_of_genes_to_plot=30,
)
Existing references
- deseq2_wf
- gene_ontology_pathway_analysis
- rnaseq
v1.15.0
This update introduces subworkflows and workflow references. These constructs allow for arbitrary composition of workflows within each other, enabling great organizational flexibility as well as reducing code duplication. Additionally, @mrland99 wrote docker build logs into the crash report if a workflow fails to register/build.
Subworkflow
To create a subworkflow, simply create two functions with the @workflow
decorator and call one inside the other, as below:
@small_task
def this_is_a_sub_wf_task(a: int) -> int:
print(“This is a sub-workflow”)
return a + 1
@workflow
def this_is_a_sub_wf(b: int) -> int:
return this_is_a_sub_wf_task(a=b)
@workflow
def this_is_a_top_level_wf(c: int) -> int:
return this_is_a_sub_wf(b=c)
Workflow Reference
A reference workflow is distinct from a subworkflow in that a reference workflow is a reference to an existing workflow, meaning workflows are reusable in other workflows without duplicating code. To create a workflow reference, simply annotate an empty function with the @workflow_reference
decorator as below.
@workflow_reference(
name=“wf.__init__.assemble_and_sort”,
version=“0.0.1”,
)
def assemble_and_sort(read1: LatchFile, read2: LatchFile):
...
A few notes: the interface of the function must be the same as the workflow it is referencing. Moreover, the body of the function should be empty, and lastly, the version of the workflow being referenced must be available to the user in order to be referenced (i.e. you can’t reference a workflow you don’t have access to).
Docker Build Logs
See #119 - logs will automatically be found in $(pwd)/.logs
and crash report tar. Thanks @mrland99.
Full Changelog: v1.14.1...v1.15.0
v1.14.1
medium_task
update to large spot fleet