Skip to content

Commit

Permalink
Simplify and clarify instructions for using dsl.importer component. (
Browse files Browse the repository at this point in the history
…#3838)

Signed-off-by: Diego Lovison <diegolovison@gmail.com>
  • Loading branch information
diegolovison authored Aug 26, 2024
1 parent 6eaa2e8 commit 9c29299
Showing 1 changed file with 2 additions and 2 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ Unlike the other three authoring approaches, an importer component is not a gene

As described in [Pipeline Basics][pipeline-basics], inputs to a task are typically outputs of an upstream task. When this is the case, artifacts are easily accessed on the upstream task using `my_task.outputs['<output-key>']`. The artifact is also registered in ML Metadata when it is created by the upstream task.

If you wish to use an existing artifact that was not generated by a task in the current pipeline or wish to use as an artifact an external file that was not generated by a pipeline at all, you can use a [`dsl.importer`][dsl-importer] component to load the artifact from its URI.
If you wish to use an existing artifact that was not generated by a task in the current pipeline, you can use a [`dsl.importer`][dsl-importer] component to load the artifact from its URI.

You do not need to write an importer component; it can be imported from the `dsl` module and used directly:

Expand All @@ -37,4 +37,4 @@ You may also specify a boolean `reimport` argument. If `reimport` is `False`, KF
[pipeline-basics]: /docs/components/pipelines/user-guides/components/compose-components-into-pipelines
[dsl-importer]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/dsl.html#kfp.dsl.importer
[artifacts]: /docs/components/pipelines/user-guides/data-handling/artifacts
[ml-metadata]: https://github.com/google/ml-metadata
[ml-metadata]: https://github.com/google/ml-metadata

0 comments on commit 9c29299

Please sign in to comment.