Skip to content

Commit

Permalink
Pin Colab notebook link to specific version
Browse files Browse the repository at this point in the history
  • Loading branch information
krasserm committed Dec 4, 2022
1 parent 64874d3 commit b9bd83f
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ See [Docker image](docs/docker-image.md) for details.
- [Model construction](docs/model-construction.md)
- [Pretrained models](docs/pretrained-models.md)
- [Training examples](docs/training-examples.md)
- [Inference examples](examples/inference.ipynb) [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/perceiver-io/blob/main/examples/inference.ipynb)
- [Inference examples](examples/inference.ipynb) [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/perceiver-io/blob/0.7.0/examples/inference.ipynb)
- [Building blocks](docs/building-blocks.md)

## Getting started
Expand Down Expand Up @@ -227,7 +227,7 @@ torch.save(model.state_dict(), "/path/to/model.pt")

For generating text from a prompt via top-k sampling, `CausalLanguageModel` provides a `generate()` method. The following
example first loads a trained model from a checkpoint and then generates text from a short sample prompt. An interactive
demo is also available in the [Colab notebook](https://colab.research.google.com/github/krasserm/perceiver-io/blob/main/examples/inference.ipynb).
demo is also available in the [Colab notebook](https://colab.research.google.com/github/krasserm/perceiver-io/blob/0.7.0/examples/inference.ipynb).

```python
from perceiver.data.text import TextPreprocessor
Expand Down

0 comments on commit b9bd83f

Please sign in to comment.