Skip to content

Releases: monarch-initiative/ontogpt

v1.0.4

27 Aug 16:12
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.0.3...v1.0.4

v1.0.3

12 Aug 15:59
53b64ae
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.0.2...v1.0.3

v1.0.2

06 Aug 16:32
e56d884
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.0.1...v1.0.2

v1.0.1

02 Aug 20:30
a2e67c0
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.0.0...v1.0.1

v1.0.0

30 Jul 17:04
b3f347d
Compare
Choose a tag to compare

🎉 Highlights 🎉

  • LLMs are now accessed through litellm, meaning OntoGPT may now be used with a large collection of API endpoints and local or alternative model providers. See the full list with the ontogpt list-models command.
  • Local, open models may be downloaded and used through the ollama package.
  • Numerous bugfixes
  • Documentation updates
  • Updates for the webapp

If something seems broken, please let us know! Open an issue here: https://github.com/monarch-initiative/ontogpt/issues

What's Changed

New Contributors

Full Changelog: v0.3.15...v1.0.0

v1.0.0rc2

25 Jul 18:54
60e1c44
Compare
Choose a tag to compare
v1.0.0rc2 Pre-release
Pre-release

What's Changed

  • Add option to truncate input_text in outputs by @caufieldjh in #413
  • eliminate hardcoded gpt4-turbo from multilingual function by @leokim-l in #415
  • Misc changes for 1.0.0rc2 by @caufieldjh in #416
  • Enabled using alternative source for textract (#412)
  • A small fix for pubmed-annotate

New Contributors

Full Changelog: v1.0.0rc1...v1.0.0rc2

v1.0.0rc1

19 Jul 21:05
1ac2f36
Compare
Choose a tag to compare
v1.0.0rc1 Pre-release
Pre-release

This is a pre-release. See below for a preview of changes.

What's Changed

Full Changelog: v0.3.15...v1.0.0rc1

v0.3.15

12 Jun 14:59
ea2ee13
Compare
Choose a tag to compare

Highlights

  • Four new templates: alz_treat, gene_extraction, storms, and onto_usage - see below for more details.
  • The pubmed-extract and pubmed-annotate commands now allow setting the text segment size (in number of characters) with the max-text-length option. Many LLMs now allow for large contexts, so set this to 100000 or so to parse the full text of a scientific article (along with the --get-pmc option).
    • Example: ontogpt -vvv pubmed-annotate -t storms --get-pmc --model gpt-4o --limit 1 --max-text-length 100000 -o storms_test.yaml "36598999"

What's Changed

Full Changelog: v0.3.14...v0.3.15

v0.3.14

30 May 18:13
c25f581
Compare
Choose a tag to compare

What's Changed

  • Add custom template loader by @caufieldjh in #388
    • Custom templates in yaml may now be passed directly to ontogpt commands - no make needed.
  • Updates for docs by @caufieldjh in #389

Full Changelog: v0.3.13...v0.3.14

v0.3.13

29 May 14:05
4b42a58
Compare
Choose a tag to compare

What's Changed

  • Include only input-relevant named entities when producing output by @caufieldjh in #382
  • More webapp updates - more models and all templates by @caufieldjh in #385
  • Add notebook with two quick CLI and webapp examples by @caufieldjh in #386

Full Changelog: v0.3.12...v0.3.13