Skip to content

Releases: iansinnott/prompta

Prompta v4.0.2

05 Jan 02:29
Compare
Choose a tag to compare

This fixes a cors issue related to using 3rd party APIs (custom providers). See prior release notes for more details: https://github.com/iansinnott/prompta/releases/tag/v4.0.0

See the assets to download and install this version.

v4.0.0

05 Jan 01:48
Compare
Choose a tag to compare

More LLMs

This release brings two main features:

  • Improved support for 3rd party LLM providers (Fireworks, Mistral, LiteLLM, etc)
  • Prompta-provided LLMs to users get started

What's new

screenshot_20240105013118.mp4

Using Mistral as a LLM providers

screenshot_20240105015316.mp4

3rd party LLMs

There are currently lots of LLM providers that offer APIs that are compatible with OpenAI. Prompta should be able to use any of these LLMs easily. Previously the experience was not great. You could use 3rd party LLM APIs but you had to fiddle with settings to use a custom provider and had to undo those settings to get back to OpenAI. How LLM providers can all exist together and you can add as many as you like.

Prompta LLMs

For some time I've wanted to improve the first-time user experience of Prompta by making it possible to immediately chat with the LLM without having to go bring their own key. For all of us existing users this is no big deal, we have OpenAI accounts and we know how to generate API keys. However, this is quite a lot of hassle for some people. I want to be able to introduce friends and family to AI by sending them to Prompta. If they have to sign up for OpenAI at the same time it's a non-starter.

There are lots of competing LLM providers now so I've set up an endpoint to provide LLM access for free to users without having to sign up. Assumedly almost no one will use the app so it won't be cost prohibitive on my end, but I may have to revisit if the API starts getting hammered.

For now this just means that all you need to use Prompta is to open it in a browser tab.

I recognize existing users probably don't care about this, so it's also possible to disable the Prompta LLMs in the settings. This provides the same experience as before—simply providing access to OpenAI via your own key.

What's Changed

  • Allow arbitrary LLM providers, Prompta-provided LLMs by @iansinnott in #29

Full Changelog: v3.3.0...v4.0.0

Prompta v3.3.0

20 Dec 06:43
Compare
Choose a tag to compare

See the assets to download and install this version.

Prompta v3.2.1

20 Dec 06:44
Compare
Choose a tag to compare

See the assets to download and install this version.

Prompta v3.2.0

17 Dec 09:59
Compare
Choose a tag to compare

Sync v2

Sync has been revamped so it actually works. The initial implementation used the vlcn p2p module, but it had issues when all clients weren't online at the same time, which is most of the time. Instead, there's now a sync server which remains up and handles connections from all clients. You can run your own sync server or use the default one.

What's Changed

  • Sync v2 by @iansinnott in #18
  • Some quality of life improvements when working with non-openai base urls by @nikvdp in #20

New Contributors

Full Changelog: v3.1.1...v3.2.0

Prompta v3.1.1

09 Dec 07:31
Compare
Choose a tag to compare

See the assets to download and install this version.

Prompta v3.1.0

07 Dec 09:41
Compare
Choose a tag to compare
Prompta v3.1.0 Pre-release
Pre-release

See the assets to download and install this version.

Prompta v3.0.3

01 Dec 07:10
Compare
Choose a tag to compare

See the assets to download and install this version.

Full Changelog: v3.0.2...v3.0.3

Prompta v3.0.2

01 Dec 07:09
Compare
Choose a tag to compare
Prompta v3.0.2 Pre-release
Pre-release

See the assets to download and install this version.

Prompta v2.1.0

24 Nov 05:15
Compare
Choose a tag to compare

See the assets to download and install this version.

What's Changed

  • Add support for custom base urls and model names by @nikvdp in #13

Full Changelog: v2.0.1...v2.1.0