Skip to content

Commit

Permalink
docs: update README
Browse files Browse the repository at this point in the history
  • Loading branch information
olimorris committed Mar 31, 2024
1 parent 6f68775 commit 07fb727
Show file tree
Hide file tree
Showing 2 changed files with 22 additions and 3 deletions.
23 changes: 21 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ Currently supports: Anthropic, Ollama and OpenAI adapters

- :speech_balloon: A Copilot Chat experience from within Neovim
- :electric_plug: Adapter support for many generative AI services
- :robot: Agentic workflows to improve LLM output
- :rocket: Inline code creation and modification
- :sparkles: Built in actions for specific language prompts, LSP error fixes and code advice
- :building_construction: Create your own custom actions for Neovim
Expand Down Expand Up @@ -341,8 +342,22 @@ Both of these actions utilise the `chat` strategy. The `Chat` action opens up a

This action enables users to easily navigate between their open chat buffers. A chat buffer can be deleted (and removed from memory) by pressing `<C-c>`.

#### Agentic Workflows

> "Instead of having an LLM generate its final output directly, an agentic workflow prompts the LLM multiple times, giving it opportunities to build step by step to higher-quality output"
As outlined in Andrew Ng's [tweet](https://twitter.com/AndrewYNg/status/1773393357022298617), agentic workflows have the ability to dramatically improve the output from an LLM. The plugin supports this via the use of workflows. Currently, the plugin only supports "reflection" (multiple prompts within the same application) and comes in built with some for:

- Adding a new feature
- Fixing a bug

Of course you can add new workflows by following the [RECIPES](RECIPES.md) guide.

#### Inline code

> [!NOTE]
> The options available to the user in the Action Palette will depend on the Vim mode.
These actions utilize the `inline` strategy. They can be useful for writing inline code in a buffer or even refactoring a visual selection; all based on a user's prompt. The actions are designed to write code for the buffer filetype that it is initated in, or, if run from a terminal prompt, to write commands.

The strategy comes with a number of helpers which the user can type in the prompt, similar to [GitHub Copilot Chat](https://github.blog/changelog/2024-01-30-code-faster-and-better-with-github-copilots-new-features-in-visual-studio/):
Expand All @@ -351,15 +366,19 @@ The strategy comes with a number of helpers which the user can type in the promp
- `/optimize` to analyze and improve the running time of the selected code
- `/tests` to create unit tests for the selected code

> [!NOTE]
> The options available to the user in the Action Palette will depend on the Vim mode.

#### Code advisor

> [!NOTE]
> This option is only available in visual mode
As the name suggests, this action provides advice on a visual selection of code and utilises the `chat` strategy. The response from the API is streamed into a chat buffer which follows the `display.chat` settings in your configuration.

#### LSP assistant

> [!NOTE]
> This option is only available in visual mode
Taken from the fantastic [Wtf.nvim](https://github.com/piersolenski/wtf.nvim) plugin, this action provides advice on how to correct any LSP diagnostics which are present on the visually selected lines. Again, the `send_code = false` value can be set in your config to prevent the code itself being sent to the generative AI service.

## :rainbow: Helpers
Expand Down
2 changes: 1 addition & 1 deletion lua/codecompanion/actions.lua
Original file line number Diff line number Diff line change
Expand Up @@ -290,7 +290,7 @@ M.static.actions = {
},
},
{
name = "Workflows...",
name = "Agentic Workflows...",
strategy = "chat",
description = "Workflows to improve the performance of your LLM",
picker = {
Expand Down

0 comments on commit 07fb727

Please sign in to comment.