Llm.nvim
A large language model (LLM) plugin for Neovim, provides commands to interact with LLM (like ChatGPT, Copilot, ChatGLM, kimi, deepseek, openrouter and local llms). Support Github models.
Install / Use
/learn @Kurama622/Llm.nvimQuality Score
Category
Customer SupportSupported Platforms
README
[!IMPORTANT] A large language model(LLM) plugin that allows you to interact with LLM in Neovim.
- Supports any LLM, such as GPT, GLM, Kimi, DeepSeek, Gemini, Qwen or local LLMs (such as ollama).
- Allows you to define your own AI tools, with different tools able to use different models.
- Most importantly, you can use free models provided by any platform (such as
Copilot,GitHub models,SiliconFlow,openrouter,Cloudflareor other platforms).
[!NOTE] The configurations of different LLMs (such as ollama, deepseek), UI configurations, and AI tools (including code completion) should be checked in the examples first. Here you will find most of the information you want to know. Additionally, before using the plugin, you should ensure that your
LLM_KEYis valid and that the environment variable is in effect.
Contents
<!-- mtoc-start --> <!-- mtoc-end -->Screenshots
Chat
Press ? can display the shortcut key help window
- Float-UI
- Split-UI
Quick Translation
enable_cword_context = true: Translate the text under the cursor in normal mode.
Explain Code
Streaming output | Non-streaming output
<p align= "center"> <img src="https://github.com/Kurama622/screenshot/blob/master/llm/llm-explain-code-compress.png" alt="llm-explain-code" width="800"> </p>Ask
One-time, no history retained.
You can configure inline_assistant to decide whether to display diffs (default: show by pressing 'd').
<p align= "center"> <img src="https://github.com/user-attachments/assets/e3300e1f-dbd2-4978-bd60-ddf9106257cb" alt="llm-ask" width="800"> </p>Attach To Chat
You can configure inline_assistant to decide whether to display diffs (default: show by pressing 'd').
<p align= "center"> <img src="https://github.com/user-attachments/assets/33ba7517-6cf1-4e52-b6b4-27e6a4fb1148" alt="llm-attach" width="800"> </p>Optimize Code
<p align= "center"> <img src="https://github.com/Kurama622/screenshot/blob/master/llm/llm-optimize-code-compress.png" alt="llm-optimize-code" width="800"> </p> <p align= "center"> <img src="https://github.com/user-attachments/assets/35c105b3-a2a9-4a6c-887c-cb20b77b3264" alt="llm-optimize-compare-action" width="800"> </p>Generate Test Cases
<p align= "center"> <img src="https://github.com/user-attachments/assets/b288e3c9-7d25-40cb-8645-14dacb571529" alt="test-case" width="800"> </p>AI Translation
<p align= "center"> <img src="https://github.com/user-attachments/assets/ff90b1b4-3c2c-40e6-9321-4bab134710ec" alt="llm-trans" width="800"> </p>Image Recognition
<p align= "center"> <img src="https://github.com/user-attachments/assets/95edeacf-feca-4dfe-bb75-02538a62c83e" alt="llm-images" width="800"> </p>Generate Git Commit Message
<p align= "center"> <img src="https://github.com/user-attachments/assets/261b21c5-0df0-48c2-916b-07f5ce0c981d" alt="llm-git-commit-msg" width="800"> </p>Generate Doc String
<p align= "center"> <img src="https://github.com/user-attachments/assets/a1ae0ba7-d914-4bcd-a136-b88d79f7eb91" alt="llm-docstring" width="800"> </p>Web Search
/buffer | /file | @web_search
Diagnostic
Both disposable_ask_handler, attach_to_chat_handler, side_by_side_handler and action_handler can enable diagnostic features:
diagnostic = { min = vim.diagnostic.severity.HINT },
-- or
-- diagnostic = { vim.diagnostic.severity.WARN, vim.diagnostic.severity.ERROR },
-- see `:h diagnostic-severity`
<p align= "center">
<img src="https://github.com/user-attachments/assets/81973f0d-73d6-43f3-94cc-d4482233b503" alt="diagnostic" width="800">
</p>
Lsp
[!NOTE] New features, still in continuous iteration.
Both disposable_ask_handler, attach_to_chat_handler, and action_handler can enable lsp features:
lsp = {
cpp = { methods = { "definition", "declaration" } },
python = { methods = { "definition" } },
lua = { methods = { "definition", "declaration" } },
root_dir = { {'pyproject.toml', 'setup.py' }, ".git" },
},
<p align= "center">
<img src="https://github.com/user-attachments/assets/2cadfaad-b201-4167-baa6-ec9ed5d7177b" alt="lsp" width="800">
</p>
<!-- ### [Code Completions](./examples/ai-tools/Code-Completions/) -->
<!-- - **virtual text** -->
<!-- <p align= "center"> -->
<!-- <img src="https://github.com/user-attachments/assets/9215ba1c-df62-4ca8-badb-cf4b62262c57" alt="completion-virtual-text" width="800"> -->
<!-- </p> -->
<!---->
<!-- - **blink.cmp or nvim-cmp** -->
<!-- <p align= "center"> -->
<!-- <img src="https://github.com/user-attachments/assets/93ef3c02-799d-435e-81fa-c4bf7df936d9" alt="completion-blink-cmp" width="800"> -->
<!-- </p> -->
Installation
Dependencies
curlfzf >= 0.37.0: Optional. Split style preview of session history and image recognition tool image selection depends on fzf(The author's development environment is 0.39.0)render-markdown.nvim: Optional. Better Markdown preview depends on this plugin.
{
"MeanderingProgrammer/render-markdown.nvim",
dependencies = {
{
"nvim-treesitter/nvim-treesitter",
branch = "main",
config = function()
vim.api.nvim_create_autocmd("FileType", {
pattern = { "llm", "markdown" },
callback = function()
vim.treesitter.start(0, "markdown")
end,
})
end,
},
"nvim-mini/mini.icons",
}, -- if you use standalone mini plugins
ft = { "markdown", "llm" },
config = function()
require("render-markdown").setup({
restart_highlighter = true,
heading = {
enabled = true,
sign = false,
position = "overlay", -- inline | overlay
icons = { " ", " ", " ", " ", " ", " " },
signs = { " " },
width = "block",
left_margin = 0,
left_pad = 0,
right_pad = 0,
min_width = 0,
border = false,
border_virtual = false,
border_prefix = false,
above = "▄",
below = "▀",
backgrounds = {},
foregrounds = {
"RenderMarkdownH1",
"RenderMarkdownH2",
"RenderMarkdownH3",
"RenderMarkdownH4",
"RenderMarkdownH5",
"RenderMarkdownH6",
},
},
dash = {
enabled = true,
icon = "─",
width = 0.5,
left_margin = 0.5,
highlight = "RenderMarkdownDash",
},
code = { style = "normal" },
})
end,
}
Preconditions
-
Register on the official website and obtain your API Key (Cloudflare needs to obtain an additional account).
-
Set the
LLM_KEY(Cloudflare needs to set an additional `ACCOUNT
