Go to file
2023-11-19 19:37:57 +01:00
.vscode initial commit 2023-11-19 19:37:57 +01:00
src initial commit 2023-11-19 19:37:57 +01:00
.eslintrc.json initial commit 2023-11-19 19:37:57 +01:00
.gitignore initial commit 2023-11-19 19:37:57 +01:00
.vscodeignore initial commit 2023-11-19 19:37:57 +01:00
package-lock.json initial commit 2023-11-19 19:37:57 +01:00
package.json initial commit 2023-11-19 19:37:57 +01:00
README.md initial commit 2023-11-19 19:37:57 +01:00
tsconfig.json initial commit 2023-11-19 19:37:57 +01:00

dumbpilot

Get inline completions using llama.cpp as as server backend

Usage

  1. start llama.cpp/server in the background or on a remote machine
  2. configure the host
  3. press ctrl+shift+l to use code prediction