You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
Alessandro Mauri efb85c2cb4 last chance before changing to OpenAI API 1 year ago
.vscode some interactivity 1 year ago
src last chance before changing to OpenAI API 1 year ago
.eslintrc.json initial commit 1 year ago
.gitignore initial commit 1 year ago
.vscodeignore initial commit 1 year ago
README.md initial commit 1 year ago
TODO.md last chance before changing to OpenAI API 1 year ago
package-lock.json initial commit 1 year ago
package.json last chance before changing to OpenAI API 1 year ago
tsconfig.json works tm 1 year ago

README.md

dumbpilot

Get inline completions using llama.cpp as as server backend

Usage

  1. start llama.cpp/server in the background or on a remote machine
  2. configure the host
  3. press ctrl+shift+l to use code prediction