.vscode | ||
src | ||
.eslintrc.json | ||
.gitignore | ||
.vscodeignore | ||
dummy.ico | ||
LICENSE | ||
package-lock.json | ||
package.json | ||
README.md | ||
TODO.md | ||
tsconfig.json |
dumbpilot
Get inline completions using llama.cpp as as server backend
Usage
- start llama.cpp/server in the background or on a remote machine
- configure the host
- press
ctrl+shift+l
to use code prediction