2023-11-19 23:37:24 +01:00
2023-11-19 19:37:57 +01:00
2023-11-19 23:37:24 +01:00
2023-11-19 19:37:57 +01:00
2023-11-19 19:37:57 +01:00
2023-11-19 19:37:57 +01:00
2023-11-19 19:37:57 +01:00
2023-11-19 19:37:57 +01:00
2023-11-19 19:37:57 +01:00
2023-11-19 23:37:24 +01:00
2023-11-19 23:37:24 +01:00

dumbpilot

Get inline completions using llama.cpp as as server backend

Usage

  1. start llama.cpp/server in the background or on a remote machine
  2. configure the host
  3. press ctrl+shift+l to use code prediction
Description
No description provided
Readme 678 KiB
Languages
TypeScript 100%