2023-11-20 23:17:29 +01:00
2023-11-21 19:39:23 +01:00
2023-11-19 19:37:57 +01:00
2023-11-19 19:37:57 +01:00
2023-11-19 19:37:57 +01:00
2023-11-19 19:37:57 +01:00
idk
2023-11-21 14:26:43 +01:00
2023-11-19 19:37:57 +01:00
2023-11-20 23:17:29 +01:00
2023-11-19 23:37:24 +01:00

dumbpilot

Get inline completions using llama.cpp as as server backend

Usage

  1. start llama.cpp/server in the background or on a remote machine
  2. configure the host
  3. press ctrl+shift+l to use code prediction
Description
No description provided
Readme 678 KiB
Languages
TypeScript 100%