|
9e25a86916
|
comment
|
2023-12-24 23:34:53 +01:00 |
|
|
42367c8b5d
|
removed some unused variables
|
2023-12-24 23:33:37 +01:00 |
|
|
90477d164d
|
various fixes, better config, better fetch handling
|
2023-12-24 23:23:35 +01:00 |
|
|
12f3c82d3e
|
display tokens as they arrive
|
2023-12-20 19:37:28 +01:00 |
|
|
b75dce72cd
|
added streaming support
|
2023-12-20 15:42:49 +01:00 |
|
|
fe68daac38
|
formatting json files
|
2023-12-16 19:48:27 +01:00 |
|
|
aaa78c03bc
|
added some necessary info
|
2023-12-16 19:36:25 +01:00 |
|
|
63f38f77ba
|
still some todos but this shit works boiii
|
2023-12-16 19:19:05 +01:00 |
|
|
960b2190bf
|
separate llama.cpp server api into a different source file
|
2023-12-16 16:56:44 +01:00 |
|
|
0a493294cf
|
partial move to file
|
2023-12-14 21:45:01 +01:00 |
|
|
efb85c2cb4
|
last chance before changing to OpenAI API
|
2023-12-14 20:12:43 +01:00 |
|
|
04f8db150d
|
more config
|
2023-11-30 00:00:11 +01:00 |
|
|
534e82ffaf
|
git rid of uncaught promise error
|
2023-11-21 19:39:23 +01:00 |
|
|
a599d44c10
|
idk
|
2023-11-21 14:26:43 +01:00 |
|
|
6bb3add1be
|
shorter filename
|
2023-11-20 23:47:06 +01:00 |
|
|
3b8a74a2d8
|
some interactivity
|
2023-11-20 23:17:29 +01:00 |
|
|
00e22e8358
|
added enable and disable commands
|
2023-11-20 00:20:20 +01:00 |
|
|
713a9cde6a
|
works tm
|
2023-11-19 23:37:24 +01:00 |
|
|
745444b564
|
initial commit
|
2023-11-19 19:37:57 +01:00 |
|