Preview (#1114)
* ✨ shared indexing * 🎨 indexing * 🧑💻 npm i --no-save in prepackage.js * fix issue with /edit not acknowledging highlighted code after retrying * 🚚 rename addLogs to addPromptCompletionPair * Bedrock chat completion, Anthropic models * 🩹 add filter for midline imports/top-level keywords and encoding header * 🩹 add .t. to stop words * 🔥 Improved Ctrl/Cmd+I (#1023) * ⚡️ improved diff streaming algo * 🎨 better messaging/formatting for further cmd/ctrl+I instructions * ⚡️ more reliably filter out unwanted explanations * 🚸 better follow up edits * 💄 accept/reject diffs block-by-block * ✨ cmd/ctrl+z to reject diff * 🚚 rename variables * 💄 allow switching files when inline diff still visible * 🚸 don't show quick pick if not ctx providers exist * 🚧 (sort of) allow switching editors while streaming diff * 💄 show model being used for cmd/ctrl+I * 💄 don't add undo stops when generating diff * 🐛 fix shortcuts for accept/reject diff blocks * ✨ improved GPT edit prompt, taking prefix/suffix into account * ✨ improved prompting for empty selection ctrl/cmd+I * ⚡️ immediately refresh codelens * 🐛 use first model if default undefined * ⚡️ refresh codelens after diff cleared * 💄 update keyboard shortcuts * ⚡️ Improved edit prompts for OS models (#1029) * 💄 refresh codelens more frequently * ⚡️ improved codellama edit prompt * ⚡️ better codellama prompt * ⚡️ use same improved prompt for most OS models * 🎨 refactor chat templates * 🎨 refactor llama2 prompt to allow ending assistant message * ⚡️ separate os models prompt when no prefix/suffix * 🎨 refactor to allow putting words in the model's mouth * ⚡️ prune code around cmd/ctrl+I * 🚚 rename to cmd/ctrl+I * 🎨 make raw a base completion option * 🩹 small improvements * 🩹 use different prompt when completions not supported * Keep the same statusBar item when updating it to prevent flickering of the status bar. (#1022) * 🎨 add getRepoName to IDE, use for indexing * 🎨 implement server client interface * 📌 pin to vectordb=0.4.12 * 🧑💻 mark xhr-sync-worker.js as external in esbuild * 🎨 break out ignore defaults into core * 🎨 update getRepoName * 🐛 fix import error * 🩹 fix chat.jsonl logging * ⚡️ improved OpenAI autocomplete support * 🐛 fix bug causing part of completions to be skipped * 🔥 remove URLContextProvider * ✨ Add Groq as an official provider * 🩹 make sure autocomplete works with claude * 💄 update positioning of code block toolbar to not cover code * ✨ Run in terminal button * ✨ insert at cursor button * ✨ Regenerate and copy buttons * ✨ Button to force re-indexing * 🐛 make sure tooltip IDs are unique * ✨ Button to continue truncated response * 🚧 WIP on inline edit browser embedding * 🚧 inline TipTapEditor * 🚧 WIP on inline TipTapEditor * 🔥 remove unused test component * 🚧 native inline edit * 💄 nicer looking input box * ✨ Diff Streaming in JetBrains * 💄 line highlighting * 💄 arial font * ✨ Retry with further instructions * 🚧 drop shadow * ✨ accept/reject diffs * ✨ accept/reject diffs * 🐛 fix off-by-one errors * 🚧 swap out button on enter * 💄 styling and auto-resize * 💄 box shadow * 🚧 fix keyboard shortcuts to accept/reject diff * 💄 improve small interactions * 💄 loading icon, cancellation logic * 🐛 handle next.value being undefined * ✨ latex support * Bug Fix: Add ternary operator to prevent nonexistant value error (#1052) * add terniary operator * Removing logging * remove comment --------- Co-authored-by: Justin Milner <jmilner@jmilner-lt2.deka.local> Co-authored-by: Nate Sesti <33237525+sestinj@users.noreply.github.com> * 🎨 small formatting change * 🩹 tweak /edit solution * ✨ Dropdown to select model * 🔊 print when SSL verification disabled * 📌 pin esbuild version to match our hosted binary * 🔥 remove unused package folder * 👷 add note about pinning esbuild * 🚚 rename pkg to binary * ⚡️ update an important stop word for starcoder2, improve dev data * 🐛 fix autocomplete bug * Update completionProvider.ts Add \r\n\r\n stop to tab completion * 📌 update package-locks * 🐛 fix bug in edit prompt * 🔊 log extension version * 🐛 handle repo undefined in vscode * ⏪ revert back to esbuild ^0.17.19 to solve no backend found error with onnxruntime * 🩹 set default autocomplete temp to 0.01 to be strictly positive * make the useCopyBuffer option effective (#1062) * Con-1037: Toggle full screen bug (#1065) * webview reset * add warning --------- Co-authored-by: Justin Milner <jmilner@jmilner-lt2.deka.local> * Update completionProvider.ts as @rootedbox suggested * Resolve conflict, accept branch being merged in (#1076) * Resolve conflict, accept branch being merged in * remove accidental .gitignore add * whoops, put gitignore back * fix --------- Co-authored-by: Justin Milner <jmilner@jmilner-lt2.deka.local> * #1073: update outdated documentation (#1074) * 🩹 small tweaks to stop words * Add abstraction for fetch to easily allow using request options (#1059) * add fetch helper function with request options * add support for request options for Jira context provider * Add a new slash command to review code. (#1071) * Add a new slash command to review code. * clean code * 🩹 add new starcoder artifact as stopword * 💄 slight improvements to inline edit UI * 🔖 update default models, bump gradle version * 📝 recommend starcoder2 * 🐛 fix jetbrains encoding issue * 🩹 don't index site-packages * 🩹 error handling in JetBrains * 🐛 fix copy to clipboard in jetbrains * fix: cursor focus issue causing unwanted return to text area (#1086) * 📝 mention autocomplete in jetbrains * 📝 Tab-autocomplete README * 🔥 remove note about custom ctx providers only being on VS Code * 📝 docs about http context provider * 👥 pull request template * Update from Claude 2 to Claude 3 (#1078) * 📝 add FAQ about single-line completions * 📝 update autocomplete docs * fix cursor focus issue causing unwanted return to text area --------- Co-authored-by: Nate Sesti <sestinj@gmail.com> Co-authored-by: Ty Dunn <ty@tydunn.com> Co-authored-by: Nate Sesti <33237525+sestinj@users.noreply.github.com> * 🔧 option to disable autocomplete from config.json * ✨ option to disable streaming with anthropic * ✅ Test to verify that files are packaged * Add FIM template for CodeGemma (#1097) Also pass stop tokens to llama.cpp. * ✨ customizable rerankers (#1088) * ✨ customizable rerankers * 💄 fix early truncation button * ⚡️ improvements to full text search + reranking * ⚡️ only use starcoder2 stop words for starcoder2 * ⚡️ crawl code graph for call expressions * 🚧 starcoder2-7b free trial * 🚧 free trial client for embeddings and re-ranking * 🚧 embeddings provider * ✅ test for presence of files in CI * 🐛 fixes to reranking * ✨ new onboarding experience * ✨ new onboarding experience * 💄 small tweaks to onboarding * 🩹 add stopAtLines filter to /edit * 🐛 clean up vite build errors * 👷 make vscode external in binary build * 💄 improved models onboarding for existing users * 💄 default indexing progress to 0.0 * 🐛 small fixes to reranking * 👷 clear folders before prepackage * 👷 say where .vsix is output * 👷 also download arm packages outside of gh actions * 🎨 add AbortSignal to indexing * 🔧 starcoder, not 2 in config_schema * 🚚 again, starcoder, not 2 * 🐛 fix bug when reranker undefined * 🩹 fix binary tsc error * ✨ configure context menu prompts * 🐛 acknowledge useLegacyCompletionsEndpoint * 🚑 fix keep existing config option * 🔊 learn about selection * ⚡️ improvements to indexing reporting when not in git repo * 🥅 handle situation where git doesn't exist in workspace * ✨ support for gemini 1.5 pro --------- Co-authored-by: Nithish <nithishmohandass06@gmail.com> Co-authored-by: EC2 Default User <ec2-user@ip-172-31-2-149.ec2.internal> Co-authored-by: LapinMalin <111463204+TheLapinMalin@users.noreply.github.com> Co-authored-by: Justin Milner <42585006+justinmilner1@users.noreply.github.com> Co-authored-by: Justin Milner <jmilner@jmilner-lt2.deka.local> Co-authored-by: lmaosweqf1 <138042737+lmaosweqf1@users.noreply.github.com> Co-authored-by: ading2210 <71154407+ading2210@users.noreply.github.com> Co-authored-by: Martin Mois <martin.mois@googlemail.com> Co-authored-by: Tobias Jung <102594442+tobiajung@users.noreply.github.com> Co-authored-by: Jason Jacobs <nerfnerd@gmail.com> Co-authored-by: Nithish <83941930+Nithishvb@users.noreply.github.com> Co-authored-by: Ty Dunn <ty@tydunn.com> Co-authored-by: postmasters <namnguyen@google.com>
This commit is contained in:
parent
301263930f
commit
6c99a34532
|
@ -1,3 +1,4 @@
|
|||
**/*.run.xml
|
||||
archive/**/*
|
||||
extensions/vscode/models/**/*
|
||||
extensions/vscode/models/**/*
|
||||
docs/docs/languages
|
|
@ -152,4 +152,4 @@ continue_server.build
|
|||
continue_server.dist
|
||||
|
||||
Icon
|
||||
Icon?
|
||||
Icon?
|
|
@ -10,9 +10,9 @@
|
|||
"request": "launch",
|
||||
"name": "Core Binary",
|
||||
"skipFiles": ["<node_internals>/**"],
|
||||
"program": "${workspaceFolder}/pkg/out/index.js",
|
||||
"preLaunchTask": "pkg:esbuild",
|
||||
"outFiles": ["${workspaceFolder}/pkg/out/**/*.js"],
|
||||
"program": "${workspaceFolder}/binary/out/index.js",
|
||||
"preLaunchTask": "binary:esbuild",
|
||||
"outFiles": ["${workspaceFolder}/binary/out/**/*.js"],
|
||||
"sourceMaps": true,
|
||||
"smartStep": true,
|
||||
"internalConsoleOptions": "openOnSessionStart"
|
||||
|
|
|
@ -142,17 +142,17 @@
|
|||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
},
|
||||
//
|
||||
// esbuild for the core binary
|
||||
{
|
||||
"label": "pkg:esbuild",
|
||||
"label": "binary:esbuild",
|
||||
"type": "shell",
|
||||
"command": "npm",
|
||||
"args": ["run", "esbuild"],
|
||||
"problemMatcher": [],
|
||||
"options": {
|
||||
"cwd": "pkg"
|
||||
"cwd": "binary"
|
||||
}
|
||||
}
|
||||
]
|
||||
|
|
|
@ -93,7 +93,7 @@ Pre-requisite: You should use the Intellij IDE, which can be downloaded [here](h
|
|||
|
||||
> For changes to Typescript code, the binary/gui will currently need to be rebuilt. Changes to Kotlin code can often be hot-reloaded with "Run -> Debugging Actions -> Reload Changed Classes"
|
||||
|
||||
Continue's JetBrains extension shares much of the code with the VS Code extension by utilizing shared code in the `core` directory and packaging it in a binary in the `pkg` directory. The Intellij extension (written in Kotlin) is then able to communicate over stdin/stdout in the [CoreMessenger.kt](./extensions/intellij/src/main/kotlin/com/github/continuedev/continueintellijextension/continue/CoreMessenger.kt) file.
|
||||
Continue's JetBrains extension shares much of the code with the VS Code extension by utilizing shared code in the `core` directory and packaging it in a binary in the `binary` directory. The Intellij extension (written in Kotlin) is then able to communicate over stdin/stdout in the [CoreMessenger.kt](./extensions/intellij/src/main/kotlin/com/github/continuedev/continueintellijextension/continue/CoreMessenger.kt) file.
|
||||
|
||||
### Formatting
|
||||
|
||||
|
|
|
@ -1 +1,4 @@
|
|||
bin
|
||||
node_modules
|
||||
bin
|
||||
data
|
||||
out
|
||||
|
|
|
@ -90,7 +90,7 @@ const targetToLanceDb = {
|
|||
entryPoints: ["src/index.ts"],
|
||||
bundle: true,
|
||||
outfile: esbuildOutputFile,
|
||||
external: ["esbuild", ...DYNAMIC_IMPORTS],
|
||||
external: ["esbuild", ...DYNAMIC_IMPORTS, "./xhr-sync-worker.js", "vscode"],
|
||||
format: "cjs",
|
||||
platform: "node",
|
||||
sourcemap: true,
|
||||
|
@ -118,6 +118,7 @@ const targetToLanceDb = {
|
|||
console.log("[info] Building binaries with pkg...");
|
||||
for (const target of targets) {
|
||||
const targetDir = `bin/${target}`;
|
||||
fs.mkdirSync(targetDir, { recursive: true });
|
||||
console.log(`[info] Building ${target}...`);
|
||||
execSync(
|
||||
`npx pkg --no-bytecode --public-packages "*" --public pkgJson/${target} --out-path ${targetDir}`,
|
||||
|
@ -141,6 +142,7 @@ const targetToLanceDb = {
|
|||
|
||||
// Download and unzip prebuilt esbuild binary for the target
|
||||
console.log(`[info] Downloading esbuild for ${target}...`);
|
||||
// Version is pinned to 0.19.11 in package.json to make sure that they match
|
||||
execSync(
|
||||
`curl -o ${targetDir}/esbuild.tgz https://registry.npmjs.org/@esbuild/${target}/-/${target}-0.19.11.tgz`,
|
||||
);
|
|
@ -1,11 +1,11 @@
|
|||
{
|
||||
"name": "pkg",
|
||||
"name": "binary",
|
||||
"version": "1.0.0",
|
||||
"lockfileVersion": 3,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "pkg",
|
||||
"name": "binary",
|
||||
"version": "1.0.0",
|
||||
"license": "Apache-2.0",
|
||||
"dependencies": {
|
||||
|
@ -19,20 +19,20 @@
|
|||
"vectordb": "^0.4.10"
|
||||
},
|
||||
"bin": {
|
||||
"pkg": "out/index.js"
|
||||
"binary": "out/index.js"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/follow-redirects": "^1.14.4",
|
||||
"@types/uuid": "^9.0.8",
|
||||
"@vercel/ncc": "^0.38.1",
|
||||
"esbuild": "^0.19.11",
|
||||
"esbuild": "0.19.11",
|
||||
"pkg": "^5.8.1",
|
||||
"typescript": "^5.3.3"
|
||||
}
|
||||
},
|
||||
"../core": {
|
||||
"name": "@continuedev/core",
|
||||
"version": "1.0.4",
|
||||
"version": "1.0.6",
|
||||
"license": "Apache-2.0",
|
||||
"dependencies": {
|
||||
"@mozilla/readability": "^0.5.0",
|
||||
|
@ -52,6 +52,7 @@
|
|||
"handlebars": "^4.7.8",
|
||||
"http-proxy-agent": "^7.0.1",
|
||||
"https-proxy-agent": "^7.0.3",
|
||||
"ignore": "^5.3.1",
|
||||
"js-tiktoken": "^1.0.8",
|
||||
"jsdom": "^24.0.0",
|
||||
"llama-tokenizer-js": "^1.1.3",
|
||||
|
@ -69,7 +70,7 @@
|
|||
"sqlite3": "^5.1.7",
|
||||
"tree-sitter-wasms": "^0.1.6",
|
||||
"uuid": "^9.0.1",
|
||||
"vectordb": "^0.4.12",
|
||||
"vectordb": "0.4.12",
|
||||
"web-tree-sitter": "^0.21.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
|
@ -142,9 +143,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@babel/helper-string-parser": {
|
||||
"version": "7.23.4",
|
||||
"resolved": "https://registry.npmjs.org/@babel/helper-string-parser/-/helper-string-parser-7.23.4.tgz",
|
||||
"integrity": "sha512-803gmbQdqwdf4olxrX4AJyFBV/RTr3rSmOj0rKwesmzlfhYNDEs+/iOcznzpNWlJlIlTJC2QfPFcHB6DlzdVLQ==",
|
||||
"version": "7.24.1",
|
||||
"resolved": "https://registry.npmjs.org/@babel/helper-string-parser/-/helper-string-parser-7.24.1.tgz",
|
||||
"integrity": "sha512-2ofRCjnnA9y+wk8b9IAREroeUP02KHp431N2mhKniy2yKIDKpbrHv9eXwm8cBeWQYcJmzv5qKCu65P47eCF7CQ==",
|
||||
"dev": true,
|
||||
"engines": {
|
||||
"node": ">=6.9.0"
|
||||
|
@ -186,9 +187,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/aix-ppc64": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/aix-ppc64/-/aix-ppc64-0.19.12.tgz",
|
||||
"integrity": "sha512-bmoCYyWdEL3wDQIVbcyzRyeKLgk2WtWLTWz1ZIAZF/EGbNOwSA6ew3PftJ1PqMiOOGu0OyFMzG53L0zqIpPeNA==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/aix-ppc64/-/aix-ppc64-0.19.11.tgz",
|
||||
"integrity": "sha512-FnzU0LyE3ySQk7UntJO4+qIiQgI7KoODnZg5xzXIrFJlKd2P2gwHsHY4927xj9y5PJmJSzULiUCWmv7iWnNa7g==",
|
||||
"cpu": [
|
||||
"ppc64"
|
||||
],
|
||||
|
@ -202,9 +203,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/android-arm": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/android-arm/-/android-arm-0.19.12.tgz",
|
||||
"integrity": "sha512-qg/Lj1mu3CdQlDEEiWrlC4eaPZ1KztwGJ9B6J+/6G+/4ewxJg7gqj8eVYWvao1bXrqGiW2rsBZFSX3q2lcW05w==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/android-arm/-/android-arm-0.19.11.tgz",
|
||||
"integrity": "sha512-5OVapq0ClabvKvQ58Bws8+wkLCV+Rxg7tUVbo9xu034Nm536QTII4YzhaFriQ7rMrorfnFKUsArD2lqKbFY4vw==",
|
||||
"cpu": [
|
||||
"arm"
|
||||
],
|
||||
|
@ -218,9 +219,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/android-arm64": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/android-arm64/-/android-arm64-0.19.12.tgz",
|
||||
"integrity": "sha512-P0UVNGIienjZv3f5zq0DP3Nt2IE/3plFzuaS96vihvD0Hd6H/q4WXUGpCxD/E8YrSXfNyRPbpTq+T8ZQioSuPA==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/android-arm64/-/android-arm64-0.19.11.tgz",
|
||||
"integrity": "sha512-aiu7K/5JnLj//KOnOfEZ0D90obUkRzDMyqd/wNAUQ34m4YUPVhRZpnqKV9uqDGxT7cToSDnIHsGooyIczu9T+Q==",
|
||||
"cpu": [
|
||||
"arm64"
|
||||
],
|
||||
|
@ -234,9 +235,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/android-x64": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/android-x64/-/android-x64-0.19.12.tgz",
|
||||
"integrity": "sha512-3k7ZoUW6Q6YqhdhIaq/WZ7HwBpnFBlW905Fa4s4qWJyiNOgT1dOqDiVAQFwBH7gBRZr17gLrlFCRzF6jFh7Kew==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/android-x64/-/android-x64-0.19.11.tgz",
|
||||
"integrity": "sha512-eccxjlfGw43WYoY9QgB82SgGgDbibcqyDTlk3l3C0jOVHKxrjdc9CTwDUQd0vkvYg5um0OH+GpxYvp39r+IPOg==",
|
||||
"cpu": [
|
||||
"x64"
|
||||
],
|
||||
|
@ -250,9 +251,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/darwin-arm64": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/darwin-arm64/-/darwin-arm64-0.19.12.tgz",
|
||||
"integrity": "sha512-B6IeSgZgtEzGC42jsI+YYu9Z3HKRxp8ZT3cqhvliEHovq8HSX2YX8lNocDn79gCKJXOSaEot9MVYky7AKjCs8g==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/darwin-arm64/-/darwin-arm64-0.19.11.tgz",
|
||||
"integrity": "sha512-ETp87DRWuSt9KdDVkqSoKoLFHYTrkyz2+65fj9nfXsaV3bMhTCjtQfw3y+um88vGRKRiF7erPrh/ZuIdLUIVxQ==",
|
||||
"cpu": [
|
||||
"arm64"
|
||||
],
|
||||
|
@ -266,9 +267,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/darwin-x64": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/darwin-x64/-/darwin-x64-0.19.12.tgz",
|
||||
"integrity": "sha512-hKoVkKzFiToTgn+41qGhsUJXFlIjxI/jSYeZf3ugemDYZldIXIxhvwN6erJGlX4t5h417iFuheZ7l+YVn05N3A==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/darwin-x64/-/darwin-x64-0.19.11.tgz",
|
||||
"integrity": "sha512-fkFUiS6IUK9WYUO/+22omwetaSNl5/A8giXvQlcinLIjVkxwTLSktbF5f/kJMftM2MJp9+fXqZ5ezS7+SALp4g==",
|
||||
"cpu": [
|
||||
"x64"
|
||||
],
|
||||
|
@ -282,9 +283,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/freebsd-arm64": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/freebsd-arm64/-/freebsd-arm64-0.19.12.tgz",
|
||||
"integrity": "sha512-4aRvFIXmwAcDBw9AueDQ2YnGmz5L6obe5kmPT8Vd+/+x/JMVKCgdcRwH6APrbpNXsPz+K653Qg8HB/oXvXVukA==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/freebsd-arm64/-/freebsd-arm64-0.19.11.tgz",
|
||||
"integrity": "sha512-lhoSp5K6bxKRNdXUtHoNc5HhbXVCS8V0iZmDvyWvYq9S5WSfTIHU2UGjcGt7UeS6iEYp9eeymIl5mJBn0yiuxA==",
|
||||
"cpu": [
|
||||
"arm64"
|
||||
],
|
||||
|
@ -298,9 +299,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/freebsd-x64": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/freebsd-x64/-/freebsd-x64-0.19.12.tgz",
|
||||
"integrity": "sha512-EYoXZ4d8xtBoVN7CEwWY2IN4ho76xjYXqSXMNccFSx2lgqOG/1TBPW0yPx1bJZk94qu3tX0fycJeeQsKovA8gg==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/freebsd-x64/-/freebsd-x64-0.19.11.tgz",
|
||||
"integrity": "sha512-JkUqn44AffGXitVI6/AbQdoYAq0TEullFdqcMY/PCUZ36xJ9ZJRtQabzMA+Vi7r78+25ZIBosLTOKnUXBSi1Kw==",
|
||||
"cpu": [
|
||||
"x64"
|
||||
],
|
||||
|
@ -314,9 +315,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/linux-arm": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/linux-arm/-/linux-arm-0.19.12.tgz",
|
||||
"integrity": "sha512-J5jPms//KhSNv+LO1S1TX1UWp1ucM6N6XuL6ITdKWElCu8wXP72l9MM0zDTzzeikVyqFE6U8YAV9/tFyj0ti+w==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/linux-arm/-/linux-arm-0.19.11.tgz",
|
||||
"integrity": "sha512-3CRkr9+vCV2XJbjwgzjPtO8T0SZUmRZla+UL1jw+XqHZPkPgZiyWvbDvl9rqAN8Zl7qJF0O/9ycMtjU67HN9/Q==",
|
||||
"cpu": [
|
||||
"arm"
|
||||
],
|
||||
|
@ -330,9 +331,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/linux-arm64": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/linux-arm64/-/linux-arm64-0.19.12.tgz",
|
||||
"integrity": "sha512-EoTjyYyLuVPfdPLsGVVVC8a0p1BFFvtpQDB/YLEhaXyf/5bczaGeN15QkR+O4S5LeJ92Tqotve7i1jn35qwvdA==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/linux-arm64/-/linux-arm64-0.19.11.tgz",
|
||||
"integrity": "sha512-LneLg3ypEeveBSMuoa0kwMpCGmpu8XQUh+mL8XXwoYZ6Be2qBnVtcDI5azSvh7vioMDhoJFZzp9GWp9IWpYoUg==",
|
||||
"cpu": [
|
||||
"arm64"
|
||||
],
|
||||
|
@ -346,9 +347,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/linux-ia32": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/linux-ia32/-/linux-ia32-0.19.12.tgz",
|
||||
"integrity": "sha512-Thsa42rrP1+UIGaWz47uydHSBOgTUnwBwNq59khgIwktK6x60Hivfbux9iNR0eHCHzOLjLMLfUMLCypBkZXMHA==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/linux-ia32/-/linux-ia32-0.19.11.tgz",
|
||||
"integrity": "sha512-caHy++CsD8Bgq2V5CodbJjFPEiDPq8JJmBdeyZ8GWVQMjRD0sU548nNdwPNvKjVpamYYVL40AORekgfIubwHoA==",
|
||||
"cpu": [
|
||||
"ia32"
|
||||
],
|
||||
|
@ -362,9 +363,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/linux-loong64": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/linux-loong64/-/linux-loong64-0.19.12.tgz",
|
||||
"integrity": "sha512-LiXdXA0s3IqRRjm6rV6XaWATScKAXjI4R4LoDlvO7+yQqFdlr1Bax62sRwkVvRIrwXxvtYEHHI4dm50jAXkuAA==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/linux-loong64/-/linux-loong64-0.19.11.tgz",
|
||||
"integrity": "sha512-ppZSSLVpPrwHccvC6nQVZaSHlFsvCQyjnvirnVjbKSHuE5N24Yl8F3UwYUUR1UEPaFObGD2tSvVKbvR+uT1Nrg==",
|
||||
"cpu": [
|
||||
"loong64"
|
||||
],
|
||||
|
@ -378,9 +379,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/linux-mips64el": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/linux-mips64el/-/linux-mips64el-0.19.12.tgz",
|
||||
"integrity": "sha512-fEnAuj5VGTanfJ07ff0gOA6IPsvrVHLVb6Lyd1g2/ed67oU1eFzL0r9WL7ZzscD+/N6i3dWumGE1Un4f7Amf+w==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/linux-mips64el/-/linux-mips64el-0.19.11.tgz",
|
||||
"integrity": "sha512-B5x9j0OgjG+v1dF2DkH34lr+7Gmv0kzX6/V0afF41FkPMMqaQ77pH7CrhWeR22aEeHKaeZVtZ6yFwlxOKPVFyg==",
|
||||
"cpu": [
|
||||
"mips64el"
|
||||
],
|
||||
|
@ -394,9 +395,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/linux-ppc64": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/linux-ppc64/-/linux-ppc64-0.19.12.tgz",
|
||||
"integrity": "sha512-nYJA2/QPimDQOh1rKWedNOe3Gfc8PabU7HT3iXWtNUbRzXS9+vgB0Fjaqr//XNbd82mCxHzik2qotuI89cfixg==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/linux-ppc64/-/linux-ppc64-0.19.11.tgz",
|
||||
"integrity": "sha512-MHrZYLeCG8vXblMetWyttkdVRjQlQUb/oMgBNurVEnhj4YWOr4G5lmBfZjHYQHHN0g6yDmCAQRR8MUHldvvRDA==",
|
||||
"cpu": [
|
||||
"ppc64"
|
||||
],
|
||||
|
@ -410,9 +411,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/linux-riscv64": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/linux-riscv64/-/linux-riscv64-0.19.12.tgz",
|
||||
"integrity": "sha512-2MueBrlPQCw5dVJJpQdUYgeqIzDQgw3QtiAHUC4RBz9FXPrskyyU3VI1hw7C0BSKB9OduwSJ79FTCqtGMWqJHg==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/linux-riscv64/-/linux-riscv64-0.19.11.tgz",
|
||||
"integrity": "sha512-f3DY++t94uVg141dozDu4CCUkYW+09rWtaWfnb3bqe4w5NqmZd6nPVBm+qbz7WaHZCoqXqHz5p6CM6qv3qnSSQ==",
|
||||
"cpu": [
|
||||
"riscv64"
|
||||
],
|
||||
|
@ -426,9 +427,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/linux-s390x": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/linux-s390x/-/linux-s390x-0.19.12.tgz",
|
||||
"integrity": "sha512-+Pil1Nv3Umes4m3AZKqA2anfhJiVmNCYkPchwFJNEJN5QxmTs1uzyy4TvmDrCRNT2ApwSari7ZIgrPeUx4UZDg==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/linux-s390x/-/linux-s390x-0.19.11.tgz",
|
||||
"integrity": "sha512-A5xdUoyWJHMMlcSMcPGVLzYzpcY8QP1RtYzX5/bS4dvjBGVxdhuiYyFwp7z74ocV7WDc0n1harxmpq2ePOjI0Q==",
|
||||
"cpu": [
|
||||
"s390x"
|
||||
],
|
||||
|
@ -442,9 +443,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/linux-x64": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/linux-x64/-/linux-x64-0.19.12.tgz",
|
||||
"integrity": "sha512-B71g1QpxfwBvNrfyJdVDexenDIt1CiDN1TIXLbhOw0KhJzE78KIFGX6OJ9MrtC0oOqMWf+0xop4qEU8JrJTwCg==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/linux-x64/-/linux-x64-0.19.11.tgz",
|
||||
"integrity": "sha512-grbyMlVCvJSfxFQUndw5mCtWs5LO1gUlwP4CDi4iJBbVpZcqLVT29FxgGuBJGSzyOxotFG4LoO5X+M1350zmPA==",
|
||||
"cpu": [
|
||||
"x64"
|
||||
],
|
||||
|
@ -458,9 +459,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/netbsd-x64": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/netbsd-x64/-/netbsd-x64-0.19.12.tgz",
|
||||
"integrity": "sha512-3ltjQ7n1owJgFbuC61Oj++XhtzmymoCihNFgT84UAmJnxJfm4sYCiSLTXZtE00VWYpPMYc+ZQmB6xbSdVh0JWA==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/netbsd-x64/-/netbsd-x64-0.19.11.tgz",
|
||||
"integrity": "sha512-13jvrQZJc3P230OhU8xgwUnDeuC/9egsjTkXN49b3GcS5BKvJqZn86aGM8W9pd14Kd+u7HuFBMVtrNGhh6fHEQ==",
|
||||
"cpu": [
|
||||
"x64"
|
||||
],
|
||||
|
@ -474,9 +475,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/openbsd-x64": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/openbsd-x64/-/openbsd-x64-0.19.12.tgz",
|
||||
"integrity": "sha512-RbrfTB9SWsr0kWmb9srfF+L933uMDdu9BIzdA7os2t0TXhCRjrQyCeOt6wVxr79CKD4c+p+YhCj31HBkYcXebw==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/openbsd-x64/-/openbsd-x64-0.19.11.tgz",
|
||||
"integrity": "sha512-ysyOGZuTp6SNKPE11INDUeFVVQFrhcNDVUgSQVDzqsqX38DjhPEPATpid04LCoUr2WXhQTEZ8ct/EgJCUDpyNw==",
|
||||
"cpu": [
|
||||
"x64"
|
||||
],
|
||||
|
@ -490,9 +491,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/sunos-x64": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/sunos-x64/-/sunos-x64-0.19.12.tgz",
|
||||
"integrity": "sha512-HKjJwRrW8uWtCQnQOz9qcU3mUZhTUQvi56Q8DPTLLB+DawoiQdjsYq+j+D3s9I8VFtDr+F9CjgXKKC4ss89IeA==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/sunos-x64/-/sunos-x64-0.19.11.tgz",
|
||||
"integrity": "sha512-Hf+Sad9nVwvtxy4DXCZQqLpgmRTQqyFyhT3bZ4F2XlJCjxGmRFF0Shwn9rzhOYRB61w9VMXUkxlBy56dk9JJiQ==",
|
||||
"cpu": [
|
||||
"x64"
|
||||
],
|
||||
|
@ -506,9 +507,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/win32-arm64": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/win32-arm64/-/win32-arm64-0.19.12.tgz",
|
||||
"integrity": "sha512-URgtR1dJnmGvX864pn1B2YUYNzjmXkuJOIqG2HdU62MVS4EHpU2946OZoTMnRUHklGtJdJZ33QfzdjGACXhn1A==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/win32-arm64/-/win32-arm64-0.19.11.tgz",
|
||||
"integrity": "sha512-0P58Sbi0LctOMOQbpEOvOL44Ne0sqbS0XWHMvvrg6NE5jQ1xguCSSw9jQeUk2lfrXYsKDdOe6K+oZiwKPilYPQ==",
|
||||
"cpu": [
|
||||
"arm64"
|
||||
],
|
||||
|
@ -522,9 +523,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/win32-ia32": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/win32-ia32/-/win32-ia32-0.19.12.tgz",
|
||||
"integrity": "sha512-+ZOE6pUkMOJfmxmBZElNOx72NKpIa/HFOMGzu8fqzQJ5kgf6aTGrcJaFsNiVMH4JKpMipyK+7k0n2UXN7a8YKQ==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/win32-ia32/-/win32-ia32-0.19.11.tgz",
|
||||
"integrity": "sha512-6YOrWS+sDJDmshdBIQU+Uoyh7pQKrdykdefC1avn76ss5c+RN6gut3LZA4E2cH5xUEp5/cA0+YxRaVtRAb0xBg==",
|
||||
"cpu": [
|
||||
"ia32"
|
||||
],
|
||||
|
@ -538,9 +539,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@esbuild/win32-x64": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/win32-x64/-/win32-x64-0.19.12.tgz",
|
||||
"integrity": "sha512-T1QyPSDCyMXaO3pzBkF96E8xMkiRYbUEZADd29SyPGabqxMViNoii+NcK7eWJAEoU6RZyEm5lVSIjTmcdoB9HA==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/@esbuild/win32-x64/-/win32-x64-0.19.11.tgz",
|
||||
"integrity": "sha512-vfkhltrjCAb603XaFhqhAF4LGDi2M4OrCRrFusyQ+iTLQ/o60QQXxc9cZC/FFpihBI9N1Grn6SMKVJ4KP7Fuiw==",
|
||||
"cpu": [
|
||||
"x64"
|
||||
],
|
||||
|
@ -602,9 +603,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@lancedb/vectordb-darwin-arm64": {
|
||||
"version": "0.4.12",
|
||||
"resolved": "https://registry.npmjs.org/@lancedb/vectordb-darwin-arm64/-/vectordb-darwin-arm64-0.4.12.tgz",
|
||||
"integrity": "sha512-38/rkJRlWXkPWXuj9onzvbrhnIWcIUQjgEp5G9v5ixPosBowm7A4j8e2Q8CJMsVSNcVX2JLqwWVldiWegZFuYw==",
|
||||
"version": "0.4.15",
|
||||
"resolved": "https://registry.npmjs.org/@lancedb/vectordb-darwin-arm64/-/vectordb-darwin-arm64-0.4.15.tgz",
|
||||
"integrity": "sha512-asNVd0ojbExbj/iRCFu/+vpnnGtAHbrn7AjrL1PFeSl1JvsvzeZBBxq+WlM4UfGKfiJhkyBvQwh609OiCP3Snw==",
|
||||
"cpu": [
|
||||
"arm64"
|
||||
],
|
||||
|
@ -614,9 +615,9 @@
|
|||
]
|
||||
},
|
||||
"node_modules/@lancedb/vectordb-darwin-x64": {
|
||||
"version": "0.4.12",
|
||||
"resolved": "https://registry.npmjs.org/@lancedb/vectordb-darwin-x64/-/vectordb-darwin-x64-0.4.12.tgz",
|
||||
"integrity": "sha512-psE48dztyO450hXWdv9Rl9aayM2HQ1uF9wErfC0gKmDUh1N0NdVq2viDuFpZxnmCis/nvGwKlYiYT9OnYNCJ9g==",
|
||||
"version": "0.4.15",
|
||||
"resolved": "https://registry.npmjs.org/@lancedb/vectordb-darwin-x64/-/vectordb-darwin-x64-0.4.15.tgz",
|
||||
"integrity": "sha512-tbMGb1P9KXdnoP6dqFnjhUUjGIVzzo2V/Ewc8iktMU1scCVQ7/rEPPvTh9jHuM1r1i+wVTNtKYijIVaHZZPJLA==",
|
||||
"cpu": [
|
||||
"x64"
|
||||
],
|
||||
|
@ -626,9 +627,9 @@
|
|||
]
|
||||
},
|
||||
"node_modules/@lancedb/vectordb-linux-arm64-gnu": {
|
||||
"version": "0.4.12",
|
||||
"resolved": "https://registry.npmjs.org/@lancedb/vectordb-linux-arm64-gnu/-/vectordb-linux-arm64-gnu-0.4.12.tgz",
|
||||
"integrity": "sha512-xwkgF6MiF5aAdG9JG8v4ke652YxUJrhs9z4OrsEfrENnvsIQd2C5UyKMepVLdvij4BI/XPFRFWXdjPvP7S9rTA==",
|
||||
"version": "0.4.15",
|
||||
"resolved": "https://registry.npmjs.org/@lancedb/vectordb-linux-arm64-gnu/-/vectordb-linux-arm64-gnu-0.4.15.tgz",
|
||||
"integrity": "sha512-5oiWvS9Y5mRc2PhaVzu9zoM1UFf77gDgX8IM95U87CZZdPAoREDvnbVhyIZa2SEZccEjLG7tShe+PJsOFWOT/w==",
|
||||
"cpu": [
|
||||
"arm64"
|
||||
],
|
||||
|
@ -638,9 +639,9 @@
|
|||
]
|
||||
},
|
||||
"node_modules/@lancedb/vectordb-linux-x64-gnu": {
|
||||
"version": "0.4.12",
|
||||
"resolved": "https://registry.npmjs.org/@lancedb/vectordb-linux-x64-gnu/-/vectordb-linux-x64-gnu-0.4.12.tgz",
|
||||
"integrity": "sha512-gJqYR0aymrS+C60xc4EQPzmQ5/69XfeFv2ofBvAj7qW+c6BcnoAcfVl+7s1IrcWeGz251sm5cD5Lx4AzJd89dA==",
|
||||
"version": "0.4.15",
|
||||
"resolved": "https://registry.npmjs.org/@lancedb/vectordb-linux-x64-gnu/-/vectordb-linux-x64-gnu-0.4.15.tgz",
|
||||
"integrity": "sha512-FjlrV2h97t5aoammRpPSBjws2cWSKYQpzALHm8Af8QjTuvo9J/MD63Fr/D6Sb9Ie2ER3pCiWD1o8UCFndCFtRg==",
|
||||
"cpu": [
|
||||
"x64"
|
||||
],
|
||||
|
@ -650,9 +651,9 @@
|
|||
]
|
||||
},
|
||||
"node_modules/@lancedb/vectordb-win32-x64-msvc": {
|
||||
"version": "0.4.12",
|
||||
"resolved": "https://registry.npmjs.org/@lancedb/vectordb-win32-x64-msvc/-/vectordb-win32-x64-msvc-0.4.12.tgz",
|
||||
"integrity": "sha512-LhCzpyEeBUyO6L2fuVqeP3mW8kYDryyU9PNqcM01m88sZB1Do6AlwiM+GjPRQ0SpzD0LK9oxQqSmJrdcNGqjbw==",
|
||||
"version": "0.4.15",
|
||||
"resolved": "https://registry.npmjs.org/@lancedb/vectordb-win32-x64-msvc/-/vectordb-win32-x64-msvc-0.4.15.tgz",
|
||||
"integrity": "sha512-o3zlY/FnY426kp2Y7xn4UbD6/Oeg5xqtezYNhhmt44lCmhSlpydx+2m9Fq1OGSDNUCWtrhMhgdHVnGTu2VTa5A==",
|
||||
"cpu": [
|
||||
"x64"
|
||||
],
|
||||
|
@ -980,11 +981,11 @@
|
|||
}
|
||||
},
|
||||
"node_modules/axios": {
|
||||
"version": "1.6.7",
|
||||
"resolved": "https://registry.npmjs.org/axios/-/axios-1.6.7.tgz",
|
||||
"integrity": "sha512-/hDJGff6/c7u0hDkvkGxR/oy6CbCs8ziCsC7SqmhjfozqiJGc8Z11wrv9z9lYfY4K8l+H9TpjcMDX0xOZmx+RA==",
|
||||
"version": "1.6.8",
|
||||
"resolved": "https://registry.npmjs.org/axios/-/axios-1.6.8.tgz",
|
||||
"integrity": "sha512-v/ZHtJDU39mDpyBoFVkETcd/uNdxrWRrg3bKpOKzXFA6Bvqopts6ALSMU3y6ijYxbw2B+wPrIv46egTzJXCLGQ==",
|
||||
"dependencies": {
|
||||
"follow-redirects": "^1.15.4",
|
||||
"follow-redirects": "^1.15.6",
|
||||
"form-data": "^4.0.0",
|
||||
"proxy-from-env": "^1.1.0"
|
||||
}
|
||||
|
@ -1270,9 +1271,9 @@
|
|||
"integrity": "sha512-xmHIy4F3scKVwMsQ4WnVaS8bHOx0DmVwRywosKhaILI0ywMDWPtBSku2HNxRvF7jtwDRsoEwYQSfbxj8b7RlJQ=="
|
||||
},
|
||||
"node_modules/detect-libc": {
|
||||
"version": "2.0.2",
|
||||
"resolved": "https://registry.npmjs.org/detect-libc/-/detect-libc-2.0.2.tgz",
|
||||
"integrity": "sha512-UX6sGumvvqSaXgdKGUsgZWqcUyIXZ/vZTrlRT/iobiKhGL0zL4d3osHj3uqllWJK+i+sixDS/3COVEOFbupFyw==",
|
||||
"version": "2.0.3",
|
||||
"resolved": "https://registry.npmjs.org/detect-libc/-/detect-libc-2.0.3.tgz",
|
||||
"integrity": "sha512-bwy0MGW55bG41VqxxypOsdSdGqLwXPI/focwgTYCFMbdUiBAxLg9CFzG08sz2aqzknwiX7Hkl0bQENjg8iLByw==",
|
||||
"dev": true,
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
|
@ -1306,9 +1307,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/esbuild": {
|
||||
"version": "0.19.12",
|
||||
"resolved": "https://registry.npmjs.org/esbuild/-/esbuild-0.19.12.tgz",
|
||||
"integrity": "sha512-aARqgq8roFBj054KvQr5f1sFu0D65G+miZRCuJyJ0G13Zwx7vRar5Zhn2tkQNzIXcBrNVsv/8stehpj+GAjgbg==",
|
||||
"version": "0.19.11",
|
||||
"resolved": "https://registry.npmjs.org/esbuild/-/esbuild-0.19.11.tgz",
|
||||
"integrity": "sha512-HJ96Hev2hX/6i5cDVwcqiJBBtuo9+FeIJOtZ9W1kA5M6AMJRHUZlpYZ1/SbEwtO0ioNAW8rUooVpC/WehY2SfA==",
|
||||
"dev": true,
|
||||
"hasInstallScript": true,
|
||||
"bin": {
|
||||
|
@ -1318,29 +1319,29 @@
|
|||
"node": ">=12"
|
||||
},
|
||||
"optionalDependencies": {
|
||||
"@esbuild/aix-ppc64": "0.19.12",
|
||||
"@esbuild/android-arm": "0.19.12",
|
||||
"@esbuild/android-arm64": "0.19.12",
|
||||
"@esbuild/android-x64": "0.19.12",
|
||||
"@esbuild/darwin-arm64": "0.19.12",
|
||||
"@esbuild/darwin-x64": "0.19.12",
|
||||
"@esbuild/freebsd-arm64": "0.19.12",
|
||||
"@esbuild/freebsd-x64": "0.19.12",
|
||||
"@esbuild/linux-arm": "0.19.12",
|
||||
"@esbuild/linux-arm64": "0.19.12",
|
||||
"@esbuild/linux-ia32": "0.19.12",
|
||||
"@esbuild/linux-loong64": "0.19.12",
|
||||
"@esbuild/linux-mips64el": "0.19.12",
|
||||
"@esbuild/linux-ppc64": "0.19.12",
|
||||
"@esbuild/linux-riscv64": "0.19.12",
|
||||
"@esbuild/linux-s390x": "0.19.12",
|
||||
"@esbuild/linux-x64": "0.19.12",
|
||||
"@esbuild/netbsd-x64": "0.19.12",
|
||||
"@esbuild/openbsd-x64": "0.19.12",
|
||||
"@esbuild/sunos-x64": "0.19.12",
|
||||
"@esbuild/win32-arm64": "0.19.12",
|
||||
"@esbuild/win32-ia32": "0.19.12",
|
||||
"@esbuild/win32-x64": "0.19.12"
|
||||
"@esbuild/aix-ppc64": "0.19.11",
|
||||
"@esbuild/android-arm": "0.19.11",
|
||||
"@esbuild/android-arm64": "0.19.11",
|
||||
"@esbuild/android-x64": "0.19.11",
|
||||
"@esbuild/darwin-arm64": "0.19.11",
|
||||
"@esbuild/darwin-x64": "0.19.11",
|
||||
"@esbuild/freebsd-arm64": "0.19.11",
|
||||
"@esbuild/freebsd-x64": "0.19.11",
|
||||
"@esbuild/linux-arm": "0.19.11",
|
||||
"@esbuild/linux-arm64": "0.19.11",
|
||||
"@esbuild/linux-ia32": "0.19.11",
|
||||
"@esbuild/linux-loong64": "0.19.11",
|
||||
"@esbuild/linux-mips64el": "0.19.11",
|
||||
"@esbuild/linux-ppc64": "0.19.11",
|
||||
"@esbuild/linux-riscv64": "0.19.11",
|
||||
"@esbuild/linux-s390x": "0.19.11",
|
||||
"@esbuild/linux-x64": "0.19.11",
|
||||
"@esbuild/netbsd-x64": "0.19.11",
|
||||
"@esbuild/openbsd-x64": "0.19.11",
|
||||
"@esbuild/sunos-x64": "0.19.11",
|
||||
"@esbuild/win32-arm64": "0.19.11",
|
||||
"@esbuild/win32-ia32": "0.19.11",
|
||||
"@esbuild/win32-x64": "0.19.11"
|
||||
}
|
||||
},
|
||||
"node_modules/escalade": {
|
||||
|
@ -1881,9 +1882,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/node-abi": {
|
||||
"version": "3.56.0",
|
||||
"resolved": "https://registry.npmjs.org/node-abi/-/node-abi-3.56.0.tgz",
|
||||
"integrity": "sha512-fZjdhDOeRcaS+rcpve7XuwHBmktS1nS1gzgghwKUQQ8nTy2FdSDr6ZT8k6YhvlJeHmmQMYiT/IH9hfco5zeW2Q==",
|
||||
"version": "3.57.0",
|
||||
"resolved": "https://registry.npmjs.org/node-abi/-/node-abi-3.57.0.tgz",
|
||||
"integrity": "sha512-Dp+A9JWxRaKuHP35H77I4kCKesDy5HUDEmScia2FyncMTOXASMyg251F5PhFoDA5uqBrDDffiLpbqnrZmNXW+g==",
|
||||
"dev": true,
|
||||
"dependencies": {
|
||||
"semver": "^7.3.5"
|
||||
|
@ -2510,9 +2511,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/typescript": {
|
||||
"version": "5.4.2",
|
||||
"resolved": "https://registry.npmjs.org/typescript/-/typescript-5.4.2.tgz",
|
||||
"integrity": "sha512-+2/g0Fds1ERlP6JsakQQDXjZdZMM+rqpamFZJEKh4kwTIn3iDkgKtby0CeNd5ATNZ4Ry1ax15TMx0W2V+miizQ==",
|
||||
"version": "5.4.3",
|
||||
"resolved": "https://registry.npmjs.org/typescript/-/typescript-5.4.3.tgz",
|
||||
"integrity": "sha512-KrPd3PKaCLr78MalgiwJnA25Nm8HAmdwN3mYUYZgG/wizIo9EainNVQI9/yDavtVFRN2h3k8uf3GLHuhDMgEHg==",
|
||||
"dev": true,
|
||||
"bin": {
|
||||
"tsc": "bin/tsc",
|
||||
|
@ -2564,9 +2565,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/vectordb": {
|
||||
"version": "0.4.12",
|
||||
"resolved": "https://registry.npmjs.org/vectordb/-/vectordb-0.4.12.tgz",
|
||||
"integrity": "sha512-H2mFwZ87d3BcuX4yGx5RoW06amRZQ3W5u/FRTNc86rwVINuEEa+Ivsqng1i8hURHXoTDbF/0Y9qwCshySkeznQ==",
|
||||
"version": "0.4.15",
|
||||
"resolved": "https://registry.npmjs.org/vectordb/-/vectordb-0.4.15.tgz",
|
||||
"integrity": "sha512-iROQ92CUGHj+kDUFVCdIk1Wvqg/pGE6WHhBlOorgRXHZwiFjP2SwmcDZeW7c7lH2hXC5Sks8MAGlglPstCpELQ==",
|
||||
"cpu": [
|
||||
"x64",
|
||||
"arm64"
|
||||
|
@ -2581,11 +2582,11 @@
|
|||
"axios": "^1.4.0"
|
||||
},
|
||||
"optionalDependencies": {
|
||||
"@lancedb/vectordb-darwin-arm64": "0.4.12",
|
||||
"@lancedb/vectordb-darwin-x64": "0.4.12",
|
||||
"@lancedb/vectordb-linux-arm64-gnu": "0.4.12",
|
||||
"@lancedb/vectordb-linux-x64-gnu": "0.4.12",
|
||||
"@lancedb/vectordb-win32-x64-msvc": "0.4.12"
|
||||
"@lancedb/vectordb-darwin-arm64": "0.4.15",
|
||||
"@lancedb/vectordb-darwin-x64": "0.4.15",
|
||||
"@lancedb/vectordb-linux-arm64-gnu": "0.4.15",
|
||||
"@lancedb/vectordb-linux-x64-gnu": "0.4.15",
|
||||
"@lancedb/vectordb-win32-x64-msvc": "0.4.15"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@apache-arrow/ts": "^14.0.2",
|
|
@ -1,5 +1,5 @@
|
|||
{
|
||||
"name": "pkg",
|
||||
"name": "binary",
|
||||
"version": "1.0.0",
|
||||
"description": "",
|
||||
"main": "out/index.js",
|
||||
|
@ -32,7 +32,7 @@
|
|||
"@types/follow-redirects": "^1.14.4",
|
||||
"@types/uuid": "^9.0.8",
|
||||
"@vercel/ncc": "^0.38.1",
|
||||
"esbuild": "^0.19.11",
|
||||
"esbuild": "0.19.11",
|
||||
"pkg": "^5.8.1",
|
||||
"typescript": "^5.3.3"
|
||||
},
|
|
@ -1,5 +1,5 @@
|
|||
{
|
||||
"name": "pkg",
|
||||
"name": "continue-binary",
|
||||
"version": "1.0.0",
|
||||
"description": "",
|
||||
"bin": "../../out/index.js",
|
|
@ -1,5 +1,5 @@
|
|||
{
|
||||
"name": "pkg",
|
||||
"name": "continue-binary",
|
||||
"version": "1.0.0",
|
||||
"description": "",
|
||||
"bin": "../../out/index.js",
|
|
@ -1,5 +1,5 @@
|
|||
{
|
||||
"name": "pkg",
|
||||
"name": "continue-binary",
|
||||
"version": "1.0.0",
|
||||
"description": "",
|
||||
"bin": "../../out/index.js",
|
|
@ -1,5 +1,5 @@
|
|||
{
|
||||
"name": "pkg",
|
||||
"name": "continue-binary",
|
||||
"version": "1.0.0",
|
||||
"description": "",
|
||||
"bin": "../../out/index.js",
|
|
@ -1,5 +1,5 @@
|
|||
{
|
||||
"name": "pkg",
|
||||
"name": "continue-binary",
|
||||
"version": "1.0.0",
|
||||
"description": "",
|
||||
"bin": "../../out/index.js",
|
|
@ -1,5 +1,5 @@
|
|||
{
|
||||
"name": "pkg",
|
||||
"name": "continue-binary",
|
||||
"version": "1.0.0",
|
||||
"description": "",
|
||||
"bin": "../../out/index.js",
|
|
@ -9,6 +9,7 @@ import { logDevData } from "core/util/devdata";
|
|||
import historyManager from "core/util/history";
|
||||
import { Message } from "core/util/messenger";
|
||||
import { Telemetry } from "core/util/posthog";
|
||||
import { streamDiffLines } from "core/util/verticalEdit";
|
||||
import { v4 as uuidv4 } from "uuid";
|
||||
import { IpcMessenger } from "./messenger";
|
||||
import { Protocol } from "./protocol";
|
||||
|
@ -47,6 +48,8 @@ export class Core {
|
|||
this.configHandler,
|
||||
this.ide,
|
||||
new PauseToken(false),
|
||||
undefined, // TODO
|
||||
Promise.resolve(undefined), // TODO
|
||||
);
|
||||
|
||||
const getLlm = async () => {
|
||||
|
@ -63,6 +66,11 @@ export class Core {
|
|||
|
||||
const on = this.messenger.on.bind(this.messenger);
|
||||
|
||||
this.messenger.onError((err) => {
|
||||
console.error(err);
|
||||
this.messenger.request("errorPopup", { message: err.message });
|
||||
});
|
||||
|
||||
// New
|
||||
on("update/modelChange", (msg) => {
|
||||
this.selectedModelTitle = msg.data;
|
||||
|
@ -151,6 +159,7 @@ export class Core {
|
|||
fullInput: msg.data.fullInput,
|
||||
ide,
|
||||
selectedCode: msg.data.selectedCode,
|
||||
reranker: config.reranker,
|
||||
});
|
||||
|
||||
Telemetry.capture("useContextProvider", {
|
||||
|
@ -293,6 +302,36 @@ export class Core {
|
|||
on("autocomplete/cancel", async (msg) => {
|
||||
this.completionProvider.cancel();
|
||||
});
|
||||
|
||||
async function* streamDiffLinesGenerator(
|
||||
configHandler: ConfigHandler,
|
||||
abortedMessageIds: Set<string>,
|
||||
msg: Message<Protocol["streamDiffLines"][0]>,
|
||||
) {
|
||||
const data = msg.data;
|
||||
const llm = await configHandler.llmFromTitle(msg.data.modelTitle);
|
||||
for await (const diffLine of streamDiffLines(
|
||||
data.prefix,
|
||||
data.highlighted,
|
||||
data.suffix,
|
||||
llm,
|
||||
data.input,
|
||||
data.language,
|
||||
)) {
|
||||
if (abortedMessageIds.has(msg.messageId)) {
|
||||
abortedMessageIds.delete(msg.messageId);
|
||||
break;
|
||||
}
|
||||
console.log(diffLine);
|
||||
yield { content: diffLine };
|
||||
}
|
||||
|
||||
return { done: true };
|
||||
}
|
||||
|
||||
on("streamDiffLines", (msg) =>
|
||||
streamDiffLinesGenerator(this.configHandler, this.abortedMessageIds, msg),
|
||||
);
|
||||
}
|
||||
|
||||
public invoke<T extends keyof Protocol>(
|
|
@ -35,6 +35,12 @@ export class IpcMessenger {
|
|||
});
|
||||
}
|
||||
|
||||
private _onErrorHandlers: ((error: Error) => void)[] = [];
|
||||
|
||||
onError(handler: (error: Error) => void) {
|
||||
this._onErrorHandlers.push(handler);
|
||||
}
|
||||
|
||||
mock(data: any) {
|
||||
const d = JSON.stringify(data);
|
||||
this._handleData(Buffer.from(d));
|
||||
|
@ -63,8 +69,11 @@ export class IpcMessenger {
|
|||
} else {
|
||||
this.send(msg.messageType, response || {}, msg.messageId);
|
||||
}
|
||||
} catch (e) {
|
||||
} catch (e: any) {
|
||||
console.warn(`Error running handler for "${msg.messageType}": `, e);
|
||||
this._onErrorHandlers.forEach((handler) => {
|
||||
handler(e);
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
|
@ -78,7 +87,7 @@ export class IpcMessenger {
|
|||
|
||||
private _handleData(data: Buffer) {
|
||||
const d = data.toString();
|
||||
const lines = d.split("\n").filter((line) => line.trim() !== "");
|
||||
const lines = d.split(/\r\n|\r|\n/).filter((line) => line.trim() !== "");
|
||||
lines.forEach((line) => this._handleLine(line));
|
||||
}
|
||||
|
|
@ -7,6 +7,7 @@ import { streamLines } from "../diff/util";
|
|||
import OpenAI from "../llm/llms/OpenAI";
|
||||
import { getBasename } from "../util";
|
||||
import { logDevData } from "../util/devdata";
|
||||
import { DEFAULT_AUTOCOMPLETE_OPTS } from "../util/parameters";
|
||||
import { Telemetry } from "../util/posthog";
|
||||
import { getRangeInString } from "../util/ranges";
|
||||
import AutocompleteLruCache from "./cache";
|
||||
|
@ -18,12 +19,12 @@ import {
|
|||
import { AutocompleteLanguageInfo } from "./languages";
|
||||
import {
|
||||
avoidPathLine,
|
||||
noTopLevelKeywordsMidline,
|
||||
stopAtLines,
|
||||
stopAtRepeatingLines,
|
||||
stopAtSimilarLine,
|
||||
streamWithNewLines,
|
||||
} from "./lineStream";
|
||||
import { DEFAULT_AUTOCOMPLETE_OPTS } from "./parameters";
|
||||
import { AutocompleteSnippet } from "./ranking";
|
||||
import { getTemplateForModel } from "./templates";
|
||||
import { GeneratorReuseManager } from "./util";
|
||||
|
@ -37,7 +38,7 @@ export interface AutocompleteInput {
|
|||
clipboardText: string;
|
||||
}
|
||||
|
||||
export interface AutocompleteOutcome {
|
||||
export interface AutocompleteOutcome extends TabAutocompleteOptions {
|
||||
accepted?: boolean;
|
||||
time: number;
|
||||
prompt: string;
|
||||
|
@ -50,10 +51,21 @@ export interface AutocompleteOutcome {
|
|||
|
||||
const autocompleteCache = AutocompleteLruCache.get();
|
||||
|
||||
const DOUBLE_NEWLINE = "\n\n";
|
||||
const WINDOWS_DOUBLE_NEWLINE = "\r\n\r\n";
|
||||
const SRC_DIRECTORY = "/src/";
|
||||
// Starcoder2 tends to output artifacts starting with the letter "t"
|
||||
const STARCODER2_T_ARTIFACTS = ["t.", "\nt"];
|
||||
const PYTHON_ENCODING = "#- coding: utf-8";
|
||||
const CODE_BLOCK_END = "```";
|
||||
|
||||
const multilineStops = [DOUBLE_NEWLINE, WINDOWS_DOUBLE_NEWLINE];
|
||||
const commonStops = [SRC_DIRECTORY, PYTHON_ENCODING, CODE_BLOCK_END];
|
||||
|
||||
function formatExternalSnippet(
|
||||
filepath: string,
|
||||
snippet: string,
|
||||
language: AutocompleteLanguageInfo
|
||||
language: AutocompleteLanguageInfo,
|
||||
) {
|
||||
const comment = language.comment;
|
||||
const lines = [
|
||||
|
@ -78,8 +90,8 @@ export async function getTabCompletion(
|
|||
filepath: string,
|
||||
contents: string,
|
||||
cursorIndex: number,
|
||||
ide: IDE
|
||||
) => Promise<AutocompleteSnippet[]>
|
||||
ide: IDE,
|
||||
) => Promise<AutocompleteSnippet[]>,
|
||||
): Promise<AutocompleteOutcome | undefined> {
|
||||
const startTime = Date.now();
|
||||
|
||||
|
@ -105,9 +117,12 @@ export async function getTabCompletion(
|
|||
// Model
|
||||
if (llm instanceof OpenAI) {
|
||||
llm.useLegacyCompletionsEndpoint = true;
|
||||
} else if (llm.providerName === "free-trial") {
|
||||
} else if (
|
||||
llm.providerName === "free-trial" &&
|
||||
llm.model !== "starcoder-7b"
|
||||
) {
|
||||
throw new Error(
|
||||
"Free trial is not supported for tab-autocomplete. We recommend using starcoder with Ollama, LM Studio, or another provider.",
|
||||
"The only free trial model supported for tab-autocomplete is starcoder-7b.",
|
||||
);
|
||||
}
|
||||
if (!llm) return;
|
||||
|
@ -129,7 +144,7 @@ export async function getTabCompletion(
|
|||
filepath,
|
||||
fullPrefix + fullSuffix,
|
||||
fullPrefix.length,
|
||||
ide
|
||||
ide,
|
||||
),
|
||||
new Promise((resolve) => {
|
||||
setTimeout(() => resolve([]), 100);
|
||||
|
@ -155,7 +170,7 @@ export async function getTabCompletion(
|
|||
recentlyEditedRanges,
|
||||
recentlyEditedFiles,
|
||||
llm.model,
|
||||
extrasSnippets
|
||||
extrasSnippets,
|
||||
);
|
||||
|
||||
// Template prompt
|
||||
|
@ -172,7 +187,7 @@ export async function getTabCompletion(
|
|||
// Format snippets as comments and prepend to prefix
|
||||
const formattedSnippets = snippets
|
||||
.map((snippet) =>
|
||||
formatExternalSnippet(snippet.filepath, snippet.contents, lang)
|
||||
formatExternalSnippet(snippet.filepath, snippet.contents, lang),
|
||||
)
|
||||
.join("\n");
|
||||
if (formattedSnippets.length > 0) {
|
||||
|
@ -203,12 +218,13 @@ export async function getTabCompletion(
|
|||
cacheHit = true;
|
||||
completion = cachedCompletion;
|
||||
} else {
|
||||
// Try to reuse pending requests if what the user typed matches start of completion
|
||||
let stop = [
|
||||
...(completionOptions?.stop || []),
|
||||
"\n\n",
|
||||
"/src/",
|
||||
"```",
|
||||
...multilineStops,
|
||||
...commonStops,
|
||||
...(llm.model.toLowerCase().includes("starcoder2")
|
||||
? STARCODER2_T_ARTIFACTS
|
||||
: []),
|
||||
...lang.stopWords,
|
||||
];
|
||||
|
||||
|
@ -216,6 +232,7 @@ export async function getTabCompletion(
|
|||
options.multilineCompletions !== "never" &&
|
||||
(options.multilineCompletions === "always" || completeMultiline);
|
||||
|
||||
// Try to reuse pending requests if what the user typed matches start of completion
|
||||
let generator = generatorReuseManager.getGenerator(
|
||||
prefix,
|
||||
() =>
|
||||
|
@ -224,7 +241,7 @@ export async function getTabCompletion(
|
|||
raw: true,
|
||||
stop,
|
||||
}),
|
||||
multiline
|
||||
multiline,
|
||||
);
|
||||
|
||||
// LLM
|
||||
|
@ -233,22 +250,22 @@ export async function getTabCompletion(
|
|||
for await (const update of generator) {
|
||||
if (token.aborted) {
|
||||
cancelled = true;
|
||||
return undefined;
|
||||
return;
|
||||
}
|
||||
yield update;
|
||||
}
|
||||
};
|
||||
let chars = generatorWithCancellation();
|
||||
const gen2 = onlyWhitespaceAfterEndOfLine(
|
||||
noFirstCharNewline(chars),
|
||||
lang.endOfLine
|
||||
);
|
||||
const lineGenerator = streamWithNewLines(
|
||||
avoidPathLine(
|
||||
stopAtRepeatingLines(stopAtLines(streamLines(gen2))),
|
||||
lang.comment
|
||||
)
|
||||
);
|
||||
let charGenerator = generatorWithCancellation();
|
||||
charGenerator = noFirstCharNewline(charGenerator);
|
||||
charGenerator = onlyWhitespaceAfterEndOfLine(charGenerator, lang.endOfLine);
|
||||
|
||||
let lineGenerator = streamLines(charGenerator);
|
||||
lineGenerator = stopAtLines(lineGenerator);
|
||||
lineGenerator = stopAtRepeatingLines(lineGenerator);
|
||||
lineGenerator = avoidPathLine(lineGenerator, lang.comment);
|
||||
lineGenerator = noTopLevelKeywordsMidline(lineGenerator, lang.stopWords);
|
||||
lineGenerator = streamWithNewLines(lineGenerator);
|
||||
|
||||
const finalGenerator = stopAtSimilarLine(lineGenerator, lineBelowCursor);
|
||||
for await (const update of finalGenerator) {
|
||||
completion += update;
|
||||
|
@ -276,6 +293,7 @@ export async function getTabCompletion(
|
|||
modelName: llm.model,
|
||||
completionOptions,
|
||||
cacheHit,
|
||||
...options,
|
||||
};
|
||||
}
|
||||
|
||||
|
@ -293,11 +311,11 @@ export class CompletionProvider {
|
|||
filepath: string,
|
||||
contents: string,
|
||||
cursorIndex: number,
|
||||
ide: IDE
|
||||
) => Promise<AutocompleteSnippet[]>
|
||||
ide: IDE,
|
||||
) => Promise<AutocompleteSnippet[]>,
|
||||
) {
|
||||
this.generatorReuseManager = new GeneratorReuseManager(
|
||||
this.onError.bind(this)
|
||||
this.onError.bind(this),
|
||||
);
|
||||
}
|
||||
|
||||
|
@ -347,7 +365,7 @@ export class CompletionProvider {
|
|||
|
||||
public async provideInlineCompletionItems(
|
||||
input: AutocompleteInput,
|
||||
token: AbortSignal | undefined
|
||||
token: AbortSignal | undefined,
|
||||
): Promise<AutocompleteOutcome | undefined> {
|
||||
// Create abort signal if not given
|
||||
if (!token) {
|
||||
|
@ -367,12 +385,17 @@ export class CompletionProvider {
|
|||
...config.tabAutocompleteOptions,
|
||||
};
|
||||
|
||||
// Allow disabling autocomplete from config.json
|
||||
if (options.disable) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
if (CompletionProvider.debouncing) {
|
||||
CompletionProvider.debounceTimeout?.refresh();
|
||||
const lastUUID = await new Promise((resolve) =>
|
||||
setTimeout(() => {
|
||||
resolve(CompletionProvider.lastUUID);
|
||||
}, options.debounceDelay)
|
||||
}, options.debounceDelay),
|
||||
);
|
||||
if (uuid !== lastUUID) {
|
||||
return undefined;
|
||||
|
@ -392,7 +415,7 @@ export class CompletionProvider {
|
|||
|
||||
// Set temperature (but don't overrride)
|
||||
if (llm.completionOptions.temperature === undefined) {
|
||||
llm.completionOptions.temperature = 0.0;
|
||||
llm.completionOptions.temperature = 0.01;
|
||||
}
|
||||
|
||||
const outcome = await getTabCompletion(
|
||||
|
@ -402,7 +425,7 @@ export class CompletionProvider {
|
|||
this.ide,
|
||||
this.generatorReuseManager,
|
||||
input,
|
||||
this.getDefinitionsFromLsp
|
||||
this.getDefinitionsFromLsp,
|
||||
);
|
||||
const completion = outcome?.completion;
|
||||
|
||||
|
@ -421,12 +444,9 @@ export class CompletionProvider {
|
|||
const logRejectionTimeout = setTimeout(() => {
|
||||
// Wait 10 seconds, then assume it wasn't accepted
|
||||
logDevData("autocomplete", outcome);
|
||||
const { prompt, completion, ...restOfOutcome } = outcome;
|
||||
Telemetry.capture("autocomplete", {
|
||||
accepted: outcome.accepted,
|
||||
modelName: outcome.modelName,
|
||||
modelProvider: outcome.modelProvider,
|
||||
time: outcome.time,
|
||||
cacheHit: outcome.cacheHit,
|
||||
...restOfOutcome,
|
||||
});
|
||||
this._logRejectionTimeouts.delete(input.completionId);
|
||||
}, 10_000);
|
||||
|
|
|
@ -6,7 +6,7 @@ export interface AutocompleteLanguageInfo {
|
|||
|
||||
// TypeScript
|
||||
export const Typescript = {
|
||||
stopWords: ["function", "class", "module", "export"],
|
||||
stopWords: ["function", "class", "module", "export", "import"],
|
||||
comment: "//",
|
||||
endOfLine: [";"],
|
||||
};
|
||||
|
|
|
@ -2,6 +2,22 @@ import { distance } from "fastest-levenshtein";
|
|||
import { DiffLine } from "..";
|
||||
import { LineStream } from "../diff/util";
|
||||
|
||||
export async function* noTopLevelKeywordsMidline(
|
||||
lines: LineStream,
|
||||
topLevelKeywords: string[],
|
||||
): LineStream {
|
||||
for await (const line of lines) {
|
||||
for (const keyword of topLevelKeywords) {
|
||||
const indexOf = line.indexOf(keyword + " ");
|
||||
if (indexOf >= 0 && line.slice(indexOf - 1, indexOf).trim() !== "") {
|
||||
yield line.slice(0, indexOf);
|
||||
break;
|
||||
}
|
||||
}
|
||||
yield line;
|
||||
}
|
||||
}
|
||||
|
||||
export async function* avoidPathLine(
|
||||
stream: LineStream,
|
||||
comment: string,
|
||||
|
@ -54,11 +70,11 @@ export async function* stopAtSimilarLine(
|
|||
}
|
||||
}
|
||||
|
||||
const LINES_TO_STOP_AT = ["# End of file."];
|
||||
const LINES_TO_STOP_AT = ["# End of file.", "<STOP EDITING HERE"];
|
||||
|
||||
export async function* stopAtLines(stream: LineStream): LineStream {
|
||||
for await (const line of stream) {
|
||||
if (LINES_TO_STOP_AT.includes(line)) {
|
||||
if (LINES_TO_STOP_AT.some((stopAt) => line.trim().includes(stopAt))) {
|
||||
break;
|
||||
}
|
||||
yield line;
|
||||
|
@ -69,7 +85,8 @@ function shouldRemoveLineBeforeStart(line: string): boolean {
|
|||
return (
|
||||
line.trimStart().startsWith("```") ||
|
||||
line.trim() === "[CODE]" ||
|
||||
line.trim() === ""
|
||||
line.trim() === "" ||
|
||||
line.trim() === "<START EDITING HERE>"
|
||||
);
|
||||
}
|
||||
|
||||
|
@ -142,6 +159,9 @@ export async function* filterEnglishLinesAtStart(lines: LineStream) {
|
|||
let i = 0;
|
||||
let wasEnglishFirstLine = false;
|
||||
for await (let line of lines) {
|
||||
if (i === 0 && line.trim() === "") {
|
||||
continue;
|
||||
}
|
||||
if (i === 0) {
|
||||
if (isEnglishFirstLine(line)) {
|
||||
wasEnglishFirstLine = true;
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
import { Chunk, IndexTag } from "..";
|
||||
import { BranchAndDir, Chunk } from "..";
|
||||
import { FullTextSearchCodebaseIndex } from "../indexing/FullTextSearch";
|
||||
|
||||
export async function fullTextRetrieve(
|
||||
prefix: string,
|
||||
suffix: string,
|
||||
indexTag: IndexTag,
|
||||
indexTag: BranchAndDir,
|
||||
): Promise<Chunk[]> {
|
||||
const index = new FullTextSearchCodebaseIndex();
|
||||
const searchStrings = prefix.split("\n").slice(-3);
|
||||
|
|
|
@ -24,6 +24,13 @@ const stableCodeFimTemplate: AutocompleteTemplate = {
|
|||
},
|
||||
};
|
||||
|
||||
const codegemmaFimTemplate: AutocompleteTemplate = {
|
||||
template: "<|fim_prefix|>{{{prefix}}}<|fim_suffix|>{{{suffix}}}<|fim_middle|>",
|
||||
completionOptions: {
|
||||
stop: ["<|fim_prefix|>", "<|fim_suffix|>", "<|fim_middle|>", "<|file_separator|>", "<end_of_turn>", "<eos>"],
|
||||
},
|
||||
};
|
||||
|
||||
// https://arxiv.org/pdf/2402.19173.pdf section 5.1
|
||||
const starcoder2FimTemplate: AutocompleteTemplate = {
|
||||
template: (
|
||||
|
@ -79,12 +86,11 @@ const deepseekFimTemplateWrongPipeChar: AutocompleteTemplate = {
|
|||
};
|
||||
|
||||
const gptAutocompleteTemplate: AutocompleteTemplate = {
|
||||
template: `Your task is to complete the line at the end of this code block:
|
||||
\`\`\`
|
||||
{{{prefix}}}
|
||||
template: `\`\`\`
|
||||
{{{prefix}}}[BLANK]{{{suffix}}}
|
||||
\`\`\`
|
||||
|
||||
The last line is incomplete, and you should provide the rest of that line. If the line is already complete, just return a new line. Otherwise, DO NOT provide explanation, a code block, or extra whitespace, just the code that should be added to the last line to complete it:`,
|
||||
Fill in the blank to complete the code block. Your response should include only the code to replace [BLANK], without surrounding backticks.`,
|
||||
completionOptions: { stop: ["\n"] },
|
||||
};
|
||||
|
||||
|
@ -105,6 +111,10 @@ export function getTemplateForModel(model: string): AutocompleteTemplate {
|
|||
return stableCodeFimTemplate;
|
||||
}
|
||||
|
||||
if (lowerCaseModel.includes("codegemma")) {
|
||||
return codegemmaFimTemplate;
|
||||
}
|
||||
|
||||
if (lowerCaseModel.includes("codellama")) {
|
||||
return codeLlamaFimTemplate;
|
||||
}
|
||||
|
@ -113,7 +123,11 @@ export function getTemplateForModel(model: string): AutocompleteTemplate {
|
|||
return deepseekFimTemplate;
|
||||
}
|
||||
|
||||
if (lowerCaseModel.includes("gpt") || lowerCaseModel.includes("davinci-002")) {
|
||||
if (
|
||||
lowerCaseModel.includes("gpt") ||
|
||||
lowerCaseModel.includes("davinci-002") ||
|
||||
lowerCaseModel.includes("claude")
|
||||
) {
|
||||
return gptAutocompleteTemplate;
|
||||
}
|
||||
|
||||
|
|
|
@ -49,8 +49,9 @@ export class ListenableGenerator<T> {
|
|||
|
||||
async *tee(): AsyncGenerator<T> {
|
||||
try {
|
||||
for (const value of this._buffer) {
|
||||
yield value;
|
||||
let i = 0;
|
||||
while (i < this._buffer.length) {
|
||||
yield this._buffer[i++];
|
||||
}
|
||||
while (!this._isEnded) {
|
||||
let resolve: (value: any) => void;
|
||||
|
@ -58,10 +59,14 @@ export class ListenableGenerator<T> {
|
|||
resolve = res;
|
||||
this._listeners.add(resolve!);
|
||||
});
|
||||
const value = await promise;
|
||||
await promise;
|
||||
this._listeners.delete(resolve!);
|
||||
|
||||
yield value;
|
||||
// Possible timing caused something to slip in between
|
||||
// timers so we iterate over the buffer
|
||||
while (i < this._buffer.length) {
|
||||
yield this._buffer[i++];
|
||||
}
|
||||
}
|
||||
} finally {
|
||||
// this._listeners.delete(resolve!);
|
||||
|
|
|
@ -4,11 +4,12 @@ import {
|
|||
filterEnglishLinesAtEnd,
|
||||
filterEnglishLinesAtStart,
|
||||
fixCodeLlamaFirstLineIndentation,
|
||||
stopAtLines,
|
||||
streamWithNewLines,
|
||||
} from "../../autocomplete/lineStream";
|
||||
import { streamLines } from "../../diff/util";
|
||||
import { stripImages } from "../../llm/countTokens";
|
||||
import { dedentAndGetCommonWhitespace, renderPromptTemplate } from "../../util";
|
||||
import { dedentAndGetCommonWhitespace } from "../../util";
|
||||
import {
|
||||
RangeInFileWithContents,
|
||||
contextItemToRangeInFileWithContents,
|
||||
|
@ -446,7 +447,7 @@ const EditSlashCommand: SlashCommand = {
|
|||
const template = llm.promptTemplates?.edit;
|
||||
let generator: AsyncGenerator<string>;
|
||||
if (template) {
|
||||
let rendered = renderPromptTemplate(
|
||||
let rendered = llm.renderPromptTemplate(
|
||||
template,
|
||||
// typeof template === 'string' ? template : template.prompt,
|
||||
messages.slice(0, messages.length - 1),
|
||||
|
@ -479,6 +480,7 @@ const EditSlashCommand: SlashCommand = {
|
|||
lineStream = filterEnglishLinesAtStart(lineStream);
|
||||
|
||||
lineStream = filterEnglishLinesAtEnd(filterCodeBlockLines(lineStream));
|
||||
lineStream = stopAtLines(lineStream);
|
||||
|
||||
generator = streamWithNewLines(
|
||||
fixCodeLlamaFirstLineIndentation(lineStream),
|
||||
|
|
|
@ -6,6 +6,7 @@ import EditSlashCommand from "./edit";
|
|||
import HttpSlashCommand from "./http";
|
||||
import ShareSlashCommand from "./share";
|
||||
import StackOverflowSlashCommand from "./stackOverflow";
|
||||
import ReviewMessageCommand from "./review";
|
||||
|
||||
export default [
|
||||
DraftIssueCommand,
|
||||
|
@ -16,4 +17,5 @@ export default [
|
|||
CommentSlashCommand,
|
||||
HttpSlashCommand,
|
||||
CommitMessageCommand,
|
||||
ReviewMessageCommand,
|
||||
];
|
||||
|
|
|
@ -0,0 +1,55 @@
|
|||
import { SlashCommand } from "../..";
|
||||
import { stripImages } from "../../llm/countTokens";
|
||||
import { ChatMessage } from "../..";
|
||||
|
||||
const prompt = `
|
||||
Review the following code, focusing on Readability, Maintainability, Code Smells, Speed, and Memory Performance. Provide feedback with these guidelines:
|
||||
|
||||
Tone: Friendly casual tone of a fellow engineer, ensure the feedback is clear and focused on practical improvements.
|
||||
Orderly Analysis: Address the code sequentially, from top to bottom, to ensure a thorough review without skipping any parts.
|
||||
Descriptive Feedback: Avoid referencing line numbers directly, as they may vary. Instead, describe the code sections or specific constructs that need attention, explaining the reasons clearly.
|
||||
Provide Examples: For each issue identified, offer an example of how the code could be improved or rewritten for better clarity, performance, or maintainability.
|
||||
Your response should be structured to first identify the issue, then explain why it’s a problem, and finally, offer a solution with example code.`;
|
||||
|
||||
|
||||
function getLastUserHistory(history: ChatMessage[]): string {
|
||||
const lastUserHistory = history
|
||||
.reverse()
|
||||
.find((message) => message.role === "user");
|
||||
|
||||
if (!lastUserHistory) return "";
|
||||
|
||||
if (lastUserHistory.content instanceof Array) {
|
||||
return lastUserHistory.content.reduce(
|
||||
(acc: string, current: { type: string; text?: string }) => {
|
||||
return current.type === "text" && current.text
|
||||
? acc + current.text
|
||||
: acc;
|
||||
},
|
||||
""
|
||||
);
|
||||
}
|
||||
|
||||
return typeof lastUserHistory.content === "string"
|
||||
? lastUserHistory.content
|
||||
: "";
|
||||
}
|
||||
|
||||
const ReviewMessageCommand: SlashCommand = {
|
||||
name: "review",
|
||||
description: "Review code and give feedback",
|
||||
run: async function* ({ llm, history }) {
|
||||
|
||||
let reviewText = getLastUserHistory(history).replace("\\review", "");
|
||||
|
||||
const content = `${prompt} \r\n ${reviewText}`;
|
||||
|
||||
for await (const chunk of llm.streamChat([
|
||||
{ role: "user", content: content },
|
||||
])) {
|
||||
yield stripImages(chunk.content);
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
export default ReviewMessageCommand;
|
|
@ -2,6 +2,11 @@ import { SerializedContinueConfig } from "..";
|
|||
|
||||
export const defaultConfig: SerializedContinueConfig = {
|
||||
models: [
|
||||
{
|
||||
title: "Claude 3 Sonnet (Free Trial)",
|
||||
provider: "free-trial",
|
||||
model: "claude-3-sonnet-20240229",
|
||||
},
|
||||
{
|
||||
title: "GPT-4 Vision (Free Trial)",
|
||||
provider: "free-trial",
|
||||
|
@ -17,21 +22,11 @@ export const defaultConfig: SerializedContinueConfig = {
|
|||
provider: "free-trial",
|
||||
model: "gemini-pro",
|
||||
},
|
||||
{
|
||||
title: "Codellama 70b (Free Trial)",
|
||||
provider: "free-trial",
|
||||
model: "codellama-70b",
|
||||
},
|
||||
{
|
||||
title: "Mixtral (Free Trial)",
|
||||
provider: "free-trial",
|
||||
model: "mistral-8x7b",
|
||||
},
|
||||
{
|
||||
title: "Claude 3 Sonnet (Free Trial)",
|
||||
provider: "free-trial",
|
||||
model: "claude-3-sonnet-20240229",
|
||||
},
|
||||
],
|
||||
slashCommands: [
|
||||
{
|
||||
|
@ -70,14 +65,19 @@ export const defaultConfig: SerializedContinueConfig = {
|
|||
{ name: "codebase", params: {} },
|
||||
],
|
||||
tabAutocompleteModel: {
|
||||
title: "Starcoder 3b",
|
||||
title: "Starcoder2 3b",
|
||||
provider: "ollama",
|
||||
model: "starcoder-3b",
|
||||
model: "starcoder2:3b",
|
||||
},
|
||||
};
|
||||
|
||||
export const defaultConfigJetBrains: SerializedContinueConfig = {
|
||||
models: [
|
||||
{
|
||||
title: "Claude 3 Sonnet (Free Trial)",
|
||||
provider: "free-trial",
|
||||
model: "claude-3-sonnet-20240229",
|
||||
},
|
||||
{
|
||||
title: "GPT-4 Vision (Free Trial)",
|
||||
provider: "free-trial",
|
||||
|
@ -93,21 +93,11 @@ export const defaultConfigJetBrains: SerializedContinueConfig = {
|
|||
provider: "free-trial",
|
||||
model: "gemini-pro",
|
||||
},
|
||||
{
|
||||
title: "Codellama 70b (Free Trial)",
|
||||
provider: "free-trial",
|
||||
model: "codellama-70b",
|
||||
},
|
||||
{
|
||||
title: "Mixtral (Free Trial)",
|
||||
provider: "free-trial",
|
||||
model: "mistral-8x7b",
|
||||
},
|
||||
{
|
||||
title: "Claude 3 Sonnet (Free Trial)",
|
||||
provider: "free-trial",
|
||||
model: "claude-3-sonnet-20240229",
|
||||
},
|
||||
],
|
||||
slashCommands: [
|
||||
{
|
||||
|
@ -137,7 +127,9 @@ export const defaultConfigJetBrains: SerializedContinueConfig = {
|
|||
params: {},
|
||||
},
|
||||
],
|
||||
ui: {
|
||||
codeBlockToolbarPosition: "top",
|
||||
}
|
||||
tabAutocompleteModel: {
|
||||
title: "Starcoder2 3b",
|
||||
provider: "ollama",
|
||||
model: "starcoder2:3b",
|
||||
},
|
||||
};
|
||||
|
|
|
@ -1,8 +1,3 @@
|
|||
import { http, https } from "follow-redirects";
|
||||
import * as fs from "fs";
|
||||
import { HttpProxyAgent } from "http-proxy-agent";
|
||||
import { HttpsProxyAgent } from "https-proxy-agent";
|
||||
import fetch from "node-fetch";
|
||||
import { ContinueConfig, ContinueRcJson, IDE, ILLM } from "..";
|
||||
import { IdeSettings } from "../protocol";
|
||||
import { Telemetry } from "../util/posthog";
|
||||
|
@ -11,7 +6,8 @@ import {
|
|||
finalToBrowserConfig,
|
||||
loadFullConfigNode,
|
||||
} from "./load";
|
||||
const tls = require("tls");
|
||||
import { fetchwithRequestOptions } from "../util/fetchWithOptions";
|
||||
|
||||
export class ConfigHandler {
|
||||
private savedConfig: ContinueConfig | undefined;
|
||||
private savedBrowserConfig?: BrowserSerializedContinueConfig;
|
||||
|
@ -20,7 +16,7 @@ export class ConfigHandler {
|
|||
private readonly ide: IDE,
|
||||
private ideSettingsPromise: Promise<IdeSettings>,
|
||||
private readonly writeLog: (text: string) => void,
|
||||
private readonly onConfigUpdate: () => void,
|
||||
private readonly onConfigUpdate: () => void
|
||||
) {
|
||||
this.ide = ide;
|
||||
this.ideSettingsPromise = ideSettingsPromise;
|
||||
|
@ -80,7 +76,7 @@ export class ConfigHandler {
|
|||
this.ide.readFile,
|
||||
workspaceConfigs,
|
||||
remoteConfigServerUrl,
|
||||
ideInfo.ideType,
|
||||
ideInfo.ideType
|
||||
);
|
||||
this.savedConfig.allowAnonymousTelemetry =
|
||||
this.savedConfig.allowAnonymousTelemetry &&
|
||||
|
@ -90,81 +86,19 @@ export class ConfigHandler {
|
|||
await Telemetry.setup(
|
||||
this.savedConfig.allowAnonymousTelemetry ?? true,
|
||||
await this.ide.getUniqueId(),
|
||||
ideInfo.extensionVersion
|
||||
);
|
||||
|
||||
return this.savedConfig;
|
||||
}
|
||||
|
||||
setupLlm(llm: ILLM): ILLM {
|
||||
const TIMEOUT = 7200; // 7200 seconds = 2 hours
|
||||
// Since we know this is happening in Node.js, we can add requestOptions through a custom agent
|
||||
const ca = [...tls.rootCertificates];
|
||||
const customCerts =
|
||||
typeof llm.requestOptions?.caBundlePath === "string"
|
||||
? [llm.requestOptions?.caBundlePath]
|
||||
: llm.requestOptions?.caBundlePath;
|
||||
if (customCerts) {
|
||||
ca.push(
|
||||
...customCerts.map((customCert) => fs.readFileSync(customCert, "utf8")),
|
||||
);
|
||||
}
|
||||
|
||||
let timeout = (llm.requestOptions?.timeout ?? TIMEOUT) * 1000; // measured in ms
|
||||
|
||||
const agentOptions = {
|
||||
ca,
|
||||
rejectUnauthorized: llm.requestOptions?.verifySsl,
|
||||
timeout,
|
||||
sessionTimeout: timeout,
|
||||
keepAlive: true,
|
||||
keepAliveMsecs: timeout,
|
||||
};
|
||||
|
||||
const proxy = llm.requestOptions?.proxy;
|
||||
|
||||
llm._fetch = async (input, init) => {
|
||||
// Create agent
|
||||
const protocol = new URL(input).protocol === "https:" ? https : http;
|
||||
const agent = proxy
|
||||
? new URL(input).protocol === "https:"
|
||||
? new HttpsProxyAgent(proxy, agentOptions)
|
||||
: new HttpProxyAgent(proxy, agentOptions)
|
||||
: new protocol.Agent(agentOptions);
|
||||
|
||||
const headers: { [key: string]: string } =
|
||||
llm!.requestOptions?.headers || {};
|
||||
for (const [key, value] of Object.entries(init?.headers || {})) {
|
||||
headers[key] = value as string;
|
||||
}
|
||||
|
||||
// Replace localhost with 127.0.0.1
|
||||
input = new URL(input);
|
||||
if (input.hostname === "localhost") {
|
||||
input.hostname = "127.0.0.1";
|
||||
}
|
||||
|
||||
let updatedBody: string | undefined = undefined;
|
||||
try {
|
||||
if (
|
||||
llm.requestOptions?.extraBodyProperties &&
|
||||
typeof init.body === "string"
|
||||
) {
|
||||
const parsedBody = JSON.parse(init.body);
|
||||
updatedBody = JSON.stringify({
|
||||
...parsedBody,
|
||||
...llm.requestOptions.extraBodyProperties,
|
||||
});
|
||||
}
|
||||
} catch (e) {
|
||||
console.log("Unable to parse HTTP request body: ", e);
|
||||
}
|
||||
|
||||
const resp = await fetch(input, {
|
||||
...init,
|
||||
body: updatedBody ?? init.body,
|
||||
headers,
|
||||
agent,
|
||||
});
|
||||
const resp = await fetchwithRequestOptions(
|
||||
new URL(input),
|
||||
{ ...init },
|
||||
llm.requestOptions
|
||||
);
|
||||
|
||||
if (!resp.ok) {
|
||||
let text = await resp.text();
|
||||
|
@ -181,7 +115,7 @@ export class ConfigHandler {
|
|||
}
|
||||
}
|
||||
throw new Error(
|
||||
`HTTP ${resp.status} ${resp.statusText} from ${resp.url}\n\n${text}`,
|
||||
`HTTP ${resp.status} ${resp.statusText} from ${resp.url}\n\n${text}`
|
||||
);
|
||||
}
|
||||
|
||||
|
|
|
@ -12,6 +12,8 @@ import {
|
|||
IContextProvider,
|
||||
IdeType,
|
||||
ModelDescription,
|
||||
Reranker,
|
||||
RerankerDescription,
|
||||
SerializedContinueConfig,
|
||||
SlashCommand,
|
||||
} from "..";
|
||||
|
@ -22,6 +24,8 @@ import {
|
|||
import { contextProviderClassFromName } from "../context/providers";
|
||||
import CustomContextProviderClass from "../context/providers/CustomContextProvider";
|
||||
import FileContextProvider from "../context/providers/FileContextProvider";
|
||||
import { AllRerankers } from "../context/rerankers";
|
||||
import { LLMReranker } from "../context/rerankers/llm";
|
||||
import { AllEmbeddingsProviders } from "../indexing/embeddings";
|
||||
import TransformersJsEmbeddingsProvider from "../indexing/embeddings/TransformersJsEmbeddingsProvider";
|
||||
import { BaseLLM } from "../llm";
|
||||
|
@ -267,6 +271,7 @@ async function intermediateToFinalConfig(
|
|||
}
|
||||
}
|
||||
|
||||
// Embeddings Provider
|
||||
if (
|
||||
(config.embeddingsProvider as EmbeddingsProviderDescription | undefined)
|
||||
?.provider
|
||||
|
@ -280,12 +285,30 @@ async function intermediateToFinalConfig(
|
|||
config.embeddingsProvider = new TransformersJsEmbeddingsProvider();
|
||||
}
|
||||
|
||||
// Reranker
|
||||
if (config.reranker && !(config.reranker as Reranker | undefined)?.rerank) {
|
||||
const { name, params } = config.reranker as RerankerDescription;
|
||||
const rerankerClass = AllRerankers[name];
|
||||
|
||||
if (name === "llm") {
|
||||
const llm = models.find((model) => model.title === params?.modelTitle);
|
||||
if (!llm) {
|
||||
console.warn(`Unknown model ${params?.modelTitle}`);
|
||||
} else {
|
||||
config.reranker = new LLMReranker(llm);
|
||||
}
|
||||
} else if (rerankerClass) {
|
||||
config.reranker = new rerankerClass(params);
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
...config,
|
||||
contextProviders,
|
||||
models,
|
||||
embeddingsProvider: config.embeddingsProvider as any,
|
||||
tabAutocompleteModel: autocompleteLlm,
|
||||
reranker: config.reranker as any,
|
||||
};
|
||||
}
|
||||
|
||||
|
|
|
@ -0,0 +1,90 @@
|
|||
import { SerializedContinueConfig } from "..";
|
||||
|
||||
export function setupOptimizedMode(
|
||||
config: SerializedContinueConfig,
|
||||
): SerializedContinueConfig {
|
||||
return {
|
||||
...config,
|
||||
models: [
|
||||
{
|
||||
title: "Claude 3 Sonnet (Free Trial)",
|
||||
provider: "free-trial",
|
||||
model: "claude-3-sonnet-20240229",
|
||||
},
|
||||
{
|
||||
title: "GPT-4 Vision (Free Trial)",
|
||||
provider: "free-trial",
|
||||
model: "gpt-4-vision-preview",
|
||||
},
|
||||
{
|
||||
title: "GPT-3.5-Turbo (Free Trial)",
|
||||
provider: "free-trial",
|
||||
model: "gpt-3.5-turbo",
|
||||
},
|
||||
{
|
||||
title: "Gemini Pro (Free Trial)",
|
||||
provider: "free-trial",
|
||||
model: "gemini-pro",
|
||||
},
|
||||
{
|
||||
title: "Mixtral (Free Trial)",
|
||||
provider: "free-trial",
|
||||
model: "mistral-8x7b",
|
||||
},
|
||||
],
|
||||
tabAutocompleteModel: {
|
||||
title: "Tab Autocomplete",
|
||||
provider: "free-trial",
|
||||
model: "starcoder-7b",
|
||||
},
|
||||
embeddingsProvider: {
|
||||
provider: "free-trial",
|
||||
},
|
||||
reranker: {
|
||||
name: "free-trial",
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
export function setupOptimizedExistingUserMode(
|
||||
config: SerializedContinueConfig,
|
||||
): SerializedContinueConfig {
|
||||
return {
|
||||
...config,
|
||||
tabAutocompleteModel: {
|
||||
title: "Tab Autocomplete",
|
||||
provider: "free-trial",
|
||||
model: "starcoder-7b",
|
||||
},
|
||||
embeddingsProvider: {
|
||||
provider: "free-trial",
|
||||
},
|
||||
reranker: {
|
||||
name: "free-trial",
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
export function setupLocalMode(
|
||||
config: SerializedContinueConfig,
|
||||
): SerializedContinueConfig {
|
||||
return {
|
||||
...config,
|
||||
models: [
|
||||
{
|
||||
title: "Ollama",
|
||||
provider: "ollama",
|
||||
model: "AUTODETECT",
|
||||
},
|
||||
],
|
||||
tabAutocompleteModel: {
|
||||
title: "Starcoder 3b",
|
||||
provider: "ollama",
|
||||
model: "starcoder2:3b",
|
||||
},
|
||||
embeddingsProvider: {
|
||||
provider: "transformers.js",
|
||||
},
|
||||
reranker: undefined,
|
||||
};
|
||||
}
|
|
@ -408,7 +408,6 @@ declare global {
|
|||
| "open"
|
||||
| "google"
|
||||
| "search"
|
||||
| "url"
|
||||
| "tree"
|
||||
| "http"
|
||||
| "codebase"
|
||||
|
@ -486,6 +485,7 @@ declare global {
|
|||
| "chat-bison-001"
|
||||
// Gemini
|
||||
| "gemini-pro"
|
||||
| "gemini-1.5-pro-latest"
|
||||
// Mistral
|
||||
| "mistral-tiny"
|
||||
| "mistral-small"
|
||||
|
|
|
@ -4,7 +4,7 @@ import {
|
|||
ContextProviderDescription,
|
||||
ContextProviderExtras,
|
||||
} from "../..";
|
||||
import { retrieveContextItemsFromEmbeddings } from "../retrieval";
|
||||
import { retrieveContextItemsFromEmbeddings } from "../retrieval/retrieval";
|
||||
|
||||
class CodebaseContextProvider extends BaseContextProvider {
|
||||
static description: ContextProviderDescription = {
|
||||
|
|
|
@ -20,7 +20,9 @@ class FolderContextProvider extends BaseContextProvider {
|
|||
query: string,
|
||||
extras: ContextProviderExtras,
|
||||
): Promise<ContextItem[]> {
|
||||
const { retrieveContextItemsFromEmbeddings } = await import("../retrieval");
|
||||
const { retrieveContextItemsFromEmbeddings } = await import(
|
||||
"../retrieval/retrieval"
|
||||
);
|
||||
return retrieveContextItemsFromEmbeddings(extras, this.options, query);
|
||||
}
|
||||
async loadSubmenuItems(
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
|
||||
import type { AxiosInstance } from "axios";
|
||||
import { RequestOptions } from "../../..";
|
||||
import { fetchwithRequestOptions } from "../../../util/fetchWithOptions";
|
||||
const { convert: adf2md } = require("adf-to-md");
|
||||
|
||||
interface JiraClientOptions {
|
||||
|
@ -8,6 +8,7 @@ interface JiraClientOptions {
|
|||
password: string;
|
||||
issueQuery?: string;
|
||||
apiVersion?: string;
|
||||
requestOptions?: RequestOptions;
|
||||
}
|
||||
|
||||
interface JiraComment {
|
||||
|
@ -63,95 +64,110 @@ export interface Issue {
|
|||
|
||||
export class JiraClient {
|
||||
private readonly options: Required<JiraClientOptions>;
|
||||
private _api: AxiosInstance | null = null;
|
||||
private baseUrl: string;
|
||||
private authHeader;
|
||||
constructor(options: JiraClientOptions) {
|
||||
this.options = {
|
||||
issueQuery: `assignee = currentUser() AND resolution = Unresolved order by updated DESC`,
|
||||
apiVersion: '3',
|
||||
...options
|
||||
apiVersion: "3",
|
||||
requestOptions: {},
|
||||
...options,
|
||||
};
|
||||
}
|
||||
|
||||
private async getApi() {
|
||||
if (!this._api) {
|
||||
|
||||
const { default: Axios } = await import("axios");
|
||||
|
||||
this._api = Axios.create({
|
||||
baseURL: `https://${this.options.domain}/rest/api/${this.options.apiVersion}/`,
|
||||
...(this.options.username
|
||||
? {
|
||||
auth: {
|
||||
username: this.options.username,
|
||||
password: this.options.password,
|
||||
},
|
||||
}
|
||||
: {
|
||||
headers: {
|
||||
Authorization: `Bearer ${this.options.password}`,
|
||||
},
|
||||
}),
|
||||
});
|
||||
}
|
||||
|
||||
return this._api;
|
||||
this.baseUrl = `https://${this.options.domain}/rest/api/${this.options.apiVersion}`;
|
||||
this.authHeader = this.options.username
|
||||
? {
|
||||
Authorization:
|
||||
"Basic " +
|
||||
btoa(this.options.username + ":" + this.options.password),
|
||||
}
|
||||
: {
|
||||
Authorization: `Bearer ${this.options.password}`,
|
||||
};
|
||||
}
|
||||
|
||||
async issue(issueId: string): Promise<Issue> {
|
||||
const api = await this.getApi();
|
||||
const result = {} as Issue;
|
||||
|
||||
const issue = await api
|
||||
.get<JiraIssue>(`/issue/${issueId}`, {
|
||||
params: {
|
||||
fields: "description,comment,summary",
|
||||
const response = await fetchwithRequestOptions(
|
||||
new URL(
|
||||
this.baseUrl + `/issue/${issueId}?fields=description,comment,summary`
|
||||
),
|
||||
{
|
||||
method: "GET",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
...this.authHeader,
|
||||
},
|
||||
})
|
||||
.then((result) => result.data);
|
||||
|
||||
},
|
||||
this.options.requestOptions
|
||||
);
|
||||
|
||||
const issue = (await response.json()) as any;
|
||||
|
||||
result.key = issue.key;
|
||||
result.summary = issue.fields.summary;
|
||||
|
||||
|
||||
if (typeof issue.fields.description === 'string') {
|
||||
if (typeof issue.fields.description === "string") {
|
||||
result.description = issue.fields.description;
|
||||
} else if(issue.fields.description) {
|
||||
} else if (issue.fields.description) {
|
||||
result.description = adf2md(issue.fields.description).result;
|
||||
} else {
|
||||
result.description = "";
|
||||
}
|
||||
|
||||
result.comments = issue.fields.comment?.comments?.map((comment) => {
|
||||
const body = typeof comment.body === 'string' ? comment.body : adf2md(comment.body).result;
|
||||
|
||||
return {
|
||||
body,
|
||||
author: comment.author,
|
||||
created: comment.created,
|
||||
updated: comment.updated,
|
||||
};
|
||||
}) ?? [];
|
||||
result.comments =
|
||||
issue.fields.comment?.comments?.map((comment: any) => {
|
||||
const body =
|
||||
typeof comment.body === "string"
|
||||
? comment.body
|
||||
: adf2md(comment.body).result;
|
||||
|
||||
return {
|
||||
body,
|
||||
author: comment.author,
|
||||
created: comment.created,
|
||||
updated: comment.updated,
|
||||
};
|
||||
}) ?? [];
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
async listIssues(): Promise<Array<QueryResult>> {
|
||||
const api = await this.getApi();
|
||||
|
||||
const results = await api.get<QueryResults>("/search", {
|
||||
params: {
|
||||
jql:
|
||||
this.options.issueQuery ??
|
||||
`assignee = currentUser() AND resolution = Unresolved order by updated DESC`,
|
||||
fields: "summary",
|
||||
const response = await fetchwithRequestOptions(
|
||||
new URL(
|
||||
this.baseUrl +
|
||||
`/search?fields=summary&jql=${
|
||||
this.options.issueQuery ??
|
||||
`assignee = currentUser() AND resolution = Unresolved order by updated DESC`
|
||||
}`
|
||||
),
|
||||
{
|
||||
method: "GET",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
...this.authHeader,
|
||||
},
|
||||
},
|
||||
});
|
||||
this.options.requestOptions
|
||||
);
|
||||
|
||||
return results.data?.issues?.map((issue) => ({
|
||||
id: issue.id,
|
||||
key: issue.key,
|
||||
summary: issue.fields.summary,
|
||||
})) ?? [];
|
||||
if (response.status != 200) {
|
||||
console.warn(
|
||||
"Unable to get jira tickets. Response code from API is",
|
||||
response.status
|
||||
);
|
||||
return Promise.resolve([]);
|
||||
}
|
||||
|
||||
const data = (await response.json()) as any;
|
||||
|
||||
return (
|
||||
data.issues?.map((issue: any) => ({
|
||||
id: issue.id,
|
||||
key: issue.key,
|
||||
summary: issue.fields.summary,
|
||||
})) ?? []
|
||||
);
|
||||
}
|
||||
}
|
||||
|
|
|
@ -23,6 +23,7 @@ class JiraIssuesContextProvider extends BaseContextProvider {
|
|||
password: this.options.token,
|
||||
issueQuery: this.options.issueQuery,
|
||||
apiVersion: this.options.apiVersion,
|
||||
requestOptions: this.options.requestOptions,
|
||||
});
|
||||
}
|
||||
|
||||
|
|
|
@ -1,43 +0,0 @@
|
|||
import { BaseContextProvider } from "..";
|
||||
import {
|
||||
ContextItem,
|
||||
ContextProviderDescription,
|
||||
ContextProviderExtras,
|
||||
} from "../..";
|
||||
|
||||
class URLContextProvider extends BaseContextProvider {
|
||||
static description: ContextProviderDescription = {
|
||||
title: "url",
|
||||
displayTitle: "URL",
|
||||
description: "Attach the contents of a web page",
|
||||
type: "query",
|
||||
};
|
||||
|
||||
async getContextItems(
|
||||
query: string,
|
||||
extras: ContextProviderExtras,
|
||||
): Promise<ContextItem[]> {
|
||||
let url = query.trim();
|
||||
if (!url.startsWith("http")) {
|
||||
url = "https://" + url;
|
||||
}
|
||||
|
||||
const response = await fetch(url);
|
||||
const html = await response.text();
|
||||
const content = html;
|
||||
const title = url
|
||||
.replace("https://", "")
|
||||
.replace("http://", "")
|
||||
.replace("www.", "");
|
||||
|
||||
return [
|
||||
{
|
||||
content,
|
||||
name: title,
|
||||
description: title,
|
||||
},
|
||||
];
|
||||
}
|
||||
}
|
||||
|
||||
export default URLContextProvider;
|
|
@ -8,18 +8,17 @@ import DocsContextProvider from "./DocsContextProvider";
|
|||
import FileTreeContextProvider from "./FileTreeContextProvider";
|
||||
import FolderContextProvider from "./FolderContextProvider";
|
||||
import GitHubIssuesContextProvider from "./GitHubIssuesContextProvider";
|
||||
import GitLabMergeRequestContextProvider from "./GitLabMergeRequestContextProvider";
|
||||
import GoogleContextProvider from "./GoogleContextProvider";
|
||||
import HttpContextProvider from "./HttpContextProvider";
|
||||
import JiraIssuesContextProvider from "./JiraIssuesContextProvider";
|
||||
import LocalsProvider from "./LocalsProvider";
|
||||
import OSContextProvider from "./OSContextProvider";
|
||||
import OpenFilesContextProvider from "./OpenFilesContextProvider";
|
||||
import PostgresContextProvider from "./PostgresContextProvider";
|
||||
import ProblemsContextProvider from "./ProblemsContextProvider";
|
||||
import SearchContextProvider from "./SearchContextProvider";
|
||||
import OSContextProvider from "./OSContextProvider";
|
||||
import TerminalContextProvider from "./TerminalContextProvider";
|
||||
import LocalsProvider from "./LocalsProvider";
|
||||
import URLContextProvider from "./URLContextProvider";
|
||||
import GitLabMergeRequestContextProvider from "./GitLabMergeRequestContextProvider";
|
||||
|
||||
const Providers: (typeof BaseContextProvider)[] = [
|
||||
DiffContextProvider,
|
||||
|
@ -28,7 +27,6 @@ const Providers: (typeof BaseContextProvider)[] = [
|
|||
GoogleContextProvider,
|
||||
TerminalContextProvider,
|
||||
LocalsProvider,
|
||||
URLContextProvider,
|
||||
OpenFilesContextProvider,
|
||||
HttpContextProvider,
|
||||
SearchContextProvider,
|
||||
|
|
|
@ -0,0 +1,30 @@
|
|||
import fetch from "node-fetch";
|
||||
import { Chunk, Reranker } from "../..";
|
||||
import { getHeaders } from "../../continueServer/stubs/headers";
|
||||
import { SERVER_URL } from "../../util/parameters";
|
||||
|
||||
export class FreeTrialReranker implements Reranker {
|
||||
name = "free-trial";
|
||||
|
||||
async rerank(query: string, chunks: Chunk[]): Promise<number[]> {
|
||||
const resp = await fetch(new URL("rerank", SERVER_URL), {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
...getHeaders(),
|
||||
},
|
||||
body: JSON.stringify({
|
||||
query,
|
||||
documents: chunks.map((chunk) => chunk.content),
|
||||
}),
|
||||
});
|
||||
|
||||
if (!resp.ok) {
|
||||
throw new Error(await resp.text());
|
||||
}
|
||||
|
||||
const data = (await resp.json()) as any;
|
||||
const results = data.sort((a: any, b: any) => a.index - b.index);
|
||||
return results.map((result: any) => result.relevance_score);
|
||||
}
|
||||
}
|
|
@ -0,0 +1,10 @@
|
|||
import { RerankerName } from "../..";
|
||||
import { FreeTrialReranker } from "./freeTrial";
|
||||
import { LLMReranker } from "./llm";
|
||||
import { VoyageReranker } from "./voyage";
|
||||
|
||||
export const AllRerankers: { [key in RerankerName]: any } = {
|
||||
llm: LLMReranker,
|
||||
voyage: VoyageReranker,
|
||||
"free-trial": FreeTrialReranker,
|
||||
};
|
|
@ -0,0 +1,91 @@
|
|||
import { Chunk, ILLM, Reranker } from "../..";
|
||||
import { getBasename } from "../../util";
|
||||
|
||||
const RERANK_PROMPT = (
|
||||
query: string,
|
||||
documentId: string,
|
||||
document: string,
|
||||
) => `You are an expert software developer responsible for helping detect whether the retrieved snippet of code is relevant to the query. For a given input, you need to output a single word: "Yes" or "No" indicating the retrieved snippet is relevant to the query.
|
||||
|
||||
Query: Where is the FastAPI server?
|
||||
Snippet:
|
||||
\`\`\`/Users/andrew/Desktop/server/main.py
|
||||
from fastapi import FastAPI
|
||||
app = FastAPI()
|
||||
@app.get("/")
|
||||
def read_root():
|
||||
return {{"Hello": "World"}}
|
||||
\`\`\`
|
||||
Relevant: Yes
|
||||
|
||||
Query: Where in the documentation does it talk about the UI?
|
||||
Snippet:
|
||||
\`\`\`/Users/andrew/Projects/bubble_sort/src/lib.rs
|
||||
fn bubble_sort<T: Ord>(arr: &mut [T]) {{
|
||||
for i in 0..arr.len() {{
|
||||
for j in 1..arr.len() - i {{
|
||||
if arr[j - 1] > arr[j] {{
|
||||
arr.swap(j - 1, j);
|
||||
}}
|
||||
}}
|
||||
}}
|
||||
}}
|
||||
\`\`\`
|
||||
Relevant: No
|
||||
|
||||
Query: ${query}
|
||||
Snippet:
|
||||
\`\`\`${documentId}
|
||||
${document}
|
||||
\`\`\`
|
||||
Relevant:
|
||||
`;
|
||||
|
||||
export class LLMReranker implements Reranker {
|
||||
name = "llmReranker";
|
||||
|
||||
constructor(private readonly llm: ILLM) {}
|
||||
|
||||
async scoreChunk(chunk: Chunk, query: string): Promise<number> {
|
||||
const completion = await this.llm.complete(
|
||||
RERANK_PROMPT(query, getBasename(chunk.filepath), chunk.content),
|
||||
{
|
||||
maxTokens: 1,
|
||||
model:
|
||||
this.llm.providerName.startsWith("openai") &&
|
||||
this.llm.model.startsWith("gpt-4")
|
||||
? "gpt-3.5-turbo"
|
||||
: this.llm.model,
|
||||
},
|
||||
);
|
||||
|
||||
if (!completion) {
|
||||
// TODO: Why is this happening?
|
||||
return 0.0;
|
||||
}
|
||||
|
||||
let answer = completion
|
||||
.trim()
|
||||
.toLowerCase()
|
||||
.replace(/"/g, "")
|
||||
.replace(/'/g, "");
|
||||
|
||||
if (answer === "yes") {
|
||||
return 1.0;
|
||||
} else if (answer === "no") {
|
||||
return 0.0;
|
||||
} else {
|
||||
console.warn(
|
||||
`Unexpected response from single token reranker: "${answer}". Expected "yes" or "no".`,
|
||||
);
|
||||
return 0.0;
|
||||
}
|
||||
}
|
||||
|
||||
async rerank(query: string, chunks: Chunk[]): Promise<number[]> {
|
||||
const scores = await Promise.all(
|
||||
chunks.map((chunk) => this.scoreChunk(chunk, query)),
|
||||
);
|
||||
return scores;
|
||||
}
|
||||
}
|
|
@ -0,0 +1,30 @@
|
|||
import { Chunk, Reranker } from "../..";
|
||||
|
||||
export class VoyageReranker implements Reranker {
|
||||
name = "voyage";
|
||||
|
||||
constructor(
|
||||
private readonly params: {
|
||||
apiKey: string;
|
||||
model?: string;
|
||||
},
|
||||
) {}
|
||||
|
||||
async rerank(query: string, chunks: Chunk[]): Promise<number[]> {
|
||||
const resp = await fetch("https://api.voyageai.com/v1/rerank", {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
Authorization: `Bearer ${this.params.apiKey}`,
|
||||
},
|
||||
body: JSON.stringify({
|
||||
query,
|
||||
documents: chunks.map((chunk) => chunk.content),
|
||||
model: this.params.model ?? "rerank-lite-1",
|
||||
}),
|
||||
});
|
||||
const data = await resp.json();
|
||||
const results = data.data.sort((a: any, b: any) => a.index - b.index);
|
||||
return results.map((result: any) => result.relevance_score);
|
||||
}
|
||||
}
|
|
@ -1,216 +0,0 @@
|
|||
import { Chunk, ContextItem, ContextProviderExtras, ILLM, IndexTag } from "..";
|
||||
import { FullTextSearchCodebaseIndex } from "../indexing/FullTextSearch";
|
||||
import { LanceDbIndex } from "../indexing/LanceDbIndex";
|
||||
|
||||
import { llmCanGenerateInParallel } from "../llm/autodetect";
|
||||
import { getBasename } from "../util";
|
||||
|
||||
const RERANK_PROMPT = (
|
||||
query: string,
|
||||
documentId: string,
|
||||
document: string,
|
||||
) => `You are an expert software developer responsible for helping detect whether the retrieved snippet of code is relevant to the query. For a given input, you need to output a single word: "Yes" or "No" indicating the retrieved snippet is relevant to the query.
|
||||
|
||||
Query: Where is the FastAPI server?
|
||||
Snippet:
|
||||
\`\`\`/Users/andrew/Desktop/server/main.py
|
||||
from fastapi import FastAPI
|
||||
app = FastAPI()
|
||||
@app.get("/")
|
||||
def read_root():
|
||||
return {{"Hello": "World"}}
|
||||
\`\`\`
|
||||
Relevant: Yes
|
||||
|
||||
Query: Where in the documentation does it talk about the UI?
|
||||
Snippet:
|
||||
\`\`\`/Users/andrew/Projects/bubble_sort/src/lib.rs
|
||||
fn bubble_sort<T: Ord>(arr: &mut [T]) {{
|
||||
for i in 0..arr.len() {{
|
||||
for j in 1..arr.len() - i {{
|
||||
if arr[j - 1] > arr[j] {{
|
||||
arr.swap(j - 1, j);
|
||||
}}
|
||||
}}
|
||||
}}
|
||||
}}
|
||||
\`\`\`
|
||||
Relevant: No
|
||||
|
||||
Query: ${query}
|
||||
Snippet:
|
||||
\`\`\`${documentId}
|
||||
${document}
|
||||
\`\`\`
|
||||
Relevant:
|
||||
`;
|
||||
|
||||
async function scoreChunk(
|
||||
chunk: Chunk,
|
||||
llm: ILLM,
|
||||
query: string,
|
||||
): Promise<number> {
|
||||
const completion = await llm.complete(
|
||||
RERANK_PROMPT(query, getBasename(chunk.filepath), chunk.content),
|
||||
{
|
||||
maxTokens: 1,
|
||||
model:
|
||||
llm.providerName.startsWith("openai") && llm.model.startsWith("gpt-4")
|
||||
? "gpt-3.5-turbo"
|
||||
: llm.model,
|
||||
},
|
||||
);
|
||||
|
||||
if (!completion) {
|
||||
// TODO: Why is this happening?
|
||||
return 0.0;
|
||||
}
|
||||
|
||||
let answer = completion
|
||||
.trim()
|
||||
.toLowerCase()
|
||||
.replace(/"/g, "")
|
||||
.replace(/'/g, "");
|
||||
|
||||
if (answer === "yes") {
|
||||
return 1.0;
|
||||
} else if (answer === "no") {
|
||||
return 0.0;
|
||||
} else {
|
||||
console.warn(
|
||||
`Unexpected response from single token reranker: "${answer}". Expected "yes" or "no".`,
|
||||
);
|
||||
return 0.0;
|
||||
}
|
||||
}
|
||||
|
||||
async function rerank(
|
||||
chunks: Chunk[],
|
||||
llm: ILLM,
|
||||
query: string,
|
||||
nFinal: number,
|
||||
): Promise<Chunk[]> {
|
||||
const scores = await Promise.all(
|
||||
chunks.map((chunk) => scoreChunk(chunk, llm, query)),
|
||||
);
|
||||
const sorted = chunks
|
||||
.map((chunk, i) => ({ chunk, score: scores[i] }))
|
||||
.sort((a, b) => b.score - a.score);
|
||||
|
||||
return sorted.map((s) => s.chunk).slice(0, nFinal);
|
||||
}
|
||||
|
||||
export async function retrieveContextItemsFromEmbeddings(
|
||||
extras: ContextProviderExtras,
|
||||
options: any | undefined,
|
||||
filterDirectory: string | undefined,
|
||||
): Promise<ContextItem[]> {
|
||||
if (!extras.embeddingsProvider) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const nFinal = options?.nFinal || 10;
|
||||
const useReranking =
|
||||
llmCanGenerateInParallel(extras.llm.providerName, extras.llm.model) &&
|
||||
(options?.useReranking === undefined ? false : options?.useReranking);
|
||||
const nRetrieve = useReranking === false ? nFinal : options?.nRetrieve || 20;
|
||||
|
||||
const ftsIndex = new FullTextSearchCodebaseIndex();
|
||||
const workspaceDirs = await extras.ide.getWorkspaceDirs();
|
||||
|
||||
if (workspaceDirs.length === 0) {
|
||||
throw new Error("No workspace directories found");
|
||||
}
|
||||
|
||||
const branches = (await Promise.race([
|
||||
Promise.all(workspaceDirs.map((dir) => extras.ide.getBranch(dir))),
|
||||
new Promise((resolve) => {
|
||||
setTimeout(() => {
|
||||
resolve(["NONE"]);
|
||||
}, 500);
|
||||
}),
|
||||
])) as string[];
|
||||
const tags: (artifactId: string) => IndexTag[] = (artifactId: string) =>
|
||||
workspaceDirs.map((directory, i) => ({
|
||||
directory,
|
||||
branch: branches[i],
|
||||
artifactId,
|
||||
}));
|
||||
|
||||
let ftsResults: Chunk[] = [];
|
||||
|
||||
try {
|
||||
if (extras.fullInput.trim() !== "") {
|
||||
ftsResults = await ftsIndex.retrieve(
|
||||
tags(ftsIndex.artifactId),
|
||||
extras.fullInput
|
||||
.trim()
|
||||
.split(" ")
|
||||
.map((element) => `"${element}"`)
|
||||
.join(" OR "),
|
||||
nRetrieve / 2,
|
||||
filterDirectory,
|
||||
undefined,
|
||||
);
|
||||
}
|
||||
} catch (e) {
|
||||
console.warn("Error retrieving from FTS:", e);
|
||||
}
|
||||
|
||||
const lanceDbIndex = new LanceDbIndex(extras.embeddingsProvider, (path) =>
|
||||
extras.ide.readFile(path),
|
||||
);
|
||||
let vecResults = await lanceDbIndex.retrieve(
|
||||
tags(lanceDbIndex.artifactId),
|
||||
extras.fullInput,
|
||||
nRetrieve,
|
||||
filterDirectory,
|
||||
);
|
||||
|
||||
// Now combine these (de-duplicate) and re-rank
|
||||
let results = [...ftsResults];
|
||||
for (const vecResult of vecResults) {
|
||||
if (results.length >= nRetrieve) {
|
||||
break;
|
||||
}
|
||||
if (
|
||||
!ftsResults.find(
|
||||
(r) =>
|
||||
r.filepath === vecResult.filepath &&
|
||||
r.startLine === vecResult.startLine &&
|
||||
r.endLine === vecResult.endLine,
|
||||
)
|
||||
) {
|
||||
results.push(vecResult);
|
||||
}
|
||||
}
|
||||
|
||||
// Re-ranking
|
||||
if (useReranking) {
|
||||
results = await rerank(results, extras.llm, extras.fullInput, nFinal);
|
||||
}
|
||||
|
||||
if (results.length === 0) {
|
||||
throw new Error(
|
||||
"Warning: No results found for @codebase context provider.",
|
||||
);
|
||||
}
|
||||
|
||||
return [
|
||||
...results.map((r) => {
|
||||
const name = `${getBasename(r.filepath)} (${r.startLine}-${r.endLine})`;
|
||||
const description = `${r.filepath} (${r.startLine}-${r.endLine})`;
|
||||
return {
|
||||
name,
|
||||
description,
|
||||
content: `\`\`\`${name}\n${r.content}\n\`\`\``,
|
||||
};
|
||||
}),
|
||||
{
|
||||
name: "Instructions",
|
||||
description: "Instructions",
|
||||
content:
|
||||
"Use the above code to answer the following question. You should not reference any files outside of what is shown, unless they are commonly known files, like a .gitignore or package.json. Reference the filenames whenever possible. If there isn't enough information to answer the question, suggest where the user might look to learn more.",
|
||||
},
|
||||
];
|
||||
}
|
|
@ -0,0 +1,31 @@
|
|||
import { BranchAndDir, Chunk } from "../..";
|
||||
import { FullTextSearchCodebaseIndex } from "../../indexing/FullTextSearch";
|
||||
export async function retrieveFts(
|
||||
query: string,
|
||||
n: number,
|
||||
tags: BranchAndDir[],
|
||||
filterDirectory: string | undefined,
|
||||
): Promise<Chunk[]> {
|
||||
const ftsIndex = new FullTextSearchCodebaseIndex();
|
||||
|
||||
let ftsResults: Chunk[] = [];
|
||||
try {
|
||||
if (query.trim() !== "") {
|
||||
ftsResults = await ftsIndex.retrieve(
|
||||
tags,
|
||||
query
|
||||
.trim()
|
||||
.split(" ")
|
||||
.map((element) => `"${element}"`)
|
||||
.join(" OR "),
|
||||
n,
|
||||
filterDirectory,
|
||||
undefined,
|
||||
);
|
||||
}
|
||||
return ftsResults;
|
||||
} catch (e) {
|
||||
console.warn("Error retrieving from FTS:", e);
|
||||
return [];
|
||||
}
|
||||
}
|
|
@ -0,0 +1,150 @@
|
|||
import { BranchAndDir, Chunk, ContextItem, ContextProviderExtras } from "../..";
|
||||
import { LanceDbIndex } from "../../indexing/LanceDbIndex";
|
||||
|
||||
import { deduplicateArray, getBasename } from "../../util";
|
||||
import { RETRIEVAL_PARAMS } from "../../util/parameters";
|
||||
import { retrieveFts } from "./fullTextSearch";
|
||||
|
||||
function deduplicateChunks(chunks: Chunk[]): Chunk[] {
|
||||
return deduplicateArray(chunks, (a, b) => {
|
||||
return (
|
||||
a.filepath === b.filepath &&
|
||||
a.startLine === b.startLine &&
|
||||
a.endLine === b.endLine
|
||||
);
|
||||
});
|
||||
}
|
||||
|
||||
export async function retrieveContextItemsFromEmbeddings(
|
||||
extras: ContextProviderExtras,
|
||||
options: any | undefined,
|
||||
filterDirectory: string | undefined,
|
||||
): Promise<ContextItem[]> {
|
||||
if (!extras.embeddingsProvider) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const nFinal = options?.nFinal || RETRIEVAL_PARAMS.nFinal;
|
||||
const useReranking = extras.reranker !== undefined;
|
||||
const nRetrieve =
|
||||
useReranking === false
|
||||
? nFinal
|
||||
: options?.nRetrieve || RETRIEVAL_PARAMS.nRetrieve;
|
||||
|
||||
// Get tags to retrieve for
|
||||
const workspaceDirs = await extras.ide.getWorkspaceDirs();
|
||||
|
||||
if (workspaceDirs.length === 0) {
|
||||
throw new Error("No workspace directories found");
|
||||
}
|
||||
|
||||
const branches = (await Promise.race([
|
||||
Promise.all(workspaceDirs.map((dir) => extras.ide.getBranch(dir))),
|
||||
new Promise((resolve) => {
|
||||
setTimeout(() => {
|
||||
resolve(["NONE"]);
|
||||
}, 500);
|
||||
}),
|
||||
])) as string[];
|
||||
const tags: BranchAndDir[] = workspaceDirs.map((directory, i) => ({
|
||||
directory,
|
||||
branch: branches[i],
|
||||
}));
|
||||
|
||||
// Get all retrieval results
|
||||
const retrievalResults: Chunk[] = [];
|
||||
|
||||
// Source: Full-text search
|
||||
let ftsResults = await retrieveFts(
|
||||
extras.fullInput,
|
||||
nRetrieve / 2,
|
||||
tags,
|
||||
filterDirectory,
|
||||
);
|
||||
retrievalResults.push(...ftsResults);
|
||||
|
||||
// Source: expansion with code graph
|
||||
// consider doing this after reranking? Or just having a lower reranking threshold
|
||||
// This is VS Code only until we use PSI for JetBrains or build our own general solution
|
||||
if ((await extras.ide.getIdeInfo()).ideType === "vscode") {
|
||||
const { expandSnippet } = await import(
|
||||
"../../../extensions/vscode/src/util/expandSnippet"
|
||||
);
|
||||
let expansionResults = (
|
||||
await Promise.all(
|
||||
extras.selectedCode.map(async (rif) => {
|
||||
return expandSnippet(
|
||||
rif.filepath,
|
||||
rif.range.start.line,
|
||||
rif.range.end.line,
|
||||
extras.ide,
|
||||
);
|
||||
}),
|
||||
)
|
||||
).flat() as Chunk[];
|
||||
retrievalResults.push(...expansionResults);
|
||||
}
|
||||
|
||||
// Source: Open file exact match
|
||||
// Source: Class/function name exact match
|
||||
|
||||
// Source: Embeddings
|
||||
const lanceDbIndex = new LanceDbIndex(extras.embeddingsProvider, (path) =>
|
||||
extras.ide.readFile(path),
|
||||
);
|
||||
let vecResults = await lanceDbIndex.retrieve(
|
||||
extras.fullInput,
|
||||
nRetrieve,
|
||||
tags,
|
||||
filterDirectory,
|
||||
);
|
||||
retrievalResults.push(...vecResults);
|
||||
|
||||
// De-duplicate
|
||||
let results: Chunk[] = deduplicateChunks(retrievalResults);
|
||||
|
||||
// Re-rank
|
||||
if (useReranking && extras.reranker) {
|
||||
let scores: number[] = await extras.reranker.rerank(
|
||||
extras.fullInput,
|
||||
results,
|
||||
);
|
||||
|
||||
// Filter out low-scoring results
|
||||
results = results.filter(
|
||||
(_, i) => scores[i] >= RETRIEVAL_PARAMS.rerankThreshold,
|
||||
);
|
||||
scores = scores.filter(
|
||||
(score) => score >= RETRIEVAL_PARAMS.rerankThreshold,
|
||||
);
|
||||
|
||||
results.sort(
|
||||
(a, b) => scores[results.indexOf(b)] - scores[results.indexOf(a)],
|
||||
);
|
||||
results = results.slice(0, nFinal);
|
||||
}
|
||||
|
||||
if (results.length === 0) {
|
||||
throw new Error(
|
||||
"Warning: No results found for @codebase context provider.",
|
||||
);
|
||||
}
|
||||
|
||||
return [
|
||||
...results.map((r) => {
|
||||
const name = `${getBasename(r.filepath)} (${r.startLine}-${r.endLine})`;
|
||||
const description = `${r.filepath} (${r.startLine}-${r.endLine})`;
|
||||
return {
|
||||
name,
|
||||
description,
|
||||
content: `\`\`\`${name}\n${r.content}\n\`\`\``,
|
||||
};
|
||||
}),
|
||||
{
|
||||
name: "Instructions",
|
||||
description: "Instructions",
|
||||
content:
|
||||
"Use the above code to answer the following question. You should not reference any files outside of what is shown, unless they are commonly known files, like a .gitignore or package.json. Reference the filenames whenever possible. If there isn't enough information to answer the question, suggest where the user might look to learn more.",
|
||||
},
|
||||
];
|
||||
}
|
|
@ -0,0 +1,28 @@
|
|||
import { Chunk } from "..";
|
||||
|
||||
export interface EmbeddingsCacheChunk {
|
||||
vector: number[];
|
||||
startLine: number;
|
||||
endLine: number;
|
||||
contents: string;
|
||||
}
|
||||
|
||||
interface ArtifactReturnTypes {
|
||||
chunks: Chunk[];
|
||||
embeddings: EmbeddingsCacheChunk[];
|
||||
}
|
||||
|
||||
export type ArtifactType = keyof ArtifactReturnTypes;
|
||||
|
||||
export interface EmbeddingsCacheResponse<T extends ArtifactType> {
|
||||
files: { [cacheKey: string]: ArtifactReturnTypes[T] };
|
||||
}
|
||||
|
||||
export interface IContinueServerClient {
|
||||
getConfig(): Promise<{ configJson: string; configJs: string }>;
|
||||
getFromIndexCache<T extends ArtifactType>(
|
||||
keys: string[],
|
||||
artifactId: T,
|
||||
repoName: string | undefined,
|
||||
): Promise<EmbeddingsCacheResponse<T>>;
|
||||
}
|
|
@ -0,0 +1,24 @@
|
|||
import {
|
||||
ArtifactType,
|
||||
EmbeddingsCacheResponse,
|
||||
IContinueServerClient,
|
||||
} from "../interface";
|
||||
|
||||
export class ContinueServerClient implements IContinueServerClient {
|
||||
constructor(
|
||||
private readonly serverUrl: string,
|
||||
private readonly userToken: Promise<string | undefined>,
|
||||
) {}
|
||||
|
||||
public async getConfig(): Promise<{ configJson: string; configJs: string }> {
|
||||
throw new Error("Not Implemented");
|
||||
}
|
||||
|
||||
public async getFromIndexCache<T extends ArtifactType>(
|
||||
keys: string[],
|
||||
artifactId: T,
|
||||
repoName: string | undefined,
|
||||
): Promise<EmbeddingsCacheResponse<T>> {
|
||||
return { files: {} };
|
||||
}
|
||||
}
|
|
@ -0,0 +1,3 @@
|
|||
export function getHeaders() {
|
||||
return {};
|
||||
}
|
|
@ -14,6 +14,8 @@ export async function* streamDiff(
|
|||
newLines: LineStream,
|
||||
): AsyncGenerator<DiffLine> {
|
||||
const mutatedOldLines = [...oldLines]; // be careful
|
||||
|
||||
// If one indentation mistake is made, others are likely. So we are more permissive about matching
|
||||
let seenIndentationMistake = false;
|
||||
|
||||
let newLineResult = await newLines.next();
|
||||
|
|
|
@ -8,16 +8,23 @@ function linesMatchPerfectly(lineA: string, lineB: string): boolean {
|
|||
return lineA === lineB && lineA !== "";
|
||||
}
|
||||
|
||||
function linesMatch(lineA: string, lineB: string): boolean {
|
||||
const END_BRACKETS = ["}", "});", "})"];
|
||||
|
||||
function linesMatch(
|
||||
lineA: string,
|
||||
lineB: string,
|
||||
linesBetween: number = 0,
|
||||
): boolean {
|
||||
// Require a perfect (without padding) match for these lines
|
||||
// Otherwise they are edit distance 1 from empty lines and other single char lines (e.g. each other)
|
||||
if (["}", "*"].includes(lineA.trim())) {
|
||||
if (["}", "*", "});", "})"].includes(lineA.trim())) {
|
||||
return lineA.trim() === lineB.trim();
|
||||
}
|
||||
|
||||
const d = distance(lineA, lineB);
|
||||
return (
|
||||
(d / Math.max(lineA.length, lineB.length) < 0.5 ||
|
||||
// Should be more unlikely for lines to fuzzy match if they are further away
|
||||
(d / Math.max(lineA.length, lineB.length) < 0.5 - linesBetween * 0.05 ||
|
||||
lineA.trim() === lineB.trim()) &&
|
||||
lineA.trim() !== ""
|
||||
);
|
||||
|
@ -37,10 +44,16 @@ export function matchLine(
|
|||
return [0, true, newLine.trim()];
|
||||
}
|
||||
|
||||
const isEndBracket = END_BRACKETS.includes(newLine.trim());
|
||||
for (let i = 0; i < oldLines.length; i++) {
|
||||
// Don't match end bracket lines if too far away
|
||||
if (i > 4 && isEndBracket) {
|
||||
return [-1, false, newLine];
|
||||
}
|
||||
|
||||
if (linesMatchPerfectly(newLine, oldLines[i])) {
|
||||
return [i, true, newLine];
|
||||
} else if (linesMatch(newLine, oldLines[i])) {
|
||||
} else if (linesMatch(newLine, oldLines[i], i)) {
|
||||
// This is a way to fix indentation, but only for sufficiently long lines to avoid matching whitespace or short lines
|
||||
if (
|
||||
newLine.trimStart() === oldLines[i].trimStart() &&
|
||||
|
|
|
@ -43,6 +43,14 @@ export interface LLMReturnValue {
|
|||
prompt: string;
|
||||
completion: string;
|
||||
}
|
||||
|
||||
export type PromptTemplate =
|
||||
| string
|
||||
| ((
|
||||
history: ChatMessage[],
|
||||
otherData: Record<string, string>,
|
||||
) => string | ChatMessage[]);
|
||||
|
||||
export interface ILLM extends LLMOptions {
|
||||
get providerName(): ModelProvider;
|
||||
|
||||
|
@ -90,7 +98,18 @@ export interface ILLM extends LLMOptions {
|
|||
|
||||
supportsImages(): boolean;
|
||||
|
||||
supportsCompletions(): boolean;
|
||||
|
||||
supportsPrefill(): boolean;
|
||||
|
||||
listModels(): Promise<string[]>;
|
||||
|
||||
renderPromptTemplate(
|
||||
template: PromptTemplate,
|
||||
history: ChatMessage[],
|
||||
otherData: Record<string, string>,
|
||||
canPutWordsInModelsMouth?: boolean,
|
||||
): string | ChatMessage[];
|
||||
}
|
||||
|
||||
export type ContextProviderType = "normal" | "query" | "submenu";
|
||||
|
@ -106,6 +125,7 @@ export interface ContextProviderDescription {
|
|||
export interface ContextProviderExtras {
|
||||
fullInput: string;
|
||||
embeddingsProvider: EmbeddingsProvider;
|
||||
reranker: Reranker | undefined;
|
||||
llm: ILLM;
|
||||
ide: IDE;
|
||||
selectedCode: RangeInFile[];
|
||||
|
@ -231,9 +251,14 @@ export interface ContextItemWithId {
|
|||
editable?: boolean;
|
||||
}
|
||||
|
||||
export interface InputModifiers {
|
||||
useCodebase: boolean;
|
||||
}
|
||||
|
||||
export interface ChatHistoryItem {
|
||||
message: ChatMessage;
|
||||
editorState?: any;
|
||||
modifiers?: InputModifiers;
|
||||
contextItems: ContextItemWithId[];
|
||||
promptLogs?: [string, string][]; // [prompt, completion]
|
||||
}
|
||||
|
@ -243,7 +268,6 @@ export type ChatHistory = ChatHistoryItem[];
|
|||
// LLM
|
||||
|
||||
export interface LLMFullCompletionOptions extends BaseCompletionOptions {
|
||||
raw?: boolean;
|
||||
log?: boolean;
|
||||
|
||||
model?: string;
|
||||
|
@ -265,6 +289,8 @@ export interface LLMOptions {
|
|||
apiKey?: string;
|
||||
apiBase?: string;
|
||||
|
||||
useLegacyCompletionsEndpoint?: boolean;
|
||||
|
||||
// Azure options
|
||||
engine?: string;
|
||||
apiVersion?: string;
|
||||
|
@ -331,11 +357,15 @@ export interface IdeInfo {
|
|||
name: string;
|
||||
version: string;
|
||||
remoteName: string;
|
||||
extensionVersion: string;
|
||||
}
|
||||
|
||||
export interface IndexTag {
|
||||
directory: string;
|
||||
export interface BranchAndDir {
|
||||
branch: string;
|
||||
directory: string;
|
||||
}
|
||||
|
||||
export interface IndexTag extends BranchAndDir {
|
||||
artifactId: string;
|
||||
}
|
||||
|
||||
|
@ -381,6 +411,7 @@ export interface IDE {
|
|||
getBranch(dir: string): Promise<string>;
|
||||
getStats(directory: string): Promise<{ [path: string]: number }>;
|
||||
getTags(artifactId: string): Promise<IndexTag[]>;
|
||||
getRepoName(dir: string): Promise<string | undefined>;
|
||||
}
|
||||
|
||||
// Slash Commands
|
||||
|
@ -430,7 +461,6 @@ type ContextProviderName =
|
|||
| "open"
|
||||
| "google"
|
||||
| "search"
|
||||
| "url"
|
||||
| "tree"
|
||||
| "http"
|
||||
| "codebase"
|
||||
|
@ -479,7 +509,8 @@ type ModelProvider =
|
|||
| "mistral"
|
||||
| "bedrock"
|
||||
| "deepinfra"
|
||||
| "flowise";
|
||||
| "flowise"
|
||||
| "groq";
|
||||
|
||||
export type ModelName =
|
||||
| "AUTODETECT"
|
||||
|
@ -496,6 +527,7 @@ export type ModelName =
|
|||
| "mistral-8x7b"
|
||||
| "llama2-7b"
|
||||
| "llama2-13b"
|
||||
| "llama2-70b"
|
||||
| "codellama-7b"
|
||||
| "codellama-13b"
|
||||
| "codellama-34b"
|
||||
|
@ -520,6 +552,7 @@ export type ModelName =
|
|||
| "chat-bison-001"
|
||||
// Gemini
|
||||
| "gemini-pro"
|
||||
| "gemini-1.5-pro-latest"
|
||||
// Mistral
|
||||
| "mistral-tiny"
|
||||
| "mistral-small"
|
||||
|
@ -528,6 +561,7 @@ export type ModelName =
|
|||
| "deepseek-1b"
|
||||
| "starcoder-1b"
|
||||
| "starcoder-3b"
|
||||
| "starcoder2-3b"
|
||||
| "stable-code-3b";
|
||||
|
||||
export interface RequestOptions {
|
||||
|
@ -573,6 +607,8 @@ interface BaseCompletionOptions {
|
|||
maxTokens?: number;
|
||||
numThreads?: number;
|
||||
keepAlive?: number;
|
||||
raw?: boolean;
|
||||
stream?: boolean;
|
||||
}
|
||||
|
||||
export interface ModelDescription {
|
||||
|
@ -589,7 +625,11 @@ export interface ModelDescription {
|
|||
promptTemplates?: { [key: string]: string };
|
||||
}
|
||||
|
||||
export type EmbeddingsProviderName = "transformers.js" | "ollama" | "openai";
|
||||
export type EmbeddingsProviderName =
|
||||
| "transformers.js"
|
||||
| "ollama"
|
||||
| "openai"
|
||||
| "free-trial";
|
||||
|
||||
export interface EmbedOptions {
|
||||
apiBase?: string;
|
||||
|
@ -606,7 +646,20 @@ export interface EmbeddingsProvider {
|
|||
embed(chunks: string[]): Promise<number[][]>;
|
||||
}
|
||||
|
||||
export type RerankerName = "voyage" | "llm" | "free-trial";
|
||||
|
||||
export interface RerankerDescription {
|
||||
name: RerankerName;
|
||||
params?: { [key: string]: any };
|
||||
}
|
||||
|
||||
export interface Reranker {
|
||||
name: string;
|
||||
rerank(query: string, chunks: Chunk[]): Promise<number[]>;
|
||||
}
|
||||
|
||||
export interface TabAutocompleteOptions {
|
||||
disable: boolean;
|
||||
useCopyBuffer: boolean;
|
||||
useSuffix: boolean;
|
||||
maxPromptTokens: number;
|
||||
|
@ -628,6 +681,17 @@ export interface ContinueUIConfig {
|
|||
codeBlockToolbarPosition?: "top" | "bottom";
|
||||
}
|
||||
|
||||
interface ContextMenuConfig {
|
||||
comment?: string;
|
||||
docstring?: string;
|
||||
fix?: string;
|
||||
optimize?: string;
|
||||
fixGrammar?: string;
|
||||
}
|
||||
interface ExperimantalConfig {
|
||||
contextMenuPrompts?: ContextMenuConfig;
|
||||
}
|
||||
|
||||
export interface SerializedContinueConfig {
|
||||
env?: string[];
|
||||
allowAnonymousTelemetry?: boolean;
|
||||
|
@ -644,6 +708,8 @@ export interface SerializedContinueConfig {
|
|||
tabAutocompleteModel?: ModelDescription;
|
||||
tabAutocompleteOptions?: Partial<TabAutocompleteOptions>;
|
||||
ui?: ContinueUIConfig;
|
||||
reranker?: RerankerDescription;
|
||||
experimental?: ExperimantalConfig;
|
||||
}
|
||||
|
||||
export type ConfigMergeType = "merge" | "overwrite";
|
||||
|
@ -684,6 +750,10 @@ export interface Config {
|
|||
tabAutocompleteOptions?: Partial<TabAutocompleteOptions>;
|
||||
/** UI styles customization */
|
||||
ui?: ContinueUIConfig;
|
||||
/** Options for the reranker */
|
||||
reranker?: RerankerDescription | Reranker;
|
||||
/** Experimental configuration */
|
||||
experimental?: ExperimantalConfig;
|
||||
}
|
||||
|
||||
export interface ContinueConfig {
|
||||
|
@ -700,6 +770,8 @@ export interface ContinueConfig {
|
|||
tabAutocompleteModel?: ILLM;
|
||||
tabAutocompleteOptions?: Partial<TabAutocompleteOptions>;
|
||||
ui?: ContinueUIConfig;
|
||||
reranker?: Reranker;
|
||||
experimental?: ExperimantalConfig;
|
||||
}
|
||||
|
||||
export interface BrowserSerializedContinueConfig {
|
||||
|
@ -714,4 +786,6 @@ export interface BrowserSerializedContinueConfig {
|
|||
userToken?: string;
|
||||
embeddingsProvider?: string;
|
||||
ui?: ContinueUIConfig;
|
||||
reranker?: RerankerDescription;
|
||||
experimental?: ExperimantalConfig;
|
||||
}
|
||||
|
|
|
@ -96,6 +96,7 @@ export class CodeSnippetsCodebaseIndex implements CodebaseIndex {
|
|||
tag: IndexTag,
|
||||
results: RefreshIndexResults,
|
||||
markComplete: MarkCompleteCallback,
|
||||
repoName: string | undefined,
|
||||
): AsyncGenerator<IndexingProgressUpdate, any, unknown> {
|
||||
const db = await SqliteDb.get();
|
||||
await CodeSnippetsCodebaseIndex._createTables(db);
|
||||
|
|
|
@ -1,4 +1,6 @@
|
|||
import { Chunk, IndexTag, IndexingProgressUpdate } from "..";
|
||||
import { BranchAndDir, Chunk, IndexTag, IndexingProgressUpdate } from "..";
|
||||
import { RETRIEVAL_PARAMS } from "../util/parameters";
|
||||
import { ChunkCodebaseIndex } from "./chunk/ChunkCodebaseIndex";
|
||||
import { DatabaseConnection, SqliteDb, tagToString } from "./refreshIndex";
|
||||
import {
|
||||
CodebaseIndex,
|
||||
|
@ -31,6 +33,7 @@ export class FullTextSearchCodebaseIndex implements CodebaseIndex {
|
|||
tag: IndexTag,
|
||||
results: RefreshIndexResults,
|
||||
markComplete: MarkCompleteCallback,
|
||||
repoName: string | undefined,
|
||||
): AsyncGenerator<IndexingProgressUpdate, any, unknown> {
|
||||
const db = await SqliteDb.get();
|
||||
await this._createTables(db);
|
||||
|
@ -85,14 +88,19 @@ export class FullTextSearchCodebaseIndex implements CodebaseIndex {
|
|||
}
|
||||
|
||||
async retrieve(
|
||||
tags: IndexTag[],
|
||||
tags: BranchAndDir[],
|
||||
text: string,
|
||||
n: number,
|
||||
directory: string | undefined,
|
||||
filterPaths: string[] | undefined,
|
||||
bm25Threshold: number = RETRIEVAL_PARAMS.bm25Threshold,
|
||||
): Promise<Chunk[]> {
|
||||
const db = await SqliteDb.get();
|
||||
const tagStrings = tags.map(tagToString);
|
||||
|
||||
// Notice that the "chunks" artifactId is used because of linking between tables
|
||||
const tagStrings = tags.map((tag) => {
|
||||
return tagToString({ ...tag, artifactId: ChunkCodebaseIndex.artifactId });
|
||||
});
|
||||
|
||||
const query = `SELECT fts_metadata.chunkId, fts_metadata.path, fts.content, rank
|
||||
FROM fts
|
||||
|
@ -110,12 +118,14 @@ export class FullTextSearchCodebaseIndex implements CodebaseIndex {
|
|||
ORDER BY rank
|
||||
LIMIT ?`;
|
||||
|
||||
const results = await db.all(query, [
|
||||
let results = await db.all(query, [
|
||||
...tagStrings,
|
||||
...(filterPaths || []),
|
||||
n,
|
||||
]);
|
||||
|
||||
results = results.filter((result) => result.rank <= bm25Threshold);
|
||||
|
||||
const chunks = await db.all(
|
||||
`SELECT * FROM chunks WHERE id IN (${results.map(() => "?").join(",")})`,
|
||||
results.map((result) => result.chunkId),
|
||||
|
|
|
@ -1,11 +1,14 @@
|
|||
// NOTE: vectordb requirement must be listed in extensions/vscode to avoid error
|
||||
import { v4 as uuidv4 } from "uuid";
|
||||
import { Table } from "vectordb";
|
||||
import {
|
||||
BranchAndDir,
|
||||
Chunk,
|
||||
EmbeddingsProvider,
|
||||
IndexTag,
|
||||
IndexingProgressUpdate,
|
||||
} from "..";
|
||||
import { ContinueServerClient } from "../continueServer/stubs/client";
|
||||
import { MAX_CHUNK_SIZE } from "../llm/constants";
|
||||
import { getBasename } from "../util";
|
||||
import { getLanceDbPath } from "../util/paths";
|
||||
|
@ -34,16 +37,11 @@ export class LanceDbIndex implements CodebaseIndex {
|
|||
|
||||
static MAX_CHUNK_SIZE = MAX_CHUNK_SIZE;
|
||||
|
||||
embeddingsProvider: EmbeddingsProvider;
|
||||
readFile: (filepath: string) => Promise<string>;
|
||||
|
||||
constructor(
|
||||
embeddingsProvider: EmbeddingsProvider,
|
||||
readFile: (filepath: string) => Promise<string>,
|
||||
) {
|
||||
this.embeddingsProvider = embeddingsProvider;
|
||||
this.readFile = readFile;
|
||||
}
|
||||
private readonly embeddingsProvider: EmbeddingsProvider,
|
||||
private readonly readFile: (filepath: string) => Promise<string>,
|
||||
private readonly continueServerClient?: ContinueServerClient,
|
||||
) {}
|
||||
|
||||
private tableNameForTag(tag: IndexTag) {
|
||||
return tagToString(tag)
|
||||
|
@ -136,6 +134,7 @@ export class LanceDbIndex implements CodebaseIndex {
|
|||
items: PathAndCacheKey[],
|
||||
resultType: IndexResultType,
|
||||
) => void,
|
||||
repoName: string | undefined,
|
||||
): AsyncGenerator<IndexingProgressUpdate> {
|
||||
const lancedb = await import("vectordb");
|
||||
const tableName = this.tableNameForTag(tag);
|
||||
|
@ -145,11 +144,92 @@ export class LanceDbIndex implements CodebaseIndex {
|
|||
await this.createSqliteCacheTable(sqlite);
|
||||
|
||||
// Compute
|
||||
let table = undefined;
|
||||
let table: Table<number[]> | undefined = undefined;
|
||||
let needToCreateTable = true;
|
||||
const existingTables = await db.tableNames();
|
||||
let computedRows: LanceDbRow[] = [];
|
||||
|
||||
const addComputedLanceDbRows = async (
|
||||
pathAndCacheKey: PathAndCacheKey,
|
||||
computedRows: LanceDbRow[],
|
||||
) => {
|
||||
// Create table if needed, add computed rows
|
||||
if (table) {
|
||||
if (computedRows.length > 0) {
|
||||
await table.add(computedRows);
|
||||
}
|
||||
} else if (existingTables.includes(tableName)) {
|
||||
table = await db.openTable(tableName);
|
||||
needToCreateTable = false;
|
||||
if (computedRows.length > 0) {
|
||||
await table.add(computedRows);
|
||||
}
|
||||
} else if (computedRows.length > 0) {
|
||||
table = await db.createTable(tableName, computedRows);
|
||||
needToCreateTable = false;
|
||||
}
|
||||
|
||||
// Mark item complete
|
||||
markComplete([pathAndCacheKey], IndexResultType.Compute);
|
||||
};
|
||||
|
||||
// Check remote cache
|
||||
if (this.continueServerClient !== undefined) {
|
||||
try {
|
||||
const keys = results.compute.map(({ cacheKey }) => cacheKey);
|
||||
const resp = await this.continueServerClient.getFromIndexCache(
|
||||
keys,
|
||||
"embeddings",
|
||||
repoName,
|
||||
);
|
||||
for (const [cacheKey, chunks] of Object.entries(resp.files)) {
|
||||
// Get path for cacheKey
|
||||
const path = results.compute.find(
|
||||
(item) => item.cacheKey === cacheKey,
|
||||
)?.path;
|
||||
if (!path) {
|
||||
console.warn(
|
||||
"Continue server sent a cacheKey that wasn't requested",
|
||||
cacheKey,
|
||||
);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Build LanceDbRow objects
|
||||
const rows: LanceDbRow[] = [];
|
||||
for (const chunk of chunks) {
|
||||
const row = {
|
||||
path,
|
||||
cachekey: cacheKey,
|
||||
uuid: uuidv4(),
|
||||
vector: chunk.vector,
|
||||
};
|
||||
rows.push(row);
|
||||
|
||||
await sqlite.run(
|
||||
"INSERT INTO lance_db_cache (uuid, cacheKey, path, vector, startLine, endLine, contents) VALUES (?, ?, ?, ?, ?, ?, ?)",
|
||||
row.uuid,
|
||||
row.cachekey,
|
||||
row.path,
|
||||
JSON.stringify(row.vector),
|
||||
chunk.startLine,
|
||||
chunk.endLine,
|
||||
chunk.contents,
|
||||
);
|
||||
}
|
||||
|
||||
await addComputedLanceDbRows({ cacheKey, path }, rows);
|
||||
}
|
||||
|
||||
// Remove items that don't need to be recomputed
|
||||
results.compute = results.compute.filter(
|
||||
(item) => !resp.files[item.cacheKey],
|
||||
);
|
||||
} catch (e) {
|
||||
console.log("Error checking remote cache: ", e);
|
||||
}
|
||||
}
|
||||
|
||||
let computedRows: LanceDbRow[] = [];
|
||||
for await (const update of this.computeChunks(results.compute)) {
|
||||
if (Array.isArray(update)) {
|
||||
const [progress, row, data, desc] = update;
|
||||
|
@ -169,26 +249,8 @@ export class LanceDbIndex implements CodebaseIndex {
|
|||
|
||||
yield { progress, desc };
|
||||
} else {
|
||||
// Create table if needed, add computed rows
|
||||
if (table) {
|
||||
if (computedRows.length > 0) {
|
||||
await table.add(computedRows);
|
||||
}
|
||||
} else if (existingTables.includes(tableName)) {
|
||||
table = await db.openTable(tableName);
|
||||
needToCreateTable = false;
|
||||
if (computedRows.length > 0) {
|
||||
await table.add(computedRows);
|
||||
}
|
||||
} else if (computedRows.length > 0) {
|
||||
table = await db.createTable(tableName, computedRows);
|
||||
needToCreateTable = false;
|
||||
}
|
||||
|
||||
await addComputedLanceDbRows(update, computedRows);
|
||||
computedRows = [];
|
||||
|
||||
// Mark item complete
|
||||
markComplete([update], IndexResultType.Compute);
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -268,21 +330,27 @@ export class LanceDbIndex implements CodebaseIndex {
|
|||
}
|
||||
|
||||
async retrieve(
|
||||
tags: IndexTag[],
|
||||
text: string,
|
||||
query: string,
|
||||
n: number,
|
||||
directory: string | undefined,
|
||||
tags: BranchAndDir[],
|
||||
filterDirectory: string | undefined,
|
||||
): Promise<Chunk[]> {
|
||||
const lancedb = await import("vectordb");
|
||||
if (!lancedb.connect) {
|
||||
throw new Error("LanceDB failed to load a native module");
|
||||
}
|
||||
const [vector] = await this.embeddingsProvider.embed([text]);
|
||||
const [vector] = await this.embeddingsProvider.embed([query]);
|
||||
const db = await lancedb.connect(getLanceDbPath());
|
||||
|
||||
let allResults = [];
|
||||
for (const tag of tags) {
|
||||
const results = await this._retrieveForTag(tag, n, directory, vector, db);
|
||||
const results = await this._retrieveForTag(
|
||||
{ ...tag, artifactId: this.artifactId },
|
||||
n,
|
||||
filterDirectory,
|
||||
vector,
|
||||
db,
|
||||
);
|
||||
allResults.push(...results);
|
||||
}
|
||||
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
import { IndexTag, IndexingProgressUpdate } from "../..";
|
||||
import { Chunk, IndexTag, IndexingProgressUpdate } from "../..";
|
||||
import { ContinueServerClient } from "../../continueServer/stubs/client";
|
||||
import { MAX_CHUNK_SIZE } from "../../llm/constants";
|
||||
import { getBasename } from "../../util";
|
||||
import { DatabaseConnection, SqliteDb, tagToString } from "../refreshIndex";
|
||||
|
@ -14,8 +15,10 @@ export class ChunkCodebaseIndex implements CodebaseIndex {
|
|||
static artifactId: string = "chunks";
|
||||
artifactId: string = ChunkCodebaseIndex.artifactId;
|
||||
|
||||
readFile: (filepath: string) => Promise<string>;
|
||||
constructor(readFile: (filepath: string) => Promise<string>) {
|
||||
constructor(
|
||||
private readonly readFile: (filepath: string) => Promise<string>,
|
||||
private readonly continueServerClient?: ContinueServerClient,
|
||||
) {
|
||||
this.readFile = readFile;
|
||||
}
|
||||
|
||||
|
@ -42,11 +45,54 @@ export class ChunkCodebaseIndex implements CodebaseIndex {
|
|||
tag: IndexTag,
|
||||
results: RefreshIndexResults,
|
||||
markComplete: MarkCompleteCallback,
|
||||
repoName: string | undefined,
|
||||
): AsyncGenerator<IndexingProgressUpdate, any, unknown> {
|
||||
const db = await SqliteDb.get();
|
||||
await this._createTables(db);
|
||||
const tagString = tagToString(tag);
|
||||
|
||||
async function handleChunk(chunk: Chunk) {
|
||||
const { lastID } = await db.run(
|
||||
`INSERT INTO chunks (cacheKey, path, idx, startLine, endLine, content) VALUES (?, ?, ?, ?, ?, ?)`,
|
||||
[
|
||||
chunk.digest,
|
||||
chunk.filepath,
|
||||
chunk.index,
|
||||
chunk.startLine,
|
||||
chunk.endLine,
|
||||
chunk.content,
|
||||
],
|
||||
);
|
||||
|
||||
await db.run(`INSERT INTO chunk_tags (chunkId, tag) VALUES (?, ?)`, [
|
||||
lastID,
|
||||
tagString,
|
||||
]);
|
||||
}
|
||||
|
||||
// Check the remote cache
|
||||
if (this.continueServerClient !== undefined) {
|
||||
try {
|
||||
const keys = results.compute.map(({ cacheKey }) => cacheKey);
|
||||
const resp = await this.continueServerClient.getFromIndexCache(
|
||||
keys,
|
||||
"chunks",
|
||||
repoName,
|
||||
);
|
||||
|
||||
for (const [cacheKey, chunks] of Object.entries(resp.files)) {
|
||||
for (const chunk of chunks) {
|
||||
await handleChunk(chunk);
|
||||
}
|
||||
}
|
||||
results.compute = results.compute.filter(
|
||||
(item) => !resp.files[item.cacheKey],
|
||||
);
|
||||
} catch (e) {
|
||||
console.error("Failed to fetch from remote cache: ", e);
|
||||
}
|
||||
}
|
||||
|
||||
// Compute chunks for new files
|
||||
const contents = await Promise.all(
|
||||
results.compute.map(({ path }) => this.readFile(path)),
|
||||
|
@ -61,22 +107,7 @@ export class ChunkCodebaseIndex implements CodebaseIndex {
|
|||
MAX_CHUNK_SIZE,
|
||||
item.cacheKey,
|
||||
)) {
|
||||
const { lastID } = await db.run(
|
||||
`INSERT INTO chunks (cacheKey, path, idx, startLine, endLine, content) VALUES (?, ?, ?, ?, ?, ?)`,
|
||||
[
|
||||
chunk.digest,
|
||||
chunk.filepath,
|
||||
chunk.index,
|
||||
chunk.startLine,
|
||||
chunk.endLine,
|
||||
chunk.content,
|
||||
],
|
||||
);
|
||||
|
||||
await db.run(`INSERT INTO chunk_tags (chunkId, tag) VALUES (?, ?)`, [
|
||||
lastID,
|
||||
tagString,
|
||||
]);
|
||||
handleChunk(chunk);
|
||||
}
|
||||
|
||||
yield {
|
||||
|
|
|
@ -0,0 +1,55 @@
|
|||
import fetch, { Response } from "node-fetch";
|
||||
import { EmbedOptions } from "../..";
|
||||
import { getHeaders } from "../../continueServer/stubs/headers";
|
||||
import { SERVER_URL } from "../../util/parameters";
|
||||
import { withExponentialBackoff } from "../../util/withExponentialBackoff";
|
||||
import BaseEmbeddingsProvider from "./BaseEmbeddingsProvider";
|
||||
|
||||
class FreeTrialEmbeddingsProvider extends BaseEmbeddingsProvider {
|
||||
static maxBatchSize = 128;
|
||||
static defaultOptions: Partial<EmbedOptions> | undefined = {
|
||||
model: "voyage-code-2",
|
||||
};
|
||||
|
||||
get id(): string {
|
||||
return FreeTrialEmbeddingsProvider.defaultOptions!.model!;
|
||||
}
|
||||
|
||||
async embed(chunks: string[]) {
|
||||
const batchedChunks = [];
|
||||
for (
|
||||
let i = 0;
|
||||
i < chunks.length;
|
||||
i += FreeTrialEmbeddingsProvider.maxBatchSize
|
||||
) {
|
||||
batchedChunks.push(
|
||||
chunks.slice(i, i + FreeTrialEmbeddingsProvider.maxBatchSize),
|
||||
);
|
||||
}
|
||||
return (
|
||||
await Promise.all(
|
||||
batchedChunks.map(async (batch) => {
|
||||
const fetchWithBackoff = () =>
|
||||
withExponentialBackoff<Response>(() =>
|
||||
fetch(new URL("embeddings", SERVER_URL), {
|
||||
method: "POST",
|
||||
body: JSON.stringify({
|
||||
input: batch,
|
||||
model: this.options.model,
|
||||
}),
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
...getHeaders(),
|
||||
},
|
||||
}),
|
||||
);
|
||||
const resp = await fetchWithBackoff();
|
||||
const data = (await resp.json()) as any;
|
||||
return data.embeddings;
|
||||
}),
|
||||
)
|
||||
).flat();
|
||||
}
|
||||
}
|
||||
|
||||
export default FreeTrialEmbeddingsProvider;
|
|
@ -1,8 +1,13 @@
|
|||
import fetch, { Response } from "node-fetch";
|
||||
import { EmbedOptions } from "../..";
|
||||
import { withExponentialBackoff } from "../../util/withExponentialBackoff";
|
||||
import BaseEmbeddingsProvider from "./BaseEmbeddingsProvider";
|
||||
|
||||
class OpenAIEmbeddingsProvider extends BaseEmbeddingsProvider {
|
||||
// https://platform.openai.com/docs/api-reference/embeddings/create is 2048
|
||||
// but Voyage is 128
|
||||
static maxBatchSize = 128;
|
||||
|
||||
static defaultOptions: Partial<EmbedOptions> | undefined = {
|
||||
apiBase: "https://api.openai.com/v1/",
|
||||
model: "text-embedding-3-small",
|
||||
|
@ -13,27 +18,50 @@ class OpenAIEmbeddingsProvider extends BaseEmbeddingsProvider {
|
|||
}
|
||||
|
||||
async embed(chunks: string[]) {
|
||||
return await Promise.all(
|
||||
chunks.map(async (chunk) => {
|
||||
const fetchWithBackoff = () =>
|
||||
withExponentialBackoff<Response>(() =>
|
||||
fetch(new URL("embeddings", this.options.apiBase).toString(), {
|
||||
method: "POST",
|
||||
body: JSON.stringify({
|
||||
input: chunk,
|
||||
model: this.options.model,
|
||||
if (!this.options.apiBase?.endsWith("/")) {
|
||||
this.options.apiBase += "/";
|
||||
}
|
||||
|
||||
const batchedChunks = [];
|
||||
for (
|
||||
let i = 0;
|
||||
i < chunks.length;
|
||||
i += OpenAIEmbeddingsProvider.maxBatchSize
|
||||
) {
|
||||
batchedChunks.push(
|
||||
chunks.slice(i, i + OpenAIEmbeddingsProvider.maxBatchSize),
|
||||
);
|
||||
}
|
||||
return (
|
||||
await Promise.all(
|
||||
batchedChunks.map(async (batch) => {
|
||||
const fetchWithBackoff = () =>
|
||||
withExponentialBackoff<Response>(() =>
|
||||
fetch(new URL("embeddings", this.options.apiBase), {
|
||||
method: "POST",
|
||||
body: JSON.stringify({
|
||||
input: batch,
|
||||
model: this.options.model,
|
||||
}),
|
||||
headers: {
|
||||
Authorization: `Bearer ${this.options.apiKey}`,
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
}),
|
||||
headers: {
|
||||
Authorization: `Bearer ${this.options.apiKey}`,
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
}),
|
||||
);
|
||||
const resp = await fetchWithBackoff();
|
||||
|
||||
if (!resp.ok) {
|
||||
throw new Error(await resp.text());
|
||||
}
|
||||
|
||||
const data = (await resp.json()) as any;
|
||||
return data.data.map(
|
||||
(result: { embedding: number[] }) => result.embedding,
|
||||
);
|
||||
const resp = await fetchWithBackoff();
|
||||
const data = await resp.json();
|
||||
return data.data[0].embedding;
|
||||
}),
|
||||
);
|
||||
}),
|
||||
)
|
||||
).flat();
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
import { EmbeddingsProviderName } from "../..";
|
||||
import FreeTrialEmbeddingsProvider from "./FreeTrialEmbeddingsProvider";
|
||||
import OllamaEmbeddingsProvider from "./OllamaEmbeddingsProvider";
|
||||
import OpenAIEmbeddingsProvider from "./OpenAIEmbeddingsProvider";
|
||||
import TransformersJsEmbeddingsProvider from "./TransformersJsEmbeddingsProvider";
|
||||
|
@ -9,4 +10,5 @@ export const AllEmbeddingsProviders: {
|
|||
ollama: OllamaEmbeddingsProvider,
|
||||
"transformers.js": TransformersJsEmbeddingsProvider,
|
||||
openai: OpenAIEmbeddingsProvider,
|
||||
"free-trial": FreeTrialEmbeddingsProvider,
|
||||
};
|
||||
|
|
|
@ -0,0 +1,80 @@
|
|||
import ignore from "ignore";
|
||||
|
||||
const DEFAULT_IGNORE_FILETYPES = [
|
||||
"*.DS_Store",
|
||||
"*-lock.json",
|
||||
"*.lock",
|
||||
"*.log",
|
||||
"*.ttf",
|
||||
"*.png",
|
||||
"*.jpg",
|
||||
"*.jpeg",
|
||||
"*.gif",
|
||||
"*.mp4",
|
||||
"*.svg",
|
||||
"*.ico",
|
||||
"*.pdf",
|
||||
"*.zip",
|
||||
"*.gz",
|
||||
"*.tar",
|
||||
"*.dmg",
|
||||
"*.tgz",
|
||||
"*.rar",
|
||||
"*.7z",
|
||||
"*.exe",
|
||||
"*.dll",
|
||||
"*.obj",
|
||||
"*.o",
|
||||
"*.a",
|
||||
"*.lib",
|
||||
"*.so",
|
||||
"*.dylib",
|
||||
"*.ncb",
|
||||
"*.sdf",
|
||||
"*.woff",
|
||||
"*.woff2",
|
||||
"*.eot",
|
||||
"*.cur",
|
||||
"*.avi",
|
||||
"*.mpg",
|
||||
"*.mpeg",
|
||||
"*.mov",
|
||||
"*.mp3",
|
||||
"*.mp4",
|
||||
"*.mkv",
|
||||
"*.mkv",
|
||||
"*.webm",
|
||||
"*.jar",
|
||||
"*.onnx",
|
||||
"*.parquet",
|
||||
"*.pqt",
|
||||
"*.wav",
|
||||
"*.webp",
|
||||
"*.db",
|
||||
"*.sqlite",
|
||||
"*.wasm",
|
||||
"*.plist",
|
||||
];
|
||||
export const defaultIgnoreFile = ignore().add(DEFAULT_IGNORE_FILETYPES);
|
||||
export const DEFAULT_IGNORE_DIRS = [
|
||||
".git",
|
||||
".vscode",
|
||||
".idea",
|
||||
".vs",
|
||||
"venv",
|
||||
".venv",
|
||||
"env",
|
||||
".env",
|
||||
"node_modules",
|
||||
"dist",
|
||||
"build",
|
||||
"target",
|
||||
"out",
|
||||
"bin",
|
||||
".pytest_cache",
|
||||
".vscode-test",
|
||||
".continue",
|
||||
"__pycache__",
|
||||
"site-packages",
|
||||
];
|
||||
export const defaultIgnoreDir = ignore().add(DEFAULT_IGNORE_DIRS);
|
|
@ -1,5 +1,6 @@
|
|||
import { IDE, IndexTag, IndexingProgressUpdate } from "..";
|
||||
import { ConfigHandler } from "../config/handler";
|
||||
import { ContinueServerClient } from "../continueServer/stubs/client";
|
||||
import { CodeSnippetsCodebaseIndex } from "./CodeSnippetsIndex";
|
||||
import { FullTextSearchCodebaseIndex } from "./FullTextSearch";
|
||||
import { LanceDbIndex } from "./LanceDbIndex";
|
||||
|
@ -20,24 +21,34 @@ export class PauseToken {
|
|||
}
|
||||
|
||||
export class CodebaseIndexer {
|
||||
configHandler: ConfigHandler;
|
||||
ide: IDE;
|
||||
pauseToken: PauseToken;
|
||||
|
||||
constructor(configHandler: ConfigHandler, ide: IDE, pauseToken: PauseToken) {
|
||||
this.configHandler = configHandler;
|
||||
this.ide = ide;
|
||||
this.pauseToken = pauseToken;
|
||||
private continueServerClient?: ContinueServerClient;
|
||||
constructor(
|
||||
private readonly configHandler: ConfigHandler,
|
||||
private readonly ide: IDE,
|
||||
private readonly pauseToken: PauseToken,
|
||||
private readonly continueServerUrl: string | undefined,
|
||||
private readonly userToken: Promise<string | undefined>,
|
||||
) {
|
||||
if (continueServerUrl) {
|
||||
this.continueServerClient = new ContinueServerClient(
|
||||
continueServerUrl,
|
||||
userToken,
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
private async getIndexesToBuild(): Promise<CodebaseIndex[]> {
|
||||
const config = await this.configHandler.loadConfig();
|
||||
|
||||
const indexes = [
|
||||
new ChunkCodebaseIndex(this.ide.readFile.bind(this.ide)), // Chunking must come first
|
||||
new ChunkCodebaseIndex(
|
||||
this.ide.readFile.bind(this.ide),
|
||||
this.continueServerClient,
|
||||
), // Chunking must come first
|
||||
new LanceDbIndex(
|
||||
config.embeddingsProvider,
|
||||
this.ide.readFile.bind(this.ide),
|
||||
this.continueServerClient,
|
||||
),
|
||||
new FullTextSearchCodebaseIndex(),
|
||||
new CodeSnippetsCodebaseIndex(this.ide),
|
||||
|
@ -48,6 +59,7 @@ export class CodebaseIndexer {
|
|||
|
||||
async *refresh(
|
||||
workspaceDirs: string[],
|
||||
abortSignal: AbortSignal,
|
||||
): AsyncGenerator<IndexingProgressUpdate> {
|
||||
const config = await this.configHandler.loadConfig();
|
||||
if (config.disableIndexing) {
|
||||
|
@ -58,18 +70,23 @@ export class CodebaseIndexer {
|
|||
|
||||
let completedDirs = 0;
|
||||
|
||||
// Wait until Git Extension has loaded to report progress
|
||||
// so we don't appear stuck at 0% while waiting
|
||||
await this.ide.getRepoName(workspaceDirs[0]);
|
||||
yield {
|
||||
progress: 0.01,
|
||||
progress: 0,
|
||||
desc: "Starting indexing...",
|
||||
};
|
||||
|
||||
for (let directory of workspaceDirs) {
|
||||
const stats = await this.ide.getStats(directory);
|
||||
const branch = await this.ide.getBranch(directory);
|
||||
const repoName = await this.ide.getRepoName(directory);
|
||||
let completedIndexes = 0;
|
||||
|
||||
try {
|
||||
for (let codebaseIndex of indexesToBuild) {
|
||||
// TODO: IndexTag type should use repoName rather than directory
|
||||
const tag: IndexTag = {
|
||||
directory,
|
||||
branch,
|
||||
|
@ -79,14 +96,23 @@ export class CodebaseIndexer {
|
|||
tag,
|
||||
{ ...stats },
|
||||
(filepath) => this.ide.readFile(filepath),
|
||||
repoName,
|
||||
);
|
||||
|
||||
for await (let { progress, desc } of codebaseIndex.update(
|
||||
tag,
|
||||
results,
|
||||
markComplete,
|
||||
repoName,
|
||||
)) {
|
||||
// Handle pausing in this loop because it's the only one really taking time
|
||||
if (abortSignal.aborted) {
|
||||
yield {
|
||||
progress: 1,
|
||||
desc: "Indexing cancelled",
|
||||
};
|
||||
return;
|
||||
}
|
||||
while (this.pauseToken.paused) {
|
||||
await new Promise((resolve) => setTimeout(resolve, 100));
|
||||
}
|
||||
|
|
|
@ -259,6 +259,7 @@ export async function getComputeDeleteAddRemove(
|
|||
tag: IndexTag,
|
||||
currentFiles: LastModifiedMap,
|
||||
readFile: (path: string) => Promise<string>,
|
||||
repoName: string | undefined,
|
||||
): Promise<[RefreshIndexResults, MarkCompleteCallback]> {
|
||||
const [add, remove, markComplete] = await getAddRemoveForTag(
|
||||
tag,
|
||||
|
@ -316,7 +317,12 @@ export async function getComputeDeleteAddRemove(
|
|||
removeTag: [],
|
||||
};
|
||||
results[resultType] = items;
|
||||
for await (let _ of globalCacheIndex.update(tag, results, () => {})) {
|
||||
for await (let _ of globalCacheIndex.update(
|
||||
tag,
|
||||
results,
|
||||
() => {},
|
||||
repoName,
|
||||
)) {
|
||||
}
|
||||
},
|
||||
];
|
||||
|
@ -338,6 +344,7 @@ export class GlobalCacheCodeBaseIndex implements CodebaseIndex {
|
|||
tag: IndexTag,
|
||||
results: RefreshIndexResults,
|
||||
_: MarkCompleteCallback,
|
||||
repoName: string | undefined,
|
||||
): AsyncGenerator<IndexingProgressUpdate> {
|
||||
const add = [...results.compute, ...results.addTag];
|
||||
const remove = [...results.del, ...results.removeTag];
|
||||
|
|
|
@ -18,6 +18,7 @@ export interface CodebaseIndex {
|
|||
tag: IndexTag,
|
||||
results: RefreshIndexResults,
|
||||
markComplete: MarkCompleteCallback,
|
||||
repoName: string | undefined,
|
||||
): AsyncGenerator<IndexingProgressUpdate>;
|
||||
}
|
||||
|
||||
|
|
|
@ -19,13 +19,13 @@ import {
|
|||
alpacaEditPrompt,
|
||||
claudeEditPrompt,
|
||||
codeLlama70bEditPrompt,
|
||||
codellamaEditPrompt,
|
||||
deepseekEditPrompt,
|
||||
gemmaEditPrompt,
|
||||
gptEditPrompt,
|
||||
mistralEditPrompt,
|
||||
neuralChatEditPrompt,
|
||||
openchatEditPrompt,
|
||||
osModelsEditPrompt,
|
||||
phindEditPrompt,
|
||||
simplifiedEditPrompt,
|
||||
xWinCoderEditPrompt,
|
||||
|
@ -38,6 +38,7 @@ const PROVIDER_HANDLES_TEMPLATING: ModelProvider[] = [
|
|||
"ollama",
|
||||
"together",
|
||||
"anthropic",
|
||||
"bedrock",
|
||||
];
|
||||
|
||||
const PROVIDER_SUPPORTS_IMAGES: ModelProvider[] = [
|
||||
|
@ -46,6 +47,7 @@ const PROVIDER_SUPPORTS_IMAGES: ModelProvider[] = [
|
|||
"google-palm",
|
||||
"free-trial",
|
||||
"anthropic",
|
||||
"bedrock",
|
||||
];
|
||||
|
||||
function modelSupportsImages(provider: ModelProvider, model: string): boolean {
|
||||
|
@ -160,7 +162,7 @@ function autodetectTemplateType(model: string): TemplateType | undefined {
|
|||
return "alpaca";
|
||||
}
|
||||
|
||||
if (lower.includes("mistral")) {
|
||||
if (lower.includes("mistral") || lower.includes("mixtral")) {
|
||||
return "llama2";
|
||||
}
|
||||
|
||||
|
@ -218,6 +220,22 @@ function autodetectTemplateFunction(
|
|||
return null;
|
||||
}
|
||||
|
||||
const USES_OS_MODELS_EDIT_PROMPT: TemplateType[] = [
|
||||
"alpaca",
|
||||
"chatml",
|
||||
// "codellama-70b", Doesn't respond well to this prompt
|
||||
"deepseek",
|
||||
"gemma",
|
||||
"llama2",
|
||||
"llava",
|
||||
"neural-chat",
|
||||
"openchat",
|
||||
"phi2",
|
||||
"phind",
|
||||
"xwin-coder",
|
||||
"zephyr",
|
||||
];
|
||||
|
||||
function autodetectPromptTemplates(
|
||||
model: string,
|
||||
explicitTemplate: TemplateType | undefined = undefined,
|
||||
|
@ -227,7 +245,11 @@ function autodetectPromptTemplates(
|
|||
|
||||
let editTemplate = null;
|
||||
|
||||
if (templateType === "phind") {
|
||||
if (templateType && USES_OS_MODELS_EDIT_PROMPT.includes(templateType)) {
|
||||
// This is overriding basically everything else
|
||||
// Will probably delete the rest later, but for now it's easy to revert
|
||||
editTemplate = osModelsEditPrompt;
|
||||
} else if (templateType === "phind") {
|
||||
editTemplate = phindEditPrompt;
|
||||
} else if (templateType === "phi2") {
|
||||
editTemplate = simplifiedEditPrompt;
|
||||
|
@ -237,7 +259,7 @@ function autodetectPromptTemplates(
|
|||
if (model.includes("mistral")) {
|
||||
editTemplate = mistralEditPrompt;
|
||||
} else {
|
||||
editTemplate = codellamaEditPrompt;
|
||||
editTemplate = osModelsEditPrompt;
|
||||
}
|
||||
} else if (templateType === "alpaca") {
|
||||
editTemplate = alpacaEditPrompt;
|
||||
|
|
|
@ -8,6 +8,7 @@ import {
|
|||
LLMReturnValue,
|
||||
ModelName,
|
||||
ModelProvider,
|
||||
PromptTemplate,
|
||||
RequestOptions,
|
||||
TemplateType,
|
||||
} from "..";
|
||||
|
@ -46,6 +47,23 @@ export abstract class BaseLLM implements ILLM {
|
|||
return modelSupportsImages(this.providerName, this.model);
|
||||
}
|
||||
|
||||
supportsCompletions(): boolean {
|
||||
if (this.providerName === "openai") {
|
||||
if (
|
||||
this.apiBase?.includes("api.groq.com") ||
|
||||
this.apiBase?.includes(":1337")
|
||||
) {
|
||||
// Jan + Groq don't support completions : (
|
||||
return false;
|
||||
}
|
||||
}
|
||||
return true;
|
||||
}
|
||||
|
||||
supportsPrefill(): boolean {
|
||||
return ["ollama", "anthropic"].includes(this.providerName);
|
||||
}
|
||||
|
||||
uniqueId: string;
|
||||
model: string;
|
||||
|
||||
|
@ -237,7 +255,6 @@ ${prompt}`;
|
|||
const log = options.log ?? true;
|
||||
const raw = options.raw ?? false;
|
||||
delete options.log;
|
||||
delete options.raw;
|
||||
|
||||
const completionOptions: CompletionOptions = mergeJson(
|
||||
this.completionOptions,
|
||||
|
@ -443,10 +460,45 @@ ${prompt}`;
|
|||
};
|
||||
}
|
||||
|
||||
private _shouldRequestDirectly() {
|
||||
if (typeof window === "undefined") {
|
||||
return true;
|
||||
public renderPromptTemplate(
|
||||
template: PromptTemplate,
|
||||
history: ChatMessage[],
|
||||
otherData: Record<string, string>,
|
||||
canPutWordsInModelsMouth: boolean = false,
|
||||
): string | ChatMessage[] {
|
||||
if (typeof template === "string") {
|
||||
let data: any = {
|
||||
history: history,
|
||||
...otherData,
|
||||
};
|
||||
if (history.length > 0 && history[0].role == "system") {
|
||||
data["system_message"] = history.shift()!.content;
|
||||
}
|
||||
|
||||
const compiledTemplate = Handlebars.compile(template);
|
||||
return compiledTemplate(data);
|
||||
} else {
|
||||
const rendered = template(history, {
|
||||
...otherData,
|
||||
supportsCompletions: this.supportsCompletions() ? "true" : "false",
|
||||
supportsPrefill: this.supportsPrefill() ? "true" : "false",
|
||||
});
|
||||
if (
|
||||
typeof rendered !== "string" &&
|
||||
rendered[rendered.length - 1]?.role === "assistant" &&
|
||||
!canPutWordsInModelsMouth
|
||||
) {
|
||||
// Some providers don't allow you to put words in the model's mouth
|
||||
// So we have to manually compile the prompt template and use
|
||||
// raw /completions, not /chat/completions
|
||||
const templateMessages = autodetectTemplateFunction(
|
||||
this.model,
|
||||
this.providerName,
|
||||
autodetectTemplateType(this.model),
|
||||
);
|
||||
return templateMessages(rendered);
|
||||
}
|
||||
return rendered;
|
||||
}
|
||||
return window?.ide !== "vscode";
|
||||
}
|
||||
}
|
||||
|
|
|
@ -31,7 +31,8 @@ class Anthropic extends BaseLLM {
|
|||
temperature: options.temperature,
|
||||
max_tokens: options.maxTokens ?? 2048,
|
||||
model: options.model === "claude-2" ? "claude-2.1" : options.model,
|
||||
stop_sequences: options.stop,
|
||||
stop_sequences: options.stop?.filter((x) => x.trim() !== ""),
|
||||
stream: options.stream ?? true,
|
||||
};
|
||||
|
||||
return finalOptions;
|
||||
|
@ -92,10 +93,15 @@ class Anthropic extends BaseLLM {
|
|||
...this._convertArgs(options),
|
||||
messages: this._convertMessages(messages),
|
||||
system: this.systemMessage,
|
||||
stream: true,
|
||||
}),
|
||||
});
|
||||
|
||||
if (options.stream === false) {
|
||||
const data = await response.json();
|
||||
yield { role: "assistant", content: data.content[0].text };
|
||||
return;
|
||||
}
|
||||
|
||||
for await (const value of streamSse(response)) {
|
||||
if (value.delta?.text) {
|
||||
yield { role: "assistant", content: value.delta.text };
|
||||
|
|
|
@ -1,35 +1,209 @@
|
|||
import * as fs from "fs";
|
||||
import { join as joinPath } from "path";
|
||||
import { promisify } from "util";
|
||||
import { BaseLLM } from "..";
|
||||
import { CompletionOptions, LLMOptions, ModelProvider } from "../..";
|
||||
import { streamSse } from "../stream";
|
||||
import {
|
||||
ChatMessage,
|
||||
CompletionOptions,
|
||||
LLMOptions,
|
||||
ModelProvider,
|
||||
} from "../..";
|
||||
import { stripImages } from "../countTokens";
|
||||
|
||||
const aws4 = require("aws4");
|
||||
const readFile = promisify(fs.readFile);
|
||||
|
||||
namespace BedrockCommon {
|
||||
export enum Method {
|
||||
Chat = "invoke",
|
||||
Completion = "invoke-with-response-stream",
|
||||
}
|
||||
export const Service: string = "bedrock";
|
||||
export const AuthAlgo: string = "AWS4-HMAC-SHA256";
|
||||
export const HashAlgo: string = "sha256";
|
||||
}
|
||||
|
||||
class Bedrock extends BaseLLM {
|
||||
static providerName: ModelProvider = "bedrock";
|
||||
static defaultOptions: Partial<LLMOptions> = {};
|
||||
static defaultOptions: Partial<LLMOptions> = {
|
||||
region: "us-east-1",
|
||||
model: "claude-3-sonnet-20240229",
|
||||
contextLength: 200_000,
|
||||
};
|
||||
|
||||
constructor(options: LLMOptions) {
|
||||
super(options);
|
||||
this.apiBase = `https://bedrock-runtime.${options.region}.amazonaws.com`;
|
||||
}
|
||||
|
||||
private _convertModelName(model: string): string {
|
||||
return (
|
||||
{
|
||||
"claude-3-sonnet-20240229": "anthropic.claude-3-sonnet-20240229-v1:0",
|
||||
"claude-3-haiku-20240307": "anthropic.claude-3-haiku-20240307-v1:0",
|
||||
"claude-2": "anthropic.claude-v2:1",
|
||||
}[model] ?? model
|
||||
);
|
||||
}
|
||||
|
||||
private _convertArgs(options: CompletionOptions) {
|
||||
const finalOptions = {
|
||||
top_k: options.topK,
|
||||
top_p: options.topP,
|
||||
temperature: options.temperature,
|
||||
max_tokens: options.maxTokens ?? 4096,
|
||||
stop_sequences: options.stop,
|
||||
};
|
||||
|
||||
return finalOptions;
|
||||
}
|
||||
|
||||
private _convertMessages(msgs: ChatMessage[]): any[] {
|
||||
const messages = msgs
|
||||
.filter((m) => m.role !== "system")
|
||||
.map((message) => {
|
||||
if (typeof message.content === "string") {
|
||||
return message;
|
||||
} else {
|
||||
return {
|
||||
...message,
|
||||
content: message.content.map((part) => {
|
||||
if (part.type === "text") {
|
||||
return part;
|
||||
} else {
|
||||
return {
|
||||
type: "image",
|
||||
source: {
|
||||
type: "base64",
|
||||
media_type: "image/jpeg",
|
||||
data: part.imageUrl?.url.split(",")[1],
|
||||
},
|
||||
};
|
||||
}
|
||||
}),
|
||||
};
|
||||
}
|
||||
});
|
||||
return messages;
|
||||
}
|
||||
|
||||
private _parseCredentialsFile(fileContents: string) {
|
||||
const profiles: { [key: string]: any } = {};
|
||||
const lines = fileContents.trim().split("\n");
|
||||
|
||||
let currentProfile: string | null = null;
|
||||
|
||||
for (const line of lines) {
|
||||
const trimmedLine = line.trim();
|
||||
|
||||
if (trimmedLine.startsWith("[") && trimmedLine.endsWith("]")) {
|
||||
currentProfile = trimmedLine.slice(1, -1);
|
||||
profiles[currentProfile] = {};
|
||||
} else if (currentProfile !== null && trimmedLine.includes("=")) {
|
||||
const [key, value] = trimmedLine.split("=");
|
||||
const trimmedKey = key.trim();
|
||||
const trimmedValue = value.trim();
|
||||
|
||||
if (trimmedKey === "aws_access_key_id") {
|
||||
profiles[currentProfile].accessKeyId = trimmedValue;
|
||||
} else if (trimmedKey === "aws_secret_access_key") {
|
||||
profiles[currentProfile].secretAccessKey = trimmedValue;
|
||||
} else if (trimmedKey === "aws_session_token") {
|
||||
profiles[currentProfile].sessionToken = trimmedValue;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return profiles;
|
||||
}
|
||||
|
||||
private async _fetchWithAwsAuthSigV4(
|
||||
apiMethod: BedrockCommon.Method,
|
||||
body: string,
|
||||
model: string,
|
||||
): Promise<Response> {
|
||||
const path = `/model/${model}/${apiMethod}`;
|
||||
const opts = {
|
||||
headers: {
|
||||
accept:
|
||||
apiMethod === BedrockCommon.Method.Chat
|
||||
? "application/json"
|
||||
: "application/vnd.amazon.eventstream",
|
||||
"content-type": "application/json",
|
||||
"x-amzn-bedrock-accept": "*/*",
|
||||
},
|
||||
path: path,
|
||||
body: body,
|
||||
service: "bedrock",
|
||||
host: new URL(this.apiBase!).host,
|
||||
region: this.region,
|
||||
};
|
||||
|
||||
let accessKeyId: string, secretAccessKey: string, sessionToken: string;
|
||||
|
||||
try {
|
||||
const data = await readFile(
|
||||
joinPath(process.env.HOME!, ".aws", "credentials"),
|
||||
"utf8",
|
||||
);
|
||||
const credentials = this._parseCredentialsFile(data);
|
||||
accessKeyId = credentials.bedrock.accessKeyId;
|
||||
secretAccessKey = credentials.bedrock.secretAccessKey;
|
||||
sessionToken = credentials.bedrock.sessionToken || "";
|
||||
} catch (err) {
|
||||
console.error("Error reading AWS credentials", err);
|
||||
return new Response("403");
|
||||
}
|
||||
return await this.fetch(new URL(`${this.apiBase}${path}`), {
|
||||
method: "POST",
|
||||
headers: aws4.sign(opts, { accessKeyId, secretAccessKey, sessionToken })[
|
||||
"headers"
|
||||
],
|
||||
body: body,
|
||||
});
|
||||
}
|
||||
|
||||
protected async *_streamComplete(
|
||||
prompt: string,
|
||||
options: CompletionOptions,
|
||||
): AsyncGenerator<string> {
|
||||
const response = await this.fetch(
|
||||
`${this.apiBase}/model/${options.model}/invoke-with-response-stream`,
|
||||
{
|
||||
method: "POST",
|
||||
headers: {
|
||||
accept: "application/vnd.amazon.eventstream",
|
||||
"content-type": "application/json",
|
||||
"x-amzn-bedrock-accept": "*/*",
|
||||
},
|
||||
body: JSON.stringify({
|
||||
inputText: prompt,
|
||||
}),
|
||||
},
|
||||
);
|
||||
|
||||
for await (const value of streamSse(response)) {
|
||||
if (value.chunk) {
|
||||
yield value.chunk;
|
||||
}
|
||||
const messages = [{ role: "user" as const, content: prompt }];
|
||||
for await (const update of this._streamChat(messages, options)) {
|
||||
yield stripImages(update.content);
|
||||
}
|
||||
// TODO: Couldn't seem to get this stream API working yet. Deferring to _streamChat.
|
||||
// import { streamSse } from "../stream";
|
||||
// const response = await this._fetchWithAwsAuthSigV4(BedrockCommon.Method.Completion, JSON.stringify({
|
||||
// ...this._convertArgs(options),
|
||||
// max_tokens: undefined, // Delete this key in favor of the correct one for the Completions API.
|
||||
// max_tokens_to_sample: options.maxTokens,
|
||||
// prompt: `\n\nHuman: ${prompt}\n\nAssistant:`,
|
||||
// })
|
||||
// );
|
||||
// for await (const value of streamSse(response)) {
|
||||
// if (value.completion) {
|
||||
// yield value.completion
|
||||
// }
|
||||
// }
|
||||
}
|
||||
|
||||
protected async *_streamChat(
|
||||
messages: ChatMessage[],
|
||||
options: CompletionOptions,
|
||||
): AsyncGenerator<ChatMessage> {
|
||||
const response = await this._fetchWithAwsAuthSigV4(
|
||||
BedrockCommon.Method.Chat,
|
||||
JSON.stringify({
|
||||
...this._convertArgs(options),
|
||||
messages: this._convertMessages(messages),
|
||||
anthropic_version: "bedrock-2023-05-31", // Fixed, required parameter for Chat API.
|
||||
}),
|
||||
this._convertModelName(options.model),
|
||||
);
|
||||
yield {
|
||||
role: "assistant",
|
||||
content: (await response.json()).content[0].text,
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -1,10 +1,9 @@
|
|||
import { BaseLLM } from "..";
|
||||
import { ChatMessage, CompletionOptions, ModelProvider } from "../..";
|
||||
import { getHeaders } from "../../continueServer/stubs/headers";
|
||||
import { SERVER_URL } from "../../util/parameters";
|
||||
import { streamResponse } from "../stream";
|
||||
|
||||
// const SERVER_URL = "http://localhost:3000";
|
||||
const SERVER_URL = "https://node-proxy-server-blue-l6vsfbzhba-uw.a.run.app";
|
||||
|
||||
class FreeTrial extends BaseLLM {
|
||||
static providerName: ModelProvider = "free-trial";
|
||||
|
||||
|
@ -12,6 +11,7 @@ class FreeTrial extends BaseLLM {
|
|||
return {
|
||||
uniqueId: this.uniqueId || "None",
|
||||
"Content-Type": "application/json",
|
||||
...getHeaders(),
|
||||
};
|
||||
}
|
||||
|
||||
|
@ -22,7 +22,7 @@ class FreeTrial extends BaseLLM {
|
|||
presence_penalty: options.presencePenalty,
|
||||
max_tokens: options.maxTokens,
|
||||
stop:
|
||||
options.model === "starcoder2-15b"
|
||||
options.model === "starcoder-7b"
|
||||
? options.stop
|
||||
: options.stop?.slice(0, 2),
|
||||
temperature: options.temperature,
|
||||
|
|
|
@ -14,7 +14,7 @@ class GooglePalm extends BaseLLM {
|
|||
|
||||
static defaultOptions: Partial<LLMOptions> = {
|
||||
model: "gemini-pro",
|
||||
apiBase: "https://generativelanguage.googleapis.com/",
|
||||
apiBase: "https://generativelanguage.googleapis.com/v1beta/",
|
||||
};
|
||||
|
||||
protected async *_streamComplete(
|
||||
|
@ -83,7 +83,7 @@ class GooglePalm extends BaseLLM {
|
|||
options: CompletionOptions,
|
||||
): AsyncGenerator<ChatMessage> {
|
||||
const apiURL = new URL(
|
||||
`v1/models/${options.model}:streamGenerateContent?key=${this.apiKey}`,
|
||||
`models/${options.model}:streamGenerateContent?key=${this.apiKey}`,
|
||||
this.apiBase,
|
||||
);
|
||||
const body = {
|
||||
|
@ -161,7 +161,7 @@ class GooglePalm extends BaseLLM {
|
|||
}
|
||||
|
||||
const apiURL = new URL(
|
||||
`v1beta2/models/${options.model}:generateMessage?key=${this.apiKey}`,
|
||||
`models/${options.model}:generateMessage?key=${this.apiKey}`,
|
||||
this.apiBase,
|
||||
);
|
||||
const body = { prompt: { messages: msgList } };
|
||||
|
|
|
@ -0,0 +1,20 @@
|
|||
import { LLMOptions, ModelProvider } from "../..";
|
||||
import OpenAI from "./OpenAI";
|
||||
|
||||
class Groq extends OpenAI {
|
||||
static providerName: ModelProvider = "groq";
|
||||
static defaultOptions: Partial<LLMOptions> = {
|
||||
apiBase: "https://api.groq.com/openai/v1/",
|
||||
};
|
||||
|
||||
private static modelConversion: { [key: string]: string } = {
|
||||
"llama2-70b": "llama2-70b-4096",
|
||||
"mistral-8x7b": "mixtral-8x7b-32768",
|
||||
gemma: "gemma-7b-it",
|
||||
};
|
||||
protected _convertModelName(model: string): string {
|
||||
return Groq.modelConversion[model] ?? model;
|
||||
}
|
||||
}
|
||||
|
||||
export default Groq;
|
|
@ -15,6 +15,7 @@ class LlamaCpp extends BaseLLM {
|
|||
presence_penalty: options.presencePenalty,
|
||||
min_p: options.minP,
|
||||
mirostat: options.mirostat,
|
||||
stop: options.stop,
|
||||
};
|
||||
|
||||
return finalOptions;
|
||||
|
|
|
@ -89,6 +89,7 @@ class Ollama extends BaseLLM {
|
|||
"neural-chat-7b": "neural-chat:7b-v3.3",
|
||||
"starcoder-1b": "starcoder:1b",
|
||||
"starcoder-3b": "starcoder:3b",
|
||||
"starcoder2-3b": "starcoder2:3b",
|
||||
"stable-code-3b": "stable-code:3b",
|
||||
}[this.model] ?? this.model
|
||||
);
|
||||
|
|
|
@ -39,6 +39,12 @@ const CHAT_ONLY_MODELS = [
|
|||
class OpenAI extends BaseLLM {
|
||||
public useLegacyCompletionsEndpoint = false;
|
||||
|
||||
constructor(options: LLMOptions) {
|
||||
super(options);
|
||||
this.useLegacyCompletionsEndpoint =
|
||||
options.useLegacyCompletionsEndpoint ?? false;
|
||||
}
|
||||
|
||||
static providerName: ModelProvider = "openai";
|
||||
static defaultOptions: Partial<LLMOptions> = {
|
||||
apiBase: "https://api.openai.com/v1/",
|
||||
|
@ -65,10 +71,15 @@ class OpenAI extends BaseLLM {
|
|||
};
|
||||
}
|
||||
|
||||
protected _convertModelName(model: string): string {
|
||||
return model;
|
||||
}
|
||||
|
||||
protected _convertArgs(options: any, messages: ChatMessage[]) {
|
||||
const url = new URL(this.apiBase!);
|
||||
const finalOptions = {
|
||||
messages: messages.map(this._convertMessage),
|
||||
model: options.model,
|
||||
model: this._convertModelName(options.model),
|
||||
max_tokens: options.maxTokens,
|
||||
temperature: options.temperature,
|
||||
top_p: options.topP,
|
||||
|
@ -76,7 +87,9 @@ class OpenAI extends BaseLLM {
|
|||
presence_penalty: options.presencePenalty,
|
||||
stop:
|
||||
// Jan + Azure OpenAI don't truncate and will throw an error
|
||||
this.apiBase?.includes(":1337") || this.apiType === "azure"
|
||||
url.port === "1337" ||
|
||||
url.host === "api.openai.com" ||
|
||||
this.apiType === "azure"
|
||||
? options.stop?.slice(0, 4)
|
||||
: options.stop,
|
||||
};
|
||||
|
@ -164,8 +177,10 @@ class OpenAI extends BaseLLM {
|
|||
): AsyncGenerator<ChatMessage> {
|
||||
if (
|
||||
!CHAT_ONLY_MODELS.includes(options.model) &&
|
||||
this.supportsCompletions() &&
|
||||
(NON_CHAT_MODELS.includes(options.model) ||
|
||||
this.useLegacyCompletionsEndpoint)
|
||||
this.useLegacyCompletionsEndpoint ||
|
||||
options.raw)
|
||||
) {
|
||||
for await (const content of this._legacystreamComplete(
|
||||
stripImages(messages[messages.length - 1]?.content || ""),
|
||||
|
|
|
@ -14,6 +14,7 @@ import Flowise from "./Flowise";
|
|||
import FreeTrial from "./FreeTrial";
|
||||
import Gemini from "./Gemini";
|
||||
import GooglePalm from "./GooglePalm";
|
||||
import Groq from "./Groq";
|
||||
import HuggingFaceInferenceAPI from "./HuggingFaceInferenceAPI";
|
||||
import HuggingFaceTGI from "./HuggingFaceTGI";
|
||||
import LMStudio from "./LMStudio";
|
||||
|
@ -94,6 +95,7 @@ const LLMs = [
|
|||
DeepInfra,
|
||||
OpenAIFreeTrial,
|
||||
Flowise,
|
||||
Groq,
|
||||
];
|
||||
|
||||
export async function llmFromDescription(
|
||||
|
@ -123,13 +125,13 @@ export async function llmFromDescription(
|
|||
completionOptions: {
|
||||
...finalCompletionOptions,
|
||||
model: (desc.model || cls.defaultOptions?.model) ?? "codellama-7b",
|
||||
maxTokens: (
|
||||
maxTokens:
|
||||
finalCompletionOptions.maxTokens ??
|
||||
cls.defaultOptions?.completionOptions?.maxTokens
|
||||
) ?? DEFAULT_MAX_TOKENS,
|
||||
cls.defaultOptions?.completionOptions?.maxTokens ??
|
||||
DEFAULT_MAX_TOKENS,
|
||||
},
|
||||
systemMessage,
|
||||
};
|
||||
};
|
||||
|
||||
return new cls(options);
|
||||
}
|
||||
|
|
|
@ -1,6 +1,46 @@
|
|||
import { ChatMessage } from "../..";
|
||||
import { stripImages } from "../countTokens";
|
||||
|
||||
function templateFactory(
|
||||
systemMessage: (msg: ChatMessage) => string,
|
||||
userPrompt: string,
|
||||
assistantPrompt: string,
|
||||
separator: string,
|
||||
prefix?: string,
|
||||
emptySystemMessage?: string,
|
||||
): (msgs: ChatMessage[]) => string {
|
||||
return (msgs: ChatMessage[]) => {
|
||||
let prompt = prefix ?? "";
|
||||
|
||||
// Skip assistant messages at the beginning
|
||||
while (msgs.length > 0 && msgs[0].role === "assistant") {
|
||||
msgs.shift();
|
||||
}
|
||||
|
||||
if (msgs.length > 0 && msgs[0].role === "system") {
|
||||
prompt += systemMessage(msgs.shift()!);
|
||||
} else if (emptySystemMessage) {
|
||||
prompt += emptySystemMessage;
|
||||
}
|
||||
|
||||
for (let i = 0; i < msgs.length; i++) {
|
||||
const msg = msgs[i];
|
||||
prompt += msg.role === "user" ? userPrompt : assistantPrompt;
|
||||
prompt += msg.content;
|
||||
if (i < msgs.length - 1) {
|
||||
prompt += separator;
|
||||
}
|
||||
}
|
||||
|
||||
if (msgs.length > 0 && msgs[msgs.length - 1].role === "user") {
|
||||
prompt += separator;
|
||||
prompt += assistantPrompt;
|
||||
}
|
||||
|
||||
return prompt;
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* @description Template for LLAMA2 messages:
|
||||
*
|
||||
|
@ -43,7 +83,10 @@ function llama2TemplateMessages(msgs: ChatMessage[]): string {
|
|||
if (msgs[i].role === "user") {
|
||||
prompt += `[INST] ${msgs[i].content} [/INST]`;
|
||||
} else {
|
||||
prompt += msgs[i].content + "</s>\n<s>";
|
||||
prompt += msgs[i].content;
|
||||
if (i < msgs.length - 1) {
|
||||
prompt += "</s>\n<s>";
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -73,77 +116,41 @@ function anthropicTemplateMessages(messages: ChatMessage[]): string {
|
|||
return prompt;
|
||||
}
|
||||
|
||||
function llavaTemplateMessages(msgs: ChatMessage[]): string {
|
||||
`A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.
|
||||
`A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.
|
||||
USER: <image>{prompt}
|
||||
ASSISTANT:`;
|
||||
const llavaTemplateMessages = templateFactory(
|
||||
() => "",
|
||||
"USER: <image>",
|
||||
"ASSISTANT: ",
|
||||
"\n",
|
||||
"A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.",
|
||||
);
|
||||
|
||||
let prompt =
|
||||
"A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.";
|
||||
const zephyrTemplateMessages = templateFactory(
|
||||
(msg) => `<|system|>${msg.content}</s>\n`,
|
||||
"<|user|>\n",
|
||||
"<|assistant|>\n",
|
||||
"</s>\n",
|
||||
undefined,
|
||||
"<|system|> </s>\n",
|
||||
);
|
||||
|
||||
for (const msg of msgs) {
|
||||
prompt += msg.role === "user" ? "USER: <image>" : "ASSISTANT: ";
|
||||
prompt += msg.content;
|
||||
const chatmlTemplateMessages = templateFactory(
|
||||
(msg) => `<|im_start|>${msg.role}\n${msg.content}<|im_end|>\n`,
|
||||
"<|im_start|>user\n",
|
||||
"<|im_start|>assistant\n",
|
||||
"<|im_end|>\n",
|
||||
);
|
||||
|
||||
prompt += "\n";
|
||||
}
|
||||
|
||||
prompt += "ASSISTANT: ";
|
||||
|
||||
return prompt;
|
||||
}
|
||||
|
||||
function zephyrTemplateMessages(msgs: ChatMessage[]): string {
|
||||
let prompt = "";
|
||||
|
||||
if (msgs[0].role === "system") {
|
||||
prompt += `<|system|>${msgs[0].content}</s>\n`;
|
||||
msgs.shift();
|
||||
} else {
|
||||
prompt += "<|system|> </s>\n";
|
||||
}
|
||||
|
||||
for (const msg of msgs) {
|
||||
prompt += msg.role === "user" ? "<|user|>\n" : "<|assistant|>\n";
|
||||
prompt += `${msg.content}</s>\n`;
|
||||
}
|
||||
|
||||
prompt += "<|assistant|>\n";
|
||||
|
||||
return prompt;
|
||||
}
|
||||
|
||||
function chatmlTemplateMessages(messages: ChatMessage[]): string {
|
||||
let prompt = "";
|
||||
|
||||
for (const msg of messages) {
|
||||
prompt += `<|im_start|>${msg.role}\n${msg.content}<|im_end|>\n`;
|
||||
}
|
||||
|
||||
prompt += "<|im_start|>assistant\n";
|
||||
return prompt;
|
||||
}
|
||||
|
||||
function templateAlpacaMessages(msgs: ChatMessage[]): string {
|
||||
let prompt = "";
|
||||
|
||||
if (msgs[0].role === "system") {
|
||||
prompt += `${msgs[0].content}\n\n`;
|
||||
msgs.shift();
|
||||
} else {
|
||||
prompt +=
|
||||
"Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n";
|
||||
}
|
||||
|
||||
for (const msg of msgs) {
|
||||
prompt += msg.role === "user" ? "### Instruction:\n" : "### Response:\n";
|
||||
prompt += `${msg.content}\n\n`;
|
||||
}
|
||||
|
||||
prompt += "### Response:\n";
|
||||
|
||||
return prompt;
|
||||
}
|
||||
const templateAlpacaMessages = templateFactory(
|
||||
(msg) => `${msg.content}\n\n`,
|
||||
"### Instruction:\n",
|
||||
"### Response:\n",
|
||||
"\n\n",
|
||||
undefined,
|
||||
"Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n",
|
||||
);
|
||||
|
||||
function deepseekTemplateMessages(msgs: ChatMessage[]): string {
|
||||
let prompt = "";
|
||||
|
@ -163,63 +170,45 @@ function deepseekTemplateMessages(msgs: ChatMessage[]): string {
|
|||
}
|
||||
|
||||
prompt += `${msg.content}`;
|
||||
prompt += msg.role === "user" ? "\n" : "<|EOT|>\n";
|
||||
|
||||
if (i < msgs.length - 1) {
|
||||
prompt += msg.role === "user" ? "\n" : "<|EOT|>\n";
|
||||
}
|
||||
}
|
||||
|
||||
return prompt + "### Response:\n";
|
||||
if (msgs.length > 0 && msgs[msgs.length - 1].role === "user") {
|
||||
prompt += "\n";
|
||||
prompt += "### Response:\n";
|
||||
}
|
||||
|
||||
return prompt;
|
||||
}
|
||||
|
||||
// See https://huggingface.co/microsoft/phi-2#qa-format
|
||||
function phi2TemplateMessages(msgs: ChatMessage[]): string {
|
||||
const HUMAN_PROMPT = "\n\nInstruct:";
|
||||
const AI_PROMPT = "\n\nOutput:";
|
||||
let prompt = "";
|
||||
const phi2TemplateMessages = templateFactory(
|
||||
(msg) => `\n\nInstruct: ${msg.content} `,
|
||||
"\n\nInstruct: ",
|
||||
"\n\nOutput: ",
|
||||
" ",
|
||||
);
|
||||
|
||||
for (const msg of msgs) {
|
||||
prompt += `${
|
||||
msg.role === "user" || msg.role === "system" ? HUMAN_PROMPT : AI_PROMPT
|
||||
} ${msg.content} `;
|
||||
}
|
||||
|
||||
prompt += AI_PROMPT;
|
||||
return prompt;
|
||||
}
|
||||
|
||||
function phindTemplateMessages(msgs: ChatMessage[]): string {
|
||||
let prompt = "";
|
||||
|
||||
if (msgs[0].role === "system") {
|
||||
prompt += `### System Prompt\n${msgs[0].content}\n\n`;
|
||||
msgs.shift();
|
||||
}
|
||||
|
||||
for (const msg of msgs) {
|
||||
prompt += msg.role === "user" ? "### User Message\n" : "### Assistant\n";
|
||||
prompt += `${msg.content}\n`;
|
||||
}
|
||||
|
||||
prompt += "### Assistant\n";
|
||||
|
||||
return prompt;
|
||||
}
|
||||
const phindTemplateMessages = templateFactory(
|
||||
(msg) => `### System Prompt\n${msg.content}\n\n`,
|
||||
"### User Message\n",
|
||||
"### Assistant\n",
|
||||
"\n",
|
||||
);
|
||||
|
||||
/**
|
||||
* OpenChat Template, used by CodeNinja
|
||||
* GPT4 Correct User: Hello<|end_of_turn|>GPT4 Correct Assistant: Hi<|end_of_turn|>GPT4 Correct User: How are you today?<|end_of_turn|>GPT4 Correct Assistant:
|
||||
*/
|
||||
function openchatTemplateMessages(msgs: ChatMessage[]): string {
|
||||
let prompt = "";
|
||||
|
||||
for (const msg of msgs) {
|
||||
prompt +=
|
||||
msg.role === "user" ? "GPT4 Correct User: " : "GPT4 Correct Assistant: ";
|
||||
prompt += msg.content + "<|end_of_turn|>";
|
||||
}
|
||||
|
||||
prompt += "GPT4 Correct Assistant: ";
|
||||
|
||||
return prompt;
|
||||
}
|
||||
const openchatTemplateMessages = templateFactory(
|
||||
() => "",
|
||||
"GPT4 Correct User: ",
|
||||
"GPT4 Correct Assistant: ",
|
||||
"<|end_of_turn|>",
|
||||
);
|
||||
|
||||
/**
|
||||
* Chat template used by https://huggingface.co/TheBloke/XwinCoder-13B-GPTQ
|
||||
|
@ -229,47 +218,25 @@ function openchatTemplateMessages(msgs: ChatMessage[]): string {
|
|||
<user>: {prompt}
|
||||
<AI>:
|
||||
*/
|
||||
function xWinCoderTemplateMessages(msgs: ChatMessage[]): string {
|
||||
let prompt = "<system>: ";
|
||||
if (msgs[0].role === "system") {
|
||||
prompt += msgs.shift()!.content;
|
||||
} else {
|
||||
prompt +=
|
||||
"You are an AI coding assistant that helps people with programming. Write a response that appropriately completes the user's request.";
|
||||
}
|
||||
|
||||
for (let msg of msgs) {
|
||||
prompt += "\n";
|
||||
prompt += msg.role === "user" ? "<user>" : "<AI>";
|
||||
prompt += ": " + msg.content;
|
||||
}
|
||||
|
||||
prompt += "<AI>: ";
|
||||
|
||||
return prompt;
|
||||
}
|
||||
const xWinCoderTemplateMessages = templateFactory(
|
||||
(msg) => `<system>: ${msg.content}`,
|
||||
"\n<user>: ",
|
||||
"\n<AI>: ",
|
||||
"",
|
||||
undefined,
|
||||
"<system>: You are an AI coding assistant that helps people with programming. Write a response that appropriately completes the user's request.",
|
||||
);
|
||||
|
||||
/**
|
||||
* NeuralChat Template
|
||||
* ### System:\n{system_input}\n### User:\n{user_input}\n### Assistant:\n
|
||||
*/
|
||||
function neuralChatTemplateMessages(msgs: ChatMessage[]): string {
|
||||
let prompt = "";
|
||||
|
||||
if (msgs[0].role === "system") {
|
||||
prompt += `### System:\n${msgs[0].content}\n`;
|
||||
msgs.shift();
|
||||
}
|
||||
|
||||
for (const msg of msgs) {
|
||||
prompt += msg.role === "user" ? "### User:\n" : "### Assistant:\n";
|
||||
prompt += `${msg.content}\n`;
|
||||
}
|
||||
|
||||
prompt += "### Assistant:\n";
|
||||
|
||||
return prompt;
|
||||
}
|
||||
const neuralChatTemplateMessages = templateFactory(
|
||||
(msg) => `### System:\n${msg.content}\n`,
|
||||
"### User:\n",
|
||||
"### Assistant:\n",
|
||||
"\n",
|
||||
);
|
||||
|
||||
/**
|
||||
'<s>Source: system\n\n System prompt <step> Source: user\n\n First user query <step> Source: assistant\n\n Model response to first query <step> Source: user\n\n Second user query <step> Source: assistant\nDestination: user\n\n '
|
||||
|
@ -292,18 +259,12 @@ function codeLlama70bTemplateMessages(msgs: ChatMessage[]): string {
|
|||
What is Cramer's Rule?<end_of_turn>
|
||||
<start_of_turn>model
|
||||
*/
|
||||
|
||||
function gemmaTemplateMessage(msgs: ChatMessage[]): string {
|
||||
let prompt = "";
|
||||
for (const msg of msgs) {
|
||||
prompt += `<start_of_turn>${
|
||||
msg.role === "assistant" ? "model" : "user"
|
||||
}\n${stripImages(msg.content).trim()}<end_of_turn>\n`;
|
||||
}
|
||||
|
||||
prompt += "<start_of_turn>model\n";
|
||||
return prompt;
|
||||
}
|
||||
const gemmaTemplateMessage = templateFactory(
|
||||
() => "",
|
||||
"<start_of_turn>user\n",
|
||||
"<start_of_turn>model\n",
|
||||
"<end_of_turn>\n",
|
||||
);
|
||||
|
||||
export {
|
||||
anthropicTemplateMessages,
|
||||
|
|
|
@ -1,5 +1,4 @@
|
|||
import { ChatMessage } from "../..";
|
||||
import { PromptTemplate } from "../../util";
|
||||
import { ChatMessage, PromptTemplate } from "../..";
|
||||
|
||||
const simplifiedEditPrompt = `Consider the following code:
|
||||
\`\`\`{{{language}}}
|
||||
|
@ -19,27 +18,115 @@ Here is the edit requested:
|
|||
|
||||
Here is the code after editing:`;
|
||||
|
||||
const gptEditPrompt = `\
|
||||
\`\`\`{{{language}}}
|
||||
{{{codeToEdit}}}
|
||||
const gptEditPrompt: PromptTemplate = (_, otherData) => {
|
||||
if (otherData?.codeToEdit?.trim().length === 0) {
|
||||
return `\
|
||||
\`\`\`${otherData.language}
|
||||
${otherData.prefix}[BLANK]${otherData.codeToEdit}${otherData.suffix}
|
||||
\`\`\`
|
||||
|
||||
You are an expert programmer. You will rewrite the above code to do the following:
|
||||
Given the user's request: "${otherData.userInput}"
|
||||
|
||||
{{{userInput}}}
|
||||
Here is the code that should fill in the [BLANK]:`;
|
||||
}
|
||||
|
||||
Output only a code block with the rewritten code:`;
|
||||
const paragraphs = [
|
||||
"The user has requested a section of code in a file to be rewritten.",
|
||||
];
|
||||
if (otherData.prefix?.trim().length > 0) {
|
||||
paragraphs.push(`This is the prefix of the file:
|
||||
\`\`\`${otherData.language}
|
||||
${otherData.prefix}
|
||||
\`\`\``);
|
||||
}
|
||||
|
||||
if (otherData.suffix?.trim().length > 0) {
|
||||
paragraphs.push(`This is the suffix of the file:
|
||||
\`\`\`${otherData.language}
|
||||
${otherData.suffix}
|
||||
\`\`\``);
|
||||
}
|
||||
|
||||
paragraphs.push(`This is the code to rewrite:
|
||||
\`\`\`${otherData.language}
|
||||
${otherData.codeToEdit}
|
||||
\`\`\`
|
||||
|
||||
The user's request is: "${otherData.userInput}"
|
||||
|
||||
Here is the rewritten code:`);
|
||||
|
||||
return paragraphs.join("\n\n");
|
||||
};
|
||||
|
||||
const codellamaInfillEditPrompt = "{{filePrefix}}<FILL>{{fileSuffix}}";
|
||||
|
||||
const codellamaEditPrompt = `\`\`\`{{{language}}}
|
||||
{{{codeToEdit}}}
|
||||
\`\`\`
|
||||
[INST] You are an expert programmer and personal assistant. Your task is to rewrite the above code with these instructions: "{{{userInput}}}"
|
||||
const START_TAG = "<START EDITING HERE>";
|
||||
const osModelsEditPrompt: PromptTemplate = (history, otherData) => {
|
||||
// "No sufix" means either there is no suffix OR
|
||||
// it's a clean break at end of function or something
|
||||
// (what we're trying to avoid is just the language model trying to complete the closing brackets of a function or something)
|
||||
const firstCharOfFirstLine = otherData.suffix?.split("\n")[0]?.[0]?.trim();
|
||||
const isSuffix =
|
||||
otherData.suffix?.trim() !== "" &&
|
||||
// First character of first line is whitespace
|
||||
// Otherwise we assume it's a clean break
|
||||
!firstCharOfFirstLine;
|
||||
const suffixTag = isSuffix ? "<STOP EDITING HERE>" : "";
|
||||
const suffixExplanation = isSuffix
|
||||
? ' When you get to "<STOP EDITING HERE>", end your response.'
|
||||
: "";
|
||||
|
||||
Your answer should be given inside of a code block. It should use the same kind of indentation as above.
|
||||
[/INST] Sure! Here's the rewritten code you requested:
|
||||
\`\`\`{{{language}}}`;
|
||||
// If neither prefilling nor /v1/completions are supported, we have to use a chat prompt without putting words in the model's mouth
|
||||
if (
|
||||
otherData.supportsCompletions !== "true" &&
|
||||
otherData.supportsPrefill !== "true"
|
||||
) {
|
||||
return gptEditPrompt(history, otherData);
|
||||
}
|
||||
|
||||
// Use a different prompt when there's neither prefix nor suffix
|
||||
if (otherData.prefix?.trim() === "" && otherData.suffix?.trim() === "") {
|
||||
return [
|
||||
{
|
||||
role: "user",
|
||||
content: `\`\`\`${otherData.language}
|
||||
${otherData.codeToEdit}
|
||||
${suffixTag}
|
||||
\`\`\`
|
||||
|
||||
Please rewrite the entire code block above in order to satisfy the following request: "${otherData.userInput}".${suffixExplanation}`,
|
||||
},
|
||||
{
|
||||
role: "assistant",
|
||||
content: `Sure! Here's the entire rewritten code block:
|
||||
\`\`\`${otherData.language}
|
||||
`,
|
||||
},
|
||||
];
|
||||
}
|
||||
|
||||
return [
|
||||
{
|
||||
role: "user",
|
||||
content: `\`\`\`${otherData.language}
|
||||
${otherData.prefix}${START_TAG}
|
||||
${otherData.codeToEdit}
|
||||
${suffixTag}
|
||||
\`\`\`
|
||||
|
||||
Please rewrite the entire code block above, editing the portion below "${START_TAG}" in order to satisfy the following request: "${otherData.userInput}".${suffixExplanation}
|
||||
`,
|
||||
},
|
||||
{
|
||||
role: "assistant",
|
||||
content: `Sure! Here's the entire code block, including the rewritten portion:
|
||||
\`\`\`${otherData.language}
|
||||
${otherData.prefix}${START_TAG}
|
||||
`,
|
||||
},
|
||||
];
|
||||
};
|
||||
|
||||
const mistralEditPrompt = `[INST] You are a helpful code assistant. Your task is to rewrite the following code with these instructions: "{{{userInput}}}"
|
||||
\`\`\`{{{language}}}
|
||||
|
@ -188,7 +275,6 @@ export {
|
|||
alpacaEditPrompt,
|
||||
claudeEditPrompt,
|
||||
codeLlama70bEditPrompt,
|
||||
codellamaEditPrompt,
|
||||
codellamaInfillEditPrompt,
|
||||
deepseekEditPrompt,
|
||||
gemmaEditPrompt,
|
||||
|
@ -196,6 +282,7 @@ export {
|
|||
mistralEditPrompt,
|
||||
neuralChatEditPrompt,
|
||||
openchatEditPrompt,
|
||||
osModelsEditPrompt,
|
||||
phindEditPrompt,
|
||||
simplestEditPrompt,
|
||||
simplifiedEditPrompt,
|
||||
|
|
|
@ -1,12 +1,12 @@
|
|||
{
|
||||
"name": "@continuedev/core",
|
||||
"version": "1.0.4",
|
||||
"version": "1.0.6",
|
||||
"lockfileVersion": 3,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "@continuedev/core",
|
||||
"version": "1.0.4",
|
||||
"version": "1.0.6",
|
||||
"license": "Apache-2.0",
|
||||
"dependencies": {
|
||||
"@mozilla/readability": "^0.5.0",
|
||||
|
@ -26,6 +26,7 @@
|
|||
"handlebars": "^4.7.8",
|
||||
"http-proxy-agent": "^7.0.1",
|
||||
"https-proxy-agent": "^7.0.3",
|
||||
"ignore": "^5.3.1",
|
||||
"js-tiktoken": "^1.0.8",
|
||||
"jsdom": "^24.0.0",
|
||||
"llama-tokenizer-js": "^1.1.3",
|
||||
|
@ -43,7 +44,7 @@
|
|||
"sqlite3": "^5.1.7",
|
||||
"tree-sitter-wasms": "^0.1.6",
|
||||
"uuid": "^9.0.1",
|
||||
"vectordb": "^0.4.12",
|
||||
"vectordb": "0.4.12",
|
||||
"web-tree-sitter": "^0.21.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
|
@ -5336,7 +5337,6 @@
|
|||
"version": "5.3.1",
|
||||
"resolved": "https://registry.npmjs.org/ignore/-/ignore-5.3.1.tgz",
|
||||
"integrity": "sha512-5Fytz/IraMjqpwfd34ke28PTVMjZjJG2MPn5t7OE4eUCUNf8BAa7b5WUS9/Qvr6mwOQS7Mk6vdsMno5he+T8Xw==",
|
||||
"peer": true,
|
||||
"engines": {
|
||||
"node": ">= 4"
|
||||
}
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
{
|
||||
"name": "@continuedev/core",
|
||||
"version": "1.0.4",
|
||||
"version": "1.0.6",
|
||||
"description": "The Continue Core contains functionality that can be shared across web, VS Code, or Node.js server. It is written in TypeScript and contains much of the functionality that was previously inside of the legacy Continue Python Server.",
|
||||
"scripts": {
|
||||
"test": "jest ./test/diff.test.ts",
|
||||
|
@ -37,6 +37,7 @@
|
|||
"handlebars": "^4.7.8",
|
||||
"http-proxy-agent": "^7.0.1",
|
||||
"https-proxy-agent": "^7.0.3",
|
||||
"ignore": "^5.3.1",
|
||||
"js-tiktoken": "^1.0.8",
|
||||
"jsdom": "^24.0.0",
|
||||
"llama-tokenizer-js": "^1.1.3",
|
||||
|
@ -54,7 +55,7 @@
|
|||
"sqlite3": "^5.1.7",
|
||||
"tree-sitter-wasms": "^0.1.6",
|
||||
"uuid": "^9.0.1",
|
||||
"vectordb": "^0.4.12",
|
||||
"vectordb": "0.4.12",
|
||||
"web-tree-sitter": "^0.21.0"
|
||||
},
|
||||
"puppeteer": {
|
||||
|
|
|
@ -3,6 +3,7 @@ import {
|
|||
ChatMessage,
|
||||
ContextItemWithId,
|
||||
ContextSubmenuItem,
|
||||
DiffLine,
|
||||
LLMFullCompletionOptions,
|
||||
MessageContent,
|
||||
PersistedSessionInfo,
|
||||
|
@ -97,6 +98,17 @@ export type Protocol = {
|
|||
},
|
||||
ProtocolGeneratorType<MessageContent>,
|
||||
];
|
||||
streamDiffLines: [
|
||||
{
|
||||
prefix: string;
|
||||
highlighted: string;
|
||||
suffix: string;
|
||||
input: string;
|
||||
language: string | undefined;
|
||||
modelTitle: string | undefined;
|
||||
},
|
||||
ProtocolGeneratorType<DiffLine>,
|
||||
];
|
||||
};
|
||||
|
||||
export interface IdeSettings {
|
||||
|
@ -107,4 +119,5 @@ export interface IdeSettings {
|
|||
|
||||
export type ReverseProtocol = IdeProtocol & {
|
||||
getIdeSettings: [undefined, IdeSettings];
|
||||
errorPopup: [{ message: string }, void];
|
||||
};
|
||||
|
|
|
@ -0,0 +1,81 @@
|
|||
import { http, https } from "follow-redirects";
|
||||
import * as fs from "fs";
|
||||
import { HttpProxyAgent } from "http-proxy-agent";
|
||||
import { HttpsProxyAgent } from "https-proxy-agent";
|
||||
import fetch, { RequestInit, Response } from "node-fetch";
|
||||
import tls from "tls";
|
||||
import { RequestOptions } from "..";
|
||||
|
||||
export function fetchwithRequestOptions(
|
||||
url: URL,
|
||||
init: RequestInit,
|
||||
requestOptions?: RequestOptions,
|
||||
): Promise<Response> {
|
||||
const TIMEOUT = 7200; // 7200 seconds = 2 hours
|
||||
|
||||
const ca = [...tls.rootCertificates];
|
||||
const customCerts =
|
||||
typeof requestOptions?.caBundlePath === "string"
|
||||
? [requestOptions?.caBundlePath]
|
||||
: requestOptions?.caBundlePath;
|
||||
if (customCerts) {
|
||||
ca.push(
|
||||
...customCerts.map((customCert) => fs.readFileSync(customCert, "utf8")),
|
||||
);
|
||||
}
|
||||
|
||||
let timeout = (requestOptions?.timeout ?? TIMEOUT) * 1000; // measured in ms
|
||||
|
||||
const agentOptions = {
|
||||
ca,
|
||||
rejectUnauthorized: requestOptions?.verifySsl,
|
||||
timeout,
|
||||
sessionTimeout: timeout,
|
||||
keepAlive: true,
|
||||
keepAliveMsecs: timeout,
|
||||
};
|
||||
|
||||
const proxy = requestOptions?.proxy;
|
||||
|
||||
// Create agent
|
||||
const protocol = url.protocol === "https:" ? https : http;
|
||||
const agent = proxy
|
||||
? protocol === https
|
||||
? new HttpsProxyAgent(proxy, agentOptions)
|
||||
: new HttpProxyAgent(proxy, agentOptions)
|
||||
: new protocol.Agent(agentOptions);
|
||||
|
||||
const headers: { [key: string]: string } = requestOptions?.headers || {};
|
||||
for (const [key, value] of Object.entries(init.headers || {})) {
|
||||
headers[key] = value as string;
|
||||
}
|
||||
|
||||
// Replace localhost with 127.0.0.1
|
||||
if (url.hostname === "localhost") {
|
||||
url.hostname = "127.0.0.1";
|
||||
}
|
||||
|
||||
// add extra body properties if provided
|
||||
let updatedBody: string | undefined = undefined;
|
||||
try {
|
||||
if (requestOptions?.extraBodyProperties && typeof init.body === "string") {
|
||||
const parsedBody = JSON.parse(init.body);
|
||||
updatedBody = JSON.stringify({
|
||||
...parsedBody,
|
||||
...requestOptions.extraBodyProperties,
|
||||
});
|
||||
}
|
||||
} catch (e) {
|
||||
console.log("Unable to parse HTTP request body: ", e);
|
||||
}
|
||||
|
||||
// fetch the request with the provided options
|
||||
let resp = fetch(url, {
|
||||
...init,
|
||||
body: updatedBody ?? init.body,
|
||||
headers: headers,
|
||||
agent: agent,
|
||||
});
|
||||
|
||||
return resp;
|
||||
}
|
|
@ -1,32 +1,51 @@
|
|||
import * as fs from "fs";
|
||||
import { ContinueRcJson, IDE, IdeInfo, IndexTag, Problem, Range, Thread } from "..";
|
||||
import {
|
||||
ContinueRcJson,
|
||||
IDE,
|
||||
IdeInfo,
|
||||
IndexTag,
|
||||
Problem,
|
||||
Range,
|
||||
Thread,
|
||||
} from "..";
|
||||
|
||||
import { getContinueGlobalPath } from "./paths";
|
||||
|
||||
class FileSystemIde implements IDE {
|
||||
getRepoName(dir: string): Promise<string | undefined> {
|
||||
return Promise.resolve(undefined);
|
||||
}
|
||||
|
||||
getTags(artifactId: string): Promise<IndexTag[]> {
|
||||
return Promise.resolve([]);
|
||||
}
|
||||
|
||||
getIdeInfo(): Promise<IdeInfo> {
|
||||
return Promise.resolve({
|
||||
ideType: "vscode",
|
||||
name: "na",
|
||||
version: "0.1",
|
||||
remoteName: "na",
|
||||
extensionVersion: "na",
|
||||
});
|
||||
}
|
||||
|
||||
readRangeInFile(filepath: string, range: Range): Promise<string> {
|
||||
return Promise.resolve("");
|
||||
}
|
||||
|
||||
getStats(directory: string): Promise<{ [path: string]: number }> {
|
||||
return Promise.resolve({});
|
||||
}
|
||||
|
||||
isTelemetryEnabled(): Promise<boolean> {
|
||||
return Promise.resolve(false);
|
||||
}
|
||||
|
||||
getUniqueId(): Promise<string> {
|
||||
return Promise.resolve("NOT_UNIQUE");
|
||||
}
|
||||
|
||||
getWorkspaceConfigs(): Promise<ContinueRcJson[]> {
|
||||
return Promise.resolve([]);
|
||||
}
|
||||
|
@ -34,28 +53,34 @@ class FileSystemIde implements IDE {
|
|||
getDiff(): Promise<string> {
|
||||
return Promise.resolve("");
|
||||
}
|
||||
|
||||
getTerminalContents(): Promise<string> {
|
||||
return Promise.resolve("");
|
||||
}
|
||||
|
||||
async getDebugLocals(threadIndex: number): Promise<string> {
|
||||
return Promise.resolve("");
|
||||
}
|
||||
|
||||
async getTopLevelCallStackSources(
|
||||
threadIndex: number,
|
||||
stackDepth: number
|
||||
stackDepth: number,
|
||||
): Promise<string[]> {
|
||||
return Promise.resolve([]);
|
||||
}
|
||||
|
||||
async getAvailableThreads(): Promise<Thread[]> {
|
||||
return Promise.resolve([]);
|
||||
}
|
||||
|
||||
showLines(
|
||||
filepath: string,
|
||||
startLine: number,
|
||||
endLine: number
|
||||
endLine: number,
|
||||
): Promise<void> {
|
||||
return Promise.resolve();
|
||||
}
|
||||
|
||||
listWorkspaceContents(): Promise<string[]> {
|
||||
return new Promise((resolve, reject) => {
|
||||
fs.readdir("/tmp/continue", (err, files) => {
|
||||
|
@ -66,6 +91,7 @@ class FileSystemIde implements IDE {
|
|||
});
|
||||
});
|
||||
}
|
||||
|
||||
getWorkspaceDirs(): Promise<string[]> {
|
||||
return new Promise((resolve, reject) => {
|
||||
fs.mkdtemp("/tmp/continue", (err, folder) => {
|
||||
|
@ -76,9 +102,11 @@ class FileSystemIde implements IDE {
|
|||
});
|
||||
});
|
||||
}
|
||||
|
||||
listFolders(): Promise<string[]> {
|
||||
return Promise.resolve([]);
|
||||
}
|
||||
|
||||
writeFile(path: string, contents: string): Promise<void> {
|
||||
return new Promise((resolve, reject) => {
|
||||
fs.writeFile(path, contents, (err) => {
|
||||
|
@ -89,21 +117,27 @@ class FileSystemIde implements IDE {
|
|||
});
|
||||
});
|
||||
}
|
||||
|
||||
showVirtualFile(title: string, contents: string): Promise<void> {
|
||||
return Promise.resolve();
|
||||
}
|
||||
|
||||
getContinueDir(): Promise<string> {
|
||||
return Promise.resolve(getContinueGlobalPath());
|
||||
}
|
||||
|
||||
openFile(path: string): Promise<void> {
|
||||
return Promise.resolve();
|
||||
}
|
||||
|
||||
runCommand(command: string): Promise<void> {
|
||||
return Promise.resolve();
|
||||
}
|
||||
|
||||
saveFile(filepath: string): Promise<void> {
|
||||
return Promise.resolve();
|
||||
}
|
||||
|
||||
readFile(filepath: string): Promise<string> {
|
||||
return new Promise((resolve, reject) => {
|
||||
fs.readFile(filepath, "utf8", (err, contents) => {
|
||||
|
@ -114,10 +148,11 @@ class FileSystemIde implements IDE {
|
|||
});
|
||||
});
|
||||
}
|
||||
|
||||
showDiff(
|
||||
filepath: string,
|
||||
newContents: string,
|
||||
stepIndex: number
|
||||
stepIndex: number,
|
||||
): Promise<void> {
|
||||
return Promise.resolve();
|
||||
}
|
||||
|
|
|
@ -1,6 +1,3 @@
|
|||
import Handlebars from "handlebars";
|
||||
import { ChatMessage } from "..";
|
||||
|
||||
export function removeQuotesAndEscapes(output: string): string {
|
||||
output = output.trim();
|
||||
|
||||
|
@ -92,34 +89,6 @@ export function dedentAndGetCommonWhitespace(s: string): [string, string] {
|
|||
return [lines.map((x) => x.replace(lcp, "")).join("\n"), lcp];
|
||||
}
|
||||
|
||||
export type PromptTemplate =
|
||||
| string
|
||||
| ((
|
||||
history: ChatMessage[],
|
||||
otherData: Record<string, string>,
|
||||
) => string | ChatMessage[]);
|
||||
|
||||
export function renderPromptTemplate(
|
||||
template: PromptTemplate,
|
||||
history: ChatMessage[],
|
||||
otherData: Record<string, string>,
|
||||
): string | ChatMessage[] {
|
||||
if (typeof template === "string") {
|
||||
let data: any = {
|
||||
history: history,
|
||||
...otherData,
|
||||
};
|
||||
if (history.length > 0 && history[0].role == "system") {
|
||||
data["system_message"] = history.shift()!.content;
|
||||
}
|
||||
|
||||
const compiledTemplate = Handlebars.compile(template);
|
||||
return compiledTemplate(data);
|
||||
} else {
|
||||
return template(history, otherData);
|
||||
}
|
||||
}
|
||||
|
||||
export function getBasename(filepath: string, n: number = 1): string {
|
||||
return filepath.split(/[\\/]/).pop() ?? "";
|
||||
}
|
||||
|
@ -190,3 +159,18 @@ export function copyOf(obj: any): any {
|
|||
if (obj === null || obj === undefined) return obj;
|
||||
return JSON.parse(JSON.stringify(obj));
|
||||
}
|
||||
|
||||
export function deduplicateArray<T>(
|
||||
array: T[],
|
||||
equal: (a: T, b: T) => boolean,
|
||||
): T[] {
|
||||
const result: T[] = [];
|
||||
|
||||
for (const item of array) {
|
||||
if (!result.some((existingItem) => equal(existingItem, item))) {
|
||||
result.push(item);
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
|
|
@ -1,27 +1,44 @@
|
|||
import { ContinueRcJson, IDE, IdeInfo, IndexTag, Problem, Range, Thread } from "..";
|
||||
import {
|
||||
ContinueRcJson,
|
||||
IDE,
|
||||
IdeInfo,
|
||||
IndexTag,
|
||||
Problem,
|
||||
Range,
|
||||
Thread,
|
||||
} from "..";
|
||||
|
||||
export class MessageIde implements IDE {
|
||||
constructor(
|
||||
private readonly request: (messageType: string, data: any) => Promise<any>
|
||||
private readonly request: (messageType: string, data: any) => Promise<any>,
|
||||
) {}
|
||||
|
||||
getRepoName(dir: string): Promise<string | undefined> {
|
||||
return this.request("getRepoName", { dir });
|
||||
}
|
||||
|
||||
getDebugLocals(threadIndex: number): Promise<string> {
|
||||
return this.request("getDebugLocals", { threadIndex });
|
||||
}
|
||||
|
||||
getTopLevelCallStackSources(
|
||||
threadIndex: number,
|
||||
stackDepth: number
|
||||
stackDepth: number,
|
||||
): Promise<string[]> {
|
||||
return this.request("getTopLevelCallStackSources", {
|
||||
threadIndex,
|
||||
stackDepth,
|
||||
});
|
||||
}
|
||||
|
||||
getAvailableThreads(): Promise<Thread[]> {
|
||||
return this.request("getAvailableThreads", undefined);
|
||||
}
|
||||
|
||||
getTags(artifactId: string): Promise<IndexTag[]> {
|
||||
return this.request("getTags", artifactId);
|
||||
}
|
||||
|
||||
getIdeInfo(): Promise<IdeInfo> {
|
||||
return this.request("getIdeInfo", undefined);
|
||||
}
|
||||
|
@ -29,9 +46,11 @@ export class MessageIde implements IDE {
|
|||
readRangeInFile(filepath: string, range: Range): Promise<string> {
|
||||
return this.request("readRangeInFile", { filepath, range });
|
||||
}
|
||||
|
||||
getStats(directory: string): Promise<{ [path: string]: number }> {
|
||||
throw new Error("Method not implemented.");
|
||||
}
|
||||
|
||||
isTelemetryEnabled(): Promise<boolean> {
|
||||
return this.request("isTelemetryEnabled", undefined);
|
||||
}
|
||||
|
@ -63,7 +82,7 @@ export class MessageIde implements IDE {
|
|||
async showLines(
|
||||
filepath: string,
|
||||
startLine: number,
|
||||
endLine: number
|
||||
endLine: number,
|
||||
): Promise<void> {
|
||||
return await this.request("showLines", { filepath, startLine, endLine });
|
||||
}
|
||||
|
@ -108,7 +127,7 @@ export class MessageIde implements IDE {
|
|||
async showDiff(
|
||||
filepath: string,
|
||||
newContents: string,
|
||||
stepIndex: number
|
||||
stepIndex: number,
|
||||
): Promise<void> {
|
||||
await this.request("showDiff", { filepath, newContents, stepIndex });
|
||||
}
|
||||
|
|
|
@ -1,7 +1,8 @@
|
|||
import { TabAutocompleteOptions } from "..";
|
||||
|
||||
export const DEFAULT_AUTOCOMPLETE_OPTS: TabAutocompleteOptions = {
|
||||
useCopyBuffer: true,
|
||||
disable: false,
|
||||
useCopyBuffer: false,
|
||||
useSuffix: true,
|
||||
maxPromptTokens: 500,
|
||||
prefixPercentage: 0.85,
|
||||
|
@ -16,3 +17,13 @@ export const DEFAULT_AUTOCOMPLETE_OPTS: TabAutocompleteOptions = {
|
|||
onlyMyCode: true,
|
||||
useOtherFiles: false,
|
||||
};
|
||||
|
||||
export const RETRIEVAL_PARAMS = {
|
||||
rerankThreshold: 0.3,
|
||||
nFinal: 10,
|
||||
nRetrieve: 20,
|
||||
bm25Threshold: -2.5,
|
||||
};
|
||||
|
||||
// export const SERVER_URL = "http://localhost:3000";
|
||||
export const SERVER_URL = "https://proxy-server-green-l6vsfbzhba-uw.a.run.app";
|
|
@ -1,13 +1,21 @@
|
|||
import os from "os";
|
||||
|
||||
export class Telemetry {
|
||||
// Set to undefined whenever telemetry is disabled
|
||||
static client: any = undefined;
|
||||
static uniqueId: string = "NOT_UNIQUE";
|
||||
static os: string | undefined = undefined;
|
||||
static extensionVersion: string | undefined = undefined;
|
||||
|
||||
static async capture(event: string, properties: any) {
|
||||
static async capture(event: string, properties: { [key: string]: any }) {
|
||||
Telemetry.client?.capture({
|
||||
distinctId: Telemetry.uniqueId,
|
||||
event,
|
||||
properties,
|
||||
properties: {
|
||||
...properties,
|
||||
os: Telemetry.os,
|
||||
extensionVersion: Telemetry.extensionVersion,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
|
@ -15,8 +23,15 @@ export class Telemetry {
|
|||
Telemetry.client?.shutdown();
|
||||
}
|
||||
|
||||
static async setup(allow: boolean, uniqueId: string) {
|
||||
static async setup(
|
||||
allow: boolean,
|
||||
uniqueId: string,
|
||||
extensionVersion: string,
|
||||
) {
|
||||
Telemetry.uniqueId = uniqueId;
|
||||
Telemetry.os = os.platform();
|
||||
Telemetry.extensionVersion = extensionVersion;
|
||||
|
||||
if (!allow) {
|
||||
Telemetry.client = undefined;
|
||||
} else {
|
||||
|
|
|
@ -4,23 +4,27 @@ import {
|
|||
filterEnglishLinesAtEnd,
|
||||
filterEnglishLinesAtStart,
|
||||
filterLeadingAndTrailingNewLineInsertion,
|
||||
fixCodeLlamaFirstLineIndentation,
|
||||
stopAtLines,
|
||||
} from "../autocomplete/lineStream";
|
||||
import { streamDiff } from "../diff/streamDiff";
|
||||
import { streamLines } from "../diff/util";
|
||||
import { gptEditPrompt } from "../llm/templates/edit";
|
||||
import { dedentAndGetCommonWhitespace, renderPromptTemplate } from "../util";
|
||||
import { Telemetry } from "./posthog";
|
||||
|
||||
function constructPrompt(
|
||||
codeToEdit: string,
|
||||
prefix: string,
|
||||
highlighted: string,
|
||||
suffix: string,
|
||||
llm: ILLM,
|
||||
userInput: string,
|
||||
language: string | undefined,
|
||||
): string | ChatMessage[] {
|
||||
const template = llm.promptTemplates?.edit ?? gptEditPrompt;
|
||||
return renderPromptTemplate(template, [], {
|
||||
return llm.renderPromptTemplate(template, [], {
|
||||
userInput,
|
||||
codeToEdit,
|
||||
prefix,
|
||||
codeToEdit: highlighted,
|
||||
suffix,
|
||||
language: language ?? "",
|
||||
});
|
||||
}
|
||||
|
@ -42,37 +46,62 @@ function modelIsInept(model: string): boolean {
|
|||
}
|
||||
|
||||
export async function* streamDiffLines(
|
||||
oldCode: string,
|
||||
prefix: string,
|
||||
highlighted: string,
|
||||
suffix: string,
|
||||
llm: ILLM,
|
||||
input: string,
|
||||
language: string | undefined,
|
||||
): AsyncGenerator<DiffLine> {
|
||||
Telemetry.capture("inlineEdit", {
|
||||
model: llm.model,
|
||||
provider: llm.providerName,
|
||||
});
|
||||
|
||||
// Strip common indentation for the LLM, then add back after generation
|
||||
const [withoutIndentation, commonIndentation] =
|
||||
dedentAndGetCommonWhitespace(oldCode);
|
||||
oldCode = withoutIndentation;
|
||||
const oldLines = oldCode.split("\n");
|
||||
const prompt = constructPrompt(oldCode, llm, input, language);
|
||||
let oldLines =
|
||||
highlighted.length > 0
|
||||
? highlighted.split("\n")
|
||||
: // When highlighted is empty, we need to combine last line of prefix and first line of suffix to determine the line being edited
|
||||
[(prefix + suffix).split("\n")[prefix.split("\n").length - 1]];
|
||||
|
||||
// But if that line is empty, we can assume we are insertion-only
|
||||
if (oldLines.length === 1 && oldLines[0].trim() === "") {
|
||||
oldLines = [];
|
||||
}
|
||||
|
||||
const prompt = constructPrompt(
|
||||
prefix,
|
||||
highlighted,
|
||||
suffix,
|
||||
llm,
|
||||
input,
|
||||
language,
|
||||
);
|
||||
const inept = modelIsInept(llm.model);
|
||||
|
||||
const completion =
|
||||
typeof prompt === "string"
|
||||
? llm.streamComplete(prompt)
|
||||
? llm.streamComplete(prompt, { raw: true })
|
||||
: llm.streamChat(prompt);
|
||||
|
||||
let lines = streamLines(completion);
|
||||
|
||||
if (inept) {
|
||||
lines = filterEnglishLinesAtStart(lines);
|
||||
}
|
||||
lines = filterEnglishLinesAtStart(lines);
|
||||
lines = filterCodeBlockLines(lines);
|
||||
lines = stopAtLines(lines);
|
||||
if (inept) {
|
||||
lines = filterEnglishLinesAtEnd(fixCodeLlamaFirstLineIndentation(lines));
|
||||
// lines = fixCodeLlamaFirstLineIndentation(lines);
|
||||
lines = filterEnglishLinesAtEnd(lines);
|
||||
}
|
||||
|
||||
let diffLines = streamDiff(oldLines, lines);
|
||||
diffLines = addIndentation(diffLines, commonIndentation);
|
||||
diffLines = filterLeadingAndTrailingNewLineInsertion(diffLines);
|
||||
if (highlighted.length === 0) {
|
||||
const line = prefix.split("\n").slice(-1)[0];
|
||||
const indentation = line.slice(0, line.length - line.trimStart().length);
|
||||
diffLines = addIndentation(diffLines, indentation);
|
||||
}
|
||||
|
||||
for await (let diffLine of diffLines) {
|
||||
yield diffLine;
|
||||
|
|
|
@ -73,6 +73,7 @@ export type WebviewProtocol = Protocol &
|
|||
|
||||
errorPopup: [{ message: string }, void];
|
||||
"index/setPaused": [boolean, void];
|
||||
"index/forceReIndex": [undefined, void];
|
||||
openUrl: [string, void];
|
||||
applyToCurrentFile: [{ text: string }, void];
|
||||
showTutorial: [undefined, void];
|
||||
|
@ -85,6 +86,20 @@ export type WebviewProtocol = Protocol &
|
|||
toggleFullScreen: [undefined, void];
|
||||
"stats/getTokensPerDay": [undefined, { day: string; tokens: number }[]];
|
||||
"stats/getTokensPerModel": [undefined, { model: string; tokens: number }[]];
|
||||
insertAtCursor: [{ text: string }, void];
|
||||
copyText: [{ text: string }, void];
|
||||
"jetbrains/editorInsetHeight": [{ height: number }, void];
|
||||
completeOnboarding: [
|
||||
{
|
||||
mode:
|
||||
| "local"
|
||||
| "optimized"
|
||||
| "custom"
|
||||
| "localExistingUser"
|
||||
| "optimizedExistingUser";
|
||||
},
|
||||
void,
|
||||
];
|
||||
};
|
||||
|
||||
export type ReverseWebviewProtocol = {
|
||||
|
@ -117,4 +132,5 @@ export type ReverseWebviewProtocol = {
|
|||
refreshSubmenuItems: [undefined, void];
|
||||
setTheme: [{ theme: any }, void];
|
||||
setColors: [{ [key: string]: string }, void];
|
||||
"jetbrains/editorInsetRefresh": [undefined, void];
|
||||
};
|
||||
|
|
|
@ -3044,7 +3044,7 @@ ieee754@^1.1.13, ieee754@^1.2.1:
|
|||
resolved "https://registry.npmjs.org/ieee754/-/ieee754-1.2.1.tgz"
|
||||
integrity sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA==
|
||||
|
||||
ignore@^5.2.0:
|
||||
ignore@^5.2.0, ignore@^5.3.1:
|
||||
version "5.3.1"
|
||||
resolved "https://registry.npmjs.org/ignore/-/ignore-5.3.1.tgz"
|
||||
integrity sha512-5Fytz/IraMjqpwfd34ke28PTVMjZjJG2MPn5t7OE4eUCUNf8BAa7b5WUS9/Qvr6mwOQS7Mk6vdsMno5he+T8Xw==
|
||||
|
@ -5730,7 +5730,7 @@ v8-to-istanbul@^9.0.1:
|
|||
"@types/istanbul-lib-coverage" "^2.0.1"
|
||||
convert-source-map "^2.0.0"
|
||||
|
||||
vectordb@^0.4.12:
|
||||
vectordb@0.4.12:
|
||||
version "0.4.12"
|
||||
resolved "https://registry.npmjs.org/vectordb/-/vectordb-0.4.12.tgz"
|
||||
integrity sha512-H2mFwZ87d3BcuX4yGx5RoW06amRZQ3W5u/FRTNc86rwVINuEEa+Ivsqng1i8hURHXoTDbF/0Y9qwCshySkeznQ==
|
||||
|
|
|
@ -167,7 +167,7 @@ export function modifyConfig(config: Config): Config {
|
|||
title: "My Custom LLM",
|
||||
model: "mistral-7b",
|
||||
},
|
||||
streamComplete: async function* (prompt, options) {
|
||||
streamCompletion: async function* (prompt: string, options: CompletionOptions) {
|
||||
// Make the API call here
|
||||
|
||||
// Then yield each part of the completion as it is streamed
|
||||
|
@ -178,5 +178,6 @@ export function modifyConfig(config: Config): Config {
|
|||
}
|
||||
},
|
||||
});
|
||||
return config;
|
||||
}
|
||||
```
|
||||
|
|
|
@ -0,0 +1,26 @@
|
|||
# AWS Bedrock
|
||||
|
||||
To setup Bedrock, add the following to your `config.json` file:
|
||||
|
||||
```json title="~/.continue/config.json"
|
||||
{
|
||||
"models": [
|
||||
{
|
||||
"title": "Bedrock: Claude 3 Sonnet",
|
||||
"provider": "bedrock",
|
||||
"model": "anthropic.claude-3-sonnet-20240229-v1:0",
|
||||
"region": "us-east-1"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
Authentication will be through temporary or long-term credentials in
|
||||
~/.aws/credentials under a profile called "bedrock".
|
||||
|
||||
```title="~/.aws/credentials
|
||||
[bedrock]
|
||||
aws_access_key_id = abcdefg
|
||||
aws_secret_access_key = hijklmno
|
||||
aws_session_token = pqrstuvwxyz # Optional: means short term creds.
|
||||
```
|
|
@ -78,6 +78,7 @@ This object allows you to customize the behavior of tab-autocomplete. The availa
|
|||
- `multilineCompletions`: Whether to enable multiline completions ("always", "never", or "auto"). Defaults to "auto".
|
||||
- `useCache`: Whether to cache and reuse completions when the prompt is the same as a previous one. May be useful to disable for testing purposes.
|
||||
- `useOtherFiles`: Whether to include context from files outside of the current one. Turning this off should be expected to reduce the accuracy of completions, but might be good for testing.
|
||||
- `disable`: Disable autocomplete (can also be done from IDE settings)
|
||||
|
||||
### Full example
|
||||
|
||||
|
|
|
@ -6,6 +6,12 @@
|
|||
"title": "BaseCompletionOptions",
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"stream": {
|
||||
"title": "Stream",
|
||||
"description": "Whether to stream the LLM response. Currently only respected by the 'anthropic' provider. Otherwise will always stream.",
|
||||
"type": "boolean",
|
||||
"default": true
|
||||
},
|
||||
"temperature": {
|
||||
"title": "Temperature",
|
||||
"description": "The temperature of the completion.",
|
||||
|
@ -128,6 +134,7 @@
|
|||
"openai",
|
||||
"free-trial",
|
||||
"anthropic",
|
||||
"bedrock",
|
||||
"together",
|
||||
"ollama",
|
||||
"huggingface-tgi",
|
||||
|
@ -139,12 +146,14 @@
|
|||
"llamafile",
|
||||
"mistral",
|
||||
"deepinfra",
|
||||
"flowise"
|
||||
"flowise",
|
||||
"groq"
|
||||
],
|
||||
"markdownEnumDescriptions": [
|
||||
"### OpenAI\nUse gpt-4, gpt-3.5-turbo, or any other OpenAI model. See [here](https://openai.com/product#made-for-developers) to obtain an API key.\n\n> [Reference](https://continue.dev/docs/reference/Model%20Providers/openai)",
|
||||
"### Free Trial\nNew users can try out Continue for free using a proxy server that securely makes calls to OpenAI using our API key. If you are ready to use your own API key or have used all 250 free uses, you can enter your API key in config.py where it says `apiKey=\"\"` or select another model provider.\n> [Reference](https://continue.dev/docs/reference/Model%20Providers/freetrial)",
|
||||
"### Anthropic\nTo get started with Anthropic models, you first need to sign up for the open beta [here](https://claude.ai/login) to obtain an API key.\n> [Reference](https://continue.dev/docs/reference/Model%20Providers/anthropicllm)",
|
||||
"### Bedrock\nTo get started with Bedrock you need to sign up on AWS [here](https://aws.amazon.com/bedrock/claude/)",
|
||||
"### Together\nTogether is a hosted service that provides extremely fast streaming of open-source language models. To get started with Together:\n1. Obtain an API key from [here](https://together.ai)\n2. Paste below\n3. Select a model preset\n> [Reference](https://continue.dev/docs/reference/Model%20Providers/togetherllm)",
|
||||
"### Ollama\nTo get started with Ollama, follow these steps:\n1. Download from [ollama.ai](https://ollama.ai/) and open the application\n2. Open a terminal and run `ollama run <MODEL_NAME>`. Example model names are `codellama:7b-instruct` or `llama2:7b-text`. You can find the full list [here](https://ollama.ai/library).\n3. Make sure that the model name used in step 2 is the same as the one in config.py (e.g. `model=\"codellama:7b-instruct\"`)\n4. Once the model has finished downloading, you can start asking questions through Continue.\n> [Reference](https://continue.dev/docs/reference/Model%20Providers/ollama)",
|
||||
"### Huggingface TGI\n\n> [Reference](https://continue.dev/docs/reference/Model%20Providers/huggingfacetgi)",
|
||||
|
@ -430,6 +439,7 @@
|
|||
"claude-3-opus-20240229",
|
||||
"claude-3-sonnet-20240229",
|
||||
"claude-3-haiku-20240307",
|
||||
"starcoder-7b",
|
||||
"AUTODETECT"
|
||||
]
|
||||
}
|
||||
|
@ -484,6 +494,7 @@
|
|||
"stable-code-3b",
|
||||
"starcoder-1b",
|
||||
"starcoder-3b",
|
||||
"starcoder2-3b",
|
||||
"mistral-tiny",
|
||||
"mistral-small",
|
||||
"mistral-medium",
|
||||
|
@ -529,6 +540,34 @@
|
|||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"if": {
|
||||
"properties": {
|
||||
"provider": {
|
||||
"enum": ["bedrock"]
|
||||
}
|
||||
},
|
||||
"required": ["provider"]
|
||||
},
|
||||
"then": {
|
||||
"properties": {
|
||||
"model": {
|
||||
"anyOf": [
|
||||
{
|
||||
"enum": [
|
||||
"claude-3-sonnet-20240229",
|
||||
"claude-3-haiku-20240307",
|
||||
"claude-2"
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"if": {
|
||||
"properties": {
|
||||
|
@ -639,7 +678,8 @@
|
|||
"deepseek-1b",
|
||||
"stable-code-3b",
|
||||
"starcoder-1b",
|
||||
"starcoder-3b"
|
||||
"starcoder-3b",
|
||||
"starcoder2-3b"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
@ -686,6 +726,7 @@
|
|||
"stable-code-3b",
|
||||
"starcoder-1b",
|
||||
"starcoder-3b",
|
||||
"starcoder2-3b",
|
||||
"AUTODETECT"
|
||||
]
|
||||
},
|
||||
|
@ -715,6 +756,23 @@
|
|||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"if": {
|
||||
"properties": {
|
||||
"provider": {
|
||||
"enum": ["groq"]
|
||||
}
|
||||
},
|
||||
"required": ["provider"]
|
||||
},
|
||||
"then": {
|
||||
"properties": {
|
||||
"model": {
|
||||
"enum": ["llama2-70b", "mistral-8x7b", "gemma"]
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"if": {
|
||||
"properties": {
|
||||
|
@ -876,7 +934,8 @@
|
|||
"deepseek-1b",
|
||||
"stable-code-3b",
|
||||
"starcoder-1b",
|
||||
"starcoder-3b"
|
||||
"starcoder-3b",
|
||||
"starcoder2-3b"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
@ -932,7 +991,8 @@
|
|||
"edit",
|
||||
"comment",
|
||||
"http",
|
||||
"commit"
|
||||
"commit",
|
||||
"review"
|
||||
],
|
||||
"type": "string",
|
||||
"markdownEnumDescriptions": [
|
||||
|
@ -943,7 +1003,8 @@
|
|||
"Edit the highlighted code with given instructions",
|
||||
"Add comments to the highlighted code",
|
||||
"Write a custom slash command at your own HTTP endpoint. Set 'url' in the params object for the endpoint you have setup.",
|
||||
"Generate a commit message for the current changes"
|
||||
"Generate a commit message for the current changes",
|
||||
"Review code and give feedback"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
@ -1314,9 +1375,26 @@
|
|||
"apiVersion": {
|
||||
"type": "integer",
|
||||
"markdownDescription": "This context provider supports both Jira API version 2 and 3. It will use version 3 by default since that's what the cloud version uses, but if you have the datacenter version of Jira, you'll need to set the API Version to 2 using the `apiVersion` property."
|
||||
},
|
||||
"requestOptions": {
|
||||
"title": "Request Options",
|
||||
"description": "Options for the HTTPS request to Jira.",
|
||||
"default": {
|
||||
"timeout": 7200,
|
||||
"verifySsl": null,
|
||||
"caBundlePath": null,
|
||||
"proxy": null,
|
||||
"headers": null,
|
||||
"extraBodyProperties": null
|
||||
},
|
||||
"allOf": [
|
||||
{
|
||||
"$ref": "#/definitions/RequestOptions"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
"required": ["domain", "email", "token"]
|
||||
"required": ["domain", "token"]
|
||||
}
|
||||
},
|
||||
"required": ["params"]
|
||||
|
@ -1559,7 +1637,7 @@
|
|||
"type": "object",
|
||||
"properties": {
|
||||
"provider": {
|
||||
"enum": ["transformers.js", "ollama", "openai"]
|
||||
"enum": ["transformers.js", "ollama", "openai", "free-trial"]
|
||||
},
|
||||
"model": {
|
||||
"type": "string"
|
||||
|
@ -1588,9 +1666,74 @@
|
|||
}
|
||||
]
|
||||
},
|
||||
"reranker": {
|
||||
"title": "Reranker",
|
||||
"markdownDescription": "The reranker is responsible for selecting the final results when retrieving snippets from your codebase.",
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"name": {
|
||||
"enum": ["voyage", "llm", "free-trial"]
|
||||
},
|
||||
"params": {
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"required": ["name"],
|
||||
"allOf": [
|
||||
{
|
||||
"if": {
|
||||
"properties": {
|
||||
"name": {
|
||||
"enum": ["llm"]
|
||||
}
|
||||
},
|
||||
"required": ["name"]
|
||||
},
|
||||
"then": {
|
||||
"properties": {
|
||||
"params": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"modelTitle": {
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"required": ["modelTitle"]
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"if": {
|
||||
"properties": {
|
||||
"name": {
|
||||
"enum": ["voyage"]
|
||||
}
|
||||
},
|
||||
"required": ["name"]
|
||||
},
|
||||
"then": {
|
||||
"properties": {
|
||||
"params": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"apiKey": {
|
||||
"type": "string"
|
||||
},
|
||||
"model": {
|
||||
"enum": ["rerank-lite-1"]
|
||||
}
|
||||
},
|
||||
"required": ["apiKey"]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"tabAutocompleteModel": {
|
||||
"title": "Tab Autocomplete Model",
|
||||
"markdownDescription": "The model used for tab autocompletion. If undefined, Continue will default to using starcoder:3b on a local Ollama instance.\n\n*IMPORTANT*:\n\nIf you use a custom model, ensure that it is one trained for fill-in-the-middle completions. An instruct model is typically not well-suited to autocomplete and you may receive unsatisfactory completions.",
|
||||
"markdownDescription": "The model used for tab autocompletion. If undefined, Continue will default to using starcoder2:3b on a local Ollama instance.\n\n*IMPORTANT*:\n\nIf you use a custom model, ensure that it is one trained for fill-in-the-middle completions. An instruct model is typically not well-suited to autocomplete and you may receive unsatisfactory completions.",
|
||||
"default": {
|
||||
"title": "Tab Autocomplete Model",
|
||||
"provider": "ollama",
|
||||
|
@ -1603,6 +1746,11 @@
|
|||
"type": "object",
|
||||
"markdownDescription": "These options let you customize your tab-autocomplete experience. Read about all options in [the docs](https://continue.dev/docs/walkthroughs/tab-autocomplete#configuration-options).",
|
||||
"properties": {
|
||||
"disable": {
|
||||
"type": "boolean",
|
||||
"description": "Disable tab autocomplete. This can also be done from the IDE settings.",
|
||||
"default": false
|
||||
},
|
||||
"useCopyBuffer": {
|
||||
"type": "boolean",
|
||||
"description": "Determines whether the copy buffer will be considered when contructing the prompt."
|
||||
|
@ -1649,6 +1797,48 @@
|
|||
}
|
||||
},
|
||||
"required": []
|
||||
},
|
||||
"ui": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"codeBlockToolbarPosition": {
|
||||
"enum": ["top", "bottom"],
|
||||
"default": "top",
|
||||
"description": "Whether to show the copy and apply code buttons at the top or bottom of code blocks in the sidebar."
|
||||
}
|
||||
}
|
||||
},
|
||||
"experimental": {
|
||||
"type": "object",
|
||||
"title": "Experimental",
|
||||
"description": "Experimental properties are subject to change.",
|
||||
"properties": {
|
||||
"contextMenuPrompts": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"comment": {
|
||||
"type": "string",
|
||||
"default": "Write comments for this code. Do not change anything about the code itself."
|
||||
},
|
||||
"docstring": {
|
||||
"type": "string",
|
||||
"default": "Write a docstring for this code. Do not change anything about the code itself."
|
||||
},
|
||||
"fix": {
|
||||
"type": "string",
|
||||
"default": "Fix this code"
|
||||
},
|
||||
"optimize": {
|
||||
"type": "string",
|
||||
"default": "Optimize this code"
|
||||
},
|
||||
"fixGrammar": {
|
||||
"type": "string",
|
||||
"default": "If there are any grammar or spelling mistakes in this writing, fix them. Do not make other large changes to the writing."
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -80,7 +80,7 @@ koverReport {
|
|||
|
||||
tasks {
|
||||
prepareSandbox {
|
||||
from("../../pkg/bin") {
|
||||
from("../../binary/bin") {
|
||||
into("${intellij.pluginName.get()}/core/")
|
||||
}
|
||||
}
|
||||
|
|
|
@ -4,7 +4,7 @@ pluginGroup = com.github.continuedev.continueintellijextension
|
|||
pluginName = continue-intellij-extension
|
||||
pluginRepositoryUrl = https://github.com/continuedev/continue
|
||||
# SemVer format -> https://semver.org
|
||||
pluginVersion = 0.0.39
|
||||
pluginVersion = 0.0.42
|
||||
|
||||
# Supported build number ranges and IntelliJ Platform versions -> https://plugins.jetbrains.com/docs/intellij/build-number-ranges.html
|
||||
pluginSinceBuild = 223
|
||||
|
|
|
@ -1,11 +1,15 @@
|
|||
package com.github.continuedev.continueintellijextension.actions
|
||||
|
||||
import com.github.continuedev.continueintellijextension.editor.DiffStreamService
|
||||
import com.github.continuedev.continueintellijextension.services.ContinuePluginService
|
||||
import com.google.gson.Gson
|
||||
import com.intellij.openapi.actionSystem.AnAction
|
||||
import com.intellij.openapi.actionSystem.AnActionEvent
|
||||
import com.intellij.openapi.actionSystem.PlatformDataKeys
|
||||
import com.intellij.openapi.components.ServiceManager
|
||||
import com.intellij.openapi.ui.DialogWrapper
|
||||
import com.intellij.openapi.components.service
|
||||
import com.intellij.openapi.fileEditor.FileEditorManager
|
||||
import com.intellij.openapi.ui.DialogWrapper
|
||||
import com.intellij.openapi.ui.Messages
|
||||
import com.intellij.openapi.wm.ToolWindowManager
|
||||
import java.awt.Dimension
|
||||
|
@ -25,16 +29,40 @@ fun pluginServiceFromActionEvent(e: AnActionEvent): ContinuePluginService? {
|
|||
|
||||
class AcceptDiffAction : AnAction() {
|
||||
override fun actionPerformed(e: AnActionEvent) {
|
||||
acceptHorizontalDiff(e)
|
||||
acceptVerticalDiff(e)
|
||||
}
|
||||
|
||||
private fun acceptHorizontalDiff(e: AnActionEvent) {
|
||||
val continuePluginService = pluginServiceFromActionEvent(e) ?: return
|
||||
continuePluginService.ideProtocolClient?.diffManager?.acceptDiff(null)
|
||||
}
|
||||
|
||||
private fun acceptVerticalDiff(e: AnActionEvent) {
|
||||
val project = e.project ?: return
|
||||
val editor = e.getData(PlatformDataKeys.EDITOR) ?: FileEditorManager.getInstance(project).selectedTextEditor ?: return
|
||||
val diffStreamService = project.service<DiffStreamService>()
|
||||
diffStreamService.accept(editor)
|
||||
}
|
||||
}
|
||||
|
||||
class RejectDiffAction : AnAction() {
|
||||
override fun actionPerformed(e: AnActionEvent) {
|
||||
rejectHorizontalDiff(e)
|
||||
rejectVerticalDiff(e)
|
||||
}
|
||||
|
||||
private fun rejectHorizontalDiff(e: AnActionEvent) {
|
||||
val continuePluginService = pluginServiceFromActionEvent(e) ?: return
|
||||
continuePluginService.ideProtocolClient?.diffManager?.rejectDiff(null)
|
||||
}
|
||||
|
||||
private fun rejectVerticalDiff(e: AnActionEvent) {
|
||||
val project = e.project ?: return
|
||||
val editor = e.getData(PlatformDataKeys.EDITOR) ?: FileEditorManager.getInstance(project).selectedTextEditor ?: return
|
||||
val diffStreamService = project.service<DiffStreamService>()
|
||||
diffStreamService.reject(editor)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
|
|
|
@ -237,7 +237,7 @@ class ContinuePluginStartupActivity : StartupActivity, Disposable, DumbAware {
|
|||
|
||||
val corePath = Paths.get(pluginPath.toString(), "core").toString()
|
||||
val targetPath = Paths.get(corePath, target).toString()
|
||||
val continueCorePath = Paths.get(targetPath, "pkg" + (if (os == "win32") ".exe" else "")).toString()
|
||||
val continueCorePath = Paths.get(targetPath, "continue-binary" + (if (os == "win32") ".exe" else "")).toString()
|
||||
|
||||
// Copy targetPath / node_sqlite3.node to core / node_sqlite3.node
|
||||
val nodeSqlite3Path = Paths.get(targetPath, "node_sqlite3.node")
|
||||
|
@ -253,7 +253,7 @@ class ContinuePluginStartupActivity : StartupActivity, Disposable, DumbAware {
|
|||
// esbuild needs permissions
|
||||
val esbuildPath = Paths.get(targetPath, "esbuild"+ (if (os == "win32") ".exe" else "")).toString()
|
||||
|
||||
val coreMessenger = CoreMessenger(esbuildPath, continueCorePath, ideProtocolClient);
|
||||
val coreMessenger = CoreMessenger(project, esbuildPath, continueCorePath, ideProtocolClient);
|
||||
continuePluginService.coreMessenger = coreMessenger
|
||||
}
|
||||
}
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue