dev (#1542)
* docs: add docs and schema for "OS" provider (#1536) * ignore .env * fix(gui): ctx rendering w/ renderInlineAs: "" (#1541) * ✨ use and cache imports for autocomplete (#1456) * ✨ use and cache imports for autocomplete * fix tsc * fix codeqwen autocomplete leading space * add voyage rerank-1 * feat: `--noEmit` for tsc checks in CI (#1559) * docs: update CustomContextProvider docs (#1557) * add stop tokens to qwen prompt * update docs to reflect 3.5 sonnet being best * docs: comment out unused providers (#1561) * import Handlebars * feat: toast notification for config updates (#1560) * feat: toast notification for config updates * feat: only trigger toast on config.json save * displayRawMarkdown option * feat: open pane on install (#1564) * feat: open pane on activation * comment out testing code * update to reflect 16 stop words limit for deepseek * feat: only trigger config update toast in vscode (#1571) * docs(prompt-files): fix typos + clarify (#1575) * doc: prompt file typo + clarifications * fix: add back correct docs * chore: add telemetry for pageviews (#1576) * feat: update onboarding w/ embeddings model (#1570) * chore(gui): remove unused pages * feat: add embeddings step * feat: update styles * feat: copy button updates * fix: correct pull command for embed model * fix: remove commented code * fix: remove commented code * feat: simplify copy btn props * chore: rename onboarding selection event * feat: add provider config * fix: undo msg name * remove dead code * fix: invalid mode check * fix: remove testing logic * docs(telemetry): add pageviews to tracking list (#1581) * Add reranker configuration options to codebase embedding docs (#1584) - Introduce reranker concept - List available reranker options - Provide configuration instructions - Update keywords to include "reranker" * chore: update pr template with screenshots (#1590) * Refactor ConfirmationDialog to use SecondaryButton for cancel action (#1586) * Added instructions for running docs server locally (#1578) - Added NPM script method - Added VS Code task method - Update contributing guidelines * Update branch policy (#1577) - Change PR target to `dev` branch - Update `CONTRIBUTING.md` instructions * Consolidate example configurations into the main configuration guide (#1579) - Moved examples to configuration.md - Deleted the separate examples.md file - Updated sidebar order and links - Improved readability and structure in configuration.md * fix: fullscreen gui retains context when hidden, fixed fullscreen focusing (#1582) * Update completionProvider.ts (warning tab-autocomplete models) (#1566) * feat: enhanced IndexingProgressBar with blinking dot feature - Integrated BlinkingDot component - Added STATUS_COLORS for various states - Replaced CircleDiv with BlinkingDot in UI - Updated status messages and layout * small UI tweaks * feat(gui): enhance ModelCard, ModelProviderTag, and Toggle components (#1595) - add styling and adjustments to ModelCard - update ModelProviderTag font size - remove box shadow from Toggle component - tweak icon styles in ModelCard - improve alignment and spacing * media query * feat: add best experience onboarding * fix: file rename * stop movement on button hover by keeping same border thickness * fix mistake in setting cursor: pointer * fix when free trial option is shown * Support Node.js versions below 20 for streaming response handling (#1591) - Add fallback for Node < 20 - Implement toAsyncIterable for streaming - Use TextDecoder for manual decoding - Maintain existing streaming for Node 20+ * small fixes * feat: add free trial card to onboarding (#1600) * feat: add free trial card to onboarding * add import * fix hasPassedFTL * fix /edit cancellation from UI * feat: add `applyCodeBlock` experimental prop (#1601) * feat: add new model styling improvements (#1609) * feat: add new model styling improvements * better gap size * feat: update bug_report.yml (#1610) * chore: update bug_report.yml * typo fix * feat: add labels to "Add docs" dialog (#1612) * feat: add labels to "Add docs" dialog * remove autofocus * don't double load config * small fixes * speed up directory traversal, and use correct native path module * option not to show config update toast * merge air-gapped and recommended setup pages * chore: add telemetry for full screen toggle (#1618) * Fix headings in codebase-embeddings.md (#1617) * mention jetbrains * docs: update changie (#1619) * feat: updated changie config * hide toc and autogenerate * Update changelog.mdx * link to deeper explanation of embeddings models * ensure target="_blank" for all links in sidebar * fix gif links in intellij README.md * don't require rust in dependency installation * chore: fix padding on gh button (#1620) * chore: adjust button padding * Update tasks.json * escape colons in diff path * smoother lancedb indexing reporting * smooth progress updates for indexing * fix tsc err * rerank-lite-1 --------- Co-authored-by: Patrick Erichsen <patrick.a.erichsen@gmail.com> Co-authored-by: Priyash <38959321+priyashpatil@users.noreply.github.com> Co-authored-by: Jonah Wagner <jonahwagner231@gmail.com> Co-authored-by: YohannZe <99359799+YohannZe@users.noreply.github.com> Co-authored-by: Dan Dascalescu <ddascalescu+github@gmail.com>
This commit is contained in:
parent
23ba0fadf5
commit
04c08c7159
|
@ -1,4 +1,3 @@
|
|||
# Changelog
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
|
|
|
@ -33,7 +33,3 @@ projects:
|
|||
- label: extensions/vscode
|
||||
key: extensions/vscode
|
||||
changelog: extensions/vscode/CHANGELOG.md
|
||||
|
||||
- label: docs
|
||||
key: docs
|
||||
changelog: docs/CHANGELOG.md # for releasing documents independently
|
||||
|
|
|
@ -21,15 +21,23 @@ body:
|
|||
attributes:
|
||||
label: Relevant environment info
|
||||
description: |
|
||||
Feel free to omit any info that is not relevant to your issue.
|
||||
|
||||
- **OS**: e.g. macOS 14.2.1
|
||||
- **Continue**: e.g. v0.9.4
|
||||
- **IDE**: e.g. VSCode 1.85.1
|
||||
- Include also your LLM Model if it is relevant to your issue
|
||||
- **Model**: e.g. Ollama v0.1.19
|
||||
- The contents of your config.json file (e.g. `~/.continue/config.json` on macOS)
|
||||
value: |
|
||||
- OS:
|
||||
- Continue:
|
||||
- IDE:
|
||||
- Model:
|
||||
- config.json:
|
||||
```json
|
||||
|
||||
```
|
||||
render: Markdown
|
||||
validations:
|
||||
required: false
|
||||
|
|
|
@ -6,3 +6,7 @@
|
|||
|
||||
- [ ] The base branch of this PR is `dev`, rather than `main`
|
||||
- [ ] The relevant docs, if any, have been updated or created
|
||||
|
||||
## Screenshots
|
||||
|
||||
If your updates include visual changes, please share screenshots below.
|
||||
|
|
|
@ -72,19 +72,19 @@ jobs:
|
|||
- name: tsc core
|
||||
run: |
|
||||
cd core
|
||||
npx tsc
|
||||
npx tsc --noEmit
|
||||
|
||||
- name: tsc extensions/vscode
|
||||
run: |
|
||||
cd extensions/vscode
|
||||
npx tsc
|
||||
npx tsc --noEmit
|
||||
|
||||
- name: tsc binary
|
||||
run: |
|
||||
cd binary
|
||||
npx tsc
|
||||
npx tsc --noEmit
|
||||
|
||||
- name: tsc gui
|
||||
run: |
|
||||
cd gui
|
||||
npx tsc
|
||||
npx tsc --noEmit
|
||||
|
|
|
@ -12,7 +12,9 @@
|
|||
// To bundle the code the same way we do for publishing
|
||||
"vscode-extension:esbuild",
|
||||
// Start the React app that is used in the extension
|
||||
"gui:dev"
|
||||
"gui:dev",
|
||||
// Start the docs site, without opening the browser
|
||||
"docs:start"
|
||||
],
|
||||
"group": {
|
||||
"kind": "build",
|
||||
|
@ -159,7 +161,7 @@
|
|||
"label": "docs:start",
|
||||
"type": "shell",
|
||||
"command": "npm",
|
||||
"args": ["run", "start"],
|
||||
"args": ["run", "start", "--", "--no-open"],
|
||||
"problemMatcher": [],
|
||||
"options": {
|
||||
"cwd": "docs"
|
||||
|
|
|
@ -2,21 +2,33 @@
|
|||
|
||||
## Table of Contents
|
||||
|
||||
- [Contributing to Continue](#contributing-to-continue)
|
||||
- [Table of Contents](#table-of-contents)
|
||||
- [❤️ Ways to Contribute](#️-ways-to-contribute)
|
||||
- [👋 Continue Contribution Ideas](#-continue-contribution-ideas)
|
||||
- [🐛 Report Bugs](#-report-bugs)
|
||||
- [✨ Suggest Enhancements](#-suggest-enhancements)
|
||||
- [📖 Updating / Improving Documentation](#-updating--improving-documentation)
|
||||
- [Running the Documentation Server Locally](#running-the-documentation-server-locally)
|
||||
- [Method 1: NPM Script](#method-1-npm-script)
|
||||
- [Method 2: VS Code Task](#method-2-vs-code-task)
|
||||
- [🧑💻 Contributing Code](#-contributing-code)
|
||||
- [Environment Setup](#environment-setup)
|
||||
- [Pre-requisites](#pre-requisites)
|
||||
- [Fork the Continue Repository with All Branches](#fork-the-continue-repository-with-all-branches)
|
||||
- [VS Code](#vs-code)
|
||||
- [Debugging](#debugging)
|
||||
- [JetBrains](#jetbrains)
|
||||
- [Debugging](#debugging-1)
|
||||
- [Formatting](#formatting)
|
||||
- [Writing Slash Commands](#writing-slash-commands)
|
||||
- [Writing Context Providers](#writing-context-providers)
|
||||
- [Adding an LLM Provider](#adding-an-llm-provider)
|
||||
- [Adding Models](#adding-models)
|
||||
- [Adding Pre-indexed Documentation](#adding-pre-indexed-documentation)
|
||||
- [📐 Continue Architecture](#-continue-architecture)
|
||||
- [Continue VS Code Extension](#continue-vs-code-extension)
|
||||
- [Continue JetBrains Extension](#continue-jetbrains-extension)
|
||||
- [📐 Continue Architecture](#-continue-architecture)
|
||||
- [Continue VS Code Extension](#continue-vs-code-extension)
|
||||
- [Continue JetBrains Extension](#continue-jetbrains-extension)
|
||||
|
||||
# ❤️ Ways to Contribute
|
||||
|
||||
|
@ -50,9 +62,32 @@ Continue is quickly adding features, and we'd love to hear which are the most im
|
|||
|
||||
Continue is continuously improving, but a feature isn't complete until it is reflected in the documentation! If you see something out-of-date or missing, you can help by clicking "Edit this page" at the bottom of any page on [docs.continue.dev](https://docs.continue.dev).
|
||||
|
||||
### Running the Documentation Server Locally
|
||||
|
||||
You can run the documentation server locally using either of the following methods:
|
||||
|
||||
#### Method 1: NPM Script
|
||||
|
||||
1. Open your terminal and navigate to the root directory of the project.
|
||||
|
||||
2. Run the following command to start the documentation server:
|
||||
```bash
|
||||
npm run start --prefix docs
|
||||
```
|
||||
|
||||
#### Method 2: VS Code Task
|
||||
|
||||
1. Open VS Code in the root directory of the project.
|
||||
|
||||
2. Open the VS Code command pallet (`cmd/ctrl+shift+p`) and select `Tasks: Run Task`.
|
||||
|
||||
3. Look for the `docs:start` task and select it.
|
||||
|
||||
This will start a local server and you can see the documentation rendered in your default browser, typically accessible at `http://localhost:3000`.
|
||||
|
||||
## 🧑💻 Contributing Code
|
||||
|
||||
> Please make PRs to the `preview` branch. We use this to first test changes in a pre-release version of the extension.
|
||||
> Please make PRs to the `dev` branch. We use this to first test changes in a pre-release version of the extension.
|
||||
|
||||
### Environment Setup
|
||||
|
||||
|
@ -70,9 +105,9 @@ nvm use
|
|||
|
||||
2. Clone your forked repository to your local machine. Use: `git clone https://github.com/YOUR_USERNAME/continue.git`
|
||||
|
||||
3. Navigate to the cloned directory and switch to the **preview** branch. Execute: `git checkout preview`, then create your feature/fix branch from there, like so: `git checkout -b 123-my-feature-branch`
|
||||
3. Navigate to the cloned directory and switch to the **dev** branch. Execute: `git checkout dev`, then create your feature/fix branch from there, like so: `git checkout -b 123-my-feature-branch`
|
||||
|
||||
4. When you're ready to submit your changes, send your pull request specifically to the **preview** branch.
|
||||
4. When you're ready to submit your changes, send your pull request specifically to the **dev** branch.
|
||||
|
||||
#### VS Code
|
||||
|
||||
|
|
|
@ -3,6 +3,6 @@ import { MessageIde } from "core/util/messageIde";
|
|||
|
||||
export class IpcIde extends MessageIde {
|
||||
constructor(messenger: TODO) {
|
||||
super(messenger.request.bind(messenger));
|
||||
super(messenger.request.bind(messenger), messenger.on.bind(messenger));
|
||||
}
|
||||
}
|
||||
|
|
|
@ -0,0 +1,81 @@
|
|||
import { IDE } from "..";
|
||||
import { RangeInFileWithContents } from "../commands/util";
|
||||
import { PrecalculatedLruCache } from "../util/LruCache";
|
||||
import {
|
||||
TSQueryType,
|
||||
getParserForFile,
|
||||
getQueryForFile,
|
||||
} from "../util/treeSitter";
|
||||
|
||||
interface FileInfo {
|
||||
imports: { [key: string]: RangeInFileWithContents[] };
|
||||
}
|
||||
|
||||
export class ImportDefinitionsService {
|
||||
static N = 10;
|
||||
|
||||
private cache: PrecalculatedLruCache<FileInfo> =
|
||||
new PrecalculatedLruCache<FileInfo>(
|
||||
this._getFileInfo.bind(this),
|
||||
ImportDefinitionsService.N,
|
||||
);
|
||||
|
||||
constructor(private readonly ide: IDE) {
|
||||
ide.onDidChangeActiveTextEditor((filepath) => {
|
||||
this.cache.initKey(filepath);
|
||||
});
|
||||
}
|
||||
|
||||
get(filepath: string): FileInfo | undefined {
|
||||
return this.cache.get(filepath);
|
||||
}
|
||||
|
||||
private async _getFileInfo(filepath: string): Promise<FileInfo> {
|
||||
const parser = await getParserForFile(filepath);
|
||||
if (!parser) {
|
||||
return {
|
||||
imports: {},
|
||||
};
|
||||
}
|
||||
const ast = parser.parse(await this.ide.readFile(filepath), undefined, {
|
||||
includedRanges: [
|
||||
{
|
||||
startIndex: 0,
|
||||
endIndex: 10_000,
|
||||
startPosition: { row: 0, column: 0 },
|
||||
endPosition: { row: 100, column: 0 },
|
||||
},
|
||||
],
|
||||
});
|
||||
const query = await getQueryForFile(filepath, TSQueryType.Imports);
|
||||
if (!query) {
|
||||
return {
|
||||
imports: {},
|
||||
};
|
||||
}
|
||||
|
||||
const matches = query?.matches(ast.rootNode);
|
||||
|
||||
const fileInfo: FileInfo = {
|
||||
imports: {},
|
||||
};
|
||||
for (const match of matches) {
|
||||
const startPosition = match.captures[0].node.startPosition;
|
||||
const defs = await this.ide.gotoDefinition({
|
||||
filepath,
|
||||
position: {
|
||||
line: startPosition.row,
|
||||
character: startPosition.column,
|
||||
},
|
||||
});
|
||||
fileInfo.imports[match.captures[0].node.text] = await Promise.all(
|
||||
defs.map(async (def) => ({
|
||||
...def,
|
||||
contents: await this.ide.readRangeInFile(def.filepath, def.range),
|
||||
})),
|
||||
);
|
||||
}
|
||||
|
||||
return fileInfo;
|
||||
}
|
||||
}
|
|
@ -0,0 +1,45 @@
|
|||
import { IDE, Location } from "..";
|
||||
import { LANGUAGES } from "./languages";
|
||||
import { getSymbolsForSnippet } from "./ranking";
|
||||
|
||||
interface FileInfo {
|
||||
filepath: string;
|
||||
}
|
||||
|
||||
export class NearbyDefinitionsService {
|
||||
static N = 10;
|
||||
|
||||
constructor(private readonly ide: IDE) {}
|
||||
|
||||
async getDefinitionsForLine(filepath: string, line: number) {
|
||||
const lineContent = await this.ide.readRangeInFile(filepath, {
|
||||
start: {
|
||||
line,
|
||||
character: 0,
|
||||
},
|
||||
end: {
|
||||
line: line + 1,
|
||||
character: 0,
|
||||
},
|
||||
});
|
||||
|
||||
// Remove keywords
|
||||
const lang = LANGUAGES[filepath.split(".").slice(-1)[0]];
|
||||
const symbols = Array.from(getSymbolsForSnippet(lineContent))
|
||||
.filter((s) => s.length > 0)
|
||||
.filter((s) => !(lang && lang?.stopWords?.includes(s)));
|
||||
|
||||
return Promise.all(
|
||||
symbols.map((s) => {
|
||||
const character = lineContent.indexOf(s);
|
||||
const pos: Location = {
|
||||
filepath,
|
||||
position: {
|
||||
line,
|
||||
character,
|
||||
},
|
||||
};
|
||||
}),
|
||||
);
|
||||
}
|
||||
}
|
|
@ -1,5 +1,6 @@
|
|||
import Handlebars from "handlebars";
|
||||
import ignore from "ignore";
|
||||
import OpenAI from "openai";
|
||||
import path from "path";
|
||||
import { v4 as uuidv4 } from "uuid";
|
||||
import { RangeInFileWithContents } from "../commands/util.js";
|
||||
|
@ -14,7 +15,6 @@ import {
|
|||
Range,
|
||||
TabAutocompleteOptions,
|
||||
} from "../index.js";
|
||||
import OpenAI from "../llm/llms/OpenAI.js";
|
||||
import { logDevData } from "../util/devdata.js";
|
||||
import { getBasename, getLastNPathParts } from "../util/index.js";
|
||||
import {
|
||||
|
@ -23,6 +23,7 @@ import {
|
|||
} from "../util/parameters.js";
|
||||
import { Telemetry } from "../util/posthog.js";
|
||||
import { getRangeInString } from "../util/ranges.js";
|
||||
import { ImportDefinitionsService } from "./ImportDefinitionsService.js";
|
||||
import { BracketMatchingService } from "./brackets.js";
|
||||
import AutocompleteLruCache from "./cache.js";
|
||||
import {
|
||||
|
@ -138,342 +139,6 @@ export type GetLspDefinitionsFunction = (
|
|||
lang: AutocompleteLanguageInfo,
|
||||
) => Promise<AutocompleteSnippet[]>;
|
||||
|
||||
export async function getTabCompletion(
|
||||
token: AbortSignal,
|
||||
options: TabAutocompleteOptions,
|
||||
llm: ILLM,
|
||||
ide: IDE,
|
||||
generatorReuseManager: GeneratorReuseManager,
|
||||
input: AutocompleteInput,
|
||||
getDefinitionsFromLsp: GetLspDefinitionsFunction,
|
||||
bracketMatchingService: BracketMatchingService,
|
||||
): Promise<AutocompleteOutcome | undefined> {
|
||||
const startTime = Date.now();
|
||||
|
||||
const {
|
||||
filepath,
|
||||
pos,
|
||||
recentlyEditedFiles,
|
||||
recentlyEditedRanges,
|
||||
clipboardText,
|
||||
manuallyPassFileContents,
|
||||
manuallyPassPrefix,
|
||||
} = input;
|
||||
const fileContents =
|
||||
manuallyPassFileContents ?? (await ide.readFile(filepath));
|
||||
const fileLines = fileContents.split("\n");
|
||||
|
||||
// Filter
|
||||
const lang = languageForFilepath(filepath);
|
||||
const line = fileLines[pos.line] ?? "";
|
||||
for (const endOfLine of lang.endOfLine) {
|
||||
if (line.endsWith(endOfLine) && pos.character >= line.length) {
|
||||
return undefined;
|
||||
}
|
||||
}
|
||||
|
||||
// Model
|
||||
if (!llm) {
|
||||
return;
|
||||
}
|
||||
if (llm instanceof OpenAI) {
|
||||
llm.useLegacyCompletionsEndpoint = true;
|
||||
} else if (
|
||||
llm.providerName === "free-trial" &&
|
||||
llm.model !== TRIAL_FIM_MODEL
|
||||
) {
|
||||
llm.model = TRIAL_FIM_MODEL;
|
||||
}
|
||||
|
||||
if (
|
||||
!shownGptClaudeWarning &&
|
||||
nonAutocompleteModels.some((model) => llm.model.includes(model)) &&
|
||||
!llm.model.includes("deepseek")
|
||||
) {
|
||||
shownGptClaudeWarning = true;
|
||||
throw new Error(
|
||||
`Warning: ${llm.model} is not trained for tab-autocomplete, and will result in low-quality suggestions. See the docs to learn more about why: https://docs.continue.dev/walkthroughs/tab-autocomplete#i-want-better-completions-should-i-use-gpt-4`,
|
||||
);
|
||||
}
|
||||
|
||||
// Prompt
|
||||
let fullPrefix =
|
||||
getRangeInString(fileContents, {
|
||||
start: { line: 0, character: 0 },
|
||||
end: input.selectedCompletionInfo?.range.start ?? pos,
|
||||
}) + (input.selectedCompletionInfo?.text ?? "");
|
||||
|
||||
if (input.injectDetails) {
|
||||
const lines = fullPrefix.split("\n");
|
||||
fullPrefix = `${lines.slice(0, -1).join("\n")}\n${
|
||||
lang.singleLineComment
|
||||
} ${input.injectDetails.split("\n").join(`\n${lang.singleLineComment} `)}\n${
|
||||
lines[lines.length - 1]
|
||||
}`;
|
||||
}
|
||||
|
||||
const fullSuffix = getRangeInString(fileContents, {
|
||||
start: pos,
|
||||
end: { line: fileLines.length - 1, character: Number.MAX_SAFE_INTEGER },
|
||||
});
|
||||
|
||||
// First non-whitespace line below the cursor
|
||||
let lineBelowCursor = "";
|
||||
let i = 1;
|
||||
while (
|
||||
lineBelowCursor.trim() === "" &&
|
||||
pos.line + i <= fileLines.length - 1
|
||||
) {
|
||||
lineBelowCursor = fileLines[Math.min(pos.line + i, fileLines.length - 1)];
|
||||
i++;
|
||||
}
|
||||
|
||||
let extrasSnippets = options.useOtherFiles
|
||||
? ((await Promise.race([
|
||||
getDefinitionsFromLsp(
|
||||
filepath,
|
||||
fullPrefix + fullSuffix,
|
||||
fullPrefix.length,
|
||||
ide,
|
||||
lang,
|
||||
),
|
||||
new Promise((resolve) => {
|
||||
setTimeout(() => resolve([]), 100);
|
||||
}),
|
||||
])) as AutocompleteSnippet[])
|
||||
: [];
|
||||
|
||||
const workspaceDirs = await ide.getWorkspaceDirs();
|
||||
if (options.onlyMyCode) {
|
||||
extrasSnippets = extrasSnippets.filter((snippet) => {
|
||||
return workspaceDirs.some((dir) => snippet.filepath.startsWith(dir));
|
||||
});
|
||||
}
|
||||
|
||||
let { prefix, suffix, completeMultiline, snippets } =
|
||||
await constructAutocompletePrompt(
|
||||
filepath,
|
||||
pos.line,
|
||||
fullPrefix,
|
||||
fullSuffix,
|
||||
clipboardText,
|
||||
lang,
|
||||
options,
|
||||
recentlyEditedRanges,
|
||||
recentlyEditedFiles,
|
||||
llm.model,
|
||||
extrasSnippets,
|
||||
);
|
||||
|
||||
// If prefix is manually passed
|
||||
if (manuallyPassPrefix) {
|
||||
prefix = manuallyPassPrefix;
|
||||
suffix = "";
|
||||
}
|
||||
|
||||
// Template prompt
|
||||
const {
|
||||
template,
|
||||
completionOptions,
|
||||
compilePrefixSuffix = undefined,
|
||||
} = options.template
|
||||
? { template: options.template, completionOptions: {} }
|
||||
: getTemplateForModel(llm.model);
|
||||
|
||||
let prompt: string;
|
||||
const filename = getBasename(filepath);
|
||||
const reponame = getBasename(workspaceDirs[0] ?? "myproject");
|
||||
|
||||
// Some models have prompts that need two passes. This lets us pass the compiled prefix/suffix
|
||||
// into either the 2nd template to generate a raw string, or to pass prefix, suffix to a FIM endpoint
|
||||
if (compilePrefixSuffix) {
|
||||
[prefix, suffix] = compilePrefixSuffix(
|
||||
prefix,
|
||||
suffix,
|
||||
filepath,
|
||||
reponame,
|
||||
snippets,
|
||||
);
|
||||
}
|
||||
|
||||
if (typeof template === "string") {
|
||||
const compiledTemplate = Handlebars.compile(template);
|
||||
|
||||
// Format snippets as comments and prepend to prefix
|
||||
const formattedSnippets = snippets
|
||||
.map((snippet) =>
|
||||
formatExternalSnippet(snippet.filepath, snippet.contents, lang),
|
||||
)
|
||||
.join("\n");
|
||||
if (formattedSnippets.length > 0) {
|
||||
prefix = `${formattedSnippets}\n\n${prefix}`;
|
||||
} else if (prefix.trim().length === 0 && suffix.trim().length === 0) {
|
||||
// If it's an empty file, include the file name as a comment
|
||||
prefix = `${lang.singleLineComment} ${getLastNPathParts(filepath, 2)}\n${prefix}`;
|
||||
}
|
||||
|
||||
prompt = compiledTemplate({
|
||||
prefix,
|
||||
suffix,
|
||||
filename,
|
||||
reponame,
|
||||
});
|
||||
} else {
|
||||
// Let the template function format snippets
|
||||
prompt = template(prefix, suffix, filepath, reponame, snippets);
|
||||
}
|
||||
|
||||
// Completion
|
||||
let completion = "";
|
||||
|
||||
const cache = await autocompleteCache;
|
||||
const cachedCompletion = options.useCache
|
||||
? await cache.get(prefix)
|
||||
: undefined;
|
||||
let cacheHit = false;
|
||||
if (cachedCompletion) {
|
||||
// Cache
|
||||
cacheHit = true;
|
||||
completion = cachedCompletion;
|
||||
} else {
|
||||
const stop = [
|
||||
...(completionOptions?.stop || []),
|
||||
...multilineStops,
|
||||
...commonStops,
|
||||
...(llm.model.toLowerCase().includes("starcoder2")
|
||||
? STARCODER2_T_ARTIFACTS
|
||||
: []),
|
||||
...(lang.stopWords ?? []),
|
||||
...lang.topLevelKeywords.map((word) => `\n${word}`),
|
||||
];
|
||||
|
||||
let langMultilineDecision = lang.useMultiline?.({ prefix, suffix });
|
||||
let multiline: boolean = false;
|
||||
if (langMultilineDecision) {
|
||||
multiline = langMultilineDecision;
|
||||
} else {
|
||||
multiline =
|
||||
!input.selectedCompletionInfo && // Only ever single-line if using intellisense selected value
|
||||
options.multilineCompletions !== "never" &&
|
||||
(options.multilineCompletions === "always" || completeMultiline);
|
||||
}
|
||||
|
||||
// Try to reuse pending requests if what the user typed matches start of completion
|
||||
const generator = generatorReuseManager.getGenerator(
|
||||
prefix,
|
||||
() =>
|
||||
llm.supportsFim()
|
||||
? llm.streamFim(prefix, suffix, {
|
||||
...completionOptions,
|
||||
stop,
|
||||
})
|
||||
: llm.streamComplete(prompt, {
|
||||
...completionOptions,
|
||||
raw: true,
|
||||
stop,
|
||||
}),
|
||||
multiline,
|
||||
);
|
||||
|
||||
// Full stop means to stop the LLM's generation, instead of just truncating the displayed completion
|
||||
const fullStop = () => generatorReuseManager.currentGenerator?.cancel();
|
||||
|
||||
// LLM
|
||||
let cancelled = false;
|
||||
const generatorWithCancellation = async function* () {
|
||||
for await (const update of generator) {
|
||||
if (token.aborted) {
|
||||
cancelled = true;
|
||||
return;
|
||||
}
|
||||
yield update;
|
||||
}
|
||||
};
|
||||
let charGenerator = generatorWithCancellation();
|
||||
charGenerator = noFirstCharNewline(charGenerator);
|
||||
charGenerator = onlyWhitespaceAfterEndOfLine(
|
||||
charGenerator,
|
||||
lang.endOfLine,
|
||||
fullStop,
|
||||
);
|
||||
charGenerator = bracketMatchingService.stopOnUnmatchedClosingBracket(
|
||||
charGenerator,
|
||||
prefix,
|
||||
suffix,
|
||||
filepath,
|
||||
multiline,
|
||||
);
|
||||
|
||||
let lineGenerator = streamLines(charGenerator);
|
||||
lineGenerator = stopAtLines(lineGenerator, fullStop);
|
||||
lineGenerator = stopAtRepeatingLines(lineGenerator, fullStop);
|
||||
lineGenerator = avoidPathLine(lineGenerator, lang.singleLineComment);
|
||||
lineGenerator = skipPrefixes(lineGenerator);
|
||||
lineGenerator = noTopLevelKeywordsMidline(
|
||||
lineGenerator,
|
||||
lang.topLevelKeywords,
|
||||
fullStop,
|
||||
);
|
||||
|
||||
for (const lineFilter of lang.lineFilters ?? []) {
|
||||
lineGenerator = lineFilter({ lines: lineGenerator, fullStop });
|
||||
}
|
||||
|
||||
lineGenerator = streamWithNewLines(lineGenerator);
|
||||
|
||||
const finalGenerator = stopAtSimilarLine(
|
||||
lineGenerator,
|
||||
lineBelowCursor,
|
||||
fullStop,
|
||||
);
|
||||
|
||||
try {
|
||||
for await (const update of finalGenerator) {
|
||||
completion += update;
|
||||
}
|
||||
} catch (e: any) {
|
||||
if (ERRORS_TO_IGNORE.some((err) => e.includes(err))) {
|
||||
return undefined;
|
||||
}
|
||||
throw e;
|
||||
}
|
||||
|
||||
if (cancelled) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
const processedCompletion = postprocessCompletion({
|
||||
completion,
|
||||
prefix,
|
||||
suffix,
|
||||
llm,
|
||||
});
|
||||
|
||||
if (!processedCompletion) {
|
||||
return undefined;
|
||||
}
|
||||
completion = processedCompletion;
|
||||
}
|
||||
|
||||
const time = Date.now() - startTime;
|
||||
return {
|
||||
time,
|
||||
completion,
|
||||
prefix,
|
||||
suffix,
|
||||
prompt,
|
||||
modelProvider: llm.providerName,
|
||||
modelName: llm.model,
|
||||
completionOptions,
|
||||
cacheHit,
|
||||
filepath: input.filepath,
|
||||
completionId: input.completionId,
|
||||
gitRepo: await ide.getRepoName(input.filepath),
|
||||
uniqueId: await ide.getUniqueId(),
|
||||
...options,
|
||||
};
|
||||
}
|
||||
|
||||
export class CompletionProvider {
|
||||
private static debounceTimeout: NodeJS.Timeout | undefined = undefined;
|
||||
private static debouncing = false;
|
||||
|
@ -489,12 +154,15 @@ export class CompletionProvider {
|
|||
this.generatorReuseManager = new GeneratorReuseManager(
|
||||
this.onError.bind(this),
|
||||
);
|
||||
this.importDefinitionsService = new ImportDefinitionsService(this.ide);
|
||||
}
|
||||
|
||||
private importDefinitionsService: ImportDefinitionsService;
|
||||
private generatorReuseManager: GeneratorReuseManager;
|
||||
private autocompleteCache = AutocompleteLruCache.get();
|
||||
public errorsShown: Set<string> = new Set();
|
||||
private bracketMatchingService = new BracketMatchingService();
|
||||
// private nearbyDefinitionsService = new NearbyDefinitionsService();
|
||||
|
||||
private onError(e: any) {
|
||||
console.warn("Error generating autocompletion: ", e);
|
||||
|
@ -654,16 +322,7 @@ export class CompletionProvider {
|
|||
options.maxPromptTokens = 500;
|
||||
}
|
||||
|
||||
const outcome = await getTabCompletion(
|
||||
token,
|
||||
options,
|
||||
llm,
|
||||
this.ide,
|
||||
this.generatorReuseManager,
|
||||
input,
|
||||
this.getDefinitionsFromLsp,
|
||||
this.bracketMatchingService,
|
||||
);
|
||||
const outcome = await this.getTabCompletion(token, options, llm, input);
|
||||
|
||||
if (!outcome?.completion) {
|
||||
return undefined;
|
||||
|
@ -735,4 +394,339 @@ export class CompletionProvider {
|
|||
displayedAt: now,
|
||||
};
|
||||
}
|
||||
|
||||
async getTabCompletion(
|
||||
token: AbortSignal,
|
||||
options: TabAutocompleteOptions,
|
||||
llm: ILLM,
|
||||
input: AutocompleteInput,
|
||||
): Promise<AutocompleteOutcome | undefined> {
|
||||
const startTime = Date.now();
|
||||
|
||||
const {
|
||||
filepath,
|
||||
pos,
|
||||
recentlyEditedFiles,
|
||||
recentlyEditedRanges,
|
||||
clipboardText,
|
||||
manuallyPassFileContents,
|
||||
manuallyPassPrefix,
|
||||
} = input;
|
||||
const fileContents =
|
||||
manuallyPassFileContents ?? (await this.ide.readFile(filepath));
|
||||
const fileLines = fileContents.split("\n");
|
||||
|
||||
// Filter
|
||||
const lang = languageForFilepath(filepath);
|
||||
const line = fileLines[pos.line] ?? "";
|
||||
for (const endOfLine of lang.endOfLine) {
|
||||
if (line.endsWith(endOfLine) && pos.character >= line.length) {
|
||||
return undefined;
|
||||
}
|
||||
}
|
||||
|
||||
// Model
|
||||
if (!llm) {
|
||||
return;
|
||||
}
|
||||
if (llm instanceof OpenAI) {
|
||||
llm.useLegacyCompletionsEndpoint = true;
|
||||
} else if (
|
||||
llm.providerName === "free-trial" &&
|
||||
llm.model !== TRIAL_FIM_MODEL
|
||||
) {
|
||||
llm.model = TRIAL_FIM_MODEL;
|
||||
}
|
||||
|
||||
if (
|
||||
!shownGptClaudeWarning &&
|
||||
nonAutocompleteModels.some((model) => llm.model.includes(model)) &&
|
||||
!llm.model.includes("deepseek") &&
|
||||
!llm.model.includes("codestral")
|
||||
) {
|
||||
shownGptClaudeWarning = true;
|
||||
throw new Error(
|
||||
`Warning: ${llm.model} is not trained for tab-autocomplete, and will result in low-quality suggestions. See the docs to learn more about why: https://docs.continue.dev/walkthroughs/tab-autocomplete#i-want-better-completions-should-i-use-gpt-4`,
|
||||
);
|
||||
}
|
||||
|
||||
// Prompt
|
||||
let fullPrefix =
|
||||
getRangeInString(fileContents, {
|
||||
start: { line: 0, character: 0 },
|
||||
end: input.selectedCompletionInfo?.range.start ?? pos,
|
||||
}) + (input.selectedCompletionInfo?.text ?? "");
|
||||
|
||||
if (input.injectDetails) {
|
||||
const lines = fullPrefix.split("\n");
|
||||
fullPrefix = `${lines.slice(0, -1).join("\n")}\n${
|
||||
lang.singleLineComment
|
||||
} ${input.injectDetails.split("\n").join(`\n${lang.singleLineComment} `)}\n${
|
||||
lines[lines.length - 1]
|
||||
}`;
|
||||
}
|
||||
|
||||
const fullSuffix = getRangeInString(fileContents, {
|
||||
start: pos,
|
||||
end: { line: fileLines.length - 1, character: Number.MAX_SAFE_INTEGER },
|
||||
});
|
||||
|
||||
// First non-whitespace line below the cursor
|
||||
let lineBelowCursor = "";
|
||||
let i = 1;
|
||||
while (
|
||||
lineBelowCursor.trim() === "" &&
|
||||
pos.line + i <= fileLines.length - 1
|
||||
) {
|
||||
lineBelowCursor = fileLines[Math.min(pos.line + i, fileLines.length - 1)];
|
||||
i++;
|
||||
}
|
||||
|
||||
let extrasSnippets = options.useOtherFiles
|
||||
? ((await Promise.race([
|
||||
this.getDefinitionsFromLsp(
|
||||
filepath,
|
||||
fullPrefix + fullSuffix,
|
||||
fullPrefix.length,
|
||||
this.ide,
|
||||
lang,
|
||||
),
|
||||
new Promise((resolve) => {
|
||||
setTimeout(() => resolve([]), 100);
|
||||
}),
|
||||
])) as AutocompleteSnippet[])
|
||||
: [];
|
||||
|
||||
const workspaceDirs = await this.ide.getWorkspaceDirs();
|
||||
if (options.onlyMyCode) {
|
||||
extrasSnippets = extrasSnippets.filter((snippet) => {
|
||||
return workspaceDirs.some((dir) => snippet.filepath.startsWith(dir));
|
||||
});
|
||||
}
|
||||
|
||||
let { prefix, suffix, completeMultiline, snippets } =
|
||||
await constructAutocompletePrompt(
|
||||
filepath,
|
||||
pos.line,
|
||||
fullPrefix,
|
||||
fullSuffix,
|
||||
clipboardText,
|
||||
lang,
|
||||
options,
|
||||
recentlyEditedRanges,
|
||||
recentlyEditedFiles,
|
||||
llm.model,
|
||||
extrasSnippets,
|
||||
this.importDefinitionsService,
|
||||
);
|
||||
|
||||
// If prefix is manually passed
|
||||
if (manuallyPassPrefix) {
|
||||
prefix = manuallyPassPrefix;
|
||||
suffix = "";
|
||||
}
|
||||
|
||||
// Template prompt
|
||||
const {
|
||||
template,
|
||||
completionOptions,
|
||||
compilePrefixSuffix = undefined,
|
||||
} = options.template
|
||||
? { template: options.template, completionOptions: {} }
|
||||
: getTemplateForModel(llm.model);
|
||||
|
||||
let prompt: string;
|
||||
const filename = getBasename(filepath);
|
||||
const reponame = getBasename(workspaceDirs[0] ?? "myproject");
|
||||
|
||||
// Some models have prompts that need two passes. This lets us pass the compiled prefix/suffix
|
||||
// into either the 2nd template to generate a raw string, or to pass prefix, suffix to a FIM endpoint
|
||||
if (compilePrefixSuffix) {
|
||||
[prefix, suffix] = compilePrefixSuffix(
|
||||
prefix,
|
||||
suffix,
|
||||
filepath,
|
||||
reponame,
|
||||
snippets,
|
||||
);
|
||||
}
|
||||
|
||||
if (typeof template === "string") {
|
||||
const compiledTemplate = Handlebars.compile(template);
|
||||
|
||||
// Format snippets as comments and prepend to prefix
|
||||
const formattedSnippets = snippets
|
||||
.map((snippet) =>
|
||||
formatExternalSnippet(snippet.filepath, snippet.contents, lang),
|
||||
)
|
||||
.join("\n");
|
||||
if (formattedSnippets.length > 0) {
|
||||
prefix = `${formattedSnippets}\n\n${prefix}`;
|
||||
} else if (prefix.trim().length === 0 && suffix.trim().length === 0) {
|
||||
// If it's an empty file, include the file name as a comment
|
||||
prefix = `${lang.singleLineComment} ${getLastNPathParts(filepath, 2)}\n${prefix}`;
|
||||
}
|
||||
|
||||
prompt = compiledTemplate({
|
||||
prefix,
|
||||
suffix,
|
||||
filename,
|
||||
reponame,
|
||||
});
|
||||
} else {
|
||||
// Let the template function format snippets
|
||||
prompt = template(prefix, suffix, filepath, reponame, snippets);
|
||||
}
|
||||
|
||||
// Completion
|
||||
let completion = "";
|
||||
|
||||
const cache = await autocompleteCache;
|
||||
const cachedCompletion = options.useCache
|
||||
? await cache.get(prefix)
|
||||
: undefined;
|
||||
let cacheHit = false;
|
||||
if (cachedCompletion) {
|
||||
// Cache
|
||||
cacheHit = true;
|
||||
completion = cachedCompletion;
|
||||
} else {
|
||||
const stop = [
|
||||
...(completionOptions?.stop || []),
|
||||
...multilineStops,
|
||||
...commonStops,
|
||||
...(llm.model.toLowerCase().includes("starcoder2")
|
||||
? STARCODER2_T_ARTIFACTS
|
||||
: []),
|
||||
...(lang.stopWords ?? []),
|
||||
...lang.topLevelKeywords.map((word) => `\n${word}`),
|
||||
];
|
||||
|
||||
let langMultilineDecision = lang.useMultiline?.({ prefix, suffix });
|
||||
let multiline: boolean = false;
|
||||
if (langMultilineDecision) {
|
||||
multiline = langMultilineDecision;
|
||||
} else {
|
||||
multiline =
|
||||
!input.selectedCompletionInfo && // Only ever single-line if using intellisense selected value
|
||||
options.multilineCompletions !== "never" &&
|
||||
(options.multilineCompletions === "always" || completeMultiline);
|
||||
}
|
||||
|
||||
// Try to reuse pending requests if what the user typed matches start of completion
|
||||
const generator = this.generatorReuseManager.getGenerator(
|
||||
prefix,
|
||||
() =>
|
||||
llm.supportsFim()
|
||||
? llm.streamFim(prefix, suffix, {
|
||||
...completionOptions,
|
||||
stop,
|
||||
})
|
||||
: llm.streamComplete(prompt, {
|
||||
...completionOptions,
|
||||
raw: true,
|
||||
stop,
|
||||
}),
|
||||
multiline,
|
||||
);
|
||||
|
||||
// Full stop means to stop the LLM's generation, instead of just truncating the displayed completion
|
||||
const fullStop = () =>
|
||||
this.generatorReuseManager.currentGenerator?.cancel();
|
||||
|
||||
// LLM
|
||||
let cancelled = false;
|
||||
const generatorWithCancellation = async function* () {
|
||||
for await (const update of generator) {
|
||||
if (token.aborted) {
|
||||
cancelled = true;
|
||||
return;
|
||||
}
|
||||
yield update;
|
||||
}
|
||||
};
|
||||
let charGenerator = generatorWithCancellation();
|
||||
charGenerator = noFirstCharNewline(charGenerator);
|
||||
charGenerator = onlyWhitespaceAfterEndOfLine(
|
||||
charGenerator,
|
||||
lang.endOfLine,
|
||||
fullStop,
|
||||
);
|
||||
charGenerator = this.bracketMatchingService.stopOnUnmatchedClosingBracket(
|
||||
charGenerator,
|
||||
prefix,
|
||||
suffix,
|
||||
filepath,
|
||||
multiline,
|
||||
);
|
||||
|
||||
let lineGenerator = streamLines(charGenerator);
|
||||
lineGenerator = stopAtLines(lineGenerator, fullStop);
|
||||
lineGenerator = stopAtRepeatingLines(lineGenerator, fullStop);
|
||||
lineGenerator = avoidPathLine(lineGenerator, lang.singleLineComment);
|
||||
lineGenerator = skipPrefixes(lineGenerator);
|
||||
lineGenerator = noTopLevelKeywordsMidline(
|
||||
lineGenerator,
|
||||
lang.topLevelKeywords,
|
||||
fullStop,
|
||||
);
|
||||
|
||||
for (const lineFilter of lang.lineFilters ?? []) {
|
||||
lineGenerator = lineFilter({ lines: lineGenerator, fullStop });
|
||||
}
|
||||
|
||||
lineGenerator = streamWithNewLines(lineGenerator);
|
||||
|
||||
const finalGenerator = stopAtSimilarLine(
|
||||
lineGenerator,
|
||||
lineBelowCursor,
|
||||
fullStop,
|
||||
);
|
||||
|
||||
try {
|
||||
for await (const update of finalGenerator) {
|
||||
completion += update;
|
||||
}
|
||||
} catch (e: any) {
|
||||
if (ERRORS_TO_IGNORE.some((err) => e.includes(err))) {
|
||||
return undefined;
|
||||
}
|
||||
throw e;
|
||||
}
|
||||
|
||||
if (cancelled) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
const processedCompletion = postprocessCompletion({
|
||||
completion,
|
||||
prefix,
|
||||
suffix,
|
||||
llm,
|
||||
});
|
||||
|
||||
if (!processedCompletion) {
|
||||
return undefined;
|
||||
}
|
||||
completion = processedCompletion;
|
||||
}
|
||||
|
||||
const time = Date.now() - startTime;
|
||||
return {
|
||||
time,
|
||||
completion,
|
||||
prefix,
|
||||
suffix,
|
||||
prompt,
|
||||
modelProvider: llm.providerName,
|
||||
modelName: llm.model,
|
||||
completionOptions,
|
||||
cacheHit,
|
||||
filepath: input.filepath,
|
||||
completionId: input.completionId,
|
||||
gitRepo: await this.ide.getRepoName(input.filepath),
|
||||
uniqueId: await this.ide.getUniqueId(),
|
||||
...options,
|
||||
};
|
||||
}
|
||||
}
|
||||
|
|
|
@ -7,6 +7,7 @@ import {
|
|||
pruneLinesFromBottom,
|
||||
pruneLinesFromTop,
|
||||
} from "../llm/countTokens.js";
|
||||
import { ImportDefinitionsService } from "./ImportDefinitionsService.js";
|
||||
import { getAst, getTreePathAtCursor } from "./ast.js";
|
||||
import {
|
||||
AutocompleteLanguageInfo,
|
||||
|
@ -15,6 +16,7 @@ import {
|
|||
} from "./languages.js";
|
||||
import {
|
||||
fillPromptWithSnippets,
|
||||
getSymbolsForSnippet,
|
||||
rankSnippets,
|
||||
removeRangeFromSnippets,
|
||||
type AutocompleteSnippet,
|
||||
|
@ -124,6 +126,7 @@ export async function constructAutocompletePrompt(
|
|||
recentlyEditedFiles: RangeInFileWithContents[],
|
||||
modelName: string,
|
||||
extraSnippets: AutocompleteSnippet[],
|
||||
importDefinitionsService: ImportDefinitionsService,
|
||||
): Promise<{
|
||||
prefix: string;
|
||||
suffix: string;
|
||||
|
@ -189,6 +192,29 @@ export async function constructAutocompletePrompt(
|
|||
}
|
||||
}
|
||||
|
||||
// Use imports
|
||||
if (options.useImports) {
|
||||
const importSnippets = [];
|
||||
const fileInfo = importDefinitionsService.get(filepath);
|
||||
if (fileInfo) {
|
||||
const { imports } = fileInfo;
|
||||
// Look for imports of any symbols around the current range
|
||||
const textAroundCursor =
|
||||
fullPrefix.split("\n").slice(-5).join("\n") +
|
||||
fullSuffix.split("\n").slice(0, 3).join("\n");
|
||||
const symbols = Array.from(
|
||||
getSymbolsForSnippet(textAroundCursor),
|
||||
).filter((symbol) => !language.topLevelKeywords.includes(symbol));
|
||||
for (const symbol of symbols) {
|
||||
const rifs = imports[symbol];
|
||||
if (Array.isArray(rifs)) {
|
||||
importSnippets.push(...rifs);
|
||||
}
|
||||
}
|
||||
}
|
||||
snippets.push(...importSnippets);
|
||||
}
|
||||
|
||||
// Filter out empty snippets and ones that are already in the prefix/suffix
|
||||
snippets = snippets
|
||||
.map((snippet) => ({ ...snippet }))
|
||||
|
|
|
@ -97,5 +97,10 @@ export function postprocessCompletion({
|
|||
completion = completion.slice(1);
|
||||
}
|
||||
|
||||
// Qwen often adds an extra space to the start
|
||||
if (llm.model.toLowerCase().includes("qwen") && completion.startsWith(" ")) {
|
||||
completion = completion.slice(1);
|
||||
}
|
||||
|
||||
return completion;
|
||||
}
|
||||
|
|
|
@ -34,6 +34,8 @@ const stableCodeFimTemplate: AutocompleteTemplate = {
|
|||
"<fim_middle>",
|
||||
"<|endoftext|>",
|
||||
"<file_sep>",
|
||||
"</fim_middle>",
|
||||
"</code>",
|
||||
],
|
||||
},
|
||||
};
|
||||
|
|
|
@ -252,7 +252,7 @@ const EditSlashCommand: SlashCommand = {
|
|||
);
|
||||
// if the above replace fails to find a match, the code will still be present
|
||||
// in the userInput. Replace it with input if available.
|
||||
if (userInput.includes("\`\`\`") && (input !== "" || !input)) {
|
||||
if (userInput.includes("```") && (input !== "" || !input)) {
|
||||
userInput = input;
|
||||
}
|
||||
|
||||
|
|
|
@ -40,14 +40,16 @@ export class ConfigHandler {
|
|||
this.updateListeners.push(listener);
|
||||
}
|
||||
|
||||
reloadConfig() {
|
||||
async reloadConfig() {
|
||||
this.savedConfig = undefined;
|
||||
this.savedBrowserConfig = undefined;
|
||||
this.loadConfig().then(() => {
|
||||
for (const listener of this.updateListeners) {
|
||||
listener();
|
||||
}
|
||||
});
|
||||
this._pendingConfigPromise = undefined;
|
||||
|
||||
await this.loadConfig();
|
||||
|
||||
for (const listener of this.updateListeners) {
|
||||
listener();
|
||||
}
|
||||
}
|
||||
|
||||
async getSerializedConfig(): Promise<BrowserSerializedContinueConfig> {
|
||||
|
@ -58,45 +60,56 @@ export class ConfigHandler {
|
|||
return this.savedBrowserConfig;
|
||||
}
|
||||
|
||||
private _pendingConfigPromise?: Promise<ContinueConfig>;
|
||||
async loadConfig(): Promise<ContinueConfig> {
|
||||
if (this.savedConfig) {
|
||||
return this.savedConfig;
|
||||
} else if (this._pendingConfigPromise) {
|
||||
return this._pendingConfigPromise;
|
||||
}
|
||||
|
||||
let workspaceConfigs: ContinueRcJson[] = [];
|
||||
try {
|
||||
workspaceConfigs = await this.ide.getWorkspaceConfigs();
|
||||
} catch (e) {
|
||||
console.warn("Failed to load workspace configs");
|
||||
}
|
||||
this._pendingConfigPromise = new Promise(async (resolve, reject) => {
|
||||
let workspaceConfigs: ContinueRcJson[] = [];
|
||||
try {
|
||||
workspaceConfigs = await this.ide.getWorkspaceConfigs();
|
||||
} catch (e) {
|
||||
console.warn("Failed to load workspace configs");
|
||||
}
|
||||
|
||||
const ideInfo = await this.ide.getIdeInfo();
|
||||
const uniqueId = await this.ide.getUniqueId();
|
||||
const ideSettings = await this.ideSettingsPromise;
|
||||
const ideInfo = await this.ide.getIdeInfo();
|
||||
const uniqueId = await this.ide.getUniqueId();
|
||||
const ideSettings = await this.ideSettingsPromise;
|
||||
|
||||
const newConfig = await loadFullConfigNode(
|
||||
this.ide,
|
||||
workspaceConfigs,
|
||||
ideSettings,
|
||||
ideInfo.ideType,
|
||||
uniqueId,
|
||||
this.writeLog,
|
||||
);
|
||||
newConfig.allowAnonymousTelemetry =
|
||||
newConfig.allowAnonymousTelemetry &&
|
||||
(await this.ide.isTelemetryEnabled());
|
||||
const newConfig = await loadFullConfigNode(
|
||||
this.ide,
|
||||
workspaceConfigs,
|
||||
ideSettings,
|
||||
ideInfo.ideType,
|
||||
uniqueId,
|
||||
this.writeLog,
|
||||
);
|
||||
newConfig.allowAnonymousTelemetry =
|
||||
newConfig.allowAnonymousTelemetry &&
|
||||
(await this.ide.isTelemetryEnabled());
|
||||
|
||||
// Setup telemetry only after (and if) we know it is enabled
|
||||
await Telemetry.setup(
|
||||
newConfig.allowAnonymousTelemetry ?? true,
|
||||
await this.ide.getUniqueId(),
|
||||
ideInfo.extensionVersion,
|
||||
);
|
||||
// Setup telemetry only after (and if) we know it is enabled
|
||||
await Telemetry.setup(
|
||||
newConfig.allowAnonymousTelemetry ?? true,
|
||||
await this.ide.getUniqueId(),
|
||||
ideInfo.extensionVersion,
|
||||
);
|
||||
|
||||
(newConfig.contextProviders ?? []).push(...this.additionalContextProviders);
|
||||
(newConfig.contextProviders ?? []).push(
|
||||
...this.additionalContextProviders,
|
||||
);
|
||||
|
||||
this.savedConfig = newConfig;
|
||||
return newConfig;
|
||||
this.savedConfig = newConfig;
|
||||
resolve(newConfig);
|
||||
});
|
||||
|
||||
this.savedConfig = await this._pendingConfigPromise;
|
||||
this._pendingConfigPromise = undefined;
|
||||
return this.savedConfig;
|
||||
}
|
||||
|
||||
async llmFromTitle(title?: string): Promise<ILLM> {
|
||||
|
|
|
@ -2,6 +2,7 @@ import { SerializedContinueConfig } from "../index.js";
|
|||
import { FREE_TRIAL_MODELS } from "./default.js";
|
||||
|
||||
export const TRIAL_FIM_MODEL = "codestral-latest";
|
||||
export const ONBOARDING_LOCAL_MODEL_TITLE = "Ollama";
|
||||
|
||||
export function setupApiKeysMode(
|
||||
config: SerializedContinueConfig,
|
||||
|
@ -23,25 +24,6 @@ export function setupApiKeysMode(
|
|||
};
|
||||
}
|
||||
|
||||
export function setupOptimizedExistingUserMode(
|
||||
config: SerializedContinueConfig,
|
||||
): SerializedContinueConfig {
|
||||
return {
|
||||
...config,
|
||||
tabAutocompleteModel: {
|
||||
title: "Tab Autocomplete",
|
||||
provider: "free-trial",
|
||||
model: TRIAL_FIM_MODEL,
|
||||
},
|
||||
embeddingsProvider: {
|
||||
provider: "free-trial",
|
||||
},
|
||||
reranker: {
|
||||
name: "free-trial",
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
export function setupLocalMode(
|
||||
config: SerializedContinueConfig,
|
||||
): SerializedContinueConfig {
|
||||
|
@ -54,7 +36,7 @@ export function setupLocalMode(
|
|||
model: "llama3",
|
||||
},
|
||||
{
|
||||
title: "Ollama",
|
||||
title: ONBOARDING_LOCAL_MODEL_TITLE,
|
||||
provider: "ollama",
|
||||
model: "AUTODETECT",
|
||||
},
|
||||
|
@ -66,7 +48,8 @@ export function setupLocalMode(
|
|||
model: "starcoder2:3b",
|
||||
},
|
||||
embeddingsProvider: {
|
||||
provider: "transformers.js",
|
||||
provider: "ollama",
|
||||
model: "nomic-embed-text",
|
||||
},
|
||||
reranker: undefined,
|
||||
};
|
||||
|
@ -107,7 +90,7 @@ export function setupLocalAfterFreeTrial(
|
|||
model: "llama3",
|
||||
},
|
||||
{
|
||||
title: "Ollama",
|
||||
title: ONBOARDING_LOCAL_MODEL_TITLE,
|
||||
provider: "ollama",
|
||||
model: "AUTODETECT",
|
||||
},
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
import Handlebars from "handlebars";
|
||||
import path from "path";
|
||||
import * as YAML from "yaml";
|
||||
import type { IDE, SlashCommand } from "..";
|
||||
|
|
|
@ -6,8 +6,6 @@ import {
|
|||
import { getBasename } from "../../util/index.js";
|
||||
import { BaseContextProvider } from "../index.js";
|
||||
|
||||
// import { getOutlines } from "llm-code-highlighter/dist/index.continue";
|
||||
|
||||
class CodeOutlineContextProvider extends BaseContextProvider {
|
||||
static description: ContextProviderDescription = {
|
||||
title: "outline",
|
||||
|
|
|
@ -1,7 +1,6 @@
|
|||
import { ContextProviderName } from "../../index.js";
|
||||
import { BaseContextProvider } from "../index.js";
|
||||
import CodeContextProvider from "./CodeContextProvider.js";
|
||||
// import CodeHighlightsContextProvider from "./CodeHighlightsContextProvider.js";
|
||||
import CodebaseContextProvider from "./CodebaseContextProvider.js";
|
||||
import CurrentFileContextProvider from "./CurrentFileContextProvider.js";
|
||||
import DatabaseContextProvider from "./DatabaseContextProvider.js";
|
||||
|
@ -23,6 +22,13 @@ import SearchContextProvider from "./SearchContextProvider.js";
|
|||
import TerminalContextProvider from "./TerminalContextProvider.js";
|
||||
import URLContextProvider from "./URLContextProvider.js";
|
||||
|
||||
/**
|
||||
* Note: We are currently omitting the following providers due to bugs:
|
||||
* - `CodeOutlineContextProvider`
|
||||
* - `CodeHighlightsContextProvider`
|
||||
*
|
||||
* See this issue for details: https://github.com/continuedev/continue/issues/1365
|
||||
*/
|
||||
const Providers: (typeof BaseContextProvider)[] = [
|
||||
DiffContextProvider,
|
||||
FileTreeContextProvider,
|
||||
|
@ -39,8 +45,6 @@ const Providers: (typeof BaseContextProvider)[] = [
|
|||
FolderContextProvider,
|
||||
DocsContextProvider,
|
||||
GitLabMergeRequestContextProvider,
|
||||
// CodeHighlightsContextProvider,
|
||||
// CodeOutlineContextProvider,
|
||||
JiraIssuesContextProvider,
|
||||
PostgresContextProvider,
|
||||
DatabaseContextProvider,
|
||||
|
|
79
core/core.ts
79
core/core.ts
|
@ -12,9 +12,8 @@ import {
|
|||
setupFreeTrialMode,
|
||||
setupLocalAfterFreeTrial,
|
||||
setupLocalMode,
|
||||
setupOptimizedExistingUserMode,
|
||||
} from "./config/onboarding.js";
|
||||
import { createNewPromptFile } from "./config/promptFile";
|
||||
import { createNewPromptFile } from "./config/promptFile.js";
|
||||
import { addModel, addOpenAIKey, deleteModel } from "./config/util.js";
|
||||
import { ContinueServerClient } from "./continueServer/stubs/client.js";
|
||||
import { indexDocs } from "./indexing/docs/index.js";
|
||||
|
@ -221,7 +220,7 @@ export class Core {
|
|||
new TransformersJsEmbeddingsProvider(),
|
||||
)) {
|
||||
}
|
||||
this.ide.infoPopup(`🎉 Successfully indexed ${msg.data.title}`);
|
||||
this.ide.infoPopup(`Successfully indexed ${msg.data.title}`);
|
||||
this.messenger.send("refreshSubmenuItems", undefined);
|
||||
});
|
||||
on("context/loadSubmenuItems", async (msg) => {
|
||||
|
@ -364,20 +363,20 @@ export class Core {
|
|||
const model =
|
||||
config.models.find((model) => model.title === msg.data.title) ??
|
||||
config.models.find((model) => model.title?.startsWith(msg.data.title));
|
||||
if (model) {
|
||||
return model.listModels();
|
||||
} else {
|
||||
if (msg.data.title === "Ollama") {
|
||||
try {
|
||||
try {
|
||||
if (model) {
|
||||
return model.listModels();
|
||||
} else {
|
||||
if (msg.data.title === "Ollama") {
|
||||
const models = await new Ollama({ model: "" }).listModels();
|
||||
return models;
|
||||
} catch (e) {
|
||||
console.warn(`Error listing Ollama models: ${e}`);
|
||||
} else {
|
||||
return undefined;
|
||||
}
|
||||
} else {
|
||||
return undefined;
|
||||
}
|
||||
} catch (e) {
|
||||
console.warn(`Error listing Ollama models: ${e}`);
|
||||
return undefined;
|
||||
}
|
||||
});
|
||||
|
||||
|
@ -411,6 +410,13 @@ export class Core {
|
|||
name: slashCommandName,
|
||||
});
|
||||
|
||||
const checkActiveInterval = setInterval(() => {
|
||||
if (abortedMessageIds.has(msg.messageId)) {
|
||||
abortedMessageIds.delete(msg.messageId);
|
||||
clearInterval(checkActiveInterval);
|
||||
}
|
||||
}, 100);
|
||||
|
||||
for await (const content of slashCommand.run({
|
||||
input,
|
||||
history,
|
||||
|
@ -429,10 +435,15 @@ export class Core {
|
|||
fetch: (url, init) =>
|
||||
fetchwithRequestOptions(url, init, config.requestOptions),
|
||||
})) {
|
||||
if (abortedMessageIds.has(msg.messageId)) {
|
||||
abortedMessageIds.delete(msg.messageId);
|
||||
break;
|
||||
}
|
||||
if (content) {
|
||||
yield { content };
|
||||
}
|
||||
}
|
||||
clearInterval(checkActiveInterval);
|
||||
yield { done: true, content: "" };
|
||||
}
|
||||
on("command/run", (msg) =>
|
||||
|
@ -490,23 +501,41 @@ export class Core {
|
|||
|
||||
on("completeOnboarding", (msg) => {
|
||||
const mode = msg.data.mode;
|
||||
|
||||
Telemetry.capture("onboardingSelection", {
|
||||
mode,
|
||||
});
|
||||
if (mode === "custom" || mode === "localExistingUser") {
|
||||
|
||||
if (mode === "custom") {
|
||||
return;
|
||||
}
|
||||
editConfigJson(
|
||||
mode === "local"
|
||||
? setupLocalMode
|
||||
: mode === "freeTrial"
|
||||
? setupFreeTrialMode
|
||||
: mode === "localAfterFreeTrial"
|
||||
? setupLocalAfterFreeTrial
|
||||
: mode === "apiKeys"
|
||||
? setupApiKeysMode
|
||||
: setupOptimizedExistingUserMode,
|
||||
);
|
||||
|
||||
let editConfigJsonCallback: Parameters<typeof editConfigJson>[0];
|
||||
|
||||
switch (mode) {
|
||||
case "local":
|
||||
editConfigJsonCallback = setupLocalMode;
|
||||
break;
|
||||
|
||||
case "freeTrial":
|
||||
editConfigJsonCallback = setupFreeTrialMode;
|
||||
break;
|
||||
|
||||
case "localAfterFreeTrial":
|
||||
editConfigJsonCallback = setupLocalAfterFreeTrial;
|
||||
break;
|
||||
|
||||
case "apiKeys":
|
||||
editConfigJsonCallback = setupApiKeysMode;
|
||||
break;
|
||||
|
||||
default:
|
||||
console.error(`Invalid mode: ${mode}`);
|
||||
editConfigJsonCallback = (config) => config;
|
||||
}
|
||||
|
||||
editConfigJson(editConfigJsonCallback);
|
||||
|
||||
this.configHandler.reloadConfig();
|
||||
});
|
||||
|
||||
|
@ -539,7 +568,7 @@ export class Core {
|
|||
on("index/indexingProgressBarInitialized", async (msg) => {
|
||||
// Triggered when progress bar is initialized.
|
||||
// If a non-default state has been stored, update the indexing display to that state
|
||||
if (this.indexingState.status != "loading") {
|
||||
if (this.indexingState.status !== "loading") {
|
||||
this.messenger.request("indexProgress", this.indexingState);
|
||||
}
|
||||
});
|
||||
|
|
|
@ -208,6 +208,11 @@ export interface RangeInFile {
|
|||
range: Range;
|
||||
}
|
||||
|
||||
export interface Location {
|
||||
filepath: string;
|
||||
position: Position;
|
||||
}
|
||||
|
||||
export interface FileWithContents {
|
||||
filepath: string;
|
||||
contents: string;
|
||||
|
@ -470,6 +475,12 @@ export interface IDE {
|
|||
listDir(dir: string): Promise<[string, FileType][]>;
|
||||
getLastModified(files: string[]): Promise<{ [path: string]: number }>;
|
||||
getGitHubAuthToken(): Promise<string | undefined>;
|
||||
|
||||
// LSP
|
||||
gotoDefinition(location: Location): Promise<RangeInFile[]>;
|
||||
|
||||
// Callbacks
|
||||
onDidChangeActiveTextEditor(callback: (filepath: string) => void): void;
|
||||
}
|
||||
|
||||
// Slash Commands
|
||||
|
@ -768,11 +779,13 @@ export interface TabAutocompleteOptions {
|
|||
useRecentlyEdited: boolean;
|
||||
recentLinePrefixMatchMinLength: number;
|
||||
disableInFiles?: string[];
|
||||
useImports?: boolean;
|
||||
}
|
||||
|
||||
export interface ContinueUIConfig {
|
||||
codeBlockToolbarPosition?: "top" | "bottom";
|
||||
fontSize?: number;
|
||||
displayRawMarkdown?: boolean;
|
||||
}
|
||||
|
||||
interface ContextMenuConfig {
|
||||
|
@ -785,6 +798,7 @@ interface ContextMenuConfig {
|
|||
|
||||
interface ModelRoles {
|
||||
inlineEdit?: string;
|
||||
applyCodeBlock?: string;
|
||||
}
|
||||
|
||||
interface ExperimentalConfig {
|
||||
|
|
|
@ -1,5 +1,3 @@
|
|||
import fs from "node:fs";
|
||||
import path from "node:path";
|
||||
import type {
|
||||
ChunkWithoutID,
|
||||
ContextItem,
|
||||
|
@ -10,9 +8,9 @@ import type {
|
|||
} from "../index.js";
|
||||
import { getBasename, getLastNPathParts } from "../util/index.js";
|
||||
import {
|
||||
getLanguageForFile,
|
||||
TSQueryType,
|
||||
getParserForFile,
|
||||
supportedLanguages,
|
||||
getQueryForFile,
|
||||
} from "../util/treeSitter.js";
|
||||
import { DatabaseConnection, SqliteDb, tagToString } from "./refreshIndex.js";
|
||||
import {
|
||||
|
@ -23,6 +21,7 @@ import {
|
|||
} from "./types.js";
|
||||
|
||||
export class CodeSnippetsCodebaseIndex implements CodebaseIndex {
|
||||
relativeExpectedTime: number = 1;
|
||||
artifactId = "codeSnippets";
|
||||
|
||||
constructor(private readonly ide: IDE) {}
|
||||
|
@ -46,35 +45,16 @@ export class CodeSnippetsCodebaseIndex implements CodebaseIndex {
|
|||
)`);
|
||||
}
|
||||
|
||||
private getQuerySource(filepath: string) {
|
||||
const fullLangName = supportedLanguages[filepath.split(".").pop() ?? ""];
|
||||
const sourcePath = path.join(
|
||||
__dirname,
|
||||
"..",
|
||||
"tree-sitter",
|
||||
"code-snippet-queries",
|
||||
`tree-sitter-${fullLangName}-tags.scm`,
|
||||
);
|
||||
if (!fs.existsSync(sourcePath)) {
|
||||
return "";
|
||||
}
|
||||
return fs.readFileSync(sourcePath).toString();
|
||||
}
|
||||
|
||||
async getSnippetsInFile(
|
||||
filepath: string,
|
||||
contents: string,
|
||||
): Promise<(ChunkWithoutID & { title: string })[]> {
|
||||
const lang = await getLanguageForFile(filepath);
|
||||
if (!lang) {
|
||||
return [];
|
||||
}
|
||||
const parser = await getParserForFile(filepath);
|
||||
if (!parser) {
|
||||
return [];
|
||||
}
|
||||
const ast = parser.parse(contents);
|
||||
const query = lang?.query(this.getQuerySource(filepath));
|
||||
const query = await getQueryForFile(filepath, TSQueryType.CodeSnippets);
|
||||
const matches = query?.matches(ast.rootNode);
|
||||
|
||||
return (
|
||||
|
|
|
@ -15,6 +15,7 @@ import {
|
|||
} from "./types.js";
|
||||
|
||||
export class FullTextSearchCodebaseIndex implements CodebaseIndex {
|
||||
relativeExpectedTime: number = 0.2;
|
||||
artifactId = "sqliteFts";
|
||||
|
||||
private async _createTables(db: DatabaseConnection) {
|
||||
|
|
|
@ -31,6 +31,7 @@ interface LanceDbRow {
|
|||
}
|
||||
|
||||
export class LanceDbIndex implements CodebaseIndex {
|
||||
relativeExpectedTime: number = 13;
|
||||
get artifactId(): string {
|
||||
return `vectordb::${this.embeddingsProvider.id}`;
|
||||
}
|
||||
|
@ -243,6 +244,9 @@ export class LanceDbIndex implements CodebaseIndex {
|
|||
}
|
||||
}
|
||||
|
||||
const progressReservedForTagging = 0.1;
|
||||
let accumulatedProgress = 0;
|
||||
|
||||
let computedRows: LanceDbRow[] = [];
|
||||
for await (const update of this.computeChunks(results.compute)) {
|
||||
if (Array.isArray(update)) {
|
||||
|
@ -261,7 +265,12 @@ export class LanceDbIndex implements CodebaseIndex {
|
|||
data.contents,
|
||||
);
|
||||
|
||||
yield { progress, desc, status: "indexing" };
|
||||
accumulatedProgress = progress * (1 - progressReservedForTagging);
|
||||
yield {
|
||||
progress: accumulatedProgress,
|
||||
desc,
|
||||
status: "indexing",
|
||||
};
|
||||
} else {
|
||||
await addComputedLanceDbRows(update, computedRows);
|
||||
computedRows = [];
|
||||
|
@ -300,13 +309,27 @@ export class LanceDbIndex implements CodebaseIndex {
|
|||
}
|
||||
|
||||
markComplete([{ path, cacheKey }], IndexResultType.AddTag);
|
||||
accumulatedProgress += 1 / results.addTag.length / 3;
|
||||
yield {
|
||||
progress: accumulatedProgress,
|
||||
desc: `Indexing ${path}`,
|
||||
status: "indexing",
|
||||
};
|
||||
}
|
||||
|
||||
// Delete or remove tag - remove from lance table)
|
||||
if (!needToCreateTable) {
|
||||
for (const { path, cacheKey } of [...results.removeTag, ...results.del]) {
|
||||
const toDel = [...results.removeTag, ...results.del];
|
||||
for (const { path, cacheKey } of toDel) {
|
||||
// This is where the aforementioned lowercase conversion problem shows
|
||||
await table?.delete(`cachekey = '${cacheKey}' AND path = '${path}'`);
|
||||
|
||||
accumulatedProgress += 1 / toDel.length / 3;
|
||||
yield {
|
||||
progress: accumulatedProgress,
|
||||
desc: `Deleting ${path}`,
|
||||
status: "indexing",
|
||||
};
|
||||
}
|
||||
}
|
||||
markComplete(results.removeTag, IndexResultType.RemoveTag);
|
||||
|
@ -318,6 +341,12 @@ export class LanceDbIndex implements CodebaseIndex {
|
|||
cacheKey,
|
||||
path,
|
||||
);
|
||||
accumulatedProgress += 1 / results.del.length / 3;
|
||||
yield {
|
||||
progress: accumulatedProgress,
|
||||
desc: `Deleting ${path}`,
|
||||
status: "indexing",
|
||||
};
|
||||
}
|
||||
|
||||
markComplete(results.del, IndexResultType.Delete);
|
||||
|
|
|
@ -12,6 +12,7 @@ import {
|
|||
import { chunkDocument } from "./chunk.js";
|
||||
|
||||
export class ChunkCodebaseIndex implements CodebaseIndex {
|
||||
relativeExpectedTime: number = 1;
|
||||
static artifactId = "chunks";
|
||||
artifactId: string = ChunkCodebaseIndex.artifactId;
|
||||
|
||||
|
|
|
@ -52,9 +52,11 @@ export class CodebaseIndexer {
|
|||
workspaceDirs: string[],
|
||||
abortSignal: AbortSignal,
|
||||
): AsyncGenerator<IndexingProgressUpdate> {
|
||||
let progress = 0;
|
||||
|
||||
if (workspaceDirs.length === 0) {
|
||||
yield {
|
||||
progress: 0,
|
||||
progress,
|
||||
desc: "Nothing to index",
|
||||
status: "disabled",
|
||||
};
|
||||
|
@ -64,29 +66,32 @@ export class CodebaseIndexer {
|
|||
const config = await this.configHandler.loadConfig();
|
||||
if (config.disableIndexing) {
|
||||
yield {
|
||||
progress: 0,
|
||||
progress,
|
||||
desc: "Indexing is disabled in config.json",
|
||||
status: "disabled",
|
||||
};
|
||||
return;
|
||||
} else {
|
||||
yield {
|
||||
progress: 0,
|
||||
progress,
|
||||
desc: "Starting indexing",
|
||||
status: "loading",
|
||||
};
|
||||
}
|
||||
|
||||
const indexesToBuild = await this.getIndexesToBuild();
|
||||
|
||||
let completedDirs = 0;
|
||||
const totalRelativeExpectedTime = indexesToBuild.reduce(
|
||||
(sum, index) => sum + index.relativeExpectedTime,
|
||||
0,
|
||||
);
|
||||
|
||||
// Wait until Git Extension has loaded to report progress
|
||||
// so we don't appear stuck at 0% while waiting
|
||||
await this.ide.getRepoName(workspaceDirs[0]);
|
||||
|
||||
yield {
|
||||
progress: 0,
|
||||
progress,
|
||||
desc: "Starting indexing...",
|
||||
status: "loading",
|
||||
};
|
||||
|
@ -99,7 +104,7 @@ export class CodebaseIndexer {
|
|||
const stats = await this.ide.getLastModified(files);
|
||||
const branch = await this.ide.getBranch(directory);
|
||||
const repoName = await this.ide.getRepoName(directory);
|
||||
let completedIndexes = 0;
|
||||
let completedRelativeExpectedTime = 0;
|
||||
|
||||
for (const codebaseIndex of indexesToBuild) {
|
||||
// TODO: IndexTag type should use repoName rather than directory
|
||||
|
@ -116,12 +121,10 @@ export class CodebaseIndexer {
|
|||
);
|
||||
|
||||
try {
|
||||
for await (let { progress, desc } of codebaseIndex.update(
|
||||
tag,
|
||||
results,
|
||||
markComplete,
|
||||
repoName,
|
||||
)) {
|
||||
for await (let {
|
||||
progress: indexProgress,
|
||||
desc,
|
||||
} of codebaseIndex.update(tag, results, markComplete, repoName)) {
|
||||
// Handle pausing in this loop because it's the only one really taking time
|
||||
if (abortSignal.aborted) {
|
||||
yield {
|
||||
|
@ -134,7 +137,7 @@ export class CodebaseIndexer {
|
|||
|
||||
if (this.pauseToken.paused) {
|
||||
yield {
|
||||
progress: completedDirs / workspaceDirs.length,
|
||||
progress,
|
||||
desc: "Paused",
|
||||
status: "paused",
|
||||
};
|
||||
|
@ -143,19 +146,24 @@ export class CodebaseIndexer {
|
|||
}
|
||||
}
|
||||
|
||||
progress =
|
||||
(completedDirs +
|
||||
(completedRelativeExpectedTime +
|
||||
indexProgress * codebaseIndex.relativeExpectedTime) /
|
||||
totalRelativeExpectedTime) /
|
||||
workspaceDirs.length;
|
||||
yield {
|
||||
progress:
|
||||
(completedDirs +
|
||||
(completedIndexes + progress) / indexesToBuild.length) /
|
||||
workspaceDirs.length,
|
||||
progress,
|
||||
desc,
|
||||
status: "indexing",
|
||||
};
|
||||
}
|
||||
completedIndexes++;
|
||||
|
||||
completedRelativeExpectedTime += codebaseIndex.relativeExpectedTime;
|
||||
yield {
|
||||
progress:
|
||||
(completedDirs + completedIndexes / indexesToBuild.length) /
|
||||
(completedDirs +
|
||||
completedRelativeExpectedTime / totalRelativeExpectedTime) /
|
||||
workspaceDirs.length,
|
||||
desc: "Completed indexing " + codebaseIndex.artifactId,
|
||||
status: "indexing",
|
||||
|
@ -186,8 +194,9 @@ export class CodebaseIndexer {
|
|||
}
|
||||
|
||||
completedDirs++;
|
||||
progress = completedDirs / workspaceDirs.length;
|
||||
yield {
|
||||
progress: completedDirs / workspaceDirs.length,
|
||||
progress,
|
||||
desc: "Indexing Complete",
|
||||
status: "done",
|
||||
};
|
||||
|
|
|
@ -329,6 +329,7 @@ export async function getComputeDeleteAddRemove(
|
|||
}
|
||||
|
||||
export class GlobalCacheCodeBaseIndex implements CodebaseIndex {
|
||||
relativeExpectedTime: number = 1;
|
||||
private db: DatabaseConnection;
|
||||
|
||||
constructor(db: DatabaseConnection) {
|
||||
|
|
|
@ -14,6 +14,7 @@ export type MarkCompleteCallback = (
|
|||
|
||||
export interface CodebaseIndex {
|
||||
artifactId: string;
|
||||
relativeExpectedTime: number;
|
||||
update(
|
||||
tag: IndexTag,
|
||||
results: RefreshIndexResults,
|
||||
|
|
|
@ -12,7 +12,7 @@ class Deepseek extends OpenAI {
|
|||
},
|
||||
useLegacyCompletionsEndpoint: false,
|
||||
};
|
||||
protected maxStopWords: number | undefined = 4;
|
||||
protected maxStopWords: number | undefined = 16;
|
||||
}
|
||||
|
||||
export default Deepseek;
|
||||
|
|
|
@ -94,13 +94,14 @@ class OpenAI extends BaseLLM {
|
|||
// Jan + Azure OpenAI don't truncate and will throw an error
|
||||
this.maxStopWords !== undefined
|
||||
? options.stop?.slice(0, this.maxStopWords)
|
||||
: url.port === "1337" ||
|
||||
url.host === "api.openai.com" ||
|
||||
url.host === "api.groq.com" ||
|
||||
url.host === "api.deepseek.com" ||
|
||||
this.apiType === "azure"
|
||||
? options.stop?.slice(0, 4)
|
||||
: options.stop,
|
||||
: url.host === "api.deepseek.com"
|
||||
? options.stop?.slice(0, 16)
|
||||
: url.port === "1337" ||
|
||||
url.host === "api.openai.com" ||
|
||||
url.host === "api.groq.com" ||
|
||||
this.apiType === "azure"
|
||||
? options.stop?.slice(0, 4)
|
||||
: options.stop,
|
||||
};
|
||||
|
||||
return finalOptions;
|
||||
|
|
|
@ -1,6 +1,10 @@
|
|||
export async function* streamResponse(
|
||||
response: Response,
|
||||
): AsyncGenerator<string> {
|
||||
async function* toAsyncIterable(nodeReadable: NodeJS.ReadableStream): AsyncGenerator<Uint8Array> {
|
||||
for await (const chunk of nodeReadable) {
|
||||
yield chunk as Uint8Array;
|
||||
}
|
||||
}
|
||||
|
||||
export async function* streamResponse(response: Response): AsyncGenerator<string> {
|
||||
if (response.status !== 200) {
|
||||
throw new Error(await response.text());
|
||||
}
|
||||
|
@ -9,9 +13,24 @@ export async function* streamResponse(
|
|||
throw new Error("No response body returned.");
|
||||
}
|
||||
|
||||
const stream = (ReadableStream as any).from(response.body);
|
||||
// Get the major version of Node.js
|
||||
const nodeMajorVersion = parseInt(process.versions.node.split(".")[0], 10);
|
||||
|
||||
yield* stream.pipeThrough(new TextDecoderStream("utf-8"));
|
||||
if (nodeMajorVersion >= 20) {
|
||||
// Use the new API for Node 20 and above
|
||||
const stream = (ReadableStream as any).from(response.body);
|
||||
for await (const chunk of stream.pipeThrough(new TextDecoderStream("utf-8"))) {
|
||||
yield chunk;
|
||||
}
|
||||
} else {
|
||||
// Fallback for Node versions below 20
|
||||
// Streaming with this method doesn't work as version 20+ does
|
||||
const decoder = new TextDecoder("utf-8");
|
||||
const nodeStream = response.body as unknown as NodeJS.ReadableStream;
|
||||
for await (const chunk of toAsyncIterable(nodeStream)) {
|
||||
yield decoder.decode(chunk, { stream: true });
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function parseDataLine(line: string): any {
|
||||
|
|
|
@ -131,9 +131,8 @@ export type ToCoreFromIdeOrWebviewProtocol = {
|
|||
| "apiKeys"
|
||||
| "custom"
|
||||
| "freeTrial"
|
||||
| "localExistingUser"
|
||||
| "optimizedExistingUser"
|
||||
| "localAfterFreeTrial";
|
||||
| "localAfterFreeTrial"
|
||||
| "bestExperience";
|
||||
},
|
||||
void,
|
||||
];
|
||||
|
|
|
@ -5,8 +5,10 @@ import type {
|
|||
IdeInfo,
|
||||
IdeSettings,
|
||||
IndexTag,
|
||||
Location,
|
||||
Problem,
|
||||
Range,
|
||||
RangeInFile,
|
||||
Thread,
|
||||
} from "..";
|
||||
|
||||
|
@ -74,5 +76,7 @@ export type ToIdeFromWebviewOrCoreProtocol = {
|
|||
listDir: [{ dir: string }, [string, FileType][]];
|
||||
getLastModified: [{ files: string[] }, { [path: string]: number }];
|
||||
|
||||
gotoDefinition: [{ location: Location }, RangeInFile[]];
|
||||
|
||||
getGitHubAuthToken: [undefined, string | undefined];
|
||||
};
|
||||
|
|
|
@ -12,7 +12,10 @@ export type IProtocol = Record<string, [any, any]>;
|
|||
|
||||
// IDE
|
||||
export type ToIdeProtocol = ToIdeFromWebviewProtocol & ToIdeFromCoreProtocol;
|
||||
export type FromIdeProtocol = ToWebviewFromIdeProtocol & ToCoreFromIdeProtocol;
|
||||
export type FromIdeProtocol = ToWebviewFromIdeProtocol &
|
||||
ToCoreFromIdeProtocol & {
|
||||
didChangeActiveTextEditor: [{ filepath: string }, void];
|
||||
};
|
||||
|
||||
// Webview
|
||||
export type ToWebviewProtocol = ToWebviewFromIdeProtocol &
|
||||
|
|
|
@ -0,0 +1,30 @@
|
|||
export class PrecalculatedLruCache<V> {
|
||||
private items: [string, V][] = [];
|
||||
constructor(
|
||||
private readonly calculateValue: (key: string) => Promise<V>,
|
||||
private readonly N: number,
|
||||
) {}
|
||||
|
||||
async initKey(key: string) {
|
||||
// Maintain LRU
|
||||
const index = this.items.findIndex((item) => item[0] === key);
|
||||
|
||||
if (index < 0) {
|
||||
// Calculate info for new file
|
||||
const value: V = await this.calculateValue(key);
|
||||
|
||||
this.items.push([key, value]);
|
||||
if (this.items.length > this.N) {
|
||||
this.items.shift();
|
||||
}
|
||||
} else {
|
||||
// Move to end of array, since it was recently used
|
||||
const [item] = this.items.splice(index, 1);
|
||||
this.items.push(item);
|
||||
}
|
||||
}
|
||||
|
||||
get(key: string): V | undefined {
|
||||
return this.items.find((item) => item[0] === key)?.[1];
|
||||
}
|
||||
}
|
|
@ -6,8 +6,10 @@ import {
|
|||
IdeInfo,
|
||||
IdeSettings,
|
||||
IndexTag,
|
||||
Location,
|
||||
Problem,
|
||||
Range,
|
||||
RangeInFile,
|
||||
Thread,
|
||||
} from "../index.d.js";
|
||||
|
||||
|
@ -23,6 +25,13 @@ class FileSystemIde implements IDE {
|
|||
return Promise.resolve(fs.existsSync(filepath));
|
||||
}
|
||||
|
||||
gotoDefinition(location: Location): Promise<RangeInFile[]> {
|
||||
throw new Error("Method not implemented.");
|
||||
}
|
||||
onDidChangeActiveTextEditor(callback: (filepath: string) => void): void {
|
||||
throw new Error("Method not implemented.");
|
||||
}
|
||||
|
||||
async getIdeSettings(): Promise<IdeSettings> {
|
||||
return {
|
||||
remoteConfigServerUrl: undefined,
|
||||
|
|
|
@ -5,11 +5,14 @@ import type {
|
|||
IdeInfo,
|
||||
IdeSettings,
|
||||
IndexTag,
|
||||
Location,
|
||||
Problem,
|
||||
Range,
|
||||
RangeInFile,
|
||||
Thread,
|
||||
} from "../index.js";
|
||||
import { ToIdeFromWebviewOrCoreProtocol } from "../protocol/ide.js";
|
||||
import { FromIdeProtocol } from "../protocol/index.js";
|
||||
|
||||
export class MessageIde implements IDE {
|
||||
constructor(
|
||||
|
@ -17,10 +20,21 @@ export class MessageIde implements IDE {
|
|||
messageType: T,
|
||||
data: ToIdeFromWebviewOrCoreProtocol[T][0],
|
||||
) => Promise<ToIdeFromWebviewOrCoreProtocol[T][1]>,
|
||||
private readonly on: <T extends keyof FromIdeProtocol>(
|
||||
messageType: T,
|
||||
callback: (data: FromIdeProtocol[T][0]) => FromIdeProtocol[T][1],
|
||||
) => void,
|
||||
) {}
|
||||
fileExists(filepath: string): Promise<boolean> {
|
||||
return this.request("fileExists", { filepath });
|
||||
}
|
||||
async gotoDefinition(location: Location): Promise<RangeInFile[]> {
|
||||
return this.request("gotoDefinition", { location });
|
||||
}
|
||||
onDidChangeActiveTextEditor(callback: (filepath: string) => void): void {
|
||||
this.on("didChangeActiveTextEditor", (data) => callback(data.filepath));
|
||||
}
|
||||
|
||||
getIdeSettings(): Promise<IdeSettings> {
|
||||
return this.request("getIdeSettings", undefined);
|
||||
}
|
||||
|
|
|
@ -134,8 +134,9 @@ export class InProcessMessenger<
|
|||
externalRequest<T extends keyof ToProtocol>(
|
||||
messageType: T,
|
||||
data: ToProtocol[T][0],
|
||||
_messageId?: string,
|
||||
): Promise<ToProtocol[T][1]> {
|
||||
const messageId = uuidv4();
|
||||
const messageId = _messageId ?? uuidv4();
|
||||
const listener = this.myTypeListeners.get(messageType);
|
||||
if (!listener) {
|
||||
throw new Error(`No handler for message type "${String(messageType)}"`);
|
||||
|
|
|
@ -19,6 +19,7 @@ export const DEFAULT_AUTOCOMPLETE_OPTS: TabAutocompleteOptions = {
|
|||
useRecentlyEdited: true,
|
||||
recentLinePrefixMatchMinLength: 7,
|
||||
disableInFiles: undefined,
|
||||
useImports: true,
|
||||
};
|
||||
|
||||
export const COUNT_COMPLETION_REJECTED_AFTER = 10_000;
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
import * as path from "path";
|
||||
import fs from "node:fs";
|
||||
import path from "node:path";
|
||||
import Parser, { Language } from "web-tree-sitter";
|
||||
|
||||
export const supportedLanguages: { [key: string]: string } = {
|
||||
|
@ -115,3 +116,34 @@ export async function getLanguageForFile(
|
|||
return undefined;
|
||||
}
|
||||
}
|
||||
|
||||
export enum TSQueryType {
|
||||
CodeSnippets = "code-snippet-queries",
|
||||
Imports = "import-queries",
|
||||
}
|
||||
|
||||
export async function getQueryForFile(
|
||||
filepath: string,
|
||||
queryType: TSQueryType,
|
||||
): Promise<Parser.Query | undefined> {
|
||||
const language = await getLanguageForFile(filepath);
|
||||
if (!language) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
const fullLangName = supportedLanguages[filepath.split(".").pop() ?? ""];
|
||||
const sourcePath = path.join(
|
||||
__dirname,
|
||||
"..",
|
||||
"tree-sitter",
|
||||
queryType,
|
||||
`${fullLangName}.scm`,
|
||||
);
|
||||
if (!fs.existsSync(sourcePath)) {
|
||||
return undefined;
|
||||
}
|
||||
const querySource = fs.readFileSync(sourcePath).toString();
|
||||
|
||||
const query = language.query(querySource);
|
||||
return query;
|
||||
}
|
||||
|
|
|
@ -0,0 +1,30 @@
|
|||
---
|
||||
title: 📝 Changelog
|
||||
description: Changelogs for the latest releases of the VSCode and IntelliJ extensions
|
||||
keywords: [changelog]
|
||||
hide_table_of_contents: true
|
||||
---
|
||||
|
||||
import VSCodeChangelog from "../../extensions/vscode/CHANGELOG.md";
|
||||
import IntelliJChangelog from "../../extensions/intellij/CHANGELOG.md";
|
||||
|
||||
Below are the changelogs for the latest releases of the VSCode and IntelliJ extensions.
|
||||
|
||||
import TOCInline from "@theme/TOCInline";
|
||||
|
||||
<TOCInline
|
||||
toc={toc.filter(
|
||||
(node) =>
|
||||
node.value === "VSCode Changelog" || node.value === "IntelliJ Changelog",
|
||||
)}
|
||||
/>
|
||||
|
||||
<hr />
|
||||
|
||||
## VSCode Changelog
|
||||
|
||||
<VSCodeChangelog />
|
||||
|
||||
## IntelliJ Changelog
|
||||
|
||||
<IntelliJChangelog />
|
|
@ -202,7 +202,12 @@ assignee = currentUser() AND resolution = Unresolved order by updated DESC
|
|||
|
||||
You can override this query by setting the `issueQuery` parameter.
|
||||
|
||||
### Code Outline
|
||||
<!--
|
||||
Note: We are currently omitting the following providers due to bugs.
|
||||
See this issue for details: https://github.com/continuedev/continue/issues/1365
|
||||
-->
|
||||
|
||||
<!-- ### Code Outline
|
||||
|
||||
Type '@outline' to reference the outline of all currently open files. The outline of a files consists of only the function and class definitions in the file. Supported file extensions are '.js', '.mjs', '.go', '.c', '.cc', '.cs', '.cpp', '.el', '.ex', '.elm', '.java', '.ml', '.php', '.ql', '.rb', '.rs', '.ts'
|
||||
|
||||
|
@ -216,7 +221,7 @@ Type '@highlights' to reference the 'highlights' from all currently open files.
|
|||
|
||||
```json
|
||||
{ "name": "highlights" }
|
||||
```
|
||||
``` -->
|
||||
|
||||
### PostgreSQL
|
||||
|
||||
|
@ -312,10 +317,15 @@ interface CustomContextProvider {
|
|||
title: string;
|
||||
displayTitle?: string;
|
||||
description?: string;
|
||||
renderInlineAs?: string;
|
||||
type?: ContextProviderType;
|
||||
getContextItems(
|
||||
query: string,
|
||||
extras: ContextProviderExtras,
|
||||
): Promise<ContextItem[]>;
|
||||
loadSubmenuItems?: (
|
||||
args: LoadSubmenuItemsArgs,
|
||||
) => Promise<ContextSubmenuItem[]>;
|
||||
}
|
||||
```
|
||||
|
||||
|
@ -455,6 +465,7 @@ Continue will use [esbuild](https://esbuild.github.io/) to bundle your `config.t
|
|||
- `displayTitle` (optional): The title displayed in the dropdown
|
||||
- `description` (optional): The longer description displayed in the dropdown when hovered
|
||||
- `type` (optional): The type of context provider. Options are "normal", "query", and "submenu". Defaults to "normal".
|
||||
- `renderInlineAs` (optional): The string that will be rendered inline at the top of the prompt. If no value is provided, the `displayTitle` will be used. An empty string can be provided to prevent rendering the default `displayTitle`.
|
||||
- `getContextItems`: A function that returns the documents to include in the prompt. It should return a list of `ContextItem`s, and is given access to the following arguments:
|
||||
- `extras.fullInput`: A string representing the user's full input to the text box. This can be used for example to generate an embedding to compare against a set of other embedded documents
|
||||
- `extras.embeddingsProvider`: The embeddings provider has an `embed` function that will convert text (such as `fullInput`) to an embedding
|
||||
|
|
|
@ -28,6 +28,6 @@ keywords: [quickstart, start, install, vscode, jetbrains]
|
|||
|
||||

|
||||
|
||||
4. If you have any problems, see the [troubleshooting guide](./troubleshooting.md) or ask for help in [our Discord](https://discord.gg/NWtdYexhMs).
|
||||
4. If you have any problems, see the [troubleshooting guide](./troubleshooting.md) or ask for help in [our Discord](https://discord.com/invite/EfJEfdFnDQ).
|
||||
|
||||
> Alternatively, you can install from disk after downloading from the [JetBrains Extension Marketplace](https://plugins.jetbrains.com/plugin/22707-continue-extension).
|
||||
|
|
|
@ -8,7 +8,7 @@ To setup Anthropic, obtain an API key from [here](https://www.anthropic.com/api)
|
|||
{
|
||||
"title": "Anthropic",
|
||||
"provider": "anthropic",
|
||||
"model": "claude-3-opus-20240229",
|
||||
"model": "claude-3-5-sonnet-20240620",
|
||||
"apiKey": "YOUR_API_KEY"
|
||||
}
|
||||
]
|
||||
|
|
|
@ -6,6 +6,85 @@ keywords: [configure, llm, provider]
|
|||
|
||||
# Configuration
|
||||
|
||||
Want a quick and easy setup for Continue? We've got you covered with some sample `config.json` files for different scenarios. Just copy and paste them into your `config.json` by clicking the gear icon at the bottom right of the Continue sidebar.
|
||||
|
||||
## Best Overall Experience
|
||||
|
||||
This setup uses Claude 3.5 Sonnet for chatting, Codestral for autocomplete, and Voyage AI for embeddings and reranking.
|
||||
|
||||
**What You Need:**
|
||||
|
||||
1. Get a Codestral API key from [Mistral AI's La Plateforme](https://console.mistral.ai/codestral)
|
||||
2. Get an Anthropic API key from [Anthropic Console](https://console.anthropic.com/account/keys)
|
||||
3. Replace `[CODESTRAL_API_KEY]` and `[ANTHROPIC_API_KEY]` with the keys you got from the above links.
|
||||
|
||||
:::note
|
||||
This example uses a free trial for embeddings and reranking, forwarding requests via ContinueDev proxy. For direct service, get a Voyage AI API key and update the `provider` and `apiKey` fields. See the [config reference for Voyage AI](../walkthroughs//codebase-embeddings.md#voyage-ai) for details on how to set this up.
|
||||
:::
|
||||
|
||||
```json title="~/.continue/config.json"
|
||||
{
|
||||
"models": [
|
||||
{
|
||||
"title": "Claude 3.5 Sonnet",
|
||||
"provider": "anthropic",
|
||||
"model": "claude-3-5-sonnet-20240620",
|
||||
"apiKey": "[ANTHROPIC_API_KEY]"
|
||||
}
|
||||
],
|
||||
"tabAutocompleteModel": {
|
||||
"title": "Codestral",
|
||||
"provider": "mistral",
|
||||
"model": "codestral-latest",
|
||||
"apiKey": "[CODESTRAL_API_KEY]"
|
||||
},
|
||||
"embeddingsProvider": {
|
||||
"provider": "free-trial"
|
||||
},
|
||||
"reranker": {
|
||||
"name": "free-trial"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Local and Offline Configuration
|
||||
|
||||
This configuration leverages Ollama for all functionalities - chat, autocomplete, and embeddings - ensuring that no code is transmitted outside your machine, allowing Continue to be run even on an air-gapped computer.
|
||||
|
||||
**What You Need:**
|
||||
|
||||
1. Download Ollama from [Ollama's Official Site](https://ollama.ai)
|
||||
2. Pull the required models:
|
||||
- For chat: `ollama pull llama3:8b`
|
||||
- For autocomplete: `ollama pull starcoder2:3b`
|
||||
- For embeddings: `ollama pull nomic-embed-text`
|
||||
|
||||
```json title="~/.continue/config.json"
|
||||
{
|
||||
"models": [
|
||||
{
|
||||
"title": "Ollama",
|
||||
"provider": "ollama",
|
||||
"model": "AUTODETECT"
|
||||
}
|
||||
],
|
||||
"tabAutocompleteModel": {
|
||||
"title": "Starcoder 2 3b",
|
||||
"provider": "ollama",
|
||||
"model": "starcoder2:3b"
|
||||
},
|
||||
"embeddingsProvider": {
|
||||
"provider": "ollama",
|
||||
"model": "nomic-embed-text"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
If you require a fully air-gapped setup, you may also want to:
|
||||
|
||||
1. For VS Code, manually download the latest .vsix file from the [Open VSX Registry](https://open-vsx.org/extension/Continue/continue) rather than the VS Code Marketplace and [install it to VS Code](https://code.visualstudio.com/docs/editor/extension-marketplace#_install-from-a-vsix). For JetBrains, manually download the .zip file from the [JetBrains Plugin Repository](https://plugins.jetbrains.com/plugin/22707-continue) and [install it to your IDE](https://www.jetbrains.com/help/idea/managing-plugins.html#install_plugin_from_disk).
|
||||
2. Open `~/.continue/config.json` and set `"allowAnonymousTelemetry": false`. This will stop Continue from attempting requests to PostHog for [anonymous telemetry](../telemetry.md).
|
||||
|
||||
## Setting up chat models
|
||||
|
||||
In `config.json`, you'll find the `models` property, a list of the models that you have saved to use with Continue:
|
||||
|
|
|
@ -1,77 +0,0 @@
|
|||
# Example configurations
|
||||
|
||||
If you're looking for a quick way to create the perfect Continue setup, we've written a few sample `config.json`s for common situations. You can copy these and paste them into your `config.json` by clicking the gear icon in the bottom right of the Continue sidebar.
|
||||
|
||||
## I just want the best experience possible
|
||||
|
||||
This uses Claude 3 Opus for chat, Codestral for autocomplete, and Voyage AI for embeddings and reranking.
|
||||
|
||||
### Pre-requisites
|
||||
|
||||
1. Obtain a Codestral API key from [Mistral AI's La Plateforme](https://console.mistral.ai/codestral)
|
||||
|
||||
2. Obtain an Anthropic API key from [here](https://console.anthropic.com/account/keys)
|
||||
|
||||
3. Replace `[CODESTRAL_API_KEY]` and `[ANTHROPIC_API_KEY]` with the keys you obtained in the previous steps.
|
||||
|
||||
```json title="~/.continue/config.json"
|
||||
{
|
||||
"models": [
|
||||
{
|
||||
"title": "Claude 3 Opus",
|
||||
"provider": "anthropic",
|
||||
"model": "claude-3-opus-20240229",
|
||||
"apiKey": "[ANTHROPIC_API_KEY]"
|
||||
}
|
||||
],
|
||||
"tabAutocompleteModel": {
|
||||
"title": "Codestral",
|
||||
"provider": "mistral",
|
||||
"model": "codestral-latest",
|
||||
"apiKey": "[CODESTRAL_API_KEY]"
|
||||
},
|
||||
"embeddingsProvider": {
|
||||
"provider": "free-trial"
|
||||
},
|
||||
"reranker": {
|
||||
"name": "free-trial"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## I need to be entirely local + offline
|
||||
|
||||
This uses Ollama for chat, autocomplete, and embeddings, making sure that no code ever leaves your machine.
|
||||
|
||||
### Pre-requisites
|
||||
|
||||
1. [Download Ollama](https://ollama.ai)
|
||||
|
||||
2. Pull the necessary models
|
||||
|
||||
i. For chat: `ollama run llama3:8b`
|
||||
|
||||
ii. For autocomplete: `ollama run starcoder2:3b`
|
||||
|
||||
iii. For embeddings: `ollama run nomic-embed-text`
|
||||
|
||||
```json title="~/.continue/config.json"
|
||||
{
|
||||
"models": [
|
||||
{
|
||||
"title": "Ollama",
|
||||
"provider": "ollama",
|
||||
"model": "AUTODETECT"
|
||||
}
|
||||
],
|
||||
"tabAutocompleteModel": {
|
||||
"title": "Starcoder 2 3b",
|
||||
"provider": "ollama",
|
||||
"model": "starcoder2:3b"
|
||||
},
|
||||
"embeddingsProvider": {
|
||||
"provider": "ollama",
|
||||
"model": "nomic-embed-text"
|
||||
}
|
||||
}
|
||||
```
|
|
@ -8,7 +8,6 @@ You will need to decide which models and providers you use for [chat](select-mod
|
|||
|
||||
Learn more:
|
||||
|
||||
- [Configuration](configuration.md)
|
||||
- [Select providers](select-provider.md)
|
||||
- [Select models](select-model.md)
|
||||
- [Configuration](configuration.md)
|
||||
- [Example `config.json`s](examples.md)
|
||||
|
|
|
@ -32,8 +32,8 @@ _You can also use other open-source chat models by adding them to your `config.j
|
|||
|
||||
#### Claude 3 from Anthropic
|
||||
|
||||
- Unlimited budget: `claude-3-opus-20240229`
|
||||
- Limited budget: `claude-3-sonnet-20240229`
|
||||
- Unlimited budget: `claude-3-5-sonnet-20240620`
|
||||
- Limited budget: `claude-3-5-sonnet-20240620`
|
||||
|
||||
#### GPT-4o from OpenAI
|
||||
|
||||
|
@ -75,11 +75,10 @@ _You can also use other autocomplete models by adding them to your `config.json`
|
|||
|
||||
## Embeddings
|
||||
|
||||
You likely want to use an embeddings model that is made to vectorize code.
|
||||
We recommend the following embeddings models, which are used for codebase retrieval as described [here](../walkthroughs/codebase-embeddings.md#embeddings-providers)
|
||||
|
||||
### Open-source models
|
||||
|
||||
- `transformers.js`
|
||||
- `nomic-embed-text`
|
||||
|
||||
### Commercial models
|
||||
|
|
|
@ -8,18 +8,23 @@ keywords: [telemetry, anonymous, usage info, opt out]
|
|||
|
||||
## Overview
|
||||
|
||||
Continue collects and reports **anonymous** usage information. This data is essential to understanding how we should improve the product. You can opt out of it at any time. We use [Posthog](https://posthog.com/), an open source platform for product analytics, to collect and store the data. You can review the code [here](https://github.com/continuedev/continue/blob/main/gui/src/hooks/CustomPostHogProvider.tsx), or review our official policy [here](https://continue.dev/privacy).
|
||||
Continue collects and reports **anonymous** usage information.
|
||||
|
||||
This data is essential to understanding how we should improve the product. You can opt out of it at any time.
|
||||
|
||||
We use [Posthog](https://posthog.com/), an open source platform for product analytics, to collect and store the data. You can review the code [here](https://github.com/continuedev/continue/blob/main/gui/src/hooks/CustomPostHogProvider.tsx), or review our official policy [here](https://continue.dev/privacy).
|
||||
|
||||
## Tracking policy
|
||||
|
||||
All data is anonymous and cleaned of PII before being sent to PostHog.
|
||||
|
||||
## What we track
|
||||
|
||||
We track
|
||||
|
||||
- whether you accept or reject suggestions (never including the code or the prompt)
|
||||
- the name of the model and slash command used
|
||||
- the number of tokens generated
|
||||
- the name of your OS and IDE
|
||||
|
||||
All data is anonymous and cleaned of PII before being sent to PostHog.
|
||||
- Whether you accept or reject suggestions (never including the code or the prompt)
|
||||
- The name of the model and slash command used
|
||||
- The number of tokens generated
|
||||
- The name of your OS and IDE
|
||||
- Pageviews
|
||||
|
||||
## How to opt out
|
||||
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
---
|
||||
title: Codebase Retrieval
|
||||
description: Talk to your codebase
|
||||
keywords: [talk, embeddings, codebase, experimental]
|
||||
keywords: [talk, embeddings, reranker, codebase, experimental]
|
||||
---
|
||||
|
||||
# Codebase retrieval
|
||||
|
@ -113,8 +113,9 @@ Voyage AI offers the best embeddings for code with their voyage-code-2 model. Af
|
|||
|
||||
OpenAI's [embeddings](https://platform.openai.com/docs/guides/embeddings) are high dimensional embeddings that give great performance on both text and code.
|
||||
|
||||
Configuration for text-embedding-3-small Model. This is default.
|
||||
The text-embedding-3-small model offers an outstanding balance between performance and efficiency, suitable for a versatile range of applications.
|
||||
#### Configuration for the `text-embedding-3-small` model
|
||||
|
||||
This is default. The `text-embedding-3-small` model offers an outstanding balance between performance and efficiency, suitable for a versatile range of applications.
|
||||
|
||||
```json title="~/.continue/config.json"
|
||||
{
|
||||
|
@ -127,8 +128,9 @@ The text-embedding-3-small model offers an outstanding balance between performan
|
|||
}
|
||||
```
|
||||
|
||||
Configuration for text-embedding-3-large Model
|
||||
For those requiring the highest level of embedding detail and precision, the text-embedding-3-large model is the better choice.
|
||||
#### Configuration for the `text-embedding-3-large` model
|
||||
|
||||
For those requiring the highest level of embedding detail and precision, the `text-embedding-3-large` model is the better choice.
|
||||
|
||||
```json title="~/.continue/config.json"
|
||||
{
|
||||
|
@ -141,8 +143,9 @@ For those requiring the highest level of embedding detail and precision, the tex
|
|||
}
|
||||
```
|
||||
|
||||
Legacy Model Configuration
|
||||
For certain scenarios, you may still find the text-embedding-ada-002 model relevant. Below is the configuration example:
|
||||
#### Legacy Model Configuration
|
||||
|
||||
For certain scenarios, you may still find the `text-embedding-ada-002` model relevant. Below is the configuration example:
|
||||
|
||||
```json title="~/.continue/config.json"
|
||||
{
|
||||
|
@ -213,9 +216,9 @@ export function modifyConfig(config: Config): Config {
|
|||
|
||||
## Reranking providers
|
||||
|
||||
A reranking model improves the accuracy of codebase retrieval by breaking it into two stages. Instead of only using similarity search to go from all possible snippets to the top 10-20, we can instead get the top 100+ from similarity search and then use a more expensive model, the "reranker", to more accurately decide the top 10-20.
|
||||
The reranker plays a crucial role in refining the results retrieved from your codebase. It processes the initial set of results obtained through embeddings-based retrieval, improving their relevance and accuracy for your queries.
|
||||
|
||||
Because reranking requires 100+ simultaneous invocations of a model, there are not currently any local rerankers that Continue supports. However if you are able to use any of the below options, it is strongly recommended, as they are cheap but will drastically improve the quality of codebase retrieval.
|
||||
Continue offers several reranking options: `cohere`, `voyage`, `llm`, and `free-trial`, which can be configured in `config.json`.
|
||||
|
||||
### Voyage AI
|
||||
|
||||
|
@ -266,6 +269,18 @@ If you only have access to a single LLM, then you can use it as a reranker. This
|
|||
|
||||
The `"modelTitle"` field must match one of the models in your "models" array in config.json.
|
||||
|
||||
### Free Trial (Voyage AI)
|
||||
|
||||
Continue offers a free trial of Voyage AI's reranking model.
|
||||
|
||||
```json title="~/.continue/config.json"
|
||||
{
|
||||
"reranker": {
|
||||
"name": "free-trial"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Customizing which files are indexed
|
||||
|
||||
Continue respects `.gitignore` files in order to determine which files should not be indexed. If you'd like to exclude additional files, you can add them to a `.continueignore` file, which follows the exact same rules as `.gitignore`.
|
||||
|
|
|
@ -4,11 +4,11 @@ Prompt (`.prompt`) files are an easy way to build and share LLM prompts with oth
|
|||
|
||||
## How to create a prompt file
|
||||
|
||||
Below is a quick example of setting up a prompt file to write unit tests:s
|
||||
Below is a quick example of setting up a prompt file to write unit tests using Jest.
|
||||
|
||||
1. Create a folder called `.prompts` at the top level of your workspace
|
||||
2. Add a file called `test.prompt` to this folder
|
||||
3. Write the following contents to `test.prompt` and save
|
||||
1. Create a folder called `.prompts/` at the top level of your workspace.
|
||||
2. Add a file called `test.prompt` to this folder. The name of this file will be the name of the slash command you will use to generate your prompt.
|
||||
3. Write the following contents to `test.prompt` and save.
|
||||
|
||||
```
|
||||
temperature: 0.5
|
||||
|
|
|
@ -24,10 +24,9 @@ const sidebars = {
|
|||
collapsed: true,
|
||||
items: [
|
||||
"setup/overview",
|
||||
"setup/configuration",
|
||||
"setup/select-provider",
|
||||
"setup/select-model",
|
||||
"setup/configuration",
|
||||
"setup/examples",
|
||||
],
|
||||
},
|
||||
{
|
||||
|
@ -47,13 +46,10 @@ const sidebars = {
|
|||
collapsible: true,
|
||||
collapsed: true,
|
||||
items: [
|
||||
// "walkthroughs/codellama",
|
||||
"walkthroughs/set-up-codestral",
|
||||
"walkthroughs/running-continue-without-internet",
|
||||
"walkthroughs/codebase-embeddings",
|
||||
"walkthroughs/tab-autocomplete",
|
||||
"walkthroughs/prompt-files",
|
||||
// "walkthroughs/config-file-migration",
|
||||
],
|
||||
},
|
||||
"development-data",
|
||||
|
@ -71,6 +67,7 @@ const sidebars = {
|
|||
},
|
||||
],
|
||||
},
|
||||
"changelog",
|
||||
],
|
||||
};
|
||||
|
||||
|
|
|
@ -1999,7 +1999,7 @@
|
|||
"type": "string"
|
||||
},
|
||||
"model": {
|
||||
"enum": ["rerank-lite-1"]
|
||||
"enum": ["rerank-lite-1", "rerank-1"]
|
||||
}
|
||||
},
|
||||
"required": ["apiKey"]
|
||||
|
@ -2103,6 +2103,11 @@
|
|||
},
|
||||
"fontSize": {
|
||||
"type": "number"
|
||||
},
|
||||
"displayRawMarkdown": {
|
||||
"type": "boolean",
|
||||
"description": "If set to true, we will display the model output as raw markdown.",
|
||||
"default": false
|
||||
}
|
||||
}
|
||||
},
|
||||
|
|
|
@ -1,38 +1,52 @@
|
|||
# Changelog
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html),
|
||||
and is generated by [Changie](https://github.com/miniscruff/changie).
|
||||
|
||||
|
||||
## 0.0.42 - 2024-04-12
|
||||
|
||||
### Added
|
||||
* Inline cmd/ctrl+I in JetBrains
|
||||
|
||||
- Inline cmd/ctrl+I in JetBrains
|
||||
|
||||
### Fixed
|
||||
* Fixed character encoding error causing display issues
|
||||
* Fixed error causing input to constantly demand focus
|
||||
* Fixed automatic reloading of config.json
|
||||
|
||||
- Fixed character encoding error causing display issues
|
||||
- Fixed error causing input to constantly demand focus
|
||||
- Fixed automatic reloading of config.json
|
||||
|
||||
## 0.0.38 - 2024-03-15
|
||||
|
||||
### Added
|
||||
* Remote config server support
|
||||
* Autocomplete support in JetBrains
|
||||
|
||||
- Remote config server support
|
||||
- Autocomplete support in JetBrains
|
||||
|
||||
## 0.0.34 - 2024-03-03
|
||||
|
||||
### Added
|
||||
* diff context provider
|
||||
|
||||
- diff context provider
|
||||
|
||||
### Changed
|
||||
* Allow LLM servers to handle templating
|
||||
|
||||
- Allow LLM servers to handle templating
|
||||
|
||||
### Fixed
|
||||
* Fix a few context providers / slash commands
|
||||
* Fixed issues preventing proper extension startup
|
||||
|
||||
- Fix a few context providers / slash commands
|
||||
- Fixed issues preventing proper extension startup
|
||||
|
||||
## v0.0.26 - 2023-12-28
|
||||
|
||||
### Added
|
||||
* auto-reloading of config on save
|
||||
|
||||
- auto-reloading of config on save
|
||||
|
||||
### Fixed
|
||||
* Fixed /edit bug for versions without Python server
|
||||
|
||||
- Fixed /edit bug for versions without Python server
|
||||
|
||||
## v0.0.25 - 2023-12-25
|
||||
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
<!-- Plugin description -->
|
||||
|
||||

|
||||

|
||||
|
||||
<h1 align="center">Continue</h1>
|
||||
|
||||
|
@ -14,25 +14,25 @@
|
|||
|
||||
## Easily understand code sections
|
||||
|
||||

|
||||

|
||||
|
||||
`cmd+J` (MacOS) / `ctrl+J` (Windows)
|
||||
|
||||
## Tab to autocomplete code suggestions
|
||||
|
||||

|
||||

|
||||
|
||||
`tab` (MacOS) / `tab` (Windows)
|
||||
|
||||
## Refactor functions where you are coding
|
||||
|
||||

|
||||

|
||||
|
||||
`cmd+I` (MacOS) / `ctrl+I` (Windows)
|
||||
|
||||
## Quickly use documentation as context
|
||||
|
||||

|
||||

|
||||
|
||||
`@docs` (MacOS) / `@docs` (Windows)
|
||||
|
||||
|
|
|
@ -37,7 +37,7 @@ fun getDiffDirectory(): File {
|
|||
return diffDir
|
||||
}
|
||||
fun escapeFilepath(filepath: String): String {
|
||||
return filepath.replace("/", "_f_").replace("\\", "_b_")
|
||||
return filepath.replace("/", "_f_").replace("\\", "_b_").replace(":", "_c_")
|
||||
}
|
||||
|
||||
interface DiffInfo {
|
||||
|
|
|
@ -1999,7 +1999,7 @@
|
|||
"type": "string"
|
||||
},
|
||||
"model": {
|
||||
"enum": ["rerank-lite-1"]
|
||||
"enum": ["rerank-lite-1", "rerank-1"]
|
||||
}
|
||||
},
|
||||
"required": ["apiKey"]
|
||||
|
@ -2103,6 +2103,11 @@
|
|||
},
|
||||
"fontSize": {
|
||||
"type": "number"
|
||||
},
|
||||
"displayRawMarkdown": {
|
||||
"type": "boolean",
|
||||
"description": "If set to true, we will display the model output as raw markdown.",
|
||||
"default": false
|
||||
}
|
||||
}
|
||||
},
|
||||
|
|
|
@ -1,3 +1,3 @@
|
|||
media
|
||||
**/*.tmLanguage
|
||||
textmate-syntaxes
|
||||
textmate-syntaxes
|
|
@ -1,44 +1,58 @@
|
|||
# Changelog
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html),
|
||||
and is generated by [Changie](https://github.com/miniscruff/changie).
|
||||
|
||||
|
||||
Pre-release Changes
|
||||
|
||||
### Added
|
||||
* Support for Gemini 1.5 Pro
|
||||
|
||||
- Support for Gemini 1.5 Pro
|
||||
|
||||
## 0.8.24 - 2024-04-12
|
||||
|
||||
### Added
|
||||
* Support for improved retrieval models (Voyage embeddings/reranking)
|
||||
* New @code context provider
|
||||
* Personal usage analytics
|
||||
|
||||
- Support for improved retrieval models (Voyage embeddings/reranking)
|
||||
- New @code context provider
|
||||
- Personal usage analytics
|
||||
|
||||
## 0.8.15 - 2024-03-05
|
||||
|
||||
### Added
|
||||
* Tab-autocomplete in beta
|
||||
|
||||
- Tab-autocomplete in beta
|
||||
|
||||
## 0.8.14 - 2024-03-03
|
||||
|
||||
### Added
|
||||
* Image support
|
||||
* Full-text search index for retrieval
|
||||
* Docs context provider
|
||||
* CodeLlama-70b support
|
||||
|
||||
- Image support
|
||||
- Full-text search index for retrieval
|
||||
- Docs context provider
|
||||
- CodeLlama-70b support
|
||||
|
||||
### Changed
|
||||
* config.ts only runs in NodeJS, not browser
|
||||
|
||||
- config.ts only runs in NodeJS, not browser
|
||||
|
||||
### Fixed
|
||||
* Fixed proxy setting in config.json
|
||||
|
||||
- Fixed proxy setting in config.json
|
||||
|
||||
## v0.8.2 - 2024-01-23
|
||||
|
||||
### Added
|
||||
* Add codellama and gemini to free trial, using new server
|
||||
* Local codebase syncing and embeddings using LanceDB
|
||||
* Improved VS Code theme matching
|
||||
|
||||
- Add codellama and gemini to free trial, using new server
|
||||
- Local codebase syncing and embeddings using LanceDB
|
||||
- Improved VS Code theme matching
|
||||
|
||||
### Changed
|
||||
* Updates to packaging to download native modules for current platform (lancedb, sqlite, onnxruntime, tree-sitter wasms)
|
||||
* Context providers now run from the extension side (in Node.js instead of browser javascript)
|
||||
|
||||
- Updates to packaging to download native modules for current platform (lancedb, sqlite, onnxruntime, tree-sitter wasms)
|
||||
- Context providers now run from the extension side (in Node.js instead of browser javascript)
|
||||
|
||||
## v0.8.1 - 2024-01-08
|
||||
|
||||
|
|
|
@ -7,11 +7,11 @@ This is the Continue VS Code Extension. Its primary jobs are
|
|||
|
||||
# How to run the extension
|
||||
|
||||
See [Environment Setup](../CONTRIBUTING.md#environment-setup)
|
||||
See [Environment Setup](../../CONTRIBUTING.md#environment-setup)
|
||||
|
||||
# How to run and debug tests
|
||||
|
||||
After following the setup in [Environment Setup](../CONTRIBUTING.md#environment-setup) you can run the `Extension (VSCode)` launch configuration in VS Code.
|
||||
After following the setup in [Environment Setup](../../CONTRIBUTING.md#environment-setup) you can run the `Extension (VSCode)` launch configuration in VS Code.
|
||||
|
||||
## Notes
|
||||
|
||||
|
|
|
@ -1999,7 +1999,7 @@
|
|||
"type": "string"
|
||||
},
|
||||
"model": {
|
||||
"enum": ["rerank-lite-1"]
|
||||
"enum": ["rerank-lite-1", "rerank-1"]
|
||||
}
|
||||
},
|
||||
"required": ["apiKey"]
|
||||
|
@ -2103,6 +2103,11 @@
|
|||
},
|
||||
"fontSize": {
|
||||
"type": "number"
|
||||
},
|
||||
"displayRawMarkdown": {
|
||||
"type": "boolean",
|
||||
"description": "If set to true, we will display the model output as raw markdown.",
|
||||
"default": false
|
||||
}
|
||||
}
|
||||
},
|
||||
|
|
|
@ -2247,7 +2247,8 @@
|
|||
},
|
||||
"model": {
|
||||
"enum": [
|
||||
"rerank-lite-1"
|
||||
"rerank-lite-1",
|
||||
"rerank-1"
|
||||
]
|
||||
}
|
||||
},
|
||||
|
@ -2361,6 +2362,11 @@
|
|||
},
|
||||
"fontSize": {
|
||||
"type": "number"
|
||||
},
|
||||
"displayRawMarkdown": {
|
||||
"type": "boolean",
|
||||
"description": "If set to true, we will display the model output as raw markdown.",
|
||||
"default": false
|
||||
}
|
||||
}
|
||||
},
|
||||
|
|
|
@ -470,7 +470,7 @@ const exe = os === "win32" ? ".exe" : "";
|
|||
// Validate the all of the necessary files are present
|
||||
validateFilesPresent([
|
||||
// Queries used to create the index for @code context provider
|
||||
"tree-sitter/code-snippet-queries/tree-sitter-c_sharp-tags.scm",
|
||||
"tree-sitter/code-snippet-queries/c_sharp.scm",
|
||||
|
||||
// Queries used for @outline and @highlights context providers
|
||||
"tag-qry/tree-sitter-c_sharp-tags.scm",
|
||||
|
|
|
@ -10,6 +10,7 @@ import { VsCodeContinueApi } from "./api";
|
|||
import { setupInlineTips } from "./inlineTips";
|
||||
|
||||
let resolveVsCodeExtension = (_: VsCodeExtension): void => {};
|
||||
|
||||
export const vscodeExtensionPromise: Promise<VsCodeExtension> = new Promise(
|
||||
(resolve) => (resolveVsCodeExtension = resolve),
|
||||
);
|
||||
|
@ -32,6 +33,8 @@ export async function activateExtension(context: vscode.ExtensionContext) {
|
|||
path.join(getExtensionUri().fsPath, "media", "welcome.md"),
|
||||
),
|
||||
);
|
||||
|
||||
vscode.commands.executeCommand("continue.focusContinueInput");
|
||||
});
|
||||
|
||||
// Load Continue configuration
|
||||
|
|
|
@ -32,7 +32,9 @@ function gotoInputKey(input: GotoInput) {
|
|||
const MAX_CACHE_SIZE = 50;
|
||||
const gotoCache = new Map<string, RangeInFile[]>();
|
||||
|
||||
async function executeGotoProvider(input: GotoInput): Promise<RangeInFile[]> {
|
||||
export async function executeGotoProvider(
|
||||
input: GotoInput,
|
||||
): Promise<RangeInFile[]> {
|
||||
const cacheKey = gotoInputKey(input);
|
||||
const cached = gotoCache.get(cacheKey);
|
||||
if (cached) {
|
||||
|
|
|
@ -24,6 +24,9 @@ import { VerticalPerLineDiffManager } from "./diff/verticalPerLine/manager";
|
|||
import { Battery } from "./util/battery";
|
||||
import { getPlatform } from "./util/util";
|
||||
import type { VsCodeWebviewProtocol } from "./webviewProtocol";
|
||||
import { Telemetry } from "core/util/posthog";
|
||||
|
||||
let fullScreenPanel: vscode.WebviewPanel | undefined;
|
||||
|
||||
function getFullScreenTab() {
|
||||
const tabs = vscode.window.tabGroups.all.flatMap((tabGroup) => tabGroup.tabs);
|
||||
|
@ -222,8 +225,13 @@ const commandsMap: (
|
|||
}
|
||||
},
|
||||
"continue.focusContinueInput": async () => {
|
||||
if (!getFullScreenTab()) {
|
||||
const fullScreenTab = getFullScreenTab();
|
||||
if (!fullScreenTab) {
|
||||
// focus sidebar
|
||||
vscode.commands.executeCommand("continue.continueGUIView.focus");
|
||||
} else {
|
||||
// focus fullscreen
|
||||
fullScreenPanel?.reveal();
|
||||
}
|
||||
sidebar.webviewProtocol?.request("focusContinueInput", undefined);
|
||||
await addHighlightedCodeToContext(false, sidebar.webviewProtocol);
|
||||
|
@ -468,24 +476,14 @@ const commandsMap: (
|
|||
return;
|
||||
}
|
||||
|
||||
if (fullScreenTab) {
|
||||
if (fullScreenTab && fullScreenPanel) {
|
||||
//Full screen open, but not focused - focus it
|
||||
// Focus the tab
|
||||
const openOptions = {
|
||||
preserveFocus: true,
|
||||
preview: fullScreenTab.isPreview,
|
||||
viewColumn: fullScreenTab.group.viewColumn,
|
||||
};
|
||||
|
||||
vscode.commands.executeCommand(
|
||||
"vscode.open",
|
||||
(fullScreenTab.input as any).uri,
|
||||
openOptions,
|
||||
);
|
||||
fullScreenPanel.reveal();
|
||||
return;
|
||||
}
|
||||
|
||||
//Full screen not open - open it
|
||||
Telemetry.capture("openFullScreen", {});
|
||||
|
||||
// Close the sidebar.webviews
|
||||
// vscode.commands.executeCommand("workbench.action.closeSidebar");
|
||||
|
@ -497,7 +495,11 @@ const commandsMap: (
|
|||
"continue.continueGUIView",
|
||||
"Continue",
|
||||
vscode.ViewColumn.One,
|
||||
{
|
||||
retainContextWhenHidden: true,
|
||||
},
|
||||
);
|
||||
fullScreenPanel = panel;
|
||||
|
||||
//Add content to the panel
|
||||
panel.webview.html = sidebar.getSidebarContent(
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
import { devDataPath } from "core/util/paths";
|
||||
import * as fs from "node:fs";
|
||||
import * as os from "node:os";
|
||||
import * as path from "node:path";
|
||||
import { devDataPath } from "core/util/paths";
|
||||
import * as vscode from "vscode";
|
||||
import { getMetaKeyLabel, getPlatform } from "../util/util";
|
||||
import { uriFromFilePath } from "../util/vscode";
|
||||
|
@ -64,7 +64,10 @@ export class DiffManager {
|
|||
}
|
||||
|
||||
private escapeFilepath(filepath: string): string {
|
||||
return filepath.replace(/\//g, "_f_").replace(/\\/g, "_b_");
|
||||
return filepath
|
||||
.replace(/\//g, "_f_")
|
||||
.replace(/\\/g, "_b_")
|
||||
.replace(/:/g, "_c_");
|
||||
}
|
||||
|
||||
private remoteTmpDir = "/tmp/continue";
|
||||
|
|
|
@ -18,6 +18,7 @@ import {
|
|||
ToCoreOrIdeFromWebviewProtocol,
|
||||
VsCodeWebviewProtocol,
|
||||
} from "../webviewProtocol";
|
||||
import { ConfigHandler } from "core/config/handler";
|
||||
|
||||
/**
|
||||
* A shared messenger class between Core and Webview
|
||||
|
@ -69,6 +70,7 @@ export class VsCodeMessenger {
|
|||
private readonly webviewProtocol: VsCodeWebviewProtocol,
|
||||
private readonly ide: VsCodeIde,
|
||||
private readonly verticalDiffManagerPromise: Promise<VerticalPerLineDiffManager>,
|
||||
private readonly configHandlerPromise: Promise<ConfigHandler>,
|
||||
) {
|
||||
/** WEBVIEW ONLY LISTENERS **/
|
||||
this.onWebview("showFile", (msg) => {
|
||||
|
@ -116,9 +118,11 @@ export class VsCodeMessenger {
|
|||
msg.data.stepIndex,
|
||||
);
|
||||
});
|
||||
|
||||
this.onWebview("applyToCurrentFile", async (msg) => {
|
||||
// Select the entire current file
|
||||
const editor = vscode.window.activeTextEditor;
|
||||
|
||||
if (!editor) {
|
||||
vscode.window.showErrorMessage("No active editor to apply edits to");
|
||||
return;
|
||||
|
@ -134,11 +138,19 @@ export class VsCodeMessenger {
|
|||
editor.selection = new vscode.Selection(start, end);
|
||||
}
|
||||
|
||||
(await this.verticalDiffManagerPromise).streamEdit(
|
||||
`The following code was suggested as an edit:\n\`\`\`\n${msg.data.text}\n\`\`\`\nPlease apply it to the previous code.`,
|
||||
await this.webviewProtocol.request("getDefaultModelTitle", undefined),
|
||||
);
|
||||
const verticalDiffManager = await this.verticalDiffManagerPromise;
|
||||
const prompt = `The following code was suggested as an edit:\n\`\`\`\n${msg.data.text}\n\`\`\`\nPlease apply it to the previous code.`;
|
||||
|
||||
const configHandler = await configHandlerPromise;
|
||||
const config = await configHandler.loadConfig();
|
||||
|
||||
const modelTitle =
|
||||
config.experimental?.modelRoles?.applyCodeBlock ??
|
||||
(await this.webviewProtocol.request("getDefaultModelTitle", undefined));
|
||||
|
||||
verticalDiffManager.streamEdit(prompt, modelTitle);
|
||||
});
|
||||
|
||||
this.onWebview("showTutorial", async (msg) => {
|
||||
const tutorialPath = path.join(
|
||||
getExtensionUri().fsPath,
|
||||
|
@ -182,6 +194,7 @@ export class VsCodeMessenger {
|
|||
return (await this.inProcessMessenger.externalRequest(
|
||||
messageType,
|
||||
msg.data,
|
||||
msg.messageId,
|
||||
)) as TODO;
|
||||
});
|
||||
});
|
||||
|
|
|
@ -93,12 +93,15 @@ export class VsCodeExtension {
|
|||
ToCoreProtocol,
|
||||
FromCoreProtocol
|
||||
>();
|
||||
const vscodeMessenger = new VsCodeMessenger(
|
||||
|
||||
new VsCodeMessenger(
|
||||
inProcessMessenger,
|
||||
this.sidebar.webviewProtocol,
|
||||
this.ide,
|
||||
verticalDiffManagerPromise,
|
||||
configHandlerPromise,
|
||||
);
|
||||
|
||||
this.core = new Core(inProcessMessenger, this.ide, async (log: string) => {
|
||||
outputChannel.appendLine(
|
||||
"==========================================================================",
|
||||
|
@ -178,9 +181,10 @@ export class VsCodeExtension {
|
|||
|
||||
// Listen for file saving - use global file watcher so that changes
|
||||
// from outside the window are also caught
|
||||
fs.watchFile(getConfigJsonPath(), { interval: 1000 }, (stats) => {
|
||||
this.configHandler.reloadConfig();
|
||||
fs.watchFile(getConfigJsonPath(), { interval: 1000 }, async (stats) => {
|
||||
await this.configHandler.reloadConfig();
|
||||
});
|
||||
|
||||
fs.watchFile(getConfigTsPath(), { interval: 1000 }, (stats) => {
|
||||
this.configHandler.reloadConfig();
|
||||
});
|
||||
|
@ -193,6 +197,24 @@ export class VsCodeExtension {
|
|||
// Listen for file changes in the workspace
|
||||
const filepath = event.uri.fsPath;
|
||||
|
||||
if (filepath === getConfigJsonPath()) {
|
||||
// Trigger a toast notification to provide UI feedback that config
|
||||
// has been updated
|
||||
const showToast = context.globalState.get<boolean>(
|
||||
"showConfigUpdateToast",
|
||||
true,
|
||||
);
|
||||
if (showToast) {
|
||||
vscode.window
|
||||
.showInformationMessage("Config updated", "Don't show again")
|
||||
.then((selection) => {
|
||||
if (selection === "Don't show again") {
|
||||
context.globalState.update("showConfigUpdateToast", false);
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
if (
|
||||
filepath.endsWith(".continuerc.json") ||
|
||||
filepath.endsWith(".prompt")
|
||||
|
@ -260,6 +282,7 @@ export class VsCodeExtension {
|
|||
|
||||
static continueVirtualDocumentScheme = "continue";
|
||||
|
||||
// eslint-disable-next-line @typescript-eslint/naming-convention
|
||||
private PREVIOUS_BRANCH_FOR_WORKSPACE_DIR: { [dir: string]: string } = {};
|
||||
|
||||
registerCustomContextProvider(contextProvider: IContextProvider) {
|
||||
|
|
|
@ -9,7 +9,9 @@ import type {
|
|||
IdeInfo,
|
||||
IdeSettings,
|
||||
IndexTag,
|
||||
Location,
|
||||
Problem,
|
||||
RangeInFile,
|
||||
Thread,
|
||||
} from "core";
|
||||
import { Range } from "core";
|
||||
|
@ -20,6 +22,7 @@ import {
|
|||
getContinueGlobalPath,
|
||||
} from "core/util/paths";
|
||||
import * as vscode from "vscode";
|
||||
import { executeGotoProvider } from "./autocomplete/lsp";
|
||||
import { DiffManager } from "./diff/horizontal";
|
||||
import { Repository } from "./otherExtensions/git";
|
||||
import { VsCodeIdeUtils } from "./util/ideUtils";
|
||||
|
@ -47,6 +50,25 @@ class VsCodeIde implements IDE {
|
|||
);
|
||||
}
|
||||
|
||||
async gotoDefinition(location: Location): Promise<RangeInFile[]> {
|
||||
const result = await executeGotoProvider({
|
||||
uri: location.filepath,
|
||||
line: location.position.line,
|
||||
character: location.position.character,
|
||||
name: "vscode.executeDefinitionProvider",
|
||||
});
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
onDidChangeActiveTextEditor(callback: (filepath: string) => void): void {
|
||||
vscode.window.onDidChangeActiveTextEditor((editor) => {
|
||||
if (editor) {
|
||||
callback(editor.document.uri.fsPath);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
private authToken: string | undefined;
|
||||
private askedForAuth = false;
|
||||
|
||||
|
|
|
@ -1,4 +1,7 @@
|
|||
import { getContinueGlobalPath } from "core/util/paths";
|
||||
import { ExtensionContext } from "vscode";
|
||||
import fs from "fs";
|
||||
|
||||
/**
|
||||
* Clear all Continue-related artifacts to simulate a brand new user
|
||||
*/
|
||||
|
|
|
@ -259,6 +259,23 @@ export class VsCodeIdeUtils {
|
|||
});
|
||||
}
|
||||
|
||||
private _cachedPath: path.PlatformPath | undefined;
|
||||
get path(): path.PlatformPath {
|
||||
if (this._cachedPath) {
|
||||
return this._cachedPath;
|
||||
}
|
||||
|
||||
// Return "path" module for either windows or posix depending on sample workspace folder path format
|
||||
const sampleWorkspaceFolder =
|
||||
vscode.workspace.workspaceFolders?.[0]?.uri.fsPath;
|
||||
const isWindows = sampleWorkspaceFolder
|
||||
? !sampleWorkspaceFolder.startsWith("/")
|
||||
: false;
|
||||
|
||||
this._cachedPath = isWindows ? path.win32 : path.posix;
|
||||
return this._cachedPath;
|
||||
}
|
||||
|
||||
async getDirectoryContents(
|
||||
directory: string,
|
||||
recursive: boolean,
|
||||
|
@ -271,7 +288,31 @@ export class VsCodeIdeUtils {
|
|||
.filter(([name, type]) => {
|
||||
type === vscode.FileType.File && !defaultIgnoreFile.ignores(name);
|
||||
})
|
||||
.map(([name, type]) => path.join(directory, name));
|
||||
.map(([name, type]) => this.path.join(directory, name));
|
||||
}
|
||||
|
||||
// If not using gitignore, just read all contents recursively
|
||||
if (!useGitIgnore) {
|
||||
const dirQueue = [];
|
||||
const allFiles: string[] = [];
|
||||
dirQueue.push(directory);
|
||||
|
||||
while (dirQueue.length > 0) {
|
||||
const currentDir = dirQueue.shift()!;
|
||||
const files = await vscode.workspace.fs.readDirectory(
|
||||
uriFromFilePath(currentDir),
|
||||
);
|
||||
for (const [name, type] of files) {
|
||||
const filepath = this.path.join(currentDir, name);
|
||||
if (type === vscode.FileType.Directory) {
|
||||
dirQueue.push(filepath);
|
||||
} else {
|
||||
allFiles.push(filepath);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return allFiles;
|
||||
}
|
||||
|
||||
try {
|
||||
|
@ -288,7 +329,7 @@ export class VsCodeIdeUtils {
|
|||
const gitRoot = await this.getGitRoot(directory);
|
||||
let onlyThisDirectory = undefined;
|
||||
if (gitRoot) {
|
||||
onlyThisDirectory = directory.slice(gitRoot.length).split(path.sep);
|
||||
onlyThisDirectory = directory.slice(gitRoot.length).split(this.path.sep);
|
||||
if (onlyThisDirectory[0] === "") {
|
||||
onlyThisDirectory.shift();
|
||||
}
|
||||
|
@ -307,8 +348,8 @@ export class VsCodeIdeUtils {
|
|||
|
||||
getAbsolutePath(filepath: string): string {
|
||||
const workspaceDirectories = this.getWorkspaceDirectories();
|
||||
if (!path.isAbsolute(filepath) && workspaceDirectories.length === 1) {
|
||||
return path.join(workspaceDirectories[0], filepath);
|
||||
if (!this.path.isAbsolute(filepath) && workspaceDirectories.length === 1) {
|
||||
return this.path.join(workspaceDirectories[0], filepath);
|
||||
} else {
|
||||
return filepath;
|
||||
}
|
||||
|
|
|
@ -0,0 +1,9 @@
|
|||
(import_statement
|
||||
(import_clause
|
||||
(named_imports
|
||||
(import_specifier
|
||||
(identifier) @import))))
|
||||
|
||||
(import_statement
|
||||
(import_clause
|
||||
(identifier) @import ))
|
|
@ -11,13 +11,12 @@ import ErrorPage from "./pages/error";
|
|||
import GUI from "./pages/gui";
|
||||
import { default as Help, default as HelpPage } from "./pages/help";
|
||||
import History from "./pages/history";
|
||||
import LocalOnboarding from "./pages/localOnboarding";
|
||||
import MigrationPage from "./pages/migration";
|
||||
import MonacoPage from "./pages/monaco";
|
||||
import ApiKeyAutocompleteOnboarding from "./pages/onboarding/apiKeyAutocompleteOnboarding";
|
||||
import ApiKeyOnboarding from "./pages/onboarding/apiKeyOnboarding";
|
||||
import ExistingUserOnboarding from "./pages/onboarding/existingUserOnboarding";
|
||||
import Onboarding from "./pages/onboarding/onboarding";
|
||||
import ApiKeysOnboarding from "./pages/onboarding/ApiKeysOnboarding";
|
||||
import LocalOnboarding from "./pages/onboarding/LocalOnboarding";
|
||||
import Onboarding from "./pages/onboarding/Onboarding";
|
||||
import SettingsPage from "./pages/settings";
|
||||
import Stats from "./pages/stats";
|
||||
|
||||
|
@ -71,10 +70,6 @@ const router = createMemoryRouter([
|
|||
path: "/onboarding",
|
||||
element: <Onboarding />,
|
||||
},
|
||||
{
|
||||
path: "/existingUserOnboarding",
|
||||
element: <ExistingUserOnboarding />,
|
||||
},
|
||||
{
|
||||
path: "/localOnboarding",
|
||||
element: <LocalOnboarding />,
|
||||
|
@ -84,8 +79,8 @@ const router = createMemoryRouter([
|
|||
element: <MigrationPage />,
|
||||
},
|
||||
{
|
||||
path: "/apiKeyOnboarding",
|
||||
element: <ApiKeyOnboarding />,
|
||||
path: "/apiKeysOnboarding",
|
||||
element: <ApiKeysOnboarding />,
|
||||
},
|
||||
{
|
||||
path: "/apiKeyAutocompleteOnboarding",
|
||||
|
|
|
@ -26,10 +26,11 @@ import { getFontSize, isMetaEquivalentKeyPressed } from "../util";
|
|||
import { getLocalStorage, setLocalStorage } from "../util/localStorage";
|
||||
import HeaderButtonWithText from "./HeaderButtonWithText";
|
||||
import TextDialog from "./dialogs";
|
||||
import { ftl } from "./dialogs/FTCDialog";
|
||||
import IndexingProgressBar from "./loaders/IndexingProgressBar";
|
||||
import ProgressBar from "./loaders/ProgressBar";
|
||||
import ModelSelect from "./modelSelection/ModelSelect";
|
||||
import PostHogPageView from "./PosthogPageView";
|
||||
import { FREE_TRIAL_LIMIT_REQUESTS } from "../util/freeTrial";
|
||||
|
||||
// #region Styled Components
|
||||
const FOOTER_HEIGHT = "1.8em";
|
||||
|
@ -89,8 +90,6 @@ const DropdownPortalDiv = styled.div`
|
|||
// #endregion
|
||||
|
||||
const HIDE_FOOTER_ON_PAGES = [
|
||||
"/onboarding",
|
||||
"/existingUserOnboarding",
|
||||
"/onboarding",
|
||||
"/localOnboarding",
|
||||
"/apiKeyOnboarding",
|
||||
|
@ -250,17 +249,20 @@ const Layout = () => {
|
|||
/>
|
||||
|
||||
<GridDiv>
|
||||
<PostHogPageView />
|
||||
<Outlet />
|
||||
<DropdownPortalDiv id="model-select-top-div"></DropdownPortalDiv>
|
||||
{HIDE_FOOTER_ON_PAGES.includes(location.pathname) || (
|
||||
<Footer>
|
||||
<div className="mr-auto flex gap-2 items-center">
|
||||
<ModelSelect />
|
||||
<div className="mr-auto flex flex-grow gap-2 items-center overflow-hidden">
|
||||
<div className="flex-shrink-0">
|
||||
<ModelSelect />
|
||||
</div>
|
||||
{indexingState.status !== "indexing" && // Would take up too much space together with indexing progress
|
||||
defaultModel?.provider === "free-trial" && (
|
||||
<ProgressBar
|
||||
completed={parseInt(localStorage.getItem("ftc") || "0")}
|
||||
total={ftl()}
|
||||
total={FREE_TRIAL_LIMIT_REQUESTS}
|
||||
/>
|
||||
)}
|
||||
<IndexingProgressBar indexingState={indexingState} />
|
||||
|
|
|
@ -0,0 +1,28 @@
|
|||
import { useEffect } from "react";
|
||||
import { usePostHog } from "posthog-js/react";
|
||||
import { useSearchParams, useLocation } from "react-router-dom";
|
||||
|
||||
/**
|
||||
* This is copied from here: https://posthog.com/tutorials/single-page-app-pageviews#tracking-pageviews-in-nextjs-app-router
|
||||
* They don't have a non-NextJS doc for React apps.
|
||||
*/
|
||||
export default function PostHogPageView() {
|
||||
const { pathname } = useLocation();
|
||||
const searchParams = useSearchParams();
|
||||
const posthog = usePostHog();
|
||||
|
||||
// Track pageviews
|
||||
useEffect(() => {
|
||||
if (pathname && posthog) {
|
||||
let url = window.origin + pathname;
|
||||
if (searchParams.toString()) {
|
||||
url = url + `?${searchParams.toString()}`;
|
||||
}
|
||||
posthog.capture("$pageview", {
|
||||
$current_url: url,
|
||||
});
|
||||
}
|
||||
}, [pathname, searchParams, posthog]);
|
||||
|
||||
return null;
|
||||
}
|
|
@ -1,34 +1,29 @@
|
|||
import { SiteIndexingConfig } from "core";
|
||||
import { usePostHog } from "posthog-js/react";
|
||||
import React, { useContext, useLayoutEffect } from "react";
|
||||
import React, { useContext, useLayoutEffect, useRef, useState } from "react";
|
||||
import { useDispatch } from "react-redux";
|
||||
import styled from "styled-components";
|
||||
import { Button, Input } from "..";
|
||||
import { IdeMessengerContext } from "../../context/IdeMessenger";
|
||||
import { SubmenuContextProvidersContext } from "../../context/SubmenuContextProviders";
|
||||
import { setShowDialog } from "../../redux/slices/uiStateSlice";
|
||||
|
||||
const GridDiv = styled.div`
|
||||
display: grid;
|
||||
grid-template-columns: 1fr 1fr;
|
||||
grid-gap: 8px;
|
||||
align-items: center;
|
||||
`;
|
||||
const DEFAULT_MAX_DEPTH = 3;
|
||||
|
||||
function AddDocsDialog() {
|
||||
const defaultMaxDepth = 3;
|
||||
const [docsUrl, setDocsUrl] = React.useState("");
|
||||
const [docsTitle, setDocsTitle] = React.useState("");
|
||||
const [urlValid, setUrlValid] = React.useState(false);
|
||||
const [maxDepth, setMaxDepth] = React.useState<number | string>(""); // Change here
|
||||
|
||||
const posthog = usePostHog();
|
||||
const dispatch = useDispatch();
|
||||
|
||||
const ref = useRef<HTMLInputElement>(null);
|
||||
|
||||
const [docsUrl, setDocsUrl] = useState("");
|
||||
const [docsTitle, setDocsTitle] = useState("");
|
||||
const [urlValid, setUrlValid] = useState(false);
|
||||
const [maxDepth, setMaxDepth] = useState<number | string>("");
|
||||
|
||||
const ideMessenger = useContext(IdeMessengerContext);
|
||||
const { addItem } = useContext(SubmenuContextProvidersContext);
|
||||
|
||||
const ref = React.useRef<HTMLInputElement>(null);
|
||||
const posthog = usePostHog();
|
||||
const isFormValid = docsUrl && docsTitle && urlValid;
|
||||
|
||||
useLayoutEffect(() => {
|
||||
setTimeout(() => {
|
||||
|
@ -38,72 +33,99 @@ function AddDocsDialog() {
|
|||
}, 100);
|
||||
}, [ref]);
|
||||
|
||||
function onSubmit(e) {
|
||||
e.preventDefault();
|
||||
|
||||
const siteIndexingConfig: SiteIndexingConfig = {
|
||||
startUrl: docsUrl,
|
||||
rootUrl: docsUrl,
|
||||
title: docsTitle,
|
||||
maxDepth: typeof maxDepth === "string" ? DEFAULT_MAX_DEPTH : maxDepth,
|
||||
};
|
||||
|
||||
ideMessenger.post("context/addDocs", siteIndexingConfig);
|
||||
|
||||
setDocsTitle("");
|
||||
setDocsUrl("");
|
||||
setMaxDepth("");
|
||||
|
||||
dispatch(setShowDialog(false));
|
||||
|
||||
addItem("docs", {
|
||||
id: docsUrl,
|
||||
title: docsTitle,
|
||||
description: new URL(docsUrl).hostname,
|
||||
});
|
||||
|
||||
posthog.capture("add_docs", { url: docsUrl });
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="p-4">
|
||||
<h3>Add Docs</h3>
|
||||
<div className="mb-8">
|
||||
<h2>Add Docs</h2>
|
||||
|
||||
<p>
|
||||
Continue pre-indexes many common documentation sites, but if there's one
|
||||
you don't see in the dropdown, enter the URL here. Continue's indexing
|
||||
engine will crawl the site and generate embeddings so that you can ask
|
||||
questions.
|
||||
</p>
|
||||
<p>
|
||||
Continue pre-indexes many common documentation sites, but if there's
|
||||
one you don't see in the dropdown, enter the URL here.
|
||||
</p>
|
||||
|
||||
<Input
|
||||
type="url"
|
||||
placeholder="URL"
|
||||
value={docsUrl}
|
||||
ref={ref}
|
||||
onChange={(e) => {
|
||||
setDocsUrl(e.target.value);
|
||||
setUrlValid(e.target.validity.valid);
|
||||
}}
|
||||
/>
|
||||
<Input
|
||||
type="text"
|
||||
placeholder="Title"
|
||||
value={docsTitle}
|
||||
onChange={(e) => setDocsTitle(e.target.value)}
|
||||
/>
|
||||
<Input
|
||||
type="text"
|
||||
placeholder={`Optional: Max Depth (Default: ${defaultMaxDepth})`}
|
||||
title="The maximum search tree depth - where your input url is the root node"
|
||||
value={maxDepth}
|
||||
onChange={(e) => {
|
||||
const value = e.target.value;
|
||||
if (value == "") {
|
||||
setMaxDepth("");
|
||||
} else if (!isNaN(+value) && Number(value) > 0) {
|
||||
setMaxDepth(Number(value));
|
||||
}
|
||||
}}
|
||||
/>
|
||||
<Button
|
||||
disabled={!docsUrl || !urlValid}
|
||||
className="ml-auto"
|
||||
onClick={() => {
|
||||
const siteIndexingConfig: SiteIndexingConfig = {
|
||||
startUrl: docsUrl,
|
||||
rootUrl: docsUrl,
|
||||
title: docsTitle,
|
||||
maxDepth: typeof maxDepth === "string" ? defaultMaxDepth : maxDepth, // Ensure maxDepth is a number
|
||||
};
|
||||
ideMessenger.post("context/addDocs", siteIndexingConfig);
|
||||
setDocsTitle("");
|
||||
setDocsUrl("");
|
||||
setMaxDepth("");
|
||||
dispatch(setShowDialog(false));
|
||||
addItem("docs", {
|
||||
id: docsUrl,
|
||||
title: docsTitle,
|
||||
description: new URL(docsUrl).hostname,
|
||||
});
|
||||
posthog.capture("add_docs", { url: docsUrl });
|
||||
}}
|
||||
>
|
||||
Done
|
||||
</Button>
|
||||
<p>
|
||||
Continue's indexing engine will crawl the site and generate embeddings
|
||||
so that you can ask questions.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<form onSubmit={onSubmit} className="flex flex-col space-y-4">
|
||||
<label>
|
||||
URL
|
||||
<Input
|
||||
type="url"
|
||||
placeholder="URL"
|
||||
value={docsUrl}
|
||||
ref={ref}
|
||||
onChange={(e) => {
|
||||
setDocsUrl(e.target.value);
|
||||
setUrlValid(e.target.validity.valid);
|
||||
}}
|
||||
/>
|
||||
</label>
|
||||
|
||||
<label>
|
||||
Title
|
||||
<Input
|
||||
type="text"
|
||||
placeholder="Title"
|
||||
value={docsTitle}
|
||||
onChange={(e) => setDocsTitle(e.target.value)}
|
||||
/>
|
||||
</label>
|
||||
|
||||
<label>
|
||||
Max Depth [Optional]
|
||||
<Input
|
||||
type="text"
|
||||
inputMode="numeric"
|
||||
placeholder={`Max depth (Default: ${DEFAULT_MAX_DEPTH})`}
|
||||
title="The maximum search tree depth - where your input url is the root node"
|
||||
value={maxDepth}
|
||||
onChange={(e) => {
|
||||
const value = e.target.value;
|
||||
if (value == "") {
|
||||
setMaxDepth("");
|
||||
} else if (!isNaN(+value) && Number(value) > 0) {
|
||||
setMaxDepth(Number(value));
|
||||
}
|
||||
}}
|
||||
/>
|
||||
</label>
|
||||
|
||||
<div className="flex justify-end">
|
||||
<Button disabled={!isFormValid} type="submit">
|
||||
Submit
|
||||
</Button>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
import { useDispatch } from "react-redux";
|
||||
import styled from "styled-components";
|
||||
import { Button } from "..";
|
||||
import { Button, SecondaryButton } from "..";
|
||||
import {
|
||||
setDialogMessage,
|
||||
setShowDialog,
|
||||
|
@ -28,7 +28,7 @@ function ConfirmationDialog(props: ConfirmationDialogProps) {
|
|||
<p>{props.text}</p>
|
||||
|
||||
<GridDiv>
|
||||
<Button
|
||||
<SecondaryButton
|
||||
onClick={() => {
|
||||
props.onCancel?.();
|
||||
dispatch(setShowDialog(false));
|
||||
|
@ -36,7 +36,7 @@ function ConfirmationDialog(props: ConfirmationDialogProps) {
|
|||
}}
|
||||
>
|
||||
Cancel
|
||||
</Button>
|
||||
</SecondaryButton>
|
||||
<Button
|
||||
onClick={() => {
|
||||
props.onConfirm();
|
||||
|
|
|
@ -6,6 +6,7 @@ import { Button, Input } from "..";
|
|||
import { IdeMessengerContext } from "../../context/IdeMessenger";
|
||||
import { setDefaultModel } from "../../redux/slices/stateSlice";
|
||||
import { setShowDialog } from "../../redux/slices/uiStateSlice";
|
||||
import { FREE_TRIAL_LIMIT_REQUESTS } from "../../util/freeTrial";
|
||||
|
||||
const GridDiv = styled.div`
|
||||
display: grid;
|
||||
|
@ -29,9 +30,9 @@ function FTCDialog() {
|
|||
<div className="p-4">
|
||||
<h3>Free Trial Limit Reached</h3>
|
||||
<p>
|
||||
You've reached the free trial limit of {ftl()} free inputs. To keep
|
||||
using Continue, you can either use your own API key, or use a local LLM.
|
||||
To read more about the options, see our{" "}
|
||||
You've reached the free trial limit of {FREE_TRIAL_LIMIT_REQUESTS} free
|
||||
inputs. To keep using Continue, you can either use your own API key, or
|
||||
use a local LLM. To read more about the options, see our{" "}
|
||||
<a href="https://docs.continue.dev/setup/overview" target="_blank">
|
||||
documentation
|
||||
</a>
|
||||
|
|
|
@ -8,7 +8,6 @@ import {
|
|||
vscBackground,
|
||||
vscFocusBorder,
|
||||
vscForeground,
|
||||
vscInputBackground,
|
||||
} from "..";
|
||||
|
||||
const ScreenCover = styled.div`
|
||||
|
@ -39,25 +38,6 @@ const Dialog = styled.div`
|
|||
// overflow: hidden;
|
||||
`;
|
||||
|
||||
const TextArea = styled.textarea`
|
||||
border: 1px solid #ccc;
|
||||
border-radius: ${defaultBorderRadius};
|
||||
padding: 8px;
|
||||
outline: 1px solid black;
|
||||
resize: none;
|
||||
background-color: ${vscInputBackground};
|
||||
color: ${vscForeground};
|
||||
|
||||
&:focus {
|
||||
outline: 1px solid ${vscForeground};
|
||||
}
|
||||
`;
|
||||
|
||||
const P = styled.p`
|
||||
color: ${vscForeground};
|
||||
margin: 8px auto;
|
||||
`;
|
||||
|
||||
const TextDialog = (props: {
|
||||
showDialog: boolean;
|
||||
onEnter: () => void;
|
||||
|
|
|
@ -16,6 +16,7 @@ import {
|
|||
vscInputBackground,
|
||||
} from "..";
|
||||
import { IdeMessengerContext } from "../../context/IdeMessenger";
|
||||
import useUIConfig from "../../hooks/useUIConfig";
|
||||
import { RootState } from "../../redux/store";
|
||||
import { getFontSize } from "../../util";
|
||||
import HeaderButtonWithText from "../HeaderButtonWithText";
|
||||
|
@ -87,6 +88,8 @@ function StepContainer(props: StepContainerProps) {
|
|||
|
||||
const [truncatedEarly, setTruncatedEarly] = useState(false);
|
||||
|
||||
const uiConfig = useUIConfig();
|
||||
|
||||
useEffect(() => {
|
||||
if (!active) {
|
||||
const content = stripImages(props.item.message.content).trim();
|
||||
|
@ -121,10 +124,19 @@ function StepContainer(props: StepContainerProps) {
|
|||
isUserInput={isUserInput}
|
||||
fontSize={getFontSize()}
|
||||
>
|
||||
<StyledMarkdownPreview
|
||||
source={stripImages(props.item.message.content)}
|
||||
showCodeBorder={true}
|
||||
/>
|
||||
{uiConfig?.displayRawMarkdown ? (
|
||||
<pre
|
||||
className="whitespace-pre-wrap break-words p-4 max-w-full overflow-x-auto"
|
||||
style={{ fontSize: getFontSize() - 2 }}
|
||||
>
|
||||
{stripImages(props.item.message.content)}
|
||||
</pre>
|
||||
) : (
|
||||
<StyledMarkdownPreview
|
||||
source={stripImages(props.item.message.content)}
|
||||
showCodeBorder={true}
|
||||
/>
|
||||
)}
|
||||
</ContentDiv>
|
||||
<div className="h-2"></div>
|
||||
{(isHovered || typeof feedback !== "undefined") && !active && (
|
||||
|
|
|
@ -0,0 +1,34 @@
|
|||
import styled, { css, keyframes } from "styled-components";
|
||||
|
||||
const DEFAULT_DIAMETER = 6;
|
||||
|
||||
const blink = keyframes`
|
||||
0%, 100% {
|
||||
transform: scale(1);
|
||||
opacity: 1;
|
||||
}
|
||||
50% {
|
||||
opacity: 0.25;
|
||||
}
|
||||
`;
|
||||
|
||||
const blinkAnimation = css`
|
||||
animation: ${blink} 3s infinite;
|
||||
`;
|
||||
|
||||
const BlinkingDot = styled.div<{
|
||||
color: string;
|
||||
diameter?: number;
|
||||
shouldBlink?: boolean;
|
||||
}>`
|
||||
background-color: ${(props) => props.color};
|
||||
box-shadow: 0px 0px 2px 1px ${(props) => props.color};
|
||||
width: ${(props) => props.diameter ?? DEFAULT_DIAMETER}px;
|
||||
height: ${(props) => props.diameter ?? DEFAULT_DIAMETER}px;
|
||||
border-radius: 50%;
|
||||
border: 1px solid rgba(255, 255, 255, 0.75);
|
||||
margin: 0 2px;
|
||||
${(props) => (props.shouldBlink ?? false) && blinkAnimation};
|
||||
`;
|
||||
|
||||
export default BlinkingDot;
|
|
@ -7,22 +7,22 @@ import { StyledTooltip, lightGray, vscForeground } from "..";
|
|||
import { IdeMessengerContext } from "../../context/IdeMessenger";
|
||||
import { RootState } from "../../redux/store";
|
||||
import { getFontSize, isJetBrains } from "../../util";
|
||||
import BlinkingDot from "./BlinkingDot";
|
||||
|
||||
const DIAMETER = 6;
|
||||
const CircleDiv = styled.div<{ color: string }>`
|
||||
background-color: ${(props) => props.color};
|
||||
box-shadow: 0px 0px 2px 1px ${(props) => props.color};
|
||||
width: ${DIAMETER}px;
|
||||
height: ${DIAMETER}px;
|
||||
border-radius: ${DIAMETER / 2}px;
|
||||
`;
|
||||
const STATUS_COLORS = {
|
||||
DISABLED: lightGray, // light gray
|
||||
LOADING: "#00B8D9", // ice blue
|
||||
INDEXING: "#6554C0", // purple
|
||||
PAUSED: "#FFAB00", // yellow
|
||||
DONE: "#36B37E", // green
|
||||
FAILED: "#FF5630", // red
|
||||
};
|
||||
|
||||
const ProgressBarWrapper = styled.div`
|
||||
width: 100px;
|
||||
height: 6px;
|
||||
border-radius: 6px;
|
||||
border: 0.5px solid ${lightGray};
|
||||
margin-top: 6px;
|
||||
`;
|
||||
|
||||
const ProgressBarFill = styled.div<{ completed: number; color?: string }>`
|
||||
|
@ -33,23 +33,32 @@ const ProgressBarFill = styled.div<{ completed: number; color?: string }>`
|
|||
width: ${(props) => props.completed}%;
|
||||
`;
|
||||
|
||||
const GridDiv = styled.div`
|
||||
display: grid;
|
||||
grid-template-rows: 1fr auto;
|
||||
const FlexDiv = styled.div`
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-items: center;
|
||||
margin-left: 8px;
|
||||
gap: 6px;
|
||||
overflow: hidden;
|
||||
`;
|
||||
|
||||
const P = styled.p`
|
||||
margin: 0;
|
||||
margin-top: 2px;
|
||||
font-size: ${getFontSize() - 2.5}px;
|
||||
const StatusHeading = styled.div`
|
||||
color: ${lightGray};
|
||||
text-align: center;
|
||||
font-size: ${getFontSize() - 2.4}px;
|
||||
white-space: nowrap;
|
||||
overflow: hidden;
|
||||
text-overflow: ellipsis;
|
||||
|
||||
@media (max-width: 400px) {
|
||||
display: none;
|
||||
}
|
||||
`;
|
||||
|
||||
const StatusInfo = styled.div`
|
||||
font-size: ${getFontSize() - 3.6}px;
|
||||
color: ${lightGray};
|
||||
white-space: nowrap;
|
||||
overflow: hidden;
|
||||
text-overflow: ellipsis;
|
||||
margin-top: 2px;
|
||||
`;
|
||||
|
||||
interface ProgressBarProps {
|
||||
|
@ -120,108 +129,92 @@ const IndexingProgressBar = ({
|
|||
}}
|
||||
className="cursor-pointer"
|
||||
>
|
||||
{indexingState.status === "failed" ? ( //red 'failed' dot
|
||||
<>
|
||||
<CircleDiv
|
||||
data-tooltip-id="indexingFailed_dot"
|
||||
color="#ff0000"
|
||||
></CircleDiv>
|
||||
{indexingState.status === "failed" ? (
|
||||
<FlexDiv data-tooltip-id="indexingFailed_dot">
|
||||
<BlinkingDot color={STATUS_COLORS.FAILED}></BlinkingDot>
|
||||
<div>
|
||||
<StatusHeading>Indexing error! Click to retry</StatusHeading>
|
||||
<StatusInfo>{getIndexingErrMsg(indexingState.desc)}</StatusInfo>
|
||||
</div>
|
||||
{tooltipPortalDiv &&
|
||||
ReactDOM.createPortal(
|
||||
<StyledTooltip id="indexingFailed_dot" place="top">
|
||||
Error indexing codebase: {getIndexingErrMsg(indexingState.desc)}
|
||||
{getIndexingErrMsg(indexingState.desc)}
|
||||
</StyledTooltip>,
|
||||
tooltipPortalDiv,
|
||||
)}
|
||||
</FlexDiv>
|
||||
) : indexingState.status === "loading" ? (
|
||||
<FlexDiv>
|
||||
<BlinkingDot
|
||||
color={STATUS_COLORS.LOADING}
|
||||
shouldBlink={true}
|
||||
></BlinkingDot>
|
||||
<StatusHeading>Continue is initializing</StatusHeading>
|
||||
</FlexDiv>
|
||||
) : indexingState.status === "done" ? (
|
||||
<FlexDiv data-tooltip-id="indexingDone_dot">
|
||||
<BlinkingDot color={STATUS_COLORS.DONE}></BlinkingDot>
|
||||
<div>
|
||||
<StatusHeading>Index up to date</StatusHeading>
|
||||
</div>
|
||||
{tooltipPortalDiv &&
|
||||
ReactDOM.createPortal(
|
||||
<StyledTooltip id="indexingDone_dot" place="top">
|
||||
Index up to date
|
||||
<br />
|
||||
Click to retry
|
||||
Click to force re-indexing
|
||||
</StyledTooltip>,
|
||||
tooltipPortalDiv,
|
||||
)}
|
||||
</>
|
||||
) : indexingState.status === "loading" ? ( // ice-blue 'indexing loading' dot
|
||||
<>
|
||||
<CircleDiv
|
||||
data-tooltip-id="indexingNotLoaded_dot"
|
||||
color="#72aec2"
|
||||
></CircleDiv>
|
||||
{tooltipPortalDiv &&
|
||||
ReactDOM.createPortal(
|
||||
<StyledTooltip id="indexingNotLoaded_dot" place="top">
|
||||
Continue is initializing
|
||||
</StyledTooltip>,
|
||||
tooltipPortalDiv,
|
||||
)}
|
||||
</>
|
||||
) : indexingState.status === "done" ? ( //indexing complete green dot
|
||||
<>
|
||||
<CircleDiv data-tooltip-id="progress_dot" color="#090"></CircleDiv>
|
||||
{tooltipPortalDiv &&
|
||||
ReactDOM.createPortal(
|
||||
<StyledTooltip id="progress_dot" place="top">
|
||||
Index up to date. Click to force re-indexing
|
||||
</StyledTooltip>,
|
||||
tooltipPortalDiv,
|
||||
)}
|
||||
</>
|
||||
) : indexingState.status === "disabled" ? ( //gray disabled dot
|
||||
<>
|
||||
<CircleDiv
|
||||
data-tooltip-id="progress_dot"
|
||||
color={lightGray}
|
||||
></CircleDiv>
|
||||
{tooltipPortalDiv &&
|
||||
ReactDOM.createPortal(
|
||||
<StyledTooltip id="progress_dot" place="top">
|
||||
{indexingState.desc}
|
||||
</StyledTooltip>,
|
||||
tooltipPortalDiv,
|
||||
)}
|
||||
</>
|
||||
</FlexDiv>
|
||||
) : indexingState.status === "disabled" ? (
|
||||
<FlexDiv>
|
||||
<BlinkingDot color={STATUS_COLORS.DISABLED}></BlinkingDot>
|
||||
<StatusHeading>{indexingState.desc}</StatusHeading>
|
||||
</FlexDiv>
|
||||
) : indexingState.status === "paused" ||
|
||||
(paused && indexingState.status === "indexing") ? (
|
||||
//yellow 'paused' dot
|
||||
<>
|
||||
<CircleDiv
|
||||
data-tooltip-id="progress_dot"
|
||||
color="#bb0"
|
||||
<FlexDiv>
|
||||
<BlinkingDot
|
||||
color={STATUS_COLORS.PAUSED}
|
||||
onClick={(e) => {
|
||||
ideMessenger.post("index/setPaused", false);
|
||||
}}
|
||||
></CircleDiv>
|
||||
{tooltipPortalDiv &&
|
||||
ReactDOM.createPortal(
|
||||
<StyledTooltip id="progress_dot" place="top">
|
||||
Click to unpause indexing (
|
||||
{Math.trunc(indexingState.progress * 100)}%)
|
||||
</StyledTooltip>,
|
||||
tooltipPortalDiv,
|
||||
)}
|
||||
</>
|
||||
) : indexingState.status === "indexing" ? ( //progress bar
|
||||
<>
|
||||
<GridDiv
|
||||
data-tooltip-id="usage_progress_bar"
|
||||
onMouseEnter={() => setHovered(true)}
|
||||
onMouseLeave={() => setHovered(false)}
|
||||
onClick={(e) => {
|
||||
ideMessenger.post("index/setPaused", true);
|
||||
}}
|
||||
>
|
||||
<ProgressBarWrapper>
|
||||
<ProgressBarFill completed={fillPercentage} />
|
||||
</ProgressBarWrapper>
|
||||
<P>
|
||||
{hovered
|
||||
? "Click to pause"
|
||||
: `Indexing (${Math.trunc(indexingState.progress * 100)}%)`}
|
||||
</P>
|
||||
</GridDiv>
|
||||
{tooltipPortalDiv &&
|
||||
ReactDOM.createPortal(
|
||||
<StyledTooltip id="usage_progress_bar" place="top">
|
||||
{indexingState.desc}
|
||||
</StyledTooltip>,
|
||||
tooltipPortalDiv,
|
||||
)}
|
||||
</>
|
||||
></BlinkingDot>
|
||||
<StatusHeading>
|
||||
Click to resume indexing ({Math.trunc(indexingState.progress * 100)}
|
||||
%)
|
||||
</StatusHeading>
|
||||
</FlexDiv>
|
||||
) : indexingState.status === "indexing" ? (
|
||||
<FlexDiv
|
||||
onMouseEnter={() => setHovered(true)}
|
||||
onMouseLeave={() => setHovered(false)}
|
||||
onClick={(e) => {
|
||||
ideMessenger.post("index/setPaused", true);
|
||||
}}
|
||||
>
|
||||
<BlinkingDot
|
||||
color={STATUS_COLORS.INDEXING}
|
||||
shouldBlink={true}
|
||||
></BlinkingDot>
|
||||
<div>
|
||||
<FlexDiv>
|
||||
<ProgressBarWrapper>
|
||||
<ProgressBarFill completed={fillPercentage} />
|
||||
</ProgressBarWrapper>
|
||||
|
||||
<StatusHeading
|
||||
style={{ fontSize: `${getFontSize() - 3}px` }}
|
||||
>{`${Math.trunc(indexingState.progress * 100)}%`}</StatusHeading>
|
||||
</FlexDiv>
|
||||
|
||||
<StatusInfo>
|
||||
{hovered ? "Click to pause" : indexingState.desc}
|
||||
</StatusInfo>
|
||||
</div>
|
||||
</FlexDiv>
|
||||
) : null}
|
||||
</div>
|
||||
);
|
||||
|
|
|
@ -8,8 +8,8 @@ import {
|
|||
setShowDialog,
|
||||
} from "../../redux/slices/uiStateSlice";
|
||||
import { getFontSize } from "../../util";
|
||||
import { ftl } from "../dialogs/FTCDialog";
|
||||
import QuickModelSetup from "../modelSelection/quickSetup/QuickModelSetup";
|
||||
import { FREE_TRIAL_LIMIT_REQUESTS } from "../../util/freeTrial";
|
||||
|
||||
const ProgressBarWrapper = styled.div`
|
||||
width: 100px;
|
||||
|
@ -96,7 +96,7 @@ const ProgressBar = ({ completed, total }: ProgressBarProps) => {
|
|||
{tooltipPortalDiv &&
|
||||
ReactDOM.createPortal(
|
||||
<StyledTooltip id="usage_progress_bar" place="top">
|
||||
{`Click to use your own API key or local LLM (required after ${ftl()} inputs)`}
|
||||
{`Click to use your own API key or local LLM (required after ${FREE_TRIAL_LIMIT_REQUESTS} inputs)`}
|
||||
</StyledTooltip>,
|
||||
tooltipPortalDiv,
|
||||
)}
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue