Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Anthropic (Claude) provider #22

Merged
merged 7 commits into from
Jan 24, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,8 +53,8 @@ The process is different for each provider, so you may refer to their documentat

![Screenshot showing how to create an API key](./img/1-api-key.png)

2. Open the JupyterLab settings and go to the **Ai providers** section to select the provider
(`mistral` is only supported one currently) and the API key (required).
2. Open the JupyterLab settings and go to the **Ai providers** section to select the `MistralAI`
provider and the API key (required).

![Screenshot showing how to add the API key to the settings](./img/2-jupyterlab-settings.png)

Expand Down
1 change: 1 addition & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,7 @@
"@jupyterlab/notebook": "^4.4.0-alpha.0",
"@jupyterlab/rendermime": "^4.4.0-alpha.0",
"@jupyterlab/settingregistry": "^4.4.0-alpha.0",
"@langchain/anthropic": "^0.3.9",
"@langchain/core": "^0.3.13",
"@langchain/mistralai": "^0.1.1",
"@lumino/coreutils": "^2.1.2",
Expand Down
2 changes: 1 addition & 1 deletion schema/ai-provider.json
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
"title": "The AI provider",
"description": "The AI provider to use for chat and completion",
"default": "None",
"enum": ["None", "MistralAI"]
"enum": ["None", "Anthropic", "MistralAI"]
}
},
"additionalProperties": true
Expand Down
38 changes: 36 additions & 2 deletions scripts/settings-generator.js
Original file line number Diff line number Diff line change
Expand Up @@ -20,23 +20,57 @@ const schemaBase = tsj
.createGenerator(configBase)
.createSchema(configBase.type);

/**
* The providers are the list of providers for which we'd like to build settings from their interface.
* The keys will be the names of the json files that will be linked to the selected provider.
* The values are:
* - path: path of the module containing the provider input description, in @langchain package.
* - type: the type or interface to format to json settings.
* - excludedProps: (optional) the properties to not include in the settings.
* "ts-json-schema-generator" seems to not handle some imported types, so the workaround is
* to exclude them at the moment, to be able to build other settings.
*/
const providers = {
mistralAI: {
path: 'node_modules/@langchain/mistralai/dist/chat_models.d.ts',
type: 'ChatMistralAIInput'
},
anthropic: {
path: 'node_modules/@langchain/anthropic/dist/chat_models.d.ts',
type: 'AnthropicInput',
excludedProps: ['clientOptions']
}
};

Object.entries(providers).forEach(([name, desc], index) => {
// The configuration doesn't include functions, which may probably not be filled
// from the settings panel.
const config = {
path: desc.path,
tsconfig: './tsconfig.json',
type: desc.type
type: desc.type,
functions: 'hide'
};

const outputPath = path.join(outputDir, `${name}.json`);

const schema = tsj.createGenerator(config).createSchema(config.type);
const generator = tsj.createGenerator(config);
let schema;

// Workaround to exclude some properties from a type or interface.
if (desc.excludedProps) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's probably fine to have exclusion clauses like this for now as we figure things out with several providers 👍

Also found that some props had to be excluded over in #27.

const nodes = generator.getRootNodes(config.type);
const finalMembers = [];
nodes[0].members.forEach(member => {
if (!desc.excludedProps.includes(member.symbol.escapedName)) {
finalMembers.push(member);
}
});
nodes[0].members = finalMembers;
schema = generator.createSchemaFromNodes(nodes);
} else {
schema = generator.createSchema(config.type);
}

if (!schema.definitions) {
return;
Expand Down
6 changes: 3 additions & 3 deletions src/completion-provider.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,10 @@ import {
IInlineCompletionContext,
IInlineCompletionProvider
} from '@jupyterlab/completer';
import { LLM } from '@langchain/core/language_models/llms';
import { BaseLanguageModel } from '@langchain/core/language_models/base';
import { ReadonlyPartialJSONObject } from '@lumino/coreutils';

import { getCompleter, IBaseCompleter, BaseCompleter } from './llm-models';
import { ReadonlyPartialJSONObject } from '@lumino/coreutils';

/**
* The generic completion provider to register to the completion provider manager.
Expand Down Expand Up @@ -57,7 +57,7 @@ export class CompletionProvider implements IInlineCompletionProvider {
/**
* Get the LLM completer.
*/
get llmCompleter(): LLM | null {
get llmCompleter(): BaseLanguageModel | null {
return this._completer?.provider || null;
}

Expand Down
65 changes: 65 additions & 0 deletions src/llm-models/anthropic-completer.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
import {
CompletionHandler,
IInlineCompletionContext
} from '@jupyterlab/completer';
import { ChatAnthropic } from '@langchain/anthropic';
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
import { AIMessage, SystemMessage } from '@langchain/core/messages';

import { BaseCompleter, IBaseCompleter } from './base-completer';

export class AnthropicCompleter implements IBaseCompleter {
constructor(options: BaseCompleter.IOptions) {
this._anthropicProvider = new ChatAnthropic({ ...options.settings });
}

get provider(): BaseChatModel {
return this._anthropicProvider;
}

async fetch(
request: CompletionHandler.IRequest,
context: IInlineCompletionContext
) {
const { text, offset: cursorOffset } = request;
const prompt = text.slice(0, cursorOffset);

// Anthropic does not allow whitespace at the end of the AIMessage
const trimmedPrompt = prompt.trim();

const messages = [
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wondering if this list of messages should be recreated each time, or it should be persisted per provider?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some other open questions: should the list of messages be shared across providers? Or should it be reset when a user switches to another provider?

Maybe a reset would be fine to keep things tidy. Also users probably wouldn't change providers often in practice?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Opened #29

new SystemMessage(
'You are a code-completion AI completing the following code from a Jupyter Notebook cell.'
),
new AIMessage(trimmedPrompt)
];

try {
const response = await this._anthropicProvider.invoke(messages);
const items = [];

// Anthropic can return string or complex content, a list of string/images/other.
if (typeof response.content === 'string') {
items.push({
insertText: response.content
});
} else {
response.content.forEach(content => {
if (content.type !== 'text') {
return;
}
items.push({
insertText: content.text,
filterText: prompt.substring(trimmedPrompt.length)
});
});
}
return { items };
} catch (error) {
console.error('Error fetching completions', error);
return { items: [] };
}
}

private _anthropicProvider: ChatAnthropic;
}
4 changes: 2 additions & 2 deletions src/llm-models/base-completer.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,14 +2,14 @@ import {
CompletionHandler,
IInlineCompletionContext
} from '@jupyterlab/completer';
import { LLM } from '@langchain/core/language_models/llms';
import { BaseLanguageModel } from '@langchain/core/language_models/base';
import { ReadonlyPartialJSONObject } from '@lumino/coreutils';

export interface IBaseCompleter {
/**
* The LLM completer.
*/
provider: LLM;
provider: BaseLanguageModel;

/**
* The function to fetch a new completion.
Expand Down
11 changes: 11 additions & 0 deletions src/llm-models/utils.ts
Original file line number Diff line number Diff line change
@@ -1,12 +1,15 @@
import { ChatAnthropic } from '@langchain/anthropic';
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
import { ChatMistralAI } from '@langchain/mistralai';
import { JSONObject } from '@lumino/coreutils';

import { IBaseCompleter } from './base-completer';
import { AnthropicCompleter } from './anthropic-completer';
import { CodestralCompleter } from './codestral-completer';
import { ReadonlyPartialJSONObject } from '@lumino/coreutils';

import mistralAI from '../_provider-settings/mistralAI.json';
import anthropic from '../_provider-settings/anthropic.json';

/**
* Get an LLM completer from the name.
Expand All @@ -17,6 +20,8 @@ export function getCompleter(
): IBaseCompleter | null {
if (name === 'MistralAI') {
return new CodestralCompleter({ settings });
} else if (name === 'Anthropic') {
return new AnthropicCompleter({ settings });
}
return null;
}
Expand All @@ -30,6 +35,8 @@ export function getChatModel(
): BaseChatModel | null {
if (name === 'MistralAI') {
return new ChatMistralAI({ ...settings });
} else if (name === 'Anthropic') {
return new ChatAnthropic({ ...settings });
}
return null;
}
Expand All @@ -40,6 +47,8 @@ export function getChatModel(
export function getErrorMessage(name: string, error: any): string {
if (name === 'MistralAI') {
return error.message;
} else if (name === 'Anthropic') {
return error.error.error.message;
}
return 'Unknown provider';
}
Expand All @@ -50,6 +59,8 @@ export function getErrorMessage(name: string, error: any): string {
export function getSettings(name: string): JSONObject | null {
if (name === 'MistralAI') {
return mistralAI.definitions.ChatMistralAIInput.properties;
} else if (name === 'Anthropic') {
return anthropic.definitions.AnthropicInput.properties;
}
return null;
}
Loading
Loading