Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
88 changes: 23 additions & 65 deletions docs/how-tos/vs-code/datacoves-copilot/v2.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,32 +7,17 @@ sidebar_position: 6

This section describes how to configure and use Datacoves Copilot v2, which comes installed on Datacoves v4+, enhancing the experience and supporting the following LLM providers:

- Anthropic
- DeepSeek
- Google Gemini
- OpenAI
- Azure OpenAI
- OpenAI Compatible
- Open Router
- xAI (Grok)
- [Anthropic](#anthropic-llm-provider)
- [Azure OpenAI](#azure-openai-llm-provider)
- [DeepSeek](#deepseek-llm-provider)
- [Google Gemini](#google-gemini-llm-provider)
- [OpenAI](#openai-llm-provider)
- [OpenAI Compatible](#openai-compatible-llm-providers)
- [Open Router](#open-router-llm-provider)
- [xAI (Grok)](#xai-grok-llm-provider)

## How Tos

import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';

<Tabs defaultValue="config" values={[
{label: 'Config', value: 'config'},
{label: 'Anthropic', value: 'anthropic'},
{label: 'OpenAI', value: 'openai'},
{label: 'Azure OpenAI', value: 'azure'},
{label: 'OpenAI Compatible', value: 'openaicompatible'},
{label: 'Google Gemini', value: 'gemini'},
{label: 'More Providers', value: 'additional'},
]}>

<TabItem value="config">

## Configure your LLM in Datacoves Copilot v2

## Create a Datacoves Secret
Expand All @@ -42,7 +27,7 @@ Creating a [Datacoves Secret](/how-tos/datacoves/how_to_secrets.md) requires som
- **Name:** The secret must be named `datacoves-copilot-api-configs`
- **Description:** Provide a simple description such as: `Datacoves Copilot config`
- **Format:** Select `Raw JSON`
- **Value**: The value will vary depending on the LLM you are utilizing, click on the corresponding tab or on `More Providers` to see more.
- **Value**: The value will vary depending on the LLM you are utilizing, see the provider sections below.
- **Scope:** Select the desired scope, either `Project` or `Environment`.
- **Project/Environment:** Select the `Project` or `Environment` that will access this LLM.

Expand Down Expand Up @@ -77,9 +62,7 @@ Lastly, be sure to toggle on the `Share with developers` option so that users wi
allowfullscreen
></iframe>

</TabItem>

<TabItem value="anthropic">
## LLM Providers

## Anthropic LLM Provider

Expand Down Expand Up @@ -128,9 +111,6 @@ Datacoves Copilot supports the following Anthropic Claude models:

See [Anthropic's Model Documentation](https://docs.claude.com/en/docs/about-claude/models/overview) for more details on each model's capabilities.

</TabItem>
<TabItem value="openai">

## OpenAI LLM Provider

Datacoves Copilot supports accessing models directly through the official OpenAI API, including the latest GPT-5 family with advanced features like reasoning effort control and verbosity settings.
Expand Down Expand Up @@ -221,9 +201,6 @@ Optimized GPT-4 models:

Refer to the [OpenAI Models documentation](https://platform.openai.com/docs/models) for the most up-to-date list of models and capabilities.

</TabItem>
<TabItem value="azure">

## Azure OpenAI LLM Provider

Datacoves Copilot supports Azure OpenAI models through the OpenAI API compatible interface.
Expand Down Expand Up @@ -324,11 +301,6 @@ Use your **deployment name** from Azure AI Foundry as the `openAiModelId`. This

Refer to [Azure OpenAI documentation](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/concepts/models) for the most current model availability and regional deployment options.

</TabItem>
<TabItem value="openaicompatible">

#### OpenAI Compatible

## OpenAI Compatible LLM Providers

Datacoves Copilot supports a wide range of AI model providers that offer APIs compatible with the OpenAI API standard. This means you can use models from providers other than OpenAI, while still using a familiar API interface. This includes providers like:
Expand All @@ -337,7 +309,7 @@ Datacoves Copilot supports a wide range of AI model providers that offer APIs co
- Cloud providers like Perplexity, Together AI, Anyscale, and others.
- Any other provider offering an OpenAI-compatible API endpoint.

**Note:** For Azure OpenAI, see the dedicated [Azure OpenAI tab](#azure-openai) for specific setup instructions.
**Note:** For Azure OpenAI, see the dedicated [Azure OpenAI](#azure-openai-llm-provider) section for specific setup instructions.

### Secret value format

Expand Down Expand Up @@ -381,9 +353,6 @@ Fine tune model usage using this additional configuration under the `openAiCusto
}
```

</TabItem>
<TabItem value="gemini">

## Google Gemini LLM Provider

Datacoves Copilot supports Google's Gemini family of models through the Google AI Gemini API.
Expand Down Expand Up @@ -436,20 +405,13 @@ Datacoves Copilot supports the following Gemini models:

Refer to the [Gemini documentation](https://ai.google.dev/gemini-api/docs/models) for more details on each model.

</TabItem>
<TabItem value="additional">

## Additional LLM Providers

Datacoves Copilot supports additional LLM providers, you can find the secret value format for each one of them and additional documentation.

### DeepSeek
## DeepSeek LLM Provider

Datacoves Copilot supports accessing models through the DeepSeek API, including deepseek-chat and deepseek-reasoner.

Website: https://platform.deepseek.com/

#### Secret value format
### Secret value format

```json
{
Expand All @@ -462,26 +424,26 @@ Website: https://platform.deepseek.com/
}
```

#### Getting an API Key
### Getting an API Key

1. Sign Up/Sign In: Go to the DeepSeek Platform. Create an account or sign in.
2. Navigate to API Keys: Find your API keys in the API keys section of the platform.
3. Create a Key: Click "Create new API key". Give your key a descriptive name (e.g., "Datacoves").
4. Copy the Key: Important: Copy the API key immediately. You will not be able to see it again. Store it securely.

#### Supported Models (`apiModelId`)
### Supported Models (`apiModelId`)

- deepseek-chat (Recommended for coding tasks)
- deepseek-reasoner (Recommended for reasoning tasks)
- deepseek-r1

### Open Router
## Open Router LLM Provider

OpenRouter is an AI platform that provides access to a wide variety of language models from different providers, all through a single API. This can simplify setup and allow you to easily experiment with different models.

Website: https://openrouter.ai/

#### Secret value format
### Secret value format

```json
{
Expand All @@ -495,25 +457,25 @@ Website: https://openrouter.ai/
}
```

#### Getting an API Key
### Getting an API Key

1. Sign Up/Sign In: Go to the OpenRouter website. Sign in with your Google or GitHub account.
2. Get an API Key: Go to the keys page. You should see an API key listed. If not, create a new key.
3. Copy the Key: Copy the API key.

#### Supported Models (`openRouterModelId`)
### Supported Models (`openRouterModelId`)

OpenRouter supports a large and growing number of models.
Refer to the [OpenRouter Models page](https://openrouter.ai/models) for the complete and up-to-date list.

### xAI Grok
## xAI Grok LLM Provider

xAI is the company behind Grok, a large language model known for its conversational abilities and large context window.
Grok models are designed to provide helpful, informative, and contextually relevant responses.

Website: https://x.ai/

#### Secret value format
### Secret value format

```json
{
Expand All @@ -527,14 +489,14 @@ Website: https://x.ai/
}
```

#### Getting an API Key
### Getting an API Key

1. Sign Up/Sign In: Go to the xAI Console. Create an account or sign in.
2. Navigate to API Keys: Go to the API keys section in your dashboard.
3. Create a Key: Click to create a new API key. Give your key a descriptive name (e.g., "Datacoves").
4. Copy the Key: Important: Copy the API key immediately. You will not be able to see it again. Store it securely.

#### Supported Models (`apiModelId`)
### Supported Models (`apiModelId`)

- grok-code-fast-1 (Default) - xAI's Grok Code Fast model with 262K context window and prompt caching, optimized for reasoning and coding tasks
- grok-4 - xAI's Grok-4 model with 262K context window, image support, and prompt caching
Expand All @@ -546,7 +508,3 @@ Website: https://x.ai/
- grok-2-vision-1212 - xAI's Grok-2 Vision model (version 1212) with image support and 32K context window

Learn more about available models at [xAI Docs](https://docs.x.ai/docs/models).

</TabItem>

</Tabs>