Common Questions About Model Selection
Does Dessix Support Custom Third-party Models?
Answer: No
Design Philosophy
To reduce the cognitive burden on users during usage, Dessix pre-configures the most suitable default models for different usage scenarios.
Scene-based Model Configuration
- Scene Matching: Dessix configures the most suitable default model for different usage scenarios
- Cost Optimization: Simple scenarios use economical models, complex scenarios use advanced models
- User Control: In supported scenarios, users can specify a particular model via mention
Scenes and Models
Dessix uses pre-configured default models based on different usage scenarios:
Scene-to-Model Mapping
| Usage Scene | Default Model | Supports Specification | Selection Reason |
|---|---|---|---|
| Chat Conversations | Gemini 2.5 Pro | ✅ Yes | Complex conversations require stronger reasoning |
| Custom Generation | Dynamic Selection | ✅ Yes | Based on user specification or system default |
| Summary Generation | Gemini 2.5 Flash | ❌ No | Summary tasks are simpler, use cost-effective models |
| Metadata Extraction | Gemini 2.5 Flash | ❌ No | Structured extraction, prioritize cost-effectiveness |
| Chat Title Generation | Gemini 2.5 Flash | ❌ No | Simple text generation, optimize response speed |
| Chat Suggestions | Gemini 2.5 Flash | ❌ No | Lightweight task, optimize cost |
| Text Completion | Gemini 2.5 Flash | ❌ No | Quick completion, optimize response speed |
Advantages of Scene-based Configuration
- Cost Optimization: Simple scenarios use economical models, complex scenarios use advanced models
- Performance Balance: Pre-configured models based on scenario characteristics
- Seamless Experience: Automatic selection, no manual intervention needed
How to Specify a Model?
In scenarios that support model specification (Chat Conversations, Custom Generation), you can specify models through the following methods:
Method 1: Specify via Mention (Recommended)
Use the @ symbol to mention a model name in conversations.
Prerequisite
You need to install the corresponding model from the Dessix Community first. The entry is the Community button in the upper right corner (shortcut Ctrl + M).
Examples:
@claude-sonnet-4 Please help me analyze the performance issues in this code@gemini-3-pro This is a complex reasoning task, please help me solve itMethod 2: Declare Model ID in Text
Directly state the model ID you want to use in the conversation:
Examples:
Use anthropic/claude-sonnet-4.5 to complete this task, ......Please use google/gemini-2.5-pro to handle this problemMethod 3: Use Model Aliases
Each model has multiple aliases, allowing you to use shorter names:
Use gemini25pro to handle this taskComplete this with claude4sonnetNote
The system does not support automatic model switching based on task type descriptions (such as "this is a programming task"). If you need to use a specific model, please explicitly specify the model ID, name, or alias.
Built-in Model List
Dessix currently includes the following high-quality models:
Gemini Series (Google)
| Model ID | Name | Features |
|---|---|---|
google/gemini-2.0-flash-001 | Gemini 2.0 Flash | Fast response, image support, 1M context |
google/gemini-2.5-flash | Gemini 2.5 Flash | Enhanced fast model, tool calling support |
google/gemini-2.5-pro | Gemini 2.5 Pro | Professional version, powerful reasoning |
google/gemini-3-pro-preview | Gemini 3 Pro Preview | Latest flagship, multimodal reasoning |
Grok Series (xAI)
| Model ID | Name | Features |
|---|---|---|
x-ai/grok-4-fast | Grok 4 Fast | 2M context, multimodal support |
x-ai/grok-code-fast-1 | Grok Code Fast 1 | Specially optimized programming model |
Claude Series (Anthropic)
| Model ID | Name | Features |
|---|---|---|
anthropic/claude-sonnet-4.5 | Claude 4.5 Sonnet | Balanced performance and efficiency |
anthropic/claude-opus-4.5 | Claude Opus 4.5 | Frontier reasoning model |
DeepSeek Series
| Model ID | Name | Features |
|---|---|---|
deepseek/deepseek-chat-v3.1 | DeepSeek V3.1 | Conversation-optimized version |
OpenAI Series
| Model ID | Name | Features |
|---|---|---|
openai/gpt-5 | GPT-5 | OpenAI flagship model, tool calling support |
Other Models
| Model ID | Name | Features |
|---|---|---|
z-ai/glm-4.6 | GLM 4.6 | Z.AI model, 200K context |
Design Considerations
Why Not Expose Model Selection?
Avoid Becoming a Playground Tool
- If model selection is exposed, Dessix might become a complex Playground tool
- Would need to explain to users which models are suitable for what tasks
- Model SOTA is constantly changing, which would increase users' cognitive burden
Reduce Cognitive Load
- Users don't need to understand the characteristics and applicable scenarios of various models
- Focus on the task itself rather than tool selection
- Simplify usage flow and improve efficiency
Transparency Needs for Advanced Users
For advanced users who need to know which models are being used:
Recommended Approach
- Use
@mention to explicitly specify models - Use model IDs or aliases when asking questions
- Refer to the model list to see supported models
Usage Tips
- Use
@model-namefor quick specification - Use full model IDs for precise specification
- Use short aliases (like
gemini25pro) for efficiency
Best Practices
General Users
- Directly describe your needs and tasks
- Dessix will automatically select the default model based on the usage scenario
- Focus on result quality rather than model selection
Advanced Users
- Use
@mention for precise model control - Specify models via ID or alias in Chat conversations
- Use model aliases for quick specification
- Understand different models' strengths, manually select when needed
Through this design, Dessix maintains ease of use while providing sufficient control for advanced users.