AI-assisted development
You can use mittwald's AI models to assist you in your development process. The models are GDPR-compliant and are hosted in Germany without storing user data or using it for training.
Available models
The service offers several AI models optimized for different development tasks. For a complete list of available models and their specifications, see the AI models documentation.
IDE integration
Overview
IDE | Integrations |
---|---|
Jetbrains IDEs (IntelliJ, PHPStorm, ...) | Continue, Jetbrains AI |
Visual Studio Code | Continue, Cline |
Cursor | Cline |
Windsurf | Cline |
Jetbrains AI (for Jetbrains IDEs, like IntelliJ, PHPStorm, WebStorm, ...)
To use the models, you need to set up a proxy service on your local machine to handle authentication and forward requests to the mittwald API.
For detailed instructions on integrating the proxy with your IDE, refer to the IDE integration section in the project's README.
Continue (for Visual Studio Code or Jetbrains IDEs)
Continue works with both Jetbrains IDEs and Visual Studio Code.
Start by installing the Plugin for your IDE:
Models are usually configured in the config.yaml
file (usually found in the ~/.continue/
directory in your home directory). You can configure the mittwald models (Qwen3-Coder-30B-Instruct
works well for code generation) using the following configuration:
name: mittwald
version: 1.0.0
schema: v1
models:
- name: Qwen3-Coder-30B-Instruct
provider: openai
model: Qwen3-Coder-30B-Instruct
apiBase: https://llm.aihosting.mittwald.de/v1
apiKey: <INSERT-API-KEY>
roles:
- chat
- edit
- apply
capabilities:
- tool_use
context:
- provider: code
- provider: docs
- provider: diff
- provider: terminal
- provider: problems
- provider: folder
- provider: codebase
Cline (for Visual Studio Code, Cursor or Windsurf)
Cline works with Visual Studio Code, Cursor and Windsurf. Start by installing the appropriate IDE plugin.
When configuring Cline via its UI, select the following options:
- API Provider: Select "OpenAI Compatible"
- Base URL: Enter
https://llm.aihosting.mittwald.de/v1
- OpenAI Compatible API Key: Enter your LLM API key from the mStudio
- Model ID: Enter
Qwen3-Coder-30B-Instruct
Usage limits
During the beta phase, each API key is limited to 300 requests per minute. The service is currently free while in beta, with full release expected in 2025.