Skip to main content
Clients & tools

Dify

Low-code LLM app platform

Dify is a low-code platform for building LLM applications with visual workflows. Connect it to Qwen Cloud's pay-as-you-go API to create chat assistants, agents, and knowledge bases powered by Qwen models.

Quick start

Get running in a few minutes:
# 1. Install plugin
Go to Dify marketplace → Models → Find "TONGYI" → Install

# 2. Configure (Settings → Model Providers → TONGYI → Settings)
API Key: sk-xxx
Use international endpoint: Yes

# 3. Test (Create blank app → Chat assistant)
Select model: qwen3.5-plus
Type message: "Write a Python hello world program"
You should see: The model responds with Python code

Configuration

Basic setup

Configure Dify to use Qwen Cloud:
  • Plugin: TONGYI (from Dify marketplace)
  • API endpoint: International (set to "Yes")
  • Authentication: API key required
  • Model selection: Choose from available models in plugin
Free quota and billing:
  • First-time users get a free quota (valid for 90 days)
  • Enable Free quota only to prevent unexpected charges

Step-by-step configuration

1

Install TONGYI plugin

Go to Dify marketplaceModelsTONGYI → Install
2

Configure API key

Click profile → SettingsModel ProvidersTONGYISettings
  • API Key: Your API key
  • Use international endpoint: Yes
3

Enable models

Click models on TONGYI card → Toggle on desired models
For newest models not in TONGYI plugin: Use OpenAI-API-compatible plugin with endpoint https://dashscope-intl.aliyuncs.com/compatible-mode/v1

Limitations

  • Plugin maintenance: TONGYI plugin is maintained by Dify, not Qwen Cloud
  • Model availability: Some newest models may require OpenAI-compatible plugin

Examples

  • Chat assistant
  • Workflow with LLM node
  • Knowledge base
  • Vision models
1

Create app

Workspace → Create blank appChat assistant
2

Configure model

Select qwen3.5-plus → Enable thinking mode if available
3

Test conversation

Type: "Explain how neural networks work"

Troubleshooting

"Invalid API-key provided" error
Solution:
  • Try earlier TONGYI plugin version
  • Use API key from default workspace (not sub-workspace)
  • Verify "Use international endpoint" is set to Yes
Models with -latest suffix not available
Solution: Use OpenAI-API-compatible plugin with:
  • Endpoint: https://dashscope-intl.aliyuncs.com/compatible-mode/v1
  • API key: Your DashScope key
  • Model: Enter model ID manually
High token consumption
Solution:
  • Use appropriate models for tasks
  • Configure reasonable context windows
  • Clear conversation history regularly
Vision toggle not appearing
Solution: Ensure you've selected a vision-capable model (qwen3.5-plus or qwen3-vl-plus)

Advanced features

Thinking mode

For models that support reasoning:
  1. Select model with thinking support
  2. Enable thinking mode toggle
  3. Set to "True" for step-by-step reasoning

Code execution nodes

Extract reasoning from responses:
  • Use regex in code execution nodes
  • Separate thinking process from final answer
  • Format output as needed