- 1. QuickStart
Language Model
- 2026-01-06 02:09:57
- Sanplex Content
- 31
- Last edited by WANG JING on 2026-01-06 02:10:30
- Share links
The Language Model module is used to configure the parameters required for integrating Sanplex with a large language model (LLM). You can create and edit model configurations, and enable or disable them as needed. Key capabilities include:
- Sanplex provides integration with OpenAI models (e.g., GPT-3.5).
- Sanplex supports different integration parameters depending on the selected model.
1. Configure integration parameters for OpenAI (e.g., GPT-3.5)
- Language model: OpenAI (e.g., GPT-3.5).
- Vendor: OpenAI.
- API key: Used for authentication and authorization when calling the OpenAI API.
- Proxy type: Select No proxy or SOCKS5 to connect to the API directly or via a proxy.
- Description: Used to describe the LLM configuration.
- Status: Select Enabled or Disabled.
How to obtain an API key
- Sign in to the OpenAI Platform with your OpenAI account.
- Open your account/organization settings, then go to API keys.
- Click Create new secret key (or Create API key), name it if required, and generate the key.
- Copy the key and store it securely. Do not share it publicly or embed it in client-side code.
- Paste the key into the API Key field in Sanplex.
图1
2. Apply the configured language model to Agent Chat
The language model configured in Admin is applied to Agent Chat:
- Click the Sanplex Agent entry in the bottom-right corner of the bottom navigation bar.
- A chat window will pop up.
- You can ask the agent assistant questions based on Sanplex content, or ask any general questions.
图2
Write a Comment
Support
- Book a Demo
- Tech Forum
- GitHub
- SourceForge
About Us
- Company
- Privacy Policy
- Term of Use
- Blogs
- Partners
Contact Us
- Leave a Message
- Email Us: [email protected]