Prerequisites
Please install pageckages in advance by the following command..env file.
.env
Output
Tutorial
Create a chat app using the<LLM /> component and the LLM API.
Implement the chat app logic in the Python function and use the yield to stream the chat logs, allowing the <LLM /> component to automatically display the streaming results.
- 1. Python
- 2. MDX(pages)
Implement the logic to answer user questions using the OpenAI SDK.
The function specified in the
postData attribute of the <LLM /> component takes prompt and thread_id as arguments.- prompt: The prompt entered by the user
- thread_id: The ID of the chat thread. A new ID is issued when a new thread is opened.