Tutorial 1: Getting Started
Install the OpenKBS CLI, create your first agent, and deploy it to the cloud.
1.1 Install the CLI
npm install -g openkbsVerify installation:
openkbs --version1.2 Create Your Agent
openkbs create telegram-agent
cd telegram-agentThis creates the following structure:
telegram-agent/
├── app/
│ ├── settings.json # Agent configuration
│ ├── instructions.txt # System prompt for the LLM
│ └── icon.png # Agent icon
├── src/
│ ├── Events/ # Backend handlers
│ │ ├── actions.js # Command implementations
│ │ ├── handler.js # Common handler logic
│ │ ├── onRequest.js # Handles user messages
│ │ ├── onResponse.js # Handles LLM responses
│ │ └── *.json # NPM dependencies
│ └── Frontend/ # UI customization
│ ├── contentRender.js
│ └── contentRender.json
├── index.js # Local dev server
└── package.json
Key Files
| File | Purpose |
|---|---|
app/settings.json | Model, itemTypes, memory settings |
app/instructions.txt | System prompt defining agent behavior |
src/Events/actions.js | Command implementations |
src/Events/handler.js | Processes commands from messages |
1.3 Deploy to Cloud
Login to OpenKBS:
openkbs loginThis opens a browser for authentication. After logging in, deploy:
openkbs pushThe CLI will:
- Register your application
- Upload frontend and backend code
- Deploy everything
- Return your agent URL:
https://{kbId}.apps.openkbs.com/
Open the URL and chat with your agent!
1.4 Understanding the Flow
User sends message
↓
┌─────────────────────────┐
│ onRequest handler │ ← Execute commands from user message
└───────────┬─────────────┘
↓
┌─────────────────────────┐
│ LLM Processing │ ← System prompt + conversation
└───────────┬─────────────┘
↓
┌─────────────────────────┐
│ onResponse handler │ ← Execute commands from LLM output
└───────────┬─────────────┘
↓
Display to user
How it works:
- User sends a message
onRequestcan execute commands from the user message (useful for API integrations)- LLM processes the message with system prompt from
instructions.txt - LLM may output XML commands like
<googleSearch>{"query": "..."}</googleSearch> onResponseparses these commands and executes them viaactions.js- Results can loop back to LLM or display to user
1.5 Local Development
For faster frontend iteration:
npm install
npm startOpens http://localhost:38593 with hot-reload.
Note: Backend changes require openkbs push to take effect.
1.6 Extend Frontend
Let's enhance your application with additional libraries and features.
For example, to properly render chat messages with Markdown, you can integrate react-markdown:
Add react-markdown
-
Add
react-markdownto your dependencies:openkbs contentRender i react-markdown -
Edit
./src/Frontend/contentRender.jsto usereact-markdown:import ReactMarkdown from 'react-markdown'; const onRenderChatMessage = async (params) => { const { content } = params.messages[params.msgIndex]; return <ReactMarkdown>{content}</ReactMarkdown>; }; -
Ask the agent: "Write professional testing plan"
You'll see the Markdown rendered with proper formatting - headers, bullet points, code blocks, etc.
Note: Frontend changes are visible immediately with hot-reload. Backend changes require openkbs push.
Summary
- Installed OpenKBS CLI
- Created agent with
openkbs create - Deployed with
openkbs push - Understood the message flow
- Set up local development
- Customized frontend with react-markdown
Next
Tutorial 2: Backend Commands - Create custom commands.