Skip to main content
You can use large language models (LLMs) to assist in building Qwoty integrations. This page explains how to access our documentation in LLM-friendly formats and how to connect AI assistants directly to your Qwoty account.

Using documentation with LLMs

llms.txt

The llms.txt file is an industry standard that helps LLMs index content efficiently, similar to how a sitemap helps search engines. AI tools use this file to understand your documentation structure and find relevant content. We automatically host an llms.txt file that lists all available pages: developer.qwoty.io/llms.txt

skill.md

The skill.md specification is a structured, machine-readable format that describes what AI agents can accomplish with Qwoty. While llms.txt tells agents where to find information, skill.md tells them what capabilities are available, what inputs are required, and what constraints apply. We automatically host a skill.md file: developer.qwoty.io/skill.md Agents can process this file using the skills CLI:
npx skills add developer.qwoty.io/skill.md

Documentation MCP Server

The Model Context Protocol (MCP) is an open standard that allows AI assistants to connect directly to external services. When connected to our documentation MCP server, AI tools can search our docs during response generation — providing more accurate answers than generic web searches. Connect your AI tools to our documentation MCP server at https://developer.qwoty.io/mcp:
  1. Navigate to the Connectors page in the Claude settings
  2. Select Add custom connector
  3. Add the server name (e.g., qwoty-docs) and URL: https://developer.qwoty.io/mcp
  4. Select Add
  5. When using Claude, select the attachments button (the + icon)
  6. Select your MCP server