Building with LLMs
You can use large language models (LLMs) to assist in integrating Knock into your application. We provide a set of tools to help you if you're using an LLM in your integration, like when using an AI-assisted editor such as Cursor, VS Code with Copilot, or Windsurf.
Plain text docs
Every page on our docs site is accessible as a plain text file by appending a .md
extension. For example, this page is accessible as building-with-llms.md
. Our plain text pages are useful to feed to an LLM when building your integration.
We also host an /llms.txt
and /llms-full.txt
files which instructs AI tools and agents how to retrieve the plain text versions of our pages.
Knock Model Context Protocol (MCP) Server
We ship an MCP server that exposes the primitives of Knock to LLMs and AI agents via the Model Context Protocol.
You can use the Knock MCP server to aid in building your Knock integration, and to also integrate Knock into any MCP client compatible agent applications.
Learn more in our MCP Server docs.
Knock agent toolkit SDK
We provide an Agent Toolkit that allows you to integrate Knock via function calling to AI agent workflows. Using the agent toolkit gives your AI agents the ability to send cross-channel messaging to your customers, as well as powering human-in-the-loop interactions.
Learn more in our Agent Toolkit docs.