The first Design Intelligence Layer your AI agent can natively speak.
Give your AI agent live access to your expression infrastructure. Tokens, brands, and guardrails, directly from Claude Code or Cursor. No manual lookups. No copy-pasting values.
Google Stitch ships an MCP server with 5 tools: build_site, get_screen_code, get_screen_image, extract_design_context, and project management. It connects Stitch's canvas to coding agents. The designless MCP server connects expression infrastructure to coding agents. Different sources, different paradigms.
MCP tools are atomic operations. Skills chain them into workflows. A skill tells an agent: read the brand, check accessibility, apply tokens, validate output. Skills built on skill.design can orchestrate the designless MCP alongside other servers. Figma MCP for canvas context. LESS MCP for design judgment. One skill, multiple intelligence sources.
Expression infrastructure is designed to be consumed by autonomous agents at runtime. Agents read tokens, execute skills, and resolve against the same system the MCP server exposes — making design decisions without human intervention. More to come.