Frequently Asked Questions

Find answers to common questions about Automiel, OpenAPI specs, LLM tool generation, and integration.

Getting Started

Basic questions about Automiel and how to begin

What is Automiel? +
Automiel turns your existing API into a tool that LLMs can use reliably. Provide your OpenAPI spec, and Automiel generates the tool definitions, parameter schemas, and MCP server code automatically.
How does Automiel work? +
Point Automiel at your OpenAPI spec file or URL. It parses your endpoints, generates LLM-compatible tool definitions, and creates an MCP server that handles the translation between LLM tool calls and HTTP requests.
Do I need to change my existing API? +
No. Automiel reads your OpenAPI spec and generates the integration layer. Your API stays exactly as it is. If your spec accurately describes your API, the tools will work.
How long does it take to set up? +
Most developers have their API available to LLMs within minutes. Run the CLI, point it at your spec, and the generated MCP server is ready to deploy.

OpenAPI & Specs

Questions about supported spec formats and requirements

What spec formats are supported? +
Automiel works with OpenAPI 3.0 and 3.1 specifications. Swagger 2.0 specs are automatically converted. You can provide a local file path or a URL.
What if my spec is incomplete? +
Automiel uses whatever is in your spec. Missing descriptions or examples mean less context for the LLM. The more complete your spec, the better the tool definitions.
Can I exclude certain endpoints? +
Yes. You can configure which endpoints to include or exclude. This is useful if some endpoints are internal-only or not suitable for LLM access.
How do I update when my API changes? +
Re-run Automiel with your updated spec. The tool definitions regenerate automatically. No manual sync required.

LLM Integration

Questions about LLM compatibility and tool generation

Which LLMs are supported? +
Automiel generates standard tool definitions that work with Claude, GPT-4, Gemini, Llama, and any model that supports function calling. Output format can be customized per provider.
What is an MCP server? +
MCP (Model Context Protocol) is a standard for connecting LLMs to external tools. Automiel generates MCP servers that expose your API endpoints as tools the LLM can call.
How do LLMs know how to use my API? +
Automiel extracts descriptions, parameter schemas, and examples from your OpenAPI spec to create tool definitions. The LLM reads these definitions to understand what each endpoint does and how to call it.
Can I customize the generated tool definitions? +
Yes. Override descriptions, add examples, rename tools, or modify parameter handling. The generated code is yours to extend.

Authentication & Security

Questions about API authentication and request handling

How does authentication work? +
Automiel supports OAuth out of the box, as well as webhook forwarding. Each request includes headers with identifiers for the user and the LLM making the call, so you always know who triggered what.
Does Automiel store my API credentials? +
No. Automiel generates code that runs in your infrastructure. Your API keys, tokens, and secrets stay with you.
How do I control which users can access my API via LLMs? +
The generated MCP server includes user and LLM identifiers in each request. Your API can validate these headers and apply your existing authorization rules.
Is my OpenAPI spec kept private? +
Yes. Your spec is processed locally or in your own environment. We do not store or access your API specifications.

Deployment

Questions about running and deploying the generated code

Where does the MCP server run? +
Anywhere you want. The generated code is yours to deploy. Run it as a standalone service, embed it in your application, or deploy to serverless platforms.
What dependencies does the generated code have? +
The generated MCP server is lightweight with minimal dependencies. It handles HTTP requests, parameter validation, and response formatting.
Can I self-host everything? +
Yes. Automiel generates code that runs entirely in your infrastructure. No runtime dependencies on Automiel services.
How do I monitor LLM calls to my API? +
The generated server includes observability hooks. You can log every call, response, and error. Integrate with your existing monitoring tools.

Pricing

Questions about costs and plans

Is there a free tier? +
Yes. Generate tool definitions for small APIs at no cost. Larger specs and advanced features are available on paid plans.
How is pricing calculated? +
Pricing is based on the number of endpoints and generation frequency. The generated code runs in your infrastructure with no per-call fees from Automiel.
Do I pay per LLM call? +
No. Once generated, the MCP server is yours. There are no usage fees from Automiel for API calls made through the generated tools.

Still have questions?

Can't find what you're looking for? We're here to help.

Contact us