An open-source LLM Gateway for managing requests across leading providers.

Read Case Study

Multiple LLM providers. One API.

Simplify access to multiple large language models with a single, unified API.

Multiple LLM providers. One API.

Centralized Management

Manage common concerns like authentication and traffic routing in one location. Ensure availability with LLM fallbacks. Optimize costs and performance with configurable features like semantic caching and guardrails.

Centralized Management

Built-in Observability

Apiary includes logging and cost tracking for multiple LLM providers through a single developer dashboard.

Built-in Observability

Our Team

Karen Jayne Poccia's profile picture

Karen Jayne Poccia

Ping Honzay's profile picture

Ping Honzay

Raphael Costa's profile picture

Raphael Costa

Shwetank Tewari's profile picture

Shwetank Tewari