In-Depth Look: Kong Introduces AI Gateway to Assist Enterprises in Managing and Expanding Generative AI Capabilities

In-Depth Look: Kong Introduces AI Gateway to Assist Enterprises in Managing and Expanding Generative AI Capabilities

Kong Inc., a top developer of cloud API technologies, has made the Kong AI Gateway available for general use. This AI-native API gateway helps businesses manage and secure generative AI workloads across any cloud environment. Since its beta release in February, the gateway has been widely adopted as companies strive to operationalize generative AI technology.

The Kong AI Gateway offers various infrastructure features designed for AI, such as support for multiple large language models (LLMs), semantic caching, semantic routing, semantic firewalling, and model lifecycle management. It integrates smoothly with Kong’s current API platform, enabling companies to manage AI and traditional APIs together.

Marco Palladino, CTO and co-founder of Kong, mentioned that organizations are creating new generative AI use cases to improve user and customer experiences. However, scaling new technology in production requires the proper infrastructure. He emphasized that the Kong AI Gateway is likely the most advanced AI infrastructure technology available, building on Kong’s existing gateway features while adding deep AI-specific capabilities.

The Kong AI Gateway acts as a central hub, offering a unified API interface to manage and secure AI technologies across different applications. It provides AI-specific capabilities such as governance, observability, security, and more, helping businesses deploy and scale generative AI projects effectively.

A standout feature of the Kong AI Gateway is its ability to analyze AI traffic and offer a unified API to consume multiple AI providers. Unlike other API management platforms that treat LLM access as just another API, Kong delves deeper, providing prompt security, compliance, governance, templating, and a lifecycle around AI prompts. It also offers advanced observability metrics for insights into provider performance, token usage, and costs.

Kong’s unified control plane, Kong Konnect, will allow organizations to monetize their fine-tuned AI models alongside traditional APIs. Palladino noted that the next step in API monetization is the monetization of AI models, particularly those fine-tuned with unique corporate intelligence.

Amid growing interest in generative AI, following the success of OpenAI’s ChatGPT, many enterprises are struggling with deployment and governance. Kong aims to simplify this with its AI gateway. Due to high demand, the product was accelerated to general availability, with enterprise customers already using it in production since the beta version.

Kong was originally founded as Mashape in 2009 and rebranded in 2017. The company has secured over $170 million in venture funding and supports trillions of API transactions for over 500 organizations globally. With its focus on AI-native infrastructure, Kong is well-positioned to facilitate the next wave of generative AI adoption in the business world.

Palladino highlighted the importance of deploying the right AI infrastructure to scale generative AI effectively, asserting that the Kong AI Gateway meets that need.