Stay updated with our daily and weekly newsletters for the latest in AI industry news and exclusive content.
Chet Kapoor, CEO of DataStax, recently claimed at a Silicon Valley conference that Cassandra is the “best database for generative AI.” Speaking at the Linux Foundation’s AI.Dev event, Kapoor highlighted the competitive landscape of generative AI, where startups and established companies are vying for leadership. This competition extends to the databases used for storing and retrieving data for large language model (LLM) applications.
During his keynote, Kapoor emphasized why DataStax’s Cassandra database stands out. Known for its reliability, Cassandra is widely used by enterprise companies and has early adopters deploying generative AI at scale. This gives it an edge over competitors like MongoDB and Pinecone. DataStax is also considering going public, having raised $115 million last year at a $1.6 billion valuation.
Kapoor’s confidence in Cassandra stems from its widespread use and reliability. He noted that while cloud giants like Microsoft and Amazon offer multiple specialized databases, enterprise CIOs now prefer integrated solutions for easier and more efficient data querying. Cassandra, being a popular operational database, is well-suited for this need, unlike the primarily analytical databases from Microsoft and Amazon, which can be costly for operational workloads.
DataStax has focused on optimizing price and performance, making Cassandra a favorite among Fortune 500 companies. Companies like Netflix, FedEx, Apple, and Home Depot rely on Cassandra for various applications, and as they develop new AI apps, they are likely to continue using it. Additionally, Amazon offers the option to use Cassandra within its cloud services, providing flexibility and avoiding vendor lock-in.
Kapoor also highlighted that DataStax has several customers actively deploying generative AI. Companies like Physics Wallah and Skypoint have moved to production with generative AI applications, demonstrating the practical use of DataStax’s Astra DB. While many enterprises are still experimenting, these early adopters show the potential for broader adoption in the near future.
DataStax’s technology also excels in key benchmarks for LLM applications. Kapoor mentioned that their JVector vector search technology outperforms competitors like Pinecone, offering more relevant results and superior throughput. Astra DB is unique in providing zero-latency access to vectorized data, making it highly efficient for AI applications.
Kapoor believes that the adoption of generative AI will accelerate rapidly, building on existing web, mobile, and cloud technologies. He anticipates significant revenue growth from generative AI deployments starting next year, with more transformative use cases emerging. While Cassandra has clear advantages, Kapoor acknowledged that the entire database sector will benefit from the increased demand driven by AI applications.
Stay informed with our daily updates and never miss out on the latest news in the AI industry.