Weaviate Fast-Tracks AI Applications Into ‘Production Era’
30 Julio 2024 - 8:01AM
AI-native vector database company Weaviate announced today that it
is releasing a developer “workbench” of tools and apps along with
flexible tiered storage to meet the needs of organizations putting
AI into production.
Inspired by Weaviate’s vibrant open source community, Weaviate’s
new developer offerings accelerate AI application development and
provide end-to-end solutions for some of the most common AI use
cases, helping organizations make the leap from AI prototypes to
production.
They include:
- Recommender app: Provides a fully managed, low-code solution
for rapid development of scalable, personalized recommendation
systems. Recommender offers configurable endpoints for
item-to-item, item-to-user, and user-to-user recommendation
scenarios and supports images, text, audio, and other forms of
multimodal data. Sign up to be part of the private beta.
- Query tool: Enables developers to query data in Weaviate Cloud
using a GraphQL interface. Available now through Weaviate Cloud
Console.
- Collections tool: Allows users to create and manage collections
in Weaviate Cloud without writing any code. Available now through
the Weaviate Cloud Console.
- Explorer tool: Lets users search and validate object data
through a graphical user interface (GUI). Coming soon to Weaviate
Cloud Console.
To fuel development of new apps, Weaviate has debuted a Labs
division dedicated to testing daring ideas and turning the best
into Weaviate products. Among its first projects, Weaviate Labs is
developing an app to help teams quickly deploy production-ready
Generative Feedback Loops for AI agents and take the next step
beyond RAG.
To meet the needs of diverse AI use cases, Weaviate’s new
storage tiers and tenant offloading capabilities allow users to
optimize for speed, cost, or performance. Low-latency applications
closely tied to revenue, such as e-commerce and recommendation
engines, can continue to be optimized for performance, while
applications with higher latency tolerances, such as chatbots, can
scale cost-efficiently.
“We’ve seen AI applications move into production at scale. Now
the AI-native stack needs to evolve so organizations can build AI
applications faster and deploy them at lower cost. We’re entering
the ‘Production Era,’ where we start to see real impact from AI,”
said Bob van Luijt, CEO and co-founder of Weaviate. “Listening to
our community, it’s clear that to take the next step, developers
need an AI-native framework with flexible storage tiers, modular
GUI tools to interact with their data, and a pipeline of new
concepts to spark their creativity.”
The storage options are:
- Hot - for the highest performance read/write data access in
real-time
- Warm - to balance accessibility and cost of data that is
readily available but used less frequently
- Cold - for cost-effective long-term storage of data with
infrequent use and slower access.
At present, most vector databases that power AI applications
treat all data as ‘Hot,’ offering rapid access but at the highest
price. This means AI applications can be costly to scale from
prototype to production. Other vector databases only offer Warm and
Cold data tiers, which makes them incompatible with real-time use
cases like e-commerce. By offering a trifecta of Hot, Warm, and
Cold tiers, Weaviate enables AI application developers to balance
cost, performance, and speed depending on the workload and use
case.
"Weaviate’s scalable multi-tenant architecture has been crucial
in maintaining fast and reliable AI-driven customer service and
engagement experiences for our thousands of users on Botsonic,"
said Samanyou Garg, CEO of Writesonic. "Their vector search
capabilities enable our users to build highly accurate AI chatbots
trained on their own data. We look forward to leveraging their new
flexible storage tiers to efficiently allocate resources for each
tenant."
Weaviate’s new Query and Collections tools are now available to
Weaviate Cloud users through the Weaviate Cloud Console. The new
storage tiers are available for all users of Weaviate Enterprise
Cloud and Weaviate Database (Open Source).
About WeaviateWeaviate is an open-source
AI-native vector database that makes it easier for developers to
build and scale AI applications. With powerful hybrid search out of
the box, seamless connection to machine learning models, and a
purpose-built architecture that scales to billions of vectors and
millions of tenants—Weaviate is a foundation for modern, AI-native
software development. Customers and open-source users, including
Instabase, NetApp, and Red Hat, power search and generative AI
applications with Weaviate while maintaining control over their own
data. The company was founded in 2019 and is funded by Battery
Ventures, Cortical Ventures, Index Ventures, ING Ventures, NEA, and
Zetta Venture Partners. For more information, visit
Weaviate.io.
Media ContactChris
Ulbrichweaviate@firebrand.marketing415 848 9175