Anthropic has introduced a new Message Batches API, offering developers a cost-effective solution for processing large volumes of queries asynchronously. The new API allows users to send batches of up to 10,000 queries, which are processed within 24 hours at half the cost of standard API calls.
The Batches API is now available in public beta, supporting Claude 3.5 Sonnet, Claude 3 Opus, and Claude 3 Haiku on the Anthropic API. Amazon Bedrock customers can use batch inference with Claude, while support for Google Cloud's Vertex AI is forthcoming.
This new feature is designed for developers who use Claude to process vast amounts of data where real-time responses aren't necessary. It offers enhanced throughput with higher rate limits and improved scalability for big data tasks, without impacting standard API rate limits.
Pricing for the Batches API is set at a 50% discount for both input and output tokens across all three Claude models. For instance, Claude 3.5 Sonnet's batch input is priced at $1.50 per million tokens, while batch output costs $7.50 per million tokens.
Quora, a user-based question-and-answer platform, is already leveraging the Batches API for summarisation and highlight extraction. Andy Edmonds, Product Manager at Quora, praised the API's convenience and cost-effectiveness, stating, "It's very convenient to submit a batch and download the results within 24 hours, instead of having to deal with the complexity of running many parallel live queries to get the same result."
Anthropic's new Batches API aims to unlock new possibilities for large-scale data processing, making tasks like analysing entire corporate document repositories more economically viable.