The new batch API service, now available on Mistral's La Plateforme, arrives as the AI developer community has faced several API price hikes in recent weeks, according to the company's announcement.
The service enables users to process high-volume requests to Mistral models at half the cost of synchronous API calls. Users can upload a batch file and download the output file once requests have been processed.
According to Mistral AI, the batch API is particularly suited for applications including customer feedback and sentiment analysis, document summarisation and translation in bulk, vector embedding to prepare search indexes, and data labelling.
The company confirmed that the batch API is available for all models served on La Plateforme and will be coming soon to their cloud provider partners. Usage is limited to 1 million ongoing requests per workspace.