
The Batch API has been significantly upgraded with new capabilities and enhancements. It now supports embedding and vision models, which are beneficial for tasks like offline indexing and bulk image processing. Additionally, the requests-per-batch limit has been increased from 10,000 to 50,000, allowing for more extensive data handling in a single batch. Users can also manage their batches directly from the dashboard, making the process more user-friendly. A new guide has been released to assist users in getting started with these new features. The cost for using the small model for batch embeddings is notably low, priced at just $0.01 per 1 million tokens, with a 4 billion batch token rate limit.

Another day, another ship! 🚢 The Batch API now supports a top-requested feature: batch embeddings! We’ve also added support for vision, increased rate limits, and made it easy to create batches directly from the dashboard. Don’t miss our new guide: https://t.co/bB2yOaarrN https://t.co/mTRbYLX1VR
The Batch API now supports our embedding models and our vision models – great for offline indexing or bulk image processing jobs. https://t.co/kmu9DB9AWg
Don't sleep on Batch Embeddings... it was one our longest running feature requests! Just $0.01 /1M tokens for the small model and a 4B batch token rate limits! Give it a whirl https://t.co/VbPUvkDQSi https://t.co/Ie2veAvTbj