Announcing the BatchAPI - receive results for your requests within 24 hours at 50% the cost and higher rate limits https://t.co/Bf52xjkbGh
Batch API — 50% off for async requests to our models: https://t.co/TYp0rJ6jAy
Developers managing large async AI tasks, this is for you! Our new Batch API is here to reduce costs and increase your rate limits: https://t.co/xOCpAXhsX4
OpenAI has launched a new Batch API designed to assist developers in managing large asynchronous AI tasks more efficiently and cost-effectively. The API offers a 50% discount on all tokens, significantly higher rate limits of up to 250 million enqueued tokens for GPT-4T, and the ability to upload bulk files with results returned within 24 hours. This initiative not only aims to reduce costs but also to streamline the processing of tasks such as summarization, translation, and image classification. Additionally, plans are in place to release this API on Azure soon.