Google has released Gemma-2-JPN, a fine-tuned version of its Gemma 2 2B AI language model optimized for Japanese text. This new model supports Japanese language queries with the same performance as English-only queries on Gemma 2. The model can run locally in browsers using WebGPU and Transformers.js. Additionally, Google has announced a $150,000 Kaggle competition to build Gemma models for various languages. The release includes training materials to help developers adapt Gemma to their own languages. The model's capabilities have been demonstrated in various applications, including code generation and translation. The model is also optimized for mobile and available for demo in the Gradio space shared by Hugging Face.
Yesterday, I had the great privilege to present our newest Gemma model in Tokyo: a 2B post-trained model specifically designed for Japanese, without degradation of English performance. https://t.co/wUKxatPF5Y
Check out a demo of the new Gemma 2 2B Japanese language model in the @Gradio space shared by @huggingface https://t.co/09qZvqEvhB
Gemma 2 2B: Advanced AI language model, optimized for mobile, and fluent in both Japanese and English ā” https://t.co/yblmrjyXnF