Home / Technology / Cohere Launches Tiny Aya for Offline AI
Cohere Launches Tiny Aya for Offline AI
17 Feb
Summary
- Tiny Aya models support over 70 languages for offline device use.
- Models are open-weight and can be used and modified by anyone.
- Regional variants cater to specific language groups like South Asia.

Enterprise AI company Cohere, through its research arm Cohere Labs, has launched Tiny Aya, a new suite of open-weight multilingual models. These models boast support for over 70 languages and are engineered to operate on devices like laptops, eliminating the need for constant internet connectivity. This innovation was announced during the India AI Summit.
The Tiny Aya family includes specialized regional variants such as TinyAya-Fire for South Asian languages like Hindi and Tamil, and TinyAya-Earth for African languages. The base model features 3.35 billion parameters. Cohere emphasized that these models were trained using comparatively modest computing resources and are ideal for developers building applications for native language speakers.
These offline-capable AI models are particularly beneficial in linguistically diverse regions like India, enabling a wider range of applications without internet dependency. The models are accessible on platforms including HuggingFace and Kaggle, allowing for easy download and local deployment by developers. Cohere is also sharing training datasets to foster further research.




