You are currently viewing At DevSparks 2024, Ola Krutrim reveals plans to supercharge developer ecosystem

At DevSparks 2024, Ola Krutrim reveals plans to supercharge developer ecosystem


Ola Krutrim blazed a new trail for Indian AI at YourStory’s first-ever developer conference, DevSparks 2024. Held on May 4, 2024, in Bangalore, the event featured a series of planned activities, presentations, and interactive sessions. These included the product showcase, a lineup of exciting product and service launches to bring Ola Krutrim one step closer to its goal of building full-stack AI capabilities. 

The one-hour session showcased the deep inroads that Krutrim has made in AI, and how it aims to empower a new wave of innovations in the space. The session was split into smaller presentations, where experts from Ola Krutrim launched new products built on Krutrim’s AI stack. 

Innovation for India, from India and for the world

Ravi Jain, VP, Ola Krutrim, opened the session with an encouraging thought: India boasts of a deep talent pool, a wide developer community and is flush with capital. Each of these factors, he believes, will help “power our developer ecosystem and push India to the forefront of AI innovation”. 

So what is stymieing India’s progress? Currently, all development, design and deployment happens on global providers and platforms, which are expensive and exclude Indian use cases and India-specific solutions. 

However, the time is right for change. The advent of technologies like AI, robotics, and cloud computing make it possible for Indian developers to start up, and build for India and the world. 

Jain summed up Ola Krutrim’s vision, stating that the company is looking to “supercharge the developer-led innovation ecosystem in India”. “Every developer should be able to create a new business solution easily and at a price point which is affordable, and in line with what the Indian end user or enterprise will pay. Our belief is that it can be done in a very fresh way.”  

The soul of AI: multimodal models from Ola Krutrim

Gautam Bhargava, VP, Ola Krutrim, spoke at length about foundational models, the India-first AI Cloud, and platform services. As Ola Krutrim’s focus is to build for India, the first order of business was to understand and create Indian Large Language Models (LLMs) from the ground up. Currently, Ola Krutrim LLMs support 11 languages, which will increase to 22 in the future. Eventually, Ola Krutrim wants to target the hundreds of different dialects used in India. The models will also navigate the complex cultural, religious, and political sensibilities in the country. 

What does this mean for developers in India? The company offers APIs (Application Programming Interfaces) and SDKs (Software Development Kits) that transform ambitions into action. As has been widely reported, Krutrim’s AI model was trained in over 2 trillion tokens, a feat that breaks the Chinchilla scaling laws for LLMs. 

Bhargava wrapped up his session with big news on Ola Krutrim’s latest developments. The company is in the process of building speech, image, and multimodal models that are India-specific to serve the Indian AI developer. 

The Android app for Krutrim was launched on May 4,2024. Next in the pipeline? The iOS app for Krutrim and the Krutrim Python SDK.

Optimising the AI cloud at Ola Krutrim

Vipul Shah, VP of Product, Ola Krutrim, and Raguraman Barathalwar, Head of Systems and Software Engineering, Ola Krutrim, discussed Ola Krutrim’s AI-first Cloud. Shah said  AI is at the core of everything at Ola Krutrim. The AI-first approach has allowed the cloud to improve efficiency, cut costs, reduce resource wastage, and anticipate the developer’s needs. This has led Krutrim to rethink every layer of the cloud and optimise every service and layer with AI. 

Shah said the AI cloud will be affordable, offer simple user experiences, and will interact through natural language and speech. It will be rooted in the Indian ethos, which will be built into multiple services offered on the cloud. 

All this has been made possible through a vertically integrated full stack, which features purpose-built AI silicon and optimised and sustainable data centres. The middle layer comprises services that developers will leverage. At the very top are the family of models that Krutrim is building, including Krutrim and open-source LLMs.  

In the context of power-consumption of GPUs and the heat generated by them, Shah introduced two completely indigenous methods of cooling: liquid immersion and direct-to-chip cooling. Both will help Krutrim build highly efficient data centres in India.   

Bharathalwar delivered an exciting announcement: various services from Ola Krutrim would be going live on May 4, 2024, including GPU as a Service, Model as a Service, and location services. He also announced that Ola Krutrim would be initiating a partnership with 1,000 developers who have the ambition to build change in India. These developers will receive Krutrim cloud credits worth Rs 10,000 to kick-start their journey of innovation. 

Location services now on the map

Location services APIs were one of the big reveals at the Ola Krutrim product showcase. Prasad Kavuri, SD of Ola Maps, said location service APIs and easily integrated SDKs will empower developers to create innovative mapping solutions. 

The company considered several factors when building efficient location services, including wide coverage, fresh data, real-time traffic data, dynamic rerouting, accurate ETAs, and reliable sources. 

Kavuri stressed upon the specific challenges that Indian roads and layouts present, and how the platform will accommodate the complexities of traffic, including 2-wheeler and 3-wheeler access. He ended his presentation by announcing that Maps services were live on Krutrim Cloud.   

<div class="externalHtml embed" contenteditable="false" data-val="”>

From silicon chiplets to supercomputers

The final presentation was made by Sambit Sahu, SVP Silicon Design, Ola Krutrim, and Raghuraman Bharathalwar, Head of Systems and Software Engineering, Ola Krutrim, on building indigenous silicon tech. 

“We are building some cutting-edge AI server products using the most advanced technologies, and we will do it at a cost structure palatable to India, “ Sahu said. Ola Krutrim’s approach allows it to optimise every layer of the stack, including the silicon design. 

Sahu added that cost, power and time to market were crucial factors that Ola Krutrim kept an eye on. He also spoke about building AI chiplets – smaller, modular chips that can be packaged together to form larger and more powerful chips. 

For more information on DevSparks 2024,




Source link

Leave a Reply