Google to provide data localisation option to AI innovators in India | Tech News

google, google logo

google, google logo(Photo: Shutterstock)


With Google’s Gemini 1.5 flash large language model (LLM), Indian organisations across all sectors, including the public sector, will now have the option to both store their data at rest and conduct machine learning processing entirely within India. This move comes amid growing demand for local data processing from Indian enterprises.


This option will be particularly helpful for enterprises in the public sector and financial services, due to the sensitive nature of the data they handle, said Bikram Bedi, Vice President and country managing director, Google Cloud India.

Click here to connect with us on WhatsApp


In a media roundtable on Thursday, Bedi stated that local data storage and processing under Gemini 1.5 flash will also benefit startups in the country by reducing the costs of using the models and improving latency in some sectors.

 


“Google is the only vendor out there providing the entire AI stack end-to-end. Every other vendor is either buying GPUs, marking them up and selling them, or outsourcing other AI technology, marking it up and selling it. The reason I point this out is that price performance becomes a very important factor when you are starting to go into production and scale. Hence, Google’s ability to provide the right performance for the price,” said Bedi.


He also added that regulatory compulsion and demand from the industry were key reasons why local processing was initiated by Google.

First Published: Oct 03 2024 | 9:50 PM IST