How Apple used Google’s help to train its AI models

How Apple used Google's help to train its AI models

On Monday, during a presentation, CEO Tim Cook of Apple unveiled an exciting partnership with OpenAI to integrate its powerful artificial intelligence model into Siri, Apple’s voice assistant.

However, detailed technical documentation released by Apple post-event reveals that Alphabet’s Google also plays a crucial role in Apple’s AI advancements. To develop Apple’s foundational AI models, engineers employed Apple’s own framework software alongside various hardware, notably their own on-premise GPUs and Google’s cloud-based tensor processing units (TPUs).

Google has been developing TPUs for nearly a decade and has highlighted two variants of its fifth-generation chips, optimized for AI training. According to Google, the performance version of the fifth generation rivals the Nvidia H100 AI chips. Furthermore, at its annual developer conference, Google announced the upcoming launch of a sixth-generation TPU this year.

These processors are tailored specifically for running AI applications and training models, with Google creating a robust cloud computing hardware and software platform around them.

Neither Apple nor Google immediately responded to requests for comment regarding this collaboration.

Apple did not elaborate on the degree of dependence on Google’s chips and software in comparison to hardware from Nvidia or other AI vendors. However, utilizing Google’s chips generally necessitates purchasing access through Google’s cloud division, akin to how customers buy computing resources from Amazon’s AWS or Microsoft’s Azure.

Leave a Reply

Your email address will not be published. Required fields are marked *

Follow Us

Contact Us

sTech Media Inc.

Phone : +(44) 282 053 2211

Email : [email protected]

Editor In Chief : Kalvin Phillips