Apple Leverages Google Tensor Chips for AI Advancements
A recent research paper from Apple has unveiled a strategic shift in the company’s AI infrastructure. The tech giant has opted to utilise Google Tensor Processing Units (TPUs) for training its AI models, a departure from the industry norm dominated by Nvidia GPUs.
Apple relies on 2,048 TPUv5p chips for model building. It also uses 8,192 TPUv4 processors for server AI models. This marks a significant investment in Google’s cloud platform. This decision is particularly noteworthy given Nvidia’s prevalent position in the AI hardware market.
The integration of TPUs into Google’s cloud ecosystem offers unique advantages for AI development. These advantages include specialised tools and a streamlined workflow. Apple’s engineers have highlighted the efficiency of TPUs in handling the computational demands of large-scale AI model training.
While this strategic partnership with Google is a notable development, Apple has outlined plans for substantial investments in its AI server infrastructure over the next two years. This initiative aims to enhance the company’s AI capabilities and reduce its reliance on external hardware providers.
The research paper also underscores Apple’s commitment to ethical AI practices. The company emphasises the use of publicly available, licensed, and open-sourced datasets for model training, ensuring the protection of user privacy.
Apple’s decision to adopt Google Cloud for its AI endeavours represents a strategic pivot with potential implications for the broader tech industry. As the AI landscape continues to evolve, this move will undoubtedly be closely monitored by competitors and industry observers alike.