Google claims AI supercomputer ‘faster than Nvidia’

Supercomputers Google uses to train its artificial intelligence models are “faster and more power-efficient” than similar tech developed by Nvidia, the company has claimed.

A recent scientific paper authored by Google engineers David Patterson and Norm Joupi said its fourth-generation Tensor Processing Unit (TPU) – an AI accelerator application which Google developed using its TensorFlow software -- is up to 1.7 times faster and 1.9 times more power-efficient than Nvidia’s comparable A100 chip.

"Circuit switching makes it easy to route around failed components," the authors said. "This flexibility even allows us to change the topology of the supercomputer interconnect to accelerate the performance of a machine learning model."

The new paper also explained that Google’s latest fourth-generation TPU outperforms TPU version three by 2.1 times and improves performance by 2.7 times.

Google recently launched its own chatbot named Bard, which is intended to rival OpenAI’s ChatGPT.

As competition ramps up in the AI space, a number of practitioners have called for a pause on training AI systems.

Over 1,000 people recently signed a letter issued by the Elon Musk-funded Future of Life Institute which warned against the risk of economic and political disruptions by “human-competitive” AI systems.

Apple co-founder Steve Wozniak and Emma Bluemke, Centre for the Governance of AI, PhD Engineering, University of Oxford, were among the letter’s signatories.

    Share Story:

Recent Stories


Bringing Teams to the table – Adding value by integrating Microsoft Teams with business applications
A decade ago, the idea of digital collaboration started and ended with sending documents over email. Some organisations would have portals for sharing content or simplistic IM apps, but the ways that we communicated online were still largely primitive.

Automating CX: How are businesses using AI to meet customer expectations?
Virtual agents are set to supplant the traditional chatbot and their use cases are evolving at pace, with many organisations deploying new AI technologies to meet rising customer demand for self-service and real-time interactions.