Google I/O: Custom TPU chip amplifies machine learning performance

Google on Wednesday revealed that for the past year, it’s been powering data centers with a custom-built Tensor Processing Unit (TPU) chip designed for machine learning and tailored for TensorFlow. Google started working on the chip because “for machine learning, the scale at which we need to do computing is incredible,” Google CEO Sundar Pichai said in the keynote address at the Google I/O conference. The result, the company announced in a blog post, is that they “deliver an order of magnitude better-optimized performance per watt for machine learning. This is roughly equivalent to fast-forwarding technology about seven years into the future (three generations of Moore’s Law).” Because the chip is tailored for machine learning, it’s more tolerant of reduced computational precision and requires fewer transistors per operation. “We’re innovating…


Link to Full Article: Google I/O: Custom TPU chip amplifies machine learning performance

Pin It on Pinterest

Share This