Google Constructs Novel Chip To Make Machine Learning Faster

It all started as a stealthy project at Google several years ago to gauge what could be accomplished with their own custom accelerators for machine learning applications; and the result is a custom chip dubbed Tensor Processing Unit, or TPU, an ASIC named after the TensorFlow software which Google uses for its machine learning programs. The ASIC or application-specific integrated circuit, is specific to deep neural nets. These are networks of hardware and software that learn specific tasks by analyzing vast amounts of data. ​ This technology has been imperative for the revamp of the search engine. The accelerator chip, speeds up a specific task providing better performance per watt than existing chips for machine learning tasks. Owing to this more operations per second can be squeezed into the silicon,…


Link to Full Article: Google Constructs Novel Chip To Make Machine Learning Faster

Pin It on Pinterest

Share This