Google is on the verge of creating a significant mark in technology markets in form of a one of a kind artificial- intelligence Chip designed by its own engineers and supercomputers. Google is also about to establish itself as a powerhouse as far as Al-focused hardware are concerned.
Google CEO Sundar Pichai made an announcement at the annual developer conference that the company has designed a computer processor that is capable of performing the kind of work that will create a huge impact in the technology industry in the coming years.
One can’t help but notice how the artificial intelligence is transforming even Google itself. The announcement was a clear sign that Google is planning on making major strides in the technology industry as far as hardware and software are concerned.
It is important to note that the new processor does not only work at a lightning speed, but its efficiency in learning and training is unmatched. The hardware is known as the Cloud Tensor Processing Unit and it derived its name from the company’s open-source TensorFlow machine-learning framework.
A unique and efficient piece
The new processor is one of a kind and it’s meant to both train and perform deep neural networks (machine learning systems that are involved in the fast evolution of everything from speech and image recognition to robotics and automated translation). According to Google, the chip is not meant to be directly sold to others. Instead, any developer or business interested can run the software via the internet that will provide access to even thousands of these processors that are packed into the Google’s database.
The past decade has seen the most influential company on the internet come up with new data center hardware from computer servers to network gear that have ensured swift growth of the online empire. The cloud service and new chips are good examples of the innovations meant to help the company achieve its long-term evolution. Most of the company’s revenue is acquired from advertisements but it is worth noting that cloud computing has recently been a major revenue boost and has shown potential of becoming an important part of Google’s future.
The new chip referred to as the Cloud TPU or PU 2.0, is a better version of a custom-built processor that has been part of Google’s own Al services for more than 2 years now. The processor has helped Google in its functions such as translation services and image recognition. This new version of the processor does not only run already trained neural networks, but it also provides a learning environment for the neural networks. The chip is also accessible through a dedicated cloud service.
Currently, developers and businesses train their neural networks by use of big farms of GPUs (chips that were originally meant to provide graphics for the gaming and software industry. nVidia has proved to be a dominant force in the industry by making chips such as Silicon Valley chip. However, Google has become a competition to recon by going a step further to create a chip that is designed to train neural networks. The rate at which the Cloud TPU trains the neural networks is much faster compared to the other processors in the market. According to Jeff Dean, who is the overseer of Google Brain (the company’s central AI lab), the new processor can reduce the amount of neural networks training time from a day to a few hours. It is clear that more chips will be produced by Google as they play a major role in the success of Cloud, which has proved to be a big source of revenue for the company.