Social networking giant Facebook is working on the semiconductor chip intended to be used for training deep learning algorithms. According to Facebook’s chief artificial intelligence (AI) researcher, company is working on next generation semiconductors which are being designed to work different from the most current semiconductor chips.
According to the Yann LeCun, this future chips will be used for training machine learning and deep learning algorithms, which will work differently and work without breaking the job in multiple batches. Currently most of the machine learning jobs are designed to split the huge data sets into batches and then runs the training jobs. But with the invention of such chips it will be possible to run train on data without splitting it batch sequences. This will boost the future machine learning and deep learning.
Earlier Intel and Facebook about their venture into the future AI chips and said that they are working on the new class of chip for Artificial Intelligence applications. According to the statement released by Intel in January; company is working on the AI chip which will be ready in mid-2019.
Recent announcements by Facebook about future chip further confirm the availability of new brand of semiconductors for machine learning in near future.
Deep learning technologies processes huge amount of data and uses theses data to train complex model. Deep learning model training is compute intensive task and currently most of the deep learning models are trained on the GPU clusters. Deep learning model training is also very expensive if the amount of training data is huge. So, there is good market for future enhanced chips to deep learning and Artificial Intelligence.
These days several Several startups are trying to create chips to handle such a huge data for efficiently training.