Your Tech Story

Meta

Meta announces AI training and inference chip project

To further assist artificial intelligence work, Meta Platforms (META.O) revealed additional information on its data centre initiatives on Thursday. This information included a proprietary chip “family” that is being developed internally.

In a collection of blog articles, the owner of Instagram, as well as Facebook, said that as an element of the Meta Training and Inference Accelerator (MTIA) programme, it would be developing a first-generation microprocessor in 2020. The goal was to make suggestive models, which are used to distribute adverts and other material in news feeds, more effective.

Meta
Image Source: moneycontrol.com

It was previously reported by Reuters that the company was already working on an upgrade and did not have intentions to fully use its first internal AI processor. In the blog posts, the very first MTIA chip was advertised as an educational tool.

Also Read: OpenAI to introduce ChatGPT app for iOS

The articles claim that the early MTIA chip focused exclusively on the inference AI method, which uses computers educated on massive amounts of data to decide what should be displayed.

Software developer Joel Coburn from Meta stated amid a talk on the new processor that the company had first used GPUs, for inference operations but had discovered that these devices were not the best option.

“Their efficiency is low for real models, despite significant software optimizations. This makes them challenging and expensive to deploy in practice,” Coburn said. “This is why we need MTIA.”

Source: reuters.com

A Meta representative did not provide details on the forthcoming chip’s release schedule or go into further detail about the organization’s plans to create chips that might additionally train the models.

Since executives realised, the chip needed the technology and software to handle requirements from product teams developing AI-powered innovations, Meta has been working on a significant effort to update its AI architecture.

Consequently, the business abandoned intentions for a wide-scale release of an internal inference chip and began developing a more ambitious chip that could conduct training as well as inference, according to Reuters.

Although Meta’s original MTIA chip struggled with high-complexity artificial intelligence (AI) models, it tackled low- & medium-complexity models more effectively than rival chips, according to Meta’s blog entries.

The MTIA chip also utilised an open-source chip structure known as RISC-V and consumed only 25 watts of power, which is significantly less than that of renowned chips in the market from suppliers like Nvidia Corporation, according to Meta.

Also Read: Will AI Take Over The World?

The company Meta said it would start construction on the company’s initial structure this year as well as offered further details on intentions to restructure its data centres towards more advanced AI-focused networking along with cooling technologies.

In a video describing the improvements, a staff member claimed that the updated layout would be 31 per cent less expensive and could be constructed twice as rapidly as the company’s present data centres.

To assist its developers in writing computer code, Meta claimed to have a system driven by artificial intelligence, which is comparable to that provided by Alphabet Inc., Amazon.com Inc., alongside Microsoft Corp.

Leave a Comment

Your email address will not be published. Required fields are marked *