In-House Processing brings cost-effective cloud coverage Brings Cost-Effective Cloud Coverage

Author:
Ally Winning, European Editor, PSD

Date
05/09/2018

 PDF

Earlier this year, I wrote a blog on Google and its tensor processing units. The company had just launched the second generation of its powerful ICs that provide the computing power for its vast datacentres around the world. The processors are specifically designed for the complex AI computational tasks found in the cloud and can be trained to perform specific tasks. Having its own bespoke processing elements gives Google a competitive advantage over other companies who have to use off-the-shelf processors from companies, such as the current market leader Nvidia.

Now it seems other AI and cloud companies with heavy processing needs are looking to take a similar route to try and efficiently gain market share in the nascent market. Initially Facebook had been working with Intel on a new chip, but now the social media giant has begun advertising for ASIC designers, presumably to work in-house on an AI processor for the company’s own datacentres. Then Chinese corporation AliBaba, which is currently better known for providing e-commerce solutions than cloud ones, followed that by employing former Qualcomm silicon designer Liang Han as an AI chip architect and then placing recruitment adverts for a team based in Silicon Valley to work with Han. The company confirmed that it was working on a neural networking processor, the Ali-NPU, that it hoped would be almost 40 times more cost-effective than off-the-shelf components.

Alibaba made further moves toward its new processor by buying fellow Chinese company C-Sky Microsystems this month. C-Sky is a developer of proprietary 32-bit CPU cores and makes the claim that it is the only high-volume CPU provider in China with its own instruction set. As well as designing its own processors, the company is a platinum member of the RISC-V foundation, which could allow it to easily expand into 64-bit cores or even complete neural networking devices. Alibaba is currently quite a small player in the global cloud market, but there is no doubt that it sees it as a huge potential growth area. Having its own processor would allow it to scale its operations up much more cost-effectively than through buying from chip manufacturers. It would also provide an easier supply chain to manage. With China’s protected internal marketplace, Alibaba could scale up pretty quickly as Huawei and ZWE have done in the mobile telecoms market.

Other giant corporations are also working on AI processing. These companies include Apple, Microsoft and Tesla. Although not initially for datacentres, the experience of designing a low-power processor for a car, phone or headset would be an ideal beginning to expand into data centre processing. At the start of the computing age, companies built a chip and looked for a market. Now, in the information age, it seems the companies have built the market and now are looking for a chip.

PSD

www.powersystemsdesign.com

RELATED