Flat-headed brother Huawei’s layout of AI chips, cloud integration you must understand

On September 25, Alibaba Pingtou launched the Hanguang 800AI chip, which shocked the industry with its leading performance. And just in August, Huawei also released its own AI chip, the Shengteng 910. Coincidentally, both sides chose cloud AI chips. Why do giants value this field? With their entry, what impact will cloud chips have on terminal AI chips?

Internet giants scramble to deploy AI cloud chips

On September 25, at the Hangzhou Yunqi Conference, Alibaba’s Pingtou Ge’s first AI self-developed chip, the Hanguang 800, was officially released. As the first officially taped AI chip released by Alibaba since the establishment of Pingtouge, this is another proof of strength after the processor IP core Xuantie 910 and Wujian SoC platform.

In the industry-standard ResNet-50 test, the reasoning performance of “Hanguang 800” reached 78,563 IPS (78,000 images per second), which is 4 times higher than the current industry’s best AI chip performance; the energy efficiency ratio is 500 IPS/W (500 images can be processed per watt of power), 3.3 times that of the second place.

Huawei: The fastest cloud training chip

After the release of Ali Hanguang 800, everyone can’t help but compare it with the AI ​​processor Shengteng 910 released by Huawei at the Full Connect Conference on August 23.

Because Huawei’s Shengteng 910 and Hanguang 800 were both known as the most powerful AI processors when they were released. In fact, the purpose of the two is not the same. AI chips can be divided into training chips and inference chips. Ascend 910 is the fastest training chip, and Hanguang 800 is the strongest inference chip. Training refers to “learning” on the platform through a large amount of labeled data, and forms a neural network model with specific functions; inference is to use the trained model to input new data and obtain various conclusions through calculation. But they all belong to cloud chips.

In AI chip application scenarios, we usually divide it into cloud and terminal. Cloud mainly refers to large-scale data centers and servers, and terminals include mobile phones, vehicles, security cameras, robots and other rich scenarios.

In addition to Ali and Huawei, domestic Baidu, Tencent, etc. have also continued to deploy in the chip field. Baidu’s layout focuses on applications such as image, voice, and unmanned driving. Tencent has improved its layout in the AI ​​chip industry chain by investing in companies such as Bitmain, Diffbot, iCarbonX, CloudMedx, Skymind, and ScaledInference.

AI chip track: what is the “congested” track for entry?

The primary reason is that when the giants enter the cloud AI chip, they are more considering the integration and optimization of their own business. With the growth of data magnitude, whether it is Baidu’s search, voice interaction, driverless or Tencent’s social, new retail, intelligent hardware and other aspects of the layout, or Alibaba cloud computing, Internet of Things and other fields, general AI chips have been difficult to meet Its own business algorithm and hardware design requirements.

Compared with traditional chip manufacturers, Internet giants generally have one thing in common in the chip field, that is, their chip products are not sold separately as commodity chips, but are bundled with their own products.

Hanguang 800 will export AI computing power through Alibaba Cloud. In the future, enterprises can obtain the computing power of Hanguang 800 through Alibaba Cloud. At the same time, Hanguang 800 is also mainly used to solve AI inference computing problems in business scenarios such as image, video recognition, and cloud computing, improve computing efficiency, reduce costs, and provide power for data centers.

Huawei’s release of the Ascend 910 is used to supplement the gap in AI cloud chips, and it is also used in Huawei’s own servers and cloud business.

Of course, the layout of the giants is not only cloud AI chips, but also terminal chips. Another important reason why a large number of manufacturers have entered this track is for the independent control of chips.

If you want to fight a tough battle on the AI ​​track, the chip is a key part. Especially in the context of the Sino-US trade war, it is also necessary to be independent and controllable on the core chips. After all, no one wants to be the next ZTE.

In 2019, the AI ​​chip battlefield is in full swing

In fact, with the help of capital, the “incremental” market of AI chips is rapidly emerging. In addition to veteran players such as Intel and Nvidia, start-up AI chip companies, Internet giant Google, new energy vehicle technology company Tesla, social media Facebook, the originator of the Internet, has also begun to get involved in AI chips.

Start-up AI chip companies, Internet giants, and traditional chip manufacturers compete on the same stage, and the AI ​​chip industry presents a situation of competition.

According to a research report by the American market research company ReportLinker, it is expected that by 2023, the AI ​​chip market will reach US$10.8 billion, with a compound annual growth rate of 53.6%.

The market is large and there are many competitors. By the end of 2018, the scale of domestic chip design companies has grown to nearly 1,700. Although affected by the economy, the growth rate is not as good as the previous two years, but there are still many people who want to grab this cake.

In the past two years, the new products of various AI chip manufacturers have been released one after another, and the division of labor in each link has gradually become obvious. The application scenarios of AI chips are no longer limited to the cloud, and various products deployed in terminals such as smartphones, security cameras, and autonomous vehicles are becoming more and more abundant.

Among the entrants, there are domestic companies that originally provided solutions in the fields of security, speech semantics, face recognition, cloud computing and other fields to seek self-development of AI chips, and there are also AI algorithm companies that have turned to AI chip research in their fields.

Different from the traditional chip market, AI chips are chips used for computing, with high efficiency and relatively simple functions, and do not involve IP authorization issues. The application is more resource-saving and the threshold is lower than that of general-purpose chips. Everyone can participate in this yet-to-be-formed chip. The AI ​​chip market has become more normal.

In fact, driven by the actual landing of AI chips and the promotion of giants, the pattern and trend of this market are gradually becoming clear.

The AI ​​chip pattern is beginning to emerge, and cloud integration has become a consensus

According to CCID Consulting, in China’s AI chip sales market in 2018, cloud AI chips accounted for 76.9% of the market share, but this does not mean that cloud chips will replace terminal chips. On the contrary, the importance of terminal chips is becoming more and more prominent. To truly exert the empowering power of AI, the cooperation between the cloud and the terminal has become a consensus.

“The use of AI chips in the cloud will break through the cost constraints of AI products such as ‘urban brain’, ‘industrial brain’, and ‘agricultural brain’, and accelerate large-scale popularization.” The chip technology department of DAMO Academy once said. Just as Alibaba’s Taobao website “makes the world easy to do business”, the launch goal of the cloud AI chip Hanguang 800 directly points to “make the world easy to do AI”.

Cambrian CEO Chen Tianshi said: “The amount of data in the terminal is limited, but the cloud can see the data of a large number of users, and the intelligent processing on the cloud side can bring together the information of many terminals.” The intelligent processing of the cloud has its irreplaceable properties in terms of data. The huge advantage of it is that it can use massive data to train very powerful models.

The advantages of cloud chips lie in powerful computing power and learning and inference capabilities, but in actual implementation, the biggest dilemma encountered by cloud chips is that the response is not timely enough. For example, in areas such as autonomous driving and security, immediate inference and response are required. At this time, terminal chips are needed as supplements to solve the delay problem of cloud chips.

In the early days of the AI ​​chip enclosure movement, cloud chips did have considerable advantages in acquiring massive data and reducing customer costs. In recent years, the industry pattern has begun to emerge, and start-up chip companies such as Cambrian and Horizon have formed, Megvii, Yitu, etc. Several major forces represented by artificial intelligence companies and giants such as Huawei and Ali.

The competition for the AI ​​chip market has gradually stabilized. In the next stage of actual implementation, the demand for terminal AI chips is gradually being discovered.

At the 2019 World Artificial Intelligence Conference in August this year, CCID Consulting’s “White Paper on the Development of China’s Artificial Intelligence Chip Industry” mentioned that an important trend in the development of AI chips will be from cloud to cloud integration.

With the rise of edge computing, the “cloud-integrated” solution has gradually become the mainstream. The intelligent algorithm is placed in front, liberating the computing resources of the center, realizing edge computing + cloud computing from the end to the cloud, speeding up the processing speed and realizing flexible applications.

In edge computing scenarios, AI chips are mainly responsible for inference tasks, and inference results are obtained by substituting data collected by sensors (microphone arrays, cameras, etc.) on the terminal device into the trained model for inference. Because edge-side scenarios are diverse and different, the considerations for computing hardware are also different. Chips can be IP in SoCs or edge servers, and performance requirements such as computing power and energy consumption are also large and small.

Therefore, different from the “high-end and general-purpose” of cloud scenarios, computing chips applied to the edge side need to be designed for special scenarios to achieve optimal solutions and solve the problems of speed and cost in the GPU computing era.

Terminal chips are mainly used in four scenarios: intelligent security, autonomous driving, mobile Internet and Internet of Things. Before AI chips became mainstream solutions, GPUs were the main choice in the industry, but the main problem with GPU chips is that they are not designed for AI computing, their computing speed is not as fast as AI chips, and the cost is high. Taking smart security as an example, the mainstream smart security solutions are mostly based on the NVIDIA Jetson TX1 GPU chip, and its largest customer is the domestic security giant Hikvision. It is reported that the cost of a single chip is estimated at about 70 to 150 US dollars, and the cost of a module is 200 to 300 US dollars.

The cloud-integrated solution will also solve the original high cost problem. Due to the diminishing effect of large-scale cost marginal benefits in chips, after the mass production of dedicated chips, the increase in the cost of each chip and related storage brought by the AI ​​module is expected to be less than 2 US dollars, and the implementation cost of AI cameras using the ASIC solution will be greatly reduced.

There are already many companies in China that have smelled the trend of “integration of cloud” and made in-depth layout, including giants such as Alibaba and Huawei, as well as “industry veterans” such as Cambrian.

Cambrian is one of the earliest companies to enter terminal chips in China. In 2016, the company released the world’s first commercial deep learning dedicated processor IP, the Cambrian 1A processor, and in November last year, released three new intelligent processor IP products, each for low-power scenarios. Vision applications, high versatility and high-performance scenarios, and terminal artificial intelligence products.

In May last year, Cambrian released the first cloud smart chip MLU100, marking Cambrian as a company that has both terminal and cloud smart processor products. From the perspective of the Cambrian’s leap from terminal to cloud, such a layout may be the main development trend in the field of AI chips.

With the clear development trend, the competitiveness of China’s AI chip industry in the global environment is also continuously improving.

AI chip highland competition, the key to China’s counterattack

The AI ​​chip market has been dominated by Nvidia, Intel, Google and others, and now they are facing challenges from HabanaLabs, Qualcomm and others. The recent release of new products from Cambrian, Huawei and Alibaba announced the entry of domestic companies into this highland.

In the 2018 Top 24 Global AI Chip Companies List released by market research and consulting firm Compass Intelligence last year, Chinese companies occupied 7 seats, of which Huawei (HiSilicon) ranked 12th, MediaTek (MediaTek) ranked 14th, and Imagination ranked 15th, Rockchip 20th, Verisilcon 21st, Cambricon 23rd, and Horizon 24th.

However, South Korea and Japan, which have always been at the top level of technology, have put out the fire in this wave of artificial intelligence. Japan, in particular, has always been in a leading position in the field of robotics research.

South Korea is the only one on the list. Samsung did not release its first AI chip, the Exynos 9610, until 2018, which is mainly used in Samsung’s flagship mobile phones. In the case of Samsung’s mobile phone market declining year by year, the new AI chip cannot restore its decline.

In this wave of AI chips, China and the United States are leading the entire AI chip industry. Although there is still a gap in market share and the overall strength of enterprises, with the latest products launched by companies such as Alibaba and Huawei, Chinese AI chips can already With high-end chips competing for chip high ground.

The strength in upstream fields such as chips will affect the overall development level of China’s artificial intelligence industry. Whether it is the current trade war or the next global industrial upgrade, China has undoubtedly caught a good card first.

The Links:   EPCS16SI8N SI4136-F-GTR

UL and Zhijia Technology signed a strategic cooperation agreement on autonomous driving safety How About The Thermal Properties Of Carbon Fiber Composites