page contents

AI algorithm encounters bottleneck, but hardware revolution pushes it into the mainstream

(Original Title: Hardware revolution pushes AI into the mainstream)


Netease Technology News on December 17, foreign media wrote that the hardware revolution has pushed artificial intelligence to the mainstream, it has greatly reduced the training time and cost of AI systems, and has not made AI an arms race that few people can participate in. .

人工智能算法遭遇瓶颈,但硬件革命将其推向主流

In recent years, as computers show their superiority over humans in increasingly complex tasks, intelligent algorithms have become a major breakthrough in the field of artificial intelligence.


Today, however, another force may have a bigger impact on pushing AI forward. Advances in professional chips and other hardware have enhanced the capabilities of state-of-the-art artificial intelligence systems, while also pushing such technologies to the mainstream. Whether this can produce tangible business benefits is another matter.


The AI Index, a project initiated by a research group at Stanford University, clearly shows the importance of the AI hardware revolution. The latest AI Index attempts to summarize the progress of artificial intelligence and captures a change in the trajectory of artificial intelligence's biggest progress in the past 18 months.


In many ways, these algorithms have not achieved the leap in recent years. Part of the reason is that in some tasks, the results achieved by this type of technology have not increased significantly: for example, in image recognition, after the computer has surpassed humans, it has not made more achievements.


This also reflects the fact that the problems to be solved are getting harder and progress is getting slower. As we all know, language is the next frontier of machine intelligence, and it is especially difficult to overcome. Although tasks such as speech recognition and language translation have been solved, understanding and reasoning are still a field dominated by humans.


Instead, the most dramatic advances come from hardware. For example, specially designed chips are used to process the large amounts of data required for machine learning, and the industry is developing specialized systems for this work.


American research institute OpenAI pointed out a hardware inflection point in 2012. Prior to that, Moore's Law, the rule of thumb for the chip industry, dominated the field of artificial intelligence computing. Moore's Law means that processing power doubles every two years.


Since then, artificial intelligence systems have followed Moore's Law. As new hardware and more resources are devoted to this issue, the capabilities of state-of-the-art artificial intelligence systems increase every 3.4 months.


There is a paradox in this hardware acceleration. On the one hand, at the forefront of science, it has turned artificial intelligence into an arms race that few people can participate in.


Big companies and governments that can control huge computing resources will be the only ones able to participate in this race. OpenAI's operating philosophy has always been that artificial intelligence researchers with the largest computers will inherit this world. The organization recently received a $ 1 billion investment from Microsoft to stay in the race.


However, another impact of the hardware revolution was bringing this technology to the mainstream. Google's TPU is one of the world's most advanced machine learning processing chips. The outside world can rent it by the company's cloud computing platform by the hour. $ 1.35).


In Silicon Valley, too many people advocate "popular" new technologies, but in the field of artificial intelligence, the claim is reasonable. With cloud services such as Amazon Web Services (AWS) making low-cost hardware and machine learning tools widely available, training neural networks-the most computationally intensive part of artificial intelligence-has suddenly become universally accessible.


Stanford's DawnBench project provides a way to benchmark artificial intelligence systems. According to the project data, in less than two years, the time required to train a system on the widely used ImageNet dataset has been reduced from 3 hours to 88 seconds. This means a significant reduction in cost from $ 2,323 to $ 12.


Whether the dramatic reduction in training time and cost will make advanced artificial intelligence a practical technology is another matter. The broad impact of machine learning is difficult to determine, but the AI Index points to a promising metric. In October of this year, about 1.32% of US job postings were related to artificial intelligence, up from 0.26% in 2010. This number is still small, and the definition of "artificial intelligence work" is also controversial, but the general direction is clear.


Erik Brynjolfsson, a MIT professor who works on the economic impact of new technologies, warns that companies that hire data scientists and machine learning experts will not see immediate returns: The first is to develop new workflows needed to make the most of this technology in order to overcome internal bottlenecks.


The artificial intelligence race to get tangible returns from a much-touted technology has begun. (Lebon)


Source: Netease Technology Report, translated by Google Translate

Statement: this information is reprinted from authoritative news media. Reprinted for the purpose of transmitting more information and academic exchange, it is not used for commercial purposes, and does not mean to agree with its views or confirm its description. The content of this article is for reference only. If you violate the rights and interests of a third party, please contact us and we will deal with it as soon as possible. 

推荐

  • QQ空间

  • 新浪微博

  • 人人网

  • 豆瓣

取消
技术支持: 机器人行业建站
  • Home
  • 手机
  • 地址
  • QQ