欢迎访问译联翻译公司!  联系邮箱:fanyi@translian.com
当前位置:首页 > 新闻动态 > 行业新闻

新闻动态 / NEWS

在线咨询 / ONLINE CHAT



广州IT翻译公司:谷歌DeepMind团队打造“可微分神经计算机”

作者: 来源: 日期:2016-10-13 8:31:54

DeepMind overcomes memory block to bring thinking computers a step closer

谷歌DeepMind团队打造“可微分神经计算机”

 

广州IT翻译公司:这台结合神经网络计算系统与常规计算机内存的学习机器,能够在没有先验知识的情况下解决小规模问题。

 

Google’s artificial intelligence arm has made a breakthrough in the development of thinking computers by creating a learning machine that combines a “neural network” computing system with conventional computer memory.

谷歌(Google)的人工智能部门在开发思维计算机方面取得一项突破,他们创造了一台结合“神经网络”计算系统与常规计算机内存的学习机器。广州IT翻译公司。

 

Scientists at DeepMind, the tech group’s London-based AI unit, have built a “differentiable neural computer”, or DNC, that for the first time can solve small-scale problems without prior knowledge, such as planning the best route between distant stations on the London Underground or working out relationships between relatives on family trees.

这家高科技集团设在伦敦的人工智能部门DeepMind的科学家们,打造了一台“可微分神经计算机”(DNC),首次能够在没有先验知识的情况下解决各种小规模问题,比如在两个相距遥远的伦敦地铁车站之间规划最佳路线,或者厘清家谱上亲属之间的关系。

 

Neural networks — connected systems modelled on biological networks such as the brain — have played a big role in the recent and rapid progress in AI research. They are excellent at deducing patterns, for example, to enable speech recognition in digital assistants such as Google Voice or Apple’s Siri. But until now they have only been able to access the data contained within their own network. In the journal Nature the 20-strong DeepMind team said the DNC provides neural networks with access to previously incompatible external data, such as text encoded in conventional digital form.

神经网络——以大脑这样的生物网络为蓝本打造的互连系统——在近期人工智能研究的快速进展中起到了很大的作用。它们非常善于推导出模式,使谷歌语音(Google Voice)或苹果(Apple)Siri等数字助理的语音识别成为可能。但是,此前它们只能访问自身网络所含的数据。20人的DeepMind团队在《自然》(Nature)期刊发表的论文中表示,DNC提供了神经网络,可以访问之前不兼容的外部数据,比如以常规数字格式编码的文本。广州IT翻译公司。

 

The trouble is that the memory in a neural network is bound up within the computation itself, which makes it rather fragile and hard to scale up,” said Alex Graves, head of the DNC project. “We decided that the way to make it more robust is to separate out the memory, so that we can expand it without affecting the processor.”

“麻烦在于,神经网络中的记忆被绑定在计算内部,这使得它相当脆弱,难以扩展,”DNC项目负责人亚历克斯•格雷夫斯(Alex Graves)表示。“我们得出结论,使其更强大的方法是分离记忆,以便我们可以扩展它,而不会影响处理器。”

 

Jay McClelland, director of Stanford University’s Centre for Mind, Brain and Computation, called the DeepMind paper “a very interesting and important milestone in AI research”.

斯坦福大学(Stanford University)心智、脑和计算中心(Center for Mind, Brain and Computation)主任杰伊•麦克利兰(Jay McClelland)称,DeepMind的这篇论文是“人工智能研究中非常有意思的重要里程碑”。广州IT翻译公司。

 

However, to make the DNC more useful in the real world than existing AI systems, it will need to be expanded to access far larger memories. “That will require a lot of engineering work,” said Mr Graves. “This is a research paper and I don’t want to speculate too much about where this is going in terms of practical problems.”

然而,为了使DNC在现实世界中比现有的人工智能系统更有用,它将需要扩展,以访问大得多的存储器。“这将需要大量的工程工作,”格雷夫斯说。“这是一篇研究论文,我不想过分推测这对解决实际问题有多大指导意义。”

 

Even so, independent computer scientists who reviewed the paper before publication said the range of applications for a general purpose DNC could be vast. Possible applications might include generating video commentaries and extracting meaning from text.

即使如此,在发表之前评审了这篇论文的独立计算机科学家表示,一般用途DNC的应用范围可能十分巨大。潜在的应用可能包括生成视频新闻报道和从文本中提取涵义。

 

DeepMind was founded in London as an AI start-up in 2010 and acquired by Google for £400m in 2014.

DeepMind2010年在伦敦成立,是一家人工智能初创企业,2014年被谷歌以4亿英镑收购。

 

广州IT翻译公司


本文由:译联广州翻译公司免费发布:供学习参考,禁止商用与转载。
sssssssssssssssssssssssss