Boston Dynamics의 로봇 개는 이제 Google AI로 게이지와 온도계를 읽습니다.

Ars Technica | | {'이벤트': '📰', '머신러닝/연구': '📰', '하드웨어/반도체': '📰', '취약점/보안': '📰', '기타 AI': '📰', 'AI 딜': '📰', 'AI 모델': '📰', 'AI 서비스': '📰', 'discount': '📰', 'news': '📰', 'review': '📰', 'tip': '📰'} AI 모델
#기타 ai #ai 모델 #gemini

요약

구글 딥마인드(Google DeepMind)에 따르면 지난 4월 14일 발표된 새로운 Gemini Robotics-ER 1.6 모델은 작업을 계획하고 실행할 수 있는 '로봇을 위한 고급 추론 모델' 역할을 합니다. Boston Dynamics의 4족 스팟과 같은 로봇은 이제 공장과 창고를 돌아다니면서 아날로그 온도계와 압력 게이지를 정확하게 읽을 수 있습니다.

왜 중요한가

본문

Robots such as Boston Dynamics’ four-legged Spot can now accurately read analog thermometers and pressure gauges while roaming around factories and warehouses. Those improvements come courtesy of Google DeepMind’s newest robotic AI model that aims to enhance robotic capabilities for ‘embodied reasoning’ when interacting with physical environments. The new Gemini Robotics-ER 1.6 model announced on April 14 performs as a “high-level reasoning model for a robot” that can plan and execute tasks, according to Google DeepMind. This model also unlocks the capability of accurately reading instruments such as complex gauges and doing visual inspections using sight glasses that provide a transparent window to peek inside tanks and pipes—a performance upgrade that came about through Google DeepMind’s ongoing collaboration with robotics company Boston Dynamics. Boston Dynamics has a keen interest in testing both quadruped and humanoid robotic workers in a wide range of industrial facilities, including the automotive factories of the robotic company’s corporate owner, Hyundai Motor Group. The company’s robot “dog,” Spot, is being trialled as a robotic inspector that roams throughout industrial facilities to check up on everything. Such inspection duties require “complex visual reasoning” to interpret the multiple needles, liquid levels, container boundaries and tick marks, along with text, in various instruments.Read full article Comments

관련 저널 읽기

전체 보기 →