VR study participant with AI prosthetic arm reaching naturally in one second, lab graphs highlighting optimal speed for body ownership.
VR study participant with AI prosthetic arm reaching naturally in one second, lab graphs highlighting optimal speed for body ownership.
AI에 의해 생성된 이미지

VR study finds AI prosthetic arms feel most natural when a reach takes about one second

AI에 의해 생성된 이미지
사실 확인됨

AI-driven prosthetic arms may feel most like part of the body when their autonomous reaching motion lasts about one second, a virtual reality experiment reported in *Scientific Reports* suggests. In the study, that mid-range speed produced the highest ratings of body ownership, sense of control and usability, while very fast and very slow movements reduced acceptance and increased discomfort.

A study in Scientific Reports examined how the movement speed of an autonomous prosthetic arm affects whether people experience it as “part of me” and how positively they evaluate it.

The paper—"Movement speed of an autonomous prosthetic limb shapes embodiment, usability and robotic social attributes in virtual reality"—was authored by Harin Hapuarachchi, Yasuyuki Inoue, Hiroaki Shigemasu and Michiteru Kitazaki and published on Feb. 7, 2026. The research used a virtual reality (VR) setup in which participants embodied an avatar whose left lower arm was replaced by a prosthetic limb that moved on its own during a reaching task.

In the experiment, the virtual prosthetic autonomously flexed toward a target along a minimum-jerk trajectory, with the duration of the movement varied across six speed conditions ranging from 125 milliseconds to 4 seconds. After each condition, participants rated multiple measures commonly used in embodiment and human-robot interaction research: sense of body ownership, sense of agency, perceived usability using the System Usability Scale (SUS), and social impressions using the Robotic Social Attributes Scale (RoSAS), which includes competence, warmth and discomfort.

Across measures tied to embodiment and practical acceptance, the study found a consistent “middle-speed” advantage. Ownership, agency and usability ratings were highest when the movement took about 1 second, and were significantly lower at both extremes—the fastest condition (125 milliseconds) and the slowest (4 seconds). The fastest movement also produced the highest discomfort ratings. Perceived competence was rated higher at moderate to moderately fast speeds than at slower speeds, while warmth did not show a clear dependence on speed.

The findings add to ongoing efforts to design prosthetic devices that may include autonomous or semi-autonomous assistance—systems that can move without continuous user input in order to help with everyday actions. Such autonomy could improve functionality, but the results suggest that designers may need to tune movement timing to match what users readily accept as human-like, rather than prioritizing speed alone.

The researchers said the implications could extend beyond prosthetic arms to other technologies that function as body extensions—such as exoskeletons and wearable robots—where movement that feels “off” may undermine comfort and acceptance. They also pointed to VR as a way to evaluate user perceptions early and safely, and noted that future research could test whether longer-term exposure changes how people perceive different movement speeds.

The work was supported by Japanese research funding programs and foundations, including JSPS KAKENHI, JST and MEXT, as well as the Murata Science and Education Foundation.

사람들이 말하는 것

Early reactions on X to the VR study on AI prosthetic arms highlight the one-second reach as optimal for natural feel, body ownership, and control, with users noting it avoids creepy fast or awkward slow movements. Positive views call it a breakthrough for intuitive prosthetics, while skeptics question the 'AI' label, pointing to prior non-AI research.

관련 기사

Tesla's Optimus robot jogging fluidly in a high-tech lab, showcasing advanced mobility in a realistic news photo illustration.
AI에 의해 생성된 이미지

Tesla's Optimus robot shows jogging capability in lab video

AI에 의해 보고됨 AI에 의해 생성된 이미지

Tesla's Optimus humanoid robot demonstrated a new milestone by jogging across a lab floor in a video shared on December 2, 2025. The footage highlights improved mobility with natural form, as progress accelerates toward mass production. CEO Elon Musk envisions the robot transforming labor by handling monotonous tasks and potentially making work optional within 20 years.

Researchers at Karolinska Institutet have identified how alpha oscillations in the brain help distinguish the body from the surroundings. Faster alpha rhythms enable precise integration of visual and tactile signals, strengthening the feeling of bodily self. The findings, published in Nature Communications, could inform treatments for conditions like schizophrenia and improve prosthetic designs.

AI에 의해 보고됨

Neuroscientists have identified eight body-like maps in the visual cortex that mirror the organization of touch sensations, enabling the brain to physically feel what it sees in others. This discovery, based on brain scans during movie viewing, enhances understanding of empathy and holds promise for treatments in autism and advancements in AI. The findings were published in Nature.

Aging societies worldwide face rising demand for elder care amid caregiver shortages. In China, robots in care facilities assist with reminders, medication schedules, and vital sign monitoring. In Latin America, including Cuba, adoption of these technologies remains in early stages but shows promise in complementing family care.

AI에 의해 보고됨

고위 마비를 겪은 두 명의 중국 환자가 뇌-기계 인터페이스(BMI) 기술을 사용해 생각만으로 전동 휠체어 조종, 로봇 개에게 배달물 가져오게 하고, 로봇 팔로 컵을 집어 물 마시는 데 성공했다. 이 성과는 수요일 중국과학원 상하이 뇌과학 및 지능기술 우수성 센터의 미디어 브리핑에서 발표됐다. 이는 BMI의 실용적 임상 적용을 향한 중대한 진보를 나타낸다.

Researchers have developed a noninvasive method using EEG brain scans to detect movement intentions in people with spinal cord injuries. By capturing signals from the brain and potentially routing them to spinal stimulators, the approach aims to bypass damaged nerves. While promising, the technology still struggles with precise control, especially for lower limbs.

AI에 의해 보고됨

A Chinese robotics firm, EngineAI, has developed a humanoid robot capable of delivering forceful Bruce Lee-style kicks, priced at US$150,000 and set for mass production two years ahead of Tesla's timeline. In contrast, Elon Musk's Optimus robot recently jogged a few steps but fell over during a demo while handing a water bottle. Backed by China's engineering talent and supply chains, such startups are accelerating humanoid robotics development.

 

 

 

이 웹사이트는 쿠키를 사용합니다

사이트를 개선하기 위해 분석을 위한 쿠키를 사용합니다. 자세한 내용은 개인정보 보호 정책을 읽으세요.
거부