Researchers have found that visual-motor synchronicity of only the hands and feet can induce a sense of illusory ownership over an invisible body interpolated between virtual hands and feet. This active method to induce a sense of illusory ownership over an invisible body at a distance has potential applications in skill learning/transfer and the concept of body-appearance-irrelevant communication in the future.
Virtual avatar-to-avatar eyewitness interviews may increase the quantity and quality of recalled information compared to face-to-face interviews. A first-of-its-kind study shows that eyewitnesses of a mock car theft provided as much as 60 percent more information when interviewed in an avatar-to-avatar context compared to face-to-face interviews. Study participants also found it easier to talk to the avatar and were more likely to admit when they didn't know the answer to a question.
A new study shows no gender difference or negative effect on a video game player's performance or subjective involvement based on whether a photorealistic avatar looked like them or like their friend.
Locating and discriminating sound sources is extremely complex because the brain must process spatial information from many, sometimes conflicting, cues. Using virtual reality and other immersive technologies, researchers can use new methods to investigate how we make sense of the word with sound.
New research has shown for the first time that a social robot can deliver a 'helpful' and 'enjoyable' motivational interview (MI) -- a counseling technique designed to support behavior change.
Driverless cars will encounter situations requiring moral assessment -- and new research suggests that people may not be happy with the decisions their cars make. Experiments designed to test people's reactions to a driving dilemma that endangers human life, revealed a high willingness for self-sacrifice, a consideration of the age of potential victims and swerving onto the sidewalk to save more lives -- intuitions that are sometimes at odds with ethically acceptable behavior or political guidelines.
TeleHuman 2 -- the world's first truly holographic videoconferencing system -- is being unveiled. TeleHuman2 allows people in different locations to appear before one another in life-size 3-D -- as if they were in the same room.
Animation in film and video games is hard to make realistic: each action typically requires creating a separate controller, while deep reinforcement learning has yet to generate realistic human or animal motion. Computer scientists have now developed an algorithm that uses reinforcement learning to generate realistic simulations that can even recover realistically, after tripping, for example. The same algorithm works for 25 acrobatic and dance tricks, with one month of learning per skill.
A smartphone application using the phone's camera function performed better than traditional physical examination to assess blood flow in a wrist artery for patients undergoing coronary angiography, according to a randomized trial.
A Zika vaccine could have a substantial effect on mitigating and preventing future Zika virus outbreaks. Through a combination of direct protection and indirect reduction of transmissions, virtual elimination is achievable, even with imperfect vaccine efficacy and coverage, according to a new computer model.
A new perspective bridges two approaches to understanding quantum gravity.
Interactive virtual reality (VR) brings medical images to life on screen, showing interventional radiologists a patient's unique internal anatomy to help physicians effectively prepare and tailor their approach to complex treatments, such as splenic artery aneurysm repair, according to new research.
A new lightweight, low-cost agricultural robot could transform data collection and field scouting for agronomists, seed companies and farmers. The TerraSentia crop phenotyping robot measures the traits of individual plants using a variety of sensors, including cameras, transmitting the data in real time to the operator's phone or laptop computer.
Scientists report that they've built an artificially intelligent ocean predator that behaves a lot like the original flesh-and-blood organism on which it was modeled. The virtual creature, 'Cyberslug,' reacts to food and responds to members of its own kind much like the actual animal, the sea slug Pleurobranchaea californica, does.
Physicists have used machine learning to teach a computer how to predict the outcomes of quantum experiments. The results could prove to be essential for testing future quantum computers.
Researchers have long been applying AI to protect wildlife. Initially, computer scientists were using AI and game theory to anticipate the poachers' haunts, and now they have applied artificial intelligence and deep learning to spot poachers in near real-time.
A team of researchers has developed a pair of '4-D goggles' that allows wearers to be physically 'touched' by a movie when they see a looming object on the screen, such as an approaching spacecraft.
Researchers discovered ways to further improve computing efficiency using management tools for cloud-based light-weight virtual machine replacements called containers.
The tiny worm C. elegans is the only living being whose neural network has been analyzed completely. It can therefore be transferred to a computer, creating a virtual copy of the worm which behaves in exactly the same way to external stimuli. Such a 'virtual worm' can learn amazing tricks -- its neural network can even be used to balance a pole, which is a standard control problem in computer science.
Using patient measurement data, researchers have succeeded in further refining the brain modeling platform 'The Virtual Brain'. The software, which has been downloaded almost 11,000 times to date, has been used in projects and publications across the globe.