Recently in the Human Computer Interactions Category


Robotic glove technology developed out of a partnership between General Motors and NASA for use on the International Space Station is being brought to life in health care, manufacturing and other applications though a licensing agreement between GM and Bioservo Technologies AB, a Swedish medical technology company.

In a milestone for space robotics, the International Space Station has hosted the first full run of ESA's experiment with a force-reflecting joystick.

NASA and GM Are Developing New Droids

GM, NASA Jointly Developing Robotic Gloves for Human Use

"General Motors and NASA are jointly developing a robotic glove that auto workers and astronauts can wear to help do their respective jobs better while potentially reducing the risk of repetitive stress injuries. The Human Grasp Assist device, known internally in both organizations as the K-glove or Robo-Glove, resulted from GM and NASA's Robonaut 2 (R2) project, which launched the first human-like robot into space in 2011. R2 is a permanent resident of the International Space Station."

DARPA Has A Program Called "Avatar"

Keith's note: According to io9: "In [DARPA's] $2.8 billion budget for 2013, unveiled on Monday, they've allotted $7 million for a project titled "Avatar." The project's ultimate goal, not surprisingly, sounds a lot like the plot of the same-named (but much more expensive) flick. According the agency, "the Avatar program will develop interfaces and algorithms to enable a soldier to effectively partner with a semi-autonomous bi-pedal machine and allow it to act as the soldier's surrogate." These robots should be smart and agile enough to do the dirty work of war, Darpa notes. That includes the "room clearing, sentry control [and] combat casualty recovery." And all at the bidding of their human partner."

Imagine if the same technology could be used such that astronauts coudl inhabit spacecraft that could also walk across a planetary surface. There are many places where the terrain could not be accessed by use of rovers.

NASA ARC Solicitation: Eye Tracker Systems

"NASA Ames invites responses to this inquiry for the purpose of purchasing three (3) eye tracker systems to support a task order under the Intelligent Systems Research and Development Support contract (ISRDS--NNA08CG83C). The purpose of the task is to undertake advanced development of the first prototype version of the Investigator Aid system. This version shall incorporate both improvements in existing routines and the ability to incorporate foreign language speech to text, visual, and specialized physiological data such as eye tracking and thermal imaging. The work is part of a longer-term research effort for the development of improved information validity assessment."

Contact Lens Computer Displays

A single-pixel wireless contact lens display, Journal of Micromechanics and Microengineering

"We present the design, construction and in vivo rabbit testing of a wirelessly powered contact lens display. The display consists of an antenna, a 500 × 500 µm2 silicon power harvesting and radio integrated circuit, metal interconnects, insulation layers and a 750 × 750 µm2 transparent sapphire chip containing a custom-designed micro-light emitting diode with peak emission at 475 nm, all integrated onto a contact lens. The display can be powered wirelessly from ~1 m in free space and ~2 cm in vivo on a rabbit. The display was tested on live, anesthetized rabbits with no observed adverse effect. In order to extend display capabilities, design and fabrication of micro-Fresnel lenses on a contact lens are presented to move toward a multipixel display that can be worn in the form of a contact lens. Contact lenses with integrated micro-Fresnel lenses were also tested on live rabbits and showed no adverse effect."

Wireless contact lens display now a reality, Extremetech

"It has finally been done: A team of US and Finnish bioengineers have embedded an antenna, radio receiver, control circuitry, and LED into a wearable contact lens. If you're a rabbit, you can hop along to their research lab at the University of Washington, Seattle, and try it out right now -- but if you're a human, you'll still have to wait a couple more years for the bionic, Terminator-like HUD of your dreams."

Think about this: Imagine having this system instead of the Heads Up Displays currently used to land spacecraft. Or if you are doing an EVA in space or are out on the surface of the Moon? You could use augmented reality apps and "see" overlays of maps on terrain ahead of you, schematics of things in front of you, system status without the distraction of averting your gaze to a screem, or frequencies your eyes normally do not see. It could be like have Geordi's Visor - without the visor.

Space Droids Using Sign Language?

Photos: Robonaut-2 Gestures In Space

Keith's note: I have seen Robonaut-2 in action and its dexterity is interesting - and rather facile.  So ... how could NASA demonstrate this dexterity in new ways, make it a little more "human" or approachable, - and reach a new segment of the populace that is normally overlooked? Program it to use Sign Language. Background: I worked for more than a decade as a professional certified (educational) sign language interpreter. This idea occurred to me when I was looking at this picture and instantly wondered what Robonaut-2 "wanted" or why it was seemingly in the process of saying "here" or maybe "give". Imagine how fast a video of Robonaut-2 saying something in American Sign Language from space would go viral. NASA could have a competition wherein people submit questions for it to answer. NASA already has a signing astronaut and SMD and NLSI already put out books in Braille. Just a thought.

P.S. Maybe he could repeat what that alien signed in "Close Encounters of the Third Kind" (video). I first saw this film when it came out with my hearing impaired roommates - none of us knew that aliens were going to sign so we all freaked out when one of them did. Of course, it was natural to us that all aliens would know how to sign - since they all already speak English, right?

Video: Hacking Kinect - NASA Applications?

Think for a moment: Remember all of the things in "Avatar", "Star Trek", and other SciFi films that were controlled by people waving their hands over sexy looking devices, wandering around holodecks, or using remotely controlled bodies. When Kinect was first released, Microsoft was against anyone hacking it. A similar thing happened when LEGO Mindstorms was released and hobbyists began to fiddle with the software. As was the case with LEGO, Microsoft has done a complete 180 and has overtly embraced the notion that people can take technology and do things that its originators never imagined. How could Kinect hacks change the way that NASA does things? What would it be like to use Kinect as a whole body interface with 360 degrees of movement while living in microgravity aboard the ISS? Could NASA control Robonaut this way?

Mind Reading Computer System May Allow Paralyzed People To Communicate and Control Robots, NSF

"Imagine living a life in which you are completely aware of the world around you but you're prevented from engaging in it because you are completely paralyzed. Even speaking is impossible. For an estimated 50,000 Americans, this is a harsh reality. It's called locked-in syndrome, a condition in which people with normal cognitive brain activity suffer severe paralysis, often from injuries or an illness such as Lou Gehrig's disease."

Think about this: this is clearly one of the major steps in the path toward creading the human/machine interface used inthe film "Avatar" wherein a paraplegic person was able to remotely control an "avatar" body. The same technology could aid astronauts n the operation of various robotic vehicles in remote and hazardous locations.

Monkeys move virtual arm with their minds, CNN

"Remember the hit movie Avatar, where the human brain alone could control a lifelike hybrid body, seeing what it sees and feeling what it feels? Scientists at Duke University are one step closer to making that concept a reality, with important applications for medicine. They have developed a system through which a monkey can control a virtual arm with its brain and also feel sensations from the appendage. The ultimate goal is to build a robotic body suit controlled entirely by brain activity, which will provide tactile feedback to the wearer, says Dr. Miguel Nicolelis, study co-author and neuroscientist at Duke University. This could potentially enable quadriplegic individuals and people with locked-in syndrome to move, walk and feel textures with robotic hands and feet."

Think about this: The same technology could allow intimate interactions with robotic systems in remote locations in space and on other worlds. It could also allow exoskeletons to operate with humans inside on worlds with higher gravity levels.

How to Build a Borg Drone: First Step

Proton-based transistor could let machines communicate with living things, University of Washington

"Human devices, from light bulbs to iPods, send information using electrons. Human bodies and all other living things, on the other hand, send signals and perform work using ions or protons. Materials scientists at the University of Washington have built a novel transistor that uses protons, creating a key piece for devices that can communicate directly with living things. The study is published online this week in the interdisciplinary journal Nature Communications."

Think about this: imagine having this human/machine technology as a sensor system for crew health - this could be a quantum leap beyond the stick-on electrodes that have been used for half a century. It would certainly make it easier for Tricorders and sickbay to check up on the crew. It could also allow a more seamless interface between humans and remotely operated robotic arms, rovers, and other mechanical systems. Add in Wifi and ...

Think about this: imagine having this gene chip technology aboard on long duration spaceflight as a diagnostic tool for crew health, for characterizing environmental microbial contamination, and to assay crop health within life support systems. Add a WiFi, WiMAX, or Bluetooth link and Tricorders could get quick updates.

White House Boosts Translational Medicine, Drug Chip Project, Science Insider

National Institutes of Health (NIH) Director Francis Collins's controversial plan to launch a new center for translational biomedical research got a boost today in a White House announcement on science initiatives. NIH also rolled out an early project for the planned center, promising up to $140 million over 5 years to develop a chip for predicting drug toxicity

"With wonderful heart and an impressive sense of scale, Tiffany Shlain's vibrant and insightful documentary, Connected, explores the visible and invisible connections linking major issues of our time--the environment, consumption, population growth, technology, human rights, the global economy--while searching for her place in the world during a transformative time in her life. Connected illuminates the beauty and tragedy of human endeavor while boldly championing the importance of personal connectedness for understanding and coping with today's global conditions." via NASA IT Summit 2011 and open.nasa.gov

Editor's note: In the film "Avatar" the line between organic and electronic is often blurred. Hybrid "avatars" containing both human and Na'vi DNA are created - but with blank minds. Humans enter a chamber that integrates with both their nervous system and that of the avatar so as to allow the human to teleoperate the avatar's body as if it were their own. Clearly, such an integration between biology and technology needs to be seamless and intimate. A new technology is now emerging that would lend itself to such cybernetic/biological applications. As described below, ultra-thin patches with embedded electronics have been developed that can be easily applied to your skin so as to allow integration with electronic systems. In the near term, such an interface could find application in space by allowing astronauts to be better monitored in terms of their health and also allow them to remotely operate robots.

NASA is gearing up for a conference in San Francisco that aims to improve the quality of Information Technology (IT) at the agency, while drawing on the expertise and innovative spirit of California's Silicon Valley.

The second NASA IT Summit will take place Aug. 15-17 at San Francisco's Marriott Marquis Hotel. The theme of this year's event is, "Making IT Stellar at NASA." NASA's Chief Information Officer (CIO) Linda Cureton will host the event.

The X PRIZE Foundation, the leading nonprofit organization solving the world's Grand Challenges of our time by creating and managing large-scale, global incentivized competitions, today announced a collaboration with Qualcomm Incorporated to design the Tricorder X PRIZE, a $10 million prize to develop a mobile solution that can diagnose patients better than or equal to a panel of board certified physicians. The X PRIZE Foundation and Qualcomm seeks to achieve this by combining advancements in expert systems and medical point of care data such as wireless sensors, advancements in medical imaging and microfluidics.

Control the cosmos with your fingers

"What do you get when you cross a WorldWide Telescope with a Kinect motion-sensing game controller? You get the "universe at your fingertips," according to Microsoft Research's Curtis Wong, who demonstrated the gesture-controlled cosmos today at the MIX11 conference in Las Vegas. Actually, having the universe at your fingertips is how Wong has thought of the freely available WorldWide Telescope project since it was first unveiled in 2008. The software, which is freely available through a Web-based interface and as a standalone program, displays the night sky and lets users zoom in on cosmic imagery from a wide variety of sources. You can even go on 3-D fly-throughs of distant galaxies, or create your own tours of celestial hot spots." More by Alan Boyle at MSNBC

Image: ECG signals wirelessly transmitted to an Android mobile phone via a low-power interface. Click on the picture to download the high-res version.

Imec and Holst Centre, together with TASS software professionals have developed a mobile heart monitoring system that allows to view your electrocardiogram on an Android mobile phone. The innovation is a low-power interface that transmits signals from a wireless ECG (electrocardiogram or heart monitoring)-sensor system to an android mobile phone. With this interface, imec, Holst Centre and TASS are the first to demonstrate a complete Body Area Network (BAN) connected to a mobile phone enabling reliable long-term ambulatory monitoring of various health parameters such as cardiac performance (ECG), brain activity (EEG), muscle activity (EMG), etc. The system will be demonstrated at the Wireless Health Conference in San Diego (US, October 5-7).

More (video) below

The Real Science of Avatar

The Real Science of Avatar - How James Cameron drew inspiration for the flora and fauna on Pandora from life forms on Earth, Time

"The message of James Cameron's Avatar, which comes out on DVD and Blu-ray April 22 in conjunction with Earth Day, is unapologetically green. "All life on Earth is connected," the director told me, when I interviewed him for my book, The Futurist: The Life and Films of James Cameron. "We have taken from nature without giving back, and the time to pay the piper is coming."

A Hackable, Wrist-mounted Tricorder?

From Texas Instruments (available soon for $49.00): eZ430-Chronos Wireless Watch Development Tool (915 MHz US Version): "The eZ430-Chronos is a highly integrated, wireless development system based for the CC430 in a sports watch. It may be used as a reference platform for watch systems, a personal display for personal area networks, or as a wireless sensor node for remote data collection. Based on the CC430F6137 <1 GHz RF SoC, the eZ430-Chronos is a complete CC430-based development system contained in a watch. This tool features a 96 segment LCD display and provides an integrated pressure sensor and 3-axis accelerometer for motion sensitive control. The integrated wireless feature allows the Chronos to act as a central hub for nearby wireless sensors such as pedometers and heart rate monitors. The eZ430-Chronos offers temperature and battery voltage measurement and is complete with a USB-based CC1111 wireless interface to a PC. The eZ430-Chronos watch may be disassembled to be reprogrammed with a custom application and includes an eZ430 USB programming interface." [More at Ti]

Brain-Computer Interfacing

New research from the University of Southampton has demonstrated that it is possible for communication from person to person through the power of thought alone. Brain-Computer Interfacing (BCI) can be used for capturing brain signals and translating them into commands that allow humans to control (just by thinking) devices such as computers, robots, rehabilitation technology and virtual reality environments.