Brain-computer interface for Second Life

November 14, 2007

A research team led by professor Jun’ichi Ushiba of the Keio University Biomedical Engineering Laboratory has developed a BCI system that lets the user walk an avatar through the streets of Second Life while relying solely on the power of thought. To control the avatar on screen, the user simply thinks about moving various body parts — the avatar walks forward when the user thinks about moving his/her own feet, and it turns right and left when the user imagines moving his/her right and left arms.

The system consists of a headpiece equipped with electrodes that monitor activity in three areas of the motor cortex (the region of the brain involved in controlling the movement of the arms and legs). An EEG machine reads and graphs the data and relays it to the BCI, where a brain wave analysis algorithm interprets the user’s imagined movements. A keyboard emulator then converts this data into a signal and relays it to Second Life, causing the on-screen avatar to move. In this way, the user can exercise real-time control over the avatar in the 3D virtual world without moving a muscle.

Future plans are to improve the BCI so that users can make Second Life avatars perform more complex movements and gestures. The researchers hope the mind-controlled avatar, which was created through a joint medical engineering project involving Keio’s Department of Rehabilitation Medicine and the Tsukigase Rehabilitation Center, will one day help people with serious physical impairments communicate and do business in Second Life.


User Research in Virtual Worlds

November 14, 2007

At a recent conference in Sydney, Australia, Gary Bunker and Gabriele Hermansson presented their new project – ‘User Research in Virtual Worlds‘ lead by Hyro. Hyro’s aim is to build a research platform that would allow to research users within virtual worlds like Second Life, not only for their experiences there but also for their needs outside of it. Researchers will attempt to use virtual research – focus groups, interviews and user testing – in a practical way in design projects requiring complex user input.

The Focus group in Second Life was conducted as a trial to test its feasibility for future use, and as such a second focus group was conducted in parallel, in the physical world. This allowed them to benchmark the findings gathered in Second Life against those that were recorded during the traditional focus group session.

Testing in-world also removes some of the potential issues around testing by people with disabilities in real life such as:

  • Travel and costs
  • Supplying equipment, assistive technologies and support
  • Payment and possible conflict of interest if a tester already earns a salary or is on benefits. Presumably payment in Linden Dollars transcends these issues.

Lisa Herrod in her blog wrote:

Virtual Participants’ were recruited from both Australia and the UK, with the focus group being recorded with a media camera and chat logger.

Things that worked

* There was a high level of feedback
* Participants were comfortable
* There was a good level of interaction with the participants
* The focus group had an international reach, which was a requirement of the testing
* Findings of the online session matched those recorded in the offline session

Things that didn’t work so well

* Online sessions took about one and a half times as long (i.e. 1.5 hrs online and 1hr offline)
* There were multiple conversation threads running at the same time, which were difficult to track
* The response time of some participants was slow
* It was confusing if participants weren’t identified directly by name during discussion, as it was at times difficult to indicate who the focus group facilitator was addressing.

Second Life Accessibily Project

October 30, 2007

This is a blog dedicated to Second Life Accessibility. The growing popularity of virtual worlds as a social networking platform, means that the issues of accessibility must be addressed sooner or later. Second Life has 1.9250,245 registered users, and yet remains inaccessible to many users with visual impairments. If you use Second Life and have ideas on how it can be made more accessible, please reply to this blog with your comments or suggestions. Also if you have a visual impairment and have an opinion on the subject, get in touch. Your contributions will be of great value to this research project.

SL accessibility issues can be split into 3 categories:

1. SL website accessibility: registration process

Presently, not only is SL not compatible with screen readers, the SL website itself is largely inaccessible to people with visual impairments. Feedback from an online questionnaire I designed demonstrates that 8 out of 10 visually impaired users were unable to register for an account on the SL website. This is due to the fact that the site does not conform with W3C accessibility guidelines. Linked images have no alt attributes and form fields do not link correctly.

After attempting to register for an account one questionnaire participant responded by saying:

“I found no easy step by step guide that would say what to expect, or even give me any reason to overcome the obstacles for joining”… their reasons for wanting to join SL – “ online community to join. But only if it represented a cross-section of real life. I’m not interested in anything that so flagrantly excludes disabled people”.

2. SL viewer accessibility

On January 8, 2007, Linden Labs announced that the viewer source code s being released into the public domain. In their official blog  Linden Labs refered to the SL client going open source as “embracing the inevitable”.

At present there are a number of alternate viewers available for download, however none of these have yet been made fully accessible to visually impaired users.

3. Grid accessibility

Making the world itself accessible is perhaps one of the hardest tasks facing Linden Labs and SL users themselves. All the content within Second Life is user generated, the solution would therefore be to implement some sort of guidelines requiring users to label all generated content (the equivalent of alt tags on websites), these lables can then be read by asisstive technologies such as screen readers allowing visually impaired users to navigate the grid.