Brain-computer interface for Second Life

November 14, 2007

A research team led by professor Jun’ichi Ushiba of the Keio University Biomedical Engineering Laboratory has developed a BCI system that lets the user walk an avatar through the streets of Second Life while relying solely on the power of thought. To control the avatar on screen, the user simply thinks about moving various body parts — the avatar walks forward when the user thinks about moving his/her own feet, and it turns right and left when the user imagines moving his/her right and left arms.

The system consists of a headpiece equipped with electrodes that monitor activity in three areas of the motor cortex (the region of the brain involved in controlling the movement of the arms and legs). An EEG machine reads and graphs the data and relays it to the BCI, where a brain wave analysis algorithm interprets the user’s imagined movements. A keyboard emulator then converts this data into a signal and relays it to Second Life, causing the on-screen avatar to move. In this way, the user can exercise real-time control over the avatar in the 3D virtual world without moving a muscle.

Future plans are to improve the BCI so that users can make Second Life avatars perform more complex movements and gestures. The researchers hope the mind-controlled avatar, which was created through a joint medical engineering project involving Keio’s Department of Rehabilitation Medicine and the Tsukigase Rehabilitation Center, will one day help people with serious physical impairments communicate and do business in Second Life.


User Research in Virtual Worlds

November 14, 2007

At a recent conference in Sydney, Australia, Gary Bunker and Gabriele Hermansson presented their new project – ‘User Research in Virtual Worlds‘ lead by Hyro. Hyro’s aim is to build a research platform that would allow to research users within virtual worlds like Second Life, not only for their experiences there but also for their needs outside of it. Researchers will attempt to use virtual research – focus groups, interviews and user testing – in a practical way in design projects requiring complex user input.

The Focus group in Second Life was conducted as a trial to test its feasibility for future use, and as such a second focus group was conducted in parallel, in the physical world. This allowed them to benchmark the findings gathered in Second Life against those that were recorded during the traditional focus group session.

Testing in-world also removes some of the potential issues around testing by people with disabilities in real life such as:

  • Travel and costs
  • Supplying equipment, assistive technologies and support
  • Payment and possible conflict of interest if a tester already earns a salary or is on benefits. Presumably payment in Linden Dollars transcends these issues.

Lisa Herrod in her blog wrote:

Virtual Participants’ were recruited from both Australia and the UK, with the focus group being recorded with a media camera and chat logger.

Things that worked

* There was a high level of feedback
* Participants were comfortable
* There was a good level of interaction with the participants
* The focus group had an international reach, which was a requirement of the testing
* Findings of the online session matched those recorded in the offline session

Things that didn’t work so well

* Online sessions took about one and a half times as long (i.e. 1.5 hrs online and 1hr offline)
* There were multiple conversation threads running at the same time, which were difficult to track
* The response time of some participants was slow
* It was confusing if participants weren’t identified directly by name during discussion, as it was at times difficult to indicate who the focus group facilitator was addressing.


Accessibility conference in Second Life

November 2, 2007

A recent conference sponsored by the U.S. Department of State Bureau of International Information Programs (IIP) and the University of Southern California Annenberg School for Communication, explored the issue of accessibility in 3D worlds.

Judy Brewer, director of the Web Accessibility Initiative (WAI) at the World Wide Web Consortium (W3C) and Bruce Bailey – accessibility IT Specialist on the United States Access Board acted as virtual ambassadors, speaking about accessibility issues. Thirty guests listened and participated in the Q&A.

According to this article Brewer sounded hopeful about making Second Life accessible to everyone, saying that:
“Accessibility software does already exist but much of it is not currently available in SL. “Content in SL is created by the community. There need to be easy and reliable ways to be able to add text descriptions to all content created in SL. We need better operability with assisted technology that is handled on the level of the core software and in the servers. This could include in-world technology. Keyboards could be accessibility driven. One can attach different devices to the keyboard port and that can give us a broader range of devices that can be added. Objects can be announced, so that it can be heard that they are there. Users could have the option to self-announce your location when entering a place.”


Second Life accessibilty survey

November 2, 2007

I have created an online survey to help identify problematic areas of Second Life. If you use or have used SL before, please take some time to answer a few questions about your experience.

the questionnaire can be found here 

If you have a visual impairment and is unable to create an account, but would like to help, i have created an SL account into which you can log in with the following details:

First name: Access

Second name: Westland

Password: accessibility

You can download the Second life client here 


Second Life Accessibility Center opens on HealthInfo Island in Second Life

November 2, 2007

Second Life accessibility center was open on September 9, 2007 on HealthInfo Island (Teleport SLURL). The Accessibility Center provides continuing education and awareness about disabilities to the residents of and visitors to SL. The information available at the Center includes material about specific types of disabilities, accessibility in electronic games and virtual worlds, as well as assistive technology in the real world.

Displays at the Accessibility Center currently focus on mobility, vision, hearing and learning impairments. In-world resources for people with disabilities are also highlighted. Several sitting areas provide a pleasing place to sit for frank discussions on disabilities.


Accessible Flash: nothing is impossible

November 2, 2007

Although this has nothing to do with Second Life, i found it quite interesting. An article on the RNIB website describes how Lightmaker pushed the boundaries of Flash to create an accessible Flash website.
The case study, analyses the steps taken by Lightmaker to pioneer new accessibility features for disabled, blind, partially sighted, mobility impaired, deaf and cognitively-disabled internet users with the development of an accessibility enabled Flash version of jkrowling.com. Lightmaker worked with the RNIB, RNID and Macromedia in the development of this project.

The Flash site was created by Lightmaker to provide an accurate portrayal of J.K. Rowling’s world. It is an explorative area in which the user can hunt for hidden content and interact with their surroundings in order to access special and collectable content.

This case study can be used as an example demonstrating what steps should be taken in making technologies such as Flash, known for being notoriously inaccessible, compliant with today’s accessibility standards.

Accessibility features

The site’s key accessibility features include:

  • Accessibility menu

An accessibility menu was created, positioned at the top of the page so that users can quickly gain access to accessibility features. The tab order was also set so that screen reader and braille output users will tab to this area of content before any other. Features in the Menu include; the ability to pause movement, turn off background sounds and to enlarge certain pieces of text and access a site help area.

  • Site help

It was felt that this new version of the site needed explaining for two reasons. To many users, the original release of jkrowling.com offered everything they wanted, and may not realise the importance of broadening the audience. Similarly, new users that make use of the accessible features may not be aware of what kind of experience is on offer, and so the ‘Site Overview’ provides this information.

The option in the main accessibility menu to turn off sounds only applies to background sounds, so these can be silenced so as not to conflict with screen reader narration. Foreground sounds are not encompassed in this muting as they provide meaning and prompts for the user. With several different sounds scattered around the site, the meaning of each sound is not obvious from the noise. To combat this we created a Sound Glossary, where every sound on the site is sorted by category and can be listened to along with a description.

One of the biggest challenges faced when starting our work in accessibility was knowing how to operate screen readers correctly. When a screen reader is running, it overwrites standard keyboard commands with its own. To navigate a site efficiently with a screen reader, knowing a handful of the main keyboard commands is vital. To help with this, we added a Keyboard Controls section, which lists the most useful keyboard commands for the two most popular Flash-compatible screen readers, JAWS and WindowEyes.

  • Adjustable text size

The user can determine the size of the body text, and enlarge all other content and clues, by using Flash’s in-built zoom feature accessible via the right mouse button.

  • Labels

Labels have been placed across the site which states the function of every control. For example, the landing page is a direct representation of J.K. Rowling’s desk however a person with a visual impairment may not know this therefore the label states ‘Welcome to my desk, which was specifically tidied for your visit. Please wander around and explore all of the objects you find here.’

Each time the user moves to a new area, the label description is updated to represent this, for example when moving to the ‘Extra Stuff’ area the label changes to ‘You have entered the extra stuff area, the news board is full of interesting bits and pieces for you to browse around.’

  • Handling audio

The site contains a number of background and foreground sounds however for vision impaired users, these sounds may become confusing, especially if a screen reader is being used, therefore the site allows the user to mute the sound if required. Captions for audio have been built into the user interface for hearing impaired users.

As mentioned previously, in the ‘Site Help’ section, there is also a sound glossary to reinforce events on the site and look up the meaning of important sound clues. Examples of general sounds include a pen scribble, the portkey journey and reward certificates. There are also sounds for when the user has revealed a hidden clue, the spider running across the screen and dialling a number on the phone.

  • Keyboard navigation

Navigating around the site is possible without a mouse. The keyboard navigation is rebuilt dynamically on-the-fly as users navigate around to different areas of the site. A logical sequence is applied to the order is which items are navigated to, making the pattern follow a logical path around the screen. Important links have been put further up the tabbing order to make life easier.

This non linear keyboard navigation can be seen whilst navigating the desktop, specifically the mobile phone. This makes use of progressive disclosure, which makes all the buttons on the phone silent until the phone is selected, at which point the whole desk is made silent to screen readers apart from the buttons on the phone. The progressive disclosure technique allows the desk to be navigated around quickly without having to tab through all 12 buttons on the phone. When the phone is required it inherits exclusive focus as the desk items are made silent.

  • Screen readers

Jkrowling.com works with screen reader technology. If detected, the user experience will then be adapted so that hidden content is easier to find. While non-screen reader users have a slightly more challenging game experience because they can use visual clues to assist them.


Second Life class action

November 2, 2007

At the time I began my research, I came across an interesting discussion on Second Life being inaccessible to the visually impaired. I was surprised to see that some people still believe a 3D virtual world ‘is not meant to be accessible’ to those with visual impairments. Still many seem to support the calls for making SL accessible. It’s an engaging discussion that can be found by following this link.

Joshua Linden, one of the developers for Linden Labs made a post on January 8th 2007, encouraging developers to take initiative in developing an accessible SL client:

“8th January 2007: ‘Joshua Linden’ said:

Hey there – I’m Joshua Linden from Linden Lab, one of the folks who help create the Second Life platform. (The world, of course, is created by the residents.) We are deeply committed to making Second Life usable by everyone. A large number of Second Life residents have “First Life” disabilities and enjoy the freedoms that a virtual world offers – from communication to movement. However, we’re still a very small company and have limited development resources, so we have not been able to do everything we want to – yet! That includes standard interfaces for accessibility tools.

We have recently done a substantial rework of our keyboard focus code to make things more predictable. As a benefit that was clearly in mind at the time, this will make it easier to eventually hook up focus-based screen readers (which typically work by interrogating the active application for changes in the displayed text) and support alternative input technologies. This is a much longer term project than simply saying “we support the W3C WAI” since accessibility hooks are built into many Web browsers already, whereas the Second Life viewer is a stand-alone application. (Long term, one could hope that content-creators in-world can tag their creations in such a way as to be more accessible once the viewer is fully accessible!)

Speaking of opening up, today (8 Jan 2007) we announced that the Second Life viewer source code is available under an open source license. See http://blog.secondlife.com/2007/01/08/embracing-the-inevitable/ for the announcement and links. We strongly encourage other developers to take on projects such as interfacing the Second Life viewer to work with alternative input and output mechanisms such as screen readers. Don’t just wait for us to do it!”


Follow

Get every new post delivered to your Inbox.