We often think of Microsoft’s Kinect as a cool part of video games, but researchers created a new prototype that uses Kinect to translate sign language into spoken and written words and spoken words into sign.
Every year when school rolls around, I make a list. I don’t make the typical list of school supplies and socks. I make a list the adaptations that need to be in place so my son can complete his school work. Adapting a mainstream school experience for a child who does not talk or write is a big task, but at its core it is simply helping my son live his life.
In a previous post, Alex Li challenged the assertion that a keyboard must control all of a computer’s software.
So, if a keyboard is not required, what alternatives could meet the needs of users with disabilities?
In a new public-private partnership, Microsoft teamed up with New York City’s Department for the Aging (DFTA) and Department of Information Technology & Telecommunications (DoITT) to develop Exergamers NYC. This program uses Kinect for Xbox in unexpected ways to promote more active and social lifestyles for New York City seniors. Participants bowl on virtual lanes, compete in boxing matches, swing for the fences in baseball games, and enjoy Zumba dance competitions. Senior centers from all five boroughs can compete against one another on Xbox and then discuss and celebrate their achievements together over Skype. The partnership builds upon the success of a 2010 collaboration between Microsoft and New York City that created the region’s first Virtual Senior Center.
Public procurement policies have been powerful tools in efforts to make information technology more accessible, helping to leverage the tremendous purchasing power of governments to encourage accessibility development.
For more than a decade Section 508, a policy that requires the U.S. government to consider accessibility when buying any information technology, has been the de facto accessibility standard around the world.
I have always loved learning languages. In sixth grade, I was excited the day the junior high school language teachers visited my class. I knew immediately that I wanted to learn French, even though everyone said it was much harder than Spanish. Later in high school, I was flipping through television channels one lazy Saturday when I realized I could follow an Italian opera without subtitles because of the similarities between Italian and French.
When I entered the field of accessibility ten years ago, the first golden rule I learned was that a keyboard must control all software and content features.
Today, this golden rule is found throughout accessibility public policy. But, it is time to re-examine the underlying assumptions of this once undisputable idea. Let’s start by looking at problems it was supposed to solve.
People with disabilities belong at the heart of international development.
That was the message hundreds of government officials and representatives from around the world delivered when they gathered for the United Nations General Assembly’s High-Level Meeting on Disability and Development (HLMDD) in September.
What does the word accessible mean to you? You may first think of the word available. Something that is accessible is available, or perhaps within reach for you to use. However, just because something is available doesn’t necessarily mean it is accessible. For instance, if you are hard of hearing or have poor vision, or have a neuromuscular or cognitive disability, some things around you may be available, but hard or impossible for you to use because they are not accessible. Furthermore, accessibility isn't just for what you might traditionally think of as "people with disabilities". For a person who is aging, accessibility may be challenged by diminished physical stamina. Even for someone with excellent visual acuity, using a mobile phone outside in bright sunlight without automatic adjustment in screen contrast is difficult, as would not having voice controls when using a smartphone in your car.