Hello,
Yes, some people are beginning to explore solutions to the touchscreen
accessibility problems, and some people are even coming up with
reasonably good ones: Apple are among them, and so are some of the
android developers. I've never tried VoiceOver on IOS, but I do hear
good things about it.
Unlike Julien, after a period of resistance, I gave into the smartphone
wave, and I'm fairly glad I did, since, for the first time, I have an
extremely mobile set-up as a writer (Bluetooth keyboard and braille
display connected to my Symbian phone running quickoffice).
<shameless-self-advertising> If anyone would care to learn more about
this, I started a blog a few months ago discussing some of the issues
faced by blind people, on which there is an article about smarphones and
their implications to us:
http://inperspextive.wordpress.com/2010/11/26/getting-smart/
It's meant to be a three part series, and I intend to discuss some of
the other issues which came up in the course of this thread as well.
</shameless-self-advertising>
Anyway, this thread has gotten way off-topic and it's probably time to
drop the ball. But, to end on a positive note, I would like to add that
general accessibility has vastly improved over the last couple of years,
with things like VoiceOver built into all Apple products, Orca rendering
most GTK/ATK applications accessible, and at least a bit of
accessibility work being done on Andreoid platforms. It's only when one
comes to specialised software, such as DAWs, that things fall apart.
Cheers,
S.M.
On Wed, Apr 20, 2011 at 07:33:13PM +0200, Julien Claassen wrote:
Hello Robert!
Well this thread is already no longer on-topic, that's why I
renamed it slightly. :-)
It really started with the iPhones. At least, there I noticed it
for the first time. they only have their touchscreen, like many
other smart phones nowadays. So apple released their screenreader
VoiceOver. They use different gestures, like take two fingers and
pull them down on the display ad the volume is lowered. You can use
your finger to move across the full screen and read element after
elemtn to you or you can activate a mode, where you can jump ahead
to every next element. So they actually started getting fame from
their phones and now they use the same system available for the
notebooks. You can use the pad - as I udnerstand - to make your
gestures and use the different modes to jump across the screen, find
things, activate them and so on. there are many more ideas to make
it easier. And I've heard, that more and more blind people change to
iPhones or ipod touch, because they like it and can be very fast
with them, if they have to be. It takes some learning, but it always
takes learning, to adapt to an assistive technology. You always have
to get to know the OS and then the logic behind the accessibility
software. So I guess that's fair enough. :-)
I never used them myself though. Not a mobile phone person and I
can't afford any notebook, not to speak of expensive macbooks. :-)
Anyway, a desktop is more reliable in the long run and can be more
powerful. For the time being I'm happy to have no mobile system.
Warm regards
Julien
--------
Music was my first love and it will be my last (John Miles)
======== FIND MY WEB-PROJECT AT: ========
http://ltsb.sourceforge.net
the Linux TextBased Studio guide
======= AND MY PERSONAL PAGES AT: =======
http://www.juliencoder.de
_______________________________________________
Linux-audio-user mailing list
Linux-audio-user(a)lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-user
--