Notes on some Web4All talks
Back at Web4All, part of The Web Conference. Here’s some notes on talks I liked.
Dragan presented the talk “AudioFunctions.web: Multimodal Exploration of Mathematical Function Graphs” and I also got to see a demo on Monday afternoon. AudioFunctions works in desktop and mobile browsers to show visually impaired users the shape of a function through sound pitch and volume using a touch screen or mouse. As you move around on the graph with mouse or finger (or keyboard with arrow keys) the pitch rises and falls according to the position on the y axis, and the further away from the line you are the softer the tone. And you can double tap for the graph to read out the coordinates. It’s very elegant! I also had a nice time talking with Dragan and his friend Cole from CMU about Ingress, indoor and outdoor navigation apps and maps, Inform7, and other fun stuff.
I caught just a bit of a talk by Lora Aroyo on CrowdTruth, which is a system for annotating information while allowing for ambiguity and disagreement, and had a look at the tutorial. It made me think of the CYC project (which I nearly worked on 30 years ago… but rejected the job offer (with regrets)) – I wonder if they are putting data into CYC now through Mechanical Turk instead of by hiring poets.
A talk called “Addressing the Situational Impairments Encountered by Firefighters through the Design of Alerts”, very interesting to see the design and testing process explained. Though I was annoyed at how this was introduced as being more convincing to designers that they need to pay attention “because firefighters are so rugged and not permanently disabled.”
Another talk on designing and piloting web dev classes for blind/visually impaired screen reader users. Looks like a good project. (opinion…. As so often with these kinds of classes the level of knowledge your students start with is important and you have to spend 90% of the initial time providing background on computers, the internet, programming languages, tools, etc and either you acknowledge that and make it a hella long course, or you end up creating a sort of hothouse dev environment so that students can experience the gratification of publishing something that works)
An interesting trend in some of the talks – use fairly sophisticated analysis to figure out (on the fly) what users might want or are trying to do or what barriers they’re running into based on how they interact with a page. For example it doesn’t matter to the vast majority of users that a site or a browser is infinitely configurable because they are reluctant to change defaults for fear they will break something, so, it works better in most cases to analyze and guess — are they having trouble or going very slowly in clicking links in small elements of the UI? Detect that and serve them a redesigned page with larger UI elements to interact with. (IMHO just design it to be less persnickity in the first place…. but…..ok).
This is an issue near and dear to my heart, or maybe my butt or my feet depending on how you look at it, because my powerchair “guesses” what I might want to (or “should” ) do; when it thinks i am going down a steep slope, it slows to a crawl no matter what I tell it. So, sure, you can guess what i want but give me the capability to override and reprogram it!!! omg.
At lunch one day I talked with Alex Jaimes about the company he works at, DataMinr, and today I hung out with Nathanel who works on RDF data systems and his fiancee whose name I didn’t catch but who works in digital humanities. And also had a nice gossip with Amy Hurst who was at CMU and now teaches HCI at NYU and heads up the Ability Project.
One last note on Chet Cooper’s project for Ability Magazine, Ability Job Fair. Basically this is a super accessible online job fair. Employers pay to list their jobs and for job-seekers it’s free. The next job fair will be on July 25, 2019. I’m interested in this project because I get asked by other disabled people all the time, how to get a job “in tech” and I’d like somewhere to refer them! During the job fair, they have sign language interpreters on-call to assist with live meetings, real time talk to text for people who have difficulty hearing, a system set up to be compatible with JAWS and other screen reader applications, and support for using voice or SMS. Chet was explaining more stuff to me during a coffee break and I got the impression he has a hydra with a lot of heads. There is also Ability Jobs which has job listings (aside from the actual event of the job fair) and some other thing called Ability Corps which is a volunteer org that from the sound of it is trying not to be a sort of charity model of abled-contributor-to-disabled-client, but to actually include disabled people in some kind of mutual aid process and to be enthusiastic about disabled people doing volunteer work in a serious way.