My Experience in ASSETS 2012

From October 22nd to October 24th, we (Me and Iftekhar Anam) attended the ASSETS 2012 conference in Boulder, CO to demonstrate our sensory substation system named “FEPS”. It was an amazing experience to meet all the major researchers on accessibility and assistive technologies for disabled people. In this writing, I am going to share my experience of attending this conference. 

We obtained an overwhelming amount of positive feedbacks on our FEPS system. While demonstrating the system, we met a professor of University of Maine named Dr. Nicholas A. Giudice. Personally he is in vision disabilities and uses a nice big trained dog. He described our system as “intriguing” and showed interest to be involved with our research endeavor. After knowing that how cumbersome it is to simultaneously develop the system and do the evaluation on it, he mentioned that his lab (VEMI lab) will be very glad to share some of the burden of evaluating the system. He mentioned he has access to quite a large number of blind and visually impaired users. Dr. Giudice is interested on vibro-tactile interface and he specifically came in ASSETS to present the paper “Learning Non-Visual Graphical Information Using a Touch-Based Vibro-Audio Interface”. He discussed with us about the opportunity of using tactile feedback because audio feedback can interfere with the speech in the dyadic conversational settings.

Apart from this, Dr. Ravi Kuber, assistant professor from University of Maryland Baltimore County (UMBC) and Mr. Sina Bahram, a PhD. Student who also have visual impairment, showed much interest on the system and provided many positive feedbacks. Both of them mentioned, from their experience, they think blind users (especially novice blind users) will be more interested on voice-over or text-to-speech feature rather than abstract audio feedback. Mr. Sina Bahram also mentioned that, it would be even more useful if it could detect shake, nod etc.

The ASSETS 2012 conference provided us a wonderful opportunity to learn about current researche going on in most of the major labs working on accessibility issues. For example, a Ph.D student in Carnegie Melon University, Marynel Vazquez, presented a paper named “Helping Visually Impaired Users Properly Aim a Camera”. This paper is written on the problem we are talking for long in our lab – “How a blind person can take a good picture?”. It is assumed in their solution that, blind users can point the camera to proper direction and the desired object is already inside the camera frame. Then, they used visual saliency feature to help the user slightly move the camera, so that the most salient object stays in the middle.