Augmented Reality and Autism
People with autism often find it difficult to recognize facial emotions, body language, and conversational sub-text, which make social interactions and developing friendships difficult to develop. Can augmented reality (AR) help such people better recognize and respond to social cues? Or does it infantilize and draw attention to them while attempting to make them behave “normally”? Why not respect their autonomy and start from there?
Dominant Framing of the Problem
Autism is quite widespread in america. Over 1 million children under the age of 17 in the US are on the autism spectrum. These children often times fail to recognize basic facial emotions, which make social interactions and developing friendships even more difficult to sustain. The traditional approach to gaining these skills requires intensive behavioral interventions that are often expensive, difficult to access, and inconsistently administered. Google Glass in collaboration with Stanford developed a cheaper and more accessible solution that uses machine learning to read facial expressions of the wearer and others, such as eye contact etc, to provide social cues using AR.
However, to help these autistic children and keep them safe from misunderstandings with others, Google Glass needs to surveil them and get their data. On the other hand, surveilling children can rob them of their privacy and autonomy, literally directing them to behave in specific ways through a combination of surveillance and automated guidance.
How do we design AR in a way that respects students privacy/autonomy while also enabling them to engage meaningfully with their peers?
AR for autism requires surveilling everything that children might do, robbing them of their privacy and autonomy
Concerns and Considerations
The concern with this framing of the problem is that it reduces safety to a matter of numbers. For example, it does not engage with the problem of whose lives will be saved. In accident situations, the algorithm will have to make moral judgements about who gets to live and who doesn’t. Is it just for an algorithm (and by extension the companies that make those algorithms) to decide our lives?
This switch from focusing on the student to focusing on their environment opens up the problem in a way that helps us move beyond paternalistic solutions
Reframing the Problem
Instead of placing the burden of improvement and conformity on the autistic student, we can focus on the culture and society they are a part of. “Can we help support more compassionate behavior towards children with autism by other children in a way that respects their autonomy and dignity?” or “Can develop technologies that help others learn how to understand autistic behavior and to respond meaningfully and compassionately to it.” This switch from focusing on the student to focusing on their environment opens up the problem in a way that helps us move beyond this binary trade-off and simultaneously resolve it.