Autism: iPad and Proloquo2go
Our son, Jake, is now 7 years old. He was diagnosed with autism and developmental delay at the age of 3.
Jake falls in the 99th percentile for development, which means that in a room of 100 children of the same age, 99 of them could out-perform Jake. Seeing the bright little boy that he was at the time, it was hard to take in the severity of his disability.
The World Opens Up for Jake
About a month ago, we bought Jake an iPad to help him learn to communicate. So far, we have funded it ourselves ($600!), but we have applied for funding and out fingers are crossed. Here in Ontario, we've been told, the Ministry (Assistance for Children with Severe Disabilities) will cover the cost of the software (the app), but not the hardware.
The app that he's using is called Proloquo2go, and it was the one that the school wanted for him. At a whopping $189, we were thrilled to hear that the government would fund it and take that burden from us.
The school was right; Proloquo2go is going to evolve with Jake and be the app that he uses while he continues to be non-verbal, which could be for the rest of his life. It comes pre-loaded with symbols for pretty much everything. We've set Jake's up to work with photographs rather than symbols or diagrams, as his needs are more basic.
I imagine that Jake might then one day start to understand symbols and then might eventually move on to Proloquo2go’s on-screen keyboard.
The app is so advanced that it will then speak, out loud, whatever you type.
How does Proloquo2go work?
The app itself is easy to use and completely customizable. It really can be as advanced or as basic as the user wants it to be. Using the iPad’s camera, it is as simple as it possibly could be to add new choices for the end user, without even having to sync it with a computer or taking the pictures off of a camera's memory card.
For us, Jake’s new favorite food is cheesies and he needs to be able to let us know when that is his food of choice. So I open Proloquo2go, go into edit mode and touch ‘add’. I give the choice a name: cheesies, and then I use the iPad’s camera to take a picture of the bowl of cheesies. Hit ‘done’.
The picture of the bowl of cheesies will now appear alongside Jake's other food choices and he can simply tap the picture. When he does, he'll hear a voice say “cheesies”. Straightforward and simple.
At the moment, Jake likes to tap the iPad’s screen with an open hand, so we’re getting very few exact choices from him. I hope that he'll start to see that making himself heard and understood is only a touch away. Although, I'm pretty sure he does get it; he can just be a bit stubborn!