Today we’re releasing SwiftKey Symbols, a symbol-based assistive communication app, running on Android, targeted at (but not limited to) young, non-verbal individuals with special needs. A new beta app in SwiftKey Greenhouse, SwiftKey Symbols was developed over the course of our last two Innovation Week events.
Earlier this year a small team of SwiftKey staff, some with experience with autism in their families, came up with the idea of developing an assistive app powered by SwiftKey’s core contextual language prediction technology. We wanted to bring an accessible, free app to people with talking and learning difficulties so that they could communicate more easily with their friends and family.
This team visited Riverside School in southwest England earlier this year, where the pupils have a range of learning difficulties and many are on the autistic spectrum.
A lot of the current communication tools on the market are often too slow to select a particular image a child might choose. We realized that SwiftKey’s core prediction and personalization technology – which learns from each individual as they use it – would be a natural fit for people on the autistic spectrum who respond particularly well to routine-based activity. Although other apps make it easy to define favorites, only SwiftKey Symbols attempts to simplify finding the right symbols through machine learning prediction. The ability to provide the technology free is also a huge benefit to this community where assistive tools can be costly and inaccessible.
How SwiftKey Symbols works
Users of SwiftKey Symbols can build a sentence by choosing images, hand-drawn by a SwiftKey team member, from a set of categories or from a prediction slider powered by the SwiftKey SDK. Employing SwiftKey’s input and predictive technology, the app’s tech complements routine-based activity and learns from each individual’s behavior to surface images relevant to them quickly.
One key feature of SwiftKey Symbols is that it factors in the time of day and day of the week so symbol predictions are as accurate and personalized as possible . For example, if the child has music class on Tuesdays at 11:00am, and has previously selected symbols during that time, these will appear as predicted symbols in the sentence strip. SwiftKey Symbols can also be deeply customized to be even more useful; users may add their own images and categories from their device and use audio playback, a speech-to-text feature that can read out the sentence that is formed for a child who has verbal impairments.
The team worked closely with members of staff at Riverside School including Charlotte Parkhouse, Speech & Language Therapist. She says,
“The communication opportunities that this app will provide are amazing. The flexible use of symbols will allow pupils with severe communication difficulties to express themselves in meaningful ways and the predictive symbol function means that it can be truly personalised. Brilliant!”
We want to use our technology to help people with communications challenges as much as we can. SwiftKey Symbols follows our ongoing work with Professor Stephen Hawking and Israeli start-up Click2Speak, both of which use SwiftKey’s core technology to enable people with mobility issues to communicate more easily.
Head to SwiftKey Greenhouse to share this with people in your life who you think might benefit from it. Like it? Have feedback for us? We want to hear from you – use the feedback button in the app to get in touch with us and share your thoughts.
Ryan Barnes and the SwiftKey Team