As we formed the team and got set up and ready for Project Playtime in Greensboro we knew we were tasked with creating something that could positively impact the lives of differently abled individuals. If we were going to be successful in our endeavors it was important to know what problem we were going to solve. Most of us could only imagine what the problems could be. As we would learn later, our ideas were based on the content we had consumed from mainstream media and weren’t the most basic problems.

We did have one team member though that has been interacting closely with an organization for quite some time that may help deepen our understanding, so he suggested a tour. While we wanted to work with a partner organization to have access to domain specific knowledge and a group of participants we could get feedback from, the goal was to create something that had the potential to benefit a similar group of participants anywhere in the world.

A group of us made a trip to AfterGateway and had an opportunity to understand what they are trying to achieve in the work they perform with individuals with multiple or severe developmental disabilities. It didn’t take us long to realize that one of the fundamental challenges that every participant in the program faced was the ability to communicate effectively. Some people communicated better than others, but almost all of the folks we met would be considered non-verbal.

As AfterGateway staff walked us through all the activities that they engage in with the participants over the course of the day and the equipment and tools they used, we realized that there was an opportunity to leverage the power of technology to make an impact. During our brainstorming session at the kick-off event, we began by mapping out what a typical day looked like for a participant and how what we would design could help.

Our problem was rooted in one of the most basic human needs - communication. We made an effort to make sure our solution was simple but effective.

Above: Participants' day schedule, used by the Greensboro team to understand the required functionality of the prototype design.

At AfterGateway, everyone starts their day with a greeting. Currently they use these large buttons that have audio recorded in them; you hit the button and the recorded voice will do the greeting for you.

Once greetings are complete some work is done on developing fine and gross motor skills and general life skills. Improving the ability to communicate remains the underlying goal for this session in addition to being a key facilitator. For this purpose they use some games – one that matches shapes, another one for colors and even a tic-tac-toe game that is operated using buttons and has an audio visual feedback mechanism.

What follows that is food and again a need to communicate what food a participant wants, what drink they would like and if the food was good.

We decided that our problem would be best served by a modular, adaptive controller that wirelessly interfaces to an application on a tablet or PC. The application will have a series of question answer panes to facilitate ease of communication through simple graphics and audio as well as some fun panes to assist with life skills development.

Since the button is the primary interface we sought input on the design from the staff at AfterGateway. Some of the requirements pertained to size, shape and texture along with the need for some kind of visual cue on button press. We then took our first 3D printed prototype back to AfterGateway and they suggested further modifications. Here are pictures of our initial button design and the updated one that we are working on getting printed. We intend that four such buttons could be connected in different ways based on the capability of the participant.

Above: Graphic showing makeup of old button used by participants for single greeting response. Above: Image of new button design integrated with user interface on computer.

For the application we decided to have a four panel format for each pane. The top would have a question or a task and each panel would be an answer or a picture if the task requires one. Each of these panels would be connected to an individual button so when the button is pressed the option on the panel will get selected.

Above: Image of ADI application as participants would view the screens corresponding to the button.

The flow in the app is controlled based on if the answer is correct, if the question is answered at all, or if the task is completed. We received input on what the questions should be for each category from the Staff at AfterGateway and are working on implementing them.

Above: The Greensboro team integrating and testing before taking the first integrated prototype to AfterGateway. Above: Son of one of the Greensboro team members testing the 3D printed button prototype at the AfterGateway Institution.

After integration and testing, e took our first prototype to AfterGateway in early March and the staff and participants got an opportunity to test it.

The feedback we got from having staff and students playing with the design is helping us to refine our prototype to be smaller and include more feedback through lights and vibrations. The feedback was very positive and also included making the buttons more accessible: making the size smaller and making the surfaces angled for cupped hands. Overall the trip was a huge success and we are excited to continue working on the next revision!

Anonymous