Motorola Assist detects your current activity and morphs the phone according to your needs during the activity. It can recognize when you shouldn’t be bothered, like when you’re sleeping or in a meeting. After hours, it only lets important calls through. It even knows when you’re driving and can automatically read out new text messages or play music. When you’re at home it can announce calls. You can set actions and exceptions so that they work exactly how you want them to based on your context.
Central to the theme of Assist was the idea that the app was smart and aware and automatically performing tasks for you. It was essential then to make sure the design lived up to the same promise. The app was designed with the idea of zero or minimal setup of the application by the user. Essentially, the app would do the heavy lifting for the user. The app would detect the activity and then offer a recommendation on how the device could help with the activity (like sleeping or driving). For the user it was simple as checking functions it should perform while in the activity.
Motorola Assist originally started out as SmartActions. SmartActions performed similar functions as Assist but it required users to manually create activities. Although very powerful, the fact that users needed to manually create the scripts for an activity made it hard to use. Motorola Assist broke away from this mold and created activities that the phone could detect and automate on its own with little or no user input.
This was a daring project for Motorola and required everyone's attention and total involvement. To get total buy-in from the entire team we ran a 3-day workshop that included the product manager, software engineering team and design team members. From the workshop, the core vision of the product. was defined. Each team member generated ideas about what this app should do and we came to a consensus as a team about the direction of the app.
Once the workshop was completed, the design team started to define and refine ideas around the conceptual model, interaction model and feature set of the app. During the ideation session the team focused on the idea that the app would service the user 24 hours a day. It should be helping the user to automate important tasks during the day and night.
As the lead interaction designer on the team, I focused on defining different possible ways content and controls would be surfaced to the user. The goal was to keep the interaction light to make the app feel more magical to the user. We ended up discarding models that required deep navigation by the user.
As the project continued, we moved into the detailed requirement phase defining key user flows and details in a UI spec for the developers.