funDrive SmartWatch App

funDrive SmartWatch App

funDrive smartwatch project - Applying the Human Centered Design
17 February 2016 — 5 months
Ms. Usability Engineering — Rhine Waal University of Applied Sciences
Supervisors
Prof. Frank Feyrer
Prof. Dr.-Ing. Rolf Becker
Team
Jassim Talat — UI/UX Designer, Usability Engineer, Project Manager
Hafeez Sani — Usability Engineer

A smartwatch app that can detect and normalize human emotions
funDrive is a smartwatch project made by Usability Engineering students at Rhine Waal University of Applied Sciences. The aim of the project was to apply the usability engineering knowledge. This required applying the ISO 9241-210 standard, as well as other subject areas such as human factors design, strategic usability, user experience design, usability testing and psychology. This was a really fun, exciting, challenging, informative and educational project. The project helped us understand and develop skills in the field of usability, human centered design, agile process model, user research and user experience design. 
"A lonely writer develops an unlikely relationship with an operating system designed to meet his every need."
The project idea is inspired by the movie HER (2013). In the movie, Theodore (the actor) buys a new OS (Digital personal Assistant) and communicate with it to perform tasks and activities (like an advanced version of SIRI, but with more human-like feel). The OS(Samantha) acts, behaves and speaks much like a real person who can experience emotions, have dreams and even falls in love with actor - Theodore.
We wanted to design an app for smartwatch that would work in similar fashion which would detect user emotions and interact with the user like a real person, something similar to Samantha (OS) from the movie “Her”. The app could make the user calm whenever the user experiences extreme emotions like anger, fear or joy.
Emotion Detection
The app would monitor the voice of the user and based on the user's voice patterns, the app would detect the emotions of the user. Additionally it would monitor the heartbeat of the user, using the built-in heart rate monitoring sensor. The app would use Artificial Intelligence to learn the behavior and personality of the user. The app would provide user emotions readings and help the user to stabilize his uncontrolled risky emotions.

The project plan was created by studying and following the ISO standard 9241-210,TR 18529 and Human Factor Design.

Project Plan
1.0    Researching Usability Methods
2.0    Project Scope
3.0    User Research
4.0    Analysis
5.0    Design 
6.0    Build
   6.1    Paper Prototyping
   6.2    Testing - Paper Prototype
   6.3    Digital Prototyping
   6.4    Testing - Digital Prototype 
   6.5    Final Build
7.0    Documentation
Project Plan      ​​​​​​​
Team Collaboration — Trello
We needed a safe, reliable communication and collaborative environment to work according to project plan. This was achieved using online team and project management and collaboration tool - Trello. Project tasks and subtasks were created and scheduled in Trello and assigned to respective team member. Each member could attach his/her work documents to the respective task and set the status of the task accordingly. Labels → To Do, In Progress, Finished, Deadlock, Help needed. 
Documents Sharing
Google apps and Google Drive was used for file sharing, project research and documentation. Google Drive cloud based working environment is free, safe, easy to use and allows everyone to work and collaborate within documents at the same time. Team members could see, review, edit and leave their comments in documents. 
Usability Methods Used
One major comprehensive online resource for this project used was usability.gov. It has an in-depth information as well as templates which were helpful for the successful completion of this project. All in all, it was exciting and informative working on the project and we now have all the working knowledge in some of the methods involved in usability engineering principle.
User Research
   • Interview
User Experience
   • Storyboard
   • Personas
   • Use case
   • Wireframe
   • Paper Prototype / testing
   • Mockups
   • Digital Prototype
Testing
   • Formative Usability Testing
ISO 9241-210 - Human Centered Design for Interactive Systems
ISO 9241-210 (Ergonomics of human-system interaction -- Part 210: Human-centred design for interactive systems) was followed for the project planning, dialogue principle and ISO TR 18529 (Ergonomics -- Ergonomics of human-system interaction -- Human-centred lifecycle process descriptions) was used for planning Human Centered Design in the project. The users were involved from the very beginning of the project life cycle and were involved till the very last step of the project. 


User Research
The user research was conducted using the interview method. This method is further divided into sub phases. Each sub phase have more in depth research rules and methods. For Human Centered Design approach, the research was started with the end-user and was involved throughout the user research phase.
User Interviews
   • Interview Guideline Definition
   • Identify Interview Questions
   • Research on technical feasibility
   • Test run Interview
   • Revise Interview Questions
   • Conduct Interviews
   • Gather Results
   • Derive Analytics in a meeting
   • Reflecting the Interview as a Method
   • User Research Documentation
Technological Feasibility
5 smart watches were researched and compared. Based on the results, one watch was selected for the project, that could fulfil all system requirements. The watched compared were Apple watch, Samsung Gear S, LG G R, Sony 3 and Moto 360.
Technical Requirements
funDrive goal is to normalize the user emotion and the process to achieve this goal involves a lot of data collection from the user. For this purpose, some main sensors and hardware requirements for the watch are:
   • Heart Rate Sensor
   • Microphone
   • Speaker
The only two smart watches that have all the required features were:
Based on the smart watch specifications chart, Apple watch was chosen as it's heart rate monitor is highly accurate and also has an oximeter to make heart rate readings even more accurate. It has a built-in speaker to give voice output commands in case an external speaker is not present.
Personas were created based on the data gathered from user research during interviews. The data gathered was distributed into user groups and one persona for each of the three user groups was made. These personas were based on the three different types of user emotions. Anger, fear and over-excitement.
After the creation of personas, use case scenarios were made based on user profiles from personas and tasks were created from user requirements and system requirements.
Main Use case Scenario

Flow Diagram



After reviewing the scope, user requirements, persona and use cases, storyboards were created using quick sketching skills to get a rough overview and idea of a use case scenario. The storyboard focused only one emotions as the core functionalities are pretty similar for the workflow of funDrive.


User experience (UX) involves understanding of user needs and wants as well as business and organizational goals. UX guarantees the quality, efficiency and effectiveness to achieve user goals of the system.
Wireframes / Mockups
The wireframe were designed in Balsamiq based on the use cases and storyboard. To create the wireframes, Apple design guidelines was followed in order to keep the rough wireframes to match the design aspects and dimensions for the apple watch.
Paper Prototype
The paper prototype was sketched on a big graph paper. A masked cutout paper was used as smart watch and was placed on top of one screen at a time. As the user interacted with the paper prototype, the mask would move to the respective screen, and the user would get the focus to only that screen. Multiple ideas and sketches were made for paper prototyping, which after reviews and testing went back in the iterative process and resulted in the best working and tangible ideas. This process also lead to new and innovative ideas and also fixed some missing elements and bugs.
The emphasis on short cycles kept the process highly productive. It also enhanced the generative nature of prototyping. Ideas were sketched, presented and critique, prototyped, presented and critique, sketched, presented and critique, prototyped, presented and critiqued, prototyped, and tested.

Testing guidelines were made on how to conduct the paper prototype test. The tasks scenarios that user needed to perform by using paper prototype were documented. The testing session with each participant lasted for approximately one hour. System Usability Scale test method was used for paper prototyping test.
Digital Prototype
The digital prototype was build in Marvelapp. Marvel App prototyping tool was preferred and used over other prototype tools available as it  provides linking / embedding of audio / video files.
Platforms
The prototype was built for the web/mobile platform, so that it could be tested on a laptop or a smartphone. The prototype was made with the dimensions of a smartwatch when displayed and tested on a iPhone 6 screen to mimic the exact size of a smartwatch.
Challenges
The audio input and output was a limitation from the prototype tools. Audio embedding was used for playing system audio messages or songs to mimic the prototype as close to the final version as possible. The Emotions detections part was designed on the right side on the prototype which was controlled by the moderator. This area was barely visible for the test user and the user focus was only on the watch and the interface.
App Start
App screen with funDrive icon. Tap funDrive icon to launch the app.

Check my wife Mood...
The user can check the mood/emotions of someone, and decide to call or not.

Emotions Details Screen - Neutral
From overview screen - if the user taps on the emotion icon, he is taken to the emotion details screen. Here he can see the details for each emotion with their percentage.

Emotions History
Swiping left on the emotions details screen - the user goes to the emotions history screen, where he can see his mood swings during the last 24 hours.

Heart Rate Details
From the overview screen, if the user taps on the heartbeat, he gets to the detailed view of the heart rate. This screen shows the current heart rate and also the maximum and minimum heart rate bpm since the app started.

Heart Rate History
Swiping left on the heart rate details screen - the user goes to the heart rate history screen, where he can see his heart rate readings of the day (previous 24 hours)

Anger Detected Emotion
The app detects anger and shows the ANGER Icon and percentage. The app then takes further steps to calm the user down.

"Are you OK?" Question  
The app cannot detect emotion from rise in heartbeat only and so it asks the user for voice input. The user on this screen can choose one of his voice answer on the right.

App Music Play
The app suggests the user if it should play a certain song for the user. The user can choose his voice input choice, by yes, or by asking to play a song of his choice. Selecting No will put the app in pause mode and the watch will not bug the user for sometime.

Music Playing...
If the user says yes, the song is played for the user on the watch, or the mobile speaker, or the car's multimedia.




To start of with the design, Apple design guidelines were followed to design for funDrive UI screens while maintaining and following the user interface design, user experience and usability aspect to follow the ISO 9241-210 guidelines.
Photoshop
The design for the app were created in Adobe Photoshop CC 2015 build 2015.1. Photoshop CC 2015 supports artboard which are already available in Adobe Illustrator. This is a brand new feature for photoshop that really helped with the creation of multiple screens within single Photoshop document. The new feature also made it easier in naming and exporting PNG screens (by single click).
Screens were designed based on horizontal and vertical levels. The horizontal level focuses on the detail functionality of one feature (or emotion), whereas the vertical level was used for the number of different features.
Smart objects were used to keep the size of the photoshop document to be limited, and easier and faster to work with. The smart object technique was also used to so that it is easier to change one object/element and all other elements are updated automatically.

We conducted the usability tests and evaluated the user performance and identified potential design bugs and improvements in terms of efficiency, speed and user friendliness.

Usability Testing with Andre 
Usability Test Objectives
• The following are the usability test objectives
• To measure if all the user goals are met
• To measure the efficiency of system
• To find any design or user experience bugs
• To measure responsiveness of the system
• To find any known bugs or errors
• To test whether the system comply with the 7 dialogue principles
• To measure user satisfactions and happiness

ISO 9241-110 Seven Dialogue Principles
To test and evaluate the 7 dialogue principles, questions were written based on the ISO 9241 part 110.
• Suitability for the task
• Self-descriptiveness
• Conformity with user expectations
• Suitability for learning
• Controllability
• Error Tolerance
• Suitability for individualization
Define Test Setup
The usability test was performed to measure and evaluate the system ability to achieve user goals and requirements. This was done by presenting the users a scenario and some tasks to complete on the system. The tests were conducted in the user preference of location (office, university, dorms). The tests were conducted on touch screen laptop and iphone 6. The test on the smartphone was prefered since they size of the watch in the digital prototype mimicked the size of the smart watch. The sessions were recorded and data was later analyzed from the recordings.
The Practical Test
The participants also tested the prototype in a real-life driving scenario while interacting with the watch (through voice only). Three participants were selected to test the three different emotions while driving with the system working in the smartwatch. Each user then tested the system with his voice input and the watch (faked by the test conductor) replied to the user just as the system would do. The test was video recorded and the results were later analyzed for task completion rate and task completion time. The participant signed a consent form for video recording.
Testing Anger with Viktoria (speech)
Testing Fear with Nanosch
Testing Excitement with Viktoria
Testing Anger with Andre
A smartwatch app to keep you calm at all times.

Back to Top