Above three ideas were rapidly prototyped following the process of…
Affective Mirror for Better Morning Mood
Help users correlate daily behaviors with morning moods.
Having personally experienced how “waking up on the wrong side of the bed” impacts mood and productivity for the rest of the day, I became curious about how different behavioral factors such as eating and exercising habits could contribute to our moods immediately upon waking up.
Data on a user’s food choices, exercise routine, and sleep schedule would be collected through notifications driven questionnaire by a mobile app. Users could choose to customize a fixed notifications alert times tailored to their schedule…
…Or they could choose to allow the app to provide timely reminders on its own.
Morning moods would be recorded through the interaction with a smart mirror.
The mirror was chosen because it is usually the first thing most people use immediately upon waking up. One of the goals is to seamlessly collect user information, and this was the most natural medium found for users to report morning moods. In addition, computer vision could be an extra modality to identify facials attributes for morning moods.
When there are insights drawn from the data, the mirror would provide behavior suggestions.
Ve aRe (VR) in Their Shoes
Help user develope empathy for others by allowing them to experience interacting with themselves through the perspective of the other person.
Miscommunidation can often result in different interpretation of the same expression when two people are communicating. To solve that, user’s interactions with others would be recorded through cameras. A BCI headset would also be used to detect and record emotions in real time when video is being recorded. Those two streams of data would be sychronized.
When user want to share their emotions experience from a discussion with the person they just had the interaction with, user would then be able to send their VR video to the other person through the mobile application.
BCI headset is used to achieve automatic emotion detection with minimum user input in all casual settings, in the near future where BCI headsets become a mainstream wearable. VR is used to create a fully immersive environment in attempt to elicit the same emotions felt by the person that send the video.
After videos are recorded and emotion classifications are annotated, user would be able to review the interaction and correct the emotions detected using a mobile application before sending it out.
When the person that the user interacted with received the video, they would be able to use a low-cost VR headset such as Google Cardboard to watch the video and “be in the shoes of the other person” while looking at the interaction with their past self.
User would then be given the opportunity to discuss the interaction with the sender of the video.
To help assist and maintain a positive emotional relationship between children and divorcing parents.
Divorce could be very overwhelming for everyone involved, so it’s important to cultivate emotional wellbeing to the best that can be, especially for the children. This was ideated to help parents keep track of their children’s emotional happiness in the process.
Young children’s interaction with a stuffed toy would be recorded. Parents would have access to the preliminary analysis of their child’s emotional health by algorithms. They would also be able to contact a family therapist about the situation if feel necessary.
The family therapist would also have access to the the recorded interaction of the child and be able to offer advice to the parent.