top of page
image.png

​WEEK 2-301&302

#302week2  #capstone  #reflection  #what?so what?now what?

Blog 2

During this week, I needed to revamp and redefine my HMW Statement and be more knowledgeable. So I decided to start using the reflective framework "What? So what? Now what?" to get inspired to delve deeper into the topic to move forward. 

What? So what? Now what? 

image.png

[Figure 1: Rolfe et al's (2001) reflective model What? So what? Now what?]

During the "what?" stage, I used the questions Mairi asked in class to ask myself: what do I know? What don't I know? What would I like to know? After asking myself these three questions, I realised I know about the precedent of apps and VRs for mental stress relief. However, I needed to understand the technology that could connect the APP to AR or VR. I'm also determining if I can combine unfamiliar 3D modelling and VR software to create an intervention within the given timeframe that genuinely helps individuals with memory loss alleviate psychological stress.

Moving into the "So What?" stage, my experience told me I needed to learn about different technological tools and software through research and interviews to gain more inspiration. During the thinking process, I felt very anxious and even lost sight of the original purpose of this project. However, I can further clarify my How Might We Statement, required technology and project plan by doing this.

 

Finally, in the "Now what?" phase, I realised that I needed to thoroughly understand the learning cycle of each type of technical software and how to use it to ensure that I could complete the project plan with high quality within the required time frame. Regarding technical software, I needed to test different features and effects to ensure the project would run smoothly. I will get a better grasp of the required technology and solve any problems that may arise during project implementation through this method.

HMW Statement

DES301&302 - Frame 4 (1).jpg

[Figure 2: How Might We Statement. My Miro.]

Through the vision of this project, I want to leverage various technological methods to help users detect emotions, manage stress, and improve mental health through interactive means.

Project Plan

  • Plan A: Virtual Meditation

  1. VR: The APP provides virtual meditation sessions combined with VR technology to immerse users in a relaxing natural environment. Furthermore, appropriate ambient sounds or relaxing music can be added to the natural VR environment.

  2. Stress Management: The APP regularly detects users' heart rate and mood fluctuations and provides personalised stress management suggestions and tasks.

  3. Emotional Record: Users can record their daily emotional state in APP.

  • Plan B: Emotion Detection and Memory Card

  1. Emotion Detection: The app uses a camera and sensors to detect the user's expression and heart rate and analyse the user's emotional state.

  2. Memory Cards: Depending on the emotional state, the app will recommend suitable memory cards, and users can write down their mood or experience of the day on the cards.

  3. AR/VR Display: 1. users can display these cards in the natural environment through the AR function or generate animation through VR. The app will play relaxing background music or natural sound effects to help users relax.

Or 2. Generate avatars based on mood cards. (Inspired by the inside out.)

image.png

[Figure 3: Disney Australia. (n.d.). Inside Out 2.]

  • Plan C: User drawing with AR display

  1. User Illustration Creation: Users can draw their illustrations in the APP, choose different tools and colours, and give free play to their creativity for relaxation.

  2. AR display: After completing the illustration, users can use the AR function to display the illustration in a natural environment to enhance the interactive experience.

  3. Audio-visual combination: When the illustration is displayed through AR, the app will play relaxing background music or natural sound effects to help users relax.

Timeline

Based on my vision for the project, I developed three plans. The most ambitious was to combine the app with VR, but the technology faced many challenges, including high development costs and complex technical software requirements.

Considering the limitations of reality, I developed a more realistic plan focusing on emotion detection with AR reminiscence card functionality. The phone camera will be used to analyse the user's emotional state to recommend corresponding personalised emotion cards, which will be displayed in real-life scenarios using AR. However, Plan B still requires in-depth research on emotion detection.

Finally, combining user drawings with AR displays in the app is the least feasible plan. Users can draw and display them in the natural environment through AR. This plan demonstrates our technological capabilities and provides a simple and effective way for users to relax.

Based on these three plans, I created the corresponding timeline.

DES301&302 - Week 3.jpg

[Figure 4: Timeline. My Miro.]

Reference

Inside Out 2. (n.d.). Disney Australia. https://www.disney.com.au/movies/inside-out-2

What? so what? now what? (2020, January 30). The University of Edinburgh. https://www.ed.ac.uk/reflection/reflectors-toolkit/reflecting-on-experience/what-so-what-now-what

  • Facebook
  • Twitter
  • LinkedIn

©2022 by 我的網站. Proudly created with Wix.com

bottom of page