EXPERIENTIAL DESIGN - TASK 1 TRENDING EXPERIENCES


Week 1 - Week 4
Task 1 - Trending Experiences
Siti Zara Sophia Binti Mohammad Reeza (0359881)
Bachelor of Interactive Spatial Design (Honours)


INSTRUCTIONS





TRENDING EXPERIENCES


Classroom Exercises


Week 1

In week 1, we were introduced to Experiential Design by Mr Razif! First, he explained what exactly was Experiential Design? Well, Experiential Design or XD falls under the umbrella of User Experience or UX. It focuses on how to design better experiences in order to increasingly engage user feelings throughout said experience such as their 5 senses and emotions. In this module, we were to create an AR mobile experience made with Unity Engine! 


After explaining, Mr Razif proceeded to show us the work from previous students and what could be improved from their AR application. While watching the video, a creeping sense of dread built up in me. I had never done something like this before, I had never coded in Unity before either, how was I, alone, supposed to create a fully functioning AR mobile experience in less than 14 weeks with such limited experience?! Then, Mr Razif mentioned the fact we could team up for our assignment. I didn't really know anybody in class but my eyes locked with a girl from across the room, Iman, and we decided to team up together (THANK GOODNESS!!)


Thus,  after learning a few more keywords and tips about experiential design (such as MVP—most viable product), Iman and I set out on our XD journey!


Week 2

Copy of Slides - Experiential Design Week 2



Ahhh Week 2! We learnt more about the world of design by touching on the terminology for all the related different types of design such as service design (SD) and brand experience (BX) and the umbrella terms that fell under them! Then, we touched upon the specific definition of experiential design (XD)! 


XD is....

The practice of designing products, processes, services, events, omnichannel journeys, and environments with a focus placed on the quality of the user experience and culturally relevant solutions.

Experience design is not driven by a single design discipline. Instead, it requires a cross-discipline perspective that considers multiple aspects of the brand/ business/ environment/ experience from product, packaging, and retail environment to the clothing and attitude of employees. Experience design seeks to develop the experience of a product, service, or event...


After learning more about XD, UX and how it all overlaps, we jumped into our first class group activity! In this activity, we  revisited user personas and user journey maps, both of which I've learnt and touched upon on previous semesters in Application Design and Spatial Design so I was familiar with what it was and how to do it! In order to conduct the activity, we were to form a group with the people sitting at our respective tables. It was a bit difficult to find a common location we had all previously visited in order to create our user journey map but we eventually landed on doing it for a shopping mall—specifically Sunway Pyramid! The below is the user journey map my teammates and I created using a Figma template!




After presenting our user journey map, Mr Razif informed us that the template we used could definitely be improved through a few key changes. This includes changing expectations and experiences to gain points and pain points instead so we could clearly visualise the positives and negatives of the experience instead of it being a mixed bag which could be confusing! However, it was good that we added the solutions—especially those relating to AR experiences that could be implemented in the mall. The feelings the user experiences throughout their journey within the mall was also a good addition!


After the activity and lesson, Iman and I approached Mr Razif to show him some of the ideas for our app that we created! After listening to the pros and cons he listed, we then proceeded to filter the ideas so we knew which one we roughly liked and wanted to focus on while being viable and realistic in our short timeframe!


Week 3

In week 3, we were introduced to the differences between AR, MR and VR! I know what you might be thinking, "HUH?! There's so many different types of realities??? I only know VR...." Well no worries confused wanderer, here's a quick breakdown of the difference between all these different types of realities!


Augmented Reality (AR) - Extends visual. AR overlays digital content into the real world without replacing the physical world! For example, adding a virtual tiger to your space to see what a tiger looks like up close without leaving your house or getting eaten alive!

Virtual Reality (VR) - Extends experiences. Have you ever wanted to enter the realm of the Walking Dead? No, not really? Well too bad cause that's exactly what VR can help all you adrenaline junkies do! You can jump into any fictional experience with the use of VR, from zombieland to a theme park or even learning how to do surgery, VR can help you experience all those things by just putting on a headset and some controls! 

Mixed Reality (MR) - Extends what AR can do in a more deeper way. Like AR, you manipulate virtual objects and things but it also implements real life physics into the mix with deeper interactions such as features that respond to voice commands or touch. What AR can do, MR can do but what MR can do, AR may not always be able to do. It's kinda like that not all rectangles are squares analogy!


Mr Razif then showed us AR experiences that are commonly available just through our mobile devices! These experiences include a project done by the Fashion Design students where they displayed different fashion on barbies and converted them into a markerless AR mobile experience which makes it easier for users at home to view!


We also checked out the Google AR experience where they allowed us to view animated AR animals at home (hence the tiger analogy above)!


I felt as though these experiences were so cool! I mean who doesn't want to see things come to life! It allows users to experience certain things they can't usually experience. For example, fully experiencing the clothes created by the fashion design students even though I wasn't there to actually witness their designs on the (barbie) runway and letting individuals who have never stepped foot in a zoo, see animals right in front of their eyes!


Next, it was time for the class activity! In this activity, we were to create AR solutions and mockups pertaining to a certain problem! Adding on from the previous week, my teammates and I decided to add-on to the shopping mall experience by fully diving in and creating the mockups for the solutions we proposed earlier! Here is our work!



We produced a few AR solutions pertaining to common problems shoppers have when heading to a mall and trying to shop! These include:

  • AR parking detector with virtual parking directions that will take you to the nearest available parking spot based on your location
  • AR mall directions for those who get easily lost in the mall and including the prices of shops
  • AR promotional advertising which is meant for promoting the best and latest deals to shoppers 
  • AR try on which will help individuals who don't like to try on clothes know if the piece of clothing they picked up is actually what they're looking for!

Based on Mr Razif's feedback, we listed how we could improve each mockup in the future (mainly pertaining to the visual accuracy displayed on the mockup)!


Finally, we ventured on into the realm of Unity and Vuforia


Now, we were to learn how to create a super basic marker-based AR on Unity and Vuforia! This seemed super daunting! Buuuut, it turned out to be a lot simpler than I thought! Based on my in class notes, I asked ChatGPT to help me describe the step by step processes to you for ease of viewing (if you don't wanna watch the video above)! I only did this cause my in class notes are formatted horribly I fear so bare with me!

Step 1: Preparing Vuforia in Unity

  1. Open your Unity project. Make sure your project is active before importing the Vuforia Engine.
  2. Import the Vuforia Engine. You can get it from Unity's Package Manager or directly from the Vuforia Developer Portal.
  3. Configure max simultaneous tracked images. In Vuforia Configuration, set how many images can be tracked at once. Increase this if your app uses multiple markers simultaneously.
  4. Device Pose setting:
    • Check it if you want the AR content to respond to device rotation or vertical tracking (e.g., looking up at a tower).
    • Uncheck it if you're scanning multiple flat images and need a simpler setup.

Step 2: Creating and Setting Up the Vuforia Image Database

  1. Log in to the Vuforia Developer Portal: developer.vuforia.com
  2. Create a license key. Go to the License Manager, create a new license, and copy the key for use in Unity later.
  3. Create a new image database. Use the Target Manager to start a new database. Set it to 'Device' tracking.
  4. Upload image targets.
    • Upload images (preferably .jpg) and define their width in meters.
    • Vuforia will rate each image from 1 to 5 stars based on how easy it is to track.
    • Ideal: 4 or 5 stars for stable detection. 3 stars is acceptable but may result in weaker tracking.
  5. Download the Unity database package. Select "Download Database for Unity" and import the .unitypackage into your project.
  6. Paste your license key in Unity. Go to Window > Vuforia Engine > Configuration and paste your license under "App License Key."

Step 3: Setting Up the AR Camera

  1. In the Unity Hierarchy, right-click and select Vuforia Engine > AR Camera.
  2. This replaces the default Unity camera with a Vuforia-compatible AR Camera.
  3. In the Inspector, open the Vuforia Configuration and confirm your license key is present.

Step 4: Adding an Image Target

  1. Right-click in the Hierarchy and select Vuforia Engine > Image Target.
  2. In the Inspector, set the database to your imported image database and select the appropriate target image.
  3. Make sure the scale of the Image Target matches the physical print size (in meters).

Step 5: Attaching a 3D Model

  1. Right-click on the Image Target in the Hierarchy.
  2. Select 3D Object and choose a primitive (e.g., cube, sphere) or import your own 3D model.
  3. Position and scale the object so that it fits nicely on the image when viewed in AR.

Notes and Tips

  • Use high-contrast, non-repetitive images for better detection results. The more unique features an image has, the better.
  • If using multiple targets, make sure each has a distinct visual design to avoid recognition conflicts.
  • To simulate real-world physics or movement, enable lighting and interaction components within Unity as needed.

Once everything is in place, build and deploy to your Android or iOS device, scan your printed image target, and watch your AR content come to life.


Tadaaa! Those are the steps and here is the vuforia target manager and marker-based AR (though I admit my cube was a bit too big here in proportion to my AR marker which is why you can't see it huhu) I did in class following them!






Week 4


In this class, we jumped right into Unity and Vuforia and learnt how to create UI features in Unity and also how to animate our 3D models using keyframes! You can see the full tutorial above!


Mr Razif explained how we can create a functioning UI in Unity with no coding as long as the object alignments were the same! It's like a little trick hehe! We resized our screens to a mobile phone size following the choice for our future application so I chose Android. Then, we created the buttons using the UI option in the Unity hierarchy and editing the TextMeshPro (TMP). Before all this though, we actually created a very simple animation for our 3D objects using keyframes and by just moving the objects! I thought creating the animation would be really hard but it was honestly quite doable and simple which was a nice surprise! After all that, we utilised the Game.SetActive Bool section of the object and used that to create that button to object response we were looking for! 


The below is my fully done animation and button!


https://drive.google.com/file/d/1crpq_56a9bwDkAEI1dmfU-dIsGBFXmLF/view?usp=sharing


It all worked out perfectly though at the end you can see me have a slight issue of respawning the models—so ensure there's no glare on your device and AR marker when trying to spawn the AR!


Research and Exploration


AR in Gaming: 

  • Pokémon Go (Niantic)


Globally, Pokémon Go remains one of the most successful and enduring AR games. Using your phone’s GPS, camera, and real-time location data, the app populates your local map with digital Pokémon. These appear overlaid on real-world environments through your camera screen. Players physically walk to different landmarks—called PokéStops and Gyms—to catch Pokémon, collect items, and battle. Seasonal events, PvP battles, and cooperative raids keep players engaged. Niantic continues to update the app with new features like the Buddy Adventure (where Pokémon walk with you in AR mode) and AR Mapping tasks (where users scan real-world locations to improve AR placement). It's definitely one of the most (if not THE most) iconic game that uses and implements AR technology so effectively in their gameplay.


  • Peridot (Niantic) 


Made by the company Niantic (who also brought you Pokemon Go), Peridot is an AR pet simulator inspired by Tamagotchi, where users hatch, raise, and care for virtual magical creatures called “Peridots.” Each Dot is genetically unique, with traits influenced by real-world environments and breeding patterns. What sets this game apart is its use of advanced surface detection—Peridots recognize floors, couches, and walls, interacting with these surfaces in realistic ways. They respond to voice and touch, play with virtual toys, and explore their surroundings. Users can walk their Peridots outside, where the creatures discover new items based on the environment (parks, roads, beaches). Peridot combines AI-driven creature behavior with immersive AR, offering a Tamagotchi-style experience in a modern, innovative form. 


  • Pikmin Bloom


Pikmin Bloom encourages users to turn everyday walks into a gardening adventure. By tracking steps and routes, users grow Pikmin—tiny plant-based creatures—that follow them around and help plant virtual flowers on the path walked. The app overlays blooming flowers onto your map, turning mundane, daily movement into a fun activity users can actually look forward to! AR mode allows users to see and interact with their Pikmin through their phone cameras, capturing photos and animations. It’s not competitive but focuses on self-care and exploration, showing how AR also can enhance mindfulness and habitual movement while being enjoyable.


These experiences show how AR in gaming is not only about visuals, but about building meaningful connections between the user’s physical environment and digital content—which is exactly what AR is supposed to do! It turns cities, parks, and sidewalks into dynamic game boards, thus making everyday life more playful and connected.


AR in Retail:

  • IKEA Place


IKEA Place allows customers to digitally furnish their space before making a purchase. By scanning a room with your phone camera, the app creates a spatial map and enables placement of IKEA furniture in 1:1 scale. Users can rotate, reposition, and walk around items to get a sense of size, texture, and how the furniture fits with existing décor. The app uses Apple’s ARKit or Android’s ARCore for real-time surface detection and space estimation, ensuring realistic integration. It reduces buyer hesitation, especially when it comes to the DIY build-style of Ikea products—both online and offline—and has proven effective in decreasing product returns, all while providing an intuitive, design-focused interface.


  • Snapchat x Gucci AR Try-On


Snapchat’s Lens Studio allows brands to create custom AR experiences. Gucci’s partnership with Snapchat produced an AR lens that lets users virtually try on their sneaker lines using selfie and rear-facing camera modes. The shoes appear anchored to the user’s feet with accurate shadows and motion tracking. It blends entertainment with commerce: once a user finds a pair they like, they can instantly tap to buy directly through the app. Gucci’s campaign saw significant user interaction, setting the bar for future luxury AR e-commerce.


AR is transforming the retail experience from passive browsing to active product interaction. Consumers are no longer limited by physical stores or guesswork—AR lets them see, try, and buy with greater confidence and convenience.


AR in Education: 

  • BBC Civilisations AR


This free educational app from the BBC brings historical artifacts and ancient relics into students' living rooms. Using a smartphone or tablet, users can place and examine 3D models of objects from various civilizations, including Egyptian mummies, Roman mosaics, and the Rosetta Stone. Each object comes with annotations, narration, and contextual history. The app allows full 360-degree viewing and even simulated damage restoration features. It bridges classroom learning with museum-quality experiences, particularly valuable for remote learners or under-resourced schools.


  • Quiver (formerly Colar Mix)


Quiver combines traditional coloring with AR. Children download printable coloring sheets from the Quiver website, color them in, and then scan them with the app. The images come to life in 3D animation, moving, speaking, or reacting depending on the chosen colouring sheet. It encourages creativity, hand-eye coordination, and tech literacy within children all at once! It could definitely be used by teachers when teaching STEM subjects to students to help them visualise processes (e.g., volcano eruptions, ecosystems) in a hands-on, memorable way.


  • Mondly AR



An extra addition and support to the Mondly language learning app, Mondly AR introduces an animated virtual teacher who appears in your space and guides you through vocabulary and dialogue lessons. Objects (like fruits, tools, or animals) also appear in 3D, helping users associate words with visual and spatial cues. The app supports voice input, pronunciation feedback, and conversation practice, offering an immersive supplement to textbook learning. It’s particularly helpful for learners who struggle with memorisation, as AR anchors abstract language to concrete visuals.


AR in education supports multiple learning styles—visual, auditory, and kinesthetic—while making lessons more interactive and experiential. It’s a powerful tool for increasing retention, sparking curiosity, and democratizing access to resources that might otherwise be out of reach.


Through these examples, we've seen that it’s already integrated into how we play, shop, and learn. By adding digital layers to our physical world, these experiences make everyday life richer, more efficient, and also more joyful. Whether navigating your city to catch rare Pokémon, testing a sofa in your living room before buying it, or walking through ancient history from your classroom, AR connects us to both technology and the world around us in transformative ways. AR isn't just limited to the markerless AR examples above, with marker-based ARs also having a place in today's increasingly digital world with it's own unique functions to help support and increase our experience in daily life.


3 AR Project Ideas


1. Storybook Alive

Problem Statement

Today’s children are growing up with screens and interactive media. In contrast, traditional storybooks—though rich in content—often appear static and unengaging to younger readers. This creates a disconnect that may reduce interest in reading and make it harder for children to develop early literacy, emotional awareness, and comprehension skills. Many parents and educators struggle to maintain attention during reading sessions, especially with learners who benefit more from visual or hands-on experiences.


Proposed AR Solution

Storybook Alive is an augmented reality application that transforms ordinary children’s storybooks into interactive, living experiences. Using a smartphone or tablet, readers scan illustrated pages, which then trigger vivid 3D animations of the characters and scenes. Each scene includes:

  • Character animation: Characters move, speak, or perform actions described on the page.
  • Narration & Sound Effects: A voice-over reads the text aloud, accompanied by ambient sounds to enrich storytelling (e.g., birds chirping in a forest scene).
  • Interactive elements: Children can tap characters to reveal their emotions, hear internal thoughts, or get definitions of difficult words.
  • Post-story quizzes: A short interactive quiz appears after the story, reinforcing the moral, characters’ motivations, and key vocabulary.

The idea is to keep the child’s attention while also encouraging empathy (understanding what characters feel), imagination, and retention through multi-sensory engagement.


Technology & AR Inspiration

  • AR Framework: Unity 3D with Vuforia (image target detection) or 8thWall for web AR compatibility.
  • Tracking Method: Book illustrations act as unique markers to launch specific 2D animations and scenes.


AR References:

  • Quiver – for turning 2D pages into animated 3D models using image tracking.
  • Peridot – for life-like character behavior and interaction with real-world surfaces.
  • BBC Civilisations AR – for contextual storytelling and spatial engagement.


Target Audience

Kindergarten to lower primary school children (ages 4–10), especially early or reluctant readers. The app also supports educators, librarians, and parents looking for creative tools to make reading time more engaging and developmentally enriching.


Sketches + Mockups






2. Lost Recipes – AR Grandma’s Kitchen

Problem Statement

Generational recipes—those passed down by grandmothers, aunts, and great-grandparents—are slowly being forgotten. Written recipes can be confusing or incomplete, and many young or novice cooks lack the confidence to try them. Cultural context, preparation techniques, and ingredient handling are often lost when reduced to static instructions, which makes traditional cooking less accessible to modern learners who are used to video, audio, and interactive content.


Proposed AR Solution

Lost Recipes is a mobile AR application that reintroduces cultural cooking traditions through step-by-step augmented guidance. By scanning a recipe card or utensil, users launch an AR kitchen assistant—styled as a grandmotherly figure, elder chef, or cultural narrator—who appears in the user’s actual kitchen. Through AR overlays, the guide provides hands-on instruction such as:

  • Projecting a virtual pot on the stove with stirring animation
  • Showing slicing techniques in the correct order and style (e.g., julienne, dice)
  • Overlaying timers and heat settings onto physical appliances
  • Offering voice instructions and fun historical facts about the recipe’s origins

Throughout the process, the app highlights cultural tips, such as ingredient substitutions, seasonal variations, or etiquette during communal meals. The experience can be paused, restarted, or saved, allowing learners to move at their own pace.


Technology & AR Inspiration

  • AR Framework: ARKit (iOS) or ARCore (Android), with plane detection for realistic object placement on kitchen surfaces.
  • Marker-Based Interactions: Recipe cards or utensils scanned to trigger specific regional recipe tutorials.


AR References:

  • IKEA Place – for accurate, fixed-position 3D object placement.
  • Pokémon Go – for integrating gamification and interactive progression.
  • BBC Civilisations AR – for cultural storytelling anchored to physical context.


Target Audience

Young adults and adults learning to cook (ages 13+), cultural preservationists, food historians, or anyone interested in heritage cuisine. Particularly useful for multicultural families, cooking classes, and diaspora communities reconnecting with their roots.


Sketches + Mockups




3. AR Solar System Explorer

Problem Statement

Understanding astronomical scale, planetary size, and orbit speed is difficult using 2D diagrams or models. The vastness of space is often compressed into inaccurate visuals, leading students to develop misconceptions about the size and position of planets. This lack of spatial comprehension limits curiosity and makes astronomy feel distant and abstract.


Proposed AR Solution

AR Solar System Explorer is an educational AR experience that transforms any flat surface into an interactive model of the solar system. Users scan a table, floor, or playground with their mobile device to place a scaled-down Sun at the center. Orbiting planets appear at proportionally scaled distances, moving in real-time or accelerated motion. Features include:

  • Tap-to-learn: Tapping a planet brings up fun facts, animations (e.g., Jupiter’s Great Red Spot), and narrated audio descriptions.
  • Scale toggle: Switch between accurate-scale and “room-fit” view so users can inspect planetary details without sacrificing comprehension.
  • Guided tours: A voice guide walks users from Mercury to Neptune, explaining key features and fun comparisons (e.g., “Jupiter is so large that 1,300 Earths could fit inside it!”).

The app encourages movement and spatial learning—users physically walk from planet to planet, simulating the real distances and developing a better understanding of the solar system’s layout.


Technology & AR Inspiration

  • AR Framework: ARKit/ARCore, using plane detection and ambient light estimation.
  • Data Layers: Interactive information panels, dynamic animations (e.g., asteroid belts, planetary rotation), and 3D labels.

AR References:

  • solAR – for accurate celestial model scaling.
  • IKEA Place – for maintaining consistent scale and placement.
  • Pokémon Go – for encouraging exploration and movement.

Target Audience

Elementary to middle school students (ages 8–14), science educators, museum programs, or curious learners of all ages. Can be used in classrooms, planetariums, or homes as a visual supplement to lessons on space and astronomy.


Sketches + Mockups


Comments