Game Log 4 - Create
Game Description:
“Stand with Them” is an AR narrative experience that transforms your surroundings into a stage of power and resistance. Inspired by a striking painting of red giants oppressing green figures, the game overlays symbolic scenes of social control onto real‐world spaces and invites you to make moral choices: intervene, stay silent, or side with the oppressors. Each decision reshapes the virtual environment and characters, bringing Europe’s cultural memory of oppression and resistance to life while embodying core EU human rights values. By stepping into the role of an active citizen, you move from bystander to empowered agent, confronting injustice in both the game world and beyond.
Mechanism, European value, creativity:
Our game is similar to a gal game, we promote the game in the form of dialogue, let the players immerse themselves in the details of each group's story. We can reflect the EU's human rights values from the demeanor of the oppressors and submissives of different groups.
This game is closely connected with the CGJ theme and the value of the European Union, emphasizing freedom, equality, and human Right.
a. CREATE
In order to support our project, we adopted several solutions of CREATE and PLAY.
1. Playable Prototypes
We have drawn the entire game design on the paper board, specifically discussed how playable the game is and whether it is easy to understand.
2. 6-8-5 Game Sketching
In this step, each member of the group participated in the process and obtained 5-10 preliminary "culture + value + gameplay" game concepts. We finally conducted a group vote to sort the results and decide on the final plan.
3. Concept Convergence
Based on the 6-8-5 Game Sketching, we carry out a detailed design of the game prototype, clearly writing out the concept name, background story, core gameplay, and value proposition.
b. Process Doucmentation
Intial stage:
In the initial stage of the project, we adopted the simplest design, where the three groups of characters did not include any actions or voice. This version only featured simple modal box interactions. The teaching assistant's suggestion was to enhance the interactivity.
After expert review and test feedback:
Regarding the early stage feedback, experts believe that the interactivity of the game as a whole is insufficient, and we should not just stick to the dialog format. We also want to make the player and the model able to interact directly. Therefore, we tried both poke and grab formats, and enriched the details of the story, adding voice to make the game as a whole even more realistic.
Prototyping
We built our AR prototype in Unity 2022.3 LTS using OpenXR and Meta XR Building Blocks, relying solely on the headset’s inside-out SLAM and hand-tracking—no external trackers.
The experience comprises two scenes: StartScene, which presents a simple StartUI Canvas with a “Start” button, and GameScene, which instantiates three green/red NPC pairs, spawning a world-space DialogueUI at each pair’s midpoint facing the player as well as a SpeechBubble above the green NPC’s head (with smooth-follow and audio playback). Interaction is driven by four inputs—controller ray and poke for all world-space UI (StartUI, EndUI) and for pointing at the floor to spawn NPCs at user-chosen real-world locations; hand poke to tap DialogueUI and EndUI buttons; hand grab (via the SDK’s GrabInteractable) to “touch” a green NPC (Help) or a red NPC (Join); and a proximity trigger in PersonInteraction.cs (head-to-NPC distance) to reveal the dialogue bubble/UI on approach and auto-ignore on departure. Once all interactions complete, GameManager sums the scores and spawns a world-space EndUI with final results and Restart/Quit buttons that follow and face the user.
Under the hood, a ScriptableObject (InteractionData) holds scene text, button labels, audio clips, and score deltas; PersonInteraction handles distance checks, UI/grab/poke events, animations, bubble updates, and score registration; and a singleton GameManager tracks when to display the EndUI.
Challenge:
Challenge 1: Enabling "touch" interactions with virtual characters
Our initial attempt used hand pokes via collider triggers in OnTriggerEnter, but tuning collider size and position was error-prone, NPC body shapes varied too much, and physical versus scripted player movement conflicted—distance checks broke when the player moved. We switched to Meta XR's Hand Grab Interactable, subscribing to grab events instead. By disabling all Position/Rotation constraints in GrabFreeTransformer, grabs no longer move the NPC but still fire events. We also disabled controller rays on the hands to avoid ray/poke conflicts, binding the grab event in script to our Help/Join methods.
Challenge 2: Stable, context-aware UI and speech bubbles
Initially, both the dialogue panels and bubbles used a per-frame UIFollower, causing them to jitter with headset micro-movements and frame drops. We refactored so that only the EndUI uses UIFollower to constantly face the user. In GameScene, DialogueUI is instantiated once at each NPC pair's midpoint, while SpeechBubble employs a SmoothFollow script with Lerp for smooth head-anchored tracking—balancing steadiness with responsiveness.
Challenge 3: Real-world placement of NPCs
Because NPCs originally spawned at fixed start positions, they could be occluded by real furniture or clip through walls. We added a placement mode: upon loading GameScene, the player uses a controller ray to point at the floor. Pressing the trigger confirms the location, and the three NPC pairs are instantiated around that point using predefined offsets, ensuring they always appear in visible, unobstructed areas.
c. EXPO Preparation
Before the activity officially begins, we make thorough preparations to ensure participants can fully engage with the experience. To set the context, a carefully designed poster is displayed, providing users with a clear and engaging background introduction that explains the theme and relevance of the activity. This poster not only highlights the basic idea of the game but also outlines the intended mode of expression, helping users understand what to expect and how to interact. Additionally, we present a concrete example of an interactive presentation to visually demonstrate the flow of interaction, allowing participants to grasp the mechanics and objectives more intuitively before they begin.
We hope to show in EXPO:
How games can show the relationship between oppressors and oppressors. How to show this oppressive relationship more clearly, and how to reflect the EU's values of human rights.
FEEDBACK
A public form provided by Microsoft. Users are invited to fill it out after playing the game. It includes two user background questions, five level questions, and one open question.
In the two background surveys, more than half of the users have experience using VR/AR. At the same time, half of the users have experience playing games on mobile phones and computers. This is because the EXPO is mainly attended by students from the course, so the basics of playing games are guaranteed.
Based on the five level questions, most users think the game is immersive (85.7%) and fun (71.4%). At the same time, 78.6% of users think the game is easy to use, and more than half of users think that grabbing is intuitive and accurate.
This proves the success of the game design.
The final development questions pointed out the direction for us. Most of the responses included that the game process should be more complex, the UI should be richer, etc.
In the future, I will focus on the following directions:
1. Make the story more complicated and difficult for users to make decisions
2. Make the UI more beautiful without unnecessary occlusion
AR-WXW-Group 9
Status | In development |
Author | GlitterStarWYT |
More posts
- Game Log 3 - Imagine12 days ago
- Game Log 1 - Experience42 days ago
- Game Log 2 - Play42 days ago
Comments
Log in with itch.io to leave a comment.
nice