VisionHack: Workshop
Entry Presentation⌗
Post-Mortem and Lessons Learned⌗
The inaugural Vision Hack was a 2 day global event held between September 13th and September 15th 2024.
Overall I think this is a fantastic start to what, I hope, will be a vibrant community of passionate developers for Apple Vision Pro and Spatial Computing.
Vision Hack gave developers 2 days to create an original work for Apple Vision Pro. Developers could choose to work in teams up to 5 or solo.
I choose to develop solo as with all of the game jams that I’ve done this year. I have a few main reasons for working solo.
-
Hackathons and Game Jams are way to reenforce what I’ve learned about a platform.
-
I use the deadline as a way to force myself to implement something non-trival within an externally defined time box.
-
My schedule may fluctuate and I may not always be able to complete the jam. If so I can drop out with out inconveniencing a team.
Entry Details⌗
My entry into Vision Hack was titled ‘Workshop’,
The general premise is that you are an individual who is tasked with repairing broken electronics in-order to prevent them from going to a land fill.
Workshop primarily takes place in an immersive environment where the main play area would be a workbench presented in front of the player.
The workbench itself would consist of 2 main areas.
-
Workspace - this is where players pick the part to repair and where the mini games would appear.
-
Backboard - the part being repaired appears here along with the mini game’s UI.
The workspace is a centered rectangular region of the workbench. And would lie flat on the workbench.
Game Play⌗
- The player is presented with a part to be repaired.
- The player selects the part to be repaired.
- A mini game is presented
- The player completes the mini game
- The player is rewarded
- A new part is presented to continue the flow
When selecting a device to repair. The player will enter into a themed mini game that corresponds to both the device and the potential defect.
For example if you saw smoke then the device could be overheating or if you saw sparks then then the device could be short-circuiting. The device itself could have multiple defects representing difficulty levels. To resolve the defects the player would complete the themed mini-game.
Devices⌗
The devices presented could be any electronic device. With the more complex devices representing more difficult mini-games or a higher number of mini-games necessary to complete the repair.
For the VisionHack I implemented the following devices.
- CPU
- Hard Drive
- Floppy Disk
Mini Games⌗
I decided to only implement 2 mini games for VisionHack
- Defender - Player has to use a cannon to fend off corruption trying to invade the device
- Recovery - Player has to use a puck to push away components from a data block
Defender Game Play⌗
Players would drag a targeting reticle to aim the cannon. The cannon would fire at a static rate.
Win Condition⌗
Destroy 10 enemies
Lose Condition⌗
Let 10 enemies past the cannon
Recovery Game Play⌗
Enemies representing corrupted data would spawn along the edges of the workspace and advance towards the data block.
If the corrupted enemy reached the data block they would infect the data block.
Win Condition⌗
Successfully defend the datablock for the length of the mini game.
Lose Condition⌗
Let 3 enemies touch the data block
Challenges and Thoughts⌗
Volume vs Immersive Space⌗
I faced a key decision on if I wanted to make the game to operate in a volume or an immersive space.
Something that I constantly run into with Apple Vision Pro is that most games that I own require an immersive space. Unfortunately this means that it’s difficult to multitask and play a game.
Want to watch a movie and play a quick game? nope. Want to have a web page open in the background? nope.
The main reason for this requirement is that Apple prevents an app from accessing user hand tracking and world sensing if it is not running as an immersive space.
This restriction is in place to preserve user privacy and is absolutely the right thing to do. However this means that games launched in a volume have to rely on the VisionOS default gestures for all game interaction. This leaves with a very limited set of tools for advanced game mechanics. You can require a controller but that will further limit the potential audience of your game.
I ended up choosing an immersive space because I wanted access to hand tracking and the possibility of anchoring the workbench to a horizontal surface. Which I ended up not implementing due to time constraints.
This had 2 major impacts on the project.
-
I spent time working on workbench repositioning in an immersive space. Effectively duplicating what a volume already gives you. If you long press on the workbench a reposition bar appears and allows you to drag the workbench.
-
I didn’t fully embrace the immersive space and as a result did not take advantage of the features available to immersive applications.
Fighting Neck Strain⌗
I believed that having the workspace directly on the desk would make players experience neck pain from having to look down at the content.
The backboard was created to address this. Having the game’s UI here would encourage users to look up and hopefully move their head into a more neutral position.
Additionally the mini game’s origin would be elevated away from the desk so that the content of the game would be at a more neutral position.
Scope⌗
The scope for this project was too large for a 2 day event. In hindsight, I should have just worked on a series of mini games that appear in front of the player.
Mini Game infrastructure⌗
The mini games should have been selectable from a list that appeared in front of the player or randomly picked until the user had completed a set of mini games.
Assets⌗
I waited too long to decide what I should do regarding assets. I ended up trying to use Blender to create some components but abandoned this due to time constraints.
In the end I used a tool called Kenney Shape to create few Voxel representations of the components. However because I was running low on time I was left with a mix of RealityKit primitive shapes and assets created in Kenney Shape. Doing it over again I would have used this tool from the start.
Effects⌗
I did not spend enough time on the particle effects and celebration effects. Due to time constraints I used the default particle emitters from Reality Composer Pro. These should have been left out entirely or done at the end.
Presentation⌗
I did not leave myself enough time to work on the presentation. Like most of the previous points this all stems from not scoping the project well enough.
Poor scoping had a cascading effect and really limited what I could do. I don’t feel like I was able to accurately express what I accomplished during VisionHack.
Lessons Learned⌗
Two days to implement ‘Workshop’ was certainly a challenging goal. In the end, I am happy with what I was able to accomplish even while being disappointed that the visual suffered.
During the two days I was able to build an infrastructure that allowed arbitrary mini games to be loaded and played by the user. Each mini game coordinated with the main application to
- load required assets
- manage mini game state
- process game events
- run the game play loop
- respond to user interaction
This was all done with an eye towards being able to quickly add more mini games in the future.
Not bad if I say so myself (and I do).