Remaining To-Do For My Next Unity 3D Adventure

I enjoyed my Unity 3D adventure this time around, starting from the LEGO microgame tutorials through to the Essentials pathway and finally venturing out and learning pieces at my own pace in my own order. My result for this Unity 3D session was Bouncy Bouncy Lights and while I acknowledge it is a beginner effort, it was more than I had accomplished on any of my past adventures in Unity 3D. Unfortunately, once again I find myself at a pause point, without a specific motivation to do more with Unity. But there are a few items still on the list of things that might be interesting to explore for the future.

The biggest gap I see in my Unity skill is creating my own unique assets. Unity supports 2D and 3D creations, but I don’t have art skills in either field. I’ve dabbled in Inkscape enough that I might be able to build some rudimentary things if I need to, and for 3D meshes I could import STL so I could apply my 3D printing design CAD skills. But the real answer is Blender or similar 3D geometry creation software, and that’s an entirely different learning curve to climb.

Combing through Unity documentation, I learned of a “world building” tool called ProBuilder. I’m not entirely sure exactly where it fits in the greater scheme of things, but I can see it has tools to manipulate meshes and even create them from scratch. It doesn’t claim to be a good tool for doing so, but supposedly it’s a good tool for whipping up quick mockups and placeholders. Most of the information about ProBuilder is focused on UV mapping, but I didn’t know it at the start. All ProBuilder documentation assume I already knew what UV meant, and all I could tell is that UV didn’t mean ultraviolet in this context. Fortunately searching for UV in context of 3D graphics gives me this Wikipedia article on UV mapping. There is a dearth of written documentation for ProBuilder, what little I found all point to a YouTube playlist. Maybe I’ll find the time to sit through them later.

I skimmed through the Unity Essentials sections on audio because Bouncy Bouncy Lights was to be silent, so audio is still on the to-do list. And like 2D/3D asset creation, I’m neither a musician nor a sound engineer. But if I ever come across motivation to climb this learning curve I know where to go to pick up where I left off. I know I have a lot to learn since meager audio experimentation already produced one lesson: AudioSource.Play would stop any prior occurrences of the sound. If I want the same sound to overlap each other I have to use AudioSource.PlayOneShot.

Incorporating video is an interesting way to make Unity scenes more dynamic, without adding complexity to the scene or overloading the physics or animation engine. There’s a Unity Learn tutorial about this topic, but I found that video assets are not incorporated in WebGL builds. The documentation said video files must be hosted independently for playback by WebGL, which adds to the hosting complications if I want to go down that route.

WebGL
The Video Clip Importer is not used for WebGL game builds. You must use the Video Player component’s URL option.

And finally, I should set aside time to learn about shaders. Unity’s default shader is effective, but it has become quite recognizable and there are jokes about the “Unity Look” of games that never modified default shader properties. I personally have no problem with this, as long as the gameplay is good. (I highly recommend Overcooked game series, built in Unity and have the look.) But I am curious about how to make a game look distinctive, and shaders are the best tool to do so. I found a short Unity Learn tutorial but it doesn’t cover very much before dumping readers into the Writing Shaders section of the manual. I was also dismayed to learn that we don’t have IntelliSense or similar helpfulness in Visual Studio when writing shader files. This is going to be a course of study all on its own, and again I await good motivation for me to go climb that learning curve.

I enjoyed this session of Unity 3D adventure, and I really loved that I got far enough this time to build my own thing. I’ve summarized this adventure in my talk to ART.HAPPENS, hoping that others might find my experience informative in video form in addition to written form on this blog. I’ve only barely scratched the surface of Unity. There’s a lot more to learn, but that’ll be left to future Unity adventures because I’m returning to rover work.

Venturing Beyond Unity Essentials Pathway

To help beginners learn how to create something simple from scratch, Unity Learn set up the Essentials pathway which I followed. Building from an empty scene taught me a lot of basic tasks that were already done for us in the LEGO microgame tutorial template. Enough that I felt confident enough to start building my own project for ART.HAPPENS. It was a learning exercise, running into one barrier after another, but I felt confident I knew the vocabulary to search for answers on my own.

Exercises in the Essentials pathway got me started on the Unity 3D physics engine, with information about setting up colliders and physics materials. Building off the rolling ball exercise, I created a big plane for balls to bounce around in and increased the bounciness for both ball and plane. The first draft was a disaster, because unlike real life it is trivial to build a perfectly flat plane in a digital world, so the balls keep bouncing in the same place forever. I had to introduce a tilt to make the bounces more interesting.

But while bouncing balls look fun (title image) they weren’t quite good enough. I thought adding a light source might help but that still wasn’t interesting enough. Switching from ball to cube gave me a clearly illuminated surface with falloff in brightness, which I thought looked more interesting than a highlight point on a ball. However, cubes don’t roll and would stop on the plane. For a while I was torn: cubes look better but spheres move better. Which way should I go? Then a stroke of realization: this is a digital world and I can change the rules if I want. So I used a cube model for visuals, but attached a sphere model for physics collisions. Now I have objects that look like cubes but bounce like balls. Something nearly impossible in the real world but trivial in the digital world.

To make these lights show up better, I wanted a dark environment. This was a multi-step procedure. First I did the obvious: delete the default light source that came with the 3D project template. Then I had to look up environment settings to turn off the “Skybox”. That still wasn’t dark, until I edited camera settings to change default color to black. Once everything went black I noticed the cubes weren’t immediately discernable as cubes anymore so I turned the lights back up… but decided it was more fun to start dark and turned the lights back off.

I wanted to add user interactivity but realized the LEGO microgame used an entirely different input system than standard Unity and nothing on the Essentials path taught me about user input. Searching around on Unity Learn I got very confused with contradictory information until I eventually figured out there are two Unity user input systems. There’s the “Input Manager” which is the legacy system and its candidate replacement “Input System Package” which is intended to solve problems with the old system. Since I had no investment in the old system, I decided to try the new one. Unfortunately even tthough there’s a Unity Learn session, I still found it frustrating as did others. I got far enough to add interactivity to Bouncy Bouncy Lights but it wasn’t fun. I’m not even sure I should be using it yet, seeing how none of the microgames did. Now that I know enough to know what to look for, I could see that the LEGO microgame used the old input system. Either way, there’s more climbing of the learning curve ahead. [UPDATE: After I wrote this post, but before I published it, Unity released another tutorial for the new Input Manager. Judging by this demo, Bouncy Bouncy Lights is using input manager incorrectly.]

The next to-do item was to add the title and interactivity instructions. After frustration with exploring a new input system, I went back to LEGO microgame and looked up exactly how they presented their text. I learned it was a system called TextMesh Pro and thankfully it had a Unity Learn section and a PDF manual was installed as part of the asset download. Following those instructions it was straightforward to put up some text using the default font. After my input system frustration, I didn’t want to get much more adventurous than that.

I had debated when to present the interactivity instructions. Ideally I would present them just as the audience got oriented and recognizes the default setup. Possibly start getting bored and ready to move on, so I can give them interactivity to keep their attention. But I have no idea when that would be. When I read the requirement that the title of the piece should be in the presentation, I added that as a title card before showing the bouncing lights. And once I added a title card, it was easy to add another card with the instructions to be shown before the bouncing lights. The final twist was the realization I shouldn’t present them as static cards that fade out: since I already had all these physical interactions in place, they are presented as falling bouncing objects in their own right.

The art submission instructions said to put in my name and a way for people to reach me, so I put my name and newscrewdriver.com at the bottom of the screen using TextMesh Pro. Then it occurred to me the URL should be a clickable link, which led me down the path of finding out how an Unity WebGL title can interact with the web browser. There seemed to be several different deprecated ways to do it, but they all point to the current recommended approach and now my URL is clickable! For fun, I added a little spotlight effect when the mouse cursor is over the URL.

The final touch is to modify the presentation HTML to suit the Gather virtual space used by ART.HAPPENS. By default Unity WebGL build generates an index.html file that puts the project inside a fixed-size box. Outside that box is the title and a button to go full screen. I didn’t want the full screen option for presenting this work in Gather, but I wanted to fill my <iframe> instead of a little box within it. My CSS layout skills are still pretty weak and I couldn’t figure it out on my own, but I found this forum thread which taught me to replace the <body> tag with the following:

  <body>
      <div class="webgl-content" style="width:100%; height:100%">
        <div id="unityContainer" style="width:100%; height:100%">
        </div>
      </div>
  </body>

I don’t understand why we need to put 100% styles on two elements before it works, but hopefully I will understand whenever I get around to my long-overdue study session on CSS layout. The final results of my project can be viewed at my GitHub Pages hosting location. Which is a satisfying result but there are a lot more to Unity to learn.

Notes on Unity Essentials Pathway

As far as Unity 3D creations go, my Bouncy Bouncy Lights project is pretty simple, as expected of a beginner’s learning project. My Unity (re)learning session started with their LEGO microgame tutorial, but I didn’t want to submit a LEGO-derived Unity project for ART.HAPPENS. (And it might not have been legal under the LEGO EULA anyway.) So after completing the LEGO microgame tutorial and its suggested Creative Mods exercises, I still had more to learn.

The good news is that Unity Learn has no shortage of instruction materials, the bad news is a beginner gets lost on where to start. To help with this, they’ve recently (or at least since the last time I investigated Unity) rolled out the concept of “Pathways” which organize a set of lessons targeted for a particular audience. People looking for something after completing their microgame tutorial is sent towards the Unity Essentials Pathway.

Before throwing people into the deep pool that is Unity Editor, the Essentials Pathway starts by setting us up with a lot of background information in the form of video interview clips with industry professionals using Unity. I preferred to read instead of watching videos, but I wanted to hear these words of wisdom so I sat through them. I loved that they allocated time to assure beginners that they’re not alone if they found Unity Editor intimidating at first glance. The best part was the person who claimed their first experience was taking one look and said “Um, no.” Closed Unity, and didn’t return for several weeks.

Other interviews covered the history of Unity, and how it enabled creation of real-time interactive content, and the tool evolved alongside the industry. There were also information for people who are interested in building a new career using Unity. Introducing terminology and even common job titles that can be used to query on sites like LinkedIn. I felt this section offered more applicable advise for this job field more than I ever received in college for my job field. I was mildly amused and surprised to see Unity classes ended with a quiz to make sure I understood everything.

After this background we are finally set loose on Unity Editor starting from scratch. Well, an empty 3D project template which is as close to scratch as I cared to get. The template has a camera and a light source but not much else. Unlike the microgames which are already filled with assets and code. This is what I wanted to see: how do I start from geometry primitives and work my way up, pulling from Unity Asset store as needed to for useful prebuilt pieces. One of the exercises was to make a ball roll down a contraption of our design (title image) and I paid special attention to this interaction. The Unity physics engine was the main reason I chose to study Unity instead of three.js or A-Frame and it became the core for Bouncy Bouncy Lights.

I’ve had a lot of experience writing in C# code, so I was able to quickly breeze through C# scripting portions of Unity Essentials. But I’m not sure this is enough to get a non-coder up and running on Unity scripting. Perhaps Unity decided they’re not a coding boot camp and didn’t bother to start at the beginning. People who have never coded before will need to go elsewhere before coming back to Unity scripting, and a few pointers might be nice.

I skimmed through a few sections that I decided was unimportant for my ART.HAPPENS project. Sound was one of them: very important for an immersive gaming experience, but my project will be silent because the Gather virtual space have a video chatting component and I didn’t want my sounds to interfere with people talking. Another area I quickly skimmed through were using Unity for 2D games, which is not my goal this time but perhaps I’ll return later.

And finally, there were information pointing us to Unity Connect and setting up a profile. At first glance it looked like Unity tried to set up a social network for Unity users, but it is shutting down with portions redistributed to other parts of the Unity network. I had no investment here so I was unaffected, but it made me curious how often Unity shuts things down. Hopefully not as often as Google who have become infamous for doing so.

I now have a basic grasp on this incredibly capable tool, and it’s time to start venturing beyond guided paths.

Bouncy Bouncy Lights

My motivation for going through Unity’s LEGO microgame tutorial (plus associated exercises) was to learn Unity Editor in the hopes of building something for ART.HAPPENS, a community virtual art show. I didn’t expect to build anything significant with my meager skills, but I made something with the skill I have. It definitely fit with the theme of everyone sharing works that they had fun with, and learned from. I arrived at something I felt was a visually interesting interactive experience which I titled Bouncy Bouncy Lights and, if selected, should be part of the exhibition opening today. If it was not selected, or if the show has concluded and its site taken down, my project will remain available at my own GitHub Pages hosting location.

There are still a few traces of my original idea, which was to build a follow-up to Glow Flow. Something colorful with Pixelblaze-controlled LED lights. But I decided to move from the physical to digital domain so now I have random brightly colored lights in a dark room each reflecting off an associated cube. But by default there isn’t enough for the viewer to immediately see the whole cube, just the illuminated face. I want them to observe the colorful lights moving around for a bit before they recognized what’s happening, prompting the delight of discovery.

Interactivity comes in two forms: arrow keys will change the angle of the platform, which will change the direction of the bouncing cubes. There is a default time interval for new falling cubes. I chose it so that there’ll always be a few lights on screen, but not so many to make the cubes obvious. The user can also press space bar to add lights faster than the default interval. If the space bar is held down, the extra lights will add enough illumination to make the cubes obvious and they’ll frequently collide with each other. I limited it to a certain rate because the aesthetics change if too many lights all jump in. Thankfully I don’t have to worry about things like ensuring sufficient voltage supply for lights when working in the digital world, but too many lights in the digital world add up to white washing out the individual colors to a pastel shade. And too many cubes interfere with bouncing, and we get an avalanche of cubes trying to get out of each other’s way. It’s not the look I want for the project, but I left in a way to do it as an Easter egg. Maybe people would enjoy bringing it up once in a while for laughs.

I’m happy with how Bouncy Bouncy Lights turned out, but I’m even happier with it as a motivation for my journey learning how to work with a blank Unity canvas.

Notes on Unity LEGO Microgame Creative Mods

Once a Unity 3D beginner completed a tightly-scripted microgame tutorial, we are directed towards a collection of “Creative Mods”. These suggested exercises build on top of what we created in the scripted tutorial. Except now individual tasks are more loosely described, and we are encouraged to introduce our own variations. We are also allowed to experiment freely, as the Unity Editor is no longer partially locked down to keep us from going astray. The upside of complete freedom is balanced by the downside of easily shooting ourselves in the foot. But now we know enough to not do that, or know how to fix it if we do. (In theory.)

Each of the Unity introductory microgames have their own list of suggested modifications, and since I just completed the LEGO microgame I went through the LEGO list. I was mildly surprised to see this list grow while I was in the middle of doing it — as of this writing, new suggested activities are still being added. Some of these weren’t actually activities at all, such as one entirely focused on a PDF (apparently created from PowerPoint) serving as a manual for the list of available LEGO Behaviour Bricks. But most of the others introduce something new and interesting.

In addition to the LEGO themed Unity assets from the initial microgame tutorial, others exist for us to import and use in our LEGO microgame projects. There was a Christmas-themed set with Santa Claus (causing me to run through Visual Studio 2019 installer again from Unity Hub to get Unity integration), a set resembling LEGO City except it’s on a tropical island, a set for LEGO Castles, and my personal favorite: LEGO Space. Most of my personal LEGO collection were from their space theme and I was happy to see a lot of my old friends available for play in the digital world.

When I noticed the list of activities grew while I was working on them, it gave me a feeling this was a work-in-progress. That feeling continued when I imported some of these asset collections and fired up their example scene. Not all of them worked correctly, mostly centered around how LEGO pieces attached to each other especially the Behaviour Bricks. Models detach and break apart at unexpected points. Sometimes I could fix it by using the Unity Editor to detach and re-attach bricks, but not always. This brick attachment system was not a standard Unity Editor but an extension built for the LEGO microgame theme, and I guess there are still some bugs to be ironed out.

The most exciting part of the tutorial was an opportunity to go beyond the LEGO prefab assets they gave us and build our own LEGO creations for use in Unity games. A separate “Build your own Enemy” tutorial gave us instructions on how to build with LEGO piece by piece within Unity Editor, but that’s cumbersome compared to using dedicated LEGO design software like BrickLink Studio and exporting the results to Unity. We don’t get to use arbitrary LEGO pieces, we have to stay within a prescribed parts palette, but it’s still a lot of freedom. I immediately built myself a little LEGO spaceship because old habits die hard.

I knew software like BrickLink Studio existed but this was the first time I sat down and tried to use one. The parts palette was disorienting, because it was completely unlike how I work with LEGO in the real world. I’m used to pawing through my bin of parts looking for the one I want, not selecting parts from a menu organized under an unfamiliar taxonomy. I wanted my little spaceship to have maneuvering thrusters, something I add to almost all of my LEGO space creations, but it seems to be absent from the approved list. (UPDATE: A few days later I found it listed under “3963 Brick, Modified 1 x 1 with 3 Loudspeakers / Space Positioning Rockets”) The strangest omission seem to be wheels. I see a lot of parts for automobiles, including car doors and windshields and even fender arches. But the only wheels I found in the approved list are steering wheels. I doubt they would include so many different fender arches without wheels to put under them, but I can’t find a single ground vehicle wheel in the palette! Oversight, puzzling intentional choice, or my own blindness? I lean towards the last but for now it’s just one more reason for me to stick with spaceships.

My little LEGO spaceship, alongside many other LEGO microgame Creative Mods exercises (but not all since the list is still growing) was integrated into my variant of the LEGO microgame and uploaded as “Desert Dusk Demo“. The first time I uploaded, I closed the window and panicked because I didn’t copy down the URL and I didn’t know how to find it again. Eventually I figured out that everything I uploaded to Unity Play is visible at https://play.unity.com/discover/mygames.

But since the legal terms of LEGO microgame assets are restricted to that site, I have to do something else for my learn-and-share creation for ART.HAPPENS. There were a few more steps I had to take there before I had my exhibit Bouncy Bouncy Lights.

Notes on Unity LEGO Microgame Tutorial

To help Unity beginners get their bearings inside a tremendously complex and powerful tool, Unity published small tutorials called microgames. Each of them represent a particular game genre, with the recently released LEGO microgame as the default option. Since I love LEGO, I saw no reason to deviate from this default. These microgame tutorials are implemented as Unity project templates that we can launch from Unity’s Hub launcher, they’re just filled out with far more content than the typical Unity empty project template.

Once a Unity project was created with the LEGO microgame template (and after we accepted all the legal conditions of using these LEGO digital assets) we see the complex Unity interface. Well aware of how intimidating it may look to a beginner, the tutorial darkened majority of options and highlighted just the one we need for that step in the tutorial. Which got me wondering: the presence of these tutorial microgames imply the Unity Editor UI itself can be scripted and controlled, how is that done? But that’s not my goal today so I set that observation aside.

The LEGO microgame starts with the basics: how to save our progress and how to play test the game in its current state. The very first change is adjusting a single variable, our character’s movement speed, and test its results. We are completely on rails at this point: the Unity Editor is locked off so I couldn’t change any other character variable, and I couldn’t even proceed unless I changed the character speed to exactly the prescribed value. This is a good way to make sure beginners don’t inadvertently change something, since we’d have no idea how to fix it yet!

Following chapters of the tutorial gradually open up the editor, allowing us to use more and more editor options and giving us gradually more latitude to change the microgame as we liked. We are introduced to the concept of “assets” which are pieces we use to assemble our game. In an ideal world they snap together like LEGO pieces, and in the case of building this microgame occasionally they actually do represent LEGO pieces.

Aside from in-game objects, the LEGO minigame also allows us to define and change in-game behavior using “Behaviour Bricks”: Assets that look just like another LEGO block in game, except they are linked to Unity code behind the scenes giving them more functionality than just a static plastic brick. I appreciated how it makes game development super easy, as the most literal implementation of “object-oriented programming” I have ever seen. However, I was conscious of the fact these behavior bricks are limited to the LEGO microgame environment. Anyone who wishes to venture beyond would have to learn entirely different ways to implement Unity behavior and these training wheels will be of limited help.

The final chapter of this LEGO microgame tutorial ended with walking us through how to build and publish our project to Unity Play, their hosting service for people to upload their Unity projects. I followed those steps to publish my own LEGO microgame, but what’s online now isn’t just the tutorial. It also included what they called “Creative Mods” for a microgame.

Unity Tutorial LEGO Microgame

Once I made the decision to try learning Unity again, it was time to revisit Unity’s learning resources. This was one aspect that I appreciated about Unity: they have continuously worked to lower their barrier to entry. Complete beginners are started on tutorials that walk us through building microgames, which are prebuilt Unity projects that show many of the basic elements of a game. Several different microgames are available, each representing a different game genre, so a beginner can choose whichever one that appeals to them.

But first an ambitious Unity student had to install Unity itself. Right now Unity releases are named by year much like other software like Ubuntu. Today, the microgame tutorials tell beginners to install version 2019.4 but did not explain why. I was curious why they tell people to install a version that is approaching two years old so I did a little digging. The answer is that Unity designates specific versions as LTS (Long Term Support) releases. Unity LTS is intended to be a stable and reliable version, with the best library compatibility and the most complete product documentation. More recent releases may have shiny new features, but a beginner wouldn’t need them and it makes sense to start with the latest LTS. Which, as of this writing, is 2019.4.

I vaguely recall running through one of these microgame exercises on an earlier attempt at Unity. I chose the karting microgame because I had always liked driving games. Gran Turismo on Sony PlayStation (the originals in both cases, before either got numbers) was what drew me into console gaming. But I ran out of steam on the karting microgame and those lessons did not stick. Since I’m effectively starting from scratch, I might as well start with a new microgame, and the newest hotness released just a few months ago is the LEGO microgame. Representing third-person view games like Tomb Raider and, well, the LEGO video games we can buy right now!

I don’t know what kind of business arrangement behind the scenes made it possible to have digital LEGO resources in our Unity projects, but I am thankful it exists. And since Unity doesn’t own the rights to these assets, the EULA for starting a LEGO microgame is far longer than for the other microgames using generic game assets. I was not surprised to find clauses forbidding use of these assets in commercial projects, but I was mildly surprised that we are only allowed to host them on Unity’s project hosting site. We can’t even host them on our own sites elsewhere. But the most unexpected clause in the EULA is that all LEGO creations depicted in our minigames must be creatable with real LEGO bricks. We are not allowed to invent LEGO bricks that do not exist in real life. I don’t find that restriction onerous, just surprising but made sense in hindsight. I’m not planning to invent an implausible LEGO brick in my own tutorial run so I should be fine.

Checking In on Unity 3D

Deciding to participating in ART.HAPPENS is my latest motivation to look at Unity 3D, something I’ve done several times. My first look was almost five years ago, and my most recent look was about a year and a half ago in the context of machine learning. Unity is a tremendously powerful tool and I’ve gone through a few beginner tutorials, but I never got as far as building my own Unity project. Will that finally change this time?

My previous look at Unity was motivated by an interest in getting into the exciting world of machine learning, specifically in the field of reinforcement learning. That line of investigation did not get very far, but as most machine learning tools are focused on Linux there was the question of Unity’s Linux support. Not just to build a game (which is supported) but also to run the Unity editor itself on Linux. My investigation was right around the time Unity Editor for Linux entered beta with expectation for release in 2020, but that has been pushed to 2021.

For my current motivation, it’s not as important to run the editor on Linux. I can just as easily create something fun and interactive by running Unity on Windows. Which led to the next question: could I output something that can work inside an <iframe> hosted within Gather, the virtual space for ART.HAPPENS? On paper the answer is yes. Unity has had the ability to render content using WebGL for a while, and their code has matured alongside browser support for WebGL. But even better is the development (and even more importantly, browser adoption) of WebAssembly for running code in a browser. This results in Unity titles that are faster to download and to execute than the previous approach of compiling Unity projects to JavaScript. These advancements are far more encouraging than what Unity competitor Unreal Engine has done, which was to boot HTML5 support out of core to a community project. Quite a sharp contrast to Unity’s continued effort to make web output a first class citizen among all of its platforms, and this gives me the confidence to proceed and dive in to the latest Unity tutorial for beginners: LEGO!

ART.HAPPENS Motivates Return to Unity 3D

I’ve been talking about rovers on this blog for several weeks nonstop. I thought it would be fun to have a micro Sawppy rover up and running in time for Perseverance landing on February 18th, but I don’t think I’ll make that self-imposed deadline. I have discovered I cannot sustain “all rovers all the time” and need a break from rover work. I’m not abandoning the micro rover project, I just need to switch to another project for a while as a change of pace.

I was invited to participate in ART.HAPPENS, a community art show. My first instinct was to say “I’m not an artist!” but I was gently corrected. This is not the fancy schmancy elitist art world, it is the world of people having fun and sharing their works kind of world. Yes, some of the people present are bona fide artists, but I was assured anyone who wants to share something done for the sheer joy of creating can join in the fun.

OK then, I can probably take a stab at it. Most of my projects are done to accomplish a specific purpose or task, so it’s a rare break to not worry about meeting objectives and build something for fun. My first line of thought was to build a follow-up to Glow Flow, something visually pleasing and interactive built out of 3D printed parts and illuminated by colorful LEDs controlled with a Pixelblaze. It’s been on my to-do list to explore more ideas on how else to use a Pixelblaze.

Since we’re in the middle of a pandemic, this art show is a virtual affair. I learned that people will be sharing photos and videos of their projects and shown in a virtual meeting space called Gather. Chosen partially because the platform was built to be friendly to all computer skill levels, Gather tries to eliminate friction of digital gatherings.

I poked my head into Gather and saw an aesthetic that reminded me of old Apple //e video games that used a top-down tiled view. For those old games, it was a necessity due to the limited computing power and memory of an old Apple computer. And those same traits are helpful here to build a system with minimal hardware requirements.

Sharing photos and videos of something like Glow Flow would be fun, but wouldn’t be the full experience. Glow Flow looked good but the real fun comes from handling it with our own hands. I was willing to make some compromises in the reality of the world today, until I noticed how individual projects will be shared as web content that will be hosted in an <iframe>. That changes the equation. Because it meant I could build something interactive after all. If I have control of content inside an <iframe>, I can build interactive web content for this show.

I briefly looked at a few things that might have been interesting, like three.js and A-Frame. But as I read documentation for those platforms, my enthusiasm dampened by shortcomings I came across. For example, building experiences incorporating physics simulation seems to be a big can of worms on those platforms. Eventually I decided: screw it, if I’m going to do this, I’m going to go big. It’s time to revisit Unity 3D.