Test Run of Quest 2 and Eyeglasses

OK so sticking some googly eyes on my Quest 2 wasn’t a serious solution to any problem, but there was another aspect of Apple Vision Pro I found interesting: they didn’t make any allowances for eyeglasses. Users need to have perfect vision, or wear contacts, or order lens inserts that clip onto their headset. This particular design decision allows a much slimmer headset and a very Apple thing to do.

Quest 3 headset has similar provisions for clip-on lenses, but my Quest 2 did not. And even though Quest 2 technically allowed for eyeglasses, it is a tiny bit too narrow for my head and would pinch my glasses’ metal arms against my head. I thought having corrective lenses inside the headset would eliminate that side pressure and was worth investigating.

Since Zenni isn’t standing by to make clip-on lenses for my Quest 2, I thought I would try to get creative and reuse one of my retired eyeglasses. I have several that were retired due to damaged arms and they would be perfect for this experiment. I selected a set, pulled out my small screwdriver set, and unfastened the arms leaving just the front frame.

For this first test, my aim is for quick-and-dirty. I used tape to hold the sides in place. For this first test I didn’t bother trying to find an ideal location.

The center was held with two rolled-up wads of double-sided foam tape. I believe the ideal spacing is something greater than zero, but this was easy for a quick test.

Clipping the face interface back on held my side strips of tape in place. I put this on my face and… it’s marginally usable! My eyesight is bad enough that I would just see a blur without my eyeglasses. With this taped-on solution, made without any consideration for properly aligned position, I could make out majority of features. I still couldn’t read small text, but I could definitely see well enough to navigate virtual environments. I declare this first proof-of-concept test a success, I will need to follow it up with a more precise positioning system to see if I can indeed make my own corrective lenses accessory for my Quest 2.

Reducing VR Headset Isolation

One advantage of Quest 2’s standalone operation capability is easy portability. I have a friend who was curious about VR but wanted to get some first-hand experience, and we were able to meet up for a demo with my Quest 2. No need to lug around a powerful PC plus two lighthouse beacons for a Valve Index.

At one point during the test drive, my friend turned towards me to talk about something. He can see where I sat as he had pass-through camera view active, but all I saw in return was the blank white plastic front surface of my Quest 2. It was a little disconcerting, like conversing through an one-way mirror. After that experience I understood the problem Apple wanted to solve with Vision Pro’s EyeSight feature.

It’s a really cool idea! EyeSight is a screen mounted on front of the headset and displays a rendering of the wearer’s eyes so people around them has something to focus on. There’s a lot of technical sophistication behind that eye rendering: because Vision Pro tracks direction of wearer’s gaze, those replicated eyes reflect the actual direction wearer is looking at. Our brains are high evolved to interpret gaze direction (very useful skill out in the wilderness to know if a saber-toothed cat is looking at us) and EyeSight aimed to make it effortlessly natural for all our normal instincts and social conventions to stay intact.

I have not seen this myself but online reports indicate EyeSight falls short of its intention. The screen is too dark to be visible in many environments, a problem made worse by the glossy clear outer layer reflecting ambient light. It was further dimmed by a lenticular lens layer that tries to give it a 3D effect, which is reportedly not very convincing as those rendered eyes are still obviously in the wrong place and not the real eyes.

Given Apple’s history of hardware iteration, I expect future iterations of EyeSight to become more convincing and natural for people to interact with. In the meantime, I can build something with 80% of the functionality for 1% of the cost.

I stuck a pair of self-adhesive googly eyes(*) to the front of my headset, and that will give human eyes something to look at instead of a blank white plastic face. It bears no resemblance to the wearer’s eyes within (or at least I hope not) and does not reflect actual gaze direction. On the upside, it is a lot more visible in bright environments and a far more amusing. Yeah it’s a silly thing but don’t worry, I have serious headset modification project ideas too.


(*) Disclosure: As an Amazon Associate I earn from qualifying purchases

Quest 2 Standalone and Mixed Reality Operation

While it was instructive to compare Quest 2 specifications with my other VR headsets, the biggest reason I wanted to try a Quest 2 is its standalone capability. After spending some time I’ve decided I’m a fan. It’s much easier to enjoy a virtual environment when I’m not constantly kicking away the cable tethering me to my gaming PC. All else being equal, a wireless experience is superior. Unfortunately, all else are not equal. The cell phone level hardware in a Quest 2 renders a decidedly lower fidelity world relative to what a modern gaming PC can render. It’s a nonissue for something simple and abstract like Beat Saber, but anything even slightly ambitious looks like a PC game from at least ten years ago.

One way to have the best of both worlds is wireless streaming from a gaming PC to my Quest over home WiFi. I tried Steam Link on Quest and was impressed by how well it worked. Unfortunately, it doesn’t work quite well enough just yet. When I’m playing games on a monitor, a few milliseconds of latency plus an occasional (about once per minute) stutter of one or two frames is fine. But on a VR headset, it quickly gives me motion sickness and a headache. Supposedly this can be improved with a WiFi 6 router, but I’m not willing to replace my home WiFi infrastructure for this feature. For the immediate future, I’m happy using my Valve Index for SteamVR experiences.

Mixed Reality

And finally, Meta’s push for Mixed Reality is still a question mark. All three of my VR headsets let me use their cameras to see real-world surroundings. But Quest is the only one of the three to do the work to map that camera footage into a convincingly realistic spatial layout around me. The HP WMR and Valve Index camera views can give me a rough idea if I’m about to run into a wall, but neither are properly mapped enough for me to, say, reach out and grab something.

To support mixed reality scenarios, Quest advertises hand-tracking capabilities for controller-free experiences. Supposedly this works well on the Quest 3, which has additional color cameras for the purpose. My house has beige walls and carpet so my hand has poor contrast for Quest 2’s black-and-white cameras to pick out. It’s pretty unreliable today.

Both of these capabilities show promise, but they’re both relatively new and I will have to wait for novel usage to emerge in mixed reality experiences yet to come. Apple’s Vision Pro is all-in on mixed reality, though, and offers to solve a problem that the Quest 2 does not.

And A Quest 2 Too

One reason I was willing to take apart my old HP Windows Mixed Reality system is the Meta Quest 2. Now with the Quest 3 taking mainstream position in their product line, Quest 2 inventory is getting cleared out at $200. That price was too tempting to resist so I get one even though I had a perfectly functional Valve Index. Here are some notes from my first hand experience.

Versus HP Windows Mixed Reality

I did not find a single spec sheet advantage my old WMR headset had over the much younger Quest 2. Technology moves fast! Quest 2 has higher screen resolution, integrated microphone and headset, and controllers that were happy to run on a single nominal AA battery instead of demanding two fully-charged AAs. Both used camera-based inside-out tracking but Quest 2 maintained better tracking because it used four cameras instead of two, and those cameras did not demand I turn on every light in the house if I wanted to use it at night. Quest 2 had some level of IPD adjustment with three settings, whereas the HP had no IPD adjustment at all.

I have not yet decided if I prefer Quest 2’s elastic headband versus HP WMR’s headband. I think the HP headband was the best part of the device and I may try 3D printing an adapter to use it with my Quest 2 to see if that’s an improvement.

Versus Valve Index

On the spec sheet Valve Index has a resolution advantage to the Quest 2. Fewer display pixels spread out across a wider field of view. In practice, I found the wider field of view much more important for immersive VR. I am happy making the tradeoff for better field of view but obviously I wouldn’t say no to both if I can get it in a future headset.

Beacon-based tracking used by Index meant I had to add those two little boxes in my room, but the results are worth it. Index has consistently better tracking especially for games where my hands have to move out of my field of view. (Reach behind my back or have a hand on my chest while looking up.) The Index controllers themselves are also much better than Quest controllers, with individual finger sensing, grip pressure sensing, and straps allowing me to open my grip without the controllers falling out. It’s a great immersion advantage, too bad Half Life: Alyx is the only game that takes full advantage of Index controllers.

Both have integrated microphone and speakers, but the Valve Index delivered much better positional audio. Weight of an Index is significantly heavier but part of that weight is the headband balancing things across my head versus Quest 2’s thin elastic band. And finally, Index has better optical adjustment capabilities. Not only smooth IPD adjustment (instead of three fixed positions) but also fore-aft adjustment.

Index is a much more comfortable headset for longer sessions and provides a more immersive VR experience compared to a Quest 2. But we have to consider their relative price tags. It’s better, but it’s not five times better. Even more if you count cost of a gaming PC! Plus, the comparisons here overlook what’s arguably Quest 2’s greatest advantage: it doesn’t need an associated gaming PC at all.

End of Windows Mixed Reality

In December 2023 Microsoft announced that Windows Mixed Reality has been deprecated and will be removed from Windows 11 24H2. This did not come as a surprise, as the platform hasn’t seen any investment in years. But it does mean my HP WMR headset will officially become a paperweight later this year.

This is fine by me, because my headset has pretty much been a paperweight since I damaged its cord. I tried fixing it and was seemingly successful, but there was a chance my fix is flawed. An errant pin could potentially ruin an expensive video card so I never really put the headset back into use. It is old anyway, lacking features of newer headsets. Heck, it was old and out of date when I got it! At that time, WMR was already… not a resounding success… and my local Best Buy decided to clear out their slow-moving inventory with heavy discounts.

What could I do with it now? There was never any compelling WMR exclusive experience for me, so I don’t have anything to revisit before it’s gone. And since I’ve upgrade to a Valve Index headset, that gives me a superior experience for everything in SteamVR. I guess I could use the deprecated WMR headset for experiments that I don’t want to risk on my expensive Valve Index, but I don’t have any project ideas along that direction. There’s no particular reason to hang on to it “just in case” an idea comes up because (1) it’ll stop working by the end of the year, and (2) if I want VR experiments with an affordable headset, I have to option to go pick up a Meta Quest 2. Which is not only affordable, but would let me explore untethered VR as well as opening the door to Quest exclusive experiences.

During my long inkjet teardown/Dell XPS debugging saga, I would frequently think about what I could do with this obsolete WMR headset. After a few months of not coming up with anything interesting, I will proceed with the ultimate fallback option: it is teardown time!

My Cell Phones Before Android, 1998-2013

I recently rediscovered this picture of all my cell phones from 1998 to 2013. I took this group picture shortly before sending most of them to electronic waste disposal. At the beginning of that fifteen year period, these were “cell phones” to specify they worked on a wireless network. By the end of that period, they are just “phones” and what used to be “phones” had become “landline”. It would have been symbolic to post this note on August 30th 2023 as that would have been the picture’s 10th anniversary, but I’m a few months late.

The oldest phone on the far left is a Sony CM-H888. I bought it in September 1998 and at the time it was a wonder of miniaturization much smaller than its contemporary analog peers. Yes, analog! This was a telephone for making voice calls over analog cellular network and nothing else. No internet, no apps, not even SMS. It looks bulky compared to the rest of this lineup mostly because of its 4*AA NiMH battery pack consuming over half of its volume. It is the only device on this list not powered by a lithium-ion battery.

Rapid technology advancement motivated me to part with my money. I upgraded to a Nokia 8260 a year later (October 1999) which weighs less than half as much (220g vs. 97g), eliminated the protruding antenna, and is a comfortable fit in my pocket instead of a barely-fit bulge. Multiple different technologies helped make this possible, including lithium-ion battery and a switch from AirTouch Cellular‘s analog network to AT&T Wireless TDMA digital cellular. It also gave me first exposure to a phone app in the form of Nokia’s legendary snake game.

A few years after getting the Nokia 8260, I bought a Compaq iPaq personal digital assistant (PDA) to help track my calendar and related adulting information that I could no longer all keep in my head. I appreciated that I had a pocket reminder of my responsibilities, and I admit to a certain level of Geek Cred for carrying around these electronic devices, but it still meant I was carrying them!

Consolidation came in December 2003, when I upgraded to a Motorola MPx200. It was the device that launched “Windows Mobile Smartphone” OS which gave me phone apps to functionally replace my PDA. The screen resolution of 176×220 was a huge upgrade over the Nokia brick but lower than iPaq’s 240×320. Plus, both of those screens were monochrome and now I have a color screen. Upgrading from TDMA to GSM digital cellular also meant I gained access to SMS text messaging. And finally, switching to a flip phone eliminated accidental butt-dials.

But it was a lot thicker than the Nokia, and didn’t fit in my pocket as nicely. So a year later (December 2004) I upgraded to an Audiovox 5600 (HTC Typhoon). It has all the features of the Motorola MPx200 at size of the Nokia 8260, so it’s almost the best of both worlds. The only thing I consider a downgrade is the fact butt-dials started happening again. Especially annoying was a feature where holding down “9” would automatically dial “911” and I could not figure out how to disable it.

So when the Cingular 3125 (HTC Startrek) launched, it caught my attention and I bought one in March 2007. It’s a flip phone to eliminate embarrassing butt-dials again, but far thinner than the Motorola MPx200. Hardware had advanced enough to put iPaq resolution screen (240×320 and in color) into a phone, and the laser-etched metal keypad looks way better in person than in pictures.

The first Apple iPhone also launched in 2007, but as an expensive premium product. My CIngular 3125 cost a small fraction of the iPhone up front, and did not require an expensive cellular data plan as the iPhone did. But the cost gap narrowed over the following years. Apple iPhone prices (along with corresponding data plan prices) eventually dropped to within reach of mass market consumers, and it was clear slabs of touch screen glass were the way of the future.

The AT&T HTC Pure (HTC Touch Diamond2) weighted about as much as my Cingular 3125. It lost the cool laser-etched keypad in exchange for a much larger and higher resolution (480×800) screen. It was one of several non-Apple efforts to follow iPhone’s lead as of January 2010 and a pretty poor showing at that. The marketing team tried their best trying to find advantages but it was pretty futile. Example: The 480×800 screen resolution was higher than the iPhone 3, but that marketing item was quickly buried by “retina display” of iPhone 4. Phones like HTC Pure could only compete at a lower price and I was fine with that. My Cingular 3125 was falling apart, held together with glue and tape. A cheap not-as-good-as-iPhone unit would suffice.

Minimizing usage of expensive data plan meant my HTC Pure did not get used as a smartphone very much. Mostly just voice calls and calendar, similar to how I had used my earlier phones. I didn’t know what I was missing out on until I upgraded to a Samsung Focus in November 2010. Windows Phone 7 was a huge advancement. Its first-party experience became a credible competitor to iPhone and Android, but third-party app support was inferior and would never catch up.

My biggest complaint with the Samsung Focus was its AMOLED screen. The bright high-contrast colors worked well for video and pictures, but its RGBG PenTile matrix proved horrible for text legibility at those resolutions. So when the Nokia Lumia 900 launched with classic RGB color pixels, I jumped over in July 2012. I was happy to accept some color and brightness limitations of a LCD screen in exchange for more legible text. Beyond its screen, I preferred Nokia’s sleek industrial design over Samsung’s anonymous black blob.

And finally, at the far right of this lineup, is a Nokia Lumia 620 I bought in May 2013. All the Nokia design and RGB matrix of the 900, but in a smaller package running Windows Phone 8. It was fine, but it was still a Windows Phone. After multiple major updates (7.5, 7.8 and 8) it became clear Microsoft was unable or unwilling to match iOS/Android on third-party app support. After losing faith in Microsoft, I never upgraded to Windows Phone 10… er, sorry, “Windows 10 Mobile”. Because rebranding always solves fundamental product issues.

I switched to Android in 2015 with a Nexus 5 and I’ve had Android phones ever since. I still have many of them (and try to keep them running) but a group photo wouldn’t be very interesting as they’re all touch screen slabs. (Effectively this photo.) RGBG PenTile AMOLED panels came back into my life again with recent phones, but I found that I didn’t mind it as much at modern phone screen resolutions. I have less than a dozen apps installed on my current phone so I never got into apps in a big way. But if I needed one, I can be confident an Android app exists. I no longer have to worry about whether an app exists for Windows Phone.

I hardly noticed when Microsoft finally pulled the plug on their phone OS efforts. I was long gone. It’s hardly the only platform I own that Microsoft axed.

Dell XPS 8950 Components Replaced Under Warranty

My six-month-old Dell XPS 8950 has been exhibiting intermittent bug checks. (Blue screens of death.) Since it was still under warranty, I wanted Dell to fix it. The tech support department tried their best to fix it in software, but they eventually decided hardware component replacement will be required to get this system back up and running reliably.

The premium I paid for XPS included on-site service visits as a perk. Dell dispatched a technician (an employee of WorldWide Tech Services) to my home with a job order to replace SSD and power supply. This made sense: a bad SSD would corrupt system files and cause the kind of seemingly random and unpredictable errors I see. If the power supply had gone bad, intermittent power glitches can do the same. As far as system components go, they are relatively inexpensive and easy to replace, so it made sense for Dell to try that first.

Unfortunately, this repair job went awry. When the technician powered my system back up, there was no video from the RTX 3080 GPU. Intel’s integrated video worked if the GPU was removed so the rest of the system seemed fine. A follow-up visit had to be scheduled for another technician to arrive with a replacement RTX 3080 GPU to get things back up and running. I hope the first technician didn’t get in too much trouble for this problem as RTX 3080 cards are not cheap.

The evening after the system was back up, another bug check occurred. Two more occurred within the 24 hours that followed. I reported this back to Dell and they asked if I would be willing to send the system to a repair depot. I didn’t care how it was done, I just wanted my system fixed, so I agreed. They sent me a shipping box with packing material and a shipping label. I guess they didn’t expect people to hang on to the original box! (I did.)

Looking up the shipping label address, I found a match for CSAT Solutions. Apparently contracted by Dell to perform such repairs. These people worked fast! According to FedEx tracking information, it was delivered to CSAT at 11AM and by 4PM the box was back in FedEx possession for the return trip. I had set up the machine to run Folding@Home and I included instructions to reproduce the problem, but it’s clear they ain’t got time for that nonsense.

An invoice in the box indicated they replaced CPU and RAM. Two more components that, if faulty, can cause random bug checks. They are significantly more expensive than a SSD or power supply so I understand why they weren’t first to be replaced. (A RTX 3080 cost more, but wasn’t part of the plan.)

I reinstall Windows 11 again and fired up Folding@Home. This time there were no bug checks running for seven days nonstop. Hooray! I’m curious whether it was CPU or RAM at fault (or both?) but at this point I have no way to know.

Due to component replacements, I almost have a different computer. Of its original parts, the metal enclosure and main logic board are all that remained. Dell has fixed the computer under warranty with no financial cost to me but significant time cost. If I value my time at, say, $50 an hour, I would have been better off just buying a new computer. As for Dell, whatever profit they had made on this sale has been completely erased and became a net loss. I’m glad this problem was fixed under warranty, but both sides prefer to avoid doing it at all. I hope this gives them a financial incentive to improve system reliability!

Notes On Diagnostics From Dell Support

My Dell XPS 8950 has started exhibited unpredictable bug checks. (Blue Screen of Death) I poked around Dell’s SupportAssist software and found a lot of promising troubleshooting tools, but none of them fixed it. Out of ideas on software fixes, and unwilling to void the warranty by modifying hardware, I used SupportAssist text chat feature to open an official trouble ticket with Dell technical support. They eventually fixed the issue, but it took a few weeks to get there.

As expected, they wanted to try the easy things first. This meant repeating many SupportAssist tools which I already knew would be doomed to fail. And Windows tools (like restore points) that did no better. Since hardware diagnostics tests passed, their suspicion moved to operating system corruption. This involved trying a lot of procedures I already knew about, and have already run, but they want to do it again. There were a few bits of novelty:

Throughout this arduous process,I was instructed to reinstall Windows three separate times in three different ways: first with SupportAssist’s OS reinstall option, then Windows’ built in recovery option, finally a clean install via an USB drive created with Microsoft’s Media Creation Tool. This is on top of the re-installation I had already performed before contacting Dell support. With all this practice, I got really good at Windows setup!

Each time I reinstalled Windows, I had to reinstall SupportAssist. Clicking on text chat created a new chat session. Which meant I was sent to someone expecting to open a new ticket and I’d have to spend time to get them straightened out with my existing ticket number.

With each bug check, I get a crash memory dump to prove their latest idea hadn’t resolved my issue. Sadly Dell’s support ticket web interface allowed only a maximum of five attachments. I quickly reached my limit and additional memory dumps had to submitted by sharing files via my Microsoft OneDrive and Google Drive accounts and sending a link via text chat. This was… sub-optimal.

Weeks later, I’ve exhausted all their scripted solutions and finally granted an escalation to senior support technicians. They reviewed my ticket and came to the conclusion I hoped they would: some hardware components would need to be replaced.

Notes on Dell SupportAssist

I have a thorny issue with my XPS 8950. The symptom is an intermittent bug check (a.k.a. blue screen of death) that is not readily reproducible and, even when it occurs, the error code varies wildly in type and in location. My previous trouble-free Dell computers have allowed me to ignore Dell’s tech support portal. Now I have a troubled PC and have to learn what’s in Dell’s SupportAssist software.

Dell SupportAssist is primarily a native Windows application that is pre-installed on every Dell PC. If it is lost, SupportAssist can be downloaded from Dell’s website. (I had to do this several times after performing operating system reinstall as a diagnostic procedure.) It has several roles to play in regular maintenance:

  • Look for common configuration problems and tries to fix them.
  • Download drivers and other system files, though mostly supplanted by Windows Update. I even got BIOS update 1.16.0 from Windows Update before it showed up as an option in SupportAssist.
  • Clean up unused files to free up disk space.

SupportAssist also included troubleshooting tools including:

  • Examine Windows system events. SupportAssist recognized that I had been experiencing bug checks, and even offered a “Fix Now” option. It’s not obvious what that did, but it didn’t help.
  • Perform a suite of hardware tests. CPU tests, memory tests, disk tests. I was amused it even spun up each of the fans.

Regarding the hardware tests: there’s also a separate piece of software that can run independent of Windows. Its title bar calls itself “SupportAssust | On-board Diagnostics” and it lives on a separate disk partition. To launch it, we have to trigger the BIOS boot select menu and select “Diagnostics”. My computer passed all of these tests as well, including running everything under “Advanced Test” with “Thorough mode” selected.

This diagnostics partition was deleted when following directions from Dell tech support to perform a completely clean install. I was worried about that — it seemed useful! — but I later learned SupportAssist Windows application can re-partition the hard drive and reinstall that Diagnostics partition.

There is one worrisome aspect of SupportAssist. When this native Windows application is installed on a system, the Dell web site running in a browser seems to be able to query hardware configuration in order to offer the appropriate documentation and driver downloads. How are those components communicating? I’m worried about that channel being a potential venue for security exploits.

There are many other features of SupportAssist I didn’t investigate because they didn’t seem helpful to me. Like tools to migrate data from one PC to another, and naturally an upsell for extended warranty coverage.

I ran every SupportAssist maintenance task and diagnostic test I could find, none helped. As a last resort I activated its operating system reinstall procedure, and that didn’t help either. I’m out of ideas for software fixes. If this were one of my home-built desktop PCs, I would start swapping out hardware to see if I can isolate it to a particular component. However, this computer is still under warranty so I don’t want to do anything that would void said warranty. If hardware replacements are to be done, it will have to be done by Dell people on Dell dime under warranty. To get that process started, I have to contact Dell technical support. I could call them over the phone, but that doesn’t seem like the best approach for an intermittent error that takes a day to reproduce. Fortunately SupportAssist includes a text chat client, and that seems more practical for my situation.

Dell XPS 8950 Bug Check Codes List

My Dell XPS 8950 I bought primarily for SteamVR started exhibiting bug checks at around six months old. It was eventually fixed under Dell’s one-year warranty, but the journey started with an attempt to diagnose it myself. Stressing it with Folding@Home would crash it once roughly every 12-24 hours.

When Windows halts with a bug check, a memory dump file is written to disk for debug purposes. It takes significant expertise to dig through a memory dump file to pinpoint a root cause. However, it’s pretty easy to get a general idea of what we are dealing with. We can install Windows debugger (WinDbg) and use its built-in automated analyzer to extract a top-level error code we can then look up online. Over the course of two weeks I ran Folding@Home to build a collection of memory dump files, hoping to find commonalities that might point at a source.

The best case scenario is to have the same bug check code on every dump, occurring in the same operating system component. What I got instead is a list of thirteen codes (appended at the bottom of this post), some more often than others. And even worse, they didn’t all happen at the same place in the system but was spread all around. The only vague commonality between them is an invalid memory operation. Sadly, “invalid memory operation” is too broad of a category to tie to a root cause. I became quite discouraged looking over those memory dumps.

I know Dell tech support has a database of bug check codes and a list of diagnostic steps to address each of them. First level support technicians are trained to tell the customer to try each item in turn. Figure a half dozen things they want me to try (probably starting with “please turn off and back on again”…) for each of 13 possible codes means I will have to trudge through a lot of those procedures.

Eventually my support ticket will establish a widespread pattern that escalate my case to more senior support staff who will look at the problem more holistically, but I have to earn it with persistence! I will be spending a lot of time with Dell tech support, starting with their preinstalled troubleshooting tool called SupportAssist.


Bug check codes encountered, with URL of the Microsoft reference page and the first sentence of their explanation pasted in after the code.

Dell XPS 8950 Stress Test with Folding@Home

I had another lengthy saga running In parallel with my lengthy Canon Pixma MX340 teardown. The Dell XPS 8950 I bought primarily for SteamVR with my Valve Index began exhibiting bug checks on an irregular basis. This is not good. I paid a premium over similar-spec computers on the expectation that a XPS would be more reliable and, failing that, Dell is more likely to fix things that go wrong. Well, the first part turned out to be wrong. Thankfully the second part was eventually tested to be true, but it took some work to get there.

The first thing I needed was a better way to reproduce the issue. I want to collect many bug check memory dumps to compare them against each other, and I needed a way to verify the problem has been resolved or not. Since I bought this computer mainly for SteamVR, the bug check usually happens while I’m in the middle of a VR session. It spoiled a few Beat Saber songs and abruptly ended firefights with Combine soldiers in Half Life: Alyx, but not every VR session triggered the problem and I wasn’t going to just stay in VR until it occurred.

I found hardware tests in Dell’s SupportAssist tool (more on SupportAssist in a future post) and ran those. My computer passed the tests with no errors. I looked for a way to run these tests in a loop but didn’t find a way to do so.

I tried just leaving the computer on and running, but not doing anything in particular. After a week, I got two bug checks. This is better than unpredictable crashes in VR sessions, but waiting 3-4 days between reproducing a failure is still not great.

I increased system workload by installing and running Folding@Home. It kept the GPU busy but CPU utilization would drop off after a few minutes. I eventually figured out Windows 11 detected a long-running compute process and decided to restrict Folding@Home to the four power-efficient E-Cores on my i7-12700 CPU. Gah, foiled! I worked around this by disabling the E-Cores in system BIOS. (Where they were called Atom Cores.) With E-Cores out of the picture, CPU utilization stays at 100% with all eight hyper-threaded P-cores running at full blast.

I would rather have a procedure to consistently and immediately reproduce the crash but I never found one. Running Folding@Home the bug check would usually occur within 12-24 hours and this was the best I’ve got. Over the course of about two weeks, Folding@Home helped me generate a decently sized collection of bug check crash memory dumps to examine.

Options for Improving Timestamp Precision

After a quick test determined that my Arduino sketch will be dealing data changing at a faster rate than 1kHz, I switched the timestamp query from calling millis() to micros(). As per Arduino documentation, this change improved time resolution by 250 from 1 millisecond precision to 4 microsecond precision. Since I had time on my mind anyway, I took a research detour to learn how this might be improved further. After learning how much work it’d take, I weighed it against my project and decided… nah, never mind.

Hardware: ATmega328P

A web search for ATmega328P processor programming found good information on this page Developing in C for the ATmega328: Marking Time and Measuring Time. The highest possible timing resolution is a counter that increments upon every clock cycle of the processor. For an ATmega328P running at 16MHz, that’s a resolution of 62.5 nanoseconds from ticks(). This 16-bit counter overflows very quickly (once every 4.096 milliseconds) so there’s another 16 bit counter ticks_ro() that increments whenever ticks() overflows. Together they become a 32-bit counter that would overflow every 4.47 minutes, after that we’re on our own to track overflows.

However, ticks() and ticks_ro() are very specific to AVR microcontrollers and not (easily) accessible from Arduino code because that kills its portability. Other microcontrollers have similar concepts but they would not be called the same thing. (Example: ESP32 has cpu_hal_get_cycle_count())

Software: Encoder Library

Another factor in timing precision is the fact that I’m not getting the micros() value when the encoder position is updated. The encoder position counter is updated within the quadrature decoding library, and I call micros() sometime afterwards.

timestamp,position,count
16,0,448737
6489548,1,1
6490076,2,1
6490688,5,1
6491300,8,1
6491912,12,1
6492540,17,1
6493220,21,1
6493876,25,1

Looking at the final two lines of this excerpt, I see my code recorded encoder update from position 21 to 25 over a period of 6493876-6493220 = 656 microseconds. But 6493876 is only when my code ran, that’s not when the encoder clicked over from 24 to 25! There’s been a delay on the order of three-digit microseconds, an approximation derived from 656/(25-21) = 164.

One potential way to improve upon this is to add a variable to the Encoder library, tracking the micros() timestamp of the most recent position update. I can then query that timestamp from my code later, instead of calling micros() myself which pads an unknown delay. I found the encoder library source code at https://github.com/PaulStoffregen/Encoder. I found an update() function and saw a switch() statement that looked at pin states and updated counter as needed. I can add my micros() update in the cases that updated position. Easy, or so I thought.

Looking at the code more closely, I realized the function I found is actually in a comment. It was labeled the “Simple, easy-to-read “documentation” version 🙂” implying the actual code was not as simple or easy to read. I was properly warned as I scrolled down further and found… AVR assembly code. Dang! That’s hard core.

On the upside, AVR assembly code means it can access the hardware registers behind ticks() and ticks_ro() for the ultimate in timer resolution. On the downside, I don’t know AVR assembly and, after some thought, I decided I’m not motivated enough to learn it for this particular project.

This was a fun side detour and I learned things I hadn’t known before, but I don’t think the cost/benefit ratio makes sense for my Canon MX340 teardown project. I want to try some other easy things before I contemplate the harder stuff.


This teardown ran far longer than I originally thought it would. Click here to rewind back to where this adventure started.

Captured CSV and Excel worksheets are included in the companion GitHub repository.

Quadrature Decoding with Arduino

I want to understand the internal workings of a Canon Pixma MX340 multi-function inkjet. Right now my focus is on its paper feed motor assembly, and I want to record data reported by a quadrature rotation encoder inside that assembly. I want to track behavior over several seconds, possibly a minute of two, which gets a little unwieldy with a logic analyzer timeline interface. So I thought I should create a tool tailored to my project and I found a promising lead using an ESP32’s pulse counter (PCNT) peripheral.

As as I started preparing for the project, thinking through and writing down what I’d need to do, a lot of details felt very familiar in a “wait… I’ve done this before” way. I had forgotten I’ve played with quadrature encoders before! A search for “quadrature” on my project notebook (this blog site) found entries on reading the knob on a Toyota audio head unit, an inexpensive knob from Amazon, and investigative detour during Honda audio head unit adventures.

Following my earlier footsteps would be an easier way to go, because the Arduino IDE and Paul Stoffregen’s quadrature decoder library are already installed on my machine. But this will be the first time I apply it to something turned by a motor instead of by a human hand. Is it fast enough to keep up? Decoder library documentation says 100-127kHz sampling rate is possible on a Teensy 3, which was the library’s original target hardware. Running on an ATmega328 would be slower.

Aside: I found this Gammon forum thread listing technical detail on ATmega328 interrupt service routines, which laid out work just for ISR overhead that would take 5.125us before any ISR code actually runs. This puts a hard upper bound of ~200 kHz on response rate of an ISR that does nothing.

In the spirit of “try the easy thing first” I’ll start with ATmega328 Arduino. If it proves too slow, I have a Teensy LC somewhere, and I definitely have ESP8266 boards. In the unlikely case they all fail to meet my need, I can resume my examination of ESP32’s pulse counter (PCNT) peripheral.


This teardown ran far longer than I originally thought it would. Click here to rewind back to where this adventure started.

Configuring Laptop for Proxmox VE

I’m migrating my light-duty server duties from my Dell Latitude E6230 to my Dell Inspiron 7577. When I started playing with KVM hypervisor on the E6230, I installed Ubuntu Desktop instead of server for two reasons: I didn’t know how to deal with the laptop screen, and I didn’t know how to work with KVM via the command line. But the experience allowed me to learn things I will incorporate into my 7577 configuration.

Dealing with the Screen

By default, Proxmox VE would leave a simple text prompt on screen, which is fine because most server hardware don’t even have screens attached. On a laptop, keeping the screen on wastes power and probably cause long-term damage as well. I found an answer on Proxmox forums:

  • Edit /etc/default/grub to add “consoleblank=30” (30 is timeout in seconds) to GRUB_CMDLINE_LINUX if an entry already existed. If not, add a single line GRUB_CMDLINE_LINUX="consoleblank=30"
  • Run update-grub to apply this configuration.
  • Reboot

Another default behavior: when closing the laptop lid, the laptop goes to sleep. I don’t want this behavior when I’m using it as mini-server. I was surprised to learn the technique I found for Ubuntu Desktop would also work for server edition as well: edit /etc/systemd/logind.conf and change HandleLidSwitch to ignore.

Making the two above changes turn off my laptop screen after the set number of seconds of inactivity, and leaves the computer running when the lid is closed.

Dealing with KVM

KVM is a big piece of software with lots of knobs. I was intimidated by the thought of learning all command line options and switches on my own. So, for my earlier experiment, I ran Virtual Machine Manager on Ubuntu Desktop edition to keep my settings straight. I’ve learned bits and pieces of interacting with KVM via its virsh command line tool, but I have yet to get comfortable enough with it to use command line as the default interface.

Fortunately, many others felt similarly and there are other ways to work with a KVM hypervisor. My personal data storage solution TrueNAS has moved from a FreeBSD-based system (now named TrueNAS CORE) to a Linux-based system (a parallel sibling product called TrueNAS SCALE). TrueNAS SCALE included virtual machine capability with KVM hypervisor which looked pretty good. After a quick evaluation session, I decided I preferred working with KVM using Proxmox VE, a whole operating system built on top of Debian/Ubuntu dedicated to the job. Hosting virtual machines with the KVM hypervisor and tools to monitor and manage those virtual machines. Instead of Virtual Machine Manager’s UI running on Ubuntu Desktop, both TrueNAS SCALE and Proxmox VE expose their UI as a browser-based interface accessible over the network.

I liked the idea of doing everything on a single server running TrueNAS SCALE, and may eventually move in that direction. But there is something to be said of keeping two isolated machines. I need my TrueNAS SCALE machine to be absolutely reliable, an appliance I can leave running its job of data storage. It can be argued it’s a good idea to use a different machine for more experimental things like ESPHome and Home Assistant Operating System. Besides, unlike normal people, I have plenty of PC hardware sitting around. Put some of them to work!

Dell Inspiron 7577 Laptop as Light Duty Server

I’m setting aside my old Dell Latitude E6230 laptop due to its multiple hardware failures. At the moment I am using it to play with virtualization server software. Virtualization hosts usually run on rack-mounted server hardware in a datacenter somewhere. But an old laptop works well for light-duty exploration at home by curious hobbyists: they sip power for small electric bill impact, they’re compact so we can stash them in a corner somewhere, and they come with a battery for surviving power failures.

I bought my Dell Inspiron 7577 15″ laptop five years ago, because at the time that was the only reasonable way to get my hands on a NVIDIA GPU. The market situation have improved since then, so I now have a better GPU on my gaming desktop. I’ve also learned I haven’t needed mobile gaming power enough to justify carrying a heavy laptop around, so I got a lighter laptop.

RAM turned out to be a big constraint on what I could explore on the E6230. Which had a meager 4GB RAM and I couldn’t justify spending money to buy old outdated DDR2 memory. Now I look forward to having 16GB of elbow room on the 7577.

While none of my virtualization experiments demanded much processing power, more is always better. This move will upgrade from a 3rd-gen Core i5 3320M processor to a 7th-gen Core i5 7300HQ. Getting four hardware cores instead of two hyperthreaded cores should be a good boost, in addition to all the other improvements made over four generations of Intel engineering.

For data storage, I’ve upgraded the 7577 from its factory M.2 NVMe SSD from a 256GB unit to a 1TB unit, and the 7577 chassis has an open 2.5″ SATA slot for even more storage if I need it. The E6230 had only a single 2.5″ SATA slot. Neither of these machines had an optical drive, but if they did, that can be converted to another 2.5″ SATA slot with adapters made for the purpose.

Both of these laptops have a wired gigabit Ethernet port, sadly a fast-disappearing luxury in laptops. It eliminates all the unreliable hassle of wireless networking, but an Ethernet jack is a huge and bulky component in an industry aiming for ever thinner and lighter designs. [UPDATE: The 7577’s Ethernet port would prove to be a source of headaches.]

And finally, the Inspiron 7577 has a hardware-level feature to improve battery longevity: I could configure its BIOS to stop battery charging at 80% full. This should be less stressful on the battery than being kept at 100% full all the time, which is what the E6230 would do and I could not configure it otherwise. I believe this deviation from laptop usage pattern contributed to battery demise and E6230 retirement, so I hope the 80% state of charge limit will keep the 7577 battery alive for longer.

When I started playing with KVM hypervisor on the E6230, I installed Ubuntu Desktop instead of server for two reasons: I didn’t know how to deal with the laptop screen, and I didn’t know how to work with KVM via the command line. Now this 7577 configuration will incorporate what I’ve learned since then.

Dell Latitude E6230 Getting Benched

I’ve got one set of dead batteries upgraded and tested and now attention turns to a different set of expired batteries. I bought this refurbished Dell Latitude E6230 several years ago intending to take apart and use as a robot brain. I changed my mind when it turned out to be a pretty nifty little laptop to take on the go, much smaller and lighter than my Dell Inspiron 7577. With lower specs than the 7577, it also had longer battery run time and its performance didn’t throttle as much while on battery. It has helped me field-program many microcontrollers and performed other mobile computing duties admirably.

I retired it from laptop duty when I got an Apple Silicon MacBook Air, but I brought it back out to serve as my introduction to running virtual machines under KVM hypervisor. Retired laptops work well as low-power machines for exploratory server duty. Running things like Home Assistant haven’t required much in the way of raw processing power, it was more important for a machine to run reliably around the clock while stashed unobtrusively in a corner somewhere. Laptops are built to be compact, energy-efficient, and already have a built-in battery backup. Though the battery usage pattern will be different from normal laptop use, which caused problems long term.

Before that happened though, this Latitude E6230 developed a problem starting up when warm. If I select “restart” it’ll reboot just fine, but if I select “shut down” and press the power button immediately to turn it back on, it’ll give me an error light pattern instead of starting up: The power LED is off, the hard drive LED is on, and the battery LED blinks. Given the blinking battery LED I thought it indicated a problem with the battery, but if I pull out the battery to run strictly on AC, I still see the same lights. The workaround is to leave the machine alone for 20-30 minutes to cool down, after which it is happy to start up either with or without battery.

But if the blinking battery LED doesn’t mean a problem with the battery, what did it mean? I looked for the Dell troubleshooting procedure that would explain this particular pattern. I didn’t get very far and, once I found the workaround, I didn’t invest any more time looking. Acting as a mini-server meant it was running most of the time and rarely powered off. And if it does power off for any reason, this mini-server isn’t running anything critical so waiting 20 minutes isn’t a huge deal. I decided to just live with this annoyance for a long time, until the second problem cropped up recently:

Now when the machine is running, the battery LED blinks yellow. This time it does indicate a problem with the battery. The BIOS screen says “Battery needs to be replaced”. The Ubuntu desktop gives me a red battery icon with an exclamation mark. And if I unplug the machine, there’s zero battery runtime: the machine powers off immediately. (Which has to be followed by that 20 minute wait for it to cool down before I can start it up again.)

I knew keeping lithium-ion batteries at 100% full charge is bad for their longevity, so this was somewhat expected. I would have preferred the ability to limit state of charge at 80% or so. Newer Dell laptops like my 7577 have such an option in BIOS but this older E6230 did not. Given its weird warm startup issue and dead battery, low-power mini-server duty will now migrate to my Inspiron 7577.

PC Power Supply Fan Replacement (CWT GPS650S)

While learning electronics by reverse-engineering board schematics, one of my computers started making an intermittent growling noise. I suspect a failing fan bearing. Probably not a big deal, as mechanical things wear, and failure is inevitable. I traced the sound to a Channel Well Technology GPS650S power supply’s internal fan. This computer has a 9th gen Core i7 CPU, which launched in 2019 so this power supply has been running for roughly four years. This is on the short end of PC cooling fan lifespan, but hopefully just bad luck of being on the short end of the bell curve.

Looking on the bright side, I know how to replace a failing fan. So given a choice I prefer this failure mode versus blowing a non-user replaceable fuse or burning up.

Getting past a few “no user serviceable parts inside” and “warranty void if removed” stickers opened up the enclosure to access the 120mm 12VDC fan.

Something’s definitely wrong with the fan, as the label isn’t supposed to get puffy and shiny in the middle like that. This is consistent with friction heat generated by a failing bearing.

Fortunately, the fan seems to be plugged in to the power supply control board with a commodity JST-XH 2-position connector.

Sitting on my shelf are multiple 120mm 12VDC cooling fans that can serve as suitable replacement. One of them even has a JST-XH connector already installed. Judging by the sheet of airflow control plastic on this fan, it was salvaged from another power supply. Probably the the one that blew an inaccessible fuse.

Unfortunately it was not that easy, but that was my own fault. I connected it up to my bench power supply dialed up to 12V DC for a test. It spun up nicely and when I reached over to disconnect power I knocked the fan grill into the fan. The fan, spinning at full speed, dealt with the sudden stop by snapping off a blade. Rendering the fan useless. D’oh!

But I had other fans to spare, including one with an Antec sticker that probably meant it came from the power supply that went up in smoke. It should work just as well, merely a bit less convenient for me because I had to cut off its existing connector and crimp my own JST-XH compatible connector. This time I was more careful with the spin-up test and did not break a blade.

The power supply is now back in action, running quietly with a replacement salvaged fan. And now I have two broken fans on hand: one with a bad bearing and another with a broken blade.

Lenovo Mirage AR was a Huge Disappointment

I’m fascinated by the significant promise and potential of Apple Vision Pro, but I’m waiting to see real-world feedback. They would have to be very positive for multiple product generations (at the very least, a more affordable non-Pro edition) before I would consider pulling out my own credit card. The last time I paid money for an AR experience, it was on the opposite end of the spectrum that was barely more than an old school Pepper’s Ghost illusion.

This was the Star Wars: Jedi Challenges product, with a Lenovo Mirage AR headset as the main hardware component. With all hype and no substance, there was no follow-up to this now-retired product. The promised third-party software development kit never materialized. The lone app has been removed from app stores. Its main URL now redirects to Lenovo’s general website, though its product support page still exists for the moment.

My first experience with an AR headset were automaker promotions with Microsoft’s Hololens and I was impressed. Sometime after that, Star Wars: Jedi Challenges promotion hype machine started spinning. I was intrigued but skeptical. It cost a tiny fraction of a Microsoft Hololens so I knew there were compromises involved. It is built around a cell phone like all lackluster 3DOF VR headsets, but this headset adds a pair of onboard cameras with onboard processing hardware that sends data to the phone via a USB cable. Based on that description, it was possible there is enough hardware for a rudimentary AR experience.

The reality was disappointing. While we did have 6DoF tracking, it was restricted to the lightsaber peripheral, just barely good enough to draw a virtual lightsaber blade on the AR headset at a rate of (unscientific guess) 30fps. There was a clearly perceptible lag between our lightsaber movement and the glowing line onscreen. In addition to the lightsaber, the cameras could also track an external beacon. A squishy rubber ball with a colorful LED inside. Since it is a sphere, there was no meaningful orientation tracking as with the lightsaber, just position relative to the headset.

There was no further understanding of our environment and no tracking of the AR headset itself. Not even 3DoF tracking like in Google Cardboard. Kylo Ren is directly in front of us regardless of which way we are looking. If we are looking down, Kylo Ren is in the floor. If we look up, Kylo Ren is in the ceiling. As far as I can tell, the only reality this headset augmented was the lightsaber, drawing a lightsaber blade over a fixed and scripted experience projected Pepper’s Ghost-style in front of my face. As far as an immersive experience goes, this rated even lower than what we can get from Google ARCore.

The good news was that I didn’t waste too much money on this disappointment, as I had waited until these things were heavily discounted just so stores could clear them out of inventory. If I had paid full MSRP I would have definitely demanded a refund! The bad news is that, since I got them on clearance, there was no refund and no return. They sat gathering dust until recently as I decided to write up my VR/AR/XR experiences here. There’s no reason to keeping taking up space with this garbage, meaning now is a good time to take it apart before disposing of it.

Dell XPS 8950 with RTX 3080 and i7-12700

Looking over Memorial Day Sale discounted computers, I decided a Dell XPS 8950 configured with a RTX 3080 GPU and i7-12700 CPU had the grunt to drive my new Valve Index VR headset at a price I’m willing to pay. My next stop was the XPS 8950 service manual. Flipping through component replacement procedures let me get a look at the design for this system. I liked seeing its clever layout and tool-less operation. I especially liked the brace that help support video card weight. GPUs been growing bigger and heavier and I’ve been grumpy the industry has not yet coalesced on a de-facto standard to support all that mass. For the most part they’re still just mounted on the backplate and cantilevered way out past the PCI Express slot, placing a great deal of strain. Motherboard manufacturers have started putting metal reinforcements on their PCI Express slot, which I consider a hack and not a solution, but that’s an entirely separate rant so I’ll stop here.

The downside of novel capabilities is a nonstandard form factor. Historically, by the time I want to upgrade a CPU/motherboard combo I’m ready for a new system anyway. (Like right now.) Therefore, I’m not terribly bothered by the fact neither the mainboard nor case are standard ATX: they’re tailored specifically to each other. There’s an upside, though. Front panel ports here are actually mounted directly on the mainboard and not connected via cables as is typical of a standard ATX case. I’ve had intermittent connection issues with such front panel ports, so I see this design as a positive.

The only part that made me pause was the proprietary power supply. Unlike CPU/mainboard combos, I have had to replace power supplies on my own PCs. Mitigating this worry is (1) XPS power supply should last longer than lowest-bidder ATX PSU, and (2) power plug could (might?) be compatible with the new ATX12VO standard. So I could rig up something to keep the machine running, even if it wouldn’t fit in the case properly.

That was enough information for me to decide on buying one. One thing the manual couldn’t definitively show are the cables, so I had to wait until my system showed up to see them. They are laid out very tidily as expected of a customized power supply with all wires trimmed to necessary length. It is also free of all legacy power plugs. No floppy connector, no CD-ROM connector. Lack of clutter ensures great airflow through the airy middle section of the case.

I was happy to see robust provision for GPU power. There are a pair of 8-pin PCIe power plugs to feed the existing RTX 3080 card. Waiting in the wings just below them, tucked into a plastic bracket, are a duplicate set of extra PCIe power plugs. Together they are enough to feed a RTX 4090 card and I feel comfortable they are ready for whatever video card I might want to upgrade to in the future.

Only a single PCIe x1 slot is still open for future expansion, but historically that has been sufficient. This system came with 32GB of DDR5 RAM in two 16GB modules, leaving two additional memory slots open. There are two M.2 slots, one of which is occupied by a terabyte Samsung NVMe SSD and the other open. If I want to add some bulk HDD storage, there are two unoccupied 3.5″ drive bays. Both of which have SATA power plugs ready to go. However, only one of the two bays have a SATA data cable and the proprietary tool-less drive caddy installed. (Look for blue plastic in upper-right corner of picture.) SATA cables are easy to get and there are open SATA ports on the mainboard. It might not be as tidy since the length isn’t customized for the case, but I’m not worried about that. I’m considering buying one caddy now. It’s pretty cheap, and ensures the bay is usable even if Dell stops carrying this part. Or I could measure the dimensions of my existing caddy and 3D print a clone.

I saw no open 2.5″ drive bays, but that is a solvable problem. This system came with a laptop-sized DVD-R/W optical drive that I do not expect to ever use. However, that gives me the option to swap it out with a 2.5″ drive adapter. I’ll just have to remember to get the correct height adapter this time.

With the exception of power supply, I see standardized form factors for everything else I anticipate installing as either replacements or upgrades. The non-standardized elements have offset benefits like GPU support, tidy cabling, good airflow, etc. This compact integrated package seems well worth a ~15% premium over a DIY build. Now I have to see if it stands up to the test of time. I hope this machine will support many adventures (VR and otherwise) for years to come. Starting with revisiting my favorites to feel its upgraded power.

Narrowed Field Down to Dell XPS 8950

When I upgraded my VR headset to a Valve Index, I knew there’s a chance I’d want a new video card to go with it. The increased display resolution and refresh rate of the Index is significantly more demanding, possibly outpacing my existing RTX 2070 video card. Well, that expectation proved to be true. And to my small surprise, the challenge of running an Index has outpaced not only my video card but my processor as well! Faced with a major upgrade, I decided to get an entire system built around a RTX 3080. I combed through all of the Memorial Day 2023 sales I could find, and the winner of this competition was a Dell XPS 8950.

Most PCs built with a RTX 3080 video card cater to a market infatuated with multicolor LEDs. I don’t care to have them on my own computer, but I know how they work electronically and confident I can turn down the garish lights if they bother me. The bigger problem are functional tradeoffs made by cases optimized around those lights and related aesthetics accessories. For example, many of these PCs have clear sides to show off the hardware and lights inside. Glass and acrylic are poor thermal conductors and obviously obstructs airflow, not great for a power-hungry machine that needs to dissipate a lot of heat. Such are the silliness in HP Omen, Lenovo Legion, Dell Alienware, and other PCs competing in this market.

Dell’s Inspiron product line is their economy class for competing on price. If price was the biggest concern, I would buy parts and build a PC myself. I’m willing to pay a small premium to have a well-engineered system that I can expect to work well for several years. After that point I will contemplate piecemeal upgrades like a new video card. Between Dell’s low-priced Inspiron and their high-end Alienware is their XPS product line. Returning to the airline analogy: if Inspiron is economy class and Alienware is first class, XPS is their business class. I think I can find my Goldilocks “just right” point here. These products aren’t penny-pinched to last barely as long as the warranty. They should give all the performance I want with none of the gratuitous LEDs.

Dell’s list prices for XPS systems are roughly double what I would pay to buy similar components and assemble a system by myself, but I have never paid Dell MSRP and I don’t intend to start. The current Dell XPS desktop is the 8960 with 4000-series NVIDIA GPUs and 13-th generation Intel CPUs. This meant existing stock for older XPS 8950 with previous generation RTX 3080 video card and older generation CPU can sometimes be found with clearance pricing. Combined with Memorial Day and additional discounts, these not-bottom-basement machines can be had for less than 15% premium over self-assembly with bottom-basement components. This was good enough for a closer look.