First Few Weeks With Dell Latitude 9410

Shortly after I bought an off-lease Dell Latitude 9410, I opened it up to verify everything met expectations. After finding no deal-breakers, I buttoned it back up and I’ve been using it for a few weeks. I’m happy with my purchase, here are some notes:

Intel Core i7

The Core i7 processor in my 9410 gives me snappy and responsive performance when I want it, then throttle itself down for low power consumption when I don’t. This means I get useful battery life on par with power-frugal Intel Atom machines, yet never feel sluggish during demanding times like I would with an Atom-based machine. My typical usage pattern results in 4-6 hours of runtime even with just 80% charge (more on that below) which ranks favorably among my past Intel-powered laptops. However, it falls short of my Apple MacBook Air with M1 Apple silicon. Both machines deliver snappy performance, but the M1 never gets as hot and runs far longer on battery.

The power and heat situation is a tradeoff against benefits of having an Intel CPU. All my developer tools are available here. I can run Docker containers without worrying about whether I need to find an ARM64 build. And pretty much every random USB peripheral will have Windows drivers. The biggest stumble so far is dual-booting Windows/Ubuntu: Dell configured this machine with Intel RST, and Ubuntu chose not to play well with RST. There are several possible solutions to this problem. I just haven’t been motivated enough to implement any of them yet.

Tablet/Laptop Convertible

Another thing my MacBook Air can’t do is fold its screen around and turn into a tablet. This was a deliberate design decision by Apple, who chose to keep MacBooks differentiated from iPads. After living with a Windows convertible for a few weeks I’ve decided I’m a fan. I’ve used Windows convertibles before but they’ve all been budget machines with limited hardware that hampered my user experience. This was my first full-power convertible and it means I can finally enjoy the benefits of a transformer and not trip over tradeoffs at every turn.

It’s nice to be able to switch back and forth. This is most useful when I’m reading documentation for software tools. I can switch to laptop mode to type a few commands for a quick hands-on exercise, then return to tablet mode and continue reading. With this positive experience I am much more receptive to such machines in the future, but I’m still not willing to pay the large price premium usually associated with such capability. Buying off-lease secondhand machines is likely to remain my pattern.

Power And Charging

Like my M1 MacBook Air, the 9410 charges via a USB Type-C connector instead of a proprietary power plug. However, this doesn’t necessarily mean I can use any USB-C power source. When up and running it demands 60 Watts. (USB power meter says 20V @ 3A.) This is even more demanding than the M1 MacBook Air, which is happy with 45W. If I plug in a less powerful USB-C supply, the MacBook will alert me that battery is still draining though at a slower rate. In contrast, this Dell would refuse to accept any power at all.

Fortunately, there are two charging systems on every laptop: there’s an operating system driver when the laptop is on, either running or in low-power sleep mode. Then there’s a completely separate firmware-based mechanism when the laptop is off, either in hibernation or completely shut down. Dell’s firmware-based charging system is willing to accepting power from sources that can’t deliver 60W. So if the bundled 60W power adapter fails or if I lose it, I still have alternatives.

And finally: like my Dell Inspiron 7577, there is a BIOS setting for me to restrict battery charging percentage. I set mine to stop charging at 80%. That still deliver enough battery runtime for most of my usage sessions, and avoiding charging to 100% should improve battery longevity. I can always change that setting back to 100% if I ever need extra runtime.

Whether iPad?

I justified this purchase as an alternative to upgrading my soon-to-be-obsolete 6th gen iPad and for my usage it was a success. At the moment I can imagine only two reasons why I might still want an iPad. The first is weight. Three pounds is light for a laptop but that’s triple the weight of an iPad. In practice, this hasn’t been a huge problem as I don’t like holding things up by hand for long periods of time, whether one pound or three. The second differentiator is the Apple app store, but at the moment I don’t need any iPad app exclusives. Maybe something else would arise as motivation for spending several hundred dollars? Until I encounter such motivation, I expect to be well-served by this Dell Latitude 9410.

Dell Latitude 9410 Internals: M.2 2280 Confirmed

Buying an used off-lease computer means giving up shiny-new cosmetic perfection in exchange for a hefty discount, a tradeoff I was willing to make. I don’t think I’ll be bothered by the “Grade B” blemishes on my unit and besides, it’s what’s inside that really counts. Purchasing from Dell Financial Services via their retail site https://dellrefurbished.com includes a 100-day warranty to back up their claim that all machines are fully functional regardless of cosmetic state.

My order confirmation email had a pleasant unexpected bonus: it included the Dell service tag for the specific machine I had just bought. I could then put that service tag into Dell’s support site to learn information about that specific unit before it had even arrived. There were still a few weeks left on its original 3 year warranty, though it would expire before the 100 day refurbished warranty would. Another positive attribute was the fact my unit was not equipped with a cellular data (WWAN) module. I don’t expect to use WWAN, and without the module, the machine should have room for a M.2 SSD in the more common, longer, and cheaper 2280 form factor. The factory configuration list included a model number for the factory SSD, which is associated with a M.2 2280 drive.

The machine arrived a few days later and, once I confirmed the machine functioned as expected, I brought up Dell’s service manual and opened up my unit to familiarize myself with its internals and to confirm information listed on Dell support. Compared with my previous Dell laptops, there was much more extensive use of thin adhesive-backed sheets of various materials. Are they for RF shielding? For airflow management? Other purposes? It’s hard for me to tell but their presence is not surprising in a device designed to be thin and light. I just have to keep in mind I can only remove them a few times before their adhesive gives out.

Modern component miniaturization allowed smaller circuit boards, freeing up more internal volume for the battery. Which needs to be disconnected and the system depowered before I disconnect anything else. This battery plug was very securely fastened and difficult to remove. Far more difficult than any previous Dell laptop battery connectors I’ve encountered. Hard enough that I triple checked I didn’t overlook some other mechanism I was supposed to release before unplugging the connector. But there were no other mechanisms, it was just a really tight fit.

After the system was depowered, I quickly made my way to the SSD to confirm it was indeed a M.2 2280 unit. This will make future upgrades easier and cheaper than the less common M.2 2230 type used in WWAN-equipped units. Speaking of which, I don’t think I can (easily) retrofit mine with one. The module connector is there on the logic board, but I don’t see any loose wires that would be appropriate for plugging into a cellular modem. So my machine probably lack cellular antennae as well. Though if I ever come into possession of a M.2 2230 SSD in the future, I might be tempted to give it a shot anyway for curiosity’s sake. Compatible WWAN modules seem to cost about $30-$50 from various Amazon vendors and maybe I can rig up a less elegant antenna. As long as I keep my expectations modest for such a project, it might still be an interesting data point. In the meantime I’m content to use the machine as-is.

Dell Latitude 9410 Cosmetic Grade B

I’ve decided to buy an off-lease Dell Latitude 9410 from Dell Financial Services, via their retail web site https://dellrefurbished.com. All of the machines have been evaluated to be in good functional order, but some of them have cosmetic blemishes separated into cosmetic grades. Cosmetic grade A are for machines in good shape, and grade B indicate machines that are in… less good shape at a lower price. Since I’m a cheapskate, I ordered a grade B unit and in my specific case, it wasn’t bad at all!

When the machine arrived, my first surprise was the label at the bottom: “Refurbished to Dell specifications by FedEx Supply Chain” I expected the evaluation and refurbishment process to be done by a Dell subcontractor, I just didn’t expect to see the FedEx name. Prompted by this surprise, I did a bit of research to find FedEx TechConnect, with uncertain relationship to “FedEx Supply Chain” formerly GENCO. To me it seems like an odd side gig for FedEx to take on, but I’m not a MBA at FedEx business development.

I found some damage on the keyboard. It means the backlight would shine through these damaged corners, not something I’d notice while I’m typing and looking at the screen. This damage is not a surprise in hindsight: when the screen is flipped around to turn this into a tablet, its keyboard becomes the exposed bottom of the device. The previous user of this laptop must have set the tablet down on something that caused this key cap damage.

Another problem with this device was the rubber strip at the bottom: it’s gone. There’s supposed to be a layer of light gray soft material overmolded onto this hard black plastic core strip. With the soft layer gone, all I have is this ugly looking strip. Fortunately I don’t have to look at it when I’m using the laptop. As a substitute, the refurbish process added cheap square stick-on rubber feet.

The stick-on squares are much thicker than the missing rubber strips. In laptop mode, this meant a larger gap at the bottom for better air cooling. But the thickness gets in the way when I fold the screen around for tablet mode: I couldn’t fold the screen completely flat when these thick rubber pads are in place. I will look for slightly thinner stick-on rubber feet to replace these thick squares, with the goal of restoring full tablet mode form factor while preserving laptop mode air cooling. I consider this a minor detail that is within my ability to fix, and not a big deal.

I found no noticeable damage on the screen, the metal body, or lid. Those were bigger concerns buying “Grade B” and my unit is practically pristine on those fronts. I don’t think the slightly scratched keyboard would bother me very much, and I can replace the stick-on rubber feet. I’m perfectly happy accepting those blemishes in exchange for >80% discount off original MSRP, especially when its internals look perfectly fine.

Dell Latitude 9410 2-in-1 Laptop/Tablet Convertible

When I learned I will need to replace my iPad in the near future, I saw an opportunity to give the Windows laptop/tablet convertible concept another try. My earlier encounters were marred by bloatware, or weighed too much for a practical tablet. A secondhand Acer Aspire Switch 10 worked admirably well but it was still heavily constrained by its modest hardware. I had been curious to see what the form factor is capable of in a modern sleek lightweight powerhouse. Getting one means paying a lot of money to buy new, which I was unwilling to do. But now that I can buy a heavily discounted off-lease unit from https://dellrefurbished.com, I’m going to give that a try.

I started keeping an eye on the “Tablets and 2-in-1 Laptops” section of the site. Batches of various machines came and went. Low end offerings are built around humble Intel Atom processors. They go up through high end machines with Intel Core i7 CPUs. I would pull up Dell’s specifications for various model numbers and eventually started focusing on a high-end model that came through with some regularity: Dell Latitude 9410. When new, these cost in the ballpark of two thousand dollars, and frequently end up in the hands of senior corporate executives as much a status symbol as productivity tool. Now they are listed for around $700, which is a decent price for its capabilities. But if I can get one with a 50% discount code, that would bring it down to $350. Over 80% discount from original MSRP and exactly the cost of a new 10th generation Apple iPad.

Latitude 9410s that come through dellrefurbished are pretty well equipped. Usually a CPU from Intel’s Core i7 line, and usually with 16GB of memory and 256GB or 512GB SSDs for storage. The touchscreen has 1920×1080 Full HD resolution, mounted on a double-jointed hinge that allow the user to fold the screen all the way around for a tablet-like form factor. All this in a package that weighs in the ballpark of 3 pounds. As is typical of Dell machines, there is a service manual available showing its internals. I was mildly disappointed to see its memory chips are soldered to the board and could not be upgraded, but at least SSD storage uses standard M.2 NVMe form factors. Units with cellular data (WWAN) modems are constrained to short M.2 2230 SSD. Units without WWAN have room to use M.2 2280 SSD which are more common.

Contemporary reviews say the Latitude 9410 is a very capable machine in a great form factor but came at a very high price. Well, dellrefurbished discount code can solve that last part, but why would they need to discount so heavily to move inventory? My conjecture is that, while this is a great Windows laptop/tablet, it doesn’t exist in a vacuum. The Latitude 9410 was launched in 2020, and what else launched around the same time? Apple’s M1 that caught Intel flat-footed. Apple Silicon launch had all the buzz in tech press, soundly beating equivalent Intel chips in power efficiency. Either more processing power at the same level of electrical power consumption, or far lower electrical power consumption at the same level of computation power. Apple laptops with the M1 chip have battery life that puts Intel-based machines to shame. Four to six hours of battery runtime isn’t bad for an Intel CPU laptop, comparing well to those that came before. But they look pretty sad next to all-day (or even multi-day) use people get out of Apple Silicon laptop battery. This and many other advantages of Apple’s 2020 laptops meant the Latitude 9410 had stiff competition both as useful tool and as status symbol.

Now in 2024, people who want to spend a few hundred dollars on a few-years-old used laptop might be more inclined to look at old Apple Silicon machines instead of an Intel-based Latitude 9410. If so, that would explain why Dell Financial Services has to discount them heavily to find buyers. Whether my conjecture is correct or not, the fact is I can now get a great deal on what used to be Dell’s top-of-the-line Windows laptop/tablet convertible. Especially if I’m willing to accept some cosmetic flaws.

Windows Convertible As iPad Replacement

I recently learned that Dell Financial Services operates a site to sell their off-lease computers. And even better, they’re willing to crank up discounts to move inventory. I spent a few fun weeks window-shopping their machines for sale, ranging from super tiny thin clients that bolt to the back of a monitor up to beefy servers designed for a data center equipment rack. But I have no real need to buy a computer. I’ve got my XPS 8950 desktop for gaming and VR. I’ve got my laptop for portable computing. I’ve got an old machine running TrueNAS, another running Proxmox, and several more old computers standing by waiting for a purpose. After a while I realized I was thinking too conventionally.

During this time, Apple released iPadOS 18 preview and a list of hardware they intend to support. Absent from that list is my 6th generation iPad. I bought it after disappointing experiences with a Windows 8 tablet and an Amazon Fire tablet. The tablet ecosystem is built around Apple’s iPad, and I’ve found products that undercut an iPad on price lack the hardware for smooth user experiences. For tablet centric usage scenarios, it was much more pleasant to use an iPad over the Samsung or Amazon Fire. On the flip side, I frequently felt limited by a tablet’s intentionally restricted capabilities. One example: I enjoy reading digital documents on an iPad, including documentation for software development tools. But when I get to a hands-on section, I have to switch hardware because an iPad is very deliberately not a general purpose computer and I can’t develop software on it.

With the knowledge that the clock is ticking on my 6th generation iPad, I started browsing for a replacement. My key priority is a USB type-C connector because I don’t want to deal with Apple Lightning cables anymore. Which meant the 10th generation iPad available for around $350. As I shopped around to see if I can get one for less than $350, it occurred to me that I should consider a Windows tablet/laptop convertible device. I had dismissed them for a long time because of first-hand experience with underpowered hardware and I wasn’t willing to pay the premium for high-end convertibles. But now I have a resource for heavily-discounted Dells business machines! I am willing to give Windows tablets another try when I can buy a powerful off-lease tablet/laptop convertible for new iPad money.

Adafruit PyCamera Library Includes Custom OV5640 Support

I am playing with my Adafruit Memento a.k.a. PyCamera which means digging into Adafruit’s CircuitPython library and sample code. I first looked over its photography parameters under software control, and now I move on to the software implementation. Skimming through source code for PyCamera CircuitPython library, I see its functionality were largely stitched together from existing Adafruit CircuitPython libraries corresponding to hardware components on board. The notable exception is that PyCamera has its own code to interface with the OV5640 camera module instead of leveraging an existing OV5640 CircuitPython library. This is interesting, why might Adafruit choose to split their OV5640 support?

PyCamera library is very much a “batteries included” bundle optimized to a particular piece of hardware, and the split fits with this design priority. Not only does the library expose interface to the camera module and the LCD screen module, it has code optimized for them to work together. The viewfinder capability is one example. This is literally all it takes to continuously send camera sensor data to LCD:

import adafruit_pycamera

pycam = adafruit_pycamera.PyCamera()
while True:
    pycam.blit(pycam.continuous_capture())

Adafruit’s OV5640 library does not have a continuous_capture() method, and their ST7789 library lacks a blit() method. Together these two PyCamera-specific APIs minimize delay between camera capture and LCD output so we can have a responsive viewfinder screen. Code comments in blit() explain it bypasses their displayio graphics library for speed, but incurs the tradeoff of not playing well with overlapping graphics elements. To mitigate this problem, camera UI text are not rendered by camera app code. They are rendered by make_camera_ui() within PyCamera library to ensure they stay clear of blit() zone. This was not how I had expected the implementation to go, interesting!

Another difference I found is that PyCamera OV5640 code is built on top of espcamera module, restricting it to run on Espressif microcontrollers. In contrast, the standalone OV5640 library uses an abstraction layer called imagecapture. Searching in CircuitPython source code, I see implementations for Atmel SAMD and Raspberry Pi Pico microcontrollers, but not for Espressif. I’m sure it would have been possible to add Espressif support to existing OV5640 library, but I can see how it was easier for PyCamera to go its own way. It knows it has an ESP32-S3 and it wants speed optimizations tailored to hardware.

While I can understand this approach to designing PyCamera library, it does make exploration more difficult. A big monolithic library means it’s harder to compose its elements in different ways to experiment off the designed path. I want a quick experiment, and the monolithic nature means I have to design something that largely resembles existing PyCamera sample code but with my added twist.


Appendix: On the Arduino side, PyCamera is also tied to Espressif’s camera library. For the OV5640 by itself, Adafruit’s guide to their breakout module didn’t mention Arduino at all.

Adafruit Memento a.k.a. PyCamera Photography Parameters

I would love to build upon Adafruit’s work and make something cool with their Memento camera module at its core, but before I brainstorm ideas I need to know what’s already on hand. After reviewing the hardware side of this system, I moved on to the software side. Looking at sample code I immediately saw mention of a “PyCamera”. As far as I can tell, it’s the same thing. Adafruit’s Arduino sample code documentation use the two names interchangeably. Perhaps PyCamera was a development code name for the product that eventually launched as the Memento? Perhaps Adafruit was afraid Arduino fans would pass over a product named PyCamera thinking it implied CircuitPython exclusivity?

One angle Adafruit used to promote Memento is the programmatic control we have over our photography. Given this sales pitch, I wanted to check out this camera’s capability in photography terms I’m familiar with. Between reading Adafruit source code and “OV5640 register datasheet” available on their downloads page, here is my understanding:

Aperture

I found nothing that I recognize as a counterpart to controlling camera aperture. Maybe I’ll find something later, but for now I believe aperture is fixed and we can’t play with our depth of field or other aperture controlled photography techniques.

Shutter Speed

There’s no physical shutter in an OV5640, but “exposure” affects how much time the camera takes to read sensor values. The default setting is to use its built-in automatic exposure control (AEC) which varies image integration time based on an internal algorithm, but it is also possible to switch the camera over to manual exposure mode for deliberately over- or under-exposed pictures. To a limited degree, at least. Even manual control is limited to range of “normal” photography so no multi-hour exposures here. The register datasheet outlines range of values but I don’t understand what they mean yet.

Sensitivity (ISO)

The conceptual counterpart for OV5640 is “gain”, and there is again the default of automatic gain control (AGC) with the option to turn off AGC and write values to specific registers to control gain. The register datasheet discusses the range of values, but I don’t understand what they mean yet.

White Balance

We can turn automatic white balance (AWB) on or off, but that’s all I know from this document. What happens when AWB is turned off is out of scope. Adafruit library exposes set_camera_wb() but then we’re out of luck for the actual values passed into that API. “For advanced AWB settings, contact your local OmniVision FAE.

Focus

This was the most excitement part for me, because vast majority of camera modules available to electronics hobbyists have a fixed focus. The OV5640 on board the Memento has a voice coil motor (VCM) to move its optical path and adjust focus. One of the Adafruit demos performed focus-stacking so I know we have programmatic access, and the camera test app exposes the ability to perform auto-focus. I was looking forward to seeing an auto-focus algorithm in detail!

Unfortunately my hopes were dashed. Indeed we have programmatic access to move the lens within its range of positions, and indeed we have access to an auto-focus algorithm, but the two are separate things. The auto-focus algorithm is an opaque binary blob uploaded to the camera running on its built-in microcontroller. We do not get to see how it works.

On the upside, there are a few auto-focus modes we should be able to select and allow us to specify a region for focus. These controls were designed to support the “tap to focus” usage pattern common to touchscreen cell phone camera apps. So while we don’t get to see the magic inside the box, we have some amount of control over what happens inside. On the downside, this capability is not exposed via Adafruit PyCamera CircuitPython library API so some modifications will be required before experimentation can commence. If I might be doing that, I should dig in to see what’s under the hood.

Adafruit Memento Camera Hardware

I’ve opened my Adabox 021 and assembled the Memento camera within. Installing and running their “Fancy Camera” CircuitPython test app was fun for a bit and a good test to verify everything worked, but my objective with Adafruit products is always to try building something of my own. At the moment I’m on a theme of learning CircuitPython. But before I start writing code, I need to know what hardware peripherals are available.

As per usual, Adafruit publish great documentation for their products, and I quickly found their Memento camera hardware peripheral pinout page. It is pretty packed! The good news is this means a lot of peripherals are already onboard and available for experimentation. The bad news is that packing all this hardware doesn’t leave much room for adding anything else. I had expected to find a handful of pads we can solder to access extra unused pins, but looking at the schematic nearly every pin is already in use. In fact we had already “run out” as one of the peripherals on board is a GPIO expansion chip to add even more pins in order to read user button presses.

The only pin not already in use was routed to expansion port A0 which is open and available. (Port A1 is already occupied for driving front face plate LEDs.) There’s another open port marked STEMMA QT which is something Adafruit uses for making I2C connections easy. It taps into the I2C bus, which already has several passengers on board.

Mechanically, the most obvious approach would be to tap into existing front and rear faceplate mounting points. They are held with small fasteners and it would be easy to use them to bolt on something else. I may have to find longer fasteners when I do, though. Another approach would be to go tool-free and use something with a bit of spring/flexibility to clip onto support posts from the side. Other than those four support posts (one at each corner, with fasteners front and back for faceplates) I saw no provisions for mounting hardware. But I think those four posts will be enough.

Now that I have some idea of the hardware on board, time to look at the software side of Memento.

Adafruit Memento Camera Assembly (Adabox 021)

I’m having fun learning Adafruit’s CircuitPython, alternating between beginner-friendly tutorials and diving into more advanced topics like USB endpoints. Next step is popping back into tutorial world as I open my Adabox 021.

This was my first Adabox. I started my subscription sometime during the pandemic hoping for a nice distraction but global supply chain issues meant Adafruit had to put the program on hold. Subscribers are not charged until a box ships, so the hold didn’t cost me any money, it just meant I didn’t get the distraction I wanted. Finally the program resumed with Spring 2024’s Adabox 021 and I like what I saw when I opened the box. Everything was wrapped up with a nice presentation, like a gift box.

The core of Adabox 021 is their Memento camera board, which can be purchased on its own but subscribers get several accessories to go with the board. Like face plates front and back to protect fragile electronics on the main camera board. I was amused by the fact fun photography-related quotes have been placed on inner surfaces where they can be read during assembly but hidden out of sight once assembled.

I followed assembly instructions and everything came together smoothly. But the camera couldn’t take any pictures. A bit of troubleshooting pointed to the bundled 256MB microSD card as my problem. When I swapped it out with a 4GB microSD card I had on hand, the camera started working. My computer couldn’t read the 256MB card either. So I followed “microSD Card Formatting Notes” and used SD association’s formatting utility on the 256MB card, which seemed to run as expected. After formatting I could use my PC to write a text file to the card and, once ejected and re-inserted, read the text. But if I then put that card into Memento, it would not be recognized by Memento as a valid storage device. And after that, my PC could no longer read the card, either, and I have to format it again. Something’s weird here and I’m not alone but since I had a 4GB card that worked, I’m not going to worry too much about it. It’s much more interesting to start examining details of this device.

Dusting Off Adafruit KB2040

My latest distraction is a Raspberry Pi Pico W, a promising alternative to ESP32 for a WiFi-capable microcontroller board. After a cursory experiment with a single LED, I’m ready to connect more components to the board to explore its capabilities. Since it’s new hardware, though, there’s always the chance I’ll goof up somewhere and destroy the microcontroller. I don’t want to take that risk because I’m borrowing this particular Pico W. It’s not mine to blow up! Thinking over my options, I remembered that I already had microcontrollers built around the same RP2040 chip: a pair of Adafruit KB2040.

I got my KB2040 as an Adafruit promotion. They typically have something set up to entice us to put a few more items in our shopping cart, and these were “Free KB2040 with $X dollar purchase” type of deal I got across two separate purchases. As the product page made clear, KB2040 was designed for people building their own custom keyboards. So it was built to be physically compatible with an existing board (Pro Micro) popular with that niche. I had no intention of building my own custom keyboard, but I thought the RP2040 chip was worth a look at some point in the future. That time is now.

There are a few tradeoffs of using a KB2040 to substitute a Pico W. The first and most obvious problem is the fact KB2040 doesn’t have the WiFi/Bluetooth chip, so I could only practice aspects of programming a RP2040 before I bring in wireless connectivity. Equally obvious is the different physical form factor, which means a different pinout. I didn’t think that was a huge deal because I’m not trying to install the board on anything that expects a specific pinout. I printed Adafruit’s pinout reference chart and I thought I was good to go, but I wasn’t. While the RP2040 chip lies at the heart of both a Pico W and a KB2040, they have different peripheral chips. This means I’d need a KB2040-specific build of software and it is not on MicroPython’s list of supported Adafruit boards. D’oh! Fortunately, Adafruit supports KB2040 for CircuitPython. Is it close enough for my purposes? I now have motivation to find out.

Test Run of Quest 2 and Eyeglasses

OK so sticking some googly eyes on my Quest 2 wasn’t a serious solution to any problem, but there was another aspect of Apple Vision Pro I found interesting: they didn’t make any allowances for eyeglasses. Users need to have perfect vision, or wear contacts, or order lens inserts that clip onto their headset. This particular design decision allows a much slimmer headset and a very Apple thing to do.

Quest 3 headset has similar provisions for clip-on lenses, but my Quest 2 did not. And even though Quest 2 technically allowed for eyeglasses, it is a tiny bit too narrow for my head and would pinch my glasses’ metal arms against my head. I thought having corrective lenses inside the headset would eliminate that side pressure and was worth investigating.

Since Zenni isn’t standing by to make clip-on lenses for my Quest 2, I thought I would try to get creative and reuse one of my retired eyeglasses. I have several that were retired due to damaged arms and they would be perfect for this experiment. I selected a set, pulled out my small screwdriver set, and unfastened the arms leaving just the front frame.

For this first test, my aim is for quick-and-dirty. I used tape to hold the sides in place. For this first test I didn’t bother trying to find an ideal location.

The center was held with two rolled-up wads of double-sided foam tape. I believe the ideal spacing is something greater than zero, but this was easy for a quick test.

Clipping the face interface back on held my side strips of tape in place. I put this on my face and… it’s marginally usable! My eyesight is bad enough that I would just see a blur without my eyeglasses. With this taped-on solution, made without any consideration for properly aligned position, I could make out majority of features. I still couldn’t read small text, but I could definitely see well enough to navigate virtual environments. I declare this first proof-of-concept test a success, I will need to follow it up with a more precise positioning system to see if I can indeed make my own corrective lenses accessory for my Quest 2.

Reducing VR Headset Isolation

One advantage of Quest 2’s standalone operation capability is easy portability. I have a friend who was curious about VR but wanted to get some first-hand experience, and we were able to meet up for a demo with my Quest 2. No need to lug around a powerful PC plus two lighthouse beacons for a Valve Index.

At one point during the test drive, my friend turned towards me to talk about something. He can see where I sat as he had pass-through camera view active, but all I saw in return was the blank white plastic front surface of my Quest 2. It was a little disconcerting, like conversing through an one-way mirror. After that experience I understood the problem Apple wanted to solve with Vision Pro’s EyeSight feature.

It’s a really cool idea! EyeSight is a screen mounted on front of the headset and displays a rendering of the wearer’s eyes so people around them has something to focus on. There’s a lot of technical sophistication behind that eye rendering: because Vision Pro tracks direction of wearer’s gaze, those replicated eyes reflect the actual direction wearer is looking at. Our brains are high evolved to interpret gaze direction (very useful skill out in the wilderness to know if a saber-toothed cat is looking at us) and EyeSight aimed to make it effortlessly natural for all our normal instincts and social conventions to stay intact.

I have not seen this myself but online reports indicate EyeSight falls short of its intention. The screen is too dark to be visible in many environments, a problem made worse by the glossy clear outer layer reflecting ambient light. It was further dimmed by a lenticular lens layer that tries to give it a 3D effect, which is reportedly not very convincing as those rendered eyes are still obviously in the wrong place and not the real eyes.

Given Apple’s history of hardware iteration, I expect future iterations of EyeSight to become more convincing and natural for people to interact with. In the meantime, I can build something with 80% of the functionality for 1% of the cost.

I stuck a pair of self-adhesive googly eyes(*) to the front of my headset, and that will give human eyes something to look at instead of a blank white plastic face. It bears no resemblance to the wearer’s eyes within (or at least I hope not) and does not reflect actual gaze direction. On the upside, it is a lot more visible in bright environments and a far more amusing. Yeah it’s a silly thing but don’t worry, I have serious headset modification project ideas too.


(*) Disclosure: As an Amazon Associate I earn from qualifying purchases

Quest 2 Standalone and Mixed Reality Operation

While it was instructive to compare Quest 2 specifications with my other VR headsets, the biggest reason I wanted to try a Quest 2 is its standalone capability. After spending some time I’ve decided I’m a fan. It’s much easier to enjoy a virtual environment when I’m not constantly kicking away the cable tethering me to my gaming PC. All else being equal, a wireless experience is superior. Unfortunately, all else are not equal. The cell phone level hardware in a Quest 2 renders a decidedly lower fidelity world relative to what a modern gaming PC can render. It’s a nonissue for something simple and abstract like Beat Saber, but anything even slightly ambitious looks like a PC game from at least ten years ago.

One way to have the best of both worlds is wireless streaming from a gaming PC to my Quest over home WiFi. I tried Steam Link on Quest and was impressed by how well it worked. Unfortunately, it doesn’t work quite well enough just yet. When I’m playing games on a monitor, a few milliseconds of latency plus an occasional (about once per minute) stutter of one or two frames is fine. But on a VR headset, it quickly gives me motion sickness and a headache. Supposedly this can be improved with a WiFi 6 router, but I’m not willing to replace my home WiFi infrastructure for this feature. For the immediate future, I’m happy using my Valve Index for SteamVR experiences.

Mixed Reality

And finally, Meta’s push for Mixed Reality is still a question mark. All three of my VR headsets let me use their cameras to see real-world surroundings. But Quest is the only one of the three to do the work to map that camera footage into a convincingly realistic spatial layout around me. The HP WMR and Valve Index camera views can give me a rough idea if I’m about to run into a wall, but neither are properly mapped enough for me to, say, reach out and grab something.

To support mixed reality scenarios, Quest advertises hand-tracking capabilities for controller-free experiences. Supposedly this works well on the Quest 3, which has additional color cameras for the purpose. My house has beige walls and carpet so my hand has poor contrast for Quest 2’s black-and-white cameras to pick out. It’s pretty unreliable today.

Both of these capabilities show promise, but they’re both relatively new and I will have to wait for novel usage to emerge in mixed reality experiences yet to come. Apple’s Vision Pro is all-in on mixed reality, though, and offers to solve a problem that the Quest 2 does not.

And A Quest 2 Too

One reason I was willing to take apart my old HP Windows Mixed Reality system is the Meta Quest 2. Now with the Quest 3 taking mainstream position in their product line, Quest 2 inventory is getting cleared out at $200. That price was too tempting to resist so I got one even though I had a perfectly functional Valve Index. Here are some notes from my first hand experience.

Versus HP Windows Mixed Reality

I did not find a single spec sheet advantage my old WMR headset had over the much younger Quest 2. Technology moves fast! Quest 2 has higher screen resolution, integrated microphone and headset, and controllers that were happy to run on a single nominal AA battery instead of demanding two fully-charged AAs. Both used camera-based inside-out tracking but Quest 2 maintained better tracking because it used four cameras instead of two, and those cameras did not demand I turn on every light in the house if I wanted to use it at night. Quest 2 had some level of IPD adjustment with three settings, whereas the HP had no IPD adjustment at all.

I have not yet decided if I prefer Quest 2’s elastic headband versus HP WMR’s headband. I think the HP headband was the best part of the device and I may try 3D printing an adapter to use it with my Quest 2 to see if that’s an improvement.

Versus Valve Index

On the spec sheet Valve Index has a resolution advantage to the Quest 2. Fewer display pixels spread out across a wider field of view. In practice, I found the wider field of view much more important for immersive VR. I am happy making the tradeoff for better field of view but obviously I wouldn’t say no to both if I can get it in a future headset.

Beacon-based tracking used by Index meant I had to add those two little boxes in my room, but the results are worth it. Index has consistently better tracking especially for games where my hands have to move out of my field of view. (Reach behind my back or have a hand on my chest while looking up.) The Index controllers themselves are also much better than Quest controllers, with individual finger sensing, grip pressure sensing, and straps allowing me to open my grip without the controllers falling out. It’s a great immersion advantage, too bad Half Life: Alyx is the only game that takes full advantage of Index controllers.

Both have integrated microphone and speakers, but the Valve Index delivered much better positional audio. Weight of an Index is significantly heavier but part of that weight is the headband balancing things across my head versus Quest 2’s thin elastic band. And finally, Index has better optical adjustment capabilities. Not only smooth IPD adjustment (instead of three fixed positions) but also fore-aft adjustment.

Index is a much more comfortable headset for longer sessions and provides a more immersive VR experience compared to a Quest 2. But we have to consider their relative price tags. It’s better, but it’s not five times better. Even more if you count cost of a gaming PC! Plus, the comparisons here overlook what’s arguably Quest 2’s greatest advantage: it doesn’t need an associated gaming PC at all.

End of Windows Mixed Reality

In December 2023 Microsoft announced that Windows Mixed Reality has been deprecated and will be removed from Windows 11 24H2. This did not come as a surprise, as the platform hasn’t seen any investment in years. But it does mean my HP WMR headset will officially become a paperweight later this year.

This is fine by me, because my headset has pretty much been a paperweight since I damaged its cord. I tried fixing it and was seemingly successful, but there was a chance my fix is flawed. An errant pin could potentially ruin an expensive video card so I never really put the headset back into use. It is old anyway, lacking features of newer headsets. Heck, it was old and out of date when I got it! At that time, WMR was already… not a resounding success… and my local Best Buy decided to clear out their slow-moving inventory with heavy discounts.

What could I do with it now? There was never any compelling WMR exclusive experience for me, so I don’t have anything to revisit before it’s gone. And since I’ve upgrade to a Valve Index headset, that gives me a superior experience for everything in SteamVR. I guess I could use the deprecated WMR headset for experiments that I don’t want to risk on my expensive Valve Index, but I don’t have any project ideas along that direction. There’s no particular reason to hang on to it “just in case” an idea comes up because (1) it’ll stop working by the end of the year, and (2) if I want VR experiments with an affordable headset, I have to option to go pick up a Meta Quest 2. Which is not only affordable, but would let me explore untethered VR as well as opening the door to Quest exclusive experiences.

During my long inkjet teardown/Dell XPS debugging saga, I would frequently think about what I could do with this obsolete WMR headset. After a few months of not coming up with anything interesting, I will proceed with the ultimate fallback option: it is teardown time!

My Cell Phones Before Android, 1998-2013

I recently rediscovered this picture of all my cell phones from 1998 to 2013. I took this group picture shortly before sending most of them to electronic waste disposal. At the beginning of that fifteen year period, these were “cell phones” to specify they worked on a wireless network. By the end of that period, they are just “phones” and what used to be “phones” had become “landline”. It would have been symbolic to post this note on August 30th 2023 as that would have been the picture’s 10th anniversary, but I’m a few months late.

The oldest phone on the far left is a Sony CM-H888. I bought it in September 1998 and at the time it was a wonder of miniaturization much smaller than its contemporary analog peers. Yes, analog! This was a telephone for making voice calls over analog cellular network and nothing else. No internet, no apps, not even SMS. It looks bulky compared to the rest of this lineup mostly because of its 4*AA NiMH battery pack consuming over half of its volume. It is the only device on this list not powered by a lithium-ion battery.

Rapid technology advancement motivated me to part with my money. I upgraded to a Nokia 8260 a year later (October 1999) which weighs less than half as much (220g vs. 97g), eliminated the protruding antenna, and is a comfortable fit in my pocket instead of a barely-fit bulge. Multiple different technologies helped make this possible, including lithium-ion battery and a switch from AirTouch Cellular‘s analog network to AT&T Wireless TDMA digital cellular. It also gave me first exposure to a phone app in the form of Nokia’s legendary snake game.

A few years after getting the Nokia 8260, I bought a Compaq iPaq personal digital assistant (PDA) to help track my calendar and related adulting information that I could no longer all keep in my head. I appreciated that I had a pocket reminder of my responsibilities, and I admit to a certain level of Geek Cred for carrying around these electronic devices, but it still meant I was carrying them!

Consolidation came in December 2003, when I upgraded to a Motorola MPx200. It was the device that launched “Windows Mobile Smartphone” OS which gave me phone apps to functionally replace my PDA. The screen resolution of 176×220 was a huge upgrade over the Nokia brick but lower than iPaq’s 240×320. Plus, both of those screens were monochrome and now I have a color screen. Upgrading from TDMA to GSM digital cellular also meant I gained access to SMS text messaging. And finally, switching to a flip phone eliminated accidental butt-dials.

But it was a lot thicker than the Nokia, and didn’t fit in my pocket as nicely. So a year later (December 2004) I upgraded to an Audiovox 5600 (HTC Typhoon). It has all the features of the Motorola MPx200 at size of the Nokia 8260, so it’s almost the best of both worlds. The only thing I consider a downgrade is the fact butt-dials started happening again. Especially annoying was a feature where holding down “9” would automatically dial “911” and I could not figure out how to disable it.

So when the Cingular 3125 (HTC Startrek) launched, it caught my attention and I bought one in March 2007. It’s a flip phone to eliminate embarrassing butt-dials again, but far thinner than the Motorola MPx200. Hardware had advanced enough to put iPaq resolution screen (240×320 and in color) into a phone, and the laser-etched metal keypad looks way better in person than in pictures.

The first Apple iPhone also launched in 2007, but as an expensive premium product. My CIngular 3125 cost a small fraction of the iPhone up front, and did not require an expensive cellular data plan as the iPhone did. But the cost gap narrowed over the following years. Apple iPhone prices (along with corresponding data plan prices) eventually dropped to within reach of mass market consumers, and it was clear slabs of touch screen glass were the way of the future.

The AT&T HTC Pure (HTC Touch Diamond2) weighted about as much as my Cingular 3125. It lost the cool laser-etched keypad in exchange for a much larger and higher resolution (480×800) screen. It was one of several non-Apple efforts to follow iPhone’s lead as of January 2010 and a pretty poor showing at that. The marketing team tried their best trying to find advantages but it was pretty futile. Example: The 480×800 screen resolution was higher than the iPhone 3, but that marketing item was quickly buried by “retina display” of iPhone 4. Phones like HTC Pure could only compete at a lower price and I was fine with that. My Cingular 3125 was falling apart, held together with glue and tape. A cheap not-as-good-as-iPhone unit would suffice.

Minimizing usage of expensive data plan meant my HTC Pure did not get used as a smartphone very much. Mostly just voice calls and calendar, similar to how I had used my earlier phones. I didn’t know what I was missing out on until I upgraded to a Samsung Focus in November 2010. Windows Phone 7 was a huge advancement. Its first-party experience became a credible competitor to iPhone and Android, but third-party app support was inferior and would never catch up.

My biggest complaint with the Samsung Focus was its AMOLED screen. The bright high-contrast colors worked well for video and pictures, but its RGBG PenTile matrix proved horrible for text legibility at those resolutions. So when the Nokia Lumia 900 launched with classic RGB color pixels, I jumped over in July 2012. I was happy to accept some color and brightness limitations of a LCD screen in exchange for more legible text. Beyond its screen, I preferred Nokia’s sleek industrial design over Samsung’s anonymous black blob.

And finally, at the far right of this lineup, is a Nokia Lumia 620 I bought in May 2013. All the Nokia design and RGB matrix of the 900, but in a smaller package running Windows Phone 8. It was fine, but it was still a Windows Phone. After multiple major updates (7.5, 7.8 and 8) it became clear Microsoft was unable or unwilling to match iOS/Android on third-party app support. After losing faith in Microsoft, I never upgraded to Windows Phone 10… er, sorry, “Windows 10 Mobile”. Because rebranding always solves fundamental product issues.

I switched to Android in 2015 with a Nexus 5 and I’ve had Android phones ever since. I still have many of them (and try to keep them running) but a group photo wouldn’t be very interesting as they’re all touch screen slabs. (Effectively this photo.) RGBG PenTile AMOLED panels came back into my life again with recent phones, but I found that I didn’t mind it as much at modern phone screen resolutions. I have less than a dozen apps installed on my current phone so I never got into apps in a big way. But if I needed one, I can be confident an Android app exists. I no longer have to worry about whether an app exists for Windows Phone.

I hardly noticed when Microsoft finally pulled the plug on their phone OS efforts. I was long gone. It’s hardly the only platform I own that Microsoft axed.

Dell XPS 8950 Components Replaced Under Warranty

My six-month-old Dell XPS 8950 has been exhibiting intermittent bug checks. (Blue screens of death.) Since it was still under warranty, I wanted Dell to fix it. The tech support department tried their best to fix it in software, but they eventually decided hardware component replacement will be required to get this system back up and running reliably.

The premium I paid for XPS included on-site service visits as a perk. Dell dispatched a technician (an employee of WorldWide Tech Services) to my home with a job order to replace SSD and power supply. This made sense: a bad SSD would corrupt system files and cause the kind of seemingly random and unpredictable errors I see. If the power supply had gone bad, intermittent power glitches can do the same. As far as system components go, they are relatively inexpensive and easy to replace, so it made sense for Dell to try that first.

Unfortunately, this repair job went awry. When the technician powered my system back up, there was no video from the RTX 3080 GPU. Intel’s integrated video worked if the GPU was removed so the rest of the system seemed fine. A follow-up visit had to be scheduled for another technician to arrive with a replacement RTX 3080 GPU to get things back up and running. I hope the first technician didn’t get in too much trouble for this problem as RTX 3080 cards are not cheap.

The evening after the system was back up, another bug check occurred. Two more occurred within the 24 hours that followed. I reported this back to Dell and they asked if I would be willing to send the system to a repair depot. I didn’t care how it was done, I just wanted my system fixed, so I agreed. They sent me a shipping box with packing material and a shipping label. I guess they didn’t expect people to hang on to the original box! (I did.)

Looking up the shipping label address, I found a match for CSAT Solutions. Apparently contracted by Dell to perform such repairs. These people worked fast! According to FedEx tracking information, it was delivered to CSAT at 11AM and by 4PM the box was back in FedEx possession for the return trip. I had set up the machine to run Folding@Home and I included instructions to reproduce the problem, but it’s clear they ain’t got time for that nonsense.

An invoice in the box indicated they replaced CPU and RAM. Two more components that, if faulty, can cause random bug checks. They are significantly more expensive than a SSD or power supply so I understand why they weren’t first to be replaced. (A RTX 3080 cost more, but wasn’t part of the plan.)

I reinstall Windows 11 again and fired up Folding@Home. This time there were no bug checks running for seven days nonstop. Hooray! I’m curious whether it was CPU or RAM at fault (or both?) but at this point I have no way to know.

Due to component replacements, I almost have a different computer. Of its original parts, the metal enclosure and main logic board are all that remained. Dell has fixed the computer under warranty with no financial cost to me but significant time cost. If I value my time at, say, $50 an hour, I would have been better off just buying a new computer. As for Dell, whatever profit they had made on this sale has been completely erased and became a net loss. I’m glad this problem was fixed under warranty, but both sides prefer to avoid doing it at all. I hope this gives them a financial incentive to improve system reliability!

Despite the headaches of this particular episode, the fact it was repaired under warranty made me quite willing to buy more refurbished Dell computers.

Notes On Diagnostics From Dell Support

My Dell XPS 8950 has started exhibited unpredictable bug checks. (Blue Screen of Death) I poked around Dell’s SupportAssist software and found a lot of promising troubleshooting tools, but none of them fixed it. Out of ideas on software fixes, and unwilling to void the warranty by modifying hardware, I used SupportAssist text chat feature to open an official trouble ticket with Dell technical support. They eventually fixed the issue, but it took a few weeks to get there.

As expected, they wanted to try the easy things first. This meant repeating many SupportAssist tools which I already knew would be doomed to fail. And Windows tools (like restore points) that did no better. Since hardware diagnostics tests passed, their suspicion moved to operating system corruption. This involved trying a lot of procedures I already knew about, and have already run, but they want to do it again. There were a few bits of novelty:

Throughout this arduous process,I was instructed to reinstall Windows three separate times in three different ways: first with SupportAssist’s OS reinstall option, then Windows’ built in recovery option, finally a clean install via an USB drive created with Microsoft’s Media Creation Tool. This is on top of the re-installation I had already performed before contacting Dell support. With all this practice, I got really good at Windows setup!

Each time I reinstalled Windows, I had to reinstall SupportAssist. Clicking on text chat created a new chat session. Which meant I was sent to someone expecting to open a new ticket and I’d have to spend time to get them straightened out with my existing ticket number.

With each bug check, I get a crash memory dump to prove their latest idea hadn’t resolved my issue. Sadly Dell’s support ticket web interface allowed only a maximum of five attachments. I quickly reached my limit and additional memory dumps had to submitted by sharing files via my Microsoft OneDrive and Google Drive accounts and sending a link via text chat. This was… sub-optimal.

Weeks later, I’ve exhausted all their scripted solutions and finally granted an escalation to senior support technicians. They reviewed my ticket and came to the conclusion I hoped they would: some hardware components would need to be replaced.

Notes on Dell SupportAssist

I have a thorny issue with my XPS 8950. The symptom is an intermittent bug check (a.k.a. blue screen of death) that is not readily reproducible and, even when it occurs, the error code varies wildly in type and in location. My previous trouble-free Dell computers have allowed me to ignore Dell’s tech support portal. Now I have a troubled PC and have to learn what’s in Dell’s SupportAssist software.

Dell SupportAssist is primarily a native Windows application that is pre-installed on every Dell PC. If it is lost, SupportAssist can be downloaded from Dell’s website. (I had to do this several times after performing operating system reinstall as a diagnostic procedure.) It has several roles to play in regular maintenance:

  • Look for common configuration problems and tries to fix them.
  • Download drivers and other system files, though mostly supplanted by Windows Update. I even got BIOS update 1.16.0 from Windows Update before it showed up as an option in SupportAssist.
  • Clean up unused files to free up disk space.

SupportAssist also included troubleshooting tools including:

  • Examine Windows system events. SupportAssist recognized that I had been experiencing bug checks, and even offered a “Fix Now” option. It’s not obvious what that did, but it didn’t help.
  • Perform a suite of hardware tests. CPU tests, memory tests, disk tests. I was amused it even spun up each of the fans.

Regarding the hardware tests: there’s also a separate piece of software that can run independent of Windows. Its title bar calls itself “SupportAssust | On-board Diagnostics” and it lives on a separate disk partition. To launch it, we have to trigger the BIOS boot select menu and select “Diagnostics”. My computer passed all of these tests as well, including running everything under “Advanced Test” with “Thorough mode” selected.

This diagnostics partition was deleted when following directions from Dell tech support to perform a completely clean install. I was worried about that — it seemed useful! — but I later learned SupportAssist Windows application can re-partition the hard drive and reinstall that Diagnostics partition.

There is one worrisome aspect of SupportAssist. When this native Windows application is installed on a system, the Dell web site running in a browser seems to be able to query hardware configuration in order to offer the appropriate documentation and driver downloads. How are those components communicating? I’m worried about that channel being a potential venue for security exploits.

There are many other features of SupportAssist I didn’t investigate because they didn’t seem helpful to me. Like tools to migrate data from one PC to another, and naturally an upsell for extended warranty coverage.

I ran every SupportAssist maintenance task and diagnostic test I could find, none helped. As a last resort I activated its operating system reinstall procedure, and that didn’t help either. I’m out of ideas for software fixes. If this were one of my home-built desktop PCs, I would start swapping out hardware to see if I can isolate it to a particular component. However, this computer is still under warranty so I don’t want to do anything that would void said warranty. If hardware replacements are to be done, it will have to be done by Dell people on Dell dime under warranty. To get that process started, I have to contact Dell technical support. I could call them over the phone, but that doesn’t seem like the best approach for an intermittent error that takes a day to reproduce. Fortunately SupportAssist includes a text chat client, and that seems more practical for my situation.

Dell XPS 8950 Bug Check Codes List

My Dell XPS 8950 I bought primarily for SteamVR started exhibiting bug checks at around six months old. It was eventually fixed under Dell’s one-year warranty, but the journey started with an attempt to diagnose it myself. Stressing it with Folding@Home would crash it once roughly every 12-24 hours.

When Windows halts with a bug check, a memory dump file is written to disk for debug purposes. It takes significant expertise to dig through a memory dump file to pinpoint a root cause. However, it’s pretty easy to get a general idea of what we are dealing with. We can install Windows debugger (WinDbg) and use its built-in automated analyzer to extract a top-level error code we can then look up online. Over the course of two weeks I ran Folding@Home to build a collection of memory dump files, hoping to find commonalities that might point at a source.

The best case scenario is to have the same bug check code on every dump, occurring in the same operating system component. What I got instead is a list of thirteen codes (appended at the bottom of this post), some more often than others. And even worse, they didn’t all happen at the same place in the system but was spread all around. The only vague commonality between them is an invalid memory operation. Sadly, “invalid memory operation” is too broad of a category to tie to a root cause. I became quite discouraged looking over those memory dumps.

I know Dell tech support has a database of bug check codes and a list of diagnostic steps to address each of them. First level support technicians are trained to tell the customer to try each item in turn. Figure a half dozen things they want me to try (probably starting with “please turn off and back on again”…) for each of 13 possible codes means I will have to trudge through a lot of those procedures.

Eventually my support ticket will establish a widespread pattern that escalate my case to more senior support staff who will look at the problem more holistically, but I have to earn it with persistence! I will be spending a lot of time with Dell tech support, starting with their preinstalled troubleshooting tool called SupportAssist.


Bug check codes encountered, with URL of the Microsoft reference page and the first sentence of their explanation pasted in after the code.