ESP_8_BIT_Composite Version 1.3.1

Over a year ago I released my first Arduino library, not knowing if anyone would care. The good news is that they do: people have been using ESP_8_BIT_Composite to drive composite video devices. The bad news is that they have been filing issues for me to fix. This backlog has piled up over several months and long overdue for me to go in and get things fixed up.


Two of the issues were merely compiler warnings, but I should still address them to minimize noise. What was weird to me that I didn’t see either of those warnings myself in the Arduino IDE. I had to switch over to using PlatformIO under Visual Studio Code, where I learned I could edit my platformio.ini file to add build_flags = […] to enable warnings of my choosing. Issue #24 was a printf() formatting issue that I couldn’t see until I added -Wformat, and issue #35 was invisible to me until I added -Wreturn-type.

Since I was on the subject anyway, I executed a build with all warnings turned on. (-Wall) This gave me far too many warnings to review. Not only did this slow down compilation to a snail’s pace, most of the hits were outside my code. Of items in my code, some appear to be overzealous rules giving me false positives. But I did see a few valid complaints of unused variables (-Wunused-variable) and I removed them.


Issue #27 took a bit more work, mostly because I started out “knowing” things that were later proven to be wrong. I had added support for setRotation() and I tested it with some text drawn via the AdafruitGFX library. (This test code became my example project GFX_RotatedText) I didn’t explicitly test drawing rectangles because when I reviewed code for Adafruit_GFX::drawChar() I saw that they use writePixel() for text size 1 and fillRect() for text sizes greater than one. So when my rotated text sample code worked correctly, I inferred that meant fillRect() was correct as well.

That was wrong, and because I didn’t know it was wrong, I kept looking in wrong places. Not realizing that my coordinate transform math for fillRect() (and drawRect()) were fundamentally broken. These APIs passed in X/Y coordinates for the rectangle’s upper-left corner, and my mistake was forgetting that drawing commands are always in the original non-rotated orientation. When the rectangles are rotated, their upper-left corner is no longer the upper-left for the actual low-level drawing operations.

My incorrect foundation blinded me to the real problem, even though I saw failures across multiple test programs. Test programs evolved until one drew four rectangles every frame, one in each supported orientation, and cycle through modifying one of four parameters in a one-second-long animation. Only then could I see a pattern in the error and realize my mistake. This test code became my new example project GFX_RotatedRect.


Finally, I had no luck with issue #23. I was not able to reproduce the compilation error myself and therefore I could not diagnose it. I reluctantly closed it out as “unable to reproduce” before tagging version 1.3.2 for release.

ESP32 as Driver for Simple Segmented LCD

Once I have some confidence that I can write ESP-IDF code to control voltage levels at a speed relevant for driving a LCD, it was time to move beyond a simple breadboard test circuit. Onward to a perforated prototype board for all ten pins of a simple 2-digit 7-segment LCD, salvaged from an electric blanket controller. In the interest of keeping the wiring simple, I chose ten GPIO pins on one side of this ESP32 dev module I’m using (*) so I could wire everything (almost) parallel.

A 1kΩ resistor connects each ESP32 GPIO pin to an LCD pin, and a 330pF capacitor connects each ESP32 GPIO pin to ground. It isn’t quite a straight shot of ten parallel pins, with a two-wide gap on the left (serial communication TX/RX pins) and a single pin gap on the right (GPIO2, which this dev board connected to a blue LED that I’m using as a system heartbeat indicator.) On the far right is a loop of wire connected to the ground plane, so I have a convenient place for my oscilloscope to clip onto.

That completes the hardware side for my initial test. Moving on to my ESP-IDF project, I started hard-coding a sequence of LEDC PWM duty cycle adjustments that would drive this LCD in a fashion that I believe is called 1/2 duty, 1/2 bias. I think 1/2 duty means switching off between two common segments so each set of segments has 1/2 of the time, and 1/2 bias means the inactive common pin is held at 1/2 of the voltage difference of the active pins. I’m still a beginner so that might be wrong!

I chose to hard code the test because it avoids all the lookup table code I’d have to do if this were to become an interactive changeable display. I have two number digits to work with, so my test pattern is a Hitchhiker’s Guide reference.

The result is clearly legible at the optimal viewing angle, but it fades off quite a bit as the perspective changes. I remember a much wider range of legible viewing angles in its original use, I assume it means I’m doing something different from real LCD driver circuits in this little hack. Possibly related is the observation that, if I illuminate this screen from behind with a LED, its light washes out the LCD.

The original device used two LEDs behind the LCD for backlight, and it didn’t make the digits hard to read, so there’s definitely something missing in my amateur hour LCD controller. But the fact remains it is under ESP32 control, and I learned a lot on my way here. This was the first tangible result of a lot of fumbling around after listening to Joey Castillo’s Remoticon talk on hacking LCDs. Seeing “42” show up on screen is a good milestone to stop and review what I’ve learned.


Source code for this experiment is available on GitHub.

(*) Disclosure: As an Amazon Associate I earn from qualifying purchases.

Switching to ESP-IDF For PWM Waveform Control

I used ESPHome and Home Assistant to quickly experiment with parameters for ESP32 chip’s LEDC peripheral for generating PWM (pulse-width modulated) signals, seeing how they looked under a cheap oscilloscope. But for actually driving a segmented LCD, I will need even better control over signal behavior. It is an issue of timing: I need to toggle between high and low states for each common segment pin to generate an alternating signal, and I have two common segments to cycle through. In order to avoid flickering the LCD, this cycle needs to occur at least several tens of times a second.

The tightest control over timing I could get with ESPHome appears to be the on_loop automation, which is generally triggered roughly every 16 milliseconds. This translates to roughly 62Hz which, if I could complete the entire cycle, would be sufficient. But performing all of those toggles within a single on_loop would be too fast for our eyes to see, so we can only take one step in the cycle per on_loop. In order to toggle both high and low on consecutive on_loop, that cuts me down to 31Hz. Then there are two common segments, which cuts it further to 15Hz. I need something faster.

Until I have other tools in my toolbox, the “something faster” I have on hand require going to Espressif’s ESP-IDF SDK. PlatformIO makes ESP-IDF easier to work with, and I’ve had experience with this arena. My starting point (chosen because I’ve done similar things before) is to write a FreeRTOS task dedicated to toggling voltage levels by changing PWM parameters. In between steps of the cycle, I use a FreeRTOS wait (vTaskDelay) to send this task into the background until the next step. This mechanism allows finer control over timing than the ~16ms of on_loop, though it is only slightly better at 10ms by default. Repeating the math above, that works out to 25Hz which would at least be as good as 24fps film. But that is not the limit. Once I’m working within ESP-IDF, I have the option to get even finer timing control. I can get a little bit faster by reconfiguring FreeRTOS tick rate via ESP-IDF’s menuconfig tool. And for ultimate timing control I can start working with hardware timers.

I whipped up a test program to generate a staircase pattern. From 0% duty cycle, to 50% duty cycle, to 100%, then 50%, and repeat with 0%. Running at 20ms per step in the cycle, the timing looks solid. I can easily move this to 10ms and still have a solid square wave.

The 50% PWM value looked almost good enough without a capacitor. (Left) I have a huge pile of 300pF capacitors on hand, so I tried one and the waveform looked much better. (Right.) This is good enough for me to move forward with wiring this signal into a segmented LCD.


Source code for this experiment is available on GitHub.

Quick ESP32 PWM Experiment via ESPHome

I’ve mapped out the segments of a small LCD salvaged from an electric blanket controller. I activated these segments with an ESP8266 that alternated between 0V and 3.3V on two GPIO pins. Good for individual segments, but not good enough to drive the whole display. There are eight segment pins and two common pins for 8*2=16 total possible combinations (14 of which are used for the two 7-segment digits) controlled from ten total pins.

Technically speaking, an ESP8266 has enough GPIO pins, but we’d start intruding into the realm of sharing pins between multiple tasks which is more complexity than I wanted to tackle for a quick test. When driving a LCD, we also want to control voltage levels on these pins and ESP8266 lacks hardware PWM peripheral. For these reasons I will use an ESP32. It has more than enough pins, and hardware PWM peripherals to support generating a voltage on all those pins. ESP32 LEDC PWM peripheral is very flexible, and I need to determine how I want to configure that peripheral.

I used ESPHome because it is a great platform for quick experiments like this. I don’t strictly need WiFi here, but easy integration to Home Assistant and ability to update code over the network are great conveniences. Before I found ESPHome, I would wire up a few potentiometers to the circuit board and write code to use ADC to read their positions. A bit of code will then allow me to interactively play with parameters and see their results. But now, with ESPHome in my toolbox, I don’t need to solder potentiometers or write code for ADC. I can get such interactivity from the Home Assistant dashboard.

By default, ESPHome configures LEDC PWM peripheral to run at a frequency of 1kHz. According to Espressif documentation, it can be configured to run as high as 40MHz, though at that point it really isn’t a “PWM” signal anymore with only a fixed 50% duty cycle. Slowing down the frequency increases the number of bits available for duty cycle specification, and I wanted to find a tradeoff that I think will work well for this project. Here is an excerpt of ESPHome configuration YAML I used for this test.

output:
  - platform: ledc
    pin: GPIO23
    id: pwm_23
  - platform: template
    id: pwm_freq
    type: float
    write_action:
      lambda: |-
        int newFreq = (int)(10000000.0*state)+1000;
        id(pwm_23).update_frequency(newFreq);
    
light:
  - platform: monochromatic
    gamma_correct: 1.0
    output: pwm_23
    name: "PWM23"
  - platform: monochromatic
    gamma_correct: 1.0
    output: pwm_freq
    name: "PWM Freq"

This allows me to adjust two variables, each exposed to Home Assistant as a monochromatic dimmable light. Which I can change via a slider on a web page instead of a potentiometer soldered to the board. (The gamma correction value was set to 1.0 because we’re not actually controlling a visible light that requires gamma correction.)

  1. pwm_23 controls the PWM duty cycle.
  2. pwm_freq controls the PWM frequency via ESPHome Template Lambda. Theoretically from 1kHz to 10MHz, though in practice we won’t reach either end as the state never gets all the way down to 0.0 nor all the way up to 1.0.

As I adjusted the frequency, ESPHome automatically calculates the duty cycle bit depth.

[15:32:00][D][ledc.output:041]: Calculating resolution bit-depth for frequency 2491148.000000
[15:32:00][D][ledc.output:046]: Resolution calculated as 5
[15:32:00][D][ledc.output:041]: Calculating resolution bit-depth for frequency 2494322.000000
[15:32:00][D][ledc.output:046]: Resolution calculated as 5
[15:32:00][D][ledc.output:041]: Calculating resolution bit-depth for frequency 2497869.000000
[15:32:00][D][ledc.output:046]: Resolution calculated as 5
[15:32:00][D][ledc.output:041]: Calculating resolution bit-depth for frequency 2501411.000000
[15:32:00][D][ledc.output:046]: Resolution calculated as 4
[15:32:00][D][ledc.output:041]: Calculating resolution bit-depth for frequency 2503623.000000
[15:32:00][D][ledc.output:046]: Resolution calculated as 4
[15:32:00][D][ledc.output:041]: Calculating resolution bit-depth for frequency 2505428.000000
[15:32:00][D][ledc.output:046]: Resolution calculated as 4

From this snippet, we can see that 2.5MHz is the limit for 5 bits of resolution. 25 = 32 levels, which gives me control of resulting voltage in (3.3V / 32) ~= 0.1V increments. I think that’ll be good enough.


Here are some plots in oscilloscope form: First the starting point of 50% duty cycle at default 1kHz frequency, before and after adding a 100nF capacitor into the mix.

Not nearly good enough to output 1.65V. To make this better, I can increase the capacitance or increase frequency. Increasing capacitance will dull system response (and I need pretty quick response to rapidly cycle between LCD common pins) so I start cranking up the frequency.

At hundreds of kilohertz without capacitor, the resulting wave is more than what this little oscilloscope can capture at this timescale. When the 100nF capacitor is added in, we see a pretty respectable 1.65V signal, might even be good enough. But there’s plenty of room left to go faster.

Getting into the megahertz range, there’s enough natural capacitance in the system (wires, breadboard, etc.) that we see a pretty good waveform even without a real capacitor in line with the output. But with just a 330pF capacitor (much smaller than the 100nF I started with) the resulting voltage looks pretty good. At least, at this time scale. Now I need to move beyond ESPHome for better control of 2.5MHz PWM signals.

FabGL Experimental Color Composite Video Output

I don’t fully understand the magic of ESP_8_BIT color composite video generation code, but I understood enough to extract just the video display portion to render a frame buffer. Since that was a bit too low-level, for my ESP_8_BIT_Composite Arduino library I added an implementation of Adafruit GFX library. Before I committed to Adafruit GFX, I window-shopped FabGL as an alternate candidate. I decided against FabGL because unlike Adafruit GFX, it was not designed for others to extend to different video output devices. The only practical way for FabGL to gain color composite video output support is if the author tackles the challenge directly.

And this, apparently, is happening! I learned of this via a post to my Arduino library’s discussion forum on GitHub. This is very exciting, because FabGL author’s show-and-tell thread shows they are tackling this problem with far more resources than I had invested in my library. They start out with an advantage in knowledge built over time, as FabGL has been under development for several years. They also have an advantage in equipment, as this effort is aided by a Tektronix VM700 instrument to analyze the results and find problems. I didn’t even have a decent oscilloscope to tell me what happened when things went wrong!

I like the potential here, so I installed FabGL via Arduino Library Manager to take a look. This was a mistake, as the composite video feature is an experimental work in progress and not part of the most recent official release v1.0.6. I deleted that library and cloned the FabGL repository directly into my Arduino Libraries folder, and this time I could build and run the basic composite video test.

FabGL author said they’re not happy with the results yet and there are lots more work to be done, but what already exists today looks very promising. Visual contents look as stable as can be reasonably expected of analog composite video. However, the color capability isn’t quite there yet. On my TV, the red test block fades to pink in its horizontal center. The blue test block has difficulty transitioning from the adjacent green test block. As the beam scans horizontally from green to blue, it makes an initial transition to blue and then overshoots (?) to purple before returning back to blue. None of this is visible in the demo YouTube video (embedded below) so perhaps this is a problem with my hardware.

I’m curious if any existing FabGL features would prove to be incompatible with color composite video output. I suspect the sound output feature might go, as color composite video generation uses several of the same ESP32 peripherals and outputs to the same pin GPIO 25. (Similar to ESP_8_BIT.) Regardless of the tradeoffs that FabGL users may have to make, I hope this experiment will eventually succeed and become a part of a future official FabGL release.


Window Shopping LovyanGFX

One part of having an open-source project is that anyone can offer their contribution for others to use in the future. Most of them were help that I was grateful to accept, such as people filling gaps in my Sawppy documentation. But occasionally, a proposed contribution unexpectedly pops out of left field and I needed to do some homework before I could even understand what’s going on. This was the case for pull request #30 on my ESP_8_BIT_composite Arduino library for generating color composite video signals from an ESP32. The author “riraosan” says it merged LovyanGFX and my library, to which I thought “Uh… what’s that?”

A web search found https://github.com/lovyan03/LovyanGFX which is a graphics library for embedded controllers, including ESP32. But also many others that ESP_8_BIT_composite does not support. While the API mimics AdafruitGFX, this library adds features like sprite support and palette manipulation. It looks like a pretty nifty library! Based on the README of that repository, the author’s primary language is Japanese and they are a big fan of M5Stack modules. So in addition to the software technical merits, LovyanGFX has extra appeal to native Japanese speakers who are playing with M5Stack modules. Roughly two dozen display modules were listed, but I don’t think I have any of them on hand to play with LovyanGFX myself.

Given this information and riraosan’s Instagram post, I guess the goal was to add ESP_8_BIT composite video signal generation as another supported output display for LovyanGFX. So I started digging into how the library was architected to enable support for different displays. I found that each supported display unit has corresponding files in the src/lgfx/v1/panel subdirectory. Each of which has a class that derives from the Panel_Device base class, which implements the IPanel interface. So if we want to add a composite video output capability to this library, that’s the code I expected to see. With this newfound knowledge, I returned to my pull request to see how it was handled. I saw nothing of what I expected. No IPanel implementation, no Panel_Device derived class. That work is in the contributor’s fork of LovyanGFX. The pull request for me has merely the minimal changes needed to ESP_8_BIT_composite to be used in that fork.

Since those changes are for a specialized usage independent of the main intention of my library, I’m not inclined to incorporate such changes. I suggested to riraosan that they fork the code and create a new LovyanGFX-focused library (removing AdafruitGFX support components) and it appears that will be the direction going forward. Whatever else happens, I now know about LovyanGFX and that knowledge would not have happened without a helpful contributor. I am thankful for that!

ESPHome Remote Receiver Test: Simplistic Shooting Game

I salvaged an infrared remote control receiver from a Roku Premier 4620X (“Cooper”) and dumped out some codes using an ESP32 microcontroller running ESPHome software’s Remote Receiver component. This is great, but before I moved on, I ran a simple introduction to actually using it. The “Hello World” of ESPHome remote receiver, so to speak.

The idea is a very simple shooting game. I will add an LED to the breadboard connected to GPIO 18. I will count to five seconds in an on_loop lambda and illuminate the LED using ESPHome GPIO output component. Once illuminated, it will wait for the signal sent by a Roku RC108 remote when the “OK” button is pressed. Once received, I will darken the LED for five seconds before turning it back on. With this very simple game I pretend to “shoot out” the light with my remote.

It was silly and trivially easy as far as shooting games go. Infrared remote controls are designed so the couch potato doesn’t have to aim very accurately for them to work. The emitter sends out the signal in a very wide cone, and the receiver is also happy to receive that signal from within a wide cone. If I am to evolve this into an actually challenging target practice contraption, I would have to figure out how to narrow the cone of effectiveness on the emitter, or the receiver, or both!

But that was not the objective today. Today it was all about dipping my toes in that world before I continued with my Roku teardown. I wanted to keep the hardware trivial and the code simple, so here is the ESPHome code excerpt for this super easy shooting game:

esphome:
  on_loop:
    then:
      - lambda: |-
          if (id(ulNextOn) < millis())
          {
            id(led_01).turn_on();
          }

status_led:
  pin:
    number: 2

output:
  - platform: gpio
    pin:
      number: 18
      inverted: true
    id: led_01

globals:
  - id: ulNextOn
    type: unsigned long
    restore_value: no
    initial_value: '0'

remote_receiver:
  pin:
    number: GPIO36
    inverted: true  
  dump: nec
  tolerance: 10
  on_nec:
    then:
      - lambda: |-
          unsigned long ulTurnOff;
          
          if (0x55AA == x.command)
          {
            id(led_01).turn_off();
            id(ulNextOn) = millis() + 5000;
          }

Roku Premiere (4620X “Cooper”) Infrared Receiver

Looking over the circuit board of a Roku Premiere 4620X (“Cooper”) I saw a lot of things that might be fun to play with but require more skill than I have at the moment. But that’s fine, every electronics hobbyist has to start somewhere, so I’m going to start small with the infrared remote control subsystem.

Consumer infrared (IR) is not standardized, and signals may be sent on several different wavelengths. Since I want to play with the Roku RC108 remote control unit, I needed to remove the infrared receiver of its corresponding Premiere 4620X in order to guarantee I have a matching set. This is a small surface-mount device protected by a small metal shield that I could fold out of the way.

Once the shield was out of the way, I could turn on the Roku and probe its pins with my voltmeter to determine the power supply pin (constant 3.3V) the ground pin, and the signal pin (mostly 3.3V but would drop as I sent signals with the remote.) I then removed this receiver with my soldering iron and connect this tiny part to a set of 0.1″ spacing headers so I could play with it on a breadboard.

In this picture, from top to bottom the pins are:

  • Ground
  • (Not connected)
  • Power 3.3V
  • (Not connected)
  • Active-low signal

I installed it on a breadboard with an ESP32, flashed with ESPHome which is my current favorite way to explore things. In this case, ESPHome has a Remote Receiver component that has decoders for many popular infrared protocols. What if we don’t know which decoder to use? That’s fine, we can set the dump parameter to all which will try every decoder all at once. For this experiment I chose an ESP32 because the of its dedicated remote control (RMT) peripheral for accurate timing while decoding signals. After I get something up and running, I might see if it works on an ESP8266 without the RMT peripheral.

With dump parameter set to all listening to a Roku RC108, I got hits from the JVC, LG, and NEC decoders. And occasionally I would get a RAW message when none of them could understand the signal. If I hold the [up] button, I get one instance of:

[18:55:50][D][remote.jvc:049]: Received JVC: data=0x5743
[18:55:50][D][remote.lg:054]: Received LG: data=0x57439867, nbits=32
[18:55:50][D][remote.nec:070]: Received NEC: address=0xC2EA, command=0xE619

But just the first one. All following hits look like this, and they repeat for as long as I held down the [up] button.

[18:55:50][D][remote.jvc:049]: Received JVC: data=0x5743
[18:55:50][D][remote.lg:054]: Received LG: data=0x57439966, nbits=32
[18:55:50][D][remote.nec:070]: Received NEC: address=0xC2EA, command=0x6699

Then if I press the [down] button on the remote, the first interpreted signal became:

[18:55:52][D][remote.jvc:049]: Received JVC: data=0x5743
[18:55:52][D][remote.lg:054]: Received LG: data=0x5743CC33, nbits=32
[18:55:52][D][remote.nec:070]: Received NEC: address=0xC2EA, command=0xCC33

Then it would repeat the following for as long as [down] was held:

[18:55:53][D][remote.jvc:049]: Received JVC: data=0x5743
[18:55:53][D][remote.lg:054]: Received LG: data=0x5743CD32, nbits=32
[18:55:53][D][remote.nec:070]: Received NEC: address=0xC2EA, command=0x4CB3

Looking at this, it looks like JVC is overeager and doesn’t actually understand the protocol as it failed to differentiate up from down. That leaves LG and NEC decoders. To find out which is closer to the Roku protocol, I started tightening the tolerance parameter from its default of 25%. When I have it tightened to 10%, only the NEC decoder returned results. Even better, each button returns a repeatable and consistent number that is different from all of the others. Even if Roku isn’t actually using a NEC protocol, using the NEC decoder is good enough to understand all its buttons. I used the NEC decoder to generate this lookup table.

Roku Remote ButtonDecoded Command
Back0x19E6
Home0x7C83
Up0x6699
Left0x619E
OK0x55AA
Right0x52AD
Down0x4CB3
Instant Replay0x07F8
Options0x1EE1
Rewind0x4BB4
Pause0x33CC
Fast Forward0x2AD5
Netflix0x34CB
Sling0x58A7
Hulu0x32CD
Vudu0x7788

I think I got lucky this time with the NEC decoder. I had another infrared remote control in my pile of electronics (JVC RM-RK52) and none of the ESPHome decoders could decipher it, just RAW data all the time. Alternatively, it’s possible that remote is on a different infrared wavelength and thus this particular receiver is only picking up fringe gibberish. I’ll put the JVC remote back in the pile until I am in the mood to dig deeper, because right now I want to quickly play with this salvaged infrared system.

Flash Memory Wear Effects of ESPHome Recovery: ESP8266 vs. ESP32

One major difference between controlling charging of a battery and controlling power to a Raspberry Pi is the tolerance for interruptions. Briefly interrupting battery charging is nothing to worry about, we can easily pick up where we left off. But a brief interruption of Raspberry Pi power means it will reset. At the minimum we will lose in-progress work, but consequences can get worse including corruption of the microSD card. If I put an ESPHome node in control of Raspberry Pi power, what happens when that node reboots? I don’t want it to trigger a Raspberry Pi reboot as well.

This was on my mind when I read ESPHome documentation for GPIO Switch: There is a parameter “restore_mode” that allows us to specify how that switch will behave upon bootup. ALWAYS_ON and ALWAYS_OFF are straightforward: the device is hard-coded to flip the switch on/off upon bootup. Neither of these would be acceptable for this case, so I have to use one of the restore options. I added it to my ESP32 configuration and performed an OTA firmware update to trigger a reboot. I was happy to see there was no interruption to the Pi. Or at least if there was, it was short enough that the capacitors I added to my Raspberry Pi power supply was able to bridge the gap.

This is great! But how does the device know the previous state to restore? The most obvious answer is to store information in the onboard flash memory for these devices, but flash memory has a wear life that embedded developers must keep in mind. Especially when dealing with inexpensive components like ESP8266 and ESP32 modules. Their low price point invites use of inexpensive flash with a short wear life. I don’t know how to probe flash memory to judge their life, but I do know ESPHome is an open-source project and I could dig into source code.

ESPHome GPIO Switch page has a link to Core Configuration, where there’s a deprecated flag esp8266_restore_from_flash to dictate whether to store persistent data in flash memory. That gave me the keyword needed to find the Global Variables section on ESPHome Automations page. Where it said there is only 96 bytes available in a mechanism called “RTC memory” and that it would not survive a power-cycle. That didn’t sound very useful but researching further I learned it survives deep sleep and so there’s utility there. Searching in ESPHome GitHub repository, I found the file preferences.cpp for ESP8266 where I believe the implementation lives. It defaults to false which means the default wouldn’t wear out ESP8266 flash memory but at the expense of RTC memory not surviving a power cycle. If we really need that level of recovery and switch esp8266_restore_from_flash to true, we have an additional knob to make trade offs between accuracy and flash memory lifespan using the flash_write_interval parameter.

So that covers ESPHome running on an ESP8266. What about an ESP32? While I see that ESP32 has its own concept of RTC memory, looking in ESPHome source code for ESP32 variant of preferences.cpp I see that it used a different mechanism called NVS. Non-Volatile Storage library is tailored for storing small key-value pairs in flash memory, and was written to minimize wear. This is great. Even better, the API also leaves the door open for different storage mechanisms in future hardware revisions, possibly something with better write durability.

From this, I conclude that ESPHome projects that require restoring states through reboots events are better off running on an ESP32 and its dedicated NVS mechanism. I didn’t have this particular feature in mind when I made the decision to use an ESP32 to build my power-control board, but in hindsight that was the right choice! Armed with confidence in the hardware, I can patch up a few to-do items in my ESPHome-based software.

Power Control Board for TrueNAS Replication Raspberry Pi

Encouraged by (mostly) success of controlling my Pixel 3a phone’s charging, the next project is to control power for a Raspberry Pi dedicated to data backup for my TrueNAS CORE storage array. (It is a remote target for replication, in TrueNAS parlance.) There were a few reasons for dedicating a Raspberry PI for the task. The first (and somewhat embarrassing) reason was that I couldn’t figure out how to set up a remote replication target using a non-root account. With full root level access wide open, I wasn’t terribly comfortable using that Pi for anything else. The second reason was that I couldn’t figure out how to have a replication target wake up for the replication process and go to sleep after it was done. So in order to keep this process autonomous, I had to leave the replication target running around the clock, and a dedicated Raspberry Pi consumes far less power than a dedicated PC.

Now I want to take a step towards power autonomy and do the easy part first. I have my TrueNAS replications kick off in response to snapshots taken, and by default that takes place daily at midnight. The first and easiest step was then to turn on my Raspberry Pi a few minutes before midnight so it is booted up and ready to receive replication snapshot shortly after midnight. For the moment, I would still have to shut it down manually sometime after replication completes, but I’ll tackle that challenge later.

From an electrical design perspective, this was no different from the Pixel 3a project. I plan to dedicate another buck converter for this task and connect enable pin (via a cable and a 1k resistor) to another GPIO pin on my existing ESP32. This would have been easy enough to implement with a generic perforated prototype circuit board, but I took it as an opportunity to play with a prototype board tailored for Raspberry Pi projects. Aside from the form factor and pre-wired connections to Raspberry Pi GPIO, these prototype kits also usually come with appropriate pin header and standoff hardware for mounting on a Pi. Looking over the various offers, I chose this particular four-pack of blank boards. (*)

Somewhat surprisingly for cheap electronics supply vendors on Amazon, this board is not a direct copy of an existing Adafruit item. Relative to the Adafruit offering, this design is missing the EEPROM provision which I did not need for my project. Roughly two-thirds of the prototype area has pins connected as they are on a breadboard, and the remaining one-third are individual pins with no connection. In comparison the Adafruit board is breadboard-like throughout.

My concern with this design is in its connection to ground. It connects only a single pin, designated #39 in most Pi GPIO diagrams and lower-left in my picture. The many remaining GND pins: 6,9,14,20,25,30, and 34 appear to be unconnected. I’m not sure if I should be worried about this for digital signal integrity or other reasons, but at least it seems to work well enough for today’s simple power supply project. If I encounter problems down the line, I can always solder more grounding wires to see if that’s the cause.

I added a buck converter and a pair of 220uF capacitors: one across input and one across output. Then a JST-XH board-to-wire connector to link back to my ESP32 control board. I needed three wires: +Vin, GND and enable. But I used a four-pin connector just in case I want to surface +5Vout in the future. (Plus, I had more four-pin connectors remaining in my JST-XH assortment pack than three-pin connectors. *)

I thought about mounting the buck converter and capacitors on the underside of this board. There’s enough physical space between the board and the Raspberry Pi to fit them. I decided against it on concern of heat dissipation, and I was glad I did. After this board was installed on top of the Pi, the CPU temperature during replication rose from 65C to 75C presumably due to reduced airflow. If I had mounted components underneath, that probably would have been even worse. Perhaps even high enough to trigger throttling.

I plan to have my ESP32 control board run around the clock, so this particular node doesn’t have the GPIO deep sleep state problem of my earlier project with ESP8266. However, I am still concerned about making sure power stays on, and the potential problems of ensuring so.


(*) Disclosure: As an Amazon Associate I earn from qualifying purchases.

Successful Quick ESPHome Test: M5Stack ESP32 Camera

I don’t really have the proper gear to test and verify my modifications to an USB cable with type C connectors. Flying blind, I knew there was a good chance I would fry something. I dug through my pile of electronics for the cheapest thing I have with an USB-C socket, which turned out to be a M5Stack ESP32 Camera.

I got this particular module as an add-on to the conference badge for Layer One 2019 as documented on Hackaday. It’s been gathering dust ever since, waiting for a project that needed a little camera driven by an ESP32. For conference badge purposes it ran code from this repository, which also pointed to resources that helped me find the M5Stack ESP32Cam product documentation page.

The camera module is an OV2640, which is a very popular for electronics hobbyists and found in various boards like this one from ArduCam. If I want to do more work with ESP32+OV2640 I can find variations on this concept for less than $10 each. But M5Stack is at least a relatively name-brand item here, enough for this module to be explicitly described in ESPHome documentation. (Along with a warning about insufficient cooling in this design!)

Two notes about this ESP32Cam module that might not be present on other ESP32+OV2640 modules:

  1. There is a battery power management IC (IP5306) on board, making this an interesting candidate for projects if I want to run on a single lithium-ion battery cell and if I don’t want to tear apart another USB power bank. I think it handles both charge safety and boost conversion for higher voltage. I don’t know for sure because the only datasheets I’ve found so far are in Simplified Chinese and my reading comprehension isn’t great.
  2. The circuit board included footprints for a few other optional IC components. (BMP280 temperature/pressure/humidity environmental sensor, MPU6050 3-axis accelerometer + 3-axis gyroscope, SPQ2410 microphone.) They are all absent from my particular module, but worth considering if they are ICs that would be useful for a particular project.
  3. There is a red LED next to the camera connected to pin 16. I used it as an ESPHome status light.
status_led:
  pin:
    number: 16

My first attempt to put ESPHome on this module was to compile a *.bin file for installation via https://web.esphome.io. Unfortunately, it doesn’t seem to properly set up the flash memory for booting as the module gets stuck in an endless loop repeating this error:

rst:0x10 (RTCWDT_RTC_RESET),boot:0x13 (SPI_FAST_FLASH_BOOT)
flash read err, 1000
ets_main.c 371 
ets Jun  8 2016 00:22:57

To work around this problem, I fired up an Ubuntu laptop and ran ESPHome docker container to access a hardware USB port for flashing. This method flashed successfully and the ESP32 was able to get online where I could make future updates over wireless.

A web search indicates an OV2640 has a native sensor resolution of 1632×1232. But the ESPHome camera component running on this module could only handle a maximum of 800×600 resolution. The picture quality was acceptable, but only about 2-3 frames per second gets pushed to Home Assistant. As expected, it is possible to trade resolution for framerate. The lowest resolution of 160×120 is very blurry but at least motion is smooth. If I try resolutions higher than 800×600, at bootup time I would see this error message in debug log:

[E][esp32_camera:095]:   Setup Failed: ESP_ERR_NO_MEM

This isn’t great. But considering its price point of roughly ten bucks for a WiFi-enabled camera module, it’s not terrible. This experiment was a fun detour before I return to my project of automated charging for a Pixel 3a phone.

Vertically Mounted Construction Experiment

My experiments with IN219 DC voltage/current sensor started by monitoring the DC output of my solar storage battery, where I can count on a constant source of power and didn’t need to worry about going to sleep to conserve power. After I gained some confidence using ESPHome I tackled the challenges of running on solar panel power with an independent battery (salvaged from a broken USB power bank) and now the first version is up and running.

But that meant I was no longer monitoring the DC output and solar battery consumption… and I liked collecting that data. So I created another ESPHome node with its own INA219 sensor to continue monitoring power output, with a few changes this time around.

The biggest hardware change is switching from ESP8266 to ESP32. I have ambition for this node to do more than monitor power consumption, I want it to control a few things as well. The ESP8266 has very few available GPIO for these tasks so I wanted the pins and peripherals (like hardware PWM) of an ESP32. Thanks to the abstraction offered by ESPHome, it is a minor switch in terms of software.


Side note: I found that (as of today) https://web.esphome.io fails to flash an ESP32 image correctly, leaving the flash partition table in a state that prevents an ESP32 from booting. Connecting to the USB port with a serial monitor shows an endless stream repeating this error:

rst:0x10 (RTCWDT_RTC_RESET),boot:0x13 (SPI_FAST_FLASH_BOOT)
flash read err, 1000
ets_main.c 371 
ets Jun  8 2016 00:22:57

My workaround was to fire up ESPHome Docker container on an Ubuntu laptop for direct USB port access. This allowed an ESP32 image to be flashed in a way that boots up successfully. After the initial flash, I no longer needed the laptop as I was able to refine my ESPHome configuration via wireless updates.

My ESP8266 flashed correctly with https://web.esphome.io, no problems there.


Back to the hardware: another experiment is trying to mount my various electronics modules on their edge to pack items closer together. This is pretty easy for things like my INA219 module and my new experimental buck converter board, which has their connectors all on one side of their circuit board. I did mount an INA219 on its edge as planned, but just before I soldered a buck converter, I changed my mind and went with a known quantity MP1584 module instead. It’s still mounted vertically, though, using legs of 220uF capacitors.

Since I expect to add various experimental peripherals for this ESP32 to control, I also added a fuse in case something goes wrong. (Generally speaking, I really should be incorporating more fuses in my projects anyway.)

The first experimental peripheral output on this board is a USB Type-A port connected to the 5V output of my MP1584. I’m starting out with a direct tap to verify everything worked as expected before I start adding ESP32 control. Thanks to vertical mounting, I have plenty of room left on this prototype board for future experiments like an aborted attempt to hack a USB Type-C cable.

A Tale of Two ADCs: ESP32 versus INA219

I started looking at Home Assistant and ESPHome because I realized I did not have enough enthusiasm to write my own sensor data gathering and processing framework. Learning how to put an ESPHome node to sleep to save power was one of many steps I had to retrace, but I’m finally ready to pick up where I left off with an INA219 DC voltage and current sensor breakout board.

I had been using an ESP32 with its integrated ADC to monitor the DC voltage output of my MPS500 battery. I started with an ESP32 because they received factory calibration by Espressif whereas the ESP8266 did not. Sadly that wasn’t as useful as I had hoped because the ADC was only capable of measuring up to 2.45 volts. The MPS500 battery voltage range is consistent with three lithium chemistry battery cells in series (“3S”) meaning up to 12.6V, requiring a voltage divider built with a few resistors. These were cheap resistors that were several percent off the nominal value, so I had to use my voltage meter to recalibrate anyway.

While off-nominal resistors would affect the accuracy of my readings, I had expected the precision to be pretty good if I followed Espressif recommendation on reducing ADC noise with use of multisampling plus a bypass capacitor. And indeed the results were perfectly sufficient for me to log the change in battery voltage over time.

But once I had an INA219 up and running, it took over voltage monitoring duty in addition to current (and power) monitoring. After just one day, I can see the task-specific ADC circuit in an INA219 significantly outperformed the general-purpose ESP32 ADC. This graph covers two days: the day before switchover, and the day after.

The green line on the left were voltage fluctuations recorded by ESP32 ADC, the yellow line on the right reflected the same usage pattern but recorded by INA219. There is a very drastic difference in noise fluctuations between the two graphs! The ESP32 ADC plot was a little jagged and perfectly fine for my purpose, but it was a real treat to see INA219 values tracing out a smooth curve with no visible noise. At least, at the scale of my graph. This improvement should help as I move on to the next step of my project.

Problems Making ESP32 Hold GPIO While Asleep

I had several motivations for using an ESP32 for my next exercise. In addition to those outlined earlier, I also wanted to explore using these microcontrollers to control things. Not just report a measurement. In other words, I wanted to see if they can be output nodes as well as data input nodes. This should be a straightforward use of GPIO pins, except for another twist: I also want the ESP32 to be asleep most of the time to save power.

The ESP32 has several sleep modes available, and I decided to go straight for the most power-saving deep sleep as my first experiment. It was straightforward to call esp_deep_sleep() at the end of my program, and this was the easiest sleep mode because I don’t have to do much configuration or handling different cases of things that might happen during sleep. When an ESP32 wakes up from deep sleep, my program starts from the beginning as if it had just been powered up. This gives me a clean slate. I don’t have to worry about testing to see if a connection is still good and maybe reconnecting if not: I always have to start from scratch.

So what are the states of GPIO pins while an ESP32 is asleep? Reading the documentation, I thought I could command digital output pins to be held either high or low while the ESP32 was in deep sleep. However, my program calling gpio_deep_sleep_hold_en() didn’t actually hold output state like I thought it would. I think my program is missing a critical step somewhere along the line.

Some research later, I haven’t figured out what I am missing, but I have learned I’m not alone in getting confused. I found ESP-IDF issue #3370, which was resolved as a duplicate of ESP32 Arduino Core issue #2712. Even though it was marked as resolved, it is still getting traffic from people confused about why GPIO states aren’t held during sleep.

As a workaround, I can use an IO expander chip like the PCF8574. Letting that hold output pin state high or low while the ESP32 is asleep. As a relatively simple chip, I expect the PCF8574 wouldn’t use a lot of power to do what it does. But it would still be an extra chip adding extra power draw. I intend to figure out ESP32 sleep mode GPIO at some point, but for now the project is both moving on. Well, at least in software, the hardware side is taking a step back to ESP8266.


[Source code for this project (flaws and all) is publicly available on GitHub]

Switching to ESP32 For Next Exercise

After deciding to move data processing off of the microcontroller, it would make sense to repeat my exercise with an even cheaper microcontroller. But there aren’t a lot of WiFi-capable microcontrollers cheaper than an ESP8266. So I looked at the associated decision to communicate via MQTT instead, because removing requirement for an InfluxDB client library meant opening up potential for other development platforms.

I thought it’d be interesting to step up to ESP8266’s big brother, the ESP32. I could still develop with the Arduino platform with an ESP32 but for the sake of practice I’m switching to Espressif’s ESP-IDF platform. There isn’t an InfluxDB client library for ESP-IDF, but it does have a MQTT library.

ESP32 has more than one ADC channel, and they are more flexible than the single channel on board the ESP8266. However, that is not a motivate at the moment as I don’t have an immediate use for that advantage. I thought it might be interesting to measure current as well as voltage. Unfortunately given how noisy my amateur circuits have proven to be, I doubt I could build a circuit that can pick up the tiny voltage drop across a shunt resistor. Best to delegate that to a dedicated module designed by people who know what they are doing.

One reason I wanted to use an ESP32 is actually the development board. My Wemos D1 Mini clone board used a voltage regulator I could not identify, so I powered it with a separate MP1584EN buck converter module. In contrast, the ESP32 board I have on hand has a regulator clearly marked as an AMS1117. The datasheet for AMS1117 indicated a maximum input voltage of 15V. Since I’m powering my voltage monitor with a lead-acid battery array that has a maximum voltage of 14.4V, in theory I could connect it directly to the voltage input pin on this module.

In practice, connecting ~13V to this dev board gave me an audible pop, a visible spark, and a little cloud of smoke. Uh-oh. I disconnected power and took a closer look. The regulator now has a small crack in its case, surrounded by shiny plastic that had briefly turned liquid and re-solidified. I guess this particular regulator is not genuine AMS1117. It probably works fine converting 5V to 3.3V, but it definitely did not handle a maximum of 15V like real AMS1117 chips are expected to do.

Fortunately, ESP32 development boards are cheap, counterfeit regulators and all. So I chalked this up to lesson learned and pulled another board out of my stockpile. This time voltage regulation is handled by an external MP1584EN buck converter. I still want to play with an ESP32 for its digital output pins.

Arduino Library Versioning For ESP_8_BIT_Composite

I think adding setRotation() support to my ESP_8_BIT_Composite library was a good technical exercise, but I made a few mistakes on the administrative side. These are the kind of lessons I expected to learn when I decided to publish my project as an Arduino library, but they are nevertheless a bit embarrassing as these lessons are happening in public view.

The first error was not following sematic versioning rules. Adding support for setRotation() is an implementation of missing functionality, it did not involve any change in API surface area. The way I read versioning rules, the setRotation() update should have been an increase in patch version number from v1.2.0 to v1.2.1, not an increase in minor version from v1.2.0 to v1.3.0. I guess I thought it deserved the minor version change because I changed behavior… but by that rule every bug fix is a change in behavior. If every bug fix is a minor version change, then when would we ever increase the patch number? (Never, as far as I can tell.)

Unfortunately, since I’ve already made that mistake, I can’t go back. Because that would violate another versioning rule: the numbers always increase and never decrease.

The next mistake was with a file library.properties in the repository, which describes my library for the Arduino Library Manager. I tagged and released v1.3.0 on GitHub but I didn’t update the version number in library.properties to match. With this oversight, the automated tools for Arduino library update didn’t pick up v1.3.0. To fix this, I updated library.properties to v1.3.1 and re-tagged and re-released everything as v1.3.1 on GitHub. Now v1.3.1 shows up as an updated version in a way v1.3.0 did not.

Screen Rotation Support for ESP_8_BIT_Composite Arduino Library

I’ve had my head buried in modern LED-illuminated digital panels, so it was a good change of pace to switch gears to old school CRTs for a bit. Several months have passed since I added animated GIF support to my ESP_8_BIT_Composite video out Arduino library for ESP32 microcontrollers. I opened up the discussion forum option for my GitHub repository and a few items have been raised, sadly I haven’t been able to fulfill the requests ranging from NTSC-J support (I don’t have a corresponding TV) to higher resolutions (I don’t know how). But one has just dropped in my lap, and it was something I can do.

Issue #21 was a request for the library to implement Adafruit GFX capability to rotate display orientation. When I first looked at rotation, I had naively thought Adafruit GFX would handle that above drawPixel() level and I won’t need to write any logic for it. This turned out to be wrong: my code was expected to check rotation and alter coordinate space accordingly. I looked at the big CRT TV I had sitting on my workbench and decided I wasn’t going to sit that beast on its side, and then promptly forgot about it until now. Whoops.

Looking into Adafruit’s generic implementation of drawPixel(), I saw a code fragment that I could copy:

  int16_t t;
  switch (rotation) {
  case 1:
    t = x;
    x = WIDTH - 1 - y;
    y = t;
    break;
  case 2:
    x = WIDTH - 1 - x;
    y = HEIGHT - 1 - y;
    break;
  case 3:
    t = x;
    x = y;
    y = HEIGHT - 1 - t;
    break;
  }

Putting this into my own drawPixel() was a pretty straightforward way to handle rotated orientations. But I had overridden several other methods for the sake of performance, and they needed to be adapted as well. I had drawFastVLine, drawFastHLine, and fillRect, each optimized for their specific scenario with minimal overhead. But now the meaning of a vertical or horizontal line has become ambiguous.

Looking over at what it would take to generalize the vertical or horizontal line drawing code, I realized they have become much like fillRect(). So instead of three different functions, I only need to make fillRect() rotation aware. Then my “fast vertical line” routine can call into fillRect() with a width of one, and similarly my “fast horizontal line” routine calls into fillRect() with a height of one. This invokes some extra computing overhead relative to before, but now the library is rotation aware and I have less code to maintain. A tradeoff I’m willing to make.

While testing behavior of this new code, I found that Adafruit GFX library uses different calls when rendering text. Text size of one uses drawPixel() for single-pixel manipulation. For text sizes larger than one, they switch to using fillRect() to draw more of the screen at a time. I wrote a program to print text at all four orientations, each at three different sizes, to exercise both code paths. It has been added to the collection of code examples as GFX_RotatedText.

Satisfied that my library now supports screen rotation, I published it as version 1.3.0. But that turned out to be incomplete, as I neglected to update the file library.properties.

Cat and Galactic Squid

Emily Velasco whipped up some cool test patterns to help me diagnose problems with my port of AnimatedGIF Arduino library example, rendering to my ESP_8_BIT_composite color video out library. But that wasn’t where she first noticed a problem. That honor went to the new animated GIF she created upon my request for something nifty to demonstrate my library.

This started when I copied an example from the AnimatedGIF library for the port. After I added the code to copy between my double buffers to keep them consistent, I saw it was a short clip of Homer Simpson from The Simpsons TV show. While the legal department of Fox is unlikely to devote resources to prosecute authors of an Arduino library, I was not willing to take the risk. Another popular animated GIF is Nyan Cat, which I had used for an earlier project. But despite its online spread, there is actual legal ownership associated with the rainbow-pooping pop tart cat. Complete with lawsuits enforcing that right and, yes, an NFT. Bah.

I wanted to stay far away from any legal uncertainties. So I asked Emily if she would be willing to create something just for this demo as an alternate to Homer Simpson and Nyan Cat. For the inspirational subject, I suggested a picture she posted of her cat sleeping on her giant squid pillow.

A few messages back and forth later, Emily created Cat and Giant Squid complete with a backstory of an intergalactic adventuring duo.

Here they are on an seamlessly looping background, flying off to their next adventure. Emily has released this art under the CC BY-SA (Creative Commons Attribution-ShareAlike) 4.0 license. And I have happily incorporated it into ESP_8_BIT_composite library as an example of how to show animated GIFs on an analog TV. When I showed the first draft, she noticed a visual artifact that I eventually diagnosed to missing X-axis offsets. After I fixed that, the animation played beautifully on my TV. Caveat: the title image of this post is hampered by the fact it’s hard to capture a CRT on camera.

Finding X-Offset Bug in AnimatedGIF Example

Thanks to a little debugging, I figured out my ESP_8_BIT_composite color video out Arduino library required a new optional feature to make my double-buffering implementation compatible with libraries that rely on a consistent buffer such as AnimatedGIF. I was happy that my project, modified from one of the AnimatedGIF examples, was up and running. Then I swapped out its test image for other images, and it was immediately clear the job is not yet done. These test images were created by Emily Velasco and released under Creative Commons Attribution-ShareAlike 4.0 license (CC BY-SA 4.0).

This image resulted in the flawed rendering visible as the title image of this post. Instead of numbers continously counting upwards in the center of the screen, various numbers are rendered at wrong places and not erased properly in the following screens. Here is another test image to get more data

Between the two test images and observing where they were on screen, I narrowed the problem. Animated GIF files might only update part of the frame and when that happens, the frame subset is to be rendered at a X/Y offset relative to the origin. The Y offset was accounted for correctly, but the X offset went unused meaning delta frames were rendering against the left edge rather than the correct offset. This problem was not in my library, but inherited from the AnimatedGIF example. Where it went unnoticed because the trademark-violating animated GIF used by that example didn’t have an X-axis offset. Once I understood the problem, I went digging into AnimatedGIF code. Where I found the unused X-offset, and added it into the example where it belonged. These test images now display correctly, but they’re not terribly interesting to look at. What we need is a cat with galactic squid friend.

Animated GIF Decoder Library Exposed Problem With Double Buffering

Once I resolved all the problems I knew existed in version 1.0.0 of my ESP_8_BIT_composite color video out Arduino library, I started looking around for usage scenarios that would unveil other problems. In that respect, I can declare my next effort a success.

My train of thought started with ease of use. Sure, I provided an adaptation of Adafruit’s GFX library designed to make drawing graphics easy, but how could I make things even easier? What is the easiest way for someone to throw up a bit of colorful motion picture on screen to exercise my library? The answer came pretty quickly: I should demonstrate how to display an animated GIF on an old analog TV using my library.

This is a question I’ve contemplated before in the context of the Hackaday Supercon 2018 badge. Back then I decided against porting a GIF decoder and wrote my own run-length encoding instead. The primary reason was that I was short on time for that project and didn’t want to risk losing time debugging an unfamiliar library. Now I have more time and can afford the time to debug problems porting an unfamiliar library to a new platform. In fact, since the intent was to expose problems in my library, I fully expected to do some debugging!

I looked around online for an animated GIF decoder library written in C or C++ code with the intent of being easily portable to microcontrollers. Bonus if it has already been ported to some sort of Arduino support. That search led me to the AnimatedGIF library by Larry Bank / bitbank2. The way it was structured made input easy: I don’t have to fuss with file I/O or SPIFFS, I can feed it a byte array. The output was also well matched to my library, as the output callback renders the image one horizontal line at a time, a great match for the line array of ESP_8_BIT.

Looking through the list of examples, I picked ESP32_LEDMatrix_I2S as the most promising starting point for my test. I modified the output call from the LED matrix I2S interface to my Adafruit GFX based interface, which required only minor changes. On my TV I can almost see a picture, but it is mostly gibberish. As the animation progressed, I can see deltas getting rendered, but they were not matching up with their background.

After chasing a few dead ends, the key insight was noticing my noisy background of uninitialized memory was flipping between two distinct values. That was my reminder I’m performing double-buffering, where I swap between front and back buffers for every frame. AnimatedGIF is efficient about writing only the pixels changed from one frame to the next, but double buffering meant each set of deltas was written over not the previous frame, but two frames prior. No wonder I ended up with gibberish.

Aside: The gibberish amusingly worked in my favor for this title image. The AnimatedGIF example used a clip from The Simpsons, copyrighted material I wouldn’t want to use here. But since the image is nearly unrecognizable when drawn with my bug, I can probably get away with it.

The solution is to add code to keep the two buffers in sync. This way libraries minimizing drawing operations would be drawing against the background they expected instead of an outdated background. However, this would incur a memory copy operation which is a small performance penalty that would be wasted work for libraries that don’t need it. After all of my previous efforts to keep API surface area small, I finally surrendered and added a configuration flag copyAfterSwap. It defaults to false for fast performance, but setting it to true will enable the copy and allow using libraries like AnimatedGIF. It allowed me to run the AnimatedGIF example, but I ran into problems playing back other animated GIF files due to missing X-coordinate offsets in that example code.