Options for Improving Timestamp Precision

After a quick test determined that my Arduino sketch will be dealing data changing at a faster rate than 1kHz, I switched the timestamp query from calling millis() to micros(). As per Arduino documentation, this change improved time resolution by 250 from 1 millisecond precision to 4 microsecond precision. Since I had time on my mind anyway, I took a research detour to learn how this might be improved further. After learning how much work it’d take, I weighed it against my project and decided… nah, never mind.

Hardware: ATmega328P

A web search for ATmega328P processor programming found good information on this page Developing in C for the ATmega328: Marking Time and Measuring Time. The highest possible timing resolution is a counter that increments upon every clock cycle of the processor. For an ATmega328P running at 16MHz, that’s a resolution of 62.5 nanoseconds from ticks(). This 16-bit counter overflows very quickly (once every 4.096 milliseconds) so there’s another 16 bit counter ticks_ro() that increments whenever ticks() overflows. Together they become a 32-bit counter that would overflow every 4.47 minutes, after that we’re on our own to track overflows.

However, ticks() and ticks_ro() are very specific to AVR microcontrollers and not (easily) accessible from Arduino code because that kills its portability. Other microcontrollers have similar concepts but they would not be called the same thing. (Example: ESP32 has cpu_hal_get_cycle_count())

Software: Encoder Library

Another factor in timing precision is the fact that I’m not getting the micros() value when the encoder position is updated. The encoder position counter is updated within the quadrature decoding library, and I call micros() sometime afterwards.

timestamp,position,count
16,0,448737
6489548,1,1
6490076,2,1
6490688,5,1
6491300,8,1
6491912,12,1
6492540,17,1
6493220,21,1
6493876,25,1

Looking at the final two lines of this excerpt, I see my code recorded encoder update from position 21 to 25 over a period of 6493876-6493220 = 656 microseconds. But 6493876 is only when my code ran, that’s not when the encoder clicked over from 24 to 25! There’s been a delay on the order of three-digit microseconds, an approximation derived from 656/(25-21) = 164.

One potential way to improve upon this is to add a variable to the Encoder library, tracking the micros() timestamp of the most recent position update. I can then query that timestamp from my code later, instead of calling micros() myself which pads an unknown delay. I found the encoder library source code at https://github.com/PaulStoffregen/Encoder. I found an update() function and saw a switch() statement that looked at pin states and updated counter as needed. I can add my micros() update in the cases that updated position. Easy, or so I thought.

Looking at the code more closely, I realized the function I found is actually in a comment. It was labeled the “Simple, easy-to-read “documentation” version 🙂” implying the actual code was not as simple or easy to read. I was properly warned as I scrolled down further and found… AVR assembly code. Dang! That’s hard core.

On the upside, AVR assembly code means it can access the hardware registers behind ticks() and ticks_ro() for the ultimate in timer resolution. On the downside, I don’t know AVR assembly and, after some thought, I decided I’m not motivated enough to learn it for this particular project.

This was a fun side detour and I learned things I hadn’t known before, but I don’t think the cost/benefit ratio makes sense for my Canon MX340 teardown project. I want to try some other easy things before I contemplate the harder stuff.


This teardown ran far longer than I originally thought it would. Click here to rewind back to where this adventure started.

Captured CSV and Excel worksheets are included in the companion GitHub repository.

Quadrature Decoding with Arduino

I want to understand the internal workings of a Canon Pixma MX340 multi-function inkjet. Right now my focus is on its paper feed motor assembly, and I want to record data reported by a quadrature rotation encoder inside that assembly. I want to track behavior over several seconds, possibly a minute of two, which gets a little unwieldy with a logic analyzer timeline interface. So I thought I should create a tool tailored to my project and I found a promising lead using an ESP32’s pulse counter (PCNT) peripheral.

As as I started preparing for the project, thinking through and writing down what I’d need to do, a lot of details felt very familiar in a “wait… I’ve done this before” way. I had forgotten I’ve played with quadrature encoders before! A search for “quadrature” on my project notebook (this blog site) found entries on reading the knob on a Toyota audio head unit, an inexpensive knob from Amazon, and investigative detour during Honda audio head unit adventures.

Following my earlier footsteps would be an easier way to go, because the Arduino IDE and Paul Stoffregen’s quadrature decoder library are already installed on my machine. But this will be the first time I apply it to something turned by a motor instead of by a human hand. Is it fast enough to keep up? Decoder library documentation says 100-127kHz sampling rate is possible on a Teensy 3, which was the library’s original target hardware. Running on an ATmega328 would be slower.

Aside: I found this Gammon forum thread listing technical detail on ATmega328 interrupt service routines, which laid out work just for ISR overhead that would take 5.125us before any ISR code actually runs. This puts a hard upper bound of ~200 kHz on response rate of an ISR that does nothing.

In the spirit of “try the easy thing first” I’ll start with ATmega328 Arduino. If it proves too slow, I have a Teensy LC somewhere, and I definitely have ESP8266 boards. In the unlikely case they all fail to meet my need, I can resume my examination of ESP32’s pulse counter (PCNT) peripheral.


This teardown ran far longer than I originally thought it would. Click here to rewind back to where this adventure started.

Hello ESPAsyncWebServer

I have several interesting sensors in my hardware to-do pile, and I think it’d be interesting to do a series of interactive UIs one for each sensor. My first effort in this vein was a browser UI to interactively play with an AS7341 spectral color sensor, and I learned a lot doing it.

One lesson was that I could perform all sensor data computation and visualization in the browser, so I don’t need much in the way of microcontroller capabilities. The ESP32 I used for AS7341 web UI was overkill. All I needed was an I2C bus and WiFi, so I could easily drop down to an ESP8266. However, because I had used the WebServer from ESP32 Arduino Core library, I will need an ESP8266-friendly replacement.

Sticking with Original Repository

I decided to look into ESPAsyncWebServer, which I had been using indirectly as a dependency of ESPHome. In addition to supporting ESP8266 as well as ESP32, it had a few other features I liked. The most important being its ability to serve files from flash file system, so I don’t have to do silly things like hexadecimal encoding files to program memory. Using this capability also means switching from Arduino IDE to PlatformIO, because the latter has easy tools to work with flash file system. And finally, I was intrigued by the claim ESPAsyncWebServer can serve GZip-compressed files. Making better use of our precious megabyte of flash memory.

The only worry is that the GitHub repository looks stale. There are over 100 open issues, and there are 69 pull requests sitting unmerged. Maybe I should use one of the forks that saw more recent development? Since I was introduced via ESPHome, I thought I would try their fork of the repository. It has seen more recent work, but unfortunately as of this writing, the most recent merge has a C syntax error.

.pio\libdeps\d1_mini\ESPAsyncWebServer-esphome\src\AsyncWebSocket.cpp: In function 'size_t webSocketSendFrame(AsyncClient*, bool, uint8_t, bool, uint8_t*, size_t)':
.pio\libdeps\d1_mini\ESPAsyncWebServer-esphome\src\AsyncWebSocket.cpp:105:23: error: expected primary-expression before ')' token
  105 |   if (!client->send();) return 0;
      |                       ^

It’s been sitting broken for over a month now. I don’t know the story behind the scenes, but it is clear the repository is in no shape to be used. I don’t know which of the other forks might be worth investigating. As an introduction, I’ll start with the original until I run into a compelling reason to do otherwise.

SPIFFS vs. LittleFS

There were a few examples to help me orient myself with ESPAsyncWebServer. Compiling them myself, though, brought up warnings that SPIFFS is now deprecated and I should switch to LittleFS. This was opened as an issue #780 against ESPAsyncWebServer library, but was closed without further action because the warning came from an optional SPIFFS-specific component called SPIFFSEditor. Since it is optional and not relevant to my projects, I can choose to ignore the warning.

Switching over is a little tricky because we build the code and file system separately, and they are two different uploads. If I upload the file system first, it gets reformatted by the code when it sees a file system it doesn’t recognize. As per documentation: “attempting to mount a SPIFFS volume under LittleFS may result in a format operation and definitely will not preserve any files, and vice-versa” In order to end up with a functional system, I need upload code using LittleFS then upload LittleFS file system contents.

Gzip Compressed Files

After verifying a simple “Hello World” web server worked, I compressed the small index.html into index.html.gz. Uploading the new LittleFS system, I was happy to see it come up in my browser! But how does this work? I queried using curl and found the answer:

HTTP/1.1 200 OK
Content-Length: 1284
Content-Type: text/html
Content-Encoding: gzip
Content-Disposition: inline; filename="index.html"
Connection: close
Accept-Ranges: none

ESPAsyncWebServer added “Content-Encoding: gzip” to the HTTP header effectively making the browser deal with it. As the browser will be running on a far more powerful CPU than the ESP8266, it is indeed better suited to handle decompression.

Serve Angular Web Application

As a test of compression, I brought over my Angular tutorial app “Tour of Heroes”. It’s not as simple as just copying the files, though. As per Angular deployment guide, I needed to add a rule so ESPAsyncWebServer will serve up index.html rather than a 404 error when an URL is not found.

  server.onNotFound([](AsyncWebServerRequest *request){
    request->send(LittleFS, "/www/index.html");
  });

Another problem was that LittleFS has a 31-character name limit. (32 characters + null termination.) Unfortunately, my Angular app bundle had a file polyfills.716cc9a230a87077.js.gz which is 32 letters! The hexadecimal hash was generated by “–output-hashing” option of “ng build”. It’s changed on every build in order to avoid using stale cached versions of polyfills.js. I could skip that feature, but then I run the risk of stale cached files. The other workaround is to skip compressing polyfills.js. Dropping the “.gz” extension resulted in a 29-letter filename that works for LittleFS.

With all files in my Angular app compressed (except for polyfills.716cc9a230a87077.js) the size of the app dropped from ~800kB down to ~230kB. This is a dramatic difference when default available flash storage is only 1MB out of a total of 4MB total flash memory onboard.

Should I use Angular to build my ESP8266-hosted sensor web UI projects? No. It is far too heavyweight and a simple sensor UI does not need the capabilities of Angular application framework.

Will I use Angular to build my ESP8266-hosted sensor web UI projects? Possibly. Angular is not the best tool for the job, but it might be valuable for me to practice Angular by starting with something simple.


Source code for this exploratory project is publicly available on GitHub

Adafruit SSD1305 Arduino Library on ESP8266

Thanks to Adafruit publishing an Arduino library for interfacing with SSD1305 display driver chip, I proved that it’s possible to control an OLED dot matrix display from a broken FormLabs Form 1+ laser resin 3D printer. But the process wasn’t seamless, I ran into several problems using this library:

  1. Failed to run on ESP32 Arduino Core due to watchdog timer reset.
  2. 4 pixel horizontal offset when set to 128×32 resolution.
  3. Sketch runs only once on Arduino Nano 33 BLE Sense, immediately after uploading.

Since Adafruit published the source code for this library, I thought I’d take a look to see if anything might explain any of these problems. For the first problem of watchdog reset on ESP32, I found a comment block where the author notes potential problems with watchdog timers. It sounds like an ESP8266 is a platform known to work, so I should try that.

  // ESP8266 needs a periodic yield() call to avoid watchdog reset.
  // With the limited size of SSD1305 displays, and the fast bitrate
  // being used (1 MHz or more), I think one yield() immediately before
  // a screen write and one immediately after should cover it.  But if
  // not, if this becomes a problem, yields() might be added in the
  // 32-byte transfer condition below.

While I’m setting up an ESP8266, I could also try to address the horizontal offset. It seems a column offset of four pixels were deliberately added for 32-pixel tall displays, something not done for 64-pixel tall displays.

  if (HEIGHT == 32) {
    page_offset = 4;
    column_offset = 4;
    if (!oled_commandList(init_128x32, sizeof(init_128x32))) {
      return false;
    }
  } else {
    // 128x64 high
    page_offset = 0;
    if (!oled_commandList(init_128x64, sizeof(init_128x64))) {
      return false;
    }
  }

There was no comment to explain why this line of code was here. My best guess is the relevant Adafruit product has internally wired its columns with four pixels of offset, so this code makes a shift to compensate. If I remove this line of code and rebuild, my OLED displays correctly.

As for the final problem of running just once (immediately after upload) on an Arduino Nano 33 BLE Sense, I don’t have any hypothesis. My ESP8266 happily restarted this test sketch whenever I pressed the reset button or power cycled the system. I’m going to chalk it up to a hardware-specific issue with the Arduino Nano 33 BLE Sense board. At the moment I have no knowledge (and probably no equipment and definitely no motivation) for more in-depth debugging of its nRF52840 chip or Arm Mbed OS.

Now I have this OLED working well with an ESP8266, a hardware platform I have on hand, I can confidently describe this display module’s pinout.

First Test with Adafruit SSD1305 Library

I feel I now have a good grasp on how I would repurpose the OLED dot matrix display from a broken FormLabs Form 1+ laser resin 3D printer. I felt I could have figured out enough to play back commands captured by my logic analyzer, interspersed with my own data, similar to how I controlled a salvaged I2C LCD. But this exploration was much easier because a user on FormLabs forums recognized the SSD1305-based display module. Thanks to that information, I had a datasheet to decipher the commands, and I could go searching to see if anyone has written code to interface with a SSD1305. Adafruit, because they are awesome, published an Arduino library to do exactly that.

Adafruit’s library was written to support several of their products that used an SSD1305, including product #2675 Monochrome 2.3″ 128×32 OLED Graphic Display Module Kit which looks very similar to the display in a Form 1+ except not on a FormLabs custom circuit board. Adafruit’s board has 20 pins in a single row, much like the Newhaven Display board but visibly more compact. Adafruit added level shifters for 5V microcontroller compatibility as well as an extra 220uF capacitor to help buffer power consumption.

Since the FormLabs custom board lacked such luxuries, I need to use a 3.3V Arduino-compatible microcontroller. The most convenient module at hand (because it was used in my most recent project) happened to be an ESP32. The ssd1305test example sketch of Adafruit’s library compiled and uploaded successfully but threw the ESP32 into a reset loop. I changed the Arduino IDE Serial Monitor baud rate to 115200 and saw this error message repeating endlessly every few seconds.

ets Jun  8 2016 00:22:57

rst:0x8 (TG1WDT_SYS_RESET),boot:0x13 (SPI_FAST_FLASH_BOOT)
configsip: 0, SPIWP:0xee
clk_drv:0x00,q_drv:0x00,d_drv:0x00,cs0_drv:0x00,hd_drv:0x00,wp_drv:0x00
mode:DIO, clock div:1
load:0x3fff0030,len:1344
load:0x40078000,len:13516
load:0x40080400,len:3604
entry 0x400805f0
SSD1305 OLED test

Three letters jumped out at me: WDT, the watchdog timer. Something in this example sketch is taking too long to do its thing, causing the system to believe it has locked up and needs a reset to recover. One unusual aspect of ssd1305test code is that all work live in setup() leaving an empty loop(). As an experiment, I moved majority of code to loop(), but that didn’t fix the problem. Something else is wrong but it’ll take more debugging.

To see if it’s the code or if it is the hardware, I pulled out a different 3.3V microcontroller: an Arduino Nano 33 BLE Sense. I chose this hardware because its default SPI communication pins are those already used in the sample sketch, making me optimistic it is a more suitable piece of hardware. The sketch ran without triggering its watchdog dimer, so there’s an ESP32 incompatibility somewhere in the Adafruit library. Once I saw the sketch was running, I connected the OLED and immediately saw the next problem: screen resolution. I see graphics, but only the lower half. To adjust, I changed the height dimension passed into the constructor from 64 to 32. (Second parameter.)

Adafruit_SSD1305 display(128, 32, &SPI, OLED_DC, OLED_RESET, OLED_CS, 7000000UL);

Most of the code gracefully adjusted to render at 32 pixel height, but there’s a visual glitch where pixels are horizontally offset: the entire image has shifted to the right by 4 pixels, and what’s supposed to be the rightmost 4 pixels are shown on the left edge instead.

The third problem I encountered is this sketch only runs once, immediately after successful uploading to the Nano 33 BLE Sense. If I press the reset button or perform a power cycle, the screen never shows anything again.

Graphics onscreen prove this OLED responds to an SSD1305 library, but this behavior warrants a closer look into library code.

Analog TV Tuning Effect with ESP_8_BIT_Composite

After addressing my backlog of issues with the ESP_8_BIT_Composite video output library, I felt that I have “eaten my vegetables” and earned some “have a dessert” fun time with my own library. On Twitter I saw that Emily Velasco had taken a programming bug and turned it into a feature, and I wanted to try taking that concept further.

When calling Adafruit GFX library’s drawBitmap() command, we have to pass in a pointer to bytes that make up the bitmap. Since that is only a buffer of raw bytes, we also have to tell drawBitmap() how to interpret those bytes by sending in dimensions for width and height. If we accidentally pass in a wrong width value, the resulting output would be garbled. If I had seen that behavior, I would have thought “Oops, my bad, let me fix the bug” and moved on. But not Emily, instead she saw a fun effect to play with.

This is pretty amazing, using the wrong width value messes up the stride length used to copy image data, and it does vaguely resemble tuning an analog TV when it is just barely out of horizontal sync. Pushing the concept further, she added a vertical scrolling offset to emulate going out of vertical sync.

However, applying the tuning effect to animations required an arduous image conversion workload and complex playback code. I was quite surprised when I learned this, as I had wrongly assumed she used the animated GIF support I had added to my library. In hindsight I should have remembered drawBitmap() was only monochrome and thus incompatible.

Hence this project: combine my animated GIF support with Emily’s analog TV tuning effect in order to avoid tedious image conversion and complex playback.

I started with my animated GIF example, which uses Larry Bank’s AnimatedGIF library to decode directly into ESP_8_BIT_Composite back buffer. For this tuning effect, I needed to decode animation frames into an intermediate buffer, which I could then selectively copy into ESP_8_BIT_Composite back buffer with the appropriate horizontal & vertical offsets to simulate tuning effect. Since I am bypassing drawBitmap() to copy memory myself, I switched from the Adafruit GFX API to the lower-level raw byte buffer API exposed by my library.

For my library I allocated the frame buffer in 15 separate 4KB allocations, which was a tradeoff between “ability to fit in fragmented memory spaces” and “memory allocation overhead”. Dividing up buffer memory was possible because rendering is done line by line and it didn’t matter if one line was contiguous with the next in memory or not. However, this tuning effect will be copying data across line boundaries, so I had to allocate the intermediate buffer as one single block of memory.

My original example also asked the AnimatedGIF library to handle the wait time in between animation frames. However, since that delay could vary in an animation, and I have a user-interactive component. In order to remain responsive to knob movement faster than animation frame rate, I took over frame timing in a non-blocking fashion. Now every run of loop() reads the potentiometer knob position and update horizontal/vertical offsets without having to wait for the next frame of the animation, resulting in more immediate feedback to the user.


Animated GIF Tuner Effect with Cat and Galactic Squid is publicly available on GitHub.

ESP_8_BIT_Composite Version 1.3.1

Over a year ago I released my first Arduino library, not knowing if anyone would care. The good news is that they do: people have been using ESP_8_BIT_Composite to drive composite video devices. The bad news is that they have been filing issues for me to fix. This backlog has piled up over several months and long overdue for me to go in and get things fixed up.


Two of the issues were merely compiler warnings, but I should still address them to minimize noise. What was weird to me that I didn’t see either of those warnings myself in the Arduino IDE. I had to switch over to using PlatformIO under Visual Studio Code, where I learned I could edit my platformio.ini file to add build_flags = […] to enable warnings of my choosing. Issue #24 was a printf() formatting issue that I couldn’t see until I added -Wformat, and issue #35 was invisible to me until I added -Wreturn-type.

Since I was on the subject anyway, I executed a build with all warnings turned on. (-Wall) This gave me far too many warnings to review. Not only did this slow down compilation to a snail’s pace, most of the hits were outside my code. Of items in my code, some appear to be overzealous rules giving me false positives. But I did see a few valid complaints of unused variables (-Wunused-variable) and I removed them.


Issue #27 took a bit more work, mostly because I started out “knowing” things that were later proven to be wrong. I had added support for setRotation() and I tested it with some text drawn via the AdafruitGFX library. (This test code became my example project GFX_RotatedText) I didn’t explicitly test drawing rectangles because when I reviewed code for Adafruit_GFX::drawChar() I saw that they use writePixel() for text size 1 and fillRect() for text sizes greater than one. So when my rotated text sample code worked correctly, I inferred that meant fillRect() was correct as well.

That was wrong, and because I didn’t know it was wrong, I kept looking in wrong places. Not realizing that my coordinate transform math for fillRect() (and drawRect()) were fundamentally broken. These APIs passed in X/Y coordinates for the rectangle’s upper-left corner, and my mistake was forgetting that drawing commands are always in the original non-rotated orientation. When the rectangles are rotated, their upper-left corner is no longer the upper-left for the actual low-level drawing operations.

My incorrect foundation blinded me to the real problem, even though I saw failures across multiple test programs. Test programs evolved until one drew four rectangles every frame, one in each supported orientation, and cycle through modifying one of four parameters in a one-second-long animation. Only then could I see a pattern in the error and realize my mistake. This test code became my new example project GFX_RotatedRect.


Finally, I had no luck with issue #23. I was not able to reproduce the compilation error myself and therefore I could not diagnose it. I reluctantly closed it out as “unable to reproduce” before tagging version 1.3.2 for release.

Window Shopping LovyanGFX

One part of having an open-source project is that anyone can offer their contribution for others to use in the future. Most of them were help that I was grateful to accept, such as people filling gaps in my Sawppy documentation. But occasionally, a proposed contribution unexpectedly pops out of left field and I needed to do some homework before I could even understand what’s going on. This was the case for pull request #30 on my ESP_8_BIT_composite Arduino library for generating color composite video signals from an ESP32. The author “riraosan” says it merged LovyanGFX and my library, to which I thought “Uh… what’s that?”

A web search found https://github.com/lovyan03/LovyanGFX which is a graphics library for embedded controllers, including ESP32. But also many others that ESP_8_BIT_composite does not support. While the API mimics AdafruitGFX, this library adds features like sprite support and palette manipulation. It looks like a pretty nifty library! Based on the README of that repository, the author’s primary language is Japanese and they are a big fan of M5Stack modules. So in addition to the software technical merits, LovyanGFX has extra appeal to native Japanese speakers who are playing with M5Stack modules. Roughly two dozen display modules were listed, but I don’t think I have any of them on hand to play with LovyanGFX myself.

Given this information and riraosan’s Instagram post, I guess the goal was to add ESP_8_BIT composite video signal generation as another supported output display for LovyanGFX. So I started digging into how the library was architected to enable support for different displays. I found that each supported display unit has corresponding files in the src/lgfx/v1/panel subdirectory. Each of which has a class that derives from the Panel_Device base class, which implements the IPanel interface. So if we want to add a composite video output capability to this library, that’s the code I expected to see. With this newfound knowledge, I returned to my pull request to see how it was handled. I saw nothing of what I expected. No IPanel implementation, no Panel_Device derived class. That work is in the contributor’s fork of LovyanGFX. The pull request for me has merely the minimal changes needed to ESP_8_BIT_composite to be used in that fork.

Since those changes are for a specialized usage independent of the main intention of my library, I’m not inclined to incorporate such changes. I suggested to riraosan that they fork the code and create a new LovyanGFX-focused library (removing AdafruitGFX support components) and it appears that will be the direction going forward. Whatever else happens, I now know about LovyanGFX and that knowledge would not have happened without a helpful contributor. I am thankful for that!

Arduino Library Versioning For ESP_8_BIT_Composite

I think adding setRotation() support to my ESP_8_BIT_Composite library was a good technical exercise, but I made a few mistakes on the administrative side. These are the kind of lessons I expected to learn when I decided to publish my project as an Arduino library, but they are nevertheless a bit embarrassing as these lessons are happening in public view.

The first error was not following sematic versioning rules. Adding support for setRotation() is an implementation of missing functionality, it did not involve any change in API surface area. The way I read versioning rules, the setRotation() update should have been an increase in patch version number from v1.2.0 to v1.2.1, not an increase in minor version from v1.2.0 to v1.3.0. I guess I thought it deserved the minor version change because I changed behavior… but by that rule every bug fix is a change in behavior. If every bug fix is a minor version change, then when would we ever increase the patch number? (Never, as far as I can tell.)

Unfortunately, since I’ve already made that mistake, I can’t go back. Because that would violate another versioning rule: the numbers always increase and never decrease.

The next mistake was with a file library.properties in the repository, which describes my library for the Arduino Library Manager. I tagged and released v1.3.0 on GitHub but I didn’t update the version number in library.properties to match. With this oversight, the automated tools for Arduino library update didn’t pick up v1.3.0. To fix this, I updated library.properties to v1.3.1 and re-tagged and re-released everything as v1.3.1 on GitHub. Now v1.3.1 shows up as an updated version in a way v1.3.0 did not.

Installing Arduino Circuit Caused Setback

I didn’t understand why I couldn’t pull USB power through the existing jack on my Arduino Nano, but I was willing to create a small circuit board to wire up VUSB directly as a workaround and move on. I originally soldered two 0.1″ headers next to each other for power and ground, but the first test run instantly pulled those wires out of the socket. So I wired up JST-XH connector for that beheaded USB cable instead. I wanted a connection more mechanically secure than the generic 0.1″ headers and towards that goal I used JST-XH 4-conductor connector. Even though I needed just two conductors, I wanted the wider connector for two reasons. (1) I hoped a wider connector will latch more securely, and (2) I was running low on 2- and 3- conductor connectors in my assortment box. (*)

Next to the power input connector is the potentiometer(*), now soldered and fixed to this perforated prototype board instead of dangling off somewhere via wires. I plan to mount this board on the sheet metal backing of the light, near the lower left corner so the knob for this potentiometer can be accessible.

Next we have the two rows used for seating an Arduino Nano. Even though I’m only using four pins, I soldered all the points on these two rows so this header will sit securely. I had originally thought I would run wires around the outside of these headers, but it turns out I could put all the wires, resistors, etc. in between these two rows so I did that. I doubt it makes much of a cosmetic difference.

And finally, the star of the show, my four-conductor connector to the wires I’ve soldered to various points on the LG LP133WF2(SP)(A1) LCD panel control circuit board. The connector is standard hobbyist stuff, relatively large and easy to work with for my projects. But the other end of the wires soldered to points on the control circuit board which were quite a bit smaller, so I had been concerned about the strength of my soldered joints. And when I lifted the connector to plug into my newly created perf board, I heard a “pop” and knew instantly that was bad news. I had destroyed the LED_EN connection. It was intended as a test point so it was small, but I had soldered to the tiny circle of copper and handling this circuit placed too much stress on this connection. The wire I added ripped off the copper pad, leaving non-conductive (and non-useful) bare circuit board material behind. This is not good. I need a backup plan.


(*) Disclosure: As an Amazon Associate I earn from qualifying purchases.

Arduino Interface for Mitutoyo SPC Data Port

I started looking for an inexpensive electronic indicator with digital output port, and ended up splurging for a genuine Mitutoyo. Sure it is over five times the cost of the Harbor Freight alternative, but I thought it would be worth the price for two reasons. One: Mitutoyo is known for high quality precision instruments, and two: they are popular enough that the data output port should be documented somewhere online.

The second point turned out to be moot because the data output port was actually documented by pamphlet in the box, no need to go hunting online. But I went online anyway to get a second opinion, and found this project on Instructables. Most of the information matched up, but the wiring pinout specifically did not. Their cable schematic had a few apparent inconsistencies. (Example: one end of the cable had two ground pins and the other end did not.) They also had a “Menu” button that I lacked. These may just be the result of different products, but in any case it is information on the internet to be taken with a grain of salt. I took a meter to my own cable to ensure I have the pinout described by the pamphlet in my own instrument.

Their Arduino code matched the pamphlet description, so I took that code as a starting point. I then released my derivative publicly on GitHub with the following changes:

  • Calculate distance within numeric domain instead of converting to string and back.
  • Decimal point placement with a single math expression instead of a list of six if statements.
  • Their code didn’t output if value is inch or millimeter, I added units.

A limitation of their code (that I did not fix) is a recovery path, should the Arduino falls out of sync. The Mitutoyo protocol was designed with a recovery provision: If the communication gets out of sync, we can sync back up using the opening 0xFFFF. But since there’s no code watching for that situation, if it falls out of sync our code would just be permanently confused until reset by the user.

For debugging I added the capability to output in raw hex. I was going to remove it once I had the distance calculation and decimal point code figured out, but I left it in place as a compile-time parameter just in case that would become handy in the future. Sending just hexadecimal data and skipping conversion to human-readable text would allow faster loops.

Note that this Mitutoyo Statistical Process Control (SPC) protocol has no interactive control — it just reports whatever is on the display. Switching units, switching direction, zeroing, all such functions are done through device buttons.

Once it all appears to work on the prototyping breadboard, I again soldered up a compact version and put it inside a custom 3D printed enclosure.

This image has an empty alt attribute; its file name is mitutoyo-spc-arduino-compact.jpg

Notes on ROS2 and rosserial

I’m happy to see ROS2 improve over the past several releases, each release more mature and suitable for adoption than the last. Tackling some long standing problems like cross compilation and also new frontiers. I know a lot of my current reservations about ROS2 are on the to-do list, but there’s a fairly significant item that appears to be deliberately absent: rosserial.

In ROS, the rosserial module is the default way of for something simple to communicate with the rest of a ROS robot. It’s been a popular way for robot builders to add small dedicated modules that serve their little niche simply and effectively with only an Arduino or similar 8-bit microcontroller. By following its conventions for translating ROS messages into simple serial byte sequences, robot builders don’t have to constantly reinvent this wheel. However, it is only really applicable when we are in control of both the computer and Arduino end of the communication. When one side is outside of our control — such as the case for LX-16A servos used on Sawppy — we can’t use the rosserial protocol and a custom node has to be created.

But while I couldn’t use rosserial for communication with the servos on my own Sawppy, I’ve seen it deployed for other Sawppy builds in different contexts. Rhys Mainwaring’s Curio rover ROS stack uses rosserial to communicate with its Arduino Mega, and Steve [jetdillo] has just installed a battery voltage monitor that reports via rosserial.

With its proven value to budget robotics, I’m sad to see it’s not currently in the cards for ROS2. Officially, the replacement for rosserial is micro-ROS built on the Micro XRCE-DDS Client. DDS is the standardized communication protocol used by ROS2, and XRCE stands for “eXtremely Resource Constrained Environment.” It’s an admirable goal to keep the protocol running with low resource consumption, but “low” is relative. Micro XRCE-DDS proudly listed its resource requirements as thus:

From the point of view of memory footprint, the latest version of this library has a memory consumption of less than 75 KB of Flash memory and 2.5 KB of RAM for a complete publisher and subscriber application.

If we look at the ATmega328P chip at the heart of a basic Arduino, we see it has 32KB of Flash and 2KB of RAM and that’s just not going to work. A straightforward port of rosserial was aborted due to intrinsic ties to ROS, but that Github issue still sees traffic because people want the thing that does not exist. [UPDATE: Now we have a ROS Discourse discussion thread about it too.]

I found a ros2arduino library built on Micro XRCE DDS, and was eager to see how it managed to fit on a simple ATmega328. Disappointingly, it doesn’t. The “Arduino” in the name referred to newer high end boards like the Arduino MKR ZERO, leaving the humble ATmega328 Arduino boards out in the cold.

As far as I can tell, this is by design. Much as how ROS2 has focused on 64-bit computing platforms over 32-bit CPUs, their “low end microcontroller support” is graduating from old school 8-bit chips to newer designs like the ESP32 and various ARM Cortex flavors such as the STM32 family. Given how those microcontrollers have fallen to single digit dollar amounts of cost, it’s hard to argue for the economic advantage of old 8-bit chips. (The processing power per dollar argument was lost long ago.) So even though the old 8-bit chips still hold some advantages, I can see the reasoning, and have resigned to accept it as the way of the future.

Sparklecon 2020 Day 2: Arduino VGAX

Unlike the first day of Sparklecon 2020, I had no obligations on the second day so I was a lot more relaxed and took advantage of the opportunity to chat and socialize with others. I brought Sawppy back for day two and the cute little rover made more friends. I hope that even if they don’t decide to build their own rover, Sawppy’s new friends might pass along information to someone who would.

I also brought some stuff to tinker at the facilities made available by NUCC. Give me a table, a power strip, and WiFi and I can get a lot of work done. And having projects in progress is always a great icebreaker for fellow hardware hackers to come up and ask what I’m doing.

Last night I was surprised to learn that one of the lighting panels at NUCC is actually the backlight of an old computer LCD monitor. The LCD is gone, leaving the brilliant white background illuminating part of the room. That motivated me to dust off the giant 30-inch monitor I had with a bizarre failure mode making it useless as a computer monitor. I wasn’t quite willing to modify it destructively just yet, but I did want to explore the idea of using the monitor as a lighting panel. Preserving the LCD layer, I can illuminate things selectively without overly worrying about the pixel accuracy problems that made it useless as a monitor.

The next decision was the hardest: what hardware platform to use? I brought two flavors of Arduino Nano, two flavors of Teensy, and a Raspberry Pi. There were solutions for ESP32 as well, but I didn’t bring my dev board. I decided to start at the bottom of the ladder and started searching for Arduino libraries that generate VGA signals.

I found VGAX, which can pump out a very low resolution VGA signal of 160 x 80 pixels. The color capability is also constrained, limited to a few solid colors that reminded me of old PC CGA graphics. Perhaps they share similar root causes!

To connect my Arduino Nano to my monitor, I needed to sacrifice a VGA cable and cut it in half to expose its wires. Fortunately NUCC had a literal bucketful of them and I put one to use on this project. An electrical testing meter helped me find the right wires to use, and we were in business.

Arduino VGAX breadboard

The results were impressive in that a humble 8-bit microcontroller could produce color VGA signals. But they were not very useful in the fact that this particular library is not capable of generating full screen video, only part of the screen was filled. I thought I might have done something wrong, but the FAQ covered “How do I center the picture” so this was completely expected.

I would prefer to use the whole screen in my project, so my search for signal generation must continue elsewhere. But seeing VGAX up and running started gears turning in Emily’s brain. She had a few project ideas that might involved VGA. Today’s work gave a few more data points on technical feasibility, so some of those ideas might get dusted off in the near future. Stay tuned. In the meantime, I’ll continue my VGA exploration with a Teensy microcontroller.

Arduino Mozzi Wilhelm Exercise

After completing a quick Mozzi exercise, I found a few problems that I wanted to fix in round 2.

The first problem was the use of audio clip from Portal 2. While Valve Software is unlikely to pursue legal action against a hobbyist exercise project for using one short sound from their game, they indisputably own the rights to that sound. If I wanted a short code exercise as any kind of example I can point people to, I should avoid using copyrighted work. Hunting around for a sound that would be recognizably popular but less unencumbered by copyright restrictions, I settled on the famous stock sound effect known as the Wilhelm scream. Information about this effect — as well as the sound clip itself — is found all over, making it a much better candidate.

The second problem was audible noise even when not playing sampled audio. Reading through Mozzi sound code and under-the-hood diagram I don’t understand why this noise is coming through. I explicitly wrote code to emit zero when there’s no sound, which I thought meant silence, but something else was happening that I don’t understand yet.

As a workaround, I will call stopMozzi() when playback ends, and when the button is pressed, I’ll call startMozzi(). The upside is that the noise between playback disappears, the downside is that I now have two very loud pops, one each at the start and end of playback. If connected to a powerful audio amplifier, this sudden impulse can destroy speakers. But I’ll be using it with a small battery-powered amplifier chip, so the destruction might not be as immediate. I would prefer to have neither the noise nor the pop, but until I figure out how, I would have to choose one of them. The decision today is for quick pops rather than ongoing noise.

This improved Arduino Mozzi exercise is publicly available on Github.

Arduino Mozzi Space Core Exercise

Temporarily stymied in my Teensy adventures, I dropped back to Arduino for a Mozzi exercise. I’ve helped Emily through bits and pieces of putting sampled audio on an Arduino using Mozzi, but I had yet to run through the whole process myself.

The hardware side was kept as simple as possible: there’s only a single switch wired to be normally open and momentarily closed. For audio output, I used the wire salvaged from the Project MC2 Pixel Purse(*) that was briefly a hacker darling due to its clearance-sale price. (As of this writing, the price is up to $39.58, far above the $6 – $8 leading to its “buy to take it apart” fame.) Since this cable was designed to be plugged into a cell phone, it had a TRRS plug that I rewired into an imitation of a monophonic TS plug by using the T wire and connecting RRS together into another wire.

With hardware sorted out, I dived into the software tasks. There were more than a few annoyances that make this task not very beginner-friendly. The Huffman audio compression utility audio2huff.py had dependencies listed only in a comment block easily overlooked by beginners. They didn’t all install in the same way (purehuff required a download and install, while the others were installed via pip.) And since they are a few years old and not actively maintained, they were all written for Python 2.

This will become more and more of a problem as we go, since Python 2 support has just officially ended at the start of 2020. Older Linux distributions would launch Python 2 when the user runs python at the command line, and those that want to use Python 3 would need to run python3. This is getting flipped around and some environments now launch Python 3 when the user runs python and those that need the old stuff has to run python2.

What happens when we run these utilities under Python 3? It’d be nice if the error message is “Hey, this needs Python 2” but that’s not the reality. It is more likely to be something like “foobar is not iterable” completely bewildering to beginners.

To be fair, none of these are problems specific to Mozzi, there’s a large body of work online that were written for Python 2 and waiting for someone motivated enough to bring them up to date.

Anyway, after those problems are sorted out, I got my Arduino to play a sound when the button is pressed. My test sound clip was from the game Portal 2: the ecstatic “SPAAACE!” emitted by the corrupted Space Core when it finally got its wish to go flying through space.

This Arduino Mozzi exercise is publicly available on Github.

[UPDATE]: “Wilhelm” exercise (round 2) has some improvements.


(*) Disclosure: As an Amazon Associate I earn from qualifying purchases. But you shouldn’t buy a Pixel Purse until it drops back down to clearance prices.

Examining Composite Video Signal Generated By Microcontrollers

There was a several-years-long period of my life when I spent money to build a home theater. This was sometime after DVD became popular, because the motivation was my realization of how much superior DVD picture quality was over VHS. With movies on VHS, noisy visual artifacts were a limitation of the analog magnetic medium. With movies on DVD, the media-imposed limitations were gone and now there are all these other limitations I could remove by spending money, lots of it, to do things like upgrade from composite video to S-Video connections.

Eventually home theater moved to all digital HDMI, and I stopped spending big money because even the cheapest flat panels could completely eliminate classic CRT problems like color convergence. (My personal peeve.) I thought I have left the era of CRT and composite video behind, but throwing out my pile of analog interconnects and video equipment turned out to be premature.

Now I’ve found an interest in old school video again, because they are accessible for the electronics hobbyist. It is much easier to build something to output a composite video signal rather than HDMI. Local fellow maker and tinkerer Emily likes the old school tech for aesthetics reasons in addition to accessibility. So one day we got together at one of our regular SGVTech meets to dig a little deeper into this world.

Emily brought an old portable TV with composite video input, and two candidate Arduino sketches each purporting to generate composite video. (arduino-tvout and one other whose name I can’t remember now.) I brought my ESP32 dev module running Bitluni’s composite video demo. For reference Emily had an actual composite video camera, the composite video Wikipedia page and the reference document used by Bitluni for his demo.

All three were able to get the little TV to show a picture. However, they looked very different under the oscilloscope. The [name will be filled in once I remember] sketch had the wildest waveform whose oscilloscope trace didn’t look anything like a composite video signal, but the proof is in the fact an animated 3D vector graphic cube showed up on the TV anyway. The waveform generated by arduino-tvout was a little rougher than expected, but unlike the previous, it was clearly recognizable as a composite video waveform on the oscilloscope and accepted by the TV. Waveform generated by Bitluni is the best fit with we expected to see, and matched most closely with output generated by the composite video camera.

Knowledge from tonight’s investigation will inform several of our project candidates.

 

Optical Drive Carriage, The Sequel

During my learning and exploration of stepper motor control, I managed to destroy an optical carriage I salvaged from a laptop drive. In order to continue experimentation I need another stepper motor linear actuator of some kind. I rummaged in my pile of parts and came up empty-handed, but I am fortunate to have another local resource: Emily‘s pile of parts. My request was granted with an assembly that had an intact motor, drive screw, and linear carriage. The optical assembly in that carriage had been torn apart, but that is irrelevant for the purposes of this project.

Unlike the previous two stepper motors I played with, this one has exposed pads on the flexible control cable so I tried to solder to them. Given my experience soldering fine pitched parts, I knew it’s problematic to solder one at a time: they are so close together heat from one pad will melt solder on an adjacent pad. My best bet is to set things up so all the wires are soldered at the same time.

Emily salvaged stepper motor wiringThere is slightly less solder than I would have preferred on each of these joints, but several efforts to add solder created solder bridges across pins. Requiring removal by solder sucker, which reduced the amount of solder even more. Since there was enough for electrical conductivity, I left it as is to continue my experiments.


Unrelated to stepper motors or Grbl:

While I was taking the above picture to document my work, I noticed I was being watched and took a picture of the watcher. This little beauty was surveying my workbench perched on top of a cardboard box of M3 fasteners from McMaster-Carr. The spider was only about 5mm long overall including legs. Unfortunately at the time of this shot I had set it for shallow depth of field to photograph the above solder joint. I adjusted my camera to bring more into focus, but this little one was camera shy and jumped away before I could take a better shot. Still, I’m quite pleased with my camera’s macro lens.

Spider on McMaster Carr box

Panasonic UJ-867 Optical Carriage (Briefly) Under A4988 Control

Once I extracted the optical assembly carriage of a Panasonic UJ-867 optical drive, the next step is to interface it with a Pololu A4988 driver board. And just as with the previous optical drive stepper motor, there are four visible pins indicating a bipolar motor suitable for control via A4988. However, this motor is far smaller as befitting a laptop computer component.

Panasonic UJ-867 70 stepper motor connector

The existing motor control cable actually passed through the spindle motor, meaning there were no convenient place to solder new wires on the flexible connector. So the cable was removed and new wires soldered in its place.

Panasonic UJ-867 80 stepper motor new wires

Given the fine pitch of these pins it was very difficult to avoid solder bridges. But it appeared to run standalone so I reinstalled into the carriage. Where it still ran – but was very weak. Hardly any power at all. When I tilted it up so the axis of travel is vertical, the carriage couldn’t win its fight against gravity. Since the job is only to move an optical assembly, I didn’t expect these carriages exert a great deal of force. But I’ve seen vertically mounted slot loading optical drives. I thought it should at least be able to fight against gravity.

A Dell laptop charger delivers 19.2V. I’m not sure how many volts this motor intended to run at, but 12V seemed reasonable. Then I increased current beyond the 50mA of the previous motor. Increasing both voltage and amperage seemed to help with more power, but it remained too weak to overcome gravity.

As I’m tilting the metal carriage assembly in my hand, I started noticing it was warming. Oh no! The motor! I checked the temperature with my finger, which was a mistake as it was hot enough to be painful to human skin. I shut down my test program but it was too late, the carriage never moved again.

Lessons learned: don’t get too overzealous with power and check temperature more frequently.

And if I want to continue these experiments, I’ll need another stepper motor assembly.

Resuming Pololu Stepper Driver Adventures with Arduino and A4988

By the time I got around to playing with homing switches on a salvaged industrial XY stage, it was getting late. I only had a few minutes to perform one experiment: connect the normally open homing switch to X_LIMIT_PIN (GPIO02 as per cpu_map.h), set HOMING_CYCLE_0 to X_AXIS in config.h, and sent command to home X-axis. The motor moved right past the switch into the hard stops, so I turned off the ESP32 marking an unsatisfying end to the work session.

I wanted to be able continue learning Grbl while at home, away from the salvaged hardware, so I dug up the A4988 motor control board I’ve played with briefly. It’s time to get a little further in depth with this thing. Motivated by my current state in the XY stage project, the first goal is to get a stepper motor to activate a homing switch. If I could get that far, I’ll think about other projects. People have done some pretty creative things with little stepper motors controlled by Grbl, I could join that fun.

Rummaging through my pile of parts, the first stepper motor I retrieved was one pulled from an optical drive. This particular stepper motor had only the drive screw, the carriage has been lost. But with four exposed pins in two groups of two, it is a bipolar motor suitable for an A4988 motor control board. I just had to solder some wires to make it usable with a breadboard.

Since this stepper motor was a lot smaller than the one used in my previous A4988 stepper motor experience, I thought this was a good opportunity to learn how to tune the current limits on these modules by following instructions published by Pololu and using an Arduino as a test controller running code published on this page. I started with a limit of 100mA, but the motor got quite toasty at that level after running for a minute. I turned it down to 50mA, and it no longer got hot to the touch running that Arduino sketch.

This is a good start, but a motor with just a drive shaft is not useful for motion control. The next step is to find something that could push on a limit switch. I don’t seem to have anything handy, so it’s time to start digging into salvaged hardware.

Evaluate Retired Melzi Board for XY Stage

In an effort to put a salvaged industrial XY table back to work, the Arduino AccelStepper was used as a quick test to see if the motors and controllers still respond to input signals. Things moved, which is a good sign, but the high precision of the Parker ZETA4 controller demanded step pulses at a far higher frequency than what AccelStepper could deliver.

The search then moved on to something that could generate the pulses required. I’m confident the hardware is capable of more, as AccelStepper topped out at less than 5 kHz signal on a 8 MHz chip. Pulsing a pin isn’t a task that requires over 1,000 instruction cycles. Given familiarity with the 3D printer world, I started looking at Marlin which runs on Arduino hardware.

The problem with running Marlin on my Arduino Nano is that I would have none of the associated accessories. No control panel, no SD reader, etc. However, I do have a full control board retired from one of my 3D printers. This board called itself a Melzi Ardentissimo and a search led to the Melzi page of RepRap wiki. Thanks to the open nature of this design, I could trace through its PCB layout. Much to my disappointment, the step and direction signals connected straight from the tiny pin on the main processor to the motor driver without surfacing in an easily tapped fashion. The intent of this board is integration and it’ll be quite some work to defeat that intent in order to decouple the processor from its integrated stepper driver.

Fortunately, I’m not limited to the world of AVR ATmega chips, nor Marlin software. There’s another very capable processor on hand waiting for such project… an ESP32 running Grbl software.