A Curiously Clean Thrift Store Neato XV-21

I went to a thrift store to take pictures, but came home with a Neato XV-21 vacuum that did not power on. At $7.99, it was a cheap gamble to see what might be usable in this robot vacuum cleaner. In my cursory in-store examination, it looked to be in good shape. After I took a little time for a closer look, I realized it was actually in even better shape than I had initially thought.

Super clean Neato XV-21

Of course, there’s no mistaking the vacuum for new. There is dirt in the dust bin and visible black smudges on top of the vacuum. But beyond that I found almost no other signs of wear. The exterior sides of this unit are clean, free of the scuff marks and other marring I associate with robot vacuum veterans. Although my experience are mostly with affordable models of Roomba and similar vacuums that perform a random walk through its environment, bumping into things along the way. The selling point of a Neato is its lidar for object avoidance, so the nearly pristine sides of may just be a result of excellent object avoidance and not lack of use.

Super clean Neato XV-21 brush

Its floor cleaning brushes show some dirt but is actually really clean. My own home vacuum would show more dirt than this after a minute of use. Brush roller drive belt looks to be in good shape.

At this point I’m still undecided whether this was a barely used robot vacuum or maybe the previous owner is just very meticulous about keeping their tools clean. The clean brush and dust bin certainly imply the former, but the latter is also quite possible as indicated by the presence of original protective plastic over the control panel. The previous owner was one of those people who did not remove plastic protective film that were meant to be removed, a mindset very different from mine.

Me? I peel that stuff off before I start taking this vacuum apart to examine its internals.

New Project: Neato Hacking

My ROS learning robot Phoebe was built mainly around a laser distance scanner module salvaged from a Neato robot vacuum cleaner. At the time I knew people were selling them for $50-$75 on Craigslist in varying condition, but I was content to pay $45 for just the laser scanner. It was all I needed for my own robot exploration purposes. I thought a full Neato vacuum might be fun to play with, but I have enough projects on my to-do list that I didn’t feel the need to go out and find one.

Unless when I do. I recently wrote a Hackaday article about bargain shopping in a thrift store, and I needed a picture to go with my article. I went into my local thrift store to take some pictures, but it was also an opportunity to practice what I preached. I spent most of my time in the electronics section and didn’t find anything I wanted to take home with me. On my way out the door, though, I took a glance at the kitchen electronics section and spotted this beauty: a Neato robot vacuum with a price tag of only $7.99.

Savers Neato XV-21 1600

It doesn’t power on, and external accessories were nowhere to be found: neither a wall wart charger nor its charging dock. But it looked to be in pretty good condition with only minor cosmetic blemishes on the exterior. Aside from the missing charger, all other major components appear to be present. But the purchase decision was based on the most interesting part: I looked inside the top bump to verify presence of a familiar looking laser scanner unit. If I all I get out of this $7.99 is a Neato lidar, I’ll be happy. If anything else worked, they would just be icing on the cake.

It’s a big question mark, but it’s one I’m buying to take home for a closer look.

SGVHAK Rover, Sawppy, and Phoebe at SGVLUG February 2019 Meeting

At the February 2019 meet for San Gabriel Valley Linux User’s Group (SGVLUG), Lan and I presented the story of rover building in our hardware hackers spinoff group a.k.a. SGVHAK. This is a practice run for our presentation at Southern California Linux Expo (SCaLE) in March. Naturally, the rovers themselves had to be present as visual aids.

20190214 Rovers at SGVLUG

We started the story in January 2018, when Lan gathered the SGVHAK group to serve as beta testers for Jet Propulsion Laboratory’s Open Source Rover project. Then we went through our construction process, which was greatly motivated by our desire to have SGVHAK rover up and running at least year’s SCaLE. Having a rover at SCaLE was not the end, it was only the beginning. I started building my own rover Sawppy, and SGVHAK rover continued to pick up hardware upgrades along the way.

On the software side, we have ambition to increase sophistication by adapting the open source Robot Operation System (ROS) which led to a small digression to Phoebe, my tool for learning ROS. Getting a rover to work effectively under ROS poses some significant challenges that we have yet to address, but if it was easy it wouldn’t be fun!

Since this was a practice talk, the Q&A session at the end was also a forum for feedback on how we could improve the talk for SCaLE. We had some good suggestions on how we might have a better smoother narrative through the story, and we’ll see what we can figure out by March.

Sawppy at Brawerman East STEAM Makers Fair

Sawppy’s publicity appearance today was at Brawerman East STEAM Makers Fair, a supercharged science fair at a private elementary school. Sawppy earned this invitation by the way of January presentation at Robotics Society of Southern California. The intent is to show students that building things is more than their assignments at their on campus Innovation Lab, there are bigger projects they can strive for beyond the classroom. But the format is, indeed, just like a school science fair, where Sawppy got a display table and a poster board.

Brawerman STEAM Makers Fair - Sawppy on table

But Sawppy is not very interesting sitting on a table, it didn’t take long before the rover started roving amongst other exhibits. The school’s 3D printer is visible on the left – a Zortrax M200.

Brawerman STEAM Makers Fair - Sawppy roaming

Sawppy was not the only project from grown-ups present. I admire the ambition of this laser cutter project undertaken by one of the parents. Look at the size of that thing. It is currently a work in progress, and its incomplete wiring were completely removed for this event so little fingers are not tempted to unplug things and possibly plugging them in a wrong place.

Brawerman STEAM Makers Fair - laser cutter

The center of this tables had some old retired electronics equipment that kids will be able to take apart. This was a huge hit at the event, but by the end of the night this side of the room was a huge mess of tiny plastic pieces scattered all over.

Brawerman STEAM Makers Fair - deconstruction zone

I brought my iPad with the idea I could have Sawppy’s Onshape CAD data visible for browsing, but it turned out the iOS Onshape app required a live internet connection and refused to work from cache. As an alternate activity, I rigged it up to show live video footage from Sawppy’s onboard camera. This was surprisingly popular with the elementary school age crowd, who got a kick out of making faces at the camera and seeing their faces on the iPad. I need to remember to do this for future Sawppy outings.

Brawerman STEAM Makers Fair - Sawppy camera ipad

After Sawppy was already committed to the event, I learned that a Star Wars themed art car was also going to be present. So I mentioned my #rxbb8 project which earned me a prime parking spot on the first floor next to the far more extensively modified “Z-Wing.” Prepare to jump to hyperspace!


(Cross-posted to Hackaday.io)

Sawppy at Space Carnival Long Beach

Sawppy at Space Carnival Long Beach

Space Carnival, held at the Expo Arts Center in Long Beach, California, welcomed Sawppy as one of several exhibits Monday afternoon. It turned out to be part of a week-long LEGO robotics camp for elementary school students. Most of the events are for campers, but the Monday evening Space Carnival was open to the public.

Since the focus is on LEGO, there were plenty of plastic bricks in attendance. The middle of the room had a big pile of bricks on a plastic tarp and kids were crawling all over the pile building their creations. Sawppy mostly spent time outside of the tarp, occasionally venturing on to some of the colorful game boards for LEGO robots to line-follow and other tasks.

Sawppy at Space Carnival Long Beach LEGO tarp

As usual, I handed controls over for kids in attendance to play with. Running over feet is still more popular of an event than I can hope to understand but, if it makes them excited, so be it.

Sawppy at Space Carnival Long Beach running over feet

Sawppy was not the only non-LEGO robot in attendance, there were also a selection of Star Wars licensed merchandise including this R2D2. I forgot if this particular unit was made by Sphero or Hasbro.

Sawppy at Space Carnival Long Beach R2D2

This event was not the first time I crossed paths with Barnabas Robotics, but it was the first time I got to speak with them beyond the standard sales pitch type of discussions. Since their business is STEM education for students K-12, they have a good feel of what type of material is appropriate for various age groups. It’s possible Sawppy can find a role in high school curriculum.

At the end of the night, the LEGO tarp cleared out enough for me to drive Sawppy across the field. Unfortunately I saw Emily’s tweet too late to replicate the movie clip she had suggested. Maybe another day!

(Cross-posted to Hackaday.io)

Sawppy Has A Busy Schedule This Week

Since the time I declared Sawppy version 1.0 (mechanical maturity), I’ve been taking my rover out to various events. From casual social gatherings to large official events to presentation in front of others who might appreciate my project. Sawppy has been a great icebreaker for me to start talking with people about their interests, and sometimes this leads to even more events added to Sawppy’s event calendar. This coming week will be especially busy.

Space Carnival

On Monday February 11th from 3pm-6:30pm Sawppy will be at Space Carnival, a FIRST Themed Event on Lincoln’s Birthday. Held at Expo Arts Center, a community space in Long Beach, CA. This event is organized by people behind local FIRST robotics teams. This year’s competition is titled “Destination: Deep Space” and has a very explicit space exploration angle to all the challenges. So even though Sawppy is nothing like a FIRST robotics competitor, an invitation was still extended to join the fun.

This event will be unique in that I had the option to be a roaming exhibit and I chose it for novelty. I think a rover who is roving will be much more engaging than a rover sitting on a table. It also means I will not be tied to a booth, so I could check out other exhibitors as I roam around with Sawppy. This eliminates the problem I had with Sawppy at DTLA Mini Maker Faire – I had to stay in one place for most of the event and couldn’t see what other people had brought.

On Wednesday February 13th Sawppy will join a STEAM Maker’s Fair at Brawerman East, a private elementary school. This is a small event catering to students and parents at the school. I believe the atmosphere will be similar to a school science fair, with exhibits of student projects. To augment these core exhibits, Sawppy and a few others were invited. The intent is to show that concepts covered in their on-campus Innovation Lab projects are just as applicable to bigger projects outside of their class.

And finally, on Thursday Februarh 14th Sawppy will be part of another SGVLUG presentation. A follow-up to the previous rover themed SGVLUG presentation, this will still set up background but will talk more about what has happened since our initial rover construction. This also serves as a practice run for a presentation to be given at Southern California Linux Expo (SCaLE) next month.

(Cross-posted to Hackaday.io)

SparkleCon Sidetrack: Does It Have A Name?

spool holder with two stage straightener 1600x1200

My simple afternoon hack of a copper wire straightener got more attention – both online and off – than I had expected. One of these came as a fun sidetrack to my Sparklecon talk about my KISS Tindie wire sculptures. As part of the background on my wire form project, I mentioned creating this holder. It kicked off a few questions, which I had answered, but I had the most fun with “Does it have a name?”

I gave the actual answer first, which was that I had only been calling it a very straightforward “wire spool holder with straightener” but I followed it up with an off-the-cuff joke “Or did you mean a name like Felicia?” I think I saw a smile by the person asking the question (hard to tell, he had a beard) and I also got a few laughs out of the audience which is great. I had intended to leave it at that, but as I was returning to my presentation another joke occurred to me: “Felicia will set you straight.”

Since my script was already derailed, I saw no reason to run with it: “Is there a fictional character who is a disciplinarian? That might be fitting.” and opened it up to the audience for suggestions. We got “Mary Poppins” which isn’t bad, but things went downhill from there. The fact is: the disciplinarian in a story is almost always a killjoy obstacle in our hero’s journey. Or worse, one of the villains, as in the suggestion of “Delores Umbridge” given by a woman wearing a Harry Potter shirt. My reaction was immediate: “No.” But two seconds later I remembered to make it a tad more positive: “Thank you, she is indeed a disciplinarian, but no.” Hopefully she doesn’t feel like I bit her head off.

After the talk, there were additional suggestions interpreting my second joke “Felicia will set you straight” in the sense of personal relationship preferences. This went down a path of politically conservative zealots who believe it is their public duty to dictate what people do in private. This direction of thinking never even occurred to me when I threw out the joke on a whim.

I think I’ll leave it at Mary Poppins.

Using LibPNG To Encode Spooky Eye Data

Sclera array and bitmap

Emily and I thought it would be cool to have the Spooky Eye visualization running on platforms in addition to Teensy and Adafruit SAMD boards. The first target: a Raspberry Pi zero. Reading through the project documentation and source code gave us a good idea how the data is encoded, but the best test is always to make use of that data and see if it turns out as I expected.

This would be a new coding experiment, so a new Github repository was created. I added the header files for various eyeballs then I started looking for ways to use that data. Since the header files are in C, it made sense to look for a C library to do something. I decided to output data to PNG bitmap files. Verifying the output looks correct would be as simple as opening the bitmap in an image viewer.

The canonical reference library for PNG image compression is libpng. Since I expect my use to be fairly mainstream, I skipped over the official documentation that covers all the corner cases a full application would need to consider. In the spirit of a quick hack prototype, I looked for sample code to modify. I found one by Dr. Andrew Greensted that looked simple and amenable to this project.

I fired up my Ubuntu 18.04 WSL and installed gcc and libpng-dev as per instructions. The sample failed to compile at first with this error:

/tmp/ccT3r4xP.o: In function `writeImage':
makePNG.c:(.text+0x36f): undefined reference to `setRGB'
collect2: error: ld returned 1 exit status

Since there were a lot of references to this sample code, I thought this wouldn’t be a new problem. A web search on “makePNG undefined reference to setRGB” sent me to this page on Stackoverflow, which indicated there was a problem with use of C keyword inline. There were two options to get around this problem: either remove inline or use the -Ofast compiler option to bypass some standards compliance. I chose to remove inline.

That was enough to get baseline sample code up and running, and modification begins. The first act is to #include "defaultEye.h" and see if that even compiles… it did not.

In file included from makePNG.c:20:0:
defaultEye.h:4:7: error: unknown type name ‘uint16_t’

Again this was a fairly straightforward fix to #include <stdint.h> which takes care of defining standard integer type uint16_t.

Once everything compiled, the makePNG sample code for generating a fractal was removed, as was the code to translate the fractal’s floating point value into color. The image data was replaced with data from Spooky Eye header files. If all works well, I should have a PNG bitmap. The first few efforts generated odd-looking images because there were bugs in my code to covert Spooky Eyes image array, encoded in 16-bit RGB565 format, to be written out in 24-bit RGB888 format. Once my bitwise manipulation errors were fixed, I had my eyeballs!

Looking Under The Hood Of Adafruit Spooky Eyes

Sclera array and bitmap

Adafruit’s Hallowing was easily the most memorable part of the 2018 Superconference attendee gift bag. Having a little moving blinking eye looking around is far more interesting than a blinking LED. It is so cool, in fact, that Emily has ambition to put the same visual effect on other devices.

Since the Hallowing was one of the headline platforms that supported CircuitPython, the original hope was that it would be very easy to translate to a Raspberry Pi. Sadly, it turns out “Spooky Eyes” is actually a sketch created using Arduino IDE for a Teensy board that also ran on the Hallowing.

As I found out in my own Nyan cat project for Superconference 2018 badge, modern image compression algorithms are a tough fit for small micro controllers. And just as I translated an animated GIF into raw byte data for my project, Spooky Eyes represented their image data in the form of raw bytes in a header file.

Adafruit always has excellent documentation, so of course there’s a page describing what these bytes represent and where they came from for the purposes of letting people create their own eye bitmaps. Apparently this project came from this forum thread. I was a little grumpy the Adafruit page said “from the Github repository” without providing a link, but the forum thread pointed here for the Python script tablegen.py.

There was a chance the source bitmaps would be on Github as well, but I had no luck finding them. They certainly weren’t in the same repository as tablegen.py or the Arduino sketches I examined. Still, the data is there, we just need to figure out what format would be most useful for putting the eye on another project.

As a first step, I’ll try to extract and translate them into a more familiar lossless bitmap format. Something that can be directly usable by more powerful devices like a Raspberry Pi. A successful translation would confirm I understand the eyeball data format correctly, which would be good to know for any future projects that might want to encode that data into different formats as needed for other devices.

KISS Tindies: Ace/Spaceman II

One of the scramble before Sparklecon was getting my KISS band back together. On the weekend prior to Sparklecon, the band went on tour in their first public appearance. The good news: they were very well received and people loved them! The bad news: Someone loved them so much they decided to adopt my Spaceman, taking him home without my permission. I was missing a member of the band.

I had already signed up to talk about the band for Sparklecon, and it would be a bit lame not to have the full band at my talk. This means making Spaceman II, but for that I would need another KISS Tindie PCB. Fortunately, Emily came to the rescue! She was also at Superconference and had also picked up a KISS Tindie PCB of her own. She generously donated her panel so I could rebuild my blinky band.

Emily KISS Tindie Panel.jpg

Emily had already soldered a pair of yellow non-blinking LEDs to her Spaceman. For the sake of consistency with the rest of my blinky band, those two LEDs were removed. Then I got to work rebuilding a wire frame body. Given the time crunch, I tried to skim a bit on details and initially started trying to make Spaceman’s guitar out of a single length of wire.

Spaceman 2 one piece guitar

I only got this far, though, before I decided it didn’t look good. I aborted and returned to multi-piece construction. It is more time consuming, but it conveys superior detail.

Spaceman 2 multi piece guitar.jpg

Unfortunately that aborted experiment put me further behind on schedule. This is not the time to experiment, I need to stick with known solutions. For the most part, I stuck with what I knew worked for the rest of this reconstruction.

Spaceman 2 complete

I’m sad that I lost my first KISS Tindie Spaceman, but this experience also gave me the opportunity to answer one question I was asked: How long does it take to build a wire frame body for a KISS Tindie? I honestly did not know because when I’m focused on a project like this I lose track of time. Bend wire, compare against drawing, repeat until the curve is right. Then solder that piece, and repeat the whole process for the next piece.

I had guessed maybe each Tindie would take 30-45 minutes. This time, I started a timer just before I removed the yellow LEDs and stopped it right after I took the above picture of a completed KISS Spaceman II. Total time: 2 hours 45 minutes. Even though this included the aborted single-wire guitar, my estimate was clearly way off!

But that time was well spent, I had the full band again for my Sparklecon presentation.

Party Bling in 30 Minutes: LED Blinky Collar

It’s good to have grand long term projects like Sawppy, but sometimes it’s nice to take a side trip and bang out a quick fun project. The KISS Tindies were originally supposed to be such a project but grew beyond its original scope. Today’s project had to be short by necessity. At less than 30 minutes, it’s one of my shortest projects ever.

Collar LED blinky final curved.jpg

The motivation for this project was an office party, but I didn’t know what the crowd was going to be like. My fallback outfit for such unknown is a long sleeved dress shirt and a sport jacket. If it turns out to be formal, I’ll be under-dressed but at least I’ll have a jacket on. If it turns out to be semi-formal I should fit in. If it is casual, I can take off the jacket. But these are people in the electronics industry, so there’s a chance I will find a room full of people wearing flashing LEDs. I decided, less than an hour before I had to leave, instead of my usual necktie I’m going to substitute a little bit of LED bling of my own.

The objective is to take the same self-blinking LEDs I used on my KISS Tindies and install them under my shirt collar. Since these LEDs can be obnoxiously bright (especially in dark rooms) the light will be indirect, bouncing off fabric underneath my collar. This way I don’t blind whoever I’m trying to hold a conversation with.

When I bought the self-blinking LEDs for the KISS Tindies project, I bought a bag of 100 so there’s plenty left to play with. For a battery holder I’ll use the same design I created for the Tindies out of copper wire. There’s no time to 3D print a structure, so I’m just going to use paper. Copper foil tape will form circuitry on that sheet of paper. Here’s the initial prototype. I folded the paper in half to give it more strength. I had also cut out a chunk of paper to help the battery holder stay in place.

collar led blinky prototype parts

Assembling these parts resulted in a successfully blinking LED and good enough to proceed.

collar led blinky prototype works

The final version used a longer sheet of paper. I measured my shirt collar and cut a sheet of paper so the ends would sit roughly 3mm inside the collar. This was longer than a normal sheet of paper so I had to pull a sheet of legal size paper out of my paper recycle bin. I think it was the legal disclosure form for a pre-approved credit card offer.

collar led blinky final

The LEDs sit a few centimeters inside the paper’s edge. The other side of the paper had extra copper tape to shield the light from shining through. I wanted the light to reflect inside my collar, not show through it. The first test showed a few circular spotlights on my shirt, so I added a sheet of Scotch tape to diffuse light. Once I was happy with the layout of this contraption, I soldered all components to copper foil for reliability.

Less than 30 minutes from start to finish, I had a blinky LED accessory for my shirt.


As it turned out, there was only one other person wearing electronics in the form of some electroluminescent wire. My blinky LED collar was more subtle about announcing itself, but they were noticed by enough people to make me happy.

(Cross-posted to Hackaday.io)

Sawppy Odometry Candidate: Flow Breakout Board

When I presented Sawppy the Rover at Robotics Society of Southern California, one of the things I brought up as an open problems is how to determine Sawppy’s movement through its environment. Wheel odometry is sufficient for a robot traveling on flat ground like Phoebe, but when Sawppy travels on rough terrain things can get messy in more ways than one.

In the question-and-answer session some people brought up the idea of calculating odometry by visual means, much in the way a modern optical computer mouse determines its movement on a desk. This is something I could whip up with a downward pointing webcam and open source software, but there are also pieces of hardware designed specifically to perform this task. One example is the PWM3901 chip, which I could experiment using breakout boards like this item on Tindie.

However, that visual calculation is only part of the challenge, because translating what that camera sees into a physical dimension requires one more piece of data: the distance from the camera to the surface it is looking at. Depending on application, this distance might be a known quantity. But for robotic applications where the distance may vary, a distance sensor would be required.

As a follow-up to my presentation, RSSC’s online discussion forum brought up the Flow Breakout Board. This is an interesting candidate for helping Sawppy gain awareness of how it is moving through its environment (or failing to do so, as the case may be.) A small lightweight module that puts the aforementioned PWM3901 chip alongside a VL53L0x distance sensor.


The breakout board only handles the electrical connections – an external computer or microcontroller will be necessary to make the whole system sing. That external module will need to communicate with PWM3901 via SPI and, separately, VL53L0x via I2C. Then it will need perform the math to calculate actual X-Y distance traveled. This in itself isn’t a problem.

The problem comes from the fact a PWM3901 was designed to be used on small multirotor aircraft to aid them in holding position. Two design decisions that make sense for its intended purpose turns out to be a problem for Sawppy.

  1. This chip is designed to help hold position, which is why it was not concerned with knowing the height above surface or physical dimension of that translation: the sensor was only concerned with detecting movement so the aircraft can be brought back to position.
  2. Multirotor aircraft all have built-in gyroscopes to stabilize itself, so they already detect rotation about their Z axis. Sawppy has no such sensor and would not be able to calculate its position in global space if it doesn’t know how much it has turned in place.
  3. Multirotor aircraft are flying in the air, so the designed working range of 80mm to infinity is perfectly fine. However, Sawppy has only 160mm between the bottom of the equipment bay and nominal floor distance. If traversing over obstacles more than 80mm tall, or rough terrain bringing surface within 80mm of the sensor, this sensor would become disoriented.

This is a very cool sensor module that has a lot of possibilities, and despite its potential problems it has been added to the list of things to try for Sawppy in the future.

Give The People What They Want: Wire Straightener Now On Thingiverse

My wire straightener project was focused on simplicity and reliability. There are no mechanical adjustments for different gauge wires or to correct for a 3D printer’s dimensional accuracy (or lack thereof.) Every adjustment had to be made in CAD by changing the relevant dimensions and printing a test unit. This requires more work up front, but once all the dimensions are dialed in, the single piece tool will never fall apart and will never need readjustment.

spool holder with two stage straightener 1600x1200

It also means the raw STL files generated by Onshape for my printer would probably not work properly for anyone else. For starters, it was tailored for my specific spool of 18 gauge copper wire. According to Google, 18 gauge translates to a diameter of 1.02mm. My calipers say my spool is 1.00 +/- 0.01 mm, slightly smaller than specified. It is then processed into G-Code by Simplify3D, my printing slicer. And finally that G-Code is translated into plastic by my printer, with all its individual quirks.

So while I was happy to share my Onshape CAD file, I resisted sharing the STL because it almost certainly would not work correctly and I don’t want people to have a bad experience with my design. But people ask for it anyway, over and over.

I have since changed my mind on the topic of posting the STL. I will post the STL, but never by itself. I will also post information describing why the STL is probably not going to work, link to Onshape CAD, and what people need to do to make their own. I foresee the following possibilities:

  1. People who don’t read the instructions will print the file as-is:
    • If it works for them – great!
    • If it doesn’t:
      • Abandon with “This design is stupid and it sucks.” – Well, let’s face it, I was not going to reach this audience anyway.
      • Maybe I should go back and read the instructions.”
  2. People who read the instructions:
    • Successfully fine-tune parameters to successfully make their own straightener – great!
    • Tried to follow directions, but encountered problems and need help – I’m happy to help.

Unless I’ve failed to consider something horrible, these possibilities have more upsides than downsides, so let’s try it. I’m going to share the STL files on the Hackaday.io project page, and I’ve created a Thingiverse page for it as well.

(Cross-posted to Hackaday.io)

ROS In Three Dimensions: Navigation and Planning Will Be Hard

backyard sawppy 1600At this point my research has led me to ROS modules RTAB-Map which will create a three dimensional representation of a robot’s environment. It seems very promising… but building such a representation is only half the battle. How would a robot make good use of this data? My research has not yet uncovered applicable solutions.

The easy thing to do is to fall back to two dimensions, which will allow the use of standard ROS navigation stack. The RTAB-Map ROS module appears to make the super easy, with the option to output a two dimension occupancy grid just like what navigation wants. It is a baseline for handling indoor environments, navigating from room to room and such.

But where’s the fun in that? I could already do that with a strictly two-dimensional Phoebe. Sawppy is a six wheel rover for handling rougher terrain and it would be far preferable to make Sawppy autonomous with ROS modules that can understand and navigate outdoor environments. But doing so successfully will require solving a lot of related problems that I don’t have answers yet.

We can see a few challenges in the picture of Sawppy in a back yard environment:

  • Grass is a rough surface that would be very noisy to robot sensors due to individual blades of grass. With its six wheel drivetrain, Sawppy can almost treat grassy terrain as flat ground. But not quite! There are hidden dangers – like sprinkler heads – which could hamper movement and should be considered in path planning.
  • In the lower right corner we can see transition from grass to flat red brick. This would show as a transition to robot sensors as well, but deciding whether that transition is important will need to be part of path planning. It even introduces a new consideration in the form of direction: Sawppy has no problem dropping from grass to brick, but it takes effort to climb from brick back on to grass. This asymmetry in cost would need to be accounted for.
  • In the upper left corner we see a row of short bricks. An autonomous Sawppy would need to evaluate those short bricks and decide if they could be climbed, or if they are obstacles to be avoided. Experimentally I have found that they are obstacles, but how would Sawppy know that? Or more interestingly: how would Sawppy perform its own experiment autonomously?

So many interesting problems, so little time!

(Cross-posted to Hackaday.io)

ROS In Three Dimensions: Data Structure and Sensor

rosorg-logo1One of the ways a TurtleBot makes ROS easier and more approachable for beginners is by simplifying a robot’s world into two dimensions. It’s somewhat like the introductory chapters of a physics textbook, where all surfaces are friction-less and all collisions are perfectly inelastic. The world of a TurtleBot is perfectly flat and all obstacles have an infinite height. This simplification allows the robot’s environment to be represented as a 2D array called an occupancy grid.

Of course, the real world is more complicated. My TurtleBot clone Phoebe encountered several problems just trying to navigate my home. The real world do not have flat floors and obstacles come in all shapes, sizes, and heights. Fortunately, researchers have been working on problems encountered by robots venturing outside the simplified world, it’s a matter of reading research papers and following their citation links to find the tools.

One area of research improves upon the 2D occupancy grid by building data structures that can represent a robot’s environment in 3D. I’ve found several papers that built upon the octree concept, so that seems to be a good place to start.

But for a robot to build a representation of its environment in 3D, it needs 3D sensors. Phoebe’s Neato vacuum LIDAR works in a simplified 2D world but won’t cut it anymore in a 3D world. The most affordable entry point here is the Microsoft Kinect sensor bar from an old Xbox 360, which can function as a RGBD (red + blue + green + depth) input source for ROS.

Phoebe used Gmapping for SLAM, but that takes 2D laser scan data and generates a 2D occupancy grid. Searching for a 3D SLAM algorithm that can digest RGBD camera data, I searched for “RGBD SLAM” that led immediately to this straightforwardly named package. But of course, that’s not the only one around. I’ve also come across RTAB-Map which seems to be better maintained and updated for recent ROS releases. And best of all, RTAB-Map has the ability to generate odometry data purely from the RGBD input stream, which might allow me to bypass the challenges of calculating Sawppy’s chassis odometry from unreliable servo angle readings.

(Cross-posted to Hackaday.io)

Sawppy on ROS: Open Problems

A great side effect of giving a presentation is that it requires me to gather my thoughts in order to present them to others. Since members of RSSC are familar with ROS, I collected my scattered thoughts on ROS over the past few weeks and condensed the essence into a single slide that I’ve added to my presentation.

backyard sawppy 1600

From building and running my Phoebe robot, I learned about the basics of ROS using a two-wheeled robot on flat terrain. Sticking to 2D simplifies a lot of robotics problems and I thought it would help me expand to a six-wheeled rover to rough terrain. Well, I was right on the former but the latter is still a big step to climb.

The bare basic responsibilities of a ROS TurtleBot chassis (and derivatives like Phoebe) is twofold: subscribe to topic /cmd_vel and execute movement commands published to that topic, and from the resulting movement, calculate and publish odometry data to topic /odom.

Executing commands sent to /cmd_vel is relatively straightforward when Sawppy is on level ground. It would not terribly different from existing code. The challenge comes from uneven terrain with unpredictable traction. Some of Sawppy’s wheels might slip and resulting motion might be very different from what was commanded. My experience with Phoebe showed that while it is possible to correct for minor drift, major sudden unexpected shifts in position or orientation (such as when Phoebe runs into an unseen obstacle) throws everything out of whack.

Given the potential for wheel slip on uneven terrain, calculating Sawppy odometry is a challenge. And that’s before we consider another problem: the inexpensive serial bus servos I use do not have fine pitched rotation encoders, just a sensor for servo angle that only covers ~240 of 360 degrees. While I would be happy if it just didn’t return any data when out of range, it actually returns random data. I haven’t yet figured out a good way to filter the signal out of the noise, which would be required to calculate odometry.

And these are just challenges within the chassis I built, there’s more out there!

(Cross-posted to Hackaday.io)

Sawppy Presented at January 2019 RSSC Meeting

Today I introduced my rover project Sawppy to members of Robotics Society of Southern California. Before the presentations started, Sawppy sat on a table so interested people can come by for a closer look. My visual aid PowerPoint slide deck is available here.

sawppy at rssc

My presentation is an extended version of what I gave at Downtown LA Mini Maker Faire. Some of the addition came at the beginning: this time I’m not following a JPL Open Source Rover presentation, so I had to give people the background story on ROV-E, JPL OSR, and SGVHAK rover to properly explain Sawppy’s inspiration. Some of the addition came at the end: there were some technical details that I was able to discuss with a technical audience. (I’ll expand on them in future blog posts.)

I was very happy at the positive reception I received for Sawppy. The first talk of the morning covered autonomous robots, so I was afraid the audience would look down at Sawppy’s lack of autonomy. Thankfully that did not turn out to be a big deal. Many were impressed by the mechanical design and construction. Quite a few were also appreciative when I stressed my emphasis on keeping Sawppy affordable and accessible. In the Q&A session we covered a few issues that had easy solutions… if one had a metalworking machine shop. I insisted that Sawppy could be built without a machine shop, and that’s why I made some of the design decisions I did.

A few people were not aware of Onshape and my presentation stirred their interest to look into it. There was also a surprising level of interest in my mention of Monoprice Maker Select v2 as an affordable entry level 3D printer, enough hands were raised that I signed up to give a future talk about my experience.

(Cross-posted to Hackaday.io)

Dell Alienware Area-51m vs. Luggable PC

On the Hackaday.io project page of my Luggable PC, I wrote the following as part of my reason for undertaking the project:

The laptop market has seen a great deal of innovation in form factors. From super thin-and-light convertible tablets to heavyweight expensive “Gamer Laptops.” The latter pushes the limits of laptop form factor towards the desktop segment.

In contrast, the PC desktop market has not seen a similar level of innovation.

It was true when I wrote it, and to the best of my knowledge it has continued to be the case. CES (Consumer Electronics Show) 2019 is underway and there are some pretty crazy gamer laptops getting launched, and I have heard nothing similar to my Luggable PC from a major computer maker.

laptops-aw-alienware-area-51m-nt-pdp-mod-heroSo what’s new in 2019? A representative of current leading edge gamer laptop is the newly launched Dell Alienware Area-51m. It is a beast of a machine pushing ten pounds, almost half the weight of my luggable. Though granted that weight includes a battery for some duration of operation away from a plug, something my luggable lacks. It’s not clear if that weight includes the AC power adapter, or possibly adapters plural since I see two power sockets in pictures. As the machine has not yet officially launched, there isn’t yet an online manual for me to go read what that’s about.

It offers impressive upgrade options for a laptop. Unlike most laptops, it uses a desktop CPU complete with a desktop motherboard processor socket. The memory and M.2 SSD are not huge surprises – they’re fairly par for the course even in mid tier laptops. What is a surprise is the separate detachable video card that can be upgraded, at least in theory. Unlike my luggable which takes standard desktop video cards, this machine takes a format I do not recognize. Ars Technica said it is the “Dell Graphics Form Factor” which I had never heard of, and found no documentation for. I share Ars skepticism in the upgrade claims. Almost ten years ago I bought a 17-inch Dell laptop with a separate video card, and I never saw an upgrade option for it.

There are many different upgrade options for the 17″ screen, but they are all 1080p displays. I found this curious – I would have expected a 4K option in this day and age. Or at least something like the 2560×1440 resolution of the monitor I used in Mark II.

And finally – that price tag! It’s an impressive piece of engineering, and obviously a low volume niche, but the starting price over $2,500 still came as a shock. While the current market prices may make more sense to buy instead of building a mid-tier computer, I could definitely build a high end luggable with specifications similar to the Alienware Area-51m for less.

I am clearly not the target demographic for this product, but it was still fun to look at.

Sawppy Will Be Presented At RSSC

Roger Sawppy

I brought Sawppy to the Downtown Los Angeles Mini Maker Faire this past December, where I had the opportunity to give a short presentation about my project. (Above.) Also at the event were the Robotics Society of Southern California (RSSC) and a few members asked if I would be interested in presenting Sawppy at an upcoming RSSC meeting.

Since I’m always happy to share Sawppy with anyone interested in my little rover, I said yes and I’m on their calendar for the RSSC meeting on Saturday, January 12th. From 11AM to noon, I plan to talk for about 35-40 minutes and leave the remaining time for Q&A and drive Sawppy over obstacles to demonstrate the rocker-bogie suspension.

This group has collective expertise in Robot Operating System, which I’ve been learning on-and-off at a self guided pace. If conversations go in a direction where it makes sense, I’ll be asking my audience for their input on how to best put Sawppy on ROS. I also plan to bring Phoebe, my ROS TurtleBot clone that I built to learn ROS, just for fun.

And I’m sure I’ll see other cool robotics projects there!

(Cross-posted to Hackaday.io)

KISS Tindies: On Stage

Now it’s time to wrap up the KISS Tindies wireform practice project with a few images for prosperity. Here’s the band together on stage:

kiss tindie band on stage

An earlier version of this picture had Catman’s drum set in its bare copper wire form. It worked fine, but I recalled most drum sets I saw on stage performances had a band logo or something on their bass drum surface facing the audience. I hastily soldered another self blinking LED to my drum set, arranged so it can draw power from a coin cell battery sitting in the drum. Calling this a “battery holder” would be generous, it’s a far simpler hack than that.

kiss tindie drum set blinky led

I then printed a Tindie logo, scaled to fit on my little drum. Nothing fancy, just a standard laser printer on normal office printer. I then traced out the drum diameter and cut out a little circle with a knife. Old-fashioned white glue worked well enough to attach it to the copper wire, and that was enough to make the drum set far more interesting than just bare wire.

A black cardboard box lid served as a stage, with a 4xAA battery tray serving as an elevated platform for the drummer. I watched a few YouTube videos to see roughly where Demon, Spaceman, and Starchild stand relative to each other and Catman as drummer. It may not be a representative sample, but hopefully it eliminated the risk of: “They never stand that way.”

With batteries installed in everyone, it’s time for lights, camera, action! It was just a simple short video shot on my cell phone camera, one continuous pull back motion as smooth as I can execute with my bare hands helped by the phone’s built in stabilization. I had one of the aforementioned YouTube videos running for background sound.

I wanted to start the video focused on the drum logo, but doing so requires the phone right where I wanted Demon to stand. After a few unsatisfactory tests, I decided to just add Demon mid-scene after the phone has moved out of the way. It solves my position problem and adds a nice bit of dynamism to the shot.