Successful Polycarbonate Plastic Engraving Session

The first test run for CNC engraving was done on a piece of MDF. Mainly because the piece was already in the machine, surfaced, and ready to go. It was also a forgiving material in case of mistakes, but MDF doesn’t show engraved details very well.

The next session increased the difficulty level: now we have a piece of scrap polycarbonate plastic (“Lexan”) for our next engraving test. This material is interesting because it has different properties than PMMA (a.k.a. acrylic.) The latter is a popular material for laser cutting but also very brittle, very vulnerable to cracking under stress. Polycarbonate plastics are much more robust and a better choice when physical strength is important in a project.

Acrylic is also popular for laser engraving projects, but polycarbonates do not engrave or cut easily under laser power due to its different properties. It is not particularly friendly to CNC machining, either, but we’ll start with an engraving project before we contemplate milling them.

Thankfully the first session was a success, and illustrates some of the challenges of working with such materials. The toughness of the material also meant the little strings of cut chips want to remain attached to the stock, making cleanup a hassle. Upon close examination, we saw the engraved groove is slightly deeper on the left side than the right. Proof our scrap MDF working surface is not flat which was not a surprise, but “flat enough” within 4-8 thousands of an inch (1-2 sheets of normal office paper) which was better than expected.

Even with its imperfections, performance on this test indicates the machine is capable of engraving on materials we can’t use in the laser cutter. That might be useful, and a good example of how we can still learn lessons on this machine despite its flawed Z-axis and other problems. We should still fix them, of course, but the machine can already be useful while we work on those improvements.

Preparing For ROS 2 Transition Looks Complicated

Before I decided to embark on a ROS Melodic software stack for Sawppy, I thought about ignoring the long legacy of ROS 1 and going to the newer ROS 2 built on more modern infrastructure. I mean, I told people to look into it, so I should walk the walk right? Eventually I decided against putting Sawppy on ROS 2, the deal breaker was that the Raspberry Pi is not a tier 1 platform for ROS 2. This means there’s no guarantee on regular binary releases for it, or that it will always function. I may have to build my own arm32 binaries for Raspbian from source code, and I would be on my own to verify functionality. I’ve done a superficial survey of other candidates for a Sawppy brain, but for today Sawppy is still thinking with a Raspberry Pi.

But even after making that decision I wanted to keep ROS 2 in mind. Open Robotics has a  ROS 2 migration guide for helping ROS node authors navigate the transition, and it doesn’t look trivial to me. But then again, I don’t have the ROS expertise to accurately judge the effort involved.

The biggest headache for some nodes will be the lack of Python 2 support. Mainly impact ROS nodes with a long legacy of Python 2 code, it does not impact a new project written against ROS Melodic which is supposed to support Python 3.

The next headache is the fact that it’s not possible to write if/else blocks to allow a single ROS node to simultaneously support ROS 1 and 2. The recommendation is to put all specialized logic into generic non-ROS-specific code in a library that can be shared. Then have separate code tailored to the infrastructure paradigms of ROS and ROS 2. This way all the code integrating with a ROS platform can be separated, but calling into a shared library.

And it also sounds like the ROS/ROS 2 build systems conflict so they can’t even coexist side by side at the same time. Different variants of a node have to live in separate branches of a repository, with the shared library code merged across branches as development continues. Leaving ROS/ROS 2 specific infrastructure code live in their separate branches.

I can see why a vocal fraction of ROS developers are unhappy with this “best practice”. And since ROS is open source, I foresee one or more groups joining forces to keep ROS 1 alive and working with old code even as Open Robotics move on to ROS 2. Right now there are noises being made from people who proclaims to do a similar thing, saying they’ll keep Python 2 alive past official EOL. In a few years we can look back and see if those Python 2 holdouts actually thrived, and we can also see how the ROS 1/ROS 2 situation has evolved.

Wish List: Modular Sawppy Motor Controllers

One of the goals for my now-abandoned ROS Melodic Sawppy software project is something I still believe to be interesting. In contrast with the non-rover specific goals I outlined over the past few days, this one is still a rover item: I would like Sawppy motor control to be encapsulated in modules that can be easily swapped so Sawppy siblings are not required to use LX-16A servos.

My SGVHAK rover software had an infantile version of this option, and it was written in extreme time pressure to support our hack of using a RC servo controller to steer the right-front corner during SGVHAK rover’s SCaLE debut. In SGVHAK rover software, all supported motor controller code are all loaded, an unnecessary amount of complexity and overhead. It would be nice for a particular rover to bring in just the code it needed.

The HBRC crew up in the SF Bay Area (Marco, Steve, and friends) have swapped out the six drive wheels for something faster while keeping the servos for steering, so a starting point is to have options for different controls for steering and driving. But keeping in mind the original scenario was using a RC servo to hack a single steering corner, we want to make it possible to use heterogeneous motor controllers for each of ten axis of motion.

I need to better understand Rhys code to know if this is something I can contribute back to the Curio ROS Melodic software project. Rhys has stated an intent to bring in ros_control for Curio software stack. Primarily for the reasons of better Gazebo simulation, but it would also abstract Sawppy motor control logic: generic velocity controllers for driving wheels and position controllers for steering. And from there, we can have individual implementations responding to those controllers. Is that how it will work? I need to ramp up on Gazebo and ros_control before I can speak knowledgeably about it.

Learning Github Actions For Automating Verification

Once I wrote up some basic unit tests for my Sawppy rover Ackermann math, I wanted to make sure the tests are executed automatically. I don’t always remember to run the tests, and a test that isn’t getting executed isn’t very useful, obviously. I knew there were multiple tools available for this task, but lacking the correct terminology I wasted time looking in the wrong places. I eventually learned this came under the umbrella of CI/CD tools. (Continuous integration/continuous deployment.) Not only that, a tool to build my own process has been sitting quietly waiting for me to get around to using it: GitHub Actions.

The GitHub Actions documentation was helpful in laying out the foundation for me as a novice, but I learn best when the general foundation is grounded by a concrete example. When looking around for an example, I realized again one was sitting right in my face: the wemake Python style guide code analysis tool is also available as a prebuilt GitHub Action.

Using it as a template, I modified my YAML configuration file so it ran my Python unit tests in addition to analyzing my Python code style. And that it would do this upon every push to the repository, or whenever someone generates a pull request. Now we have insight into the condition of my code style and basic functionality upon every GitHub interaction, ensuring that nobody can get away with pushing (or create a pull request) with code that is completely untried and fundamentally broken. If they should try to get away with such a thing, GitHub will catch them doing it, and deliver proof. It’s not extensive enough to catch esoteric problems, but it provides a baseline sanity check.

I feel like this is something good to keep going and put into practice for all my future coding projects. Well, at least the nontrivial ones… I’ll probably skip doing it for simple Arduino demo sketches and such.

First Foray Into Python Unit Tests

When a Sawppy community member stepped up and released a ROS Melodic rover software stack, I abandoned my own efforts since there was little point in duplicating effort. But in addition to rover control, that project was also a test run for a few other ideas. I used a Jupyter notebook to help work through the math involved in rover geometry, and I started using a Python coding style static analysis tool to enforce my code style consistency.

I also wanted to start writing a test suite in parallel to my code development. It’s something I thought would be useful in past projects but never put enough focus into it. It always seemed so intimidating to build test suites that are robust enough to catch all the bugs, when it takes effort to climb the learning curve to even verify the most basic functionality. What would be the point of that? Surely basic functionality would have been verified before code is pushed to a Github repository.

Then I had the misfortune to waste many hours on a different project, because another developer did not even verify the code was valid Python syntax before committing and pushing to the repository. My idealism meant I wasted too many hours digging for another explanation, because “surely they’ve at least ran their code” and I was wrong. This taught me there’s value in unit tests that verify basic functionality.

So I brought up the Python unit test library documentation, and started writing a few basic tests for rover Ackermann geometry calculation. The biggest hurdle was that binary floating point arithmetic is not precise enough to use the normal equality comparison, and we don’t even need that much precision anyway. Calculating Sawppy steering geometry isn’t like calculating orbital trajectory for an actual mission to Mars. For production code using Python 3.5 onwards, there’s a math.isclose() available as a result of PEP 485. And for the purposes of Python unit tests, we can use assertAlmostEqual(). And how did I generate my test data? I used my Jupyter notebook! It’s a nice way to verify my wemake-compliant code would generate the same output as the original calculations hashed out in Jupyter notebook.

And finally, none of this would do any good if it doesn’t get executed. If someone is going to commit and push bad code they didn’t even try to run, they’re certainly not going to run the unit tests, either. What I need is to learn how to make a machine perform the verification for me.

Reworking Sawppy Ackermann Math in a Jupyter Notebook

The biggest difference between driving Sawppy and most other robotic platforms is the calculation behind operating the six-wheel-drive, four-wheel-steering chassis. Making tight turns in such a platform demands proper handling of Ackermann steering geometry calculations. While Sawppy’s original code (adapted from SGVHAK rover) was functional, I thought it was more complex than necessary.

So when I decided to rewrite Sawppy code for ROS Melodic (since abandoned) I also wanted to rework the math involved. I’ve done this a few times, most recently to make the calculations in C for an Arduino implementation of Sawppy control code, and it always starts with a sketch on paper so I can visualize the problem and keep critical components in mind.

Once satisfied with the layout on paper, I translate them into code. And as typically happens, that code would not work properly on the first try. The test/debug/repeat loop is a lot more pleasant in Python than it was in C, so I was happy to work with the tools I knew. But if the iterative process was even faster, I was convinced I could write even better code.

Thus I had my first real world use of a Jupyter notebook: my Sawppy Python Ackermann code. I could document my thinking in Markdown right alongside the code, and I could test ideas for simplification right in the notebook and see their results in numerical form.

But I’m not limited to numerical form: Jupyter notebooks can access a tremendous library of data visualization tools. It was quite overwhelming to wade through all of my options, I ended up using matplotlib‘s quiver plot. It plots a 2D field of arrows, and I used arrow direction to represent steering angle and arrow length to represent rolling speed. This plot gave a quick visual confirmation those numbers made sense.

In the Jupyter notebook I could work freely without worrying about whether I was adhering properly to style guides. It made the iterative work faster, but that did mean spending time to rework the code to satisfy wemake style guides. The basic logic remains identical between the two implementations.

I think this calculation is better than what I had used on SGVHAK rover, but it feels like there’s still room for improvement. I don’t know exactly how to improve just yet, but when I have ideas, I know I can bring up the Jupyter notebook for some quick experiments.

Inviting wemake to Nitpick My Python Code Style

I’m very happy Rhys Mainwaring released a ROS Melodic software stack for their Curio rover, a sibling of my Sawppy rover. It looks good, so I’ve abandoned my own ROS Melodic project, but not before writing down some notes. Part 1 dealt with ROS itself, many of which Rhys covered nicely. This post about Python Style is part 2, something I had hoped to do for the sake of my own learning and I’ll revisit on my next Python project. (Which may or may not be a robotic project.)

The original motivation was to get more proficient at writing Python code that conforms to recommended best practices. It’s not something I can yet do instinctively, so every time I tackle a new Python project I have to keep PEP8 open in a browser window for reference. And the items not explicitly covered by PEP8 are probably covered by another style guide like Google’s Python style guide.

But the rules are useless without enforcement. While it’s perfectly acceptable for a personal project to stop with “looks good to me” I wanted to practice going a step further with static code analysis tools called “linter“s. For PEP8 rules, the canonical linter is Flake8 which is a Python source code analysis tool packaged with a set of default rules for enforcing PEP8. But as mentioned earlier, PEP8 doesn’t cover everything, so Flake8 has option for additional modules for enforcing even more style rules. While browsing these packages, I was amused to find the wemake Python style guide which called itself “the strictest and most opinionated python linter ever.”

I installed wemake packages so that I can make Python code in my abandoned ROS Melodic project compliant with wemake. While I can’t say I was thrilled by all of the rules (it did get quite tedious!) I can confirm it does result in very consistent code. I’m glad I’ve given it a try, and I’m still undecided if I’m going to commit to wemake for future Python projects. No matter the final decision, I’ll definitely keep running at least plain flake8.

But while consistent code structure is useful for ease of maintenance, during the initial prototyping and algorithm design it’s nice to have something with more flexibility and immediate feedback. And I’ve only just discovered Jupyter notebooks for that purpose.

Original Goals For Sawppy ROS Melodic Project

Since a member of the Sawppy builder community has stepped up to deliver a ROS Melodic software stack, I’ve decided to abandon my own effort because it would mean duplicating a lot of effort for no good reason. I will write down some thoughts about the project before I leave it behind. It’s not exactly a decent burial, but it’ll be something to review if I ever want to revisit the topic.

Move to ROS Melodic

My previous ROS adventures were building the Phoebe Turtlebot project, which was based on ROS Kinetic. I wanted to move up to the latest long term service release, ROS Melodic, something Rhys has done as well in the Curio project.

Move to Python 3

I had also wanted to move all of my Python code to Python 3. ROS Kinetic was very much tied to Python 2, which reached end-of-life at the beginning of 2020. It was not possible to move the entire ROS community to Python 3 overnight, but a lot of work for this transition was done for ROS Melodic. Python 2 is still the official release for Melodic, but they encourage all Python modules to be tested against Python 3 and supposedly all of the core infrastructure has been made to be compatible with Python 3. Looking over the Curio project, I saw nothing offhand indicating a dependency on either Python version, so I’m cautious optimistic it is Python 3 compatible.

Conform to ROS Project Structure

I originally thought I could create a Sawppy ROS subdirectory under Sawppy’s main Github repository, but decided to create a new repository for two reasons:

  1. ROS build system Catkin imposes its own directory structure, and
  2. Existing name “Sawppy_Rover” does not conform to ROS package naming recommendations. Name must be all lowercase to avoid ambiguity between case-sensitive and case-insensitive file systems. https://www.ros.org/reps/rep-0144.html

Rhy’s Curio project solves all of these concerns.

Conform to ROS Conventions

Another motivation for a rewrite of my Sawppy code was to change things to fit ROS conventions for axis orientation and units:

  • Sawppy had been using +Y as forward, ROS uses +X as forward.
  • Sawppy had been using turn angle of positive degrees as clockwise, ROS uses right hand rule along +Z axis meaning counter-clockwise.
  • Math functions prefer to work in radians, but older code had been written in terms of degrees. Going with ROS convention of radians would skip a lot of unnecessary conversion math.
  • One potential source of confusion: “angular velocity” flips direction from “turn direction” when velocity is negative, old Sawppy code didn’t do that.

Rhy’s Curio project appears to adhere to ROS conventions.

All of that looks great! Up next on this set of notes, my original intent to practice better Python coding style with my project.

Rhys Mainwaring’s ROS Melodic Software and Simulator for Curio

When I created Sawppy, my first goal was to deliver something that could be fun for robotics enthusiasts to play with. The target demographics were high school students and up, which meant creating a software stack that is self-contained and focused enough to be easy to learn and modify.

To cater to Sawppy builders with ambition for more, one of the future to-do list was to write the necessary modules to drive Sawppy via open source Robot Operating System. (ROS) It is a platform with far more capability, with access to modules created by robotics researchers, but not easy for robotics beginners to pick up. I’ve played with ROS on-and-off since then, never quite reaching the level of proficiency I needed to make it happen.

So I was very excited to learn of Rhys Mainwaring’s Curio rover. Curio is a Sawppy sibling with largely the same body but running a completely different software stack built on ROS Melodic. Browsing the Curio code repository, I saw far more than just a set of nodes to run a the physical rover, it includes two significant contributions towards a smarter rover.

Curio Rover in Simulation

There’s a common problem with intelligent robotics research today: evolving machine learning algorithms require many iterations and it would take far too long to run them on physical robots. Even more so here because, true to their real-life counterparts, Sawppy and siblings are slow. Rhys has taken Sawppy’s CAD data and translated physical forms and all joint kinematics to the Gazebo robot simulator used by ROS researchers. Now it is possible to work on intelligent rovers in the virtual world before adapting lessons to the real world.

Rover Odometry

One of the challenges I recognized (but didn’t know how to solve) was calculating rover wheel odometry. The LX-16A servos used on Sawppy could return wheel position, but only within an approximately 240 degree arc out of the entire 360 degrees circle. Outside of that range, the position data is noisy and unreliable.

Rhys has managed to overcome this problem with an encoder filter that learned to recognize when the servo position data is unreliable. This forms the basis of a system to calculate odometry that works well with existing hardware and can be even faster with an additional Arduino.

ROS Software Stack For Sawppy

Several people have asked me for ROS software for Sawppy, and I’m glad Rhys stepped up to the challenge and contributed this work back to the community. I encourage all the Sawppy builders who wanted ROS to look over Rhys’ work and contribute if it is within your skills to do so. As a ROS beginner myself, I will be alongside you, learning from this project and trying to run it on my own rover.

https://github.com/srmainwaring/curio

(Cross-posted to Sawppy’s Hackaday.io page)

Sparklecon 2020: Sawppy’s First Day

I brought Sawppy to Sparklecon VII because I’m telling the story of Sawppy’s life so far. It’s also an environment where a lot of people would appreciate the little miniature Mars rover running amongst them.

Sparklecon 2020 2 Sawppy near battlebot arena

Part of it was because a battlebot competition was held at Sparklecon, with many teams participating. I’m not entirely sure what the age range of participants were, because some of the youngest may just be siblings dragged along for the ride and the adults may be supervising parents. While Sawppy is not built for combat, some of the participants still have enough of a general interest of robotics to took a closer look at Sawppy.

Sparklecon 2020 3 Barb video hosting survey

First talk I attended was Barb relaying her story of investigating video hosting. Beginning of 2020 ushered in some very disruptive changes in YouTube policies of how they treat “For Kids” video. But as Barb explains, this is less about swear words in videos and more about Google tracking. Many YouTube content authors including Barb were unhappy with the changes, so Barb started looking elsewhere.

Sparklecon 2020 4 Sawppy talk

The next talk I was present for was my own, as I presented Sawppy’s story. Much of the new material in this edition were the addition of pictures and stories of rovers built by other people around the country and around the world. Plus we recorded a cool climbing capability demonstration:

Sparklecon 2020 5 Emily annoying things

Emily gave a version of the talk she gave at Supercon. Even though some of us were at Supercon, not all of us were able to make it to her talk. And she brought different visual aids this time around, so even people who were at the Supercon talk had new things to play with.

Sparklecon 2020 6 8 inch floppy drive

After we gave our talks, the weight was off our shoulders and we started exploring the rest of the con. During some conversation, Dual-D of NUCC dug up an old school eight inch floppy drive. Here I am failing to insert a 3.5″ floppy disk in that gargantuan device.

Sparklecon 2020 7 sand table above

Last year after Supercon I saw photographs of a sand table and was sad that I missed it. This year I made sure to scour all locations to make sure I can find it if it was present. I found it in the display area of the Plasmatorium drawing “SPARKLE CON” in the sand.

Sparklecon 2020 8 sand table below

Here’s the mechanism below – two stepper motors with belts control the works.

Sparklecon 2020 9 tesla coil winding on lathe

There are full sized manual (not CNC) lathe and mill at 23b shop, but I didn’t get to see them run last year. This year we got to see a Tesla coil winding get built on the lathe.

For last year’s Sparklecon Day 2 writeup, I took a picture of a rather disturbing Barbie doll head transplanted on top of a baseball trophy. And I hereby present this year’s disturbing transplant.

Sparklecon 2020 WTF

Sawppy has no idea what to do about this… thing.

Sawppy Servo Experiment: Standard Servo with Metal Horn

From birth my Sawppy has been running around with LX-16A servos made by LewanSoul (also seen sold under the Hiwonder brand *) but that is not an explicit requirement. From the onset I designed Sawppy to accommodate multiple different servo types, primarily the three I investigated. In theory any servo would work, as long as they physically fit within the available space and someone puts in the effort to design a servo-specific mounting bracket and output adapter.

In an exploration to lower cost of rover building, today’s experiment is to validate my design goal of flexibility, putting theory into practice by adapting standard RC servos. This servo was also interesting because it has a metal horn, which would replace the common plastic servo horns that have been a common point of failure. This particular pairing of servo and horn came from now-defunct Roboterra (as of writing, the link shows up as a Squarespace site whose subscription has expired.) But the same concepts should apply to other servos and their horns.

Roboterra servo with metal horn

An adapter bracket was quickly whipped up to bolt to Sawppy. The bracket surrounds the entire perimeter of the servo including the four empty mounting points. The servo is not mounted as RC servos usually are, because Sawppy was designed so the servo only needs to provide twisting force. They are free to slide along axis of rotation, letting 608 bearings built elsewhere into Sawppy take care of handling the forces of a rover on the move.

Roboterra servo bracket and coupler

The metal horn on this servo is much larger in diameter than LX-16A servos, allowing a larger 3D-printed coupler. The metal horn was tapped to accept screws with M3 thread. The larger coupler held with M3 machine screws is far more sturdy than the LX-16A solution of coarse threads cut by self-tapping screws.

This is a promising first step into using commodity RC servos on a rover build. A large selection of RC servos are out there for every budget and power, making them a tempting option for some rover builders. But there’s still work ahead as the wiring will get more complicated as well as requiring a revamp of the electronics control system.

A Vortex (or Cyclone) Separator Appears

After each of the test cut runs on our project CNC, I’ve used the shop vacuum to clean up the mess afterwards. However, this does not help with the mess during cutting, the most important part of which are our machine’s ways and drive screws which are vulnerable from debris. What we really need is some kind of collection system that we can run while the machine is cutting.

One problem with this requirement is the fact that vacuum filters quickly clog up when used in this manner. The standard solution is to separate bulk of debris from the airflow before it flows into the filter, thereby extending life of the filter by reducing the amount of debris it has to catch out of the air. Since this is a standard solution, many products are available for purchase. But being makers, our first thought was how we might make one for less money, and 3D printing seemed like a way to go. Since the device is mainly a hollow shell, in theory we could print one for less money in plastic filament than buying one.

However, the problem is that none of my 3D printers are well suited to printing a tall cylindrical object exceeding my printer volume. And if I should split it across several pieces, I risk introducing a gap that can compromise the vacuum and also disrupt the debris extraction airflow. This type of project is ideally suited for a tall delta-style 3D printer, so I started asking around fellow makers of SGVHAK if anyone had one of those printers.

One member did have such a printer, and asked what I wanted to print. When I described the project, he suggested that we skip the printing. Some time ago he purchased a vortex separator (*) for another project, and it is now available for this project CNC. I agree taking a manufactured unit is much easier than printing one! It is even a perfect fit with the nature of our project, which is mostly built from parts salvaged or recycled from earlier projects.

But the vortex separator is only a single core component, we’ll have to build the rest of the dust collection system.


(*) Disclosure: As an Amazon Associate I earn from qualifying purchases.

Contamination Concern for CNC Ways And Drive Screw

Thinking over potential tasks to be tackled on our project CNC, we thought it might be occasionally useful to mill our own circuit boards. But before we start cutting bits of copper off fiberglass, we should make sure those flakes of copper won’t end up where they’ll do harm. One of the open questions involves how we should protect the ways and drive screws of our Parker motion control XY stage.

The XY stage was salvaged from an optical inspection machine, so it was not a surprise to see this mechanism has limited protection against contamination as most items under optical inspection don’t shed debris. Hence unlike real CNC mills, the ways here have no cover. On this machine they are exposed when an axis moves off center. Cursory inspection indicates the critical surfaces are those in the center facing to the side, so what we see as top surfaces are not areas of direct contact. But it’s still better to not have any contaminants build up here, because of the next item:

The drive screws have a thin metal cover to protect against dust, but the cover is opened towards the ways. When the table moves off center, there is a window for debris to fall from exposed ways to inside the screw compartment and end up sticking to the lubricant coating the mechanism. In the picture above we could see through this hole. While the screw itself is dark and out of line of sight, we could see colors of wires also living in that compartment. (They connect to three magnetic switches for the axis: a location/homing switch, and limit switches to either extreme.)

We realized this would be a problem once we started cutting into MDF and making a big mess. Powdered MDF may cause abrasion and should be kept out of the ways and screws if we can. Milling circuit boards would generate some shredded copper. I’m not sure if that would be considered abrasive, but they are definitely conductive and we should keep them away from machine internals as much as possible. A subtractive manufacturing machine like this one will always make big messes, how might we keep that under control?

Contemplating CNC Milling Circuit Boards

Another activity that we will be investigating in addition to CNC engraving is the potential of making our own circuit boards. Mechanically speaking, milling circuit boards are very similar to engraving. Both types of tasks stay within a very shallow range of Z, and would suffer little impact by our wobbly Z axis. Milling boards could involve larger tools than a pointy engraving tool, but they should still be relatively small and not drastically strain our limited gantry rigidity.

Experimentation will start with the cheapest option: blank circuit boards that have a layer of copper on one side. (“single-sided copper clad”) This will be suitable for small projects with a few simple connections that we had previously tackled with premade perforated board and some wires. For example, Sawppy’s handheld controller could have easily been a single-layer board. We would need to go to dual layer for more sophisticated projects like the Death Clock controller board, and the ambition for this line of investigation is for the machine to make a replacement control circuit board for itself.

We don’t yet know how feasible that will be. As the level of complexity increases, at some point it won’t be worth trying to do board ourselves and we’re better off sending the job to a professional shop like OSH Park. And the first few boards are expected to be indicative of amateur hour and a disaster, hence we didn’t care very much about the quality of the initial batch of test boards. They were purchased from that day’s lowest bidder and second lowest bidder on Amazon. (*)

But even though circuit board milling is mechanically similar to engraving, the software side is an entirely different beast that will need some ramp-up time. And before we start cutting metal in the form of a thin layer of copper, we need to pay some attention to the machine’s needs.


(*) Disclosure: As an Amazon Associate I earn from qualifying purchases.

Comparing CNC Engraving Tool To Milling Tool

The decision to explore CNC engraving was so we can learn machine tool operation while sidestepping the weaknesses currently present in our project CNC machine. Projects staying within a single Z depth will suffer minimally from the Z-axis wobble imparted by our bent Z-axis ballscrew. But engraving also helps reduce impact from the lack of rigidity due to differences in our cutting tools.

CNC with cutter 80mm past motor bearing

Here’s the 1/4″ diameter endmill as it was installed in our CNC spindle. In the pursuit of rigidity I wanted the largest diameter that we can put in a ER11 collet not realizing the large diameter also meant longer length. I bought this one solely because it could be available quickly but a more detailed search found no shorter cutters. The end of this particular cutting tool extends roughly 80mm beyond the spindle motor bearing.

In comparison, the engraving tool had a 1/8″ diameter. Judging just by diameter, the 1/8″ diameter tool would be weaker. But that overlooks the fact it is also shorter, resulting in its tip extending only about 55mm beyond the spindle motor bearing. So not only is the engraving bit removing less material and placing less stress on the spindle as a result, it also has a 30% shorter leverage arm to twist the Z-axis assembly about.

Now I understand why such simple inexpensive mills and small diameter tools are a common part of modest desktop CNC mills. (*) The load imparted by such a Z-axis assembly is very modest, making it possible to have machines that are barely any more rigid than a 3D printer. (And in some cases, not even as rigid.) While our Parker XY table is far more capable than the XY stage in these machines, our Z-axis isn’t much better (yet) so we’ll stay in a similar arena low lateral load and material removal.


(*) Disclosure: As an Amazon Associate I earn from qualifying purchases.

Arduino Mozzi Wilhelm Exercise

After completing a quick Mozzi exercise, I found a few problems that I wanted to fix in round 2.

The first problem was the use of audio clip from Portal 2. While Valve Software is unlikely to pursue legal action against a hobbyist exercise project for using one short sound from their game, they indisputably own the rights to that sound. If I wanted a short code exercise as any kind of example I can point people to, I should avoid using copyrighted work. Hunting around for a sound that would be recognizably popular but less unencumbered by copyright restrictions, I settled on the famous stock sound effect known as the Wilhelm scream. Information about this effect — as well as the sound clip itself — is found all over, making it a much better candidate.

The second problem was audible noise even when not playing sampled audio. Reading through Mozzi sound code and under-the-hood diagram I don’t understand why this noise is coming through. I explicitly wrote code to emit zero when there’s no sound, which I thought meant silence, but something else was happening that I don’t understand yet.

As a workaround, I will call stopMozzi() when playback ends, and when the button is pressed, I’ll call startMozzi(). The upside is that the noise between playback disappears, the downside is that I now have two very loud pops, one each at the start and end of playback. If connected to a powerful audio amplifier, this sudden impulse can destroy speakers. But I’ll be using it with a small battery-powered amplifier chip, so the destruction might not be as immediate. I would prefer to have neither the noise nor the pop, but until I figure out how, I would have to choose one of them. The decision today is for quick pops rather than ongoing noise.

This improved Arduino Mozzi exercise is publicly available on Github.

CNC Exploration Via Flat Cutting Projects

We got far enough on the project CNC mill (built out of mostly salvaged parts) to make test cuts, and evaluate results. I honestly didn’t think we would get this far. Back when I first plugged in the salvaged Parker motion control XY table I had only a vague clue where I might go with it, only knowing that I will be learning a lot as I go. Now here’s a machine capable of making a decent effort executing G-code programs generated from Autodesk Fusion 360.

There was never a real solid goal for this project, no “North Star” to guide the direction nor a finish line to mark completion. I think I can now articulate the underlying goal for this project: to learn as much as I can about the world of automated machine tools with the smallest possible budget. This is why I didn’t worry overly much about imperfections like a bent Z-axis ballscrew or a Z-axis gantry lacking in rigidity: they were good enough to move forward and learn lessons.

At this point the Parker XY table, our old industrial equipment at the heart of everything, has proven to be a solid core. In contrast, our problematic Z-axis has proven to to be the weak point. We could fix those problems, but solutions all cost money. So before I pull out the credit card again, a question: are there things we can learn with excellent XY axis but lackluster Z?

The answer is yes: there exists CNC projects with exacting requirements in XY axis but much less demanding of Z. We’ve briefly toyed with one category: pen plotters. For a pen holder, it only matters that a pen is put on paper at the appropriate time and lifted otherwise. Factors like precisely square vertical alignment are not important.

Since we’ve already had some fun with pen plotting, I decided to start exploring the next step up in difficulty: CNC engraving. We will be using a cutting tool in our spindle to remove some minimal material. So while the Z-axis demands are similar to pen plotters, engraving requires a little more rigidity and precision than pen plotting. All the same toolpath generation tasks apply, so as a Hello World to CNC engraving, I engraved “SGVHAK” into the previously prepared surface. With this success, we can look at other projects we can use to learn CNC tasks with the flawed machine we have.

Evaluating Results Of Cutting Tests On Our CNC Project

Our project CNC, pieced together from stuff around the shop, has performed several very informative test cuts. Several items we’ve suspected might be potential issues have been proven as such. Our Z-axis was indeed unreliable in its vertical alignment due to a bent ball screw. Beyond the ball screw, the entire gantry assembly for Z-axis doesn’t have the rigidity to avoid tool chatter when pushing a quarter inch diameter endmill through MDF. The Z-axis rollers prone to loosening were only the weakest link in this chain, we’re confident there are additional problems lying in wait.

On the upside, some items we worried about have not become limiting factors. Using an inexpensive ESP32 for stepper motor control timing was a question mark. We knew the real time guarantees of a shared core were not going to be as precise as a dedicated real-time processor like the PRU of a Beaglebone. But we didn’t know if it was good enough. And finally, we didn’t know if the salvaged Parker motion control XY stage at the heart of this project had hidden problems that could have sunk the project. We think it might have been retired due to an electrical problem we fixed, but it might have been retired due to some other problem we couldn’t fix. Given the consistency we saw between runs, it looks like an ESP32 running Grbl is a fine match for the decades old (but still precise) Parker table.

We’ve learned a lot of lessons in the software realm as well. From configuring GRBL to switching G-code sender to bCNC to CAM parameters of Fusion 360. It feels like there are tons more to learn on the software side of CNC projects, so that’s where the focus will remain for the near future. It’d be wonderful to have a rigid and dependably vertical axis capable of swinging large tools, but even without, there’s lots to learn using what we’ve put together to date. The next area of exploration will be CNC engraving.

Arduino Mozzi Space Core Exercise

Temporarily stymied in my Teensy adventures, I dropped back to Arduino for a Mozzi exercise. I’ve helped Emily through bits and pieces of putting sampled audio on an Arduino using Mozzi, but I had yet to run through the whole process myself.

The hardware side was kept as simple as possible: there’s only a single switch wired to be normally open and momentarily closed. For audio output, I used the wire salvaged from the Project MC2 Pixel Purse(*) that was briefly a hacker darling due to its clearance-sale price. (As of this writing, the price is up to $39.58, far above the $6 – $8 leading to its “buy to take it apart” fame.) Since this cable was designed to be plugged into a cell phone, it had a TRRS plug that I rewired into an imitation of a monophonic TS plug by using the T wire and connecting RRS together into another wire.

With hardware sorted out, I dived into the software tasks. There were more than a few annoyances that make this task not very beginner-friendly. The Huffman audio compression utility audio2huff.py had dependencies listed only in a comment block easily overlooked by beginners. They didn’t all install in the same way (purehuff required a download and install, while the others were installed via pip.) And since they are a few years old and not actively maintained, they were all written for Python 2.

This will become more and more of a problem as we go, since Python 2 support has just officially ended at the start of 2020. Older Linux distributions would launch Python 2 when the user runs python at the command line, and those that want to use Python 3 would need to run python3. This is getting flipped around and some environments now launch Python 3 when the user runs python and those that need the old stuff has to run python2.

What happens when we run these utilities under Python 3? It’d be nice if the error message is “Hey, this needs Python 2” but that’s not the reality. It is more likely to be something like “foobar is not iterable” completely bewildering to beginners.

To be fair, none of these are problems specific to Mozzi, there’s a large body of work online that were written for Python 2 and waiting for someone motivated enough to bring them up to date.

Anyway, after those problems are sorted out, I got my Arduino to play a sound when the button is pressed. My test sound clip was from the game Portal 2: the ecstatic “SPAAACE!” emitted by the corrupted Space Core when it finally got its wish to go flying through space.

This Arduino Mozzi exercise is publicly available on Github.

[UPDATE]: “Wilhelm” exercise (round 2) has some improvements.


(*) Disclosure: As an Amazon Associate I earn from qualifying purchases. But you shouldn’t buy a Pixel Purse until it drops back down to clearance prices.

First Experiment in Teensy Audio Foiled By CPU Instruction Set

The original line of Arduino boards are know for a lot of very flexible features that made them the champion of entry into physical computing. “Large storage space” is not one of those features, as the ATmega328P has only 32KB of flash. Plenty for blinking little LEDs and a great many other projects, severely limiting for anything that requires nontrivial amount of data.

One such class of projects is playing back digital audio samples. Simple sounds like tones, beeps, and bloops take little data, but recorded digital audio consumes a great many bytes. I was introduced to the Mozzi Arduino sound synthesis library via exposure to Emily’s projects. While the emphasis is on synthesis, it does have provision to handle sampled audio. However, due to the small flash storage, even with compression it could only handle short sounds.

Looking around for an upgrade, I remembered I have a Teensy LC OSH Park edition I had purchased alongside an OSH Park order years ago and thought it was worth an audio playback experiment. The CPU is much faster and it has almost double the flash storage. Exploring the Arduino-compatibility layer called Teensyduino, I see it offers a full analog audio API, complete with a graphical tool to help compose the tree of audio input/processing/output nodes. I was very impressed!

I decided to start small with just two nodes: sound is to be generated by playing from sampled audio data in memory, and sent to built-in DAC. Pressing “Export” generated the following boilerplate code:

#include <Audio.h>
#include <Wire.h>
#include <SPI.h>
#include <SD.h>
#include <SerialFlash.h>

// GUItool: begin automatically generated code
AudioPlayMemory playMem1; //xy=262,658
AudioOutputAnalog dac1; //xy=436,657
AudioConnection patchCord1(playMem1, dac1);
// GUItool: end automatically generated code

void setup() {
// put your setup code here, to run once:

}

void loop() {
// put your main code here, to run repeatedly:

}

Sadly, this code did not compile. Instead, I encountered this error:

/tmp/ccLBGUGU.s: Assembler messages:
/tmp/ccLBGUGU.s:291: Error: selected processor does not support `smull r0,ip,r3,r5' in Thumb mode
/tmp/ccLBGUGU.s:292: Error: shifts in CMP/MOV instructions are only supported in unified syntax -- `mov ip,ip,asl r6'
/tmp/ccLBGUGU.s:293: Error: unshifted register required -- `orr r0,ip,r0,lsr r7'

A web search on this error message indicated this is because the chip on Teensy LC does not support the instructions described in the error message, I’d need a Teensy 3 or 4. To test this hypothesis, I changed the compiler to target Teensy 3 instead of LC and hit “Verify” again. This time, it was successful. But I don’t have a Teensy 3 on hand, so I can’t actually run that code.

Given this limitation I’ll have to find some other project for my Teensy LC. In the meantime, I’ll drop back to the tried and true Arduino for my Mozzi exercise.