Dustbuster Battery: Next Steps

The battery of my old Black & Decker Dustbuster BDH9600CHV is too weak to do its job. Now that we’ve found enough evidence to suggest the battery was degraded by constant over-charging, there are a few options forward: repair, replace, or upgrade.

Cells Removed

Repair

The first option is to try to repair the existing battery cells. The information on Wikipedia indicates the cells could be repaired by performing several deep discharge+charge cycles. This must be done on an individual cell basis, because deep discharge of a pack risks irreparable damage to the weaker cells from cell reversal. It would be time-consuming even if the equipment to automate this process is on hand.

Replace

The second option is to purchase new Ni-Cad battery cells and replace the degraded ones in the Dustbuster. Ni-Cad batteries are cheap, but assembling them into packs are usually done with the help of a small spot welder. In theory the battery tabs could be attached with a soldering iron, but it’s very difficult to solder a battery cell because the metal can acts as a heat sink drawing heat away from the solder joint. If too much heat goes into the battery cell chemicals, it will damage the battery.


Whether repaired or replaced, the existing power adapter that constantly overcharges the battery pack will need to be retired. What we would need is a good charger with a controller that knows how to properly charge a NiCad battery. Without this knowledge, we’ll quickly return to the same predicament with a battery pack ruined by overcharging.

Usually, when working on a project, it’s fun to buy new tools necessary for the job. Our candidates are:

  • Individual Ni-Cad cell deep discharge+charge cycler.
  • Ni-Cad battery tab spot welder.
  • Ni-Cad aware smart battery charger.

Unfortunately, they’re all dealing with Ni-Cad battery cells, which has faded to niche applications and their use is not expected for future projects. The present (and foreseeable future) solution to portable battery are lithium-ion chemistry cells, and that’s the motivation for the next option:

Upgrade

Since the battery charger would need to be replaced anyway, that removes the main motivation to stay with an electrically compatible chemistry. Freed from that constraint, the most interesting path forward is to find a way to power this old Dustbuster with an entirely different type of battery.

Let’s bring this Dustbuster into the 21-st century with a lithium battery upgrade.

Investigating Dustbuster Battery Degradation

When embarking on a project to repair something, it’s always helpful to understand and articulate what went wrong so we have confidence we’re fixing the right thing. The starting point for this project is seeing this old Black & Decker BDH9600CHV Dustbuster trying to do its job: It couldn’t spin its motors fast enough to generate vacuum to pick up household debris. The most obvious suspect is the battery pack, so let’s examine the batteries.

Dustbuster screws.jpg

Five small screws held the two halves of the vacuum enclosure together. Once the screws were removed, the two halves separated easily without any additional glue or plastic clips to worry about. The internals were as expected – a battery pack, hooked up to a switch, and wired to the motor driving the vacuum vanes.

The battery pack is built from eight nickel-cadmium (Ni-Cad) battery cells. Six arrayed around the motor, and two more tucked in the handle. The first hypothesis is that some of the cells have died. The cell voltage levels were probed as the motor spun, looking for any cells that has sunk to zero volts or possibly a cell-reversal situation. All eight cells delivered under 1 volt but well above zero, disproving the initial “dead cell” hypothesis.

The next hypothesis is battery memory effect. Technically the term applies to a very specific issue with Ni-Cad battery, but in popular use the term has become an umbrella for several different conditions that afflict Ni-Cad batteries.

The most promising item under the umbrella was “Voltage depression due to long-term over-charging”. The voltage has already been verified to be low but not zero. There should be a charging control circuit to prevent over-charging, perhaps that failed? A search came up empty: there didn’t seem to be a charge controller at all. The batteries seemed to be connected directly to the output of the charging stand AC power adapter.

The nominal voltage of this battery pack is 8 * 1.2V = 9.6V. The maximum output of this Dustbuster’s AC to DC adapter? 24V. Ouch! That’s significantly over nominal and the battery pack has been held at that level for years.

These circumstances imply this battery pack has indeed suffered under long-term over-charging. Explaining why it now deliver depressed voltage levels.

24V DC

New Project: Handheld Vacuum Upgrade

Taking a break from reviving old computers, the next project is to revive a small household appliance. The subject of the upgrade is a handheld vacuum. Specifically a Black & Decker BDH9600CHV, a member of the “Dustbuster” line whose success defined a whole new product category.

Dustbuster

A major factor of the success is their easy of use. Whenever there’s a cleanup task, it’s easy to pull the vacuum off its charging stand and clean up the mess. No need to pull a big heavy vacuum out of the closet, no need to look for the nearest plug. A small cordless handheld vacuum is very convenient and people are willing to pay for that convenience.

The basic design of a Dustbuster is straightforward: a battery pack hooked up to a motor controlled by a switch. As a result, the majority of the vacuum are durable and reliable thanks to their simplicity. With the notable exception of the battery pack. The battery pack is what makes the cordless vacuum possible and easy to use, but the battery is also the weakest link.

This particular Dustbuster had been sitting in the standby charging base and the battery power capacity gradually dwindled over the past few years. Now the battery pack, even when freshly charged off its charging stand, could only offer a little bit of power before the motor slowed down and couldn’t generate enough vacuum to be useful.

In today’s disposable society, it’s easy to just throw away such a simple and inexpensive appliance and buy a new one. But where’s the fun in that? Since the rest of the vacuum seems to be OK, the goal of the new project is to give this vacuum new life by some combination of repairing, restoring, and/or revamping the battery pack.

Let’s open it up and see what we find…

Time-of-Use (TOU) Electric Bill: Good Concept, Poor Execution.

Household electric power is a great convenience of modern living and a triumph of engineering. Most people living in first-world countries are so used to having power whenever we want it, we only think about the work behind the scenes when there’s a power failure. Most of the electricity generated is immediately consumed. So when consumption changes throughout the day, generation has to be kept in sync. This balance act is ongoing at all hours of the day, every day, and most of us don’t have to think about it.

One detail of this balancing act is the increasing cost of power as demand rises. Obviously the power company would like to keep their cost as low as possible, so the power plants that are the cheapest to run (base load power stations) run constantly. As demand outstrips the ability of the base load stations, they begin generating power via increasingly more expensive means. These peaking power plants are usually cheaper to build but more expensive to run so they are only used to handle peaks in demand.

Residential electric bills are typically insulated from this change. The standard home electric meters simply record the running tally so the homeowner is billed on the total amount of electricity consumed. With the introduction of more sophisticated meters, it becomes feasible to track energy usage in more detail which makes it possible to bill based on time of use. (TOU)

TOU rates correlate cost of consumption to cost of generation. This gives the consumer a financial incentive to be more energy-efficient. If enough energy usage is shifted around, the consumer can save money. Over a year ago, the local electric utility sent out an invitation to join a TOU pilot study. It would be an interesting experience to see that theory put into practice so the invitation was accepted.

The TOU rates were listed on the bill, where the peak hour rates are indeed appropriately expensive, roughly triple the non-TOU rate. But the off-peak rates were tremendously disappointing: only a 20% discount off the non-TOU rate. A 300% penalty for on-peak vs. 20% discount off-peak means it’ll be difficult to actually save overall money under this plan.

But the study was on and it’s time to put in the best effort. The most significant changes came from running the laundry machines and the dishwasher during off-peak hours as much as possible. On some days this was a severe inconvenience. The effort continued but the consumer was not always happy about it.

At the end of the one-year study, they mailed out a TOU cost summary. They took the year’s electric use and computed it two ways: Once through the TOU pilot study rate, and again on the non-TOU rate.

The reward for ecological awareness? The windfall for severe inconvenience?

SCE TOU Unimpressed

$1.93 over the entire year.

From the perspective of encouraging people to save, this was a complete failure. The utility needs to discount the off-peak rate much more significantly than they did during this study before people would see enough savings to be worth the inconvenience. The kind of time and effort expended during this year was not remotely worth saving $1.93.

Mazda Vision Coupe: Design Highlights

Different people go to an auto show to see different things. My personal target for the LA Auto Show was a concept car in the Mazda pavilion: the Vision Coupe. Mazda unveiled it at the Tokyo Auto Show a little over a month ago and it has been pretty well received. When I found out Mazda would bring it to Los Angeles I had to go see it for myself.

Front three quarter

A significant aspect of the design is the evolution away from creases in the sheet metal. About ten years ago the Mazda Nagare concept car illustrated the use of creases, and the idea spread through the Mazda line. I thought the show car was novel but I am not a fan of the translation into production cars. While some of the creases do illustrate a story flowing from one design element to another, too many of the creases feel forced. They form from seemingly nowhere and fade out to nothing, contributing little (or distracting from) the story told by the overall shape of the car.

With the Vision Coupe (and the RX-Vision before it) Mazda design declared sharp creases are played out. Moving on to the smooth sculpted surfaces has a risk – they do not show up on photographs as well as creases. So Mazda risks losing sales with people who car-shop by looking at pictures. Photos miss the full impact of the design that can only be appreciate by seeing how lights reflect and move around the body. I look forward to seeing how these ideas translate to Mazda showrooms.

Headlight

Another idea I want to see translated to production are the lights. We’ve had big round headlights for most of the history of the automobile. With the introduction of compact LEDs bright enough to be headlights, designers now have new flexibility and explored different styles of LED illumination. Some designs weren’t very bold and laid out the LED in a straightforward grid. Some tried to spread them in a creative pattern, but an array of LEDs can easily make people think of arthropod eyes which can be unsettling. Some of those designs have been quite polarizing.

The solution presented on the Vision Coupe concept is to take the LEDs and form them into a circle so our human anthropomorphic brains can see an eye. But not limit it to a circle – the design plays with a line of light that carries through the eye (but doesn’t cut into the ‘eyeball’) and also with the shadows cast by the sculpted surround that evokes eyelids. Futuristic yet familiar. I like this design though I’m not sure it’ll survive translation to production. There are a lot of legal requirements on headlights that are difficult to satisfy and so are usually ignored in show cars.

Taillight

On the opposite end, the taillights use a similar theme of a line of light through a circle. But now, rendered in red, the circles look like rocket engine exhausts instead of eyeballs. There are far fewer legal requirements around taillights so I hope this translates intact to some future showroom Mazda.

Sharp nose

The final detail that really attracted me is the staggered levels of the nose, led by the hood that ends in a sharp beak. Sleek and full of personality, it sadly has no chance of surviving translation to production. Real production cars need front bumpers, license plate holders, and are not allowed to cut pedestrian legs off at the knee.

But it does look awesome.

Technology for Promotion at the Los Angeles Auto Show

The 2017 Los Angeles Auto Show is underway this week. The cars are the stars, but you can read about them elsewhere. Instead, here are some of the interesting technology installed on the show floor.


There were two venues that featured the Microsoft HoloLens. I had been interested by this augmented reality headset and was happy this was a chance to try them myself.

Hololens 1 - Nissan

Nissan chose to use HoloLens to showcase their driving assist technologies. Up to six people (three front, three rear) can put on a HoloLens and look at the little toy Nissan on the table. Each of wearer sees an interactive environment projected around the toy car to illustrate how various features assist the driver. It’s possible to walk a bit to check different perspectives, but movement was limited because the HoloLens units were tethered to the table.

I felt this presentation underutilized HoloLens. It didn’t feel significantly superior to what you can accomplish with a cell phone, Pokemon-Go style.

Hololens 2 - Petersen

The other HoloLens exhibit was actually an exhibit at the Petersen Automotive Museum that they’ve brought to the auto show floor. The program is not interactive, but the user can walk around and check out views from different perspectives as the narrated presentation proceeded.

The best part was when they started illustrating airflow over and through the insides of the physical 2017 Ford GT in front of us. It’s quite informative to be able to move your head around to get a better feel of where the airflow is moving. Especially the X-Ray view of airflow through and under the car.

This was a much better demonstration of what’s possible with the superior precision and response rate of HoloLens tracking.


Several booth displays had some sort of virtual reality equipment. It is interesting that none of them were used to showcase any kind of driving. Just the opposite – most of them were there to showcase autonomous vehicle technology, a.k.a. the lack of driving.

VR 1 - Ford motion couch

Ford brought this motion-controlled couch with four seats, each of which can seat somebody with a Google Cardboard-style headset to experience riding in a Ford autonomous vehicle.

VR 2 - Volvo

Volvo brought in four Vive headsets to illustrate their safety technologies, much as Nissan did with Hololens. One random technical point of interest: I only found a single location beacon in the installation. Vive usually needs two beacons. I wonder where the other beacon was or if they’ve managed to do without the second.

VR 3 - Infiniti

Infiniti’s VR experience takes the guest on a virtual ride in the QX50. The most novel part of the program was the beginning, where parts of the SUV flew through space and assembled themselves around the viewer into a QX50.

VR 4 - VW

In contrast to the compact seating of the Volvo booth, VW put up this huge glass ring to give their I.D. Crozz VR ride plenty of elbow room. The guest seemed to stay seated through the whole experience so it’s not clear why this amount of room was necessary.

VR 5 - Nissan

I’m sure Nissan paid a lot of money for their Star Wars license for car promotion. And they were not afraid to use it! Liberally customized Nissans modeled after various Star Wars properties were on display. Their pavilion included this “Droid Repair Bay” VR activity. It looked so cool I almost didn’t wonder what it had to do with cars.


None of the AR or VR experiences featured any actual driving. For that, there were plenty of old-fashioned driving simulators on display.

Driving Sim 1 - Forza

Forza Motorsports are here to promote… themselves! I have Forza at home so didn’t bother to spend time playing it here.

Driving Sim 2 - Hyundai

Hyundai Racing had a four-seat configuration. What caught my eye is that they’re using Forza for the driving experience but the race car is sponsored by Gran Turismo. (See banner on top of the windshield.) Hmm…

Driving Sim 3 - Demon

Not all the driving sims were about the race track; the simulator set up in the Dodge pavilion lets people try their hands at drag racing. A fitting way to promote their drag strip focused Dodge Demon.

Driving Sim 4 - Ford

Ford brought in a full-motion driving simulator to promote the off-road focused F-150 Raptor. The hydraulic cylinders simulate the rough and tumble of racing head-to-head (back-to-back?) through a dirt track.


All good marketing companies have worked to think up ways to build customer connection through social media. There were plenty of photo booths present for people to post company-sponsored images to their social media. Two stood out for their novelty.

Selfie 1 - Honda

Honda’s “Dream Machine” is a selfie cam mounted in a little pivoting pod at floor level. After the person takes the picture, they press a trigger and a smoke ring puffs out of the pod towards the big screen, “sending” the picture to be displayed on the big screen.

Selfie 2 - Toyota

Toyota brought an array of cameras that all takes a picture at the same time, so the guest receives an animated GIF of them in Matrix-style “bullet-time”.

Acer Aspire Switch Runs Windows 10 (Fall Creator’s Update)

After Secure Boot discouraged me from putting a Linux variant on the recently revived Acer SW5-012 (Aspire Switch 10) convertible laptop, I tried to replace the existing Windows 8 installation (locked with passwords I don’t have) with the latest Windows 10.

The first thing to check is to look in the BIOS and verify the CPU is not a member of the ill-fated Intel Clover Trail series, whose support was dropped. Fortunately, the machine uses a newer CPU so I can try installing Windows 10 Fall Creator’s Update. I had an installation USB flash drive built with Microsoft’s Media Creation Tool.

I needed an USB OTG cable to start the installation. Once in progress, I deleted the existing Windows 8 system partition (~20 GB) and the recovery image partition (~7 GB), leaving the remaining two system partitions intact before proceeding.

When Windows 10 initially came up, there were significant problems with hardware support. The touchscreen didn’t work, there was no sound, and the machine was ignorant of its own battery charge level. Fortunately all of these hardware issues were resolved by downloading and running the “Platform Drivers Installer” from Acer’s support site.

Acer Win10

After the driver situation was sorted out I started poking around elsewhere on the system and found a happy surprise on Windows licensing. Since I couldn’t get into the Windows 8 installation, I couldn’t perform a Windows upgrade. Because I performed a system wipe, I thought I lost the Windows license on this machine. But I was wrong! I don’t know exactly what happened, but when I went to look at the computer’s information, it claims “Windows is Activated.”

The sticker on the bottom of the machine says it came with Windows 8 Pro. The new Windows 10 installation activated itself as Windows 10 Home. It is technically a step down from Pro to Home but I am not going to complain at the unexpectedly functional Windows license.

The machine outperformed my expectations. It handily outperformed my other computers with Intel Atom processors. I think the key part is its 2GB of RAM, double the 1GB RAM of the other Atom machines. The machine is surprisingly usable relative to its Atom peers.

Some credit is due to Acer for building a low-end computer in 2014 that is still capable on the software of 2017 (almost 2018.)

Acer Aspire Switch is Linux Unfriendly

Now that the hardware of an Acer SW5-012 (Aspire Switch 10) is back up and running, the focus turns to software. Windows 8 is installed but locked with passwords I don’t have. I didn’t care much for Windows 8 anyway, and whatever data exists is not mine to recover. So – a clean wipe is in order.

As with the Latitude X1, my first thought was to turn this little old machine into an almost-Chromebook with Neverware CloudReady. And just like with the Latitude X1, the attempt was foiled. The Latitude X1 was too old and did not support some processor features required by CloudReady. The Acer problem is just the opposite – the hardware is too new and deliberately blocks the installation.

The blocking mechanism is Secure Boot, which according to its own web site is a “security standard developed by members of the PC industry to help make sure that a device boots using only software that is trusted by the Original Equipment Manufacturer.” I would describe it with different terms. Either way, trying to install CloudReady – or a Linux distribution – results in the error screen “Secure Boot Error”.

Intentional or not, this puts the Acer in a bad state. It gets stuck neither fully on nor off, the screen dark but burning battery power and making itself warm. I had to disassemble the computer again to pull the battery from the main circuit board in order to reboot the machine.

In theory Secure Boot can be disabled, but various efforts by other people on the internet indicated this isn’t straightforward. I certainly had no better luck when I tried it: I can see the menu option, and I could change it from black on white (disabled) to white on gray (enabled) by creating an admin password, but I couldn’t figure out how to actually change the Secure Boot mode out of “Standard”.

Acer Secure Boot Menu

And it might not even be worth the effort, as forum traffic indicates there is very poor Linux driver support for this class of hardware. Probably related to the secure boot barrier but either way I’m giving up. I’ll stay with Windows on this machine.

No AC Adapter, No Problem! Alternate Power Source for an Acer Aspire Switch.

Once I was done gawking the clever magnetic attachment mechanism of the Acer SW5-012, it’s time to get back to trying to get it to run. The machine was able to power up on its remaining battery power for a little bit, but now it needs more juice. Since I was given this computer in nonfunctional “as-is” state, the AC power adapter was not part of the package.

Disinclined to spend any money on this machine, but willing to spend time, I went online to look for information about the AC adapter. Unfortunately there appeared to have been several similar but different computers sold under the “Acer Aspire Switch 10” name. And while it’s unclear if all of them use the same AC power adapter, the adapters were consistently stated to be an unit that outputs 12V DC.

This is great news as I have many ways to deliver 12V DC among my collection of tools and parts. But I have no plugs on hand that fits the existing power socket. I examined the power connector to the motherboard and saw four wires. A continuity check confirmed that it’s a simple positive terminal and ground terminal, with a pair of wires electrically connected for each. None of the wires are electrically distinct from power, so I don’t have to worry about data handshaking signals that are involved in charging certain other laptops.

Armed with this information, I removed the existing 12V power socket and the associated bracket. I cut the wire connecting the socket to the motherboard and soldered a JST RCY connector in its place.

Acer JST RCY adaptation

This type of connector is popular with remote-control aircraft and frequently used to carry roughly 12 volts (3-cell lithium rechargeable battery) at up to 3 amps. I reassembled the tablet, connected a 12V power source, and was reassured by illumination of the charging activity light. After a few hours, the tablet was charged up and ready to go again. Success!

 

Functional Simplicity of the “Acer Smart Hinge”

Yesterday’s post was about trying to bring an Acer SW5-012 back to life, which was fortunately as easy as reseating a ribbon cable. One of the reasons I was so eager to crack that thing open was my fascination with its hinge attachment mechanism. This was one of the “convertible” machines launched in the Windows 8 era and evolution of the category continued to this day with computers like the Microsoft Surface Book.

The hinge attachment/release mechanism for the Surface Book featured precisely machined components and electronics to control a wire of memory alloy. This Acer is a much cheaper machine so its nifty connector must also be simpler. Before I pried it open, I mentally tried to figure out how I would design such a mechanism.

At the time I thought the battery was flat, so I excluded any electronics in the design. It had to work without power, which made me think about magnets. A few small magnets to detect when the base is close to the screen, and pull against some spring-loaded arms to hold the thing together. When I pull on the screen, the force overcomes the springs to releases the arms.

Once I popped off the back cover of the computer, I could check my design against the answer and… well, I got the magnets part right even though it was based on a false premise (the battery was not flat like I thought.) And all the spring-loaded arms and clips and levers? Unnecessary complexity. I knew it had to be simpler than the Surface Book mechanism, but it was far simpler than what I imagined.

The actual mechanism consisted of magnets and… that’s it. Just some very cleverly placed magnets. When the screen is installed on the base, the magnets attract like we expect them to do, holding things together.

Acer Hinge Engaged

So what happens when we lift the screen away from the base? What’s causing that mechanical “click” sound?

When the base is lifted, the magnets in the screen is pulled away from the magnets in the base. Lacking the strong attraction, the magnets in the screen searched for the next best thing and finds a few metal plates slightly recessed into the cavity. The “click” is the magnet moving from the no-longer-there base magnet to the metal plate. When the magnets are attached to this inner metal plate, they are a few millimeters away from the edge of the unit but that’s far enough to keep it from picking up errant metal bits (paperclips, staples, etc.) while it is in tablet mode.

Acer Hinge Released

When the screen is reinstalled on the base, the screen magnet leave the metal plate in favor of the magnet in the base, making another “click”.

The Acer manual called it the “Acer Smart Hinge” and I agree it’s very smart – on the part of the people who designed it. Its simplicity lends to lower manufacturing cost and also to its reliability – no springs to break, no latch to wear out.

I am impressed.

Acer Aspire Disabled By Loose Cable.

I recently received an old Acer Aspire Switch 10 computer that no longer ran: there was no response when pushing the power button. The most obvious hypothesis is that the batteries are flat and need to be charged. Unfortunately, my gift of the computer did not include its matching AC power adapter.

If I was confident that was the only issue, I would go out and buy a power adapter. But I didn’t know if there were more serious problems in this machine and didn’t want to throw money at an unknown quantity. Besides, I received this computer on the premise that I wanted to take it apart for fun, so that’s exactly what I’m going to do.

Putting its serial number into Acer’s support site told me the model number (SW5-012) and part number (NT.L4TAA.018), but no service manual. I’m spoiled by Dell who usually releases a service manual detailing how to take apart and service a computer. Apparently Acer does not follow the practice.

There were no obvious external fasteners I could loosen, so I started prying at the visible seams to see if I could release plastic clips. Once I had three loose, the remainder (~25 in all) easily popped off in sequence.

My target was the battery module which I planned to remove and charge directly. Removing the battery required removing several pieces of tape. Some of these pieces of tape were applied over connectors, presumably to help the cables stay in place. One of these cables traversed the length of the battery so I had to remove the tape and the cable to free the battery. After I carefully peeled off the tape, I reached out to disconnect the cable and… it fell off freely.

Hmm, that wasn’t supposed to happen.

This cable connects the motherboard on one side of the machine to a small circuit board on the other side. The small circuit board hosts the Windows button, the volume up/down buttons, the headphone jack, and… the power button. If this cable was disconnected, it would explain why pushing the power button had no response.

Acer Power Ribbon

Since the battery was accessible now, I checked its voltage: 4.01V. Comfortably above the ~3.7V nominal voltage of a lithium-ion battery so the problem with this computer was not a dead battery. Maybe it’s the loose cable I just came across? I reinstalled the cable and pushed the power button again.

And… it’s alive!

Hands-On Fun for Kids at DTLA Mini Maker Faire

Today was the DTLA Mini Maker Faire (DTLA = downtown Los Angeles) and I went to see who the event would draw. I know there’s plenty of maker activity in the greater Los Angeles area, but it’s such a big area hindered by world-famous traffic congestion that it’s rare for everyone with common interest to gather together in one place. Any group of like-minded people are likely to congregate within several local clusters versus one big Los Angeles group. A call out to gather should be interesting to see.

Entrance

I was not disappointed! Groups came from all across Los Angeles basin and I saw many interesting things I didn’t know existed. The event took place in the downtown Los Angeles Public Library during regular hours. Maker Faire exhibits were tucked into various rooms scattered throughout the library adding a scavenger hunt into the experience.

My favorite part was seeing so many booths offering hands-on activity for young children to play with. I had expected grown-ups showing off their hobbies, since that’s what I had read about other Maker Faires in the past. I certainly got that, but I was more amused watching little kids engrossed in their own activities so this post is focused on the little ones.


The loudest booth in the courtyard is definitely the reDiscover Center booth. They had a basic woodworking shop set up and kids were building things with real woodworking tools (not plastic pretend tools) under adult supervision.

reDiscover

At the other end of the courtyard, SGVLUG (San Gabriel Valley Linux User’s Group) had multiple activities but the most popular was where kids were given old computer hard drives and the tools to take them apart. It looked like hard drive platters were being extracted to become Christmas tree ornaments.

SGVLUG

MatterHackers is a 3D-printing retailer within driving distance, but not close enough for me to have made a visit yet. They had a Ultimaker 3 running, but more interestingly, they had two 3D-printing pens set up for kids to freehand their own plastic creations.

MatterHackers

HexLab Makerspace came prepared with laser-cut wood kits of dinosaurs. But they didn’t just hand them out to kids for assembly – they also had paint set out for kids to color their dinosaurs and staffers offered encouragement to the children creating their own masterpieces.

Hexlab

The scattered nature of the event meant some attractions were harder to find than others, which is unfortunate. Getting to the auditorium required walking through a few uninviting-looking hallways that probably caused it to be overlooked by many. Those who entered could see robots for the FIRST robotics competition set up on stage. Kids could get in line to drive one of them on stage.

FIRST drivingOne of the robots is built to launch balls into the air, a task required in one particular competition. This robot gets all the attention whenever they demonstrated the ball launch mechanism.

FIRST firing


This event had fun for tinkers young and old alike. It has made me much more interested in attending more Maker Faires.

Fusion 360 Script Engine Uses Python Version 3

One of the differences between Python 2 and 3 was changing print from a statement to a function. This means a Python beginner like myself immediately runs into the breaking change trying to print “Hello World!” to the console. (print "Hello World!" versus print("Hello World!")) It sets the tone pretty clearly up front: when working with Python, developers need to know which version they’re writing for.

Which is why I was very puzzled when I searched for this information in the Autodesk documentation and came up empty-handed. The most obvious place in my mind would be the “Python Specific Issues” page of the user’s manual, but it wasn’t mentioned there. Searching various other combinations of terms – on and off Autodesk’s web site – failed to find a straightforward answer. Given the relatively young age of Fusion 360’s Python scripting support, I would expect them to use Python 3 but I wanted confirmation.

Well, if I can’t find it in documentation, there’s always looking at the code itself. And a tiny baby step beyond the simple boilerplate script generated by Fusion 360. It’s not quite printing “Hello World” but it’s almost that simple.

First I imported the Python sys module for querying system parameters.

import sys

Then I changed the boilerplate message box output string to include the version number.

ui.messageBox("We're running on Python " + str(sys.version_info.major), "Version detection")

Executing the script confirms the scripting engine is running Python 3.

F360Python3

Once the web search engines index this post, people who have this question in the future will know the answer just by reading the title of this post on the search results. They won’t even need to click on the link to read this page.

(This really simple bit of code is almost not worth committing to Github… but it’s a slow day so it was pushed up to be publicly available.)

Fusion 360 Scripting: Learning Resources Overview

fusion-360-logo31I don’t know how much Autodesk expects their Fusion 360 users to write their own custom scripts, but Autodesk have certainly made enough information available free online for anybody to give it a try. Broadly, they are divided into three categories:

  1. Why“: the Fusion 360 API User’s Manual describes the overall concepts of how the API is designed and laid out. It is written to be the starting point before the programmer dives into actual code.
  2. What“: the Fusion 360 API Reference Manual documents the specific nuts and bolts of how a script communicates with Fusion 360. This is where developers go to find the small but important details necessary to write good Fusion 360 code.
  3. How“: Autodesk provides sample code so we can see some already-written scripts and get a feel of how they work. Some people may prefer to start with the code before (or possibly ‘instead of’) going to the concepts described in the user’s manual. But every learner will need to cross-reference the sample code against the reference manual to understand everything a sample does.

I appreciated the foundation laid out by the user’s manual. It left me feeling confident that I could march into the scripts and be properly oriented to understand what I’m seeing and how to find answers when I need them. Whether this confidence is misplaced or not is too early to tell at the moment.

One thing that I found interesting: Autodesk provides sample code of different styles across multiple venues. There’s a fairly large set of samples alongside the manuals on Autodesk’s own help website, but there is in addition a Github account “AutodeskFusion360” where script code is published. Some are samples, some are hackathon projects, and some are scripts that they’re released to solve some problems that customers have raised in the forums.

Together they cover a pretty wide spectrum of code to learn from, from simplified educational code snippets to complete scripts intended to run on user workstations.

Windows Subsystem Returns for Linux

One of the newest features in Windows 10 is the “Windows Subsystem for Linux” (WSL) allowing a limited set of Linux binaries to run on the latest 64-bit edition of Windows 10. It may be a sign of open-source friendliness by the new Microsoft CEO but for trivia’s sake: it is not a new concept.

The lineage for Windows 10 traces all the way back to Windows NT, built-in the early 1990s as a heavier-duty operating system (or according to some, “a real operating system”) to move upscale relative to the existing DOS-based Windows (“not a real operating system”). As consumer-level hardware grew more capable, the old DOS core was phased out and the NT kernel took over. Windows 2000 was the modest start, followed by the successful Windows XP.

But back when Windows NT launched, it was intended to capture the business, enterprise, and government markets with higher margins than the consumer market. At the time, one requirement to compete for government contracts was support for POSIX, a IEEE-defined subset of Unix. The software architects for Windows NT built a modular design that supported multiple subsystems. In addition to the home-grown Microsoft Win32 and the POSIX subsystem to meet government requirement, there is also a subsystem for IBM OS/2 to compete in enterprises that had invested in OS/2.

History showed those subsystem were barely, if anything, more than lip service. They were not used much and gradually faded away in later evolution of the NT lineage.

But now, the concept returns.

Microsoft has a healthy and profitable market in desktop software development with Windows, but is only a marginal player in the web + cloud world. The people writing code there are more likely to be using a Linux workstation or a Macintosh with its FreeBSD-based MacOS. In an attempt to make Windows more relevant to this world, they need to provide access to the already entrenched tools.

So just like before, Microsoft is building a Linux subsystem for business competitive reasons. But unlike the POSIX subsystem, they couldn’t get away with just lip service to satisfy a checklist. It will actually need to be useful to gain meaningful traction.

The method of installation is a little odd – the supported Linux distributions are listed on the Microsoft Windows app store. But once we get past this square peg jammed in a round hole, it works well enough.

Ubuntu in app store

WSL is not a virtual machine or even a container. The Linux executables were not recompiled for Windows, they’re the exact same binaries. And they’re not isolated – they run side by side with the rest of Windows and has access to the same file system.

Personally, I’m using WSL so I can use the same git source control commands that I’ve learned while working in Ubuntu. I know Github has a Windows GUI and associated command-line toolkit, but I expect running the Ubuntu git via WSL would work better with git outside of Github. (Bitbucket, Heroku, etc.)

This is a good start. I hope WSL has a real life ahead to help Windows play well with others, and not fade away like its predecessors.

Dell Latitude X1: A 2005 Laptop Tries To Fit In 2017

I thought it might be fun to try to get the twelve-year-old Dell Latitude X1 laptop up and running. My expectations were not high, but when I looked over the hardware specs I found the out-of-date hardware surprisingly within reason to run current software.

The computer came with Windows XP, which is long out of service. The previous owner of this laptop switched to running Ubuntu 11. Since that’s far out of date as well and I had no login information anyway, a clean wipe is in order.

I thought I’d jump straight to the latest Ubuntu 17.10, but was unable to find a 32-bit installer. The lack of a 32-bit installer turns out to be an intentional omission, part of Ubuntu’s plans to phase out 32-bit support. So I installed an older version (16.04 LTS) which did have a 32-bit installer, and upgraded from there. The resulting system was quite sluggish. After using it a bit, I decided part of the problem was the spinning-platter hard drive but there’s also the old graphics chip struggling to handle the visual effects of a modern OS.

To isolate the latter, I installed Ubuntu MATE, a variant of Ubuntu with the MATE desktop. MATE is a simpler alternative which is supposed to run better on lower-end hardware. That part was true – after installing Ubuntu MATE, the Latitude X1 didn’t spend as much chugging through graphical transitions. But the overall experience was still slow – the spinning platter hard drive remains a significant influence on performance.

Switching to MATE would have made a larger difference if I had a larger screen (or multiple monitors) running multiple windows. But since the Latitude X1 screen was so small, I only have one window at a time running full-screen, reducing the influence of the desktop environment.

The Latitude X1’s performance on modern software is held back by the spinning-platter hard drive. Which led to the next idea: can we upgrade the hard drive to a SSD? I have a few old SSDs available for such a project.

Dell always publishes excellent manuals for working with their machines. They also keep them online and available, even for twelve-year old machines. So getting to the hard drive was no problem. As soon as the hard drive was visible, though, I knew I was in trouble. The drive is much smaller than the standard laptop hard drive.

HDD18HDD35

Even if the SSD could physically fit, it did not have the correct data interface. The interface connector is unlike anything I’ve seen in a laptop hard drive. The closest thing I can recall is a CompactFlash connector.

HDD18Plug

The label on the drive proclaims itself to be a Toshiba MK3006GAL. Sadly, unlike Dell, Toshiba does not keep documentation online for old hardware. I remain ignorant of the details and industry specification for this specific hard drive interface and form factor. Maybe it is rare enough that there would be no SSD upgrade possible at all. Since I was not planning to spend money on this project, though, the details are irrelevant. This old computer will stick with its old spinning platter hard drive.


If I had to make a prediction 12 years ago about how well the Latitude X1 would hold up to the years, I probably would have predicted the CPU speed as the largest bottleneck, followed by the quantity of RAM. I would not have guessed that the growth of cheap tablets would demand that operating systems continue to run on a 1 gigahertz processor and within 1 gigabyte of RAM.

I also would not have guessed that solid state drives would have dropped in price and become such a cost-effective boost to overall system performance. The hard drive turned out to be the most significant sign of age in this twelve-year-old laptop.

Dell Latitude X1 is Almost a Teenager

Today’s new toy is actually an old toy: a Dell Latitude X1 ultra-portable laptop that was originally released in early 2005. The fact that it is still running twelve years later is fairly impressive. I was once skeptical of the price premium Dell charged over their consumer product line, but I’ve seen enough consumer Dell die off while their business Dell counterparts kept trucking to change my mind. While I still might not choose to pay that premium, I now believe the price difference buys a more durable product.

Or perhaps the credit should go to Samsung? When I searched for reviews of this old laptop, I found this review which claimed the laptop is a rebadged Samsung Q30. The article even helpfully included a picture of the Q30 so we can see cosmetic similarities (and the differences.)

There are dings and dents from over a decade of service, but aside from the expected degradation in battery capacity, the machine seems to be running much as it did over a decade ago. I booted it up to verify that it could still do so (Looks like the previous owner installed Ubuntu 11) before I started digging into the hardware.

Looking into the BIOS, I find the processor is an Intel Pentium M ULV 733, a 32-bit single-core low-power processor running at a modest 1.1 GHz. It is definitely out of date in the current age of 64-bit multi-core multi-gigahertz CPUs but we might still be able to work with it.

There is 1.2 gigabytes of RAM, an unusual amount that I’m sure it was quite a luxurious amount in its day. Not so much today, but not as bad as it could have been. In the days of Windows Vista there was an expectation that computer memory baseline would keep moving up, 2 then 4 then 8 gigabytes and beyond, but it hasn’t panned out that way. Demand emerged to run on lower-end hardware so recent builds of Linux and Windows 10 both included provisions to run on inexpensive tablets with 1 gigabyte or less of RAM.

The same break in the capacity trend also applied to storage. This machine has only a 30 gigabyte hard drive, and hard drive capacity have grown to multiple terabytes within the past decade. But the advent of solid-state storage plus the desire for inexpensive tablets with modest storage meant operating systems had to stay slim.

All the remaining accessories follow the same trend – definitely out of date but surprisingly still within the realm of relevance. A screen with resolution of 1280×768, Bluetooth and Wi-Fi, Ethernet and USB, SD card reader, all the trappings expected of a modern laptop.

There are a few amusing anachronisms: a CompactFlash reader in addition to the SD reader. There is no HDMI video out port – just VGA. And the best one of all – a telephone jack for dial-up modem connectivity.

LatitudeX1Modem

 

 

Overview: Fusion 360 vs. Onshape Scripting

My fascination with gears led me to the scripting mechanism in both Fusion 360 and Onshape. Both CAD packages offer only minimal gear generation capabilities, and not even built-in to the main software: Spur gear generation was given in the form of an external add-in. This meant they are ideal introductory entry points to examine the different design philosophies.

The Onshape team invented their own scripting language called FeatureScript. It has some superficial similarities with C-style languages which helps a software developer get up and running more quickly on the new language. The code libraries are designed specifically around designing parts within a Onshape feature studio.

In contrast, Autodesk did not try to invent a new language. Instead, they decided to support multiple existing languages and let the user choose the one that best suits their purpose. C++ or Python. (Fusion 360 used to also support JavaScript, but support has been deprecated.)

Readability: Because FeatureScript was designed specifically for its task, the code is much more direct and readable. All API are designed to fit with FeatureScript because that’s the only language supported. Fusion 360 C++/Python code must call into the Autodesk library code which had to be adapted to be usable across languages. Advantage: Onshape.

Performance: C++ can be compiled to native code and executed on the local computer. For small tasks, this will be the fastest possible option. Python code running locally wouldn’t be quite as fast, but unlike FeatureScript, it does not have to deal with network latency. Advantage: Fusion 360

Security: FeatureScript executes in a secure sandbox on Onshape server, and thus limited in risk for exploitation by bad actors. In contrast, a native C++ binary can easily host hostile code. Python is slightly better in this regard, but it’s not clear how much effort (if any) Fusion 360 puts into running Python securely. Advantage: Onshape

Development: Like the rest of Onshape, all work with FeatureScript takes place within the user’s web browser window. If the developer does not like the Onshape FeatureScript editor, they are unfortunately stuck. In contrast, Fusion 360 presents many options. C++ development takes place on a platform-appropriate IDE (Visual Studio for Windows, Xcode for MacOS) and for Python the Spyder IDE is provided. Advantage: Fusion 360

Evolution: FeatureScript is owned by Onshape and any future evolution is fully under Onshape control. Since all FeatureScript written by users are stored on Onshape servers, they can validate any future changes for compatibility. In contrast, Autodesk owns neither C++ nor Python and could not direct future evolution. And since Autodesk does not host the plug-in code, they have no visibility on breaks in compatibility. Lastly, as history has shown, Autodesk can choose to abandon a language (JavaScript) and all the users on it. Advantage: Onshape

Capability: FeatureScript is constrained to feature creation in a parts studio. Autodesk API is currently roughly similar, focused on the model designer portion of Fusion 360, but they are starting to branch out. As of this writing they’ve just started rolling out some basic scripting capabilities for the CAM module and stated intention to do more. Advantage: Fusion 360


Python LogoFor the immediate future, I intend to focus on Fusion 360 scripting via Python. I’ve wanted to become more proficient in Python, and this would be a good way to look at Python through a lens entirely different from writing Qt code on a Raspberry Pi. I also have ambition to write Fusion 360 plug-in that leverages external libraries to provide novel services, and that’s not possible when stuck inside the FeatureScript sandbox. I will just have to keep a careful eye on the Python library imports in an effort to keep the security risks under control.

 

Fusion 360 and Onshape: Spur Gears via Scripts

I’ve been learning Onshape and Fusion 360 as they are two of the cloud-based CAD solutions with a subscription tier that’s free for me. They each have their strengths and weaknesses and it’s always interesting to compare and contrast between them.

One item they share that I found surprising is that neither of them has built-in capability to add gears into the mechanical design. I’ve always thought of gears as a critical part of any nontrivial machinery. So I had expected to find a significant section of the CAD package to be devoted to creating various types of gears for different applications, simulating and analyzing their suitability to the task, but there was nothing.

For basic projects, a simple spur gear would suffice. Thankfully, both companies have heard the user requests and made simple spur gear creation available as an add-in created with their own respective scripting mechanism.

Extending Onshape requires learning to write code in their custom language FeatureScript. A member of the Onshape team used it to create the spur gear script and made it available in the public documents library. One downside of this approach is the fact that (1) Onshape users needs to make a copy of a public document before they could use it for their own purposes, and (2) all documents created with the free subscription tier of Onshape are public. These two factors combined means many, many copies of the spur gear script in the public documents library. Correction: Custom FeatureScript can be added to the toolbar without making a copy. Thanks [FS] for the comment!

OnShape Spur Gear

Fusion 360 did not declare their own scripting language. They expose their extension API to two languages: C++ for the high-performance native code crowd, and the Python interpreted scripting language for the less performance-focused audience. They used to also support JavaScript, but as of July 2017 they have moved that into maintenance mode due to lack of usage.

The spur gear script is part of the sample script library installed alongside Fusion 360, so I didn’t have to find a separate download nor was copying of public document necessary. They presented it in both C++ and Python format. I found the Python version in the “Scripts and Add-ins” menu and clicked “Run”.

F360 Script Addin

Onshape and Fusion 360’s spur gear scripts present their gear parameters in slightly different formats, but both should suffice for simple projects. If I want to do something more complex, I will have to dive into coding my own extension script.

I’ve wanted to learn more about what is and isn’t possible in an extension script anyway, so… challenge accepted!

F360 Spur Gear

 

 

 

Waiting For Efficient Voice Control

When I started playing with computers, audio output was primitive and there were no means of audio input at all. Voice controlled computers were pure science fiction. When the Sound Blaster gave my computer a voice, it also enabled primitive voice recognition. The mechanics were primitive and the recognition poor, but the promise was there.

Voice recognition capabilities have improved in the years since. Phone-operated systems have enabled voice controlled menus within in the past decade or so. (“For balance and payment information, press or say 4.”) It is now considered easy to recognize words in a specific domain.

Just within the past few years, advances in neural networks (“deep learning”) have enabled tremendous leaps in recognition accuracy. No longer constrained to differentiating numbers, like navigating a voice menu, the voice commands can be general enough to be interesting. And so now we have digital assistants like Apple’s Siri, Google’s Assistant, Amazon’s Alexa, and Microsoft’s Cortana.

But when I tried to communicate with them, I still feel frustrated by the experience. The problems are rarely technological now – the recognition rate is pretty good. But there was something else. I had been phrasing it as “low-bandwidth communication” but I just read an article from Wired that offers a much better explanation: These voice-controlled robots are designed to be chatty.

Chatty Bots
Link to “Stop the Chitchat” article on Wired

The problem has moved from one of technical implementation (“how do we recognize words”) to one of user experience (“how do we react to words we recognize”) and I do not appreciate the current state of the art at all. The article lays out reasons why designers choose to do this: To make the audio assistants less intimidating to people new to the technology, make them sound like a polite butler instead of an efficient computer. I understand the reason, but I’m eager for the industry to move past this introductory phase. Or at least start offering a “power user” mode.

After all, when I perform a Google search, I don’t type in the query like I would to a person. I don’t type “Hey I’d like to read about neural networks, could you bring up the Wikipedia article, please?” No, I type in “wikipedia neural network

Voice interaction with a computer should be just as terse and efficient, but we’re not there yet. Even worse, we’re intentionally not there due to user experience design intent, and that just makes me grind my teeth.

Today, if I wanted a voice-controlled light, I have to say something like “Alexa, turn on the hallway lights.

I look forward to the day when I can call out:

AZIZ! LIGHT!