Tria Beauty Hair Removal Laser 4X Teardown

The best part of a local meetup is having different people bring in cool things that I would never have encountered in my own life. This week’s SGVTech meet is a prime example: I got to look inside an old hair removal laser. This is a product category I didn’t even know existed beforehand. Reading the product’s web site, it proclaims itself to be far more powerful than any of its competitors on the market, second only to medical grade equipment not sold to consumers. I don’t know if that is true, but power is certainly a theme. There are multiple safety hoops the consumer must jump through. A skin tone sensor in the base must verify the user’s skin color is within an acceptable range before it will even power on. And before the laser will fire, a secondary verification performed by a different sensor in the tip must pass.

The motivation for tonight’s teardown is the power subsystem. When plugged in for charging, the device gives all the indication of successful full charge, but it could never go through the first stage of its extensive power-up procedure. The manufacturer does not supply replacement batteries: while it’s possible the device is no longer safe to use beyond the life of the battery, it’s also easy to be more cynical about planned obsolescence. Tonight’s mission is to open it up and see if it can be brought back to life.

The exterior enclosure is an impressive work of industrial design, presenting a user-friendly appearance that hides the power and sophistication within. Popping a few plastic clips unveiled the device is dominated by equipment supporting a powerful laser. The battery module consumes majority of the interior volume, followed by a large heat sink and fan for thermal management.

Tria 4X laser components

The battery pack consists of two cells, and our first surprise was the 3.2V nominal volts listed on its label. This typically indicates lithium iron phosphate (LiFePO4) battery cells, which is known for high power delivery befitting a high powered handheld laser. However, these cells typically trade off power delivery with lower power capacity, but this pack claims 4.4 AHr which is far higher than any LiFePO4 cell I’ve seen. Tearing apart the outer plastic, we saw the pack is two cells wired in parallel, supported by measuring an open circuit voltage of 3.4. Two cells 2.2 AHr apiece is still very high by LiFePO4 standards.

We had a wide selection of battery cells we intended to try swapping in, from NiMH (visible in picture above) to standard 18650 cylindrical lithium batteries to lithium polymer packs intended for high amperage draw remote control vehicles. But before we start hooking up batteries, we should understand what power the device is looking for. So the battery pack’s wires were cut off and replaced with connectors to a bench power supply.

Tria 4X laser on power supply

We expected the device to power up with the power supply set to 3.2V, the nominal voltage listed on the battery pack, but there was no response. Turning it up to 3.4V was also unresponsive. Something inside this device is looking for a very specific power profile before it will activate. This may be one of the safety hoops, but it certainly dims our prospects of getting it up and running on other batteries.

Then a mistake was made: thinking the device might be running into the power supply’s current limit, a hand reached out to increase current but instead turned the knob to increase voltage far higher than 3.4V. A component on the circuit board started glowing and smoking, leaving behind a burnt hole so we can’t even read the part number anymore to figure out what it used to be.

Tria 4X fried chip

Oh well, so much for bringing the device back to life.

Now that it is well and truly dead, we have to abort the revitalization project and revert to our typical mode of disassembly for curiosity. The heat sink appeared to be a custom piece of machined metal, even the cooling fan might be custom due to how it clips in a way that conformed to shape of the heat sink. But obviously the star attraction is the laser assembly, and it didn’t look anything like what we expected. Behind the optical assembly we see… two pieces of golden colored metal?

Tria 4X laser with optics

Most of our collective experience are with LEDs in plastic packaging. If we use magnification, we can see a little bit of the semiconductor within but they don’t look anything like this. Our ignorance of solid state lasers meant we didn’t understand what we were looking at.

Tria 4X laser

Looking on the bright side, maybe it’s just as well we didn’t understand enough to play with it. This is a powerful piece of equipment, operating on wavelengths of light that we could not see. There is no blink reflex to save our eyesight in case of accident.

 

 

 

New Project: Neato Hacking

My ROS learning robot Phoebe was built mainly around a laser distance scanner module salvaged from a Neato robot vacuum cleaner. At the time I knew people were selling them for $50-$75 on Craigslist in varying condition, but I was content to pay $45 for just the laser scanner. It was all I needed for my own robot exploration purposes. I thought a full Neato vacuum might be fun to play with, but I have enough projects on my to-do list that I didn’t feel the need to go out and find one.

Unless when I do. I recently wrote a Hackaday article about bargain shopping in a thrift store, and I needed a picture to go with my article. I went into my local thrift store to take some pictures, but it was also an opportunity to practice what I preached. I spent most of my time in the electronics section and didn’t find anything I wanted to take home with me. On my way out the door, though, I took a glance at the kitchen electronics section and spotted this beauty: a Neato robot vacuum with a price tag of only $7.99.

Savers Neato XV-21 1600

It doesn’t power on, and external accessories were nowhere to be found: neither a wall wart charger nor its charging dock. But it looked to be in pretty good condition with only minor cosmetic blemishes on the exterior. Aside from the missing charger, all other major components appear to be present. But the purchase decision was based on the most interesting part: I looked inside the top bump to verify presence of a familiar looking laser scanner unit. If I all I get out of this $7.99 is a Neato lidar, I’ll be happy. If anything else worked, they would just be icing on the cake.

Samsung Is Getting Into Physical Stores

Wandering around town yesterday, I stumbled across a store under construction with a name that’s familiar to me, but not in association with a physical store: Samsung.

Samsung Store 1600

I’ve noticed Samsung’s retail ambitions growing beyond just another item on a shelf, beyond buying an end-isle display in a store. There is a Samsung mini-stores inside my local Best Buy, staff by people wearing Samsung logo shirts instead of Best Buy logo shirts. I never asked if they were Best Buy employees with a different uniform or Samsung employees, but I don’t suppose that matters. Obviously when I step foot within the Samsung zone I was getting pitched the Samsung life.

But that was only about a hundred square feet, this looks significantly larger and obviously entirely focused on Samsung products. According to Engadget, this is one of three Samsung retail stores opening in conjunction with the “10 years of Galaxy” event where Samsung is expected to introduce the Galaxy S10. One in New York, on in Texas, and this one on the west coast in California.

The location speaks to Samsung’s ambition. Americana at Brand is an upscale shopping center. Featuring brands like Tesla, Diane von Furstenberg, and Sephora. It is across the street from Glendale Galleria, a sizable mall all by itself. Glendale Galleria has retail presence from Microsoft, Amazon, and Apple. (Americana also has an Apple store, only a few hundred feet from Glendale Galleria’s Apple store, for some reason.) Beyond the borders of these two giant malls, the neighborhood also includes other ameneties such as the Alex theater for performing arts, surrounded by restaurants catering to that audience.

In short, this storefront is a major investment by Samsung to stake a claim in prime real estate. I’ll be curious to see what kind of foot traffic I see in the Samsung store on my next visit to Americana.

Window Shopping RobotC For My NXT

I remember my excitement when LEGO launched their Mindstorm NXT product line. I grew up with LEGO and was always a fan of the Technic line letting a small child experiment with mechanical designs without physical dangers of fabrication shop tools. Building an intuitive grasp of the powers of gear reduction was the first big step on my learning curve of mechanical design.

Starting with simple machines that were operated by hand cranks and levers, LEGO added actuators like pneumatic cylinders and electric motors. This trend eventually grew to programmable electronic logic. Some of the more affordable Mindstorm products only allowed the user to select between a few fixed behaviors, but with the NXT it became possible for users to write their own fully general programs.

Lego Mindstorm NXT Smart Brick

At this point I was quite comfortable with programming in languages like C, but that was not suitable for LEGO’s intended audience. So they packaged a LabVIEW-based programming environment that is a visual block-based system like today’s Scratch & friends. It lowered the barrier to entry but exacted a cost in performance. The brick is respectably powerful inside, and many others thought it was worthwhile to unlock its full power. I saw enough efforts underway that I thought I’d check back later… and I finally got around to it.

Over a decade later now, I see Wikipedia has a long list of alternative methods of programming a LEGO Mindstorm NXT. My motivation to look at this list came from Jim Dinunzio, a member of Robotics Society of Southern California, presenting his project TotalCanzRecall at the February 2019 RSSC meeting. Mechanically his robot was built from the stock set of components in a NXT kit, but the software was written with RobotC. Jim reviewed the capabilities he had with RobotC that were not available with default LEGO software, the most impressive one being the ability to cooperatively multitask.

A review of information on RobotC web site told me it is almost exactly what I had wanted when I picked up a new NXT off the shelf circa 2006. A full IDE with debugging tools among a long list of interesting features and documentation to help people learn those features.

Unfortunately, we are no longer in 2006. My means of mechanical construction has evolved beyond LEGO to 3D-printing, and I have a wide spectrum of electronic brainpower at my disposal from a low-end 8-bit PIC Microcontrollers (mostly PIC16F18345) to the powerful Raspberry Pi 3, both of which can already be programmed with C.

There may be a day when I will need to build something using my dusty Mindstorm set and program it using RobotC. When that day comes I’ll go buy a license of RobotC and sink my teeth into the problem, but that day is not today.

Window Shopping JeVois Machine Vision Camera

In the discussion period that followed my Sawppy presentation at RSSC, there was a discussion on machine vision. When discussing problems & potential solutions, JeVois camera was mentioned as one of the potential tools for machine vision problems. I wrote down the name and resolved to look it up later. I have done so and I like what I see.

First thing that made me smile was the fact it was a Kickstarter success story. I haven’t committed any of my own money to any Kickstarter project, but I’ve certainly read more about failed projects than successful ones. It’s nice when the occasional success story comes across my radar.

The camera module is of the type commonly used in cell phones, and behind the camera is a small machine vision computer again built mostly of portable electronics components. The idea is to have a completely self-contained vision processing system, requiring only power input and delivers processed data output. Various machine vision tasks can be handled completely inside the little module as long as the user is realistic about the limited processing power available. It is less powerful but also less expensive and smaller than Google’s AIY Vision module.

The small size is impressive, and led to my next note of happiness: it looks pretty well documented. When I looked at its size, I had wondered how to best mount the camera on a project. It took less than 5 minutes to decipher documentation hierarchy and find details on physical dimensions and how to mount the camera case. Similarly, my curiosity about power requirements was quickly answered with confirmation that its power draw does indeed exceed the baseline USB 500mW.

Ease of programming was the next investigation. Some of the claims around this camera made it sound like its open source software stack can run on a developer’s PC and debugged before publishing to the camera. However, the few tutorials I skimmed through (one example here) all required an actual JeVois camera to run vision code. I interpret this to mean that JeVois software stack is indeed specific to the camera. The whole “develop on your PC first” only means the general sense of developing vision algorithms on a PC before porting to JeVois software stack for deployment on the camera itself. If I find out I’m wrong, I’ll come back and update this paragraph.

2017-04-27-15-14-36When I looked on Hackaday, I saw that one of the writers thought JeVois camera’s demo mode was a very effective piece of software. It should be quite effective at its job: get users interested in digging deeper. Project-wise, I see a squirrel detector and a front door camera already online.

The JeVois camera has certainly earned a place on my “might be interesting to look into” list for more detailed later investigation.

SGVHAK Rover, Sawppy, and Phoebe at SGVLUG February 2019 Meeting

At the February 2019 meet for San Gabriel Valley Linux User’s Group (SGVLUG), Lan and I presented the story of rover building in our hardware hackers spinoff group a.k.a. SGVHAK. This is a practice run for our presentation at Southern California Linux Expo (SCaLE) in March. Naturally, the rovers themselves had to be present as visual aids.

20190214 Rovers at SGVLUG

We started the story in January 2018, when Lan gathered the SGVHAK group to serve as beta testers for Jet Propulsion Laboratory’s Open Source Rover project. Then we went through our construction process, which was greatly motivated by our desire to have SGVHAK rover up and running at least year’s SCaLE. Having a rover at SCaLE was not the end, it was only the beginning. I started building my own rover Sawppy, and SGVHAK rover continued to pick up hardware upgrades along the way.

On the software side, we have ambition to increase sophistication by adapting the open source Robot Operation System (ROS) which led to a small digression to Phoebe, my tool for learning ROS. Getting a rover to work effectively under ROS poses some significant challenges that we have yet to address, but if it was easy it wouldn’t be fun!

Since this was a practice talk, the Q&A session at the end was also a forum for feedback on how we could improve the talk for SCaLE. We had some good suggestions on how we might have a better smoother narrative through the story, and we’ll see what we can figure out by March.

Sawppy at Brawerman East STEAM Makers Fair

Sawppy’s publicity appearance today was at Brawerman East STEAM Maker’s Fair, a supercharged science fair at a private elementary school. Sawppy earned this invitation by the way of January presentation at Robotics Society of Southern California. The intent is to show students that building things is more than their assignments at their on campus Innovation Lab, there are bigger projects they can strive for beyond the classroom. But the format is, indeed, just like a school science fair, where Sawppy got a display table and a poster board.

Brawerman STEAM Makers Fair - Sawppy on table

But Sawppy is not very interesting sitting on a table, it didn’t take long before the rover started roving amongst other exhibits. The school’s 3D printer is visible on the left – a Zortrax M200.

Brawerman STEAM Makers Fair - Sawppy roaming

Sawppy was not the only project from grown-ups present. I admire the ambition of this laser cutter project undertaken by one of the parents. Look at the size of that thing. It is currently a work in progress, and its incomplete wiring were completely removed for this event so little fingers are not tempted to unplug things and possibly plugging them in a wrong place.

Brawerman STEAM Makers Fair - laser cutter

The center of this tables had some old retired electronics equipment that kids will be able to take apart. This was a huge hit at the event, but by the end of the night this side of the room was a huge mess of tiny plastic pieces scattered all over.

Brawerman STEAM Makers Fair - deconstruction zone

I brought my iPad with the idea I could have Sawppy’s Onshape CAD data visible for browsing, but it turned out the iOS Onshape app required a live internet connection and refused to work from cache. As an alternate activity, I rigged it up to show live video footage from Sawppy’s onboard camera. This was surprisingly popular with the elementary school age crowd, who got a kick out of making faces at the camera and seeing their faces on the iPad. I need to remember to do this for future Sawppy outings.

Brawerman STEAM Makers Fair - Sawppy camera ipad

After Sawppy was already committed to the event, I learned that a Star Wars themed art car was also going to be present. So I mentioned my #rxbb8 project which earned me a prime parking spot on the first floor next to the far more extensively modified “Z-Wing.” Prepare to jump to hyperspace!

rxbb8zwingcropped

Window Shopping AWS DeepRacer

aws deepracerAt AWS re:Invent 2018 a few weeks ago, Amazon announced their DeepRacer project. At first glance it appears to be a more formalized version of DonkeyCar, complete with an Amazon-sponsored racing league to take place both online digitally and physically at future Amazon events. Since the time I wrote up a quick snapshot for Hackaday, I went through and tried to learn more about the project.

While it would have been nice to get hands-on time, it is still in pre-release and my application to join the program received a an acknowledgement that boils down to “don’t call us, we’ll call you.” There’s been no updates since, but I can still learn a lot by reading their pre-release documentation.

Based on the (still subject to change) Developer Guide, I’ve found interesting differences between DeepRacer and DonkeyCar. While they are both built on a 1/18th scale toy truck chassis, there are differences almost everywhere above that. Starting with the on board computer: a standard DonkeyCar uses a Raspberry Pi, but the DeepRacer has a more capable onboard computer built around an Intel Atom processor.

The software behind DonkeyCar is focused just on driving a DonkeyCar. In contrast DeepRacer’s software infrastructure is built on ROS which is a more generalized system that just happens to have preset resources to help people get up and running on a DeepRacer. The theme continues to the simulator: DonkeyCar has a task specific simulator, DeepRacer uses Gazebo that can simulate an environment for anything from a DeepRacer to a humanoid robot on Mars. Amazon provides a preset Gazebo environment to make it easy to start DeepRacer simulations.

And of course, for training the neural networks, DonkeyCar uses your desktop machine while DeepRacer wants you to train on AWS hardware. And again there are presets available for DeepRacer. It’s no surprise that Amazon wants people to build skills that are easily transferable to robots other than DeepRacer while staying in their ecosystem, but it’s interesting to see them build a gentle on-ramp with DeepRacer.

Both cars boil down to a line-following robot controlled by a neural network. In the case of DonkeyCar, the user trains the network to drive like a human driver. In DeepRacer, the network is trained via reinforcement learning. This is a subset of deep learning where the developer provides a way to score robot behavior, the higher the better, in the form of an reward function. Reinforcement learning trains a neural network to explore different behaviors and remember the ones that help it get a higher score on the developer-provided evaluation function. AWS developer guide starts people off with a “stay on the track” function which won’t work very well, but it is a simple starting point for further enhancements.

Based on reading through documentation, but before any hands-on time, the differences between DonkeyCar and DeepRacer serve different audiences with different priorities.

  • Using AWS machine learning requires minimal up-front investment but can add up over time. Training a DonkeyCar requires higher up-front investment in computer hardware for machine learning with TensorFlow.
  • DonkeyCar is trained to emulate behavior of a human, which is less likely to make silly mistakes but will never be better than the trainer. DeepRacer is trained to optimize reward scoring, which will start by making lots of mistakes but has the potential to drive in a way no human would think of… both for better and worse!
  • DonkeyCar has simpler software which looks easier to get started. DeepRacer uses generalized robot software like ROS and Gazebo that, while presets are available to simplify use, still adds more complexity than strictly necessary. On the flipside, what’s learned by using ROS and Gazebo can be transferred to other robot projects.
  • The physical AWS DeepRacer car is a single pre-built and tested unit. DonkeyCar is a DIY project. Which is better depends on whether a person views building their own car as a fun project or a chore.

I’m sure there are other differences that will surface with some hands-on time, I plan to return and look at AWS DeepRacer in more detail after they open it up to the public.

Sawppy at Space Carnival Long Beach

Sawppy at Space Carnival Long Beach

Space Carnival, held at the Expo Arts Center in Long Beach, California, welcomed Sawppy as one of several exhibits Monday afternoon. It turned out to be part of a week-long LEGO robotics camp for elementary school students. Most of the events are for campers, but the Monday evening Space Carnival was open to the public.

Since the focus is on LEGO, there were plenty of plastic bricks in attendance. The middle of the room had a big pile of bricks on a plastic tarp and kids were crawling all over the pile building their creations. Sawppy mostly spent time outside of the tarp, occasionally venturing on to some of the colorful game boards for LEGO robots to line-follow and other tasks.

Sawppy at Space Carnival Long Beach LEGO tarp

As usual, I handed controls over for kids in attendance to play with. Running over feet is still more popular of an event than I can hope to understand but, if it makes them excited, so be it.

Sawppy at Space Carnival Long Beach running over feet

Sawppy was not the only non-LEGO robot in attendance, there were also a selection of Star Wars licensed merchandise including this R2D2. I forgot if this particular unit was made by Sphero or Hasbro.

Sawppy at Space Carnival Long Beach R2D2

This event was not the first time I crossed paths with Barnabas Robotics, but it was the first time I got to speak with them beyond the standard sales pitch type of discussions. Since their business is STEM education for students K-12, they have a good feel of what type of material is appropriate for various age groups. It’s possible Sawppy can find a role in high school curriculum.

At the end of the night, the LEGO tarp cleared out enough for me to drive Sawppy across the field. Unfortunately I saw Emily’s tweet too late to replicate the movie clip she had suggested. Maybe another day!

Sawppy Has A Busy Schedule This Week

Since the time I declared Sawppy version 1.0 (mechanical maturity), I’ve been taking my rover out to various events. From casual social gatherings to large official events to presentation in front of others who might appreciate my project. Sawppy has been a great icebreaker for me to start talking with people about their interests, and sometimes this leads to even more events added to Sawppy’s event calendar. This coming week will be especially busy.

Space Carnival

On Monday February 11th from 3pm-6:30pm Sawppy will be at Space Carnival, a FIRST Themed Event on Lincoln’s Birthday. Held at Expo Arts Center, a community space in Long Beach, CA. This event is organized by people behind local FIRST robotics teams. This year’s competition is titled “Destination: Deep Space” and has a very explicit space exploration angle to all the challenges. So even though Sawppy is nothing like a FIRST robotics competitor, an invitation was still extended to join the fun.

This event will be unique in that I had the option to be a roaming exhibit and I chose it for novelty. I think a rover who is roving will be much more engaging than a rover sitting on a table. It also means I will not be tied to a booth, so I could check out other exhibitors as I roam around with Sawppy. This eliminates the problem I had with Sawppy at DTLA Mini Maker Faire – I had to stay in one place for most of the event and couldn’t see what other people had brought.

On Wednesday February 13th Sawppy will join a STEAM Maker’s Fair at Brawerman East, a private elementary school. This is a small event catering to students and parents at the school. I believe the atmosphere will be similar to a school science fair, with exhibits of student projects. To augment these core exhibits, Sawppy and a few others were invited. The intent is to show that concepts covered in their on-campus Innovation Lab projects are just as applicable to bigger projects outside of their class.

And finally, on Thursday Februarh 14th Sawppy will be part of another SGVLUG presentation. A follow-up to the previous rover themed SGVLUG presentation, this will still set up background but will talk more about what has happened since our initial rover construction. This also serves as a practice run for a presentation to be given at Southern California Linux Expo (SCaLE) next month.

(Cross-posted to Hackaday.io)

My Monoprice 3D Printers at February 2019 RSSC Meeting

When I presented the story of my Sawppy rover project last month at the January 2019 meet of Robotics Society of Southern California (RSSC) I made an offhand comment about my 3D printers. Later on, in a discussion on potential speakers, there were people who wanted to know more about 3D printers and I offered to summarize my 3D printer experience in a follow-on talk. Originally scheduled for March, I asked to be rescheduled when I realized the March RSSC meet would take place at the same time as Southern California Linux Expo (SCaLE).

My talk (presentation slide deck) starts with a disclaimer that my experience and knowledge was limited. I started by explaining why I chose Monoprice printers backed by a short history lesson on Monoprice because that sets the proper expectations. Then I ran through my three Monoprice printers: the Select Mini, the Maker Select V2, and the Maker Ultimate. Each of these printers had their strengths and weaknesses.

Monoprice Select Mini

  • Simple low-cost printer that still covers all the basic concepts of FDM printers.
  • Closest we have to a “Fisher Price My First 3D Printer”
  • Recommended for beginners to find out if they’ll like 3D printing.

Monoprice Maker Select

  • Classic Prusa i3 design.
  • Easiest to take apart for modifications and/or repairs.
  • Recommended for people who like to tinker with their equipment.

Monoprice Maker Ultimate

  • Design “inspired by” Ultimaker.
  • Highest precision and most reliable operation.
  • Recommended for people who just want their equipment to work.
  • But price level approaches that of many other good printers, like a genuine Prusa i3.

I brought my printers to the meet so interested people can look them over up close. I did not perform any print demos, because I’ve almost certainly knocked the beds out of level during transit. Plus, I forgot my spools of filament at home. But these are robotics people, they can gain a lot just by looking over the mechanical bits.

20190209 RSSC 3D Printers

SparkleCon Sidetrack: Does It Have A Name?

spool holder with two stage straightener 1600x1200

My simple afternoon hack of a copper wire straightener got more attention – both online and off – than I had expected. One of these came as a fun sidetrack to my Sparklecon talk about my KISS Tindie wire sculptures. As part of the background on my wire form project, I mentioned creating this holder. It kicked off a few questions, which I had answered, but I had the most fun with “Does it have a name?”

I gave the actual answer first, which was that I had only been calling it a very straightforward “wire spool holder with straightener” but I followed it up with an off-the-cuff joke “Or did you mean a name like Felicia?” I think I saw a smile by the person asking the question (hard to tell, he had a beard) and I also got a few laughs out of the audience which is great. I had intended to leave it at that, but as I was returning to my presentation another joke occurred to me: “Felicia will set you straight.”

Since my script was already derailed, I saw no reason to run with it: “Is there a fictional character who is a disciplinarian? That might be fitting.” and opened it up to the audience for suggestions. We got “Mary Poppins” which isn’t bad, but things went downhill from there. The fact is: the disciplinarian in a story is almost always a killjoy obstacle in our hero’s journey. Or worse, one of the villains, as in the suggestion of “Delores Umbridge” given by a woman wearing a Harry Potter shirt. My reaction was immediate: “No.” But two seconds later I remembered to make it a tad more positive: “Thank you, she is indeed a disciplinarian, but no.” Hopefully she doesn’t feel like I bit her head off.

After the talk, there were additional suggestions interpreting my second joke “Felicia will set you straight” in the sense of personal relationship preferences. This went down a path of politically conservative zealots who believe it is their public duty to dictate what people do in private. This direction of thinking never even occurred to me when I threw out the joke on a whim.

I think I’ll leave it at Mary Poppins.

UltraViolet Shutdown Does Not Inspire Confidence

I consider myself a technology enthusiast, but it’s not a blank check. Reliability and dependability is a big deal, and I view with skepticism technologies which fail on those fronts. This is the reason I have not started talking to Google Assistant on my Android phone – voice recognition is too unreliable. It’s also why I would spent extra money for CAT6 Ethernet in a house – wireless is always less reliable than wired. And finally, it’s why I have a DVD (now Blu-ray) collection, even though almost anything is available online.

To ease skeptics like myself into the digital world, many of my recent movie purchases on physical media also included a code to grant me a digital license of the film. I was willing to participate in this experiment, because if the digital arm folds I still have my physical media. This proved wise when the digital film was provided by a service created by a studio for their own films, as they closed down one by one. I also have digital licenses for movies on platforms like Windows Media, but even though the platform lives the studio-specific license servers have been taken down making my content unplayable.

UltraViolet was an effort to build a more permanent platform, with support of multiple studios for the content and multiple services for playback. Movies Anywhere started as a Disney-only effort (which drew my skepticism) but it has since grew into a multi-studio offering. Playback quality is uneven across various streaming services, but having a centralized license store made it very consumer friendly – I could sample the quality of different feeds and play the best one. I’ve been quite satisfied with recent releases on Vudu and Fandango Now, both of which offer high bandwidth 4K HDR streams with quality high enough I have a hard time distinguishing from Blu-ray media playback on my Roku-equipped TCL television.

I started feeling more comfortable with the idea of making digital-only movie purchases, easing into the digital library concept. Hey, maybe this is going to work after all and my money won’t vaporize overnight.

Then UltraViolet announced they are shutting down.

ultravioletwillclose

Just like the little startup services that never matured, Variety reports the studios involved have collectively agreed to call it quits. This shutdown notice seems to imply that my digital licenses will still survive in linked retailers, but then I’m beholden to individual retailers honoring this agreement and also staying in business.

I always knew these licenses are subject to variables outside my control, but I was gradually easing into the idea perhaps those variables aren’t as volatile as they were. This is a reminder otherwise.

Looks like I will continue to buy physical media.

Using LibPNG To Encode Spooky Eye Data

Sclera array and bitmap

Emily and I thought it would be cool to have the Spooky Eye visualization running on platforms in addition to Teensy and Adafruit SAMD boards. The first target: a Raspberry Pi zero. Reading through the project documentation and source code gave us a good idea how the data is encoded, but the best test is always to make use of that data and see if it turns out as I expected.

This would be a new coding experiment, so a new Github repository was created. I added the header files for various eyeballs then I started looking for ways to use that data. Since the header files are in C, it made sense to look for a C library to do something. I decided to output data to PNG bitmap files. Verifying the output looks correct would be as simple as opening the bitmap in an image viewer.

The canonical reference library for PNG image compression is libpng. Since I expect my use to be fairly mainstream, I skipped over the official documentation that covers all the corner cases a full application would need to consider. In the spirit of a quick hack prototype, I looked for sample code to modify. I found one by Dr. Andrew Greensted that looked simple and amenable to this project.

I fired up my Ubuntu 18.04 WSL and installed gcc and libpng-dev as per instructions. The sample failed to compile at first with this error:

/tmp/ccT3r4xP.o: In function `writeImage':
makePNG.c:(.text+0x36f): undefined reference to `setRGB'
collect2: error: ld returned 1 exit status

Since there were a lot of references to this sample code, I thought this wouldn’t be a new problem. A web search on “makePNG undefined reference to setRGB” sent me to this page on Stackoverflow, which indicated there was a problem with use of C keyword inline. There were two options to get around this problem: either remove inline or use the -Ofast compiler option to bypass some standards compliance. I chose to remove inline.

That was enough to get baseline sample code up and running, and modification begins. The first act is to #include "defaultEye.h" and see if that even compiles… it did not.

In file included from makePNG.c:20:0:
defaultEye.h:4:7: error: unknown type name ‘uint16_t’

Again this was a fairly straightforward fix to #include <stdint.h> which takes care of defining standard integer type uint16_t.

Once everything compiled, the makePNG sample code for generating a fractal was removed, as was the code to translate the fractal’s floating point value into color. The image data was replaced with data from Spooky Eye header files. If all works well, I should have a PNG bitmap. The first few efforts generated odd-looking images because there were bugs in my code to covert Spooky Eyes image array, encoded in 16-bit RGB565 format, to be written out in 24-bit RGB888 format. Once my bitwise manipulation errors were fixed, I had my eyeballs!

Looking Under The Hood Of Adafruit Spooky Eyes

Sclera array and bitmap

Adafruit’s Hallowing was easily the most memorable part of the 2018 Superconference attendee gift bag. Having a little moving blinking eye looking around is far more interesting than a blinking LED. It is so cool, in fact, that Emily has ambition to put the same visual effect on other devices.

Since the Hallowing was one of the headline platforms that supported CircuitPython, the original hope was that it would be very easy to translate to a Raspberry Pi. Sadly, it turns out “Spooky Eyes” is actually a sketch created using Arduino IDE for a Teensy board that also ran on the Hallowing.

As I found out in my own Nyan cat project for Superconference 2018 badge, modern image compression algorithms are a tough fit for small micro controllers. And just as I translated an animated GIF into raw byte data for my project, Spooky Eyes represented their image data in the form of raw bytes in a header file.

Adafruit always has excellent documentation, so of course there’s a page describing what these bytes represent and where they came from for the purposes of letting people create their own eye bitmaps. Apparently this project came from this forum thread. I was a little grumpy the Adafruit page said “from the Github repository” without providing a link, but the forum thread pointed here for the Python script tablegen.py.

There was a chance the source bitmaps would be on Github as well, but I had no luck finding them. They certainly weren’t in the same repository as tablegen.py or the Arduino sketches I examined. Still, the data is there, we just need to figure out what format would be most useful for putting the eye on another project.

As a first step, I’ll try to extract and translate them into a more familiar lossless bitmap format. Something that can be directly usable by more powerful devices like a Raspberry Pi. A successful translation would confirm I understand the eyeball data format correctly, which would be good to know for any future projects that might want to encode that data into different formats as needed for other devices.

KISS Tindies: Ace/Spaceman II

One of the scramble before Sparklecon was getting my KISS band back together. On the weekend prior to Sparklecon, the band went on tour in their first public appearance. The good news: they were very well received and people loved them! The bad news: Someone loved them so much they decided to adopt my Spaceman, taking him home without my permission. I was missing a member of the band.

I had already signed up to talk about the band for Sparklecon, and it would be a bit lame not to have the full band at my talk. This means making Spaceman II, but for that I would need another KISS Tindie PCB. Fortunately, Emily came to the rescue! She was also at Superconference and had also picked up a KISS Tindie PCB of her own. She generously donated her panel so I could rebuild my blinky band.

Emily KISS Tindie Panel.jpg

Emily had already soldered a pair of yellow non-blinking LEDs to her Spaceman. For the sake of consistency with the rest of my blinky band, those two LEDs were removed. Then I got to work rebuilding a wire frame body. Given the time crunch, I tried to skim a bit on details and initially started trying to make Spaceman’s guitar out of a single length of wire.

Spaceman 2 one piece guitar

I only got this far, though, before I decided it didn’t look good. I aborted and returned to multi-piece construction. It is more time consuming, but it conveys superior detail.

Spaceman 2 multi piece guitar.jpg

Unfortunately that aborted experiment put me further behind on schedule. This is not the time to experiment, I need to stick with known solutions. For the most part, I stuck with what I knew worked for the rest of this reconstruction.

Spaceman 2 complete

I’m sad that I lost my first KISS Tindie Spaceman, but this experience also gave me the opportunity to answer one question I was asked: How long does it take to build a wire frame body for a KISS Tindie? I honestly did not know because when I’m focused on a project like this I lose track of time. Bend wire, compare against drawing, repeat until the curve is right. Then solder that piece, and repeat the whole process for the next piece.

I had guessed maybe each Tindie would take 30-45 minutes. This time, I started a timer just before I removed the yellow LEDs and stopped it right after I took the above picture of a completed KISS Spaceman II. Total time: 2 hours 45 minutes. Even though this included the aborted single-wire guitar, my estimate was clearly way off!

But that time was well spent, I had the full band again for my Sparklecon presentation.

SparkleCon Day 2

A great part of SparkCon is its atmosphere. It is basically a block party held by 23b Shop and friends in the same business park. Located in Fullerton, CA, the venue’s neighborhood is a mix of residential, retail, and commercial properties. As a practical matter, this meant good eats like Don Carlos Mexican Restaurant and Monkey Business Cafe were in easy walking distance.

Originally my Day 2 was going to start bright and (too) early for me at 9AM with the KISS Tindies presentation, but the relaxed easygoing nature of the event meant a schedule change was possible and we did it at noon instead. I loved talking to all my fellow people who thought my circuit sculptures were more interesting than a certain football game taking place around the same time.

Roger presenting at Sparklecon - Drew Fustini
Photo by Drew Fustini
Roger presenting at Sparklecon
Photo by Jaren Havell

It was another great opportunity to practice public speaking. I think it went well and some people let me know afterwards that they enjoyed the talk. Success!

The table and couches of NUCC once again hosted various hacks. Emily’s little green-tinted CRT attracted immediate attention.

Emily green tinted LED on NUCC table at Sparklecon

It wasn’t long before it hosted a Matrix waterfall of characters.

Emily wants to host a version of Adafruit Hallowing’s default eyeball program on her tiny round CRT. To see how it would look, Emily and Jaren took a video of the Hallowing eyeball and played it back on a Raspberry Pi.

While this was underway, I was unwinding by playing with my copper wires. Yesterday I made a crude taco truck, today I tried to make an abstract steam locomotive out of a single wire. There was no planning involved, so it was no surprise I ran out of wire before I could finish.

Single wire steam engine

Elsewhere on the table were electronic noisemakers to play with. To the left is a “Dronie” assembled by @hackabax this weekend, next to another of his noisemaker devices whose name I forgot. Inside the metal case in the right is one of Emily’s earlier projects, a simple sequencer powered by a pair of 555 timers.

Noisemakers Unite.jpg

One casualty of the pouring rain were the robot competitions, but the Hebocon boxes were still set out for people to play with. Maybe we won’t have robots this time, but we can still have other interesting contraptions.

Hebocon boxes at Sparklecon

Sometimes “interesting” veered into “unsettling”…

Barbie head baseball flag thing

It was a great weekend, rain or no rain. I had the opportunity to present one of my projects and saw it was well-received. I got to see people I’ve met before at other events, and met some new people too. And it was a great way to learn about spaces I’ve only heard about before. Chances are good I’ll be back at 23b Shop and/or NUCC before next Sparklecon rolls around.

SparkleCon Day 1

I have arrived at SparkleCon! I had thought this event was just at the hackerspace 23b Shop, but it is actually spread across several venues in the same business park. The original plan also included activity in the parking lot between these venues, but a powerful storm ruined those plans. Given this was in Southern California the locals are not very well equipped to handle any amount rain, never mind the amount that came pouring from the sky today. So people packed into the indoor venues where it was warm and dry. STAGESTheatre is where some talks were held, like Helen Leigh‘s talk From Music Tech Make to Manufacture demonstrating her Mini Mu.

Sparklecon Day 1 mini mu

The doors of Plasmatorium was also open and primary source of music. And finally the National Upcycling Computer Collective which had this festive sign displayed.

Sparklecon Day 1 sign

One corner of NUCC was set up with a pair of couches and a table, which grew into KISS Tindie headquarters. The original plan was to set up an inflatable couch and table someplace in the outdoor region, but the rain cancelled those plans and we took over this space instead.

Sparklecon Day 1 NUCC CouchThe table started the day empty, and there was a time when it was populated by scattered stickers, but towards the evening it became an electronics workshop. Here we can see multiple simultaneous projects underway.

Sparklecon Day 1 Workbench

I had a few taco, fries, and octopus kits to give out. While talking about tacos and KISS Tindie sculptures, it was suggested that I use my newfound circuit sculpture skills to build a taco truck. So I did!

Sparklecon Day 1 taco truck

KISS Tindies will be at SparkleCon

SparckleCon IV, the annual event held by 23b Shop, will be this upcoming weekend. It will be my first opportunity to attend and it looks like I’ll be jumping in with both feet and presenting part of a talk. Currently scheduled for Sunday morning at 9AM, the topic is Hackaday and Tindie, with focus on the recently concluded circuit sculpture project.

Ironically, there won’t be any actual contest entries at the presentation, because staff members like myself were not eligible to enter. So I’m bringing the next best thing: my KISS Tindies band which I built because I thought circuit sculptures looked like fun.

kiss tindie band on stage

The talk will be a condensed summary of my circuit sculpturing adventures documented on this blog. From my initial Tindie puppy, to my wire straightening tool, to the four members of the band and finally the drum set. The topic will neatly tie into both Hackaday and Tindie and it’s my way of making sure I hit the standard points without being too much of a corporate commercial.

We’ll see how successful the venture will be… my brain isn’t typically working at its best Sunday mornings at 9AM, and some fraction of the conference attendees will be hungover in bed. I choose to see this as a positive thing: it’s good practice for my public speaking skills, and any goofs would likely go unnoticed (or at least forgiven) by an equally night-owl-heavy crowd.

Party Bling in 30 Minutes: LED Blinky Collar

It’s good to have grand long term projects like Sawppy, but sometimes it’s nice to take a side trip and bang out a quick fun project. The KISS Tindies were originally supposed to be such a project but grew beyond its original scope. Today’s project had to be short by necessity. At less than 30 minutes, it’s one of my shortest projects ever.

Collar LED blinky final curved.jpg

The motivation for this project was an office party, but I didn’t know what the crowd was going to be like. My fallback outfit for such unknown is a long sleeved dress shirt and a sport jacket. If it turns out to be formal, I’ll be under-dressed but at least I’ll have a jacket on. If it turns out to be semi-formal I should fit in. If it is casual, I can take off the jacket. But these are people in the electronics industry, so there’s a chance I will find a room full of people wearing flashing LEDs. I decided, less than an hour before I had to leave, instead of my usual necktie I’m going to substitute a little bit of LED bling of my own.

The objective is to take the same self-blinking LEDs I used on my KISS Tindies and install them under my shirt collar. Since these LEDs can be obnoxiously bright (especially in dark rooms) the light will be indirect, bouncing off fabric underneath my collar. This way I don’t blind whoever I’m trying to hold a conversation with.

When I bought the self-blinking LEDs for the KISS Tindies project, I bought a bag of 100 so there’s plenty left to play with. For a battery holder I’ll use the same design I created for the Tindies out of copper wire. There’s no time to 3D print a structure, so I’m just going to use paper. Copper foil tape will form circuitry on that sheet of paper. Here’s the initial prototype. I folded the paper in half to give it more strength. I had also cut out a chunk of paper to help the battery holder stay in place.

collar led blinky prototype parts

Assembling these parts resulted in a successfully blinking LED and good enough to proceed.

collar led blinky prototype works

The final version used a longer sheet of paper. I measured my shirt collar and cut a sheet of paper so the ends would sit roughly 3mm inside the collar. This was longer than a normal sheet of paper so I had to pull a sheet of legal size paper out of my paper recycle bin. I think it was the legal disclosure form for a pre-approved credit card offer.

collar led blinky final

The LEDs sit a few centimeters inside the paper’s edge. The other side of the paper had extra copper tape to shield the light from shining through. I wanted the light to reflect inside my collar, not show through it. The first test showed a few circular spotlights on my shirt, so I added a sheet of Scotch tape to diffuse light. Once I was happy with the layout of this contraption, I soldered all components to copper foil for reliability.

Less than 30 minutes from start to finish, I had a blinky LED accessory for my shirt.

collar-led-blinky

As it turned out, there was only one other person wearing electronics in the form of some electroluminescent wire. My blinky LED collar was more subtle about announcing itself, but they were noticed by enough people to make me happy.

(Cross-posted to Hackaday.io)