Salvage Surface Mount Switches For Homing Test

After I managed to destroy a stepper motor during experimentation, Emily graciously donated another one to the cause. Hooking it up to my A4988 test board, I can tell this optical drive carriage had more power than the one I salvaged from a laptop drive. At least this one could win a fight against gravity. However, I was dismayed to find it is still quite weak in absolute terms, and this time I’m wary of cranking up the power in fear of destroying another motor.

So in order to set up something to learn how to wire up a homing switch for Grbl, I need to find switches that take less force to activate than the switches I already have on hand. Where might I find tiny switches that take tiny force to activate? The recently disassembled laptop optical drive!

SMD switches

Its control board had a few tiny surface mount switches. They were connected to small mechanical linkages throughout the drive so this control board can determine various states of the eject mechanism and such. There were a total of four switches and I put them under a heat gun in an effort to remove them.

This was a tricky procedure, as I had to melt the solder without melting the plastic. I put too much heat into two of the switches and destroyed them in the process. Fortunately, two of them were removed relatively undamaged. I put one of them under a meter to check for continuity, and it appeared to still work as a switch.

And soon, if all goes well, it will be a homing switch.

Panasonic UJ-867 Optical Carriage (Briefly) Under A4988 Control

Once I extracted the optical assembly carriage of a Panasonic UJ-867 optical drive, the next step is to interface it with a Pololu A4988 driver board. And just as with the previous optical drive stepper motor, there are four visible pins indicating a bipolar motor suitable for control via A4988. However, this motor is far smaller as befitting a laptop computer component.

Panasonic UJ-867 70 stepper motor connector

The existing motor control cable actually passed through the spindle motor, meaning there were no convenient place to solder new wires on the flexible connector. So the cable was removed and new wires soldered in its place.

Panasonic UJ-867 80 stepper motor new wires

Given the fine pitch of these pins it was very difficult to avoid solder bridges. But it appeared to run standalone so I reinstalled into the carriage. Where it still ran – but was very weak. Hardly any power at all. When I tilted it up so the axis of travel is vertical, the carriage couldn’t win its fight against gravity. Since the job is only to move an optical assembly, I didn’t expect these carriages exert a great deal of force. But I’ve seen vertically mounted slot loading optical drives. I thought it should at least be able to fight against gravity.

A Dell laptop charger delivers 19.2V. I’m not sure how many volts this motor intended to run at, but 12V seemed reasonable. Then I increased current beyond the 50mA of the previous motor. Increasing both voltage and amperage seemed to help with more power, but it remained too weak to overcome gravity.

As I’m tilting the metal carriage assembly in my hand, I started noticing it was warming. Oh no! The motor! I checked the temperature with my finger, which was a mistake as it was hot enough to be painful to human skin. I shut down my test program but it was too late, the carriage never moved again.

Lessons learned: don’t get too overzealous with power and check temperature more frequently.

And if I want to continue these experiments, I’ll need another stepper motor assembly.

Examining Pixelblaze Sensor Expansion Board

With my RGB-XYZ 3D sweep test program, I’ve verified my LED helix is fully set up with a Pixelblaze controller programmed with its geometry wound around a 3D printed chassis.  I have a blank canvas – what shall I create on it? A Pixelblaze by itself is capable of generating some pretty amazing patterns, but by default it has no sense of the world around it. It can only run a programmed sequence like my sweep test. I could focus on writing patterns for spectacular LED light shows, but I decided to dig deeper for sensor-reactive patterns.

There are a few provisions on board for analog and digital inputs, so patterns could react to buttons or knobs. Going beyond such simple input is the Sensor Expansion Board. It is an optional Pixelblaze add-on board which provides the following:

  • A microphone specifically designed to continue function in loud environments
  • An ambient light level sensor
  • 3-axis accelerometer
  • 5 additional analog inputs

A Pixelblaze fresh out of the package includes a few sound-reactive patterns that work with the microphone. They are fun to play with, but that ground has been covered. Seeking fresh under-explored territory and an opportunity to write something useful for future users, I looked at the other sensors available and was drawn to the accelerometer. With it, I could write patterns that react to direction of gravity relative to the sensor board. This should be interesting.

The sensor board is fully documented on Github, which included description of the protocol used to send data to a Pixelblaze. Or actually any other microcontroller capable of decoding 3.3 volt serial data at 115200 baud which should be all of them! In my case I’ll be using it with my Pixelblaze, and the first lesson here is that we only really need 3 pins out of its 7-pin expansion header: 3.3V power and ground obviously, and since the protocol is unidirectional, only one of two serial transmit pins is used by the sensor board. The remaining pins are pass-through available for other purposes. I’ll explore those possibilities later, for now it’s time to get set up to experiment with the accelerometer.

Examining Adafruit AT42QT1070 Capacitive Touch Sensor Breakout

The Death Clock logic is built around user action to trigger its little show for amusement. While we could easily incorporate a micro switch or some such simple mechanical input, Emily felt it would make more sense to have a capacitive touch sensor. This fits into the theme of the clock, sensing and reading a person’s body instead of merely detecting a mechanical movement. So we’ll need something to perform this touch sensing and she procured an Adafruit #1362, AT42QT1070 5-Pad Capacitive Touch Sensor Breakout Board for use. Inside the package was a set of leads for breadboard experimentation, so we soldered them on, put the works on a breadboard, and started playing with it.

Initially the board worked mostly as advertised on Adafruit product page, but it is a lot more finicky than we had anticipated. We encountered frequent false positives (signaled touch when we haven’t touched the contact) and false negatives (did not signal touch when we have touched the contact.) Generally the unpredictability got worse as we used larger pieces of conductive material. Either in the form of longer wires, or in the form of large metal blocks we could touch.

Digging into the datasheet linked from Adafruit’s site, we learned that the sensor runs through a self calibration routine upon powerup, and about a “guard” that can be connected to something not intended as touch contact in order to form a reference for intended touch contacts. The calibration routine explains why we got wild readings as we experimented with different touch pads – we should have power cycled the chip with each new arrangement to let it recalibrate.

After we started power-cycling the chip, we got slightly better results, but we still needed to keep conductive material to a minimum for reliable operation. We played with the guard key and failed to discern noticeable improvement in touch sense reliability, perhaps we’re not using it right?

For Death Clock implementation we will try to keep the sensor board as close to the touch point as we can, and minimize wire length and electrical connections. Hopefully that’ll give us enough reliability.

Evaluating Microchip HV5812 For VFD Projects

[Emily] and I started our vacuum fluorescent display (VFD) project because there was an interesting unit available, with a look distinctly different from modern LED. We just had to salvage it out of an obsolete piece of electronics and figure out how to make it work. We now have a prototype VFD driver circuit up and running, and we can command it to light up arbitrary combinations of segments at arbitrary times from a Python program running on an attached Raspberry Pi. This is a satisfying milestone marking completion of our first generation hardware allowing us to transition to focusing on what to actually put on that display.

The first few experiments with VFD patterns confirmed that we really like how a VFD looks! As people who love to take things apart to see how they work, we enjoy all the components of a VFD visible through its glass case. Their intricate internals qualify them as desktop sculpture just sitting there, making them light up is just icing on the cake.

With this early success and desire for more, chances are good that we’ll embark on additional VFD projects in the future. For our first VFD project we chose to stick with generic chips for the sake of learning the basic principles, but if we’re going to start building more we should look at using chips designed for the purpose.

According to Digi-Key’s online catalog, there are dedicated vacuum fluorescent drivers available from Maxim and Microchip. None of Maxim’s chips are available in hobbyist-friendly through-hole designs, but two of Microchip’s three lines are. HV5812P-G is the 20-channel model in 28-pin DIP format, and HV518P-G is the 32-channel counterpart in 40-pin DIP format. Curiously, for ~50% more pins, the HV518P-G costs over double the price. So it made sense to start with the HV5812.

With data and clock pins for straightforward serial data input, it was designed to be easy to drive from pretty much any microcontroller. The only thing that caught my attention is that logic input lines are expected to be 5V input with a minimum of 3.5V required to be interpreted as logic high. This meant we couldn’t drive it directly from 3.3V hosts like a Raspberry Pi or an ESP32. We’d need level shifters or a 5V capable part like a PIC to act an intermediary.

It looks promising enough — and priced cheaply enough — to be a consideration for potential follow-on VFD projects. So we’ll add that to the Digi-Key shopping cart and see where things go from there.

A Close-Up Look At VFD Internals

When we first pulled the vacuum fluorescent display (VFD) from an old Canon tuner timer unit, we can see a lot of intricate details inside the sealed glass. We had work to do that day – probing the pinout and such, but part of its overall allure comes from the beauty of details visible within. It is still something of interest, I just had to remember to bring my camera with a close up macro lens to take a few photos.

VFD under macro lens off

One of the reasons a VFD is so interesting to look at is the fact the actual illuminating elements sits under other parts which contribute to the process. Closest to the camera are the heating filaments, visible as four horizontal wires. This is where electrons begin their journey to trigger fluorescence.

Between these filaments and individual illuminating segments are our control grids, visible literally as a very fine mesh grid mostly – but not entirely – built on a pattern of hexagons.

And beyond the control grids, our individual phosphor coated segments that we control to illuminate at our command using our prototype driver board. (Once it is fully debugged, at least.) These phosphor elements are what actually emits light and become visible to the user. The grid and filament are thin which helps them block as little of this light as possible.

Fortunately an illuminated VFD segment emits plenty of light to make it through the fine mesh of grids and fine wires of filament. From a distance those fine elements aren’t even visible, but up close they provide a sophisticated look that can’t be matched by the simplicity of a modern segmented LED unit.

VFD under macro lens on

Original NEC VSL0010-A VFD Power Source

Now that we have a better understanding of how a NEC VSL0010-A vacuum fluorescent display (VFD) works, figuring out its control pinout with the help of an inkjet power supply, we returned to the carcass we salvaged that VFD out of. Now that we knew each pins’ function, we picked those that supplied 2.5V AC for filament power to track. We expect they are least likely to pass through or be shared by other devices. We traced through multiple circuit boards back to the main power transformer output plug. We think it’s the two gray wires on the left side of this picture, but our volt meter probes are too big to reach these visible contact points. And the potential risk of high voltage made us wary of poking bare wires into that connector as we did for the inkjet power supply.

NEC VSL0010-A VFD Power Supply - Probes Too Fat

Our solution came as a side benefit of decision made earlier for other reasons. Since we were new to VFD technology, our curiosity-fueled exploratory session was undertaken with an inexpensive Harbor Freight meter instead of the nice Fluke in the shop. Originally the motivation was to reduce risk: we won’t cry if we fry the Harbor Freight meter, but now we see a secondary benefit: With such an expensive device, we also feel free to modify these probes to our project at hand. Off we go to the bench grinder!

NEC VSL0010-A VFD Power Supply - Probes Meet Grinder

A few taps on the grinding wheel, and we have much slimmer probes that could reach in towards those contacts.

NEC VSL0010-A VFD Power Supply - Probes Now Thin

Suitably modified, we could get to work.

NEC VSL0010-A VFD Power Supply - Probes At Work

We were able to confirm the leftmost pair of wires, with gray insulation, is our 2.5VAC for VFD filament. The full set of output wires from this transformer, listed by color of their insulation, are:

  • Gray pair (leftmost in picture): 2.6V AC
  • Brown pair (spanning left and right sides): 41V AC
  • Dark blue pair: (center in picture) 17.2V AC
  • Black pair (rightmost in picture): 26.6V AC

There was also a single light-blue wire adjacent to the pair of dark blue wires. Probing with volt meter indicated it was a center tap between the dark blue pair.

NEC VSL0010-A VFD Power Supply Transformer

Once determined, we extracted the transformer as a single usable unit: there was a fuse holder and an extra power plug between it and the device’s AC power cord. We’re optimistic this assembly will find a role in whatever project that VFD will eventually end up in. 2.6V AC can warm filament, rectified 26.6V AC should work well for VFD grid and segments. And with proper rectification and filtering, a microcontroller can run off one of these rails. It’ll be more complex than driving a LED display unit, but it’ll be worth it for that distinctive VFD glow.

HP Inkjet Printer Power Supply For NEC VSL0010-A VFD

One of the reasons LED has overtaken VFD in electronics is reduced power requirements. Not just in raw wattage of power consumed, but also the varying voltage levels required to drive a VFD. The NEC VSL0010-A VFD whose pinout we just probed ran on 2.5V AC and ~30V DC. In contrast, most LED can run at the same 5V or 3.3V DC power plane as their digital drive logic, vastly simplifying design.

We didn’t have a low voltage AC source handy for probing, so we used 2.5V DC. We expected this to have only cosmetic effects. One side of our VFD will be brighter than the other, since one side will have a filament-to-grid/element voltage difference of 30V but the other will only have 27.5V.

But putting 2.5V DC on the filament occupied our only bench power supply available at the time. What will we use for our 30V DC power source? The answer came from our parts pile of previously disassembled electronics, in this case a retired HP inkjet printer’s power supply module labeled with the number CM751-60190.

HP CM751-60190 AC Power Adapter

According to the label, this module could deliver DC at 32V and 12V. Looking at its three-conductor output plug, it was easy to come to the conclusion we have one wire for ground, one wire for 32V, and one wire for 12V. But that easy conclusion would be wrong. Look closer at the label…

HP CM751-60190 AC Power Adapter pinout

We do indeed have a ground wire in the center, but there is only one power supply wire labelled +32V/+12V. It actually delivers “32 or 12” volts, not “32 and 12” volts. That last pin on the left has an icon. What did that mean? Our hint comes from power output specifications: +32V 1095mA or +12V 170mA. We deduced this meant the icon is a moon, indicating a way to toggle low-power sleep mode where the power supply only delivers 12V * 170mA = 2 watts vs. full 32V * 1095mA = 35 W.

With that hypothesis in hand, it’s time to hook up some wires and test its behavior.

HP CM751-60190 AC Power Adapter test

When “sleep mode” pin is left floating, voltage output is 32VDC. When that pin is grounded, voltage output drops to 12VDC. Since we’re looking for 32VDC to drive our VFD grid and elements, it’s easy enough to leave sleep wire unconnected and solder wires to the remaining two wires to obtain 32V DC for our VFD adventures.

HP CM751-60190 AC Power Adapter new wires

Sleuthing NEC VSL0010-A VFD Control Pinout

Vacuum Fluorescent Display (VFD) technology used to be the dominant form of electronics display. But once LEDs became cheap and bright enough, they’ve displaced VFDs across much of the electronics industry. Now a VFD is associated with vintage technology, and its distinctive glow has become a novelty in and of itself. Our star attraction today served as display for a timer and tuner unit that plugs into the tape handling unit of a Canon VC-10 camera to turn it into a VCR. A VFD is very age-appropriate for a device that tunes into now-obsolete NTSC video broadcast for recording to now-obsolete VHS magnetic tape.

Obviously, in this age of on-demand internet streaming video, there’s little point in bringing the whole system back to life. But the VFD appears to be in good shape, so in pursuit of that VFD glow, salvage operation began at a SGVHAK meetup.

NEC VSL0010-A VFD Before

We have the luxury of probing it while running, aided by the fact we can see much of its implementation inside the vacuum chamber through clear glass. The far right and left pins are visibly connected to filament wires, probing those pins saw approximately 2.5V AC. We can also see eight grids, each with a visible connection to its corresponding pin. That leaves ten pins to control elements within a grid. Probing the grid and element pins indicate they are being driven by roughly 30V DC. (It was hard to be sure because we didn’t have a constant-on element to probe…. like all VCRs, it was blinking 12:00)

This was enough of a preliminary scouting report for us to proceed with desoldering.

NEC VSL0010-A VFD Unsoldering

Predating RoHS solder that can be finicky, it was quickly freed.

NEC VSL0010-A VFD Freed

Now we can see its back side and, more importantly, its part number which immediately went into a web search on how to control it.

NEC VSL0010-A VFD Rear

The top hit on this query is this StackExchange thread, started by someone who has also salvaged one of these displays and wanted to get it up and running with an Arduino. Sadly the answers were unhelpful and not at all supportive, discouraging the effort with “don’t bother with it”.

We shrugged, undeterred, and continued working to figure it out by ourselves.

NEC VSL0010-A VFD Front

If presented with an unknown VFD in isolation, the biggest unknown would have been what voltage levels to use. But since we have that information from probing earlier, we could proceed with confidence we won’t burn up our VFD. We powered up the filament, then powered up one of the pins visibly connected to a grid and touched each of the remaining ten non-grid pins to see what lights up. For this part of the experiment, we got our 32V DC from the power supply unit of a HP inkjet printer.

We then repeated the ten element probe for each grid, writing down what we’ve found along the way.

NEC VSL0010-A VFD Annotated

We hope to make use of this newfound knowledge in a future project, and we hope this blog post will be found by someone in the future and help them return a VFD to its former glowing glory.

Window Shopping JeVois Machine Vision Camera

In the discussion period that followed my Sawppy presentation at RSSC, there was a discussion on machine vision. When discussing problems & potential solutions, JeVois camera was mentioned as one of the potential tools for machine vision problems. I wrote down the name and resolved to look it up later. I have done so and I like what I see.

First thing that made me smile was the fact it was a Kickstarter success story. I haven’t committed any of my own money to any Kickstarter project, but I’ve certainly read more about failed projects than successful ones. It’s nice when the occasional success story comes across my radar.

The camera module is of the type commonly used in cell phones, and behind the camera is a small machine vision computer again built mostly of portable electronics components. The idea is to have a completely self-contained vision processing system, requiring only power input and delivers processed data output. Various machine vision tasks can be handled completely inside the little module as long as the user is realistic about the limited processing power available. It is less powerful but also less expensive and smaller than Google’s AIY Vision module.

The small size is impressive, and led to my next note of happiness: it looks pretty well documented. When I looked at its size, I had wondered how to best mount the camera on a project. It took less than 5 minutes to decipher documentation hierarchy and find details on physical dimensions and how to mount the camera case. Similarly, my curiosity about power requirements was quickly answered with confirmation that its power draw does indeed exceed the baseline USB 500mW.

Ease of programming was the next investigation. Some of the claims around this camera made it sound like its open source software stack can run on a developer’s PC and debugged before publishing to the camera. However, the few tutorials I skimmed through (one example here) all required an actual JeVois camera to run vision code. I interpret this to mean that JeVois software stack is indeed specific to the camera. The whole “develop on your PC first” only means the general sense of developing vision algorithms on a PC before porting to JeVois software stack for deployment on the camera itself. If I find out I’m wrong, I’ll come back and update this paragraph.

2017-04-27-15-14-36When I looked on Hackaday, I saw that one of the writers thought JeVois camera’s demo mode was a very effective piece of software. It should be quite effective at its job: get users interested in digging deeper. Project-wise, I see a squirrel detector and a front door camera already online.

The JeVois camera has certainly earned a place on my “might be interesting to look into” list for more detailed later investigation.

Sawppy Odometry Candidate: Flow Breakout Board

When I presented Sawppy the Rover at Robotics Society of Southern California, one of the things I brought up as an open problems is how to determine Sawppy’s movement through its environment. Wheel odometry is sufficient for a robot traveling on flat ground like Phoebe, but when Sawppy travels on rough terrain things can get messy in more ways than one.

In the question-and-answer session some people brought up the idea of calculating odometry by visual means, much in the way a modern optical computer mouse determines its movement on a desk. This is something I could whip up with a downward pointing webcam and open source software, but there are also pieces of hardware designed specifically to perform this task. One example is the PWM3901 chip, which I could experiment using breakout boards like this item on Tindie.

However, that visual calculation is only part of the challenge, because translating what that camera sees into a physical dimension requires one more piece of data: the distance from the camera to the surface it is looking at. Depending on application, this distance might be a known quantity. But for robotic applications where the distance may vary, a distance sensor would be required.

As a follow-up to my presentation, RSSC’s online discussion forum brought up the Flow Breakout Board. This is an interesting candidate for helping Sawppy gain awareness of how it is moving through its environment (or failing to do so, as the case may be.) A small lightweight module that puts the aforementioned PWM3901 chip alongside a VL53L0x distance sensor.

flow_breakout_585px-1

The breakout board only handles the electrical connections – an external computer or microcontroller will be necessary to make the whole system sing. That external module will need to communicate with PWM3901 via SPI and, separately, VL53L0x via I2C. Then it will need perform the math to calculate actual X-Y distance traveled. This in itself isn’t a problem.

The problem comes from the fact a PWM3901 was designed to be used on small multirotor aircraft to aid them in holding position. Two design decisions that make sense for its intended purpose turns out to be a problem for Sawppy.

  1. This chip is designed to help hold position, which is why it was not concerned with knowing the height above surface or physical dimension of that translation: the sensor was only concerned with detecting movement so the aircraft can be brought back to position.
  2. Multirotor aircraft all have built-in gyroscopes to stabilize itself, so they already detect rotation about their Z axis. Sawppy has no such sensor and would not be able to calculate its position in global space if it doesn’t know how much it has turned in place.
  3. Multirotor aircraft are flying in the air, so the designed working range of 80mm to infinity is perfectly fine. However, Sawppy has only 160mm between the bottom of the equipment bay and nominal floor distance. If traversing over obstacles more than 80mm tall, or rough terrain bringing surface within 80mm of the sensor, this sensor would become disoriented.

This is a very cool sensor module that has a lot of possibilities, and despite its potential problems it has been added to the list of things to try for Sawppy in the future.

Intel RealSense T265 Tracking Camera

In the middle of these experiments with a Xbox 360 Kinect as robot depth sensor, Intel announced a new product that’s along similar lines and a tempting venue for robotic exploration: the Intel RealSense T265 Tracking Camera. Here’s a picture from Intel’s website announcing the product:

intel_realsense_tracking_camera_t265_hero

T265 is not a direct replacement for the Kinect, at least not as a depth sensing camera. For that, we need to look at Intel’s D415 and D435. They would be fun to play with, too, but I already had the Kinect so I’m learning on what I have before I spend money.

So if the T265 is not a Kinect replacement, how is it interesting? It can act as a complement to a depth sensing camera. The point of the thing is not to capture the environment – it is to track the motion and position within that environment. Yes, there is the option for an image output stream, but the primary data output of this device is a position and orientation.

This type of camera-based “inside-out” tracking is used by the Windows Mixed Reality headsets to determine its user’s head position and orientation. These sensors requires low latency and high accuracy to avoid VR motion sickness, and has obvious applications in robotics. Now Intel’s T265 offers that capability in a standalone device.

According to Intel, the implementation is based on a pair of video cameras and an inertial motion unit (IMU). Data feeds into internal electronics running a V-SLAM (visual simultaneous location and mapping) algorithm aided by Movidius neural network chip. This process generates position+orientation output. It seems pretty impressive to me that it is done in such a small form factor and high speed (at least low latency) with 1.5 watt of power.

At $200, it is a tempting toy for experimentation. Before I spend that money, though, I’ll want to read more about how to interface with this device. The USB 2 connection is not surprising, but there’s a phrase that I don’t yet understand: “non volatile memory to boot the device” makes it sound like the host is responsible for some portion of the device’s boot process, which isn’t like any other sensor I’ve worked with before.

Xbox 360 Kinect Depth Sensor Data via OpenKinect (freenect)

I want to try using my Xbox 360 Kinect as a robot sensor. After I’ve made the necessary electrical modifications, I decided to try to talk to my sensor bar via OpenKinect (a.k.a. freenect a.k.a. libfreenect) driver software. Getting it up and running on my Ubuntu 16.04 installation was surprisingly easy: someone has put in the work to make it a part of standard Ubuntu software repository. Whoever it was, thank you!

Once installed, though, I wasn’t sure what to do next. I found documentation telling me to launch a test/demonstration viewer application called glview. That turned out to be old information, the test app is actually called freenect-glview. Also, it is no longer added to the default user search path. I have to launch it with the full path /usr/bin/freenect-glview.

Once I got past that minor hurdle, I have on my screen a window that showed two video feeds from my Kinect sensor: on the left, depth information represented by colors. And on the right, normal human vision color video. Here’s my Kinect pointed at its intended home: on board my rover Sawppy.

freenect-glview with sawppy

This gave me a good close look at Kinect depth data. What’s visible and represented by color is pretty good, but the black areas worry me. They represent places where the Kinect was not able to extract depth information. I didn’t expect it to be able to pick out fine surface details of Sawppy components, but I did expect it to see the whole chassis in some form. This was not the case, with areas of black all over Sawppy’s chassis.

Some observations of what a Xbox 360 Kinect could not see:

  • The top of Sawppy’s camera mast. Neither the webcam nor the 3D-printed mount for that camera. This part is the most concerning one because I have no hypothesis why.
  • The bottom of Sawppy’s payload bay. This is unfortunate but understandable: it is a piece of laser cut acrylic which would reflect Kinect’s projected pattern away from the receiving IR camera.
  • The battery pack in rear has a smooth clear plastic package and would also not reflect much back to the camera.
  • Wiring bundles are enclosed in a braided sleeve. It would scatter the majority of IR pattern and those that make it to the receiving camera would probably be jumbled.

None of these are deal-breakers on their own, they’re part of the challenges of building a robot that functions outside of a controlled environment. In addition to those, I’m also concerned about the frame-to-frame inconsistency of depth data. Problematic areas are sometimes visible for a frame and disappear in the next. The noisiness of this information might confuse a robot trying to make sense of its environment with this data. It’s not visible in the screenshot above, but here’s an animated GIF showing a short snippet for illustration:

kinect-looking-at-sawppy-inconsistencies

Xbox 360 Kinect Driver: OpenNI or OpenKinect (freenect)?

The Kinect sensor bar from my Xbox 360 has long been retired from gaming duty. For its second career as robot sensor, I have cut off its proprietary plug and rewired it for computer use. Once I’ve verified the sensor bar is electrically compatible with a computer running Ubuntu, the first order of business was to turn fragile test connections into properly soldered wires protected by heat shrink tube. Here’s my sensor bar with its new standard USB 2.0 connector and a JST-RCY connector for 12 volt power.

xbox 360 kinect with modified plugs 12v usb

With the electrical side settled, attention turns to software. The sensor bar can tell the computer it is a USB device, but we’ll need additional driver software to access all the data it can provide. I chose to start with the Xbox 360 Kinect because of its wider software support, which means I have multiple choices on which software stack to work with.

OpenNI is one option. This open source SDK is still around thanks to Occipital, one of the companies that partnered with PrimeSense. PrimeSense was the company that originally developed the technology behind Xbox 360 Kinect sensor, but they have since been acquired by Apple and their technology incorporated into the iPhone X. Occipital itself is still in the depth sensor business with their Structure sensor bar. Available standalone or incorporated into products like Misty.

OpenKinect is another option. It doesn’t have a clear corporate sponsor like OpenNI, and seems to have its roots in the winner of the Adafruit contest to create an open source Kinect driver. Confusingly, it is also sometimes called freenect or variants thereof. (Its software library is libfreenect, etc.)

Both of these appear to still be receiving maintenance updates, and both have been used a lot of cool Kinect projects outside of Xbox 360 games. Ensuring there will be a body of source code available as reference for using either. Neither are focused on ROS, but people have written ROS drivers for both OpenNI and OpenKinect (freenect). (And even an effort to rationalize across both.)

One advantage of OpenNI is that it provides an abstraction layer for many different depth cameras built on PrimeSense technology, making code more portable across different hardware. This does not, however, include the second generation Xbox One Kinect, as that was built with a different (not PrimeSense) technology.

In contrast, OpenKinect is specific to the Xbox 360 Kinect sensor bar. It provides access to parts beyond the PrimeSense sensor: microphone array, tilt motor, and accelerometer.  While this means it doesn’t support the second generation Xbox One Kinect either, there’s a standalone sibling project libfreenect2 for meeting that need.

I don’t foresee using any other PrimeSense-based sensors, so OpenNI’s abstraction doesn’t draw me. The access to other hardware offered by OpenKinect does. Plus I do hope to upgrade to a Xbox One Kinect in the future, so I decided to start my Xbox 360 Kinect experimentation using OpenKinect.

Modify Xbox 360 Kinect for PC Use

I want to get some first hand experience working with depth cameras in a robotics context, and a little research implied the Xbox 360 Kinect sensor bar is the cheapest hardware that also has decent open source software support. So it’s time to dust off the Kinect sensor bar from my Halo 4 Edition Xbox 360.

I was a huge Halo fan and I purchased this Halo 4 console bundle as an upgrade from my first generation Xbox 360. My Halo enthusiasm has since faded and so has the Xbox 360. After I upgraded to Xbox One, I lent out this console (plus all accessories and games) to a friend with young children. Eventually the children lost interest in an antiquated console that didn’t play any of the cool new games and it resumed gathering dust. When I asked if I could reclaim my Kinect sensor bar, I was told to reclaim the whole works. The first accessory to undergo disassembly at SGVTech was the Xbox 360 Racing Steering Wheel. Now it is time for the second accessory: my Kinect sensor bar.

The sensor bar connected to my console via a proprietary connector. Most Xbox 360 accessories are wireless battery-powered devices, but the Kinect sends far more data than normal wireless controllers and requires much more power than rechargeable AA batteries can handle. Thus the proprietary connector is a combination of a 12 volt power supply alongside standard USB 2.0 data at 5 volts. To convert this sensor bar for computer use instead of a Xbox 360, the proprietary connector needs to be replaced by two separate connectors: A standard USB 2.0 plug plus a 12V power supply plug.

Having a functioning Xbox 360 made the task easier. First, by going into the Kinect diagnostics menu, I could verify the Kinect is in working condition before I start cutting things up. Second, after I severed the proprietary plug and splayed out wires in the cable, a multimeter was able to easily determine the wires for 12 volt (tan), 5 volt (red), and ground (black) by detecting the voltages placed on those wires by a running Xbox 360.

That left only the two USB data wires, colored green and white. Thankfully, this appears to be fairly standardized across USB cables. When I cut apart a USB 2.0 cable to use as my new plug, I found the same red, black, green, and white colors on wires. To test the easy thing first, I matched wire colors, kept them from shorting each other with small pieces of tape, and put 12V power on the tan wire using a bench power supply.

xbox 360 kinect on workbench

Since I was not confident on this wiring, I used my cheap laptop to test my suspect USB wiring instead of using my good laptop. Fortunately, the color matching appeared to work and the sensor bar enumerated properly. Ubuntu’s dmesg utility lists a Kinect sensor bar as a USB hub with three attached devices:

  1. Xbox NUI Motor: a small motor that can tilt the sensor bar up or down.
  2. Xbox Kinect Audio: on board microphone array.
  3. Xbox NUI Camera: this is our depth-sensing star!

xbox nui camera detected

[ 84.825198] usb 2-3.3: new high-speed USB device number 10 using xhci_hcd
[ 84.931728] usb 2-3.3: New USB device found, idVendor=045e, idProduct=02ae
[ 84.931733] usb 2-3.3: New USB device strings: Mfr=2, Product=1, SerialNumber=3
[ 84.931735] usb 2-3.3: Product: Xbox NUI Camera
[ 84.931737] usb 2-3.3: Manufacturer: Microsoft

 

ROS In Three Dimensions: Starting With Xbox 360 Kinect

The long-term goal driving my robotics investigations is to build something that has an awareness of its environment, and intelligently plan actions within it. (This is a goal shared by many other members of RSSC as well.) Building Phoebe gave me an introduction to ROS running in two dimensions, and now I have ambition to graduate to three. A robot working in three dimensions need a sensor that works in three dimensions, so where I’m going to start.

Phoebe started with a 2D laser scanner purchased off eBay that I learned to get up and running in ROS. Similarly, the cheapest 3D sensor that can be put on a ROS robot are repurposed Kinect sensor bars from Xbox game consoles. Even better, since I’ve been a Xbox gamer (more specifically a Halo and Forza gamer) I don’t need to visit eBay. I have my own Kinect to draft into this project. In fact, I have more than one: I have both the first generation Kinect sensor accessory for Xbox 360, and the second generation that was released alongside Xbox One.

xbox 360 kinect and xbox one kinect

The newer Xbox One Kinect is a superior sensor with a wider field of view, higher resolution, and better precision. But that doesn’t necessarily make it the best choice to start off with, because hardware capability is only part of the story.

When the Xbox 360 Kinect was launched, it was a completely novel new device offering depth sensing at a fraction of the price of existing depth sensors. There was a lot of enthusiasm both in the context of video gaming and hacking them to be used outside of Xbox 360 games. Unfortunately, the breathless hype wrote checks that the reality of a low-cost depth camera couldn’t quite cash. By the time Xbox One launched with an updated Kinect, interest had waned and far fewer open source projects aimed to work with a second generation Kinect.

The superior capabilities of the second generation sensor bar also brought downsides: it required more data bandwidth and hence a move to USB 3.0. At the time, USB 3.0 ecosystem was still maturing and new Kinect had problems working with certain USB 3.0 implementations. Even if the data could get into a computer, the sheer amount of it placed more demands on processing code. When coupled with reduced public interest, it meant software support for the second generation Kinect is less robust. A web search found a lot of people who encountered problems trying to get their second generation bar to work.

In the interest of learning the ropes and getting an introduction to the world of 3D sensing, I decided a larger and more stable software base is more interesting than raw hardware capabilities. I’ll use the first generation Xbox 360 Kinect sensor bar to climb the learning curve of building a three-dimensional solution in ROS. Once that is up and running, I can try to tame the more finicky second generation Kinect.

Happy Octopus Eating Taco and Fries

As a short break from larger scale projects, I decided to get some more SMD soldering practice. When I was at Supercon I received these little soldering kits which were designed to the simple “Shitty Add-On” specification. It’s a way for people to get a simple start with PCB projects made with aesthetics in mind as well as functionality, headlined by sophisticated #badgelife creations.

As fitting their simple nature, all I have to start with are little zip lock bags of parts. The first hint towards assembly instructions were printed on the circuit board: the text @Michelle4904 which pointed to the Twitter page of Michelle Grau and a Google Doc of assembly instructions. One notable fact of these kits is that there were no extra parts to replace any that might accidentally fly off into space, which meant I had to be extra careful handling them. Fortunately, the parts were larger than my most recent SMD LED project and while I did drop a few, these were large enough they did not fly too far and I was able to recover them.

I started with the fries. It was slow going at first because I was very afraid of losing parts. But I gradually built up a feel for handling them and things got gradually faster. After a bit of experimentation, I settled on a pattern of:

  1. Tin one pad with a small blob of solder.
  2. Place the SMD component next to the blob, and melt the blob again to connect them.
  3. Solder the other end of SMD component to other pad.

michelle4904 fries in progress

This is the rapid start portion of the learning curve – every LED felt faster and easier than the last. Soon the pack of fries were finished and illuminated. I was a little confused as to why the five LEDs were green, I had expected them to be yellow. Maybe they are seasoned fries with parsley or something?

Tangential note: when I visit Red Robin I like to upgrade to their garlic herbed fries.

michelle4904 fries illuminated

Once the fries were done, I then moved on to the taco which had a denser arrangement of components. It was an appropriate next step up in difficulty.

michelle4904 tacos in progress

Once completed, I have a taco with yellow LEDs for the shell (the same yellow I would have expected in the fries…), red LED for tomato, green LED for lettuce, and white LED for sour cream. It’s a fun little project.

Tangential note: Ixtaco Taqueria is my favorite taco place in my neighborhood.

michelle4904 tacos illuminated

The last zip lock bag has a smiling octopus and it was the easiest one to solder. It appears the original intention was to be an 1-to-4 expansion board for shitty add-ons, but if so, the connector genders are backwards.

michelle4904 octopus with connectors

No matter, I’ll just solder the taco and fries permanently to my happy octopus tentacles. In order to let this assembly of PCBs stand on its own, I soldered one of the battery holders I designed for my KISS Tindies.

michelle4904 kit combo battery

And here’s a happy octopus enjoying Taco Tuesday and a side of fries.

michelle4904 kit combo illuminated

Strange Failure Of Monoprice Monitor 10734

Under examination at a recent SGVTech meet is a strange failure mode for a big monitor. Monoprice item #10734 is a 30-inch diagonal monitor with 2560×1600 resolution. The display panel is IPS type and have LED back light – in short, it’s a pretty capable monitor. Unfortunately, it has suffered a failure of some sort that makes it pretty useless as a computer monitor: certain subpixels on the screen no longer respond to display input.

Here the monitor is supposed to display black text “What” on a white background. But as we can see, some parts inside the letters that are supposed to be dark are still illuminated, most visibly here in some green subpixels. The misbehaving pixels are not random, they are in a regular pattern, but I’m at a loss as to why this is happening.

monoprice 10734 subpixel failure

And these misbehaving subpixels drift in their value. Over a course of several minutes they will shift slightly in response to adjacent pixels. The end result is that anything left on screen for more than a few minutes will leave a ghost behind, its colors remembered by the misbehaving subpixels. It looks like old school CRT screen burn-in, but only takes a few minutes to create. Unplugging the monitor, waiting a few minutes, plugging it back in and turning it on does not eliminate the ghost. I need to wait a week before the ghosting fades to an undetectable level. This behavior means something, but I don’t know what.

In the hopes that there’s something obviously broken, out came the screwdrivers to take the monitor apart. We were not surprised to find there are minimal components inside the metal case: there is the big panel unit, connected to a small interface board which hosts the power plug and all the video input ports, connected to an even smaller circuit board hosting the user control buttons.

monoprice 10734 backless

The panel is made by LG, model number LM300WQ6(SL)(A1).

monoprice 10734 panel lg lm300wq6 sl a1

A web search found the LG datasheet for this panel. Because it is a 10-bits per color panel, and there are so many pixels, its data bandwidth requirements are enormous. 4 LVDS channels clocking over 70 MHz. There were no visibly damaged parts on its integrated circuit board. On the other end of LVDS interface cables, there were no visible damage on the interface board either.

monoprice 10734 interface board

Before I opened this up and found the datasheet, I had thought maybe I could generate signals to feed this panel using a PIC or something. But the chips I have are grossly underpowered to talk to a 4-channel LVDS panel at required speed. Perhaps this could be a FPGA project in the future? For now this misbehaving monitor will sit in a corner gathering dust waiting for the time I get around to that project, or until the next housecleaning pass where it departs as electronic recycle, whichever comes first.

Freeform Fun with Salvaged SMD LEDs

There’s a freeform circuit contest going on at Hackaday right now. I’m not eligible to enter, but I can still have fun on my own. I haven’t had any experience creating freeform circuit sculptures and now is as good as time as any to play around for a bit.

Where should I start? The sensible thing is to start simple with a few large through-hole light-emitting diodes (LEDs), but I didn’t do that. I decided to start higher up on the difficulty scale because of a few other events. The first is that I learned to salvage surface mount devices (SMD) from circuit boards with a cheap hot air gun originally designed for paint stripping. I had pulled a few SMD LEDs from a retired landline telephone and they were sitting in a jar.

The second is the arrival of a fine soldering iron tip (*). I had ordered it in anticipation for trying to repair a damaged ESP32 module. I thought I should practice using these new tips on something expendable before tackling an actual project, and a freeform exercise with salvaged SMD LED seems like a great opportunity to do so.

As a beginner at free form soldering, and a beginner at SMD soldering, the results were predictably terrible. I will win no prizes for fine workmanship here! But everyone has to start somewhere, and there will be many opportunities for practice in the future.

7 survivors light

It’s just a simple circuit with seven LEDs in parallel, on the lead of their shared current-limiting resistor. Not visible here is another aspect of learning to work with surface mount components: they are really, really small. I had actually salvaged nine LEDs. Two of the nine LEDs have flown off somewhere in my workshop and didn’t make it to the final product.

9 salvaged LEDs

For comparison, here is the “I ❤ SMD” soldering kit that was my gentle introduction to surface mount soldering. The LED in that soldering kit were significantly larger and easier to manipulate than those I salvaged from an obsolete landline telephone.

SMD intro size comparison

Moral of the story: for future projects practicing SMD assembly, be sure to have spare components on hand to replace those that fly off into space.


(*) Disclosure: As an Amazon Associate I earn from qualifying purchases.

Sony KP-53S35 Signal Board “A” Components

Here are the prizes rewarded for an afternoon spent desoldering parts from a Sony KP-53S35’s signal board “A”.

Signal board A top before

The most visually striking component were the shiny metal boxes in the corner. This is where signal from the TV antenna enters into the system. RF F-Type connectors on the back panel is connected to these modules via their RCA type connector. Since this TV tuned in to analog TV broadcast signals that have long since been retired, I doubt these parts are good for any functional purpose anymore. But they are still shiny and likely to end up in a nonfunctional sculpture project.

Near these modules on the signal board was this circuit board “P”. It was the only module installed as a plug-in card, which caught our eye. Why would Sony design an easily removable module? There were two candidate explanations: (1) easy replacement because it was expected to fail frequently, or (2) easy replacement because it is something to be swapped out. Since the module worked flawlessly for 21 years, it’s probably the latter. A web search for the two main ICs on board found that the Philips TDA8315T is a NTSC decoder, which confirmed hypothesis #2: this “P” board is designed to be easily swapped for a TV to support other broadcast standards.

The RCA jacks are simple and quite likely to find use in another project.

Miscellaneous ICs and other modules were removed mostly as practice. I may look up their identifiers to see if anything is useful, but some of the parts (like the chips with Sony logo on top) are going to be proprietary and not expected to be worth the effort to figure out what they do.

The largest surface mount chip – which I used as hot air SMD removal practice – was labeled BH3856FS and is an audio processing chip handling volume and tone control. Looking at the flip side of the circuit board, we can see it has a large supporting cast of components clustered near it. It might be fun to see if I can power it up for a simple “Hello World” circuit, but returning it to full operation is dependent on the next item:

What’s far more interesting is nearby: the TDA7262 is a stereo audio amplifier with 20W per channel. This might be powerful enough to drive deflection coils to create Lissajous curves. The possibility was enough to make me spent the time and effort to remove its heat sinks gently and also recover all nearby components that might support it. I think it would be a lot of fun to get this guy back up and running in a CRT Lissajous curve project. Either with or without its former partner, the BH3856FS audio chip above.