Solar Eclipse High Altitude Balloon, part 2

Planning

If there’s one piece of advice I could give to anyone looking to launch a high altitude balloon, it’s this: don’t pick a date and try to prepare for it; instead, get everything prepared and then pick a date. The former is a recipe for disaster; you’ll start cutting corners and leaving things unprepared and unchecked. In my case, I didn’t even think of the idea until the eclipse was a month away, and I couldn’t really reschedule.

On the plus side, I had the pleasure of working with some fantastic people at the Boulder Hacker Space, Solid State Depot, and I’m confident that the project would not have succeeded without their help.

Overall, it’s important that you do everything you can to keep your launch safe and prevent it from interfering with or presenting dangers other people. Launching a balloon presents a risk to other aircraft and to people on the ground. The FAA has stated that they think this risk is acceptable because of the scientific value that balloon launches can provide.  In our case, we were doing this for fun and and to learn, but there was little scientific value otherwise, so be mindful and polite and do everything you can to reduce risk.

Coordinating

As this was a group project, we tried to meet often in person. We also kept documentation on our wiki, which I highly recommend. It was useful any time we had some documentation we wanted to keep or share, and it’s easy and quick to edit. We kept track of ideas for the payload, experiments, testing results, and a launch checklist.

The process

There are a whole lot of things you need to do to launch a balloon:

  1. Pick a launch site
  2. Prepare the payload: recording, sensors, tracking, electronics, case, parachute
  3. Prepare ground tracking equipment
  4. Test everything
  5. Prepare lifting gas and launch setup
  6. Notify the FAA
  7. Launch!

All of these steps ended up being more complicated than I had realized. I’ll try to go through each one with a different post. First up, picking a launch site.

Picking a launch site

Before anything else, you should consider where to launch from. If you only want get views of the curvature of Earth or just some nice aerial views, then your launch site can be pretty flexible. In our case, we wanted to capture the eclipse and view it for as long as possible, so there was a pretty narrow band of land that we wanted to launch from. We looked at a map of the eclipse in Wyoming and decided to launch from there.

The FAA has some requirements for anyone launching a balloon that further restrict launch sites: you’re not allowed to launch your balloon over populated or congested areas (for the first 1000 feet of launch), you’re not allowed to pass through class A-E airspace (near an airport), you can’t operate at any altitude if there’s more than five-tenths cloud coverage, and you’re not allowed to operate your balloon in a hazardous manner. That last one is ambiguous, but you should avoid highways at the least.

For viewing restricted airspace, SkyVector is a fantastic resource. The FAA also has a map tool named Know Before You Fly aimed at drone operators, and you should avoid any restricted places that it lists.

You should also consider how to recover the balloon after it lands. Wyoming has large amounts of state and federally owned land which you are allowed to walk on, which makes recovery convenient. If you do land on private property, you will need to contact the landowner to get permission to recover your balloon. Note that landowners are not required to give you access, nor are they required to recover and provide you the payload, but they also can’t claim ownership of it. It gets into a weird legal limbo that is best avoided if possible. There are several hunting apps for your phone that show which land is public or private that can aid in recovery. Some will even list the owners’ names, so you can start to contact them if you need to.

Once you have a general sense of where you want to launch, you can start running predictions of your balloon’s flight path using several online tools. This will give you a good idea of how your balloon will travel with the jet stream at different heights and where the balloon will land. Two tools that I like are HabHub’s CUSF predictor and The University of Southampton’s ASTRA predictor. Note that HabHub’s launch time is in UTC, while ASTRA’s is local. Both sites require you to enter some details about your payload – you won’t know the details yet, but an ascent rate of 5 m/s, descent rate of 5 m/s, and burst height of 30,000m are good defaults. ASTRA gives more control over your parameters, but at this stage, I would recommend using the parameters we launched with: 1.1kg, payload, Rocketman 3 ft. parachute, 2.3kg neck lift, helium, standard weather data, and standard flight type.

You have two choices for launch gas: helium and hydrogen. Helium is a nonrenewable resource and costs more than hydrogen, but unlike hydrogen, it’s not explosive. You will need a gas regulator for either one. For helium, you can use carbon dioxide regulators, which are commonly used in kegs, so you might be able to borrow one. I’d recommend using helium for your first launch because it’s safer.

In the next part, I’ll talk about the payload – what electronics we selected, tracking mechanisms, the enclosure, testing, and picking a balloon and parachute.

Posted in Uncategorized | Leave a comment

Solar Eclipse High Altitude Balloon, part 1

During the 2017 solar eclipse, some friends and I at Solid State Depot decided to launch a weather balloon and record the eclipse from the stratosphere. Jumping to the punchline: it was a success! We tracked and recovered the payload, and captured the eclipse on video with both a 360 degree camera and a vertically oriented camera (sorry).

Why?

Several years ago, some coworkers at Yelp launched and tracked a weather balloon. After seeing their footage and work, I was interested in trying it myself. To that end, a few years ago, I had bought a tracker that broadcasts its global coordinates over ham radio frequencies. I had recently got my ham radio license, and was programming it with my call sign, when I realized that the eclipse was next month. What better way is there than to see it then from the air?

Looking online, I found another person who recorded an eclipse from a weather balloon, so this wasn’t uncharted territory, but I wanted to try it myself. Plus, I wanted to use a 360 degree camera to get a unique view.

I figured I would put my thoughts out there for anyone looking to launch their own balloon. Although we did recover the balloon and footage, there were a lot of things that went wrong. If I ever launch another balloon, I’ll know what to avoid, and hopefully you will too.

Posted in Uncategorized | Leave a comment

SparkFun AVC 2015 recap

Another year, another AVC. This was my second year competing, and although I had a lot more time to prepare, things still didn’t go very well. I figured I’d give a recap of the things that went well, the things that didn’t, and plans for next year.

Last-minute modifications

SparkFun allowed teams to do some practice runs the day before the competition. I attended, and the main thing that I discovered was that the fence surrounding the course was about 5 cm off the ground, and my vehicle had a propensity to get itself wedged underneath whenever it ran into it. The suspension on the Grasshopper lets the wheels kind of bow inward while moving the chassis down; this caused the tires to move just enough to get wedged underneath or pushed all the way to the other side of the fence, causing the vehicle to get stuck. I didn’t have any padding or other bumpers on the front, and this ended most of my runs. I ended up going to Solid State Depot afterwards and attaching a clear plastic bump plate to prevent the vehicle from getting wedged again. I zip tied it to the front bumper of the Grasshopper so that it was sloped up and forward.

The morning of the competition, they allowed some more practice runs. The bump plate worked well in preventing the vehicle from getting wedged, but my zip ties snapped almost immediately. I ended up duct taping the plate to the bumper instead. Based on some more testing, I also fiddled with my collision recovery algorithm. Unfortunately, I didn’t have time to test this…

Heats

So! On to my first heat. My vehicle took off the line and then turned slightly to the left and into the fence. It ended up stopping there, and due to a bug in my last-minute code changes, it never tried to back up and try again. Another competitor made a good suggestion: drive the vehicle straight for a few seconds in order to get a better GPS heading, and then switch to the normal navigation algorithm. This should hopefully avoid other competitors and other navigation problems.

The second heat revealed a problem with my bump plate: because it was mounted at an angle, it had the tendency to drive over and on top of smaller obstacles. Another vehicle drove in front of me right at the start and hit the curb, and my vehicle ended up running into them and getting high-centered on top of them. I think I also damaged their vehicle… Sorry!

My third run ended similarly. The car appeared to be navigating well but was driving very closely along the fence. The fence itself was weighted down by some sandbags and my car made it over one, but got high-centered on the second.

Practice runs

There was some good news in the midst of all of this though. In between heats, we were allowed to do some test runs, and I did collect some interesting data. And, there were even some successful runs!

run-1-1

Here’s what the course looked like. I’ve highlighted the starting line in blue; the course starts travelling northwest and then makes 4 right turns to return to the starting line. Each red dot represents a single GPS reading. The first problem that I saw was that the GPS readings started a little to the south of the starting line, but eventually corrected. My vehicle initially drove into the curb at the start (because it thought that it needed to drive further north) but detected a collision and successfully recovered.

The next weird thing I saw was a jump in the GPS data that I highlighted in green. At first, I thought this was due to incorrect readings from the GPS module, but after looking at the logs, the module just failed to switch into NMEA mode. The module I used is modal, and you need to switch between having it report magnetometer readings and NMEA messages. For whatever reason, this switching failed, but my car used dead reckoning to interpolate the gaps, so there was no problem.

run-1-2

The next jump occurred as I was nearing the second turn. The GPS started reporting coordinates about 200 meters south, so my car tried to drive north and ended up running into the fence. The GPS jumped a few times after that, first to the west of the course, then to the inside of the course, and finally near to the fence perimeter, which is close to where my vehicle was. I’m not too sure how to deal with these discrepancies. One idea that was suggested my a member of Solid State Depot is to ignore data after it jumps; this would work for a while and I could fill in the gaps using dead reckoning, but how do you recover from this error? If the GPS readings jump back but are 5 meters off from your estimate, is the estimate wrong, or is the GPS wrong? I haven’t come up with a good plan, so I decided to scrap this idea. Another idea is to define the fence perimeter around the course and ignore readings outside of it. This would help with obviously bad readings, but the readings near the start and end of my runs were close to the vehicle’s position but outside of the fence, so this strategy would discard some somewhat useful data. Maybe I could fudge readings that are just outside of the fence to appear as just inside the fence and provide the Kalman filter with a confidence inversely proportional to the distance outside of the fence.

run-2

My second test run looked a lot better. The car didn’t run into anything and made it around the course in about 1:15. Unfortunately it stopped right before the finish line and claimed that it had reached all of its waypoints. I’m not sure why it stopped there, so I set up for a third test run.

run-3

This test run also went well: no collisions, the GPS data looked sound, and vehicle made it around in a similar time. However, like last time, it stopped before the finish line. For later runs, I just added more waypoints past the first turn, which worked well in further testing.

runs-combined

Here are both test runs overlaid each other in white, along with the path that I was trying to follow in yellow. I’m surprised with how closely they follow each other, especially at the start and between the 2nd and 3rd turns. I think the vehicle was turning too wide past some waypoints and ended up overcompensating as it approached the next waypoint.

Crowd pleaser!

This year, I dressed my vehicle up as a blue shell from Mario Kart, and dressed myself as Luigi. I was inspired after watching this video. This was a lot of fun and seemed to get a good reaction from the crowd. My family also dressed up! At the end of the day, I won the crowd pleaser award. Yippee!

family

Posted in Uncategorized | Leave a comment

SparkFun AVC build log 2015-06-14

It’s less than a week until the SparkFun AVC, so I figured it’s probably time for a quick update. There have been a lot of things I still need to get working (such as a physical start button), which is why I haven’t been writing much, but I figured this would be a good break.

Progress

Since my post a few months ago, I’ve gotten a few things checked off my to do list. Here’s what I had written, with things that are completed marked off:

  • Critical
    • Control the car from the onboard computer
    • Complete sensor data processing
    • Complete basic navigation to waypoints
    • Add a physical start button
  • Nice to have
    • Include basic collision detection and recovery
    • Use a Kalman filter for position estimation
    • Add a remote kill switch
    • Implement obstacle avoidance
    • Consider the shortcut, non-GPS option, and ramp and hoop
    • Estimate speed from accelerating from a standstill
    • Look at faster motors and other upgrades
    • Get remote monitoring working

Reading data from the iSUP800F module was a bit tricky. It’s modal in that it either emits NMEA (i.e. GPS messages), or binary messages of a proprietary binary format. The binary format contains raw magnetometer, accelerometer, temperature, and pressure readings. I wanted to use the raw magnetometer readings rather than relying on it converting these measurements into compass readings (more on that later), so my code needed to constantly switch the mode on the device.

Collision detection is very basic. I had wanted to include accelerometer readings to detect a collision, but for now, if the reported GPS speed is 0 for more than 2 seconds, then a collision is reported and the car backs up, turns, and tries again.

I had wanted to do some basic image processing to avoid the barrels, go under the hoop, and go over the jump, but one of my camera mounts ended up snapping after I ran into a curb during a test run. I still have the camera on board for recording a first person view of runs, but I don’t think it’s secured well enough to do any kind of avoidance.

Compass

In years past, a lot of people seemed to have compass problems, myself included. A lot of people (again, including myself) tended to turn left immediately and drive into the hay bails. From looking at my logs, it looked like my car was picking up some interference, because it thought it was facing 90 degrees clockwise of where it actually was.

I saw similar problems when I was testing this year. Debugging this problem was aided immensely by remote monitoring. I was able to reproduce this problem and found it occurred consistently when the car would drive over certain parts of my test runs. The compass would visibly deflect as the car drove past a point and then return back after the car had put some distance between itself and the point. It appeared that there was something in the ground that was interfering with the magnetometer.

I started logging the raw magnetometer readings and found that when there was significant deflection, the magnitude of the readings increased dramatically. I figured that I could monitor these magnitudes and drop readings when they are far away from the expected magnitude. I added some code to monitor the mean and standard deviation of the magnitude readings while I was calibrating the hard iron offsets of the compass; then I added code to drop readings if they were more than 2 standard deviations from the mean. This helped, but a lot of readings were being dropped, even though they appeared legitimate, i.e. the heading looked correct. Pushing that up to 4 standard deviations seemed to provide a better balance.

Demo

Part of the requirements for entering the AVC is providing a demo video or pictures a few weeks before the competition to prove that you have some progress. Here was mine:

The car mostly works and is following waypoints! I still need to do more testing, investigate some better waypoint following techniques, and get a physical button wired up.

Posted in Uncategorized | Tagged , | Leave a comment

Sparkfun AVC Build Log 2015-04-13

In my previous post, I laid out a list of basic things that I needed or wanted to get done, in order of importance. Since then, I ordered a GPS module (specifically the SUP800F, which integrates a GPS receiver, compass, accelerometer, thermometer, and barometer) and have gotten in working with my Raspberry Pi. I used the directions from Adafruit to get it to appear as a TTY in the Pi and read data from it.

The SUP800F is definitely a good value, but it has some quirks. The documentation is acceptable, although I’ve found a few mistakes and some things are missing. When you order it, it comes with a USB breakout board that you can use to read data from the module or configure it, which is really handy. You can also download a proprietary Windows program to read and visualize data from the module, and configure it this way too. If you want to configure it manually, or from something other than Windows, then you’ll need to find a separate AN0028 document. This is one area where the documentation falls short; the module contains some features that are not described in the AN0028 document and I’m not sure how to configure them manually. You can configure them from the propriety program, but that’s annoying when I already have it connected to the Pi. Another annoying thing is that the module is modal: it’s always either returning human-readable GPS (or specifically, NMEA) messages or a proprietary binary format that includes accelerometer, temperature, and raw magnetometer values. Switching between modes is slow, but I’m hoping that’s a problem on my end that I can fix. The documentation also claims that the compass is self-calibrating by rotating it twice in 5 seconds, but I’m not sure how to clear the calibration bit if it’s miscalibrated.

When I first got the module hooked up, I took the car out for a test spin. And spin it did, in circles, for a long time. Trying to debug what the car was doing was tricky; I was logging all my data, but reading it back and trying to compare that with what the car was doing at any particular time was painful. I could SSH into the Pi and tail the log in real time, but the data were scrolling by so quickly that it was hard to scan. I decided that I needed a monitoring solution.

I wanted to be able to access the remote monitor without needing a specialized program, so I decided to run a small web server on the Pi. Any WiFi enabled device with a web browser would be able to view the data. This was also nice because I could add an easy remote kill switch; click a button on the page, and the car stops.

I ended up choosing CherryPy because it’s a lightweight framework, and because it supports websockets. Without websockets, the page would need to poll the web server to keep getting the most recent information about the car, which would require renegotiating a new TCP connection every time. The Pi continually sends data to any connected clients, which is updated in the web page. I also added some buttons at the top to remotely control the car, including starting, stopping, and calibrating the compass. Finally, I plugged my logging infrastructure into the page so that I can view recent info, warning and error messages. Here’s what it looks like:

sparkfun-avc-monitor

Based on this, it looks like my compass was having some issues. I’m going to focus on debugging that next.

Posted in Uncategorized | 3 Comments

Sparkfun AVC Build Log 2015-03-09

I entered the Sparkfun Autonomous Vehicle Competition for the first time last year. It didn’t go as well as I had hoped, so I’m hoping to start earlier and make some changes this year. This is going to be a short recap of the progress I’ve made so far.

Process

Last year, I started late and spent too much time working on less important things at the expense of critical functions. This year, I’m going to prioritize things by importance and ease of implementation and get the earlier things tested and working well before trying other things. Here’s a rough list of things that need to be done in order of priority:

  • Critical
    • Control the car from the onboard computer (done!)
    • Complete sensor data processing (in progress)
    • Complete basic navigation to waypoints
    • Add a physical start button
  • Nice to have
    • Include basic collision detection and recovery
    • Use a Kalman filter for position estimation
    • Add a remote kill switch
    • Implement obstacle avoidance
    • Consider the shortcut, non-GPS option, and ramp and hoop
    • Estimate speed from accelerating from a standstill
    • Look at faster motors and other upgrades
    • Get remote monitoring working

Code

Last year, all of my code was in Python. Python is nice because it is easy to test, easy to prototype new ideas, easy to edit on the Pi so that you can make changes in the field, and has fantastic library support (Want to output PWM from the Raspberry Pi’s GPIO pins with hard real-time requirements without using much CPU? There’s a library for that). It’s biggest drawback is that it’s slow. I’ve heard estimates of it being around 40 times slower that optimized C code.

Now before you write Python as an option off, I would strongly urge you to benchmark your code and consider the tradeoffs. Last year, I was running my code on a Raspberry Pi model B running an ARMv6 processor at 700 MHz. The GPS I was using only provided updates at 1 Hz, while the compass provided updates at 5 Hz. My vehicle had a top speed of about 2 meters per second, and my code directly used the GPS readings without any filtering. With my code processing this information and controlling the car at 5 Hz, I was comfortably sitting under 20% CPU utilization. Python was perfectly adequate.

This year, I have a much faster car, a GPS module that updates more frequently, and I’m hoping to do some extra position processing and image processing for obstacle avoidance. I started modifying my original code but found after running some benchmarks that the CPU usage was uncomfortably high, so I started looking for a new language.

Language

My initial thought was to try Rust, a new programming language whose development is being sponsored by Mozilla. Rust compiles to machine code, so it should be about as fast as modern C++. It also has a complex type system that prevents memory leaks, null pointer errors, use-after-free errors, and data races.

Rust is an evolving language. They only recently released an alpha version, and they recommend that you always use the nightly builds. Unfortunately, there is no official ARMv6 support yet. I did find a guide for compiling a cross-compiler targeting the Pi, but I was worried that that would prevent me from easily modifying the code in the field. After some digging, I found a few months old build that ran on the Pi. This worked fine, but trying to get help when I ran into issues was problematic because most people were only familiar with the nightly version. Also, I found that the trigonometric functions did not work.

At this point, I wasn’t sure what to do. I started rewriting my code in C++ but became frustrated with, well, everything about C++. Build systems, Makefiles, Sconstruct files, header files, circular dependencies, manual memory management, race conditions, linking, preprocessor directives, unsafe macros, null pointer errors… During my brief time with Rust, I’d forgotten about all of the annoyances of C++. Fortunately, another user started making unofficial builds of Rust for the Pi and I was able to use them. And, the trigonometric functions worked!

Rust hasn’t been all rainbows and sunshine though. Being a pre-1.0 language, it’s still undergoing a lot of syntax changes, and I’ve had to rewrite parts of my code several times. This should stop in a month or two though. Partly as a result of this constant change, it’s also lacking an extensive library. Fortunately, I’ve found that it’s very easy to call external C functions. Finally, the biggest problem I’ve had is the learning curve. With other languages, it’s easy to pick up bits and pieces of the syntax and learn as you go, but I’ve found that you really need a good handle on the type system before you can do much of anything. Be prepared to read through the Rust book before setting off.

I guess that’s it for now. Over the next few weeks I hope to get the GPS sensors working and the car doing some basic waypoint navigation. Once that’s done, I’ll add a physical start button. This should leave me in decent shape for the competition if nothing else goes as planned.

Posted in Uncategorized | Tagged , , , | 2 Comments

2014 Side Projects

I recently saw This Year in Side Projects on Hacker News, and it seemed like it would be a good idea to document my year too.

Side Projects

 

Node JavaScript libraries cache

I’ve always thought that one of the most clever things that Google has done is incentivize website owners to allow Google to track their users by providing them with free tools. Install Google Analytics on your site and they’ll provide you with real-time information about how many users are visiting your site, where they are coming from, and how they got there! Of course, this information is collected by sending requests from your browser to Google, which they presumably use to track users.

It’s not that I explicitly distrust Google, but after the revelations that the NSA has backdoors inside many company’s servers, I’d rather not send the information at all. It’s really easy to install privacy add ons in browsers that block such tracking, but then Google offering hosting for popular JavaScript libraries. They sold this with the argument that sites would load faster and that users wouldn’t need to download the same library on a dozen different sites. These are good points, but the latest minified version of jQuery is only 33 KiB uncompressed, and most sites that I’ve seen use different minor versions of jQuery anyway, so it’s not like a lot of bandwidth is being saved. So now, you’re given the choice between keeping your viewing habits private and having a website that functions.

To avoid hitting a CDN and potentially being tracked, I wrote a small app in Node that you run locally that intercepts requests for JS libraries and either serves it from a local cache, or downloads it t a cache and then serves it. Just edit your hosts file to point the CDNs at localhost and you should be good to go. There’s still some room for improvement. I haven’t tried to get it to work with HTTPS, although I haven’t found any sites that use CDNs over HTTPS so that hasn’t been a problem yet. Google also offers font hosting, and they do some magic on the backend to dynamically generate CSS for only the font sizes you need, and that isn’t supported yet, but it’s a start.

TLS upgrader

Earlier in the year, I installed the browser add on Calomel SSL Validation, which ranks the strength of your SSL connection. Initially I thought something was broken because it was ranking many sites as very weak, but then I realized that many websites just defaulted to terrible encryption.

The way that TLS negotiation works is your browser sends a list of algorithms that it supports in the order that it prefers them, and then the server you’re connecting to picks one that it also supports and responds back. Unfortunately, a lot of servers prefer weaker algorithms because they require less CPU resources, and will ignore your browser’s preferences to use stronger algorithms.

Initially, I tried just disabling the weakest algorithms, but some sites (Amazon, Wikipedia) just stopped working. Some sites would load most of the page, but one particular part of the page would use different encryption than the rest, so some part of the page wouldn’t work. It was really frustrating to load YouTube and see the entire page load, and then the video would mysteriously fail. To try to work around this, I wrote a SOCKS proxy that you would run locally that would intercept HTTPS requests and incrementally renegotiate TLS connections. It would prefer strong algorithms, and if no common algorithm was found, it would fall back to weaker ones until a match was found. It would then forward that connection to the browser.

Halfway through the project, I realized that an attacker could use this to force weaker connections. I figured that there must have been some protection in the protocol against these kinds of attacks, and after reading enough of the IETF TLS document, it turned out there was. I finished up the project just because it was a good learning experience, and sure enough, Firefox gives a big scary warning that someone is tampering with your connection when you use it, so that was a dead end.

I had originally wanted to use an external proxy so that the same proxy would work with any browser or client, but that wasn’t going to work. I figured that any program that tried to incrementally renegotiate TLS would need to be run in the client. I started looking at modifying SSL Anywhere to see if that would work, but got pretty lost and didn’t make much progress.

Raspberry Pi radio control

Ever since I read about how you can use software to turn the Raspberry Pi GPIO pin into an FM transmitter, I was intrigued by the idea of modifying it to drive a radio controlled car. It would be really easy to make a robot by buying an off the shelf RC car and just taping a Raspberry Pi to the top. You can read more about here.

SparkFun Autonomous Vehicle Competition

Related to the above, I entered the SparkFun Autonomous Vehicle Competition. I used an RC car from Radio Shack and an Android phone for telemetry data. It didn’t go well – I started too late and probably spent too much time writing the Raspberry Pi radio control. On the plus side, my car never had problems moving; a lot of people had trouble even making it off the line. My design was completely disconnected – the phone connected to the Pi over WiFi and the Pi drove the car by broadcasting signals. I think avoiding wires and connections helped avoid a lot of gremlins.

OpenCV distance estimator

I wanted to have a quick and easy way to estimating distance to a target for another project I’m working on. Initially I considered using two targets and using image recognition to identify and measure the angle between them, but after some testing I found that I could get very accurate estimations using only a single target. You can read more about this project here.

frozen pipes monitor

The pipes in my apartment froze last year. My landlord has never dealt with frozen pipes before, which is understandable, and their solution was to have me turn my heat up to 80 and wait for the pipes to thaw. He was worried that there might be breaks in the pipes, so he wanted me to turn the faucet on and call him as soon as it started running.

Unfortunately, after a few hours, I needed to use the bathroom. I also wanted to shower and drink some water, but I couldn’t leave the house. I ended up pointing my webcam at the faucet and spinning up a quick website so that I could view the faucet from my phone. Later, I wanted to go out to dinner and not need to constantly check my phone, so I used OpenCV to detect changes in the image and Twilio to send me an SMS when the image changed. I didn’t have an accurate way to test the code because I couldn’t turn the water on, but it worked! When the water came back on, I got a message and checked the page to confirm.

Webcam setup

Webcam setup

collaborative multiplayer GameBoy

After seeing the hilarity of Twitch plays Pokémon, I wanted to see if it would be possible to reduce the lag for players. The original version on Twitch was accepting commands from users, feeding them into a GameBoy emulator, and then streaming that back out to thousands of connected users. This turnaround caused about a 10 second lag, which was really frustrating. I wanted to see if I could modify a JavaScript GameBoy emulator and only multiplex the commands out to each user. Each user’s browser would then emulate the GameBoy normally.

Because the emulator was written in JavaScript, I used Node on the server side so that I could emulate the game simultaneously and provide the full game state to newly connected clients. As with many emulation projects, the easy part was getting the graphical stack to work, but audio synchronization was a lot harder. If a client was emulating faster than the server, then it would need to slow down, which caused noticeable audio artifacts.

Although I got multiple clients to connect and emulate simultaneously, I never ironed out the audio issues, and another site beat me to it. I was also worried about the legality of distributing ROM images along with web pages, which wasn’t a problem with the original Twitch Plays Pokémon.

2015

Looking forward, I’d like to get a head start on another SparkFun AVC entry. I’ve started reading a bunch of papers on improved position and other state estimation using Kalman filters, and I’m hoping to more manual testing. I’ve also started learning Rust by writing my command code for the car in it.

Posted in Uncategorized | 1 Comment

Sparkfun AVC 2014 report

I first heard about the about 2 months before it was held in 2014. It sounded like something I would love to enter, so with the help of my friends at the Boulder Hacker Space, Solid State Depot, I managed to scrape together a somewhat working entry. I figure it’s about time I write up my experience and write about my plans for next year.

Background

The AVC has a few different categories: fixed wing aircraft, rotating wing aircraft, and ground vehicles. I entered the ground competition because I don’t have much experience with flying, and I was worried about the expense and consequences of crashes during testing: with a ground vehicle, it just bounces off whatever obstacle it ran into, but a flying vehicle can be destroyed. Ground vehicles that malfunction also won’t go very far, but an aerial vehicle might never be seen again.

The ground competition has a few different classes, 3 based on price and size of the vehicle, and the 4th contains vehicles with non-tradional (i.e. not wheeled) locomotion. The cheapest class encompasses vehicles under $350 total. This might just be sour grapes, but I think this class is the most interesting, because it really forces you to prioritize your spending and to come up with creative cost-effective solutions. I entered this class and barely came under the limit.

Design

Control

I don’t know a lot about electronics, so I wanted to use an off the shelf vehicle and modify it as little as possible. Since reading how to turn the Raspberry Pi into an FM transmitter, I figured that it would be possible to modify the code to control an off the shelf remote controlled car. This would solve two things: I could avoid hardware hacking by keeping the control entirely in software and I could easily swap out a car for another model if I needed to. I spent a while tinkering with it, but I eventually got it working!

All of my command software ran on a Raspberry Pi. I wrote the navigation code in Python. I was initially worried about CPU load with using an interpreted language, but with an update interval of 10 Hz, the CPU load was around 5%, so this wasn’t a problem. For next year, I’m going to try rewriting it in Rust so that I can increase the update interval and do some image processing concurrently.

I’ve had several problems with the Raspberry Pi corrupting its file system when the power was suddenly cut. The best solution is to always shut down the system cleanly when powering it down, but with the vehicle running into obstacles and wires becoming dislodged, this wasn’t always an option. I ended up replacing Raspbian with Industrial Perennial Environment, a distro that mounts the root filesystem as read-only by default. This fixed all of the corruption problems I was seeing. For logging and video and other things where I needed a writeable filesystem, I just made a separate partition on the SD card.

Vehicle

The Raspberry Pi can only broadcast at frequencies up to 250 MHz. A lot of new, high-end RC cars operate at 2.4 GHz, so I was limited to lower end vehicles. A lot of cheap RC cars only have 6 directional controls: forward, reverse, and some combination of left, right, or straight. This would have worked fine, but I wanted to have proportional steering. I ended up using a RadioShack Dune Warrior, which has proportional steering and operates in the 27 MHz spectrum. I used an oscilloscope attached to the transmitter to observe the signals that it broadcast and reverse engineered the commands.

The Dune Warrior worked pretty well at first, but if you read the reviews, you’ll see that one big flaw is that the steering gears wear down over time. This ended up being a problem during the competition, as the vehicle would drift slightly left or right when it was in the resting straight position. I’m planning on seeing if I can 3D print a new gear to replace the current one, but if that doesn’t work, I might go with a simpler 6 directional control vehicle instead next year.

Navigation

Most of the people I talked to at the competition used GPS to navigate around the track, and I was no exception. The cheapest module I could find cost around $40 and required an Arduino or other real-time processing. Linux isn’t a real-time OS, so it really wasn’t appropriate. Instead, I ended up purchasing a cheap Android phone that already had GPS sensors built in. I liked this plan because the phone is already ruggedized and shielded, comes with its own battery supply, has a touch-screen display, and it comes with a whole bunch of sensors.

To collect data from an Android phone, you need some way for it to communicate with the Raspberry Pi. I originally tried to set up an adhoc wireless network using a USB WiFi adapter, but older versions of Android can’t connect to adhoc networks. I ended up setting up the Pi to act as an access point and then having the phone connect to it. This also had the benefit of making it easy to monitor the vehicle – just connect a laptop to the virtual access point and SSH into the Pi.

I spoke with Michael Shimniok at the Colorado Maker Faire, and he gave me a lot of advice. One thing that he suggested is that compasses are unreliable, and instead, one should use the heading computed by the GPS module. I ended up ignoring this advice because the compass seemed reliable when I was testing in my local parking lot, but sure enough, I had problems at the competition. A lot of other competitors I spoke to had similar problems. Next year, I’ll try to avoid it. It should be noted that there was interference in the magnetometer readings from the vehicle itself, but I worked around this by subtracting the raw magnetometer readings when the phone was mounted on the vehicle from the raw readings when the phone was alone.

I read a few different blog posts where people’s otherwise successful runs were stopped short when the vehicle collided with an obstacle. Some entrants I talked to used echolocation distance sensors to avoid obstacles, while others used image processing. I was already close to the spending limit, so I took a different approach. Rather than trying to avoid obstacles, I detected collisions and then backed the vehicle up to try to drive around it. The accelerometer sensors in the phone were too noisy to provide useful data when the car was moving, but when it stopped, there was a huge drop in the magnitude of the readings, which made it easy to detect collisions.

Results

So how did my entry wind up doing in the end? Simply, not great. I only got the car to follow some basic waypoints the night before the competition at 3 AM (note to self: start earlier next year!). One big problem I had was that the compass was slow to update when the vehicle was turning, so it ended up turning past the correct heading and driving a swerving S-pattern towards the next waypoint. To work around this, I calculated the estimated turn time required to point to the next waypoint, turned for that amount of time, and then drove straight for 2 seconds to let the compass update.

At the competition, the compass caused even more problems. In the 1st heat, my car took off the line and immediately veered left into the curb. Looking at the logs, I was getting weird readings from the compass. On the plus side, the vehicle recognized that it had stopped, backed up, and tried to go around the obstacle. It ended up repeating this action a dozen times, eventually going around the first corner and netting me 25 points (the only points I would get that day). The other 2 heats ended similarly, with my vehicle turning completely around, driving straight for 2 seconds, and then turning back.

In all, I had a lot of fun working on my vehicle and am looking forward to the competition next year. I’m planning on starting earlier so that I can hopefully avoid 3 AM debugging sessions before the competition this time 🙂

Posted in Uncategorized | Leave a comment

Using SimpleCV to estimate distance to a target

For a recent project, I wanted to be able to estimate the distance and direction to a known target using a webcam. I decided to use SimpleCV, a Python computer vision library, to process the images. I wanted to use a Raspberry Pi along with a Raspberry Pi camera module so that the entire project could be mobile; the Raspberry Pi is somewhat underpowered, so I spent a lot of time trying to optimize my code to get an acceptable framerate. The full code is available on GitHub.

Target

To start, I needed a target that would be easily and unambiguously recognizable. SimpleCV has some primitives to recognize specific colors and shapes (including circles, squares and rectangles), so I decided to put a high contrast circle against a contrasting background. I decided on the circle because it shouldn’t be affected by roll of either the camera or the target. I tried a few different things, but ended up covering a CD in bright orange duct tape and attaching it to some black posterboard. The CD worked well because it has a well known and specific diameter.

SimpleCV target - orange CD placed against posterboard.

SimpleCV target.

To calculate the distance to the target, we can use some trigonometry as long as we know the angle to the target and the length of some object. CDs have a diameter of 120mm, so that part’s easy. We can find the angle by calculating the radius in pixels of the identified CD in the image, dividing that number by the width of the image, and then multiplying the ratio by the horizontal field of view of the camera. According to fastmapper in the Raspberry Pi forums, the Raspberry Pi camera module’s HFOV is 53.50±0.13 degrees. Using this information we can calculate the distance by using a tangent calculation.

Raspberry Pi camera

The Raspberry Pi camera module doesn’t show up as a Video4Linux device, so accessing it through SimpleCV isn’t as easy as other webcams. There are some projects to get the module to appear as a Video4Linux device, but they looked more convoluted than I wanted. I instead used the picamera module to access the Raspberry Pi camera and saved the data directly into an io.BytesIO buffer and used PIL to open the image.

camera = picamera.PiCamera()
camera.resolution = (320, 240)
camera.start_preview() # Start the camera
with io.BytesIO() as byte_buffer:
    byte_buffer = io.BytesIO()
    camera.capture(byte_buffer, format='jpeg', use_video_port=True)
    byte_buffer.seek(0)
    image = SimpleCV.Image(PIL.Image.open(byte_buffer))

The use_video_port takes pictures faster, but the individual pictures come out more grainy. Without it, I was getting less than one picture per second. It’s worth noting that I’m encoding the image to JPG format and then immediately decoding it, which is wasteful and unecessary; it should be possible to use the raw data and avoid the transcoding, but I haven’t figured out how to do that yet.

Processing

To process the image, I needed to find circular orange blobs in the picture. SimpleCV has a method colorDistance that transforms an image into a grayscale image that represents each pixel’s distance from a particular color. This would work well for our bright orange CD if I could control the lighting at all times, but I plan on using the target outside, and any time the target enters a shadow, the color would change. Instead, I used hueDistance which uses the hue of a color and is less susceptible to changing lighting conditions.

Orange hue distance

Orange hue distance,

One problem with just using the hue is that black is generally a constant distance from any particular hue, and I was having trouble separating the black background from the orange CD. To work around this, I used colorDistance to find a black mask that I could ignore and then only run image processing on the masked portion.

Black areas

Black areas (represented as white)

SimpleCV has some functions to detect blobs in the image and then determine if the blobs are circles. Once any circles are found, it can also estimate the radius of the circle to subpixel accuracy.

black_mask = image.colorDistance(SimpleCV.Color.BLACK).binarize()
distance = image.hueDistance(ORANGE_DUCT_TAPE_RGB).invert()
blobs = (distance - black_mask).findBlobs()
if blobs is not None:
    circles = [b for b in blobs if b.isCircle(0.25)]
    if len(circles) == 0:
        return None
    # The circles should already be sorted by area ascending, so grab the biggest one
    cd = circles[-1]

Speed

Using this setup with 320×240 resolution, I was only able to get about 1 frame per second. There were a few things that I tried to speed everything up. First, I knew that the blobs I was interested in were always in a particular pixel size range, so it wasn’t worth calling isCircle on ones that were too small or too large. The findBlobs function takes optional minsize and maxsize parameters, so invalid blobs can be filtered out early. This made a small improvement, but it wasn’t consistently better because the image doesn’t always have blobs that can be filtered out.

The next thing I tried was cropping the image before doing any processing. For my application, the target is always going to be in the middle half of the image, so cropping the image before doing any processing should improve performance a lot. Unfortunately, I ran into some problems where cropping the image before calling colorDistance would produce clearly incorrect results. It’s worth noting that cropping the image before calling hueDistance worked fine. I don’t know if this is a bug in SimpleCV or in my code. To work around this, I just called colorDistance and cropped the result, and then did the rest of the processing from there. With these optimizations, I was able to about double the framerate.

Results

Using the nominal values for the CD diameter and the camera’s HFOV, I calculated the distance to the target at a known distance of 1.155m 10 times. The total framerate was 2.04 frames per second and the readings averaged out to 1.298m with a standard deviation of 0.00134m. The precision was surprisingly high, even if the accuracy was not. I believe I read that the field of view changes when using the use_video_port option, but I can’t find if or where I read that, so I’m not going to mess with that value. I was concerned that maybe the outline of the CD was bleeding into the background, so I tried changing the the diameter to 110mm and got readings of 1.157m. I repeated this at a distance of 2m and 0.5m and got readings of 2.000m with standard deviation of 0.00554m and 0.5 and 0.496m with 0.000199m standard deviation. It’s worth noting that at 2m, the framerate dropped to 1.40 FPS; I’m not sure why that is. Considering how precise these measurements were, it might be worth reducing the resolution even more in order to increase the framerate.

I did notice that when I reduced the lighting (the above tests were done with ample lighting), the detected radius of the CD dropped slightly which affected the distance calculations. It might be possible to adjust the expected CD radius in response to changing lighting conditions, but the overall error rate in lower light conditions was only a few centimeters, so I didn’t try it.

Posted in Uncategorized | 5 Comments

Turn your Raspberry Pi into a radio controller for RC vehicles

Ever since I tried turning the Raspberry Pi into an FM transmitter, I had wondered if it would be possible to make it drive a radio-controlled car. It turns out that it is! With no modifications to your Pi, you can be driving around a toy-grade RC car iFe, and I’ve read about a few other ones online that should work as well. Any RC toy that works at a frequency in the range of 1-250 MHz should be controllable by the Pi once you figure out the command signals.

I’m going to talk a little bit about how RC cars work, how the code and hardware works on the Pi, and about RC controls in general. If you just want to start driving around, you can jump to the end.

Raspberry Pi FM radio

The first version of the Raspberry Pi FM transmitter was written by Oliver Mattos and Oskar Weigl. It works by using the hardware that is normally meant to produce spread spectrum signal clock signals on the GPIO pins to generate a radio frequency. The GPIO pin itself acts as an antenna, which you can extend by attaching a length of wire such as a jumper cable. By reading samples from an audio file and writing them out to the GPIO pin at a certain rate, it is possible to tune in a radio and hear the audio.

The original version of this code had a small problem though. FM requires a continuous stream of radio signals, but the Raspberry Pi normally runs Linux, which is not a real-time operating system. This means that you can start, stop, and run different programs at the same time, but because the Pi only has a single CPU core, it has to simulate multiple programs running at the same time by constantly switching between them. Whenever there was a context switch away from the program that was writing to the GPIO pins, the radio signal would skip and pop. Richard Hirst came up with a great solution to this problem: use the DMA controller to continually write data to the GPIO pins. Because DMA bypasses the CPU altogether, it is not affected by context switches, and it has the side effect of reducing the total CPU load because the controlling program now only has to periodically refill a circular buffer. Unfortunately, it also means that it’s not possible to stop broadcasting for short periods of time (at least not that I’ve been able to figure out), which will be problematic later when we try to control an RC car.

Frequency Modulation, Amplitude Modulation, and Pulse Modulation

There are two common ways of encoding audio over a radio signal: frequency modulation (FM) and amplitude modulation (AM). FM works by continually varying the frequency of the signal in small amounts while keeping the amplitude the same, while AM keeps the frequency the same while varying the amplitude. The difference is best described with a picture:

am-fm
Animated diagram representing the difference between radio waves modulated by amplitude and by frequency. By Berserkus, from Wikipedia. Used under CC BY-SA 2.5.

The Raspberry Pi FM program uses the fractional divider to control and change the frequency of the generated signal. This allows it to generate an FM audio signal. Because the Pi is just writing values to the GPIO pin, I don’t think that it’s possible to control the amplitude of the signal, so I don’t think that it’s possible to have the Pi generate AM audio signals.

There are a few different control schemes that remote control toys use. Most toy-grade RC cars (ones with forward/back + left/right control) use pulse modulation. The radio controller broadcasts a signal at a constant frequency for a certain amount of time, and then stops broadcasting for another amount of time. It then repeats this cycle with varying lengths of broadcast and pause time in a particular pattern. The RC car recognizes this pattern as a command, such as “forward” or “reverse right” and starts moving.

The specific pattern typically consists of some base time length, such as 400 microseconds, and all pulses and pauses are multiples of that length. A control signal starts with a synchronization pattern consisting of a broadcast for some length of time and a pause, which is repeated a certain number of times. Then a number of bursts and pauses are repeated for the command. For example, the New Bright 1:24 truck sends synchronization signals of 1200 microsecond bursts with 400 microsecond pauses repeated 4 times, followed by a number of 400 microsecond bursts and 400 microsecond pauses. If that signal is repeated 40 times, the vehicle drives forward, while 28 bursts tell it to drive reverse and right.

Remember earlier when we had to use DMA to ensure a constant signal? Because of this, I haven’t been able to figure out how to make the Pi stop broadcasting for a consistent amount of time, which is needed for the signal pauses. However, think back to how FM radio signals work. Because the Pi send FM, it can very quickly change the frequency that it is broadcasting on. To simulate a broadcast pause, it instead broadcasts at a different frequency for a length of time. Because the RC car is only listening for signals at one particular frequency, as far as it is concerned, the Pi has stopped broadcasting altogether.

A step up from toy-grade cars usually have proportional steering and throttle control, so you can more precisely control the vehicle. Most high-end hobby grade devices use pulse-position modulation. Andrew Hazelden has an excellent explanation of how PPM works. I wanted to get a car with proportional steering and throttle control, but most RC cars that I looked at operate in gigahertz range which is too high for the Pi, so I instead bought a RadioShack Dune Warrior which runs in the 27 MHz range.

Finding the signal

With toy-grade cars, I’ve found that the easiest way to find the signals that control them is to just use brute force. They mostly use very similar signal patterns, so you can simply iterate through different control pattens until the car responds. For more complex patterns, you’ll probably need to use an oscilloscope to analyze the patterns.

Brute force

For toy-grade RC cars, you can just turn on the car and iterate through different control patterns until you find one that causes the car to move. Every car that I’ve looked at uses one synchronization pattern for all of the controls that it can respond to, so once you find one signal, it’s just a matter of guessing how many signal repeats control the remaining actions.

Iterating through all possible patterns can take a few hours, and I didn’t want to wait around for the car to move, so I instead pointed a webcam at the car and then waited until the image changed. Once the car moves, the computer will see that the image has changed and will save the image and the pattern.

I tried a few different techniques to monitor the webcam to decide if the car has moved. First, I tried just computing the percent difference of the color values of the pixels between a base photo and the most recent photo. This generated a lot of false positives due to changing lighting conditions, such as when a cloud passes overhead. If you run this program, I recommend placing the car in a closet or some other place where you can keep the lighting consistent.

Next, I converted the image to greyscale and then normalized the colors so that the darkest color was black and the lightest color was white. This reduced the noise between pictures a lot and helped with my varying lighting conditions, but I was still worried about false positives. I tried one more thing: reducing the image depth to 1 bit, so it was only black and white.

1-bit-posterized-rc-car

This reduced the noise to a really low level, and because the wall behind the car was white, once it moved, the difference was very clear. This reduced my false positives to 0, so I stopped here. I had considered running edge detection on the base image and then ignoring differences along the detected edges because that’s where most of the difference between photos appeared but I ended up not needing to.

Oscilloscope

For more complex vehicles, it’s easiest to hook an oscilloscope up to the radio controller and then observing the changes in the command signal as you move the controller. This is what we ended up doing with the Dune Warrior.

dune-warrior-oscilloscope
Dune Warrior, straight and full throttle

We hooked up the oscilloscope and watched how the signal changed as we adjusted the throttle and the steering. If you want to do this yourself, I strongly suggest recording the signal to video so that it can be reviewed later. The signal starts with a 500 microsecond synchronization burst and pause, followed by 22 more bursts. Each burst is either 127 microseconds or 200 microseconds, with a pause of the same length. The burst lengths are used to encode a binary pattern, where the shorter bursts are a 0 and the longer bursts are a 1. 6 bits are for steering, 5 are used for the throttle, and 1 is used as an even parity bit; the remainder never appear to change and I’m not sure what their function is. What’s strange is that the bits for steering are not contiguous; bits 7-8 form the 2 most significant bits, while 1-4 form the rest. 32 is centered for steering, while 0 is sharp left and 63 is sharp right. Similarly, 16 is idle for throttle while 0 is full reverse and 31 is full forward.

Running your own car

If you have a cheap toy-grade RC car (i.e. one that only has controls for forward/backward and left/right) then you should be able to find the command signals and control the car pretty easily. Clone the repository from my GitHub on your Raspberry Pi and compile pi_pcm by running scons. Run pi_pcm as root. By default, the program listens for JSON commands on port 5432.

Turn on the car and put it in a location where you can control the lighting (such as a closet). Point a computer that has a webcam at the car (it can be the same computer or a separate one), run python watch.py -f [frequency] -s [Pi IP address] and enter the frequency of your RC car (this should normally be 27 or 49 MHz). You’ll need to have gstreamer installed to capture the webcam stream. Most cars operate on one of a several different channels in the 27 or 49 MHz range; if you don’t know the exact channel, then just enter 27 or 49 and the program will iterate through each channel for you.

Once the program sees the car move, it will save a picture and rename it according to the command it just sent. The filename format is frequency-microseconds-synchronizationBurstLength-synchronizationBurstCount-signalCount. From there, you can run python control.py -f [frequency] -s [Pi IP address]. It will prompt you for the command information and ask you for different signal bursts; try different ones (1 – 100 or so) and observe how the RC car reacts. Once you have all of the signals, save the information in a JSON file (see control-specs/pro-dirt.json for an example). Now you should be able to run python interactive_control.py -s [Pi IP address] [JSON control file] and drive the car using the arrow keys.

More thoughts

There are a few other projects that you cna build from here. For one, I’m planning on entering the SparkFun Autonomous Vehicle Competition with my Dune Warrior. You can make an autonomous vehicle yourself by buying a USB battery pack for the Pi and taping both to the top of your car and writing a simple control program. I’m also planning on buying a Raspberry Pi camera, slapping on a WiFi USB dongle, and implementing Twitch drives your car in my apartment.

Posted in Uncategorized | Tagged , | 31 Comments