AGI again?

I was thinking about the GPS stuff, in particular about how the accuracy doesn’t seem that great. Of course I’ve only tested so far a walking speed – it could be much better at more typical flying speeds. But even if it isn’t, I’m thinking that the accelerometer (which I still haven’t received – I’m really starting to like Amazon Prime) could help here. When you put the two together, GPS and acceleromoter, you might be able to determine location and velocity much more accurately than with only one or the other.

This got me thinking about mental modalities, i.e. vision, hearing, proprioception, and the vestibular system, which work together to provide similar, otherwise hidden, information. Which is kind of neat since this is the sort of thing I previously wanted to research, but for which I never had any data. It seems, whether I meant it or not, I’m back in the AGI world.

Oh, BTW, I got some code hooked up to use Google maps to visualize a GPS recording. It’s based off of code from PubNub.

GPS testing

So I took my GPS for a walk around the block. I wrote a little thingy to read the serial stream. I will get a github repo created soon, but in the meantime here is the code. It depends on jSSC, which is available from maven here. (I just added it to the build.gradle file, which I will put in the repo.)

package lohbihler.manfred.nmea;

import java.io.PrintWriter;
import jssc.SerialPort;

public class SimpleLogger {
    public static void main(String[] args) throws Exception {
        SerialPort port = new SerialPort("/dev/ttyAMA0");
        boolean success = port.openPort();
        if (!success)
            throw new RuntimeException("Failed to open serial port");
        
        success = port.setParams(9600, SerialPort.DATABITS_8, SerialPort.STOPBITS_1, SerialPort.PARITY_NONE);
        if (!success)
            throw new RuntimeException("Failed to set serial port params");
        
        try (PrintWriter out = new PrintWriter("nmea.out")) {
            while (true) {
                String s = port.readString();
                if (s != null)
                    out.write(s);
            }
        }
    }
}

Anyway, shortly after this outing my hard drive decided it was going to die. I knew about this for a while actually, but only recently did it become clear that I was going to have to figure out how to get everything on the new hard drive that had been sitting around since last November. Sadly it was too late to do a simple clone. Reflect would die with various error codes at various moments, and diagnostics declared unilaterally that there were too many bad sectors. Kind of annoying since the drive started up without any issues besides a bit of slowness for months, and there had only been one BSOD that whole time. Long story short, $170 CAD for a new Windows license and about a week of getting it installed (don’t bother cutting to a CD, go straight for the bootable USB) and getting everything else installed and copied over and configured et al later, and we’re back up and running.

I managed to get my NMEA parsing code all written up too. Or at least enough code for my purposes.

The bottom line so far is that the GPS is accurate within about a meter, which might be good enough. I will probably have to look into what I mentioned before: how to smooth the variability.

It also occurred to me that there is a landing/takeoff problem, i.e. how does the plane know where the ground is. For landing I’m thinking if it’s maybe 10cm or so off the ground it could just cut the engine and hope for the best, but 2 cm would be a lot better. Even if it knew the current elevation above sea level (I could perhaps always take off and land at the same place(s)), the GPS isn’t nearly accurate enough to guess lift off and touch down. So with only this speculation I threw fiscal caution to the wind and blew $4.42 for this. Can’t wait to test out that baby.

Weather is getting warmer, so it might be time to pick up the actual aircraft. I’m still worried about fitting everything inside. I’ll have to become good friends with the nice fellows at the hobby shop so that they maybe let me take a look inside some of the boxes before I buy. Or maybe I should just expect to spend a few hundreds. Chances I’ll destroy the first plane anyway just trying to learn to fly it.

Using the Rasperry Pi serial port to listen to the Adafruit Ultimate GPS

Setting up the Ultimate GPS to work with the Arduino was easy. I was happy this worked, because it proved that my soldering of the header was all good. But what I wanted to was have my RaspPi listen to it. This turned out to be pretty easy too. I managed it with the following:

  • Raspberry Pi
  • Adafruit Ultimate GPS
  • A breadboard
  • 4 jumper wires

The Raspi has a serial TX/RX at GPIO 14 (TX) and 15 (RX). (Pins 8 and 10 respectively.) Here’s a map that I’ve found gaudy, but handy.The first thing I did was prove that my serial port was working by wiring RX and TX together. Then, I tried using a test program (included in the raspbian-jessie distribution) to echo characters:

pi@raspberrypi:~ $ miniterm.py

This didn’t work. (I.e. key presses didn’t do anything.) After some searching, I found this page, with which I was able to determine that the problem was that the serial port was being used for console logins. As described there, the solution was to use “sudo raspi-config” to turn this feature off and reboot. Using this command:

dmesg | grep tty

… I found that the port that I wanted was ttyAMA0. Trying the test program again:

pi@raspberrypi:~ $ miniterm.py /dev/ttyAMA0

… This time I got characters echoing. (I.e. when I press a key, the value shows up on the console.)

Ok, so the serial port works. Now (using the GPIO map link above), I connected 5V to VIN, GND to GND, RX to TX, and TX to RX. Running miniterm.py again, I saw it dumping out GPS data.

$GPGSA,A,1,,,,,,,,,,,,,,,*1E
$GPGSV,1,1,00*79
$GPRMC,232008.086,V,,,,,0.00,0.00,060180,,,N*47
$GPVTG,0.00,T,,M,0.00,N,0.00,K,N*32
$GPGGA,232009.086,,,,,0,00,,,M,,M,,*7C
$GPGSA,A,1,,,,,,,,,,,,,,,*1E
$GPRMC,232009.086,V,,,,,0.00,0.00,060180,,,N*46
$GPVTG,0.00,T,,M,0.00,N,0.00,K,N*32
$GPGGA,232010.086,,,,,0,00,,,M,,M,,*74
$GPGSA,A,1,,,,,,,,,,,,,,,*1E
$GPRMC,232010.086,V,,,,,0.00,0.00,060180,,,N*4E
$GPVTG,0.00,T,,M,0.00,N,0.00,K,N*32
$GPGGA,232011.086,,,,,0,00,,,M,,M,,*75
$GPGSA,A,1,,,,,,,,,,,,,,,*1E
$GPRMC,232011.086,V,,,,,0.00,0.00,060180,,,N*4F
$GPVTG,0.00,T,,M,0.00,N,0.00,K,N*32
$GPGGA,232012.086,,,,,0,00,,,M,,M,,*76
$GPGSA,A,1,,,,,,,,,,,,,,,*1E
$GPRMC,232012.086,V,,,,,0.00,0.00,060180,,,N*4C
$GPVTG,0.00,T,,M,0.00,N,0.00,K,N*32

Huzzah!

Oh, while poking around I also answered my question about the update rates of the GPS. It turns out the .h file in the Adafruit GPS Library defines a bunch of commands for such settings, including setting the update rate, setting the position fix update rate, filtering sentence types, and others. I tried a couple commands using miniterm, but they didn’t work, so I’ll need to eventually RTFM, but at least now I know where to look.

I’ll probably end up writing my own parsing code because, although the above library probably does all I need, my plan is to use Java for this project. I thought about using python, but I haven’t used it enough to be anything like fluent, and I don’t want a language learning curve being a drag on the project; time is already limited enough. I could wrap the cpp code in JNI, but having written a ton of parsing code in my past (including an NMEA data source for Mango, which I didn’t port to M2M2 because I didn’t have any equipment with which to test), I think I can handle this as well. Besides, then I can architect the code the way I want it. So the next step is to prototype the parser.

On-board pilot parts, and an introduction

Ok, so here’s what I’m planning on using for this project. But first, let’s decide on a name that’s better than “this project” and “on-board auto-pilot”. Yeah, “Otto” sounds like “auto”, but that’s too cute. I think I’d rather do homage, so I’m trying to decide between Manfred and Screwball. The danger of homage is that I might end up doing someone a disservice if the project tanks or is abandoned, but I prefer to start with a positive attitude. I like the latter, but it’s too long to say. So is the former, but at least it can be shortened to just Fred. So there, that’s done. Say hello to Fred.

Also, let’s set some expectations. This is going to be a long term project for a few reasons. I might not get this going this year.

Oh, also, about Amazon. Yes, I said it was boring, but that’s very different from what the New York Times had to say. I can’t speak for other people, but my personal experience there is that:

  1. The people are smart and decent, and compared to some other situations I’ve been in are a pleasure to work with.
  2. Timelines for getting tasks done are very generous. At first I thought everyone was seriously sand-bagging, but now that I have a better idea how much time research takes, the overhead of working within the Amazon tech infrastructure (where it takes about 10 minutes to restart my service instance … 10 minutes! I’m used to 15 seconds), and how much some unforeseen thingy can destroy your productivity, I’m starting to come around.
  3. Even if you go over, a reasonable explanation is typically accepted without any grief. You just need to remember that the other people there are pretty smart too, so your excuse better be airtight. And don’t let it happen too often. Never, if possible.
  4. I generally felt more stressed out when I was on my own. Timelines were much more aggressive, clients less forgiving, and payment sometimes slow. I’m still getting used to this idea of money showing up magically in my bank account every second Friday. Bliss!

The bottom line is that I like it there, and would easily recommend it to anyone that can dig the culture. I was not paid to say all this..

Ok, so back to Freddy. I recently got my self an Arduino (aka Genuino in my neighbourhood) but I’m pretty sure, without any evidence, that it’s not going to be able to handle the processing or memory requirements. So the plan is to use my Raspberry Pi 2 (Model B V1.1) that I bought last year. It currently has an 8GB micro SD in it; I may have to big this up eventually, but it should be good for now.

What inputs is Fred going to need?

  • Location
  • Attitude
  • Vision

The first should be easy: GPS. It turns out to be really easy in fact. I ordered an Adafruit Ultimate GPS breakout board on Amazon. (Again, not paid. But willing to be.) I soldered on the provided header and attached it to my Arduino via a breadboard, and it just worked. Sweet. I’m not sure that the input rate is going to be quick enough though, nor that the accuracy of the readings will do. The device is supposed to provide updates at 10 Hz, but it looked more like 1 Hz in my simple test. It’s not clear to me what the 10 Hz actually means. It wrote 50 NMEA lines in ~10s, consisting of around 3K characters, but the lines are groups of GPRMC, GPVTG, GPGGA, and GPGSA lines, which are dumped out about every second. So depending on how you look at it, it’s ~5 lines/s, or readings every second. I’m suspecting that the 10 Hz is how often it internally reads satellite signals. Or maybe outputs are configurable somehow. Whatever.

Readings are accurate within 3m, which at an RC plane’s speed might not be really great. If there are problems, I think I might be able to get around them to some degree by simultaneously calculating expected readings and comparing with actuals. For this to work well I might need to do some testing by:

  • Getting a tolerance of readings while staying still
  • Getting a tolerance of deltas after moving a known distance over a given time

This would tell me how much I can generally trust the readings. If I collect enough data I should be able to determine how I can combine readings with projections to get the most believable values. I’ll post data and source here along the way.

On the prediction side of things, I should be able to enhance predictions by factoring in throttle, elevators, ailerons, rudder, and attitude. I think the only thing that might throw off predictions is wind speed and gusts, but I don’t have a solution for that.

Ok, attitude. I ordered a SODIAL(R) GY-521 MPU-6050 Module 3 Axis Gyroscope+ 3 Axis (yep, still not getting paid) because it looks like it will work and it was too cheap to pass up. I’ll talk about it when i get it.

Finally, vision. Cameras are probably small and cheap, but I’m not going to do this for now. I doubt that the RaspPi will be able to do any useful vision processing, and even if it could it would probably draw the battery in minutes. Also, the software itself is an unknown, and writing anything like it myself would likely stall the project way too long. I might put a camera in eventually just to have some pics to review after landing, but that’s about it.

But let’s not forget the plane itself. I’m looking at something like the Apprentice. The main considerations here are the ease to learn how to fly (I’ve never flown an RC plane before), and the space for payload (i.e. will the RaspPi and everything else fit into the fuselage). I’m also expecting some serious crashing going on, so it should be reasonably priced too, and replacement parts should be cheap and available. Oh, it should also be battery powered so that I can leech power for my own stuff.

One final thing. This project is meant to be entirely open source. The hope is that some folks may find these posts and get involved. A guy can dream, can’t he?

New start…

I was considering calling this post “giving up”, but I realized that whether I liked it or not it’s not true. I did finally give up on freelancing, which was occasionally providing time to work on personal projects – like AGI research. I ended up taking a job with Amazon here in Toronto. I’ve been there for about 4 months now. The learning curve was absolutely brutal; it was like starting over in technology again… There are so many internal tools to figure out that even now I still feel like a fraud, trying to talk of things I barely know anything about.

But I do believe that I am actually being productive, finally!, maybe even being almost worth what they pay me. And from that perspective I think I can honestly declare that the work there is, well… boring. Amazon has quite successfully advanced software development into software engineering, such that there is little one could do there – relevant to the business – that hasn’t been done in some similar way before. When given a task, the majority of one’s time is spent finding the best examples of how something like it has already been done. Then you spend a little time copying and tweaking the code to your task’s requirements. Finally, you spend a good chuck of time testing and writing automated tests. Tool and process overhead can be significant too. The bottom line is that software engineers actually spend relatively little time writing original, thought-provoking code.

I could go on, but the point is that when I took the job I thought I was giving up on AGI. I was OK with this for two reasons. First, it seemed like there were a lot of really great projects going on, and that my contributions – especially working on my own – where not going to be of any great help overall. Second, one place where such projects are going on of course is Amazon; maybe I would eventually end up on one, who knows?

And maybe I will, but in the meantime it became pretty clear that I needed something interesting to work on. Initial thoughts were around farming. Automated farming that is, i.e. using sensors and code to monitor the health of a crop, and over time have the automation also deal with maintenance. I still think this is a really interesting area, but I eventually decided that it would be better to actually be a farmer to do this well. Plus, I suspect the sensor technology (e.g. soil phosphorous levels, etc.) aren’t very good yet, and I’m not about to start trying to develop that sort of thing.

Home automation was an idea too, and still has a lot of potential. Even more now with the Amazon Echo. I played with Alexa Skills a bit, and they’re pretty much dead easy to make. Integrating with your home thermostat is another matter. (I contacted Honeywell for technical details on mine. They got back to me at 9:30am the next business day, and seemed perfectly happy to give me whatever I asked for, except for the fact that my t-stat was provided by my utilities provider Powerstream, which meant I would have to contact them. Despite the friendly contend on the latter’s contact page, I have yet to even get a response from them. Losers.) Anyway, too many bad memories working in building control… Moving on…

I finally landed on a really old project idea: develop an on-board pilot that can fly (what would otherwise be) a remote control airplane. (“Old” in the sense that it had been kicking around in my head for a long time.) My plan is that subsequent posts here will provide the details of what I’m doing to the extent that any one could replicate. I’ve already done a bit of research and testing, but I’ll get to that later. Suffice for now to explain why this is a “new start”.

I’ve droned (pun intended) on before about how intelligence sprang from movement, from living in real time. (Read some of the earliest posts here for more details on that.) This isn’t to say that deep learning – which to my knowledge has no temporal notions in it – couldn’t eventually do intelligent things. It’s saying that movement is the source of biological intelligence. I tried developing simple machines in simulations, but found that simulations can easily go wrong. Although I wasn’t thinking about this when I landed (pun intended) on this project, this will in fact provide me with a full temporal-based feedback cycle from which my AGI ideas can finally take … some form.

Step one: what equipment would an on-board pilot need?