Category: Uncategorized

Raspbian Still Bad

The command “sudo apt-get install arduino” doesn’t get you the Arduino development environment. Instead, when you try to run it from the command line, you get “Error occured during initialization of VM. Server VM is only supported on ARMv7+ VFP”. I get that cross-compilation can be tricky, but this software didn’t get released, it escaped. Arduino isn’t exactly an obscure package.

To be fair, this isn’t because Arduino is broken. It’s because apt automatically chooses a version of Java that can’t run on the Zero W. So any package that uses Java is probably also broken, but I’m not looking into it, because I’m trying to actually do something with the Raspberry Pi. If I wanted to fuck around with a broken package chain, I’d… have gotten my fill of that like 10-13 years ago, so I guess I’d need a time machine. Really, the Pi Zero W is a lot like a time machine, taking me back to a time when Linux was a project, rather than being something you can use for projects.

If you, for some reason, want to use the Arduino IDE on a Pi Zero W, the incantation is to install Java 8 with ‘sudo apt-get install openjdk-8-jre-headless openjdk-8-jre’ and then use ‘sudo update-alternatives –config java’. Then you”ll be running a 5 year old Java and a 6 year old Arduino IDE, but at least the IDE will start up.

Curious Effects Getting List Extents

I have a program that gets a list of GPS waypoints, and wants to figure out their bounding box. The naive way[1] to do this is find the maximum and minimum latitude and longitude, and use the maxes as one corner and the minimums as the other corner.

Off the top of my head, I can think of two ways to do this: Iterate the list of waypoints, comparing to the max and minimum so far, and updating as I go. The list has N points, I have to look at all of them, so O(N), so far so good.

The other way to do it is to do list comprehensions to get the latitudes and longitudes as separate lists, and then call max() and min() on each of those. I would assume that each list comprehension is O(N), and each call to max() or min() is also O(N), since they have to look through the whole list to find the maximum or minimum, and so it is 6 loops over the list (2 comprehensions, 2 max() calls, 2 min() calls), and so this is the slower way to do it.

It turns out, not so much.

I ran the code below on Repl.it and got, usually, the list comprehension version being just very slightly faster to twice as fast. Occasionally, the 10,000 instance case is slower, but not all the time.

import random 
from timeit import default_timer as timer

#Try some different sizes of lists
for jj in [10, 100, 1000, 10000, 100000, 1000000]:

  #Set up N waypoints
  waypoints = []
  for ii in range(jj):
    lat = (random.random() * 360) - 180
    lon = (random.random() * 360) - 180
    waypoints.append({"lat":lat, "lon":lon})

  start = timer()

  # One loop
  maxLat = maxLon = -float("inf")
  minLat = minLon = float("inf")
  for point in waypoints:
      lat = float(point["lat"])
      if lat < minLat:
          minLat = lat
      if lat > maxLat:
          maxLat = lat
      lon = float(point["lon"])
      if lon < minLon:
          minLon = lon
      if lon > maxLon:
          maxLon = lon

  mid = timer()

  # List comprehensions
  lons = [float(point["lon"]) for point in waypoints]
  lats = [float(point["lat"]) for point in waypoints]
  minLat1 = min(lats)
  minLon1 = min(lons)
  maxLat1 = max(lats)
  maxLon1 = max(lons)

  end = timer()

  #Print the results
  print(f"{jj} points")
  print(f"  first way {mid-start}")
  print(f"  second way {end-mid}")
  print(f"  speedup {(mid-start)/(end-mid)}")
  assert(minLat == minLat1)
  assert(maxLat == maxLat1)
  assert(minLon == minLon1)
  assert(maxLon == maxLon1)

So why is it faster? Clearly, I’m assuming something wrong. I suspect the main thing that I’m assuming wrong is that the constant 6 multiplied by the O(N) matters. It probably doesn’t, and that’s why we typically drop the constant multipliers in runtime comparisons. It’s likely that list comprehensions and max()/min() of iterables are calls to a very fast C implementation, and are just so much faster than my loop in Python that the fact that I’m doing 6 iterations doesn’t really matter.

Another thing that I’m assuming is that max and min are implemented as linear searches over iterables. It’s entirely possible that iterables store references to their maximum and minimum values, and just return that when asked, rather than going looking. I doubt it, since the overhead on removing an element would be large [2], but it is possible.

I haven’t looked into either of these assumptions, since timing the runs answered the question I had (“Which is faster?”), and the follow-on question (“Why?”) isn’t useful to me at this time.

[1] It does some stupid stuff around the poles and across the international date line, for example.

[2] You’ve removed the largest element. What is the new largest? Time to go searching…

Alternatively, the list could be implemented as a parallely-linked-list, where one set of links is the elements in their list order, and the other set is the elements in their sorted order, but then the list [1, 3, 5, “pickles”, <built-in method foo of Bar object at 0x1f34b881>, 6] doesn’t have well-defined links for the sorted order.

Drag and Drop Python Objects in WxPython

I’m working on a UI for a system that has agents, which have some qualities, and units, which have multiple agents. I want to be able to display each unit as a list, and drag agents from that list to other units, to reassign them.

There are a lot of examples for using drag and drop with text and files, because WxPython provides drop targets for text and files already. One way that this could be implemented is to serialize the dragged objects to JSON, drop them as text, and then de-serialize them. This has some disadvantages, notably that you end up restricted by what you can pass via JSON. I wanted to pass Python objects, so I used pickle.

What I eventually came up with is below. It uses ObjectListView, which is a great wrapper around WxPython’s ListCtrl. On drag, the selected items of the source list are pickled and passed through the Wx drag and drop mechanism. When they are dropped on another ObjectListView, they are then unpickled and added to that ObjectListView (OLV), and removed from the source OLV.

One thing that this code does leave up to the programmer is ensuring that what goes in on one side of the drag and drop is the same as what is expected out on the other side. Another, slightly more subtle thing, is that this uses pickle on the drop data, so it would be possible to have a script that generates malicious pickle data, and lets you drag it from another UI window to my script’s OLV, whereupon it unpickles into something nasty.

That said, if your attacker is sitting at your computer, launching malicious programs and dragging and dropping stuff out of them, you have already lost, and should probably invest in better door locks.

#!/usr/bin/env python
import wx
import pickle
from ObjectListView import ObjectListView, ColumnDefn  #pip install objectlistview
import wx.lib.scrolledpanel as scrolled

# UI with a few list boxes that can be drag/dropped between, and have title bars

class Agent(object):
    # Model of a single agent, has:
    # identifier (string?)
    # range (km)
    # speed (kph)
    # capacity (integer)
    def __init__(self, ident, range, speed, capacity):
        self.ident = ident
        self.range = range
        self.speed = speed
        self.capacity = capacity

    def __repr__(self):
        return f"<Agent: {self.ident}>"

# Drag and drop Drop target that supports receiving pickled 
# python data structures and doing something with them. 
class GenericDropTarget(wx.DropTarget):
    def __init__(self, object):
        super(GenericDropTarget, self).__init__()
        self.object = object

        self.data = wx.CustomDataObject("PickleData")
        self.SetDataObject(self.data)

    def OnData(self, x, y, defResult):
        #print(f"OnData({x},{y})")
        
        if self.GetData():
            # unpickle data and do something with it
            pickled_stuff = self.data.GetData()
            cukes = pickle.loads(pickled_stuff)

            # TODO We are making the assumption that a "PickleData"
            # actually only has a list of Agents in it. 
            # Add some checking before making this a real thing, or
            # limit the type to a more-specific format like "AgentList"
            self.object.AddObjects(cukes)

        return defResult

    def OnDrop(self, x, y):
        #print(f"OnDrop({x},{y})")
        return True

    def OnDragOver(self, x, y, defResult):
        #print(f"OnDragOver({x},{y})")
        return defResult

    def OnEnter(self, x, y, defResult):
        #print(f"OnEnter({x},{y})")
        return defResult

class UnitPanel(wx.Panel):
 
    def __init__(self, parent, unitName="No name set"):
        wx.Panel.__init__(self, parent=parent, id=wx.ID_ANY)

        self.dataOlv = ObjectListView(self, wx.ID_ANY, style=wx.LC_REPORT|wx.SUNKEN_BORDER)

        self.dataOlv.SetColumns([
            ColumnDefn("ID", "left", -1, "ident", minimumWidth=100),
            ColumnDefn("Range", "right", -1, "range", minimumWidth=60),
            ColumnDefn("Speed", "right", -1, "speed", minimumWidth=60),
            ColumnDefn("Capacity", "right", -1, "capacity", minimumWidth=60)
        ])
 
        self.agents = []
        self.dataOlv.SetObjects(self.agents)
        self.dataOlv.Bind(wx.EVT_LIST_BEGIN_DRAG, self.OnDragInit)
        
        # Set up a drop target on the listview
        dt = GenericDropTarget(self.dataOlv) 
        self.dataOlv.SetDropTarget(dt) 

        # Set up a title for this box
        self.unitLabel = wx.StaticText(self, id=wx.ID_ANY, label=unitName)

        mainSizer = wx.BoxSizer(wx.VERTICAL)
        mainSizer.Add(self.unitLabel, proportion=0, flag=wx.ALL, border=5)
        mainSizer.Add(self.dataOlv, proportion=1, flag=wx.ALL|wx.EXPAND, border=5)
        self.SetSizer(mainSizer)

    def populate(self, units):
        self.agents = []
        # for unit in units:
        #     self.agents.append(Agent(unit["ident"], unit["range"], unit["speed"], unit["capacity"]))
        # self.dataOlv.SetObjects(self.agents)
        for unit in units:
            a = Agent(unit["ident"], unit["range"], unit["speed"], unit["capacity"])
            self.dataOlv.AddObject(a)
            #self.draggableURLText.Bind(wx.EVT_MOTION, self.OnStartDrag)

    def OnDragInit(self, event): 
        # Get all the selected items from this list
        selected = self.dataOlv.GetSelectedObjects()

        # Pickle them and put them in a custom data object
        pickled_selection = pickle.dumps(selected)
        drag_obj = wx.CustomDataObject("PickleData")
        drag_obj.SetData(pickled_selection) 

        #Create a drag and drop source from this ObjectListView
        src = wx.DropSource(self.dataOlv) 
        src.SetData(drag_obj)
        print("Drag started")
        result = src.DoDragDrop(wx.Drag_DefaultMove) 
        if result == wx.DragCopy:
            # We don't copy because agents are hardware
            self.dataOlv.RemoveObjects(selected)
        elif result == wx.DragMove:
            # Remove the data from here, add it to another list
            self.dataOlv.RemoveObjects(selected)
        else:
            # Default, do nothing
            print("Nothing, nothing, nothing at all")
        

class AssetFrame(wx.Frame):
    def __init__(self):
        wx.Frame.__init__(self, parent=None, id=wx.ID_ANY, 
                          title="ObjectListView Demo", size=(800, 600))
        
        self.panel = scrolled.ScrolledPanel(self, id=wx.ID_ANY)

        self.mainSizer = wx.BoxSizer(wx.VERTICAL)
        self.panel.SetSizer(self.mainSizer)
        self.panel.SetupScrolling()
        self.Show()

    def populate(self, config):
        for unit in config["units"]:
            unit_panel = UnitPanel(self.panel, unitName=unit["name"])
            unit_panel.populate(unit["agents"])
            self.mainSizer.Add(unit_panel)

 
if __name__ == "__main__":
    app = wx.App(False)

    # We're going to need to be able to populate the frame from the config,
    # so represent the data as a data structure and initialze with that
    config = {"units": [{"name": "Unimatrix 01",
                        "agents": [
                            {"ident": "7 of 9", "range": 10, "speed": 10, "capacity": 12},
                            {"ident": "Third of 5", "range": 10, "speed": 10, "capacity": 12}
                        ]},
                        {"name": "Unit 2",
                        "agents": [
                            {"ident": "u2s1", "range": 10, "speed": 10, "capacity": 12},
                            {"ident": "u2a2", "range": 10, "speed": 10, "capacity": 12},
                            {"ident": "u2a3", "range": 10, "speed": 10, "capacity": 12}
                        ]},
                        {"name": "Unit Foo",
                        "agents": [
                            {"ident": "bar", "range": 10, "speed": 10, "capacity": 12},
                            {"ident": "baz", "range": 10, "speed": 10, "capacity": 12},
                            {"ident": "quux", "range": 10, "speed": 10, "capacity": 12}
                        ]},
                        {"name": "Enterprise",
                        "agents": [
                            {"ident": "Galileo", "range": 10, "speed": 10, "capacity": 12},
                            {"ident": "Copernicus", "range": 10, "speed": 10, "capacity": 12}
                        ]},
                        {"name": "GSV Insufficent Gravitas",
                        "agents": [
                            {"ident": "GCU Nervous Energy", "range": 1000000, "speed": 3452334, "capacity": 13452342},
                            {"ident": "GCU Grey Area", "range": 1000000, "speed": 234523546, "capacity": 234562312}
                        ]}] 
              }

    frame = AssetFrame()
    frame.populate(config)
    app.MainLoop()

I called the unpickled data “cukes” because this is demo code, and I was being silly. A cucumber is, after all, what you get when you undo the process of making pickles. You may want to change that if you copy/paste this into production code.

A Flat Earth Would Be Odd

I’m not sure what “flat earthers” actually believe, but assuming the world is literally flat leads to some interesting results. For the sake of keeping things simple, lets assume that the world is flat like a coin: locally bumpy, but overall shaped like a disk. This assumption is based on what the word “flat” means. A saddle isn’t flat, so a hyperbolic-curved earth doesn’t count as “flat” in any reasonable way. Further, lets assume that it keeps the same surface area that the shall we say “conventional” model claims that it has. This assumption lets us avoid the absurdities [1] required to preserve distances, and so travel times, while unwrapping a sphere to cover a disk.

These assumptions tell us how big the disk is. The earth’s surface area is allegedly 196.9 million square miles. A disk’s surface area is given by pi * r2, so do the math to get r and you wind up with a radius of 7.916 million miles. This may present astronavigation problems, as the moon is only 238,900 miles away, so it might hit the disk… if the moon were real! [2]

Ok, so we all live on a disk that’s about 16 million miles across. Because I’m comically egotistical, I’m going to say that my home town is in the middle of the disk, equidistant from every edge.

Now we come to an interesting point: How thick is the disk? Let’s assume that we believe in gravity. The gravity in my home town is one G, and it causes objects to fall down, which is to say “towards the surface of the earth”. Under the conventional model, this is because the earth is under me, and so the gravity caused by the large mass of the earth exerts a force on objects above it. Now there are two ways we can go: either the earth is made of the stuff that it is observed to be made of, and has the density it is generally observed to have, or it’s made out of something far more dense. All that this really varies is how thick the disk is under my home town. To have 1G there, using the conventional materials, requires at least the alleged thickness of the round earth, which is to say about 7,917 miles. Using something denser makes it thinner, without affecting the gravity, but there are limits on how dense matter can get.

Ok, so far, so good. However, we’ve set a little bit of a trap for ourselves here. Everywhere you go on the surface of the earth, the gravity is about 1G, so everywhere you go, the disk earth has to be about 8k miles thick. In my home town, at the center of the disk, this isn’t a problem, because the gravity in all the other directions balances out.

What’s that you say? Yes, gravity in other directions. You see, we’re talking about a disk that measures about 8k miles thick and 16M miles across. If you’re in the center, there are equal amounts of disk around you in all directions, so the pull in all the other directions balances out. If you’re off center, there’s more mass on one side of you than on the other, and so there is a component to the gravitational pull that isn’t straight down, and is unbalanced by the ratio of how much mass is on each side.

Initially, this would probably be pretty subtle. Things would fall a little bit in the direction of my home town, but more or less straight down. Friction would suffice to keep things on surfaces, but round things would always roll towards my home town. It would only be a little harder to walk away from my home town than towards it. I doubt it would affect which way water goes down the toilet all that badly, although the water would pile up on one side of the bowl. However, as you got towards the edge, things would get FUCKING DIRE.

How dire? Well, lets look at the volume of that disk earth, which is properly a cylinder now that we know how thick it is. A cylinder that’s 16,000,000 miles across and 8,000 miles thick has a volume of 6,430,000,000,000,000,000 cubic miles. The conventional model earth has a volume of 260,000,000,000 (that is to say, 260 billion) cubic miles, and exerts a gravitational pull of 1G when you’re on the surface, which is to say that all of it is under you. When you’re on the edge of the disk earth, which is to say that almost all of it is, say, east of you, it exerts a gravitational pull of (around) 24,730,769.2308G. So to understand the force exerted on some poor schmuck who happens to get teleported from my town to the literal eastern edge of the world, assume that the ISO standard schmuck weighs 150 lbs. He will suddenly experience a thrust of about four billion pounds of force to the west. For comparison, a Saturn V rocket generated about 7.5 million (with an “m”) pounds of thrust, or about 500 times less thrust than people living on the edge experience as a consequence of just being there.

Unfortunately, since some of that force (at least 1G of it, thanks to the thickness of the disk) is downwards, towards the center of mass of the disk, rather than the location of my home town, the schmuck is going to hit the ground going absurdly fast and get spread all over it.

But wait, what is that ground made of? We said earlier that this disk is made out of normal earth stuff, which you can go out and observe to be mostly silicate-based rocks. The rocks under my home town have about 8 million miles of rock around them, pressing in towards the center of mass of the disk. Extremely weird stuff happens to matter under those kinds of pressures. Hydrogen (theoretically) becomes a metal. Atomic nuclei get mashed into each other.

That said, my home town is going to have other problems. For example, all the water in the world, and all the air, and all the stuff that’s far enough away to experience mostly-sideways gravity, is all going to flow towards the center of mass of the disk, and some of it will be coming in very fast. Since my town is barely above sea level under the conventional model, I think it’s going to get both extremely hot, due to the abrupt change in pressure, and rather wet, although possibly not before the disaster of degenerate matter that’s forming under it gets to the surface.

Alright flat-earthers, you got my home town into this, you get it out. Why are none of these effects observed on the ostensibly flat earth that we live on?

Well, maybe it’s not 8,000 miles thick. Maybe, it is in fact quite thin, and They can manipulate gravity to provide ~1G everywhere you go. They put some thickness everywhere that anyone decides to dig, or anywhere a tree falls over, but everywhere else it’s…. 1 inch thick. One inch is pretty thin, but that still puts the volume of the disk around 99,804,672,000,000,000 cubic miles, and so the inwards G-force experienced by someone on the edge at around 383,864.123G. This is still a troubling amount of force, but clearly if They can provide 1G over the whole surface of the earth, They can sort this out too.

That said, how long have They been doing this? If the earth has always been a disk, then obviously They were doing it before we evolved, so they’re not human. If They did it recently, why did no one notice the change? Was that the “road work” that was making my commute to work slow this morning? Either way, this moves “Them” from the category of “Federal/NWO government conspiracy to hide the truth, man!” to “Capricious god or gods with odd senses of humor”. At that point, there’s no use arguing what shape the world is, because it might be different tomorrow.


[1] These absurdities extend from the simple “everyone who has ever traveled is part of the conspiracy and lies about how long it takes” to the complex “THEY (It’s always ‘they’, innit?) can alter spacetime to slow or speed up travel”.

[2] Spoiler alert: it is.

ToyBrain Rides Again

Now with WiFi, current monitoring, and the ever-worrying li-poly battery chemistry.

image

I haven’t tested the WiFi module or motor drivers yet, but the board is fully populated and the battery regulation circuit works.

There are a few design changes in the pipeline, but I’m going to go for a full hardware test and try to find more bugs before I create version 2 of the boards.

The current design is in github, along with all the docs.

You've failed me for the last time, Register.com.

I tried paying to renew my domain at register.com. I tried 5 times. In a normal business, if you try to pay, the business usually takes your money. Not register.com. So I’ve transferred my domain to 1&1, who will hopefully be able to take my money and provide me with goods and services in exchange.

Current State of The ToyBrains

The fuse settings on the current device are low:0xe2, high:0xda, and extended:0x5. I can talk to it via ICSP, and get the correct component signature (0x1e9514) back. What all of this says to me is that the ICSP settings are correct, and the onboard oscillator is running, so the chip is capable of having the bootloader installed.

It is entirely likely that I was using the wrong bootloader for my boards. I am using an ATMega328 running at 8MHz, so I suspect that the correct bootloader is ATmegaBOOT_168_atmega328_pro_8MHz.hex. This is important because an bootloader created for the wrong clock speed can still be loaded onto a board, but won’t be able to communicate over the serial port. The timings of the serial signals would be messed up, because any delay operations will become either too long or too short, depending on if the clock is too slow or too fast.

I got the latest version of the Arduino IDE, and modified the appropriate files as described at the end of this entry. In order to burn the bootloader, I had to be root, so I started the Arduino IDE as root and burned the bootloader, which apparently worked on the first try.

I quit and restarted the IDE, because I didn’t want to keep running as root, and plugged in my FTDI cable. Unfortunately, the IDE couldn’t compile my little test program because the arduino IDE ships with an old version of avr-gcc (4.3.2) and the ATMega328 wasn’t supported until later. I have avr-gcc 4.5.3, so I renamed the avr folder in /arduino-1.0.5/hardware/tools to avr_old. This forces the IDE to use the system avr-gcc, because it can’t find its own. With this, I was able to compile.

Next, I attempted to upload the compiled program to the board. The upload failed with the error message “avrdude: stk500_recv(): programmer is not responding”.

I switched to using the programmer to upload the sketch, and wrote a sketch that blinks an LED on analog pin 5. Originally, the sketch used analog pin 7, because that’s where my debug LED is hooked up, but it turns out that while you can use A0-A5 as digital outputs, you can’t do that with A6 or A7.

At any rate, the system can now blink an LED on A5. This verifies that the onboard clock is working, that the memory can be written to, and that the compiler is generating valid code. The clock speed is even correct, because a blink program with a 1 second period even generates 1 second blinks.

Now I just need to figure out why uploading via USB/serial doesn’t work, and I’ll be golden.

Read more

Modkit Micro Alpha

Modkit Micro Alpha does not run on Ubuntu 12.04 64-bit without the user moving some directories around and installing ia32-libs.

It also does not save projects more than once per launch, or compile code at all, on Ubuntu or Mac OSX.

I appreciate that the point of Modkit’s Kickstarter campaign was not to sell a finished product, but to get the money to produce one. That said, people who have paid money for a thing may want that thing to work. Modkit is now in the uncomfortable position of having sold the moon, and being expected to deliver it.

ToyBrain V2 Ordered

I’ve ordered the boards for the second version of the ToyBrain project. These boards are smaller than the originals, and should correct most of the problems (swapped TX/RX lines, put the ICSP header in right, etc.) I’m planning to outfit them with ATMega328s.

If these work and are all correct, the next version will be done in black, possibly with gold for looks. After that, perhaps I’ll make a kickstarter of it and see if I can’t sell a few boards.

FFMPEG recipes

ffmpeg -y -i video.avi -vframes 1 -ss 00:00:10 -an -vcodec png -f rawvideo -s 320×240 frame.png

Extract the frame at 10 seconds into the video file video.avi, resize it to 320×240, and save it to frame.png.