Vape Sensor Plotting

The values calculated here are differences from the average of the first 2o samples. The X axis is time, but it’s not well-specified because I didn’t actually set a regular timer, I just collected samples as fast as possible.

As was predicted in the previous entry, the MQ-135 and MQ-3, both of which can sense alcohol, had the strongest response to vape clouds. These are the ones I’d use for the primary detection if I was building a serious vape detecting product.

The MQ-9 and MQ-8 had similar responses, but not as strong. This is kind of interesting, since they are supposed to be good for butane, propane, LPG, and hydrogen, but maybe they just do a decent job detecting light molecules with carbon in them? The MQ-2 response is interesting, since it’s sold as a LPG, propane, and hydrogen detector, but has a moderate response to vapes too.

The much lower results seems to indicate that vape clouds have no VOCs (CJMCU-1100) and no town gas or LPG (MQ-5, but I didn’t need a sensor to tell me that).

In Which Vapes May Yet Be Sensed

I’m coming to the “being a narc” thing a little late, since everyone is now having a panic about cutting agents in vapes killing people (Body count stands at ~15. The cops killed about 678 people so far this year, so y’know, keep vaping and avoid cops, it’s 45 times safer). At any rate, a school in my area was sold fantastically expensive devices that monitor areas for vaping, and report to the school administration when someone does it. The devices are on a subscription model, so they’re not just expensive once, they stay expensive.

This got me wondering what it actually takes to detect vape… vapor. Vape juice is mostly propylene glycol and glycerine, plus a dab of flavor and possibly nicotine. I have two theories about ways to detect this. One is that there will be some material produced by vaping that is detectable with a gas sensor. The other way is that phat clouds can be picked up by a particulate sensor, and the fact that it picks up smoke too is just fine, since the kids aren’t supposed to be smoking in the bathroom either.

MQ-2Combustable gasses, LPG, propane, hydrogen, probably also methane
MQ-8Just hydrogen
MQ-4Methane and natural gas
MQ-3Alcohol
MQ-6LPG, iso-butane, propane
MQ-5LPG, natural gas, “town gas”
MQ-135Carbon monoxide, carbon dioxide, ammonia, nitrogen oxide, alcohols, “aromatic compounds”, “sulfide”, and smoke
MQ-9Carbon monoxide, LPG
CMJU-1100VOCs, toluene, formaldehyde, benzene, and so on

There’s a lot of overlap in these sensors, as well as a lot of ambiguity in their datasheets. “Sulfide” isn’t a thing without whatever it’s a sulfide of. Hydrogen sulfide is a toxic and terrible-smelling gas. Cadmium sulfide is a bright yellow solid. “Town gas” is typically a mix of carbon monoxide and hydrogen. LPG is also a mix, including propane, butane, and isobutane, so having it in a list with any of those is kind of redundant.

I haven’t tested yet, but I suspect that the sensors most likely to detect sick clouds are the MQ-3, the MQ-135, and maaaaybe the CMJU-1100. The MQ-3 claims to detect alcohol, as in a breathalyzer, but the family of alcohols is actually pretty large. There’s ethyl (drinkin’ alcohol) and isopropyl (cleanin’ alcohol) and methyl (killin’ alcohol), in addition to some stuff people don’t typically think of as alcohols, like the sugar alcohols, which includes glycerine. Since glycerine is in vape juice, perhaps the sensor will detect it.

The actual mechanism of these sensors is interesting. They appear to the circuit as a resistor that changes resistance in the presence of a gas. The resistor is made of tin dioxide, which has a resistance that drops when exposed to the gasses, but it only responds quickly if the device is hot, so there is also a built-in heater for the sensors.

Because the sensors have little heaters in them, A) they smell weird when they power up for the first time and B) they take a while to stabilize. I powered them up and watched them on the Arduino serial plotter until the outputs more or less leveled out. Then I vaped at them.

Pretty much all of the sensors had some sort of response, but the thing that seemed to vary between them was that some responded faster than others. The next step is going to be logging which ones are outputting what, so I can tell which ones had the fast response and which ones had the strongest response.

The sensors also have a slight response when I just blow at them with normal breath (I have a control! I’m a scientist, doin’ a science!). Interestingly, for some of the sensors, the reaction to my normal breath was a deviation downwards/towards a lower value, while the vape reactions was uniformly a deviation upwards/towards a higher value. This suggests that including one of the sensors that indicates vaping by the sign of its change would serve as a check, since only vaping would cause both sensors to go up, and normal breath would cause one to go up and one to go down.

int readings[9] = {0,0,0,0,0,0,0,0,0};
int index = 0;

void setup() {
  Serial.begin(9600);
}

void loop() {
  //Read all the analog inputs from A0-A8 (I think they're consecutive...)
  index = 0;
  for(int ii = A0; ii <= A8; ii++ ){
    readings[index] = analogRead(ii);
    index++;    
  }

  for(int jj = 0; jj < index-1; jj++){
    Serial.print(readings[jj]);
    Serial.print(",");
  }
  Serial.println(readings[index-1]);
}

On Looking into Mouse Sensors

That is to say, the sensors from optical mice, rather than a sensor intended to detect the small rodent.

I have 10 boards from optical mice and three desoldered sensors. Among that bunch there are two common IC packages, a 16-pin staggered DIP (4 units) and an 8-pin staggered DIP (6 units). There is also a 20-pin staggered DIP, and two 12-pin DIP packages.

Most of the chips were made by Agilent, or Avago, a spin-off of Agilent that eventually bought Broadcom and started operating under that name. A couple are from At Lab, or as they style themselves “@lab”.

The chip interfaces are very heterogeneous. Some of them just output PS2 data and clock signals, and so are a very integrated mouse IC. Some of them output quadrature signals for x and y motion.

I had high hopes for using these mouse sensors for a couple of hacks. One of them is that they are essentially optical flow processors, so you can use them to either get velocity based on the observed motion of stationary objects from a moving platform, assuming you know how far away the objects are (and so get odometry for a moving robot by watching the ground roll by). The inverse of that is that you can also get how far away an object is assuming that it is stationary and you know your own speed (for height over ground detection in a drone, for example).

Ultimately, though, I don’t think this stash of ICs is going to do the job I want. What I want is something I can drop into projects I’m working on, and reverse engineering each of these, finding the datasheets for ICs old enough to support PS2 protocol, and so forth, would be its own hassle of a project. USB optical mice are $7 or so, so I can’t really justify the effort to get these working, sort out optics for them, etc.

On top of that, drone optical flow sensors with the optics already sorted are like $10-20, so for that use case, I can just buy the part. For robot odometry, I can use the same part, or put optics on a USB mouse that can actually plug into a recent computer, instead of decoding quadrature or PS2.

It feels kind of weird to pick up one of my old projects that I had been kind of looking forward to, and realize that it’s simply not useful or interesting, but I guess that’s just how it goes. At least I can free up that parts drawer now!

Algorithmic bar stocking

Generally, stocking a bar is a pretty simple affair. You get a big bottle of your basic alcohols (rum, gin, vodka, whiskey, tequila), smaller bottles of some other stuff (vermouth, bitters, etc), and a bunch of mixers.

Given a well-stocked bar, you can make a lot of drinks. However, you can’t make all the drinks, or rather, you shouldn’t. Not all combinations of ingredients will work well together. A rum and Coke is fine. A Coke and Coke won’t impress anyone, and a Midori and Coke would be awful, so the upper bound on the number of drinks you can make with a set of N alcohols is much lower than pure math would suggest.

I’m very gradually building a barbot, which adds a further complication: the robot only has five pumps. So from all the available alcohols, mixers, and so on, I have to choose five (there’s also a practical concern, which is that pumping soda shakes it up badly, but I’m choosing to ignore that for now).

For example, gin, tonic, vodka, orange juice, and cranberry juice would let me make a Vodka and Tonic, G&T, Screwdriver, Gin & Juice, and a Cape Cod. That’s pretty good, with 5 different drinks available from 5 different ingredients (10 if you count “shot of gin”, “glass of orange juice”, and so on as drinks).

But I want to know what the set of five liquors with the most possible mixed drinks is. To that end, I’ve downloaded the complete set of mixed drinks from the webtender, which I plan to use as the data for making my drink set.

The algorithm is another matter.

Color All The FloatCanvas Objects!

I’m drawing stuff over a background image of the ocean, which looks bluish-green. Naturally, bluish-greens and some grays don’t have enough contrast to stand out, so they kind of get lost in the image.

I initially tried calculating luminance from RGB triplets, which does work for readability, but I was using the web readability thresholds for the luminance ratio (4.5 for normal, 7 for high contrast), and it didn’t work very well because my background is sort of middling in luminance, so it punched out most of the middle range of colors, resulting in everything being either dark and similar-looking, or light and similar-looking.

I switched to RGB color distance (which, yes, isn’t perceptually flat, but this isn’t really a color-sensitive application, beyond not making them too similar). In order to figure out where my threshold should be, I wanted to get a list of all the wxPython named colors, and what they looked like, and their distance from “cadetblue”, which is about the same color as the average of my ocean background.

#!/usr/bin/python

# Generate a page that previews all the colors from WX
import wx

app = wx.App(False)

import wx.lib.colourdb as wxColorDB

def get_d(colorname1, colorname2="cadetblue"):
    target_color = wx.Colour(colorname1).Get()
    source_color = wx.Colour(colorname2).Get()
    # Euclidian distance
    d = math.sqrt(sum([pow(x[0] - x[1], 2) for x in zip(target_color[:3], source_color)]))
    return d
    
print("<html>")
print("  <body>")

colors = list(set(wxColorDB.getColourList()))
colors.sort(key=lambda c: get_d(c, "white"))

for color in colors:
    cRGB = wx.Colour(color).Get()
    print("    <div style=\"white-space: nowrap\">")
    print(f"      <div style=\"display: inline-block; width:300px\">{color}</div>")
    print(f"      <div style=\"display: inline-block;height:30px;overflow:auto;background-color:rgb({cRGB[0]},{cRGB[1]},{cRGB[2]}); width:100px\"> </div>")
    d = get_d(color, "white")
    print(f"      <div style=\"display: inline-block\">{d}</div>")
    print("    </div>")

print("  </body>")
print("</html>")

That script generates an HTML page with the colors in it, ranked by distance from the given color name. I picked white for the version published here, but as you can see, the default is “cadetblue”. If you pick a color name WX doesn’t know, you’re going to have a bad time.

A distance of 80 seemed to work pretty well for me, so as a rule of thumb, 80 units of color distance gets you a distinct color in 8-bit-per-color RGB color space.

There are, of course, some problems to be aware of. For instance, distance folds hue and value together, so getting brighter or darker and remaining the same hue can make up a lot of that 80 units, without necessarily getting good contrast.

Raspbian Still Bad

The command “sudo apt-get install arduino” doesn’t get you the Arduino development environment. Instead, when you try to run it from the command line, you get “Error occured during initialization of VM. Server VM is only supported on ARMv7+ VFP”. I get that cross-compilation can be tricky, but this software didn’t get released, it escaped. Arduino isn’t exactly an obscure package.

To be fair, this isn’t because Arduino is broken. It’s because apt automatically chooses a version of Java that can’t run on the Zero W. So any package that uses Java is probably also broken, but I’m not looking into it, because I’m trying to actually do something with the Raspberry Pi. If I wanted to fuck around with a broken package chain, I’d… have gotten my fill of that like 10-13 years ago, so I guess I’d need a time machine. Really, the Pi Zero W is a lot like a time machine, taking me back to a time when Linux was a project, rather than being something you can use for projects.

If you, for some reason, want to use the Arduino IDE on a Pi Zero W, the incantation is to install Java 8 with ‘sudo apt-get install openjdk-8-jre-headless openjdk-8-jre’ and then use ‘sudo update-alternatives –config java’. Then you”ll be running a 5 year old Java and a 6 year old Arduino IDE, but at least the IDE will start up.

The Year of Linux on the Desktop is “Ha, Get Fucked”

People talk about usability, and in a lot of ways, Ubuntu is pretty close. It certainly beats Windows 10, at least for me, since you’re allowed to know what has gone wrong and fix it, instead of just rebooting whenever your computer gets infested with ghosts. I do complain about Ubuntu, but it’s usually because most things are decent, so the things that are bad are particularly egregious. And, after all, I did get what I paid for.

This time, though, I’m trying to work with a Raspberry Pi Zero W, and Raspbian. It has a little setup walkthrough that clearly took a serious blow to the head at some point, since it asks you to connect to WiFi, and then if you didn’t connect to a network, still asks if you want to download software. From what, I might ask, since there’s not a network connection?

Of course, the reason I didn’t connect to the network is that it’s hidden. Not a problem, I have the SSID on paper here… and no way to tell it to the Pi. There’s a network configurator, but it only lets you deal with unhidden networks, not enter your own SSIDs. Maybe this was done because “SSID” is one of those worrying “techie” terms, and this is supposed to be for somewhat less technical users? Protip: it’s not. It ships as a bare board in an antistatic bag, for fuck’s sake. Maybe this was not done because it’s somehow hard?

At any rate, the solution appears to be: manually edit wpasupplicant.conf. Manually edit a text file that only root can edit, in this, the Year of Our Lord 2019. I don’t have a problem with this. I have a PhD in making tiny robots go and have been using Linux for 15 years, because everything else is worse for my use cases. Normal humans, who are perhaps entering college and just want to check out Linux and maybe try writing a little Scratch or Python, they are going to have a problem with this.

Also, the same startup script asked me if my screen had black bars around the edges. It did, so I said yes (more fool me!). When I rebooted, the edges of the desktop (you know, where the UI goes) were mostly off the edge of the screen. Setting the hostname with the network config tool on the toolbar caused host name resolution errors every time I use sudo, apparently because sudo wanted “ouija1” but “ouija_1” got written into /etc/hosts. That’s not actually a valid hostname (my error), but it got written into /etc/hosts (an error by whoever wrote the alleged network config tool). Again, I know what /etc/hosts is, editing it isn’t an issue. I’m weird. For most people, this is an issue.

So in general, my experience with the Raspberry Pi and Raspbian is that it’s not ready for end users who want to use it to do things. If you want it in order to do things to it, rather than with it, and are already an experienced electronics enthusiast and Linux user, you’ll be fine.

Curious Effects Getting List Extents

I have a program that gets a list of GPS waypoints, and wants to figure out their bounding box. The naive way[1] to do this is find the maximum and minimum latitude and longitude, and use the maxes as one corner and the minimums as the other corner.

Off the top of my head, I can think of two ways to do this: Iterate the list of waypoints, comparing to the max and minimum so far, and updating as I go. The list has N points, I have to look at all of them, so O(N), so far so good.

The other way to do it is to do list comprehensions to get the latitudes and longitudes as separate lists, and then call max() and min() on each of those. I would assume that each list comprehension is O(N), and each call to max() or min() is also O(N), since they have to look through the whole list to find the maximum or minimum, and so it is 6 loops over the list (2 comprehensions, 2 max() calls, 2 min() calls), and so this is the slower way to do it.

It turns out, not so much.

I ran the code below on Repl.it and got, usually, the list comprehension version being just very slightly faster to twice as fast. Occasionally, the 10,000 instance case is slower, but not all the time.

import random 
from timeit import default_timer as timer

#Try some different sizes of lists
for jj in [10, 100, 1000, 10000, 100000, 1000000]:

  #Set up N waypoints
  waypoints = []
  for ii in range(jj):
    lat = (random.random() * 360) - 180
    lon = (random.random() * 360) - 180
    waypoints.append({"lat":lat, "lon":lon})

  start = timer()

  # One loop
  maxLat = maxLon = -float("inf")
  minLat = minLon = float("inf")
  for point in waypoints:
      lat = float(point["lat"])
      if lat < minLat:
          minLat = lat
      if lat > maxLat:
          maxLat = lat
      lon = float(point["lon"])
      if lon < minLon:
          minLon = lon
      if lon > maxLon:
          maxLon = lon

  mid = timer()

  # List comprehensions
  lons = [float(point["lon"]) for point in waypoints]
  lats = [float(point["lat"]) for point in waypoints]
  minLat1 = min(lats)
  minLon1 = min(lons)
  maxLat1 = max(lats)
  maxLon1 = max(lons)

  end = timer()

  #Print the results
  print(f"{jj} points")
  print(f"  first way {mid-start}")
  print(f"  second way {end-mid}")
  print(f"  speedup {(mid-start)/(end-mid)}")
  assert(minLat == minLat1)
  assert(maxLat == maxLat1)
  assert(minLon == minLon1)
  assert(maxLon == maxLon1)

So why is it faster? Clearly, I’m assuming something wrong. I suspect the main thing that I’m assuming wrong is that the constant 6 multiplied by the O(N) matters. It probably doesn’t, and that’s why we typically drop the constant multipliers in runtime comparisons. It’s likely that list comprehensions and max()/min() of iterables are calls to a very fast C implementation, and are just so much faster than my loop in Python that the fact that I’m doing 6 iterations doesn’t really matter.

Another thing that I’m assuming is that max and min are implemented as linear searches over iterables. It’s entirely possible that iterables store references to their maximum and minimum values, and just return that when asked, rather than going looking. I doubt it, since the overhead on removing an element would be large [2], but it is possible.

I haven’t looked into either of these assumptions, since timing the runs answered the question I had (“Which is faster?”), and the follow-on question (“Why?”) isn’t useful to me at this time.

[1] It does some stupid stuff around the poles and across the international date line, for example.

[2] You’ve removed the largest element. What is the new largest? Time to go searching…

Alternatively, the list could be implemented as a parallely-linked-list, where one set of links is the elements in their list order, and the other set is the elements in their sorted order, but then the list [1, 3, 5, “pickles”, <built-in method foo of Bar object at 0x1f34b881>, 6] doesn’t have well-defined links for the sorted order.

Drag and Drop Python Objects in WxPython

I’m working on a UI for a system that has agents, which have some qualities, and units, which have multiple agents. I want to be able to display each unit as a list, and drag agents from that list to other units, to reassign them.

There are a lot of examples for using drag and drop with text and files, because WxPython provides drop targets for text and files already. One way that this could be implemented is to serialize the dragged objects to JSON, drop them as text, and then de-serialize them. This has some disadvantages, notably that you end up restricted by what you can pass via JSON. I wanted to pass Python objects, so I used pickle.

What I eventually came up with is below. It uses ObjectListView, which is a great wrapper around WxPython’s ListCtrl. On drag, the selected items of the source list are pickled and passed through the Wx drag and drop mechanism. When they are dropped on another ObjectListView, they are then unpickled and added to that ObjectListView (OLV), and removed from the source OLV.

One thing that this code does leave up to the programmer is ensuring that what goes in on one side of the drag and drop is the same as what is expected out on the other side. Another, slightly more subtle thing, is that this uses pickle on the drop data, so it would be possible to have a script that generates malicious pickle data, and lets you drag it from another UI window to my script’s OLV, whereupon it unpickles into something nasty.

That said, if your attacker is sitting at your computer, launching malicious programs and dragging and dropping stuff out of them, you have already lost, and should probably invest in better door locks.

#!/usr/bin/env python
import wx
import pickle
from ObjectListView import ObjectListView, ColumnDefn  #pip install objectlistview
import wx.lib.scrolledpanel as scrolled

# UI with a few list boxes that can be drag/dropped between, and have title bars

class Agent(object):
    # Model of a single agent, has:
    # identifier (string?)
    # range (km)
    # speed (kph)
    # capacity (integer)
    def __init__(self, ident, range, speed, capacity):
        self.ident = ident
        self.range = range
        self.speed = speed
        self.capacity = capacity

    def __repr__(self):
        return f"<Agent: {self.ident}>"

# Drag and drop Drop target that supports receiving pickled 
# python data structures and doing something with them. 
class GenericDropTarget(wx.DropTarget):
    def __init__(self, object):
        super(GenericDropTarget, self).__init__()
        self.object = object

        self.data = wx.CustomDataObject("PickleData")
        self.SetDataObject(self.data)

    def OnData(self, x, y, defResult):
        #print(f"OnData({x},{y})")
        
        if self.GetData():
            # unpickle data and do something with it
            pickled_stuff = self.data.GetData()
            cukes = pickle.loads(pickled_stuff)

            # TODO We are making the assumption that a "PickleData"
            # actually only has a list of Agents in it. 
            # Add some checking before making this a real thing, or
            # limit the type to a more-specific format like "AgentList"
            self.object.AddObjects(cukes)

        return defResult

    def OnDrop(self, x, y):
        #print(f"OnDrop({x},{y})")
        return True

    def OnDragOver(self, x, y, defResult):
        #print(f"OnDragOver({x},{y})")
        return defResult

    def OnEnter(self, x, y, defResult):
        #print(f"OnEnter({x},{y})")
        return defResult

class UnitPanel(wx.Panel):
 
    def __init__(self, parent, unitName="No name set"):
        wx.Panel.__init__(self, parent=parent, id=wx.ID_ANY)

        self.dataOlv = ObjectListView(self, wx.ID_ANY, style=wx.LC_REPORT|wx.SUNKEN_BORDER)

        self.dataOlv.SetColumns([
            ColumnDefn("ID", "left", -1, "ident", minimumWidth=100),
            ColumnDefn("Range", "right", -1, "range", minimumWidth=60),
            ColumnDefn("Speed", "right", -1, "speed", minimumWidth=60),
            ColumnDefn("Capacity", "right", -1, "capacity", minimumWidth=60)
        ])
 
        self.agents = []
        self.dataOlv.SetObjects(self.agents)
        self.dataOlv.Bind(wx.EVT_LIST_BEGIN_DRAG, self.OnDragInit)
        
        # Set up a drop target on the listview
        dt = GenericDropTarget(self.dataOlv) 
        self.dataOlv.SetDropTarget(dt) 

        # Set up a title for this box
        self.unitLabel = wx.StaticText(self, id=wx.ID_ANY, label=unitName)

        mainSizer = wx.BoxSizer(wx.VERTICAL)
        mainSizer.Add(self.unitLabel, proportion=0, flag=wx.ALL, border=5)
        mainSizer.Add(self.dataOlv, proportion=1, flag=wx.ALL|wx.EXPAND, border=5)
        self.SetSizer(mainSizer)

    def populate(self, units):
        self.agents = []
        # for unit in units:
        #     self.agents.append(Agent(unit["ident"], unit["range"], unit["speed"], unit["capacity"]))
        # self.dataOlv.SetObjects(self.agents)
        for unit in units:
            a = Agent(unit["ident"], unit["range"], unit["speed"], unit["capacity"])
            self.dataOlv.AddObject(a)
            #self.draggableURLText.Bind(wx.EVT_MOTION, self.OnStartDrag)

    def OnDragInit(self, event): 
        # Get all the selected items from this list
        selected = self.dataOlv.GetSelectedObjects()

        # Pickle them and put them in a custom data object
        pickled_selection = pickle.dumps(selected)
        drag_obj = wx.CustomDataObject("PickleData")
        drag_obj.SetData(pickled_selection) 

        #Create a drag and drop source from this ObjectListView
        src = wx.DropSource(self.dataOlv) 
        src.SetData(drag_obj)
        print("Drag started")
        result = src.DoDragDrop(wx.Drag_DefaultMove) 
        if result == wx.DragCopy:
            # We don't copy because agents are hardware
            self.dataOlv.RemoveObjects(selected)
        elif result == wx.DragMove:
            # Remove the data from here, add it to another list
            self.dataOlv.RemoveObjects(selected)
        else:
            # Default, do nothing
            print("Nothing, nothing, nothing at all")
        

class AssetFrame(wx.Frame):
    def __init__(self):
        wx.Frame.__init__(self, parent=None, id=wx.ID_ANY, 
                          title="ObjectListView Demo", size=(800, 600))
        
        self.panel = scrolled.ScrolledPanel(self, id=wx.ID_ANY)

        self.mainSizer = wx.BoxSizer(wx.VERTICAL)
        self.panel.SetSizer(self.mainSizer)
        self.panel.SetupScrolling()
        self.Show()

    def populate(self, config):
        for unit in config["units"]:
            unit_panel = UnitPanel(self.panel, unitName=unit["name"])
            unit_panel.populate(unit["agents"])
            self.mainSizer.Add(unit_panel)

 
if __name__ == "__main__":
    app = wx.App(False)

    # We're going to need to be able to populate the frame from the config,
    # so represent the data as a data structure and initialze with that
    config = {"units": [{"name": "Unimatrix 01",
                        "agents": [
                            {"ident": "7 of 9", "range": 10, "speed": 10, "capacity": 12},
                            {"ident": "Third of 5", "range": 10, "speed": 10, "capacity": 12}
                        ]},
                        {"name": "Unit 2",
                        "agents": [
                            {"ident": "u2s1", "range": 10, "speed": 10, "capacity": 12},
                            {"ident": "u2a2", "range": 10, "speed": 10, "capacity": 12},
                            {"ident": "u2a3", "range": 10, "speed": 10, "capacity": 12}
                        ]},
                        {"name": "Unit Foo",
                        "agents": [
                            {"ident": "bar", "range": 10, "speed": 10, "capacity": 12},
                            {"ident": "baz", "range": 10, "speed": 10, "capacity": 12},
                            {"ident": "quux", "range": 10, "speed": 10, "capacity": 12}
                        ]},
                        {"name": "Enterprise",
                        "agents": [
                            {"ident": "Galileo", "range": 10, "speed": 10, "capacity": 12},
                            {"ident": "Copernicus", "range": 10, "speed": 10, "capacity": 12}
                        ]},
                        {"name": "GSV Insufficent Gravitas",
                        "agents": [
                            {"ident": "GCU Nervous Energy", "range": 1000000, "speed": 3452334, "capacity": 13452342},
                            {"ident": "GCU Grey Area", "range": 1000000, "speed": 234523546, "capacity": 234562312}
                        ]}] 
              }

    frame = AssetFrame()
    frame.populate(config)
    app.MainLoop()

I called the unpickled data “cukes” because this is demo code, and I was being silly. A cucumber is, after all, what you get when you undo the process of making pickles. You may want to change that if you copy/paste this into production code.

A Flat Earth Would Be Odd

I’m not sure what “flat earthers” actually believe, but assuming the world is literally flat leads to some interesting results. For the sake of keeping things simple, lets assume that the world is flat like a coin: locally bumpy, but overall shaped like a disk. This assumption is based on what the word “flat” means. A saddle isn’t flat, so a hyperbolic-curved earth doesn’t count as “flat” in any reasonable way. Further, lets assume that it keeps the same surface area that the shall we say “conventional” model claims that it has. This assumption lets us avoid the absurdities [1] required to preserve distances, and so travel times, while unwrapping a sphere to cover a disk.

These assumptions tell us how big the disk is. The earth’s surface area is allegedly 196.9 million square miles. A disk’s surface area is given by pi * r2, so do the math to get r and you wind up with a radius of 7.916 million miles. This may present astronavigation problems, as the moon is only 238,900 miles away, so it might hit the disk… if the moon were real! [2]

Ok, so we all live on a disk that’s about 16 million miles across. Because I’m comically egotistical, I’m going to say that my home town is in the middle of the disk, equidistant from every edge.

Now we come to an interesting point: How thick is the disk? Let’s assume that we believe in gravity. The gravity in my home town is one G, and it causes objects to fall down, which is to say “towards the surface of the earth”. Under the conventional model, this is because the earth is under me, and so the gravity caused by the large mass of the earth exerts a force on objects above it. Now there are two ways we can go: either the earth is made of the stuff that it is observed to be made of, and has the density it is generally observed to have, or it’s made out of something far more dense. All that this really varies is how thick the disk is under my home town. To have 1G there, using the conventional materials, requires at least the alleged thickness of the round earth, which is to say about 7,917 miles. Using something denser makes it thinner, without affecting the gravity, but there are limits on how dense matter can get.

Ok, so far, so good. However, we’ve set a little bit of a trap for ourselves here. Everywhere you go on the surface of the earth, the gravity is about 1G, so everywhere you go, the disk earth has to be about 8k miles thick. In my home town, at the center of the disk, this isn’t a problem, because the gravity in all the other directions balances out.

What’s that you say? Yes, gravity in other directions. You see, we’re talking about a disk that measures about 8k miles thick and 16M miles across. If you’re in the center, there are equal amounts of disk around you in all directions, so the pull in all the other directions balances out. If you’re off center, there’s more mass on one side of you than on the other, and so there is a component to the gravitational pull that isn’t straight down, and is unbalanced by the ratio of how much mass is on each side.

Initially, this would probably be pretty subtle. Things would fall a little bit in the direction of my home town, but more or less straight down. Friction would suffice to keep things on surfaces, but round things would always roll towards my home town. It would only be a little harder to walk away from my home town than towards it. I doubt it would affect which way water goes down the toilet all that badly, although the water would pile up on one side of the bowl. However, as you got towards the edge, things would get FUCKING DIRE.

How dire? Well, lets look at the volume of that disk earth, which is properly a cylinder now that we know how thick it is. A cylinder that’s 16,000,000 miles across and 8,000 miles thick has a volume of 6,430,000,000,000,000,000 cubic miles. The conventional model earth has a volume of 260,000,000,000 (that is to say, 260 billion) cubic miles, and exerts a gravitational pull of 1G when you’re on the surface, which is to say that all of it is under you. When you’re on the edge of the disk earth, which is to say that almost all of it is, say, east of you, it exerts a gravitational pull of (around) 24,730,769.2308G. So to understand the force exerted on some poor schmuck who happens to get teleported from my town to the literal eastern edge of the world, assume that the ISO standard schmuck weighs 150 lbs. He will suddenly experience a thrust of about four billion pounds of force to the west. For comparison, a Saturn V rocket generated about 7.5 million (with an “m”) pounds of thrust, or about 500 times less thrust than people living on the edge experience as a consequence of just being there.

Unfortunately, since some of that force (at least 1G of it, thanks to the thickness of the disk) is downwards, towards the center of mass of the disk, rather than the location of my home town, the schmuck is going to hit the ground going absurdly fast and get spread all over it.

But wait, what is that ground made of? We said earlier that this disk is made out of normal earth stuff, which you can go out and observe to be mostly silicate-based rocks. The rocks under my home town have about 8 million miles of rock around them, pressing in towards the center of mass of the disk. Extremely weird stuff happens to matter under those kinds of pressures. Hydrogen (theoretically) becomes a metal. Atomic nuclei get mashed into each other.

That said, my home town is going to have other problems. For example, all the water in the world, and all the air, and all the stuff that’s far enough away to experience mostly-sideways gravity, is all going to flow towards the center of mass of the disk, and some of it will be coming in very fast. Since my town is barely above sea level under the conventional model, I think it’s going to get both extremely hot, due to the abrupt change in pressure, and rather wet, although possibly not before the disaster of degenerate matter that’s forming under it gets to the surface.

Alright flat-earthers, you got my home town into this, you get it out. Why are none of these effects observed on the ostensibly flat earth that we live on?

Well, maybe it’s not 8,000 miles thick. Maybe, it is in fact quite thin, and They can manipulate gravity to provide ~1G everywhere you go. They put some thickness everywhere that anyone decides to dig, or anywhere a tree falls over, but everywhere else it’s…. 1 inch thick. One inch is pretty thin, but that still puts the volume of the disk around 99,804,672,000,000,000 cubic miles, and so the inwards G-force experienced by someone on the edge at around 383,864.123G. This is still a troubling amount of force, but clearly if They can provide 1G over the whole surface of the earth, They can sort this out too.

That said, how long have They been doing this? If the earth has always been a disk, then obviously They were doing it before we evolved, so they’re not human. If They did it recently, why did no one notice the change? Was that the “road work” that was making my commute to work slow this morning? Either way, this moves “Them” from the category of “Federal/NWO government conspiracy to hide the truth, man!” to “Capricious god or gods with odd senses of humor”. At that point, there’s no use arguing what shape the world is, because it might be different tomorrow.


[1] These absurdities extend from the simple “everyone who has ever traveled is part of the conspiracy and lies about how long it takes” to the complex “THEY (It’s always ‘they’, innit?) can alter spacetime to slow or speed up travel”.

[2] Spoiler alert: it is.