Author: john

  • Report from the AI trenches

    Report from the AI trenches

    I work for a small company that specializes in digital signage, on-hold messaging, and overhead music. I have my hands in everything, front to back.

    Years ago, I wrote the code that constitutes the on hold and overhead music in C#, which has been an absolute pain to deal with since these players are on-site all across the country. (Thousands of them) There is just no way for a small shop to keep up with Microsoft’s dotnet release pace anymore.

    So I decided I’d like to simplify my programming life a bit and learn Go, then port my C# project to Go as a capstone of sorts. Even though I know better than to add more than one new technology to any project, I decided to break my rule and also mix in AI… Oh, and also Wails for the UI. How hard could all of this be?

    You can probably guess how accurate my estimate ended up being for this project.

    The first thing to point out is just how much AI coding agents have progressed in the last 6-8 months. I started with Qwen coder, then moved to JetBrains original AI agent (Did it have a name?) then to Junie, then finally to Claude Code. Junie was where I really started to see the potential of agents. With Qwen I had to have it convert one file at a time, starting with the files at the end of the dependency chain and working (slowly) back up to the main file. If I told it to convert the whole codebase, it would go into Roomba mode, hoover up everything in sight, run out of context then start a downward spiral into madness. Thank god for git.

    I’m the type of guy who likes to know where he’s going BEFORE he gets in the car. I know, I know, I should be more agile. You can imagine how happy I was once I discovered Claude could handle huge amounts of context and I could explain the whole summer vacation itinerary to it.

    The more I explained what I was trying to accomplish and how I wanted to get there, the better Claude did. Imagine that.

    Planning mode is a stroke of genius. I have spent 30 minutes or more explaining to Claude how I wanted a feature implemented, Claude would create a plan, I would pick it apart, Claude would provide a rebuttal, back and forth we’d go. The more we collaborated (argued?) the higher the likelihood Claude would nail the feature on the first try. It’s getting harder and harder to not assign the “he” pronoun to Claude. I name my cars, so I’m not sure why I’m hesitant to personify Claude.

    To put this in context, the LOC of the project is about 12k lines, nothing back breaking for sure. But there is significant complexity in modifying existing code with new features that require modifying dozens of individual files. Understanding the abstractions, dependencies, and even the style of the existing code, then modifying it and getting it all right the first time is pretty damn impressive. Claude managed that feat a few times there near the end of the project.

    I’m not sure who is training who at this point.

  • The idea is called LayeredIO

    The idea is called LayeredIO

    There’s a map of Oklahoma I’d like to see. It’s has three layers: a base map showing county boundaries, a second layer showing injection well activity where the points are sized based on how much wastewater is being pumped underground, and a third layer showing seismic activity with points sized by earthquake magnitude. Then I want to set it in motion and watch it play like a video, looking for cause and effect patterns over time.

    That capability, scrubbing through time to watch geographic data change on a map like video playback, is at the heart of what I’m building with LayeredIO. But the temporal visualization is only part of the story. There’s deeper problem, trust.

    Health and environmental data shouldn’t be “sold” to the public with marketing. It should exist without motive.

    The Idea That Wouldn’t Let Go

    I fell in love with GIS a long time ago while solving a problem for a hospital group. Unfortunately, my career took a different path and I was never able to explore it the way I wanted to. 

    It was around that time, 10 years ago, that I had the idea for LayeredIO. I was having difficulty finding datasets to test my code with, and I realized it would be nice to have a kind of “GitHub for GIS data.” Why? Because it’s hard to understand what a dataset really is (and the area is covers) if all you have to go on is a cryptic series of abbreviations for a filename from some random FTP server. The idea has been rattling around in the back of my head ever since. I’ve made a few passes at building it before and realized just how complex of an endeavor it would be. So I put it back on the shelf.

    What’s different now? I’m more experienced, technology has made huge leaps forward, particularly AI, and I think I may be a bit more motivated now that I see the trust problem getting worse instead of better.

    What LayeredIO Evolved Into

    LayeredIO is version control for community geographic data. Whether you’re mapping bus routes, bike paths, parks, crime patterns, construction projects, or environmental sensors, LayeredIO provides version history and a reputation system that makes crowdsourced data trustworthy.

    Think of it like civic resilience infrastructure for when official data sources simply don’t exist, become unreliable, or disappear.

    When I was working with that hospital group, I had access to historical patient demographic data, but the best I could do was create a series of still images. They got the point across as far as what patient demographics looked like “at one point on one day last year,” but I thought a more compelling story could be made from animating the changes over time in a video-like fashion. Then decision makers might be able to glean more from the data.

    Enter the version control idea. If I have all the previous versions of the sample points over time, then I could fast forward and rewind through time interactively instead of just staring at a single image.

    Add recent political events into the mix, and now I see where versioning of government data could help detect the politicization of data that’s supposed to be trustworthy.

    The Solution: Reputation as the Trust Mechanism

    If government data is suspect, why would crowdsourced data be any better? Anyone can post anything on Wikipedia, right?

    In a word: reputation. The reputation of the dataset and the reputation of the author of the dataset.

    Wikipedia works in no small part due to the peer review process. The conspiracy theory you post on Wikipedia will almost immediately be flagged by the community as suspect. Will every dataset on LayeredIO be trustworthy? No. But you’ll have a clue that it’s suspect based on peer reviews and ratings.

    Can reputation systems be gamed? Absolutely. We’ve seen it on Amazon reviews, YouTube likes, even academic citations. But we have several guardrails in place to prevent gaming of the system: account age requirements, preventing self-rating, rate limits. Those systems will evolve over time to detect whatever new games bad actors come up with. It’s never a solved problem; you just have to be vigilant and adapt.

    Who This Is For

    It won’t be the enterprise market, that’s for sure. Their needs are so specialized. Initially, I’m hoping this might be helpful in more mundane aspects of life:

    • School superintendents keeping their bus routes up to date
    • County officials documenting their snow routes
    • Citizens documenting potholes, broken street lights, bike trails
    • Recording historical markers
    • Tracking environmental sensors that communities install themselves

    The list is endless, really.

    These aren’t sexy problems, but they’re real problems that affect people’s daily lives. And they’re all situations where the “official” version might lag behind reality or might not exist at all.

    If You Aren’t Measuring It, You Aren’t Managing It

    There are lots of gaps that need addressing. There’s a saying: “If you aren’t measuring it, you aren’t managing it.” If we all have an easy way to measure the world around us, maybe we could do a better job of managing our resources.

    That’s what LayeredIO is really about. It’s civic infrastructure for a world where we can’t always trust official sources. It’s a way for communities to create, maintain, and validate their own geographic knowledge. It’s version control that lets us see how our world changes over time …and who changed it.