an eddy in the bitstream

Page 2 of 75

Ready for the Fall

Back in 1998 I was living in Georgia and I got lonely for all my friends in Minnesota. During that year I wrote a lot of songs because that’s what (some) lonely people do, especially lonely people doing a lot of driving and spending tons of time alone.

In 2020, like a lot of people with time on their hands, I decided to revisit some of those songs that had never gotten recorded before. Inspired by the process of recording music again, after nearly 20 years of not recording my own songs, I decided to (slowly) make my way through the shoebox of old tunes and see if they still made me happy to play.  Some of them do.

I’ve been putting the new recordings up here as I finish them.

I’m also going to write up some notes about each song, since one of the things I like to know when I listen to an album is details like when and where a song was written. A little context helps frame my experience.

“Ready for the Fall” was written in 1998 right after I moved back to Minnesota, around Labor Day. Something about the wordplay of autumn and lapsarian and some struggles that my family was having at that time. This was a song that we recorded in the winter of 1998-99 for the first House of Mercy Band album (the “white album”) but our version never quite hit what I heard in my head. This version is closer, and I am grateful that Chris Larson contributed harmony vocals on it, as he and I used to sing it together in those days.


Over the last several years I’ve found myself needing to explain/justify my habit of using a Makefile in software projects. I figure it’s time to create a post about it, so I can just refer here in the future.

I’ve a long-standing (decade+) habit of (ab)using Makefiles in projects, regardless of what the language(s) are and what other kinds of management tools are in use. Here’s one example. I don’t actually use them to compile things, or for keeping track of when files change, but more as a convenient mnemonic standard.

My rationale has been:

  • make is ubiquitous so there’s usually nothing to install
  • heterogenous projects involve multiple languages, with multiple invocation syntaxes. make allows me to easily remember a short command that is meaningful for what I want to accomplish (execute a task, start or stop a service, etc), rather than needing to first think “what language is this?”
  • make is language agnostic. It’s just a handy way to group shell invocations together with environment variables and comments/context.
  • make foobar is easier to remember (for me) than foobar with --all the --usual but sometimes --forgettable options.
  • during the workday, switching between repos that have different languages/tools/frameworks can create cognitive overhead, and make build or make run will just work regardless of what directory I’m in
  • it’s both a way of documenting common tasks for shared developer knowledge/utility, and making it more convenient to onboard developers, regardless of whatever other tools that they might be familiar with.
  • a Makefile is like an executable README

HB 2196 Testimony – Unemployment Insurance

The Kansas legislature is considering a bill that would create a legislative oversight committee specific to the technical modernization of the Kansas Department of Labor. The following is the testimony I wrote for the committee.

Testimony on Substitute for HB2196
Senate Commerce Committee
Peter Karman
March 17, 2021

Mr. Chairman and Committee Members,

I am Peter Karman, resident of Lawrence. I am here to present testimony on Substitute for House Bill 2196.

I am a technology professional with over 25 years of experience. I am also a former federal employee where I played a significant engineering role in the development of several IT projects, including the identity verification, security and fraud prevention, and authentication features of, which currently serves over 30 million Americans. I also worked on the modernization of the Department of Veterans Affairs case appeals management system.

During April and May of 2020 I assisted the Kansas Department of Labor as a volunteer with the United States Digital Response. The USDR is a group of several thousand volunteer technologists who stepped forward during the early days of the COVID-19 pandemic to assist state and local governments with the sudden demands on their technical infrastructure. I spent hundreds of hours in the depths of the KDOL technical systems and with the KDOL IT team during those weeks and have remained in touch with the department in the year since.

I also served on Governor Kelly’s transition team in 2018 and have knowledge of many of the IT systems, architectures and challenges within Kansas state government.

First, I applaud the committee’s desire to see KDOL technology systems modernized. The state’s reliance on mainframe technology poses several challenges to providing key services to constituents, particularly in the quickly evolving online environment on which many Kansans depend. I also support the legislature’s desire to understand the unemployment application process from the perspective of the users of the system. Filing a claim and receiving benefits should combine the best of design research, security and fraud prevention practices. Kansans facing these difficult circumstances deserve an experience filled with empathy and respect.

Second, there are provisions in the current bill that concern me, both as a professional and as a taxpayer. Creating a new oversight committee will add redundant bureaucracy to the existing legislative oversight structure that includes the Joint Committee on Information Technology. Empowering this committee with architectural decision-making power will place incredible demands on the committee and will prevent professional technologists and architects from using their industry knowledge and expertise to address complex design and security issues. We should let professional technologists determine the best technology solutions.

Page 3 of the current bill enumerates several technical security remediations and strategies that have no reason to exist in statute. I have worked on several IT modernization projects at the federal level that were initiated via Congressional action and every one of them that made specific technological requirements part of the law resulted in delays and inferior technological outcomes. Digital security practices must evolve as quickly as the threats they hope to defeat. When technical specifications make their way into statute, IT professionals face a dilemma between implementing the legal requirements and implementing secure systems. This false choice can lead, in the best case, to security theater and, in the worst case, insecure implementations. Please don’t enshrine today’s technical specifications, which must constantly evolve, into law preventing IT professionals from solving tomorrow’s security threats.

Thank you to the committee for considering this testimony and for your continued work in helping KDOL better serve Kansans.


Peter Karman

KS SOS election results

The Kansas Secretary of State elections web page says that “precinct specific election results are available upon request.”

However, according to the statute:

“The secretary of state shall publish on the official secretary of state website results by precinct for all federal offices, statewide offices and for state legislative offices not later than 30 days after the final canvass of the primary election is complete.”

So it appears that the SOS office is not following the law.

Since the general election precinct results have a similar requirement to be published no later than 30 days after the final canvass, which must be held no later than December 1, I would expect general election precinct results to be available on the SOS website by December 30, 2020.

I hope that by then the SOS office will be following the law.

Update 2021-01-05 Happy to report that the 2020 general election results have been posted. The primary results still have not.

Shame, software, and government

This summer I wrote a long essay about shame and software projects. Shame is a topic I’ve been thinking about for nearly 30 years, so the gestation period for the essay was unusually long. I wanted to capture here some of my context and motivation for writing it.

As I’ve written before, working in and around government has highlighted for me the extent to which people in public service will avoid taking risks. The shame essay is my attempt to understand why risk avoidance is such a strong institutional and cultural norm.

Public servants avoid risk because they are trying to avoid the shame of failure. The “public” in public service means that the fallout from what might otherwise be a “normal” failure ratchets up the sense of exposure. In government, failure can mean public scrutiny, investigations, inspections, audits, newspaper headlines, coupled with the internal personal sense that the value of the work is so crucial and that so many people are relying on us to do it well. High ideals, high stakes, a field ripe for shame events.

When it comes to information technology, a field that continues to evolve at a blistering pace, the patterns of shame avoidance are even more acute. The most common phrases about technology I hear from people in government, sometimes at the highest levels, are variations on “I’m not a tech person” and “I don’t understand how it works.” These statements are ways of lowering expectations, and therefore lowering the risk of shame by shrinking the gap between the ideal and the real.

In our modern, interconnected world, especially in the middle of a pandemic with people reliant on unemployment websites and remote work environments, IT systems pose a huge risk to government’s ability to fulfill its mission. So somebody in government needs to know how the IT works. But for years we’ve consistently under-invested in our government IT systems and in the processes (hiring and procurement) that support them. We’ve made it risky to work in and around government IT, and perversely, we continue to make the problem worse by trying to outsource all that risk to private sector vendors.

So we end up in a vicious knot of low tech IQ capacity, poor risk management practices and shame. What I tried to lay out in the shame essay was how some Agile software practices can help cut the knot.

The Agile process grew out of response to the Waterfall model. All by itself the Waterfall model of software development is not rooted in shame. However, the way it has been applied, particularly in large organizational contexts like government, has reinforced cultures of shame. One reason is that Waterfall requires a careful and detailed (and often time-consuming) upfront articulation of the ideal final result, analogous to the ideal self. A long detailed list of requirements and a long development cycle incentivizes development teams to build to the plan rather than to continually seek input directly from stakeholders while building. The ideal plan drifts further and further from an evolving reality. Since shame is directly connected to failings of the ideal self, Waterfall projects are perfectly set up for shameful patterns.

This is why bringing Agile patterns to government can be met with so much resistance and why their successful application can be so transformative. The existing patterns of managing high risk are ingrained at literally a cellular level because shame operates at that level of primary biological affect. Shame shapes the fundamental story the organization tells itself. Like the shame experience itself, organizations can feel stuck and powerless. The way out is with trust and empathy, and those are the traits and patterns that Agile encourages us to build, through Agile’s insistence on smaller, less risky changes.

When I see a digital service team helping to transform how government manages risk, the pattern I observe is a group of people learning to negotiate with shame and shame culture. That’s why I often say that the hardest work in government IT is the emotional labor. When I see success, outcomes often include not just stronger systems but stronger teams.

« Older posts Newer posts »

© 2022 peknet

Theme by Anders NorenUp ↑