Z K X P

The Craft, the Machine, and the Algorithm

What happens when no one remembers who built the things we depend on


Somewhere not far from you, code is keeping a hospital lit through the night. Somewhere else, code is timing the lights on a morning commute so the city can breathe in rhythm. Somewhere beneath your feet, code is guiding water through pipes with enough steadiness that ordinary life can keep pretending it is effortless.

Most people will never see this code. They will never meet the people who wrote it. They will benefit from it the way they benefit from gravity: constantly, invisibly, and without the slightest awareness that it could ever stop working. But it can. And the question of who writes it, and how, and with what degree of understanding, is about to change more dramatically than at any point in the history of computing.

To understand why that matters, it helps to go back further than the invent of computers.

When Everything Was Craft

For most of human history, everything people used was made by someone who understood it completely.

A sculptor in Benin casting bronze figures of kings and warriors spent months shaping wax forms before the metal ever touched fire. Blackfoot knowledge keepers painted buffalo hides with symbols that carried generations of history forward, each mark a deliberate act of memory that only the maker's lineage could fully read. A weaver in Kyoto layered silk threads into patterns so intricate that a single kimono could take a year to complete. A stonemason in Petra carved an entire temple facade out of a cliff face, every column and cornice shaped by hand, by eye, by judgment refined over a lifetime.

What made their work valuable was not simply effort. It was presence. The maker understood the material, the purpose, the failure modes, the consequence of each choice. The work was slow because understanding is slow. It was expensive because judgment is expensive. It was beautiful because intention leaves a signature that no shortcut can counterfeit.

Then came the machine, and the nature of making changed for the first time.

Suddenly a factory could produce in a day what a workshop produced in a year. Cotton textiles that once took skilled hands weeks to weave could be stamped out by the mile. Food that once came from a kitchen with a name attached to it started coming from facilities with batch numbers. Products multiplied. Costs collapsed. Access expanded to people who had never been able to afford the handmade version. This was genuine progress, and billions of lives improved because of it.

But something was lost in the exchange, and it took a generation to notice. The bread was softer but somehow less nourishing. The furniture was cheaper but somehow less durable. The clothes fit the body but somehow didn't fit the person. Mass production solved the problem of scarcity and introduced a quieter problem in its place: the slow erosion of intentionality. No single human held the whole picture anymore. The system held it. And the system optimized for volume, not for meaning.

Now we are entering a second transformation. And this time, the thing being industrialized is not cotton or steel or bread. It is thought itself.

The Commit as Brushstroke

If you have never written software, a "commit" is the smallest unit of intentional change a developer makes to a codebase. It is a record that says: I was here. I understood this. I changed it on purpose, for this reason. A commit history is a timeline of decisions, and reading one is like tracing the thought process of the minds behind the architecture.

A good commit history reads like disciplined thought. Not perfect thought, but accountable thought. You can see hesitation, revision, tradeoff, restraint. You can see someone carrying a problem long enough to deserve an answer.

When a human writes code, every commit is a brushstroke. I mean that precisely. The developer looks at the whole canvas, identifies what needs to change, understands why, and makes a deliberate mark. Sometimes the mark is bold. Sometimes it is a single-pixel correction that nobody else would notice. But it carries intent. The person who made it could tell you why.

Artificial intelligence can now generate in minutes what once took teams days. The code compiles. The tests pass. The feature ships. From the outside, the output looks identical. Maybe even cleaner. But open the commit history and you will feel the absence before you can name it. There is no struggle in it. No moment where a developer paused, reconsidered, and chose the harder path because it was the ethically right one. No sign that anyone stood inside the consequences of the choice, or rejected the easy structure for the sound one, or accepted the slower path because they knew what the system would eventually have to bear. Every decision was derived, not arrived at. Extrapolated from the patterns of engineers who once sat with the problem long enough to earn the answer.

The system produced the code the way a factory produces bread: correctly, efficiently, and without the quality that made the handmade version worth remembering.

The Soft Logic Around the Hard World

Here is where the analogy to bread and furniture stops being comfortable.

Because software is no longer something that lives only in screens and browsers. Software is now the nervous system of the physical world. It manages water pressure in pipes beneath your street. It sequences the traffic lights your children cross under. It monitors structural loads in buildings where thousands of people sleep at night. It balances electrical grids that hospitals depend on to keep patients alive. It regulates access, pressure, load, timing, routing, recovery. It is the soft logic wrapped around the hard world, the hidden choreography that gives modern life its steadiness.

There is, in other words, an operating system beneath ordinary life. Most people have never seen it. Almost no one designed it as a whole. It is old, fragmented, overloaded, and it is about to be rewritten by the fastest, cheapest, least intentional tools humanity has ever created.

We are entering an era where the code that governs physical infrastructure will be increasingly written by systems that have never stood in front of what they are controlling. That have never felt the vibration of a process running wrong. We are doing to software what industrialization did to food, to textiles, to furniture. We are solving the problem of scarcity (not enough developers, not enough time, not enough code) by trading away the quality that made the scarce version valuable: the unbroken chain of human judgment from the first line to the last.

This is not a story about whether AI is good or bad. The loom was not good or bad. The assembly line was not good or bad. They were tools that changed what "making" meant, and the people who understood the change early were the ones who shaped what came next.

What We Carry Forward

The craftsman, the machine, and the algorithm. Three ages of making, each one faster than the last, each one producing more than the last, each one asking us to decide what we are willing to lose for what we gain. Three different relationships between speed and judgment, scale and responsibility, output and understanding.

The question facing us now is not whether AI will write most of the world's code. It will. The question is whether we will remember what it means for a human to be present in every decision, or whether we will let that standard dissolve so gradually that we forget it was ever there. The same way most people alive today have never tasted bread made by someone who grew the wheat.

If we do not pay attention, we will look back the way we look back at industrialized food and mass-produced housing and disposable everything, and we will ask ourselves why we traded away the standard we had before we understood what it cost us. Except this time, the consequences will not be a less nourishing loaf of bread. They will be systems that govern the safety of millions, built by processes that never understood the weight of what they were building.

The next era of technology will not be defined by the apps we touch. It will be defined by the systems we do not see, and by whether anyone still understands them when they begin to fail.

Share on Substack

Written by

Zeshan Nurani

Published on

Sun Mar 08 2026

On this page