nick.cloud

Occasional posts about web development

Productizing the meta

Latest post

There is a pattern that shows up whenever a craft gets easier.

In the past, the hard thing was doing the work at all. But now with better tools, the hard thing becomes deciding what to do, in what order, with which constraints, so that the result is actually useful.

Coding is speed-running this change at a pace never seen before. “Coding is solved” is too strong a statement, perhaps. There will still be difficult engineering problems. But it is directionally right. More and more of the mechanical act of producing code will be handled by machines. The bottleneck moves upstream, into problem framing, context assembly, orchestration, evaluation, and refinement.

This lightning-speed shift is causing a fair bit of anxiety. In a bid not to be left behind, software engineers are furiously pushing the boundaries of AI coding agents. Trying to become an expert wielder of these new tools.

And that’s justified - because wielding these tools well is definitely a valuble skill to possess. The raw power of AI coding agents is undeniable, but awkward to wrangle. To get consistently good results, you need a surprising amount of invisible expertise. Prompting is part of this, but “prompting” is too small a word. What matters is the whole system around the prompt.

Call that the meta. The meta is the layer above the model: prompting, context engineering, orchestration, evaluation, workflow design, and interface design. It is the difference between having access to intelligence and being able to reliably create a useful artefact.

The mere availability of powerful tools is not enough to make something useful. You can buy cameras, lenses, lights, editing software, sound equipment, and special effects tools. That does not mean you can make a decent feature film. The scarcity is not access to the tools. It is knowing how to use them together to produce something coherent.

So one opportunity is to become an expert practicioner of the “meta” of AI coding agents. Someone who can wield models the way a good director wields actors, cameras, editors, and a script. That will be valuable, and already is. There will be people who are unusually good at making AI systems produce results that look almost magical to everyone else.

But the bigger opportunity, perhaps, is not just practicing the meta. It is productising it.

What does that mean? It means taking the expert skills required to get useful outputs from AI and packaging them into software that ordinary people can use.

Not software engineers. Everyone else.

The lawyer who wants a first-pass brief. The founder who wants a business plan. The analyst who wants a financial model. The team lead who wants an internal tool. The marketer who wants a campaign plan. The designer who wants a working prototype.

These people do not want “model access.” They do not want to think about prompts, context windows, routing logic, tool calls, or evaluation harnesses. They want an artefact. They want to start with intent and end with something concrete.

That is what building the meta means: building the abstraction layer that turns expert AI operation into a product for non-experts.

The user brings the goal, the constraints, and maybe some source material. The product handles the rest. It gathers context. It asks the right follow-up questions. It chooses a model. It decides whether the task should be broken into pieces. It uses tools where needed. It checks the output. It reformats it. It revises it. It delivers something the user can actually use.

In that world, the real product is not the model output. The real product is the invisible machinery that makes good outputs routine.

Raw models are likely less defensible than products that wrap human context around them: the domain knowledge, workflow logic, interaction design, error handling, quality controls, and accumulated understanding of what users in a particular field are actually trying to do. General intelligence may become abundant. Expert human-like judgment probably won’t.

So the next wave of valuable products may not look like better demos of model capability. They may look like quiet software products that let normal people produce things that previously required either specialists or unusually skilled AI operators.

Not because the users learned how to use AI expertly, but because someone else productised that expertise for them.