Blog post

Gesamtkunstwerk

Post content

A few months ago I applied for a product manager position. The company asked me for a technical test: analyze a problem, propose a solution, document it with good judgment. I did it like any reasonable professional in 2026 would, with the help of an AI agent. While reviewing the result, an uncomfortable thought came to me: if the other candidates were also using AI, and AI is fundamentally statistical, a sophisticated average of everything it has seen, the logical thing would be that we all converged toward a similar point. Similar outputs. Similar structures. Similar conclusions.

Where was the difference then?

Part of the answer came before writing a single line. Before asking the agent for anything, I had asked myself questions: questions about what the company was really proposing, about what was left unsaid, about the assumptions that should be questioned. Those initial questions were differentiating, because they were mine: they came from the books I've read, from the projects where I've failed, from those where I've succeeded, from the experiences that have shaped me. That can't be replicated statistically. That is sediment. And sediment is hard to copy.

I must be honest: I didn't get the job. Part of the distance was in something more basic, not knowing how things were done inside there. The expected format of an investigation, for example: whether they wanted an Excel or a narrative paragraph. Corporate culture as invisible context that no agent can infer if you don't know it yourself. What I wonder is if that also has a solution: if the onboarding of the future should be double, yours and your agent's, and if giving someone who arrives a document with the context, language and criteria of the company to feed their agent wouldn't be simply good practice but something almost necessary.

I think of a craftsman from three thousand years ago. A knife maker, for example. He learned the craft from his father or a master, practiced it for decades, and in that mastery found his identity and livelihood. He didn't make bowls. He didn't tan hides. He made knives, and made them better than anyone within a two-day walking radius. Specialization was the natural order of things, a logical consequence of knowledge being transmitted slowly, from person to person, and mastering something taking a lifetime.

That began to change much earlier than we usually think. The Renaissance was already pushing toward a different figure: the man who knew about anatomy and hydraulics and perspective and military engineering. Da Vinci as an anomaly that prefigured something. But it was the Bauhaus, in the twenties of the last century, that turned that anomaly into a program. Walter Gropius wanted to erase the distinction between craftsman and artist, between the one who built and the one who conceived. The concept that articulated it was the Gesamtkunstwerk, the total work of art, where architecture, industrial design, painting, photography and craftsmanship merged into a single discipline without hierarchies. Specialization remained valuable, but synthesis between disciplines became the true horizon.

What's happening now with AI seems to me an acceleration of that same impulse, but of a qualitatively different magnitude. It's not just that barriers to technical knowledge are dissolving, but that barriers of all kinds are dissolving: conceptual, creative, disciplinary. A mechanic with curiosity and access to AI tools can today design a complete engine, simulate its behavior and manufacture it with a 3D printer. Not at industrial scale. Not optimally. But he can. And that's a rupture.

The shape of professional knowledge has had several names in recent years. The I-shaped profile: depth in a single area, almost none in the rest. The T-shaped: one main specialty with a horizontal arm of adjacent knowledge. The comb-shaped: several specializations with bridges between them. What I wonder is if these models are still useful as description or if we're entering something more fluid, not a fixed form of knowledge but a capacity to acquire forms according to what the moment requires.

Here's where the paradox that hasn't left me since that technical test appears. If AI democratizes access to knowledge, if anyone can, with the right agent, produce competent analysis in areas they don't master, then the hard-to-copy stops being in knowledge itself. It moves somewhere else. And that somewhere else, it seems to me, has two opposite poles that coexist without canceling each other.

The first is the extreme of absolute depth. The ceramist who has spent thirty years with hands in clay. The doctoral student who has dedicated five years to a problem so specific that barely twenty people in the world can follow his reasoning. When everyone can produce something acceptable, the extraordinary becomes extraordinarily valuable. Extreme mastery doesn't get automated, it becomes rarer and more sought after.

The second pole is harder to name. It's not exactly creativity, nor experience, nor general culture. It's something that has to do with the quality of questions one is capable of asking oneself. With the particular way each person connects what they know with what they observe. With what Jules, Samuel L. Jackson's character in Pulp Fiction, summarized with a phrase that sounds simple and isn't: personality goes a long way.

Follow me on LinkedIn to stay updated on new posts.