Post content
In the prologue to the revised edition of The Magus, John Fowles confesses something unusual for a novelist: that this novel remains, decades after publishing it, the one that leaves him most professionally unsatisfied. What gives him the impulse to try again is realizing that, of all his books, it's the one that has sparked the most interest among his readers. His own dissatisfaction plus the external signal: together, they justify reopening something that was already closed. Any product manager would recognize that process without needing translation.
But there's another idea in The Magus that has stuck with me since I read it. Fowles writes that men tend to see the world as objects and women as relationships between them. It's a statement about two radically different ways of understanding reality. One looks at things. The other looks at what happens between things.
I've been thinking about this distinction for weeks from a different angle: what happens when AI starts solving objects for us.
The research is clear and increasingly abundant. Language models homogenize creative outputs. Not at an individual level—for each person who uses them, AI tends to improve the perceived quality of what they produce—but at a collective level. The more people use the same tools to generate solutions, ideas don't disappear or diminish, but they come to life in an increasingly homogeneous way. Each AI-assisted story becomes a little more like the others. Each product strategy generated with the same model converges toward the same center of gravity. The variety that once emerged from thousands of minds working independently gets compressed into the patterns the model has learned to reproduce.
It's the paradox that researchers point out in Science Advances: AI improves individual creativity while reducing collective diversity. A social dilemma in miniature: each writer benefits, but the ecosystem's outputs become impoverished. Not because there are fewer ideas, but because they all start to look alike.
This is nothing new. Every technology that gets adopted massively ends up conditioning the outputs of its era. The printing press standardized typography and with it reading rhythms. Photography pushed painting away from realism, but at the same time created its own conventions. Digital cinema made production cheaper, but the look of films shot with the same cameras and the same post-production software converged toward a recognizable aesthetic.
The tool is never neutral: it always leaves its mark on what's done with it, and when everyone uses the same one, the mark becomes landscape. In the digital world, this has been happening for years before generative AI entered the scene. The massive adoption of minimalist aesthetics, the proliferation of shared design systems, or the influence of platforms like Pinterest (direct source for mood board construction in agencies and studios worldwide) have long been compressing the available visual space toward a common center of gravity. When everyone starts from the same references, results look alike before any algorithm even intervenes.
Perhaps the most eloquent example isn't digital: if a car's exterior design is decided exclusively from data collected in a wind tunnel, optimizing every curve to improve aerodynamics, all manufacturers following that same criterion will end up arriving at the same place. Physics doesn't leave much room: there are shapes that air prefers, and when data rules, diversity yields. Modern cars increasingly look alike not due to lack of imagination from their designers, but because they're all optimizing against the same parameter.
What gets homogenized, ultimately, are the objects. The outputs. The solutions.
And here is where Fowles' distinction becomes operational.
If objects tend to converge—if the solutions AI generates for your problem and those it generates for mine look increasingly alike—what remains differential are the relationships. The how of getting there. The process by which an organization, a team, a person transits from problem to solution. The conversations that happen along the way. The frictions that are generated and resolved. The decisions made in the margins.
The gaze that sees relationships, in a world where objects standardize, becomes strategy.
This has concrete consequences for those working in product. If the functionality you can build today with AI is approximately the same that any competitor with access to the same tools can build, differentiation doesn't live in the what. It lives in how that functionality emerges from a particular understanding of your users, from a team culture that makes decisions in a certain way, from an accumulation of context that no model can synthesize because it wasn't in the room when it happened.
The product as object becomes democratized. The process as relationship remains unrepeatable.
This idea is equal parts relieving and uncomfortable. Relieving because homogenization doesn't eliminate the possibility of differentiation; it pushes it toward a place that was always harder to copy. How a team thinks together, the quality of conversations behind each decision, the friction between different perspectives... But it's also uncomfortable because it forces us to give importance to things we usually treat as secondary. Meetings stop being formalities and become the place where meaning is built. Diversity of perspectives stops being noise and becomes raw material. The slow process by which an idea transforms into something good through contact with other minds stops being a cost and becomes the advantage.
Ed Catmull formulated it better than anyone in Creativity Inc.: "Give a good idea to a mediocre team, and they will screw it up. Give a mediocre idea to a brilliant team, and they will either fix it or throw it away and come up with something better." What Catmull describes isn't a theory about creativity, it's a theory about relationships.
Fowles revised his novel because he still felt it was unfinished and because his readers told him it was worth it. Not because he had a tool that told him how to improve it. The difference between both things is exactly the difference between objects and relationships.
In a world where AI solves objects with growing efficiency, the question worth asking isn't what the model can generate. It's what kind of relationships you're building around what the model generates.
Solutions converge. Processes don't.
–
Follow me on LinkedIn to stay updated on new posts.
–