Sean Taylor

✱ Unintended consequences and the antifragile imagination

D.A Wallach discussing drug development:

So you have to think about delivering to the right place. You have to think about the drug actually working when it gets there. You have to be cognizant of all the other places it could go and the damage it could be, it could do there. You have to figure out how much you need to give to affect the right level of activity and how long you need to be giving that amount of the drug. You have to think about the patient's immune system and the way that it will react to the drug. Is the body gonna mount an immune response to this and generate, say, anti-drug antibodies that then neutralize the effect of the medicine? You have to think about how much it's gonna cost to manufacture the drug. Is that going to be commercially viable? You have to think about how much the US government will pay for that drug and what the economists are going to conclude when they think about the economic impact of everyone paying for that drug to treat this disease versus paying less for a slightly worse drug, but one that may have significant enough benefits that the cost-benefit favors that one.

Every serious innovation touches more than it intends. The bigger the system, the longer the tail of its consequences - and the slower our capacity to make sense of them. What's dangerous isn't just the fallout we didn't foresee but the futures we failed to imagine.

Sir Jony Ive on Victorian sewage and the unintended consequences of technological change

The thing that that I think is is so challenging is there was time for society to to stop and consider what was happening and there was time for structure and whether that was sort of infrastructure whether it was sort of social frameworks to to try and assimilate and and deal with these shifts and I think what's been very challenging is we are moving so fast the discussion comes far too late and there can't be - I mean unless there is - I mean the thing that I find encouraging about AI is it's very rare for there to be a discussion about AI and there not to be the appropriate concerns about safety.

Victorian engineers rerouted centuries of street-level sewage with cathedral-like pumping stations. They had time - time to build slowly, absorb impact and adapt society around the shift. Today, speed outpaces reflection. And while Jony Ive is right to value early, serious conversations about safety - especially in paradigm shifts like AI - that’s not the same as the ethical imagination we often see adopted today: moralistic, myopic and quick to shut down exploration before anything is even built.

We need sharper foresight and stronger systems thinking, yes - but also stronger backbones. Not just to anticipate harm, but to imagine systems that grow stronger through disruption. Innovation demands moral courage, not moral panic. Antifragility begins in the imagination.

#words