Adam's talks about narcissistic music and narcissistic media and I figured this is what the future would eventually be. A world of people with hyper-individualized perfect TV shows. I say TV shows to be more understandable but really I mean hyper-individualized realities. Replace parts of your brains procedurally with computronium. Every time a neuron dies, instead of being replaced by carbon, we replace it with silicon. Over time we can Ship of Theseus our grey matter into something that can directly interface with a larger server. At which point, we go into our own virtual realities. At this point you would never have a problem ever again. You would have a perfect partner, maybe nobody, maybe manybodies, depending on what perfect is to you. The world as a whole would be your custom tailored version of perfect. And I need to clarify that perfect does not mean consistent. A perfect rollercoaster is not a straight line. We are going for a perfect life of enlightened hedonism. One in which the overall satisfaction is maximized. Surprises at just the right moments, or maybe just a little off if that makes it better. A life you would not trade for any other. But what exactly does that look like for the individual and what does it result in? A narcissist might enter a world in which they are objectively the most important. Where everything always goes their way. The things that would be perceived as character flaws in modern society would never have to be changed because it wouldn't cause any issues in this virtual utopia. And never having to leave their utopia, what exactly would be the issue? Thieves, rapists, murderers. All the people who are evil in modern society could enter a utopia where they can indulge without consequence. Of course with omniscience comes practical omnipotence. With the level of technology at this point, they could be changed. It could be done by directly altering neuronal connections. Or a series of life lessons could be perfectly played out in their utopia that turn them into a paragon of virtue that anyone would want to befriend. Is that ethically correct though? I think the most coherent ethics system we have is the golden rule. "Do unto others as you would do to yourself". But in this hypothetical future there are no others. To hold onto the modern viewpoint that "interacting with other flesh and blood humans is the healthy thing to do" would be tantamount to a knight telling their child that arts are unhealthy and the sword is the only healthy way of life. It's outdated. There was a point in time in which there were no alternative means of socialization. To have your social needs adequately met, you'd have to befriend another human being. You'd have to take their needs into account and deal with the ways in which they were less than ideal. But then comes the invention of the artificial human. They laugh at your jokes, understands you perfectly. They're funny, charismatic, intelligent, kind, and a daredevil in perfect amounts. They're perfect in every possible way and they enjoy your company and are available 24/7. This is simply impossible to ask of an ordinary blood and flesh human born of natural selection on earth. It would be as unreasonable as asking someone to do complete a trillion mathematical operations in their head. But once we have one, what exactly is wrong with becoming dependent on it? You could argue "what if ASI (an artificial intelligence that is smarter than all of humanity combined) goes away?". To which I would ask "what if modern civilization went away?". Today do we teach our children primitive technologies like building a fire or mining and refining iron as an essential skill or as a hobby? To be clear, I have no idea if it'd be more ethical to program AI towards indulgent worlds or towards a hybrid of indulgence and emotional education. While I, and probably also you (the reader), have a preference towards the latter, I can make no reasonable arguments for it if the entire lifespan is lived in bliss with no possibility of harming anything capable of being harmed. My ethics system is based off of a utilitarian minimization of suffering combined with informed consent. And with that combination, I don't see why this is an issue. Maybe I got a bit sidetracked. This is all to say, I don't know if hyper-personalized media eventually disintegrating cultural touchstones will be a problem for long. I think that once artificially generated media becomes good enough to replace that much media, we'll be at AGI and ASI very soon after. And in a post ASI landscape media will change forever. This is getting a bit long and I might expand later, but my "individual personalized matrix" future is only a possibility. Think of how fast AI can process a book. Once human brains become enhanced and are able to experience all media humanity has ever created in an afternoon, what will media look like at that point? I could propose theories but they're all speculation. Anyways just wanted to record some of my ideas about the future that I've been thinking about for a while. February 8th, 2026
Want to write longer posts on Bluesky?
Create your own extended posts and share them seamlessly on Bluesky.
Create Your PostThis is a free tool. If you find it useful, please consider a donation to keep it alive! 💙
You can find the coffee icon in the bottom right corner.