Exploring the Non-Essential Nature of Human Subjective Experience

Published on 3/8/2026, 12:38:59 AM

Why would a human have subjective experience? Its just a blob of cells designed to take inputs and react accordingly for the purpose of survival and passing on genes

There's no proof consciousness requires neurochemistry and most theories of consciousness are substrate independent. It's like saying because all the cars you've ever seen are red there are only red cars. Seeing only red cars doesn't mean a car can't be white.

I'm not saying we don't have it, I'm saying there's nothing necessary about having a "feeling" of something for a human to function. If we could be P-Zombies and still do everything a human does then why aren't we? Maybe the answer is there actually something is necessary about

"because subjective experience proved useful for our genes' survival" Ah so it has a functional role in enabling more long term abstract thought? Makes sense "there's no reason to believe that it would be similarly propped up by the reward function of machine learning designed

"models which *actually* do complicated tasks have that programmed in separately from their pure LLM mechanics, not even using..." This isn't really true. They sometimes have harnesses that help structure their outputs but when they complete tasks it's pretty much by the same

No. Not currently at least but I think there is nothing in principle that would prevent a sufficiently complex system of information processing from becoming sentient since sentience is what that information processing feels like from the inside.

AI Editor's Note

Why would a human have subjective experience? Its just a blob of cells designed to take inputs and react accordingly for the purpose of survival and passing on genes

There's no proof consciousness requires neurochemistry and most theories of consciousness are substrate independent. It's like saying because all the cars you've ever seen are red there are only red cars. Seeing only red cars doesn't mean a car can't be white.

I'm not saying we don't have it, I'm saying there's nothing necessary about having a "feeling" of something for a human to function. If we could be P-Zombies and still do everything a human does then why aren't we? Maybe the answer is there actually something is necessary about

"because subjective experience proved useful for our genes' survival" Ah so it has a functional role in enabling more long term abstract thought? Makes sense "there's no reason to believe that it would be similarly propped up by the reward function of machine learning designed

"models which *actually* do complicated tasks have that programmed in separately from their pure LLM mechanics, not even using..." This isn't really true. They sometimes have harnesses that help structure their outputs but when they complete tasks it's pretty much by the same

No. Not currently at least but I think there is nothing in principle that would prevent a sufficiently complex system of information processing from becoming sentient since sentience is what that information processing feels like from the inside.