Eric Hoel in The Intrinsic Perspective:
There’s an unkillable myth that the very definition of the word “consciousness” is somehow so slippery, so bedeviled with problems, that we must first specify what we mean out of ten different notions. When this definitional objection is raised, its implicit point is often—not always, but often—that the people who wish to study consciousness scientifically (or philosophically) are so fundamentally confused they can’t even agree on a definition. And if a definition cannot be agreed upon, we should question whether there is anything to say at all.
Unfortunately, this “argument from undefinability” shows up regularly among a certain set of well-educated people. Just to given an example, there was recently an interesting LessWrong post wherein the writer reported on his attempts to ask people to define consciousness, from a group of:
Mostly academics I met in grad school, in cognitive science, AI, ML, and mathematics.
He found that such people would regularly conflate “consciousness” with things like introspection, purposefulness, pleasure and pain, intelligence, and so on. These sort of conflations being common is my impression as well, as I run into them whenever I have given public talks about the neuroscience of consciousness, and I too have found it most prominent among those with a computer science, math, or tech background. It is especially prominent right now amid AI researchers.
So I am here to say that, at least linguistically, “consciousness” is well-defined, and that this isn’t really a matter of opinion.