How Worried Are You By The Symbol Grounding Problem?
Imagine you're a mental representation. You are a computational symbol system, and your job is to contain knowledge that is about the world and that can help your organism interact with that world (Newell, 1980). The 'aboutness' thing is the most important part of you - you are an intentional system, which means you have content that is meaningful. So where did your content come from? (I'd like to know your thoughts, so please help by answering the questions at the end!)
This question is the issue of symbol grounding, first posed as such by Searle (1980), talked about seriously by Harnad (1990) and made famous with the Chinese Room thought experiment: The problem is that you can have a system that deals in nothing but syntax (the form and structure of a communication transaction) but that will pass the Turing Test, i.e. look like it trades in semantics (meaning), even though that syntax is definitely not grounded in a real semantics. There is currently no solution to the problem of endowing a mental representation symbol system with content/meaning/intentionality that doesn't involve that meaning to have come from somewhere else. If the meaning is not intrinsic to the system's form (Bickhard, 2009, calls this being 'internally related') then the mean has to come from something else, but then how did that thing get its meaning, and so on....it quickly becomes turtles all the way down. This means that mental representations cannot do the things they need to do in order to play the role they need to play in our cognitive economy to make us functional, intentional beings and not philosophical zombies. This has always struck me as an absolute disaster for the standard cognitive approach. But my question here is, do other people worry about this? I would love it if people would comment below and answer the following questions:
What flavour of cognitive scientist are you? (psychologist, philosopher, enactivist, representationalist, Jerry Fodor in the actual flesh, etc)
Do you know about the symbol grounding problem?
Then, if you do,
Are you concerned by the implications of the symbol grounding problem for mental representations?
Do you think the problem has already been solved? If so, how?
Obviously I have opinions, but this time I am very much interested in yours! References Bickhard, M. H. (2009). The interactivist model. Synthese, 166(3), 547-591. Harnad, S. (1990). The symbol grounding problem. Physica D: Nonlinear Phenomena, 42(1), 335-346. Newell, A. (1980). Physical symbol systems*. Cognitive science, 4(2), 135-183. Searle, J. R. (1980). Minds, brains, and programs. Behavioral and brain sciences, 3(03), 417-424.
- You Won't Forget This
Last year researchers reported they were able to use real-time images of a person’s brain activity to tell what version of an ambiguous shape they were looking at. Now Leun Otten and colleagues report that they can use measures of the brain’s surface...
- Oh Crap. Re-thinking Van Gelder (a Purple Peril)
I have this problem where I like pretty much everything William Bechtel writes except when it pertains to cognitive science. It's annoying because, even when I disagree with him, I think he's worth taking seriously. This was on my mind when I...
- Grounded Vs. Embodied Cognition: A (hopefully Uncontentious) Note On Terminology
Our Frontiers paper made the case that embodied cognition is, by definition, a fairly radical affair. We argue ...if perception-action couplings and resources distributed over brain, body, and environment are substantial participants in cognition,...
- Chemero (2009) Chapter 9 - The Metaphysics Of Radical Embodiment
The final chapter of RECS tackles the metaphysical implications of the radical stance. Gibson was a staunch realist, but there are some odd elements to entities like affordances that, to certain minds, sound like idealism or antirealism of some kind....
- Chemero (2009), Chapter 2: Embodied Cognition
Chemero spent Chapter 1 creating space for himself and his book in the marketplace of ideas about how we should do our cognitive science. Chapter 2 is about situating his theory in amongst the competition, with the goal of establishing exactly what a...