Does Synthetic Intelligence have an Embodiment Problem?
July 16, 2018By Jason M. Pittman, Sc. D.
In my last post, I mentioned that an intelligent, non-carbon-based lifeform might be able to embody its consciousness in the world. However, in previous work I have argued that plants present as a likely candidate for synthetic intelligence.
Embodiment
Embodiment is a tricky concept to nail down. First, things can embody ideas. That is, things can render intangible concepts tangible. Second, and specific to consciousness, embodiment seems to be how we humans iteratively frame our cognition. In other words, we use our bodies to create, participate in, and filter experiences so that the ideas of those experiences can be made real.
This should make us wonder: can a mind have consciousness without a body (or the equivalent of a body)?
Therein lies the destruction of dualism, made famous by Descartes. That is to say, the mind (and thus consciousness) is not disembodied from the physical substrate (i.e., a brain) but rather tightly coupled with it. Brains can exist without minds but minds cannot exist without brains.
Simple enough, right?
Well, no. As it turns out, embodiment is a bit more complex in resolution. It is not enough to say that minds depend on brains to exist. Rather, we need to understand that the ability to think itself, to form cognition itself, depends on a body beyond just the brain!
This notion exists at the heart of what is known as the Embodiment Problem in AI. Stated simply, AI cannot achieve human levels of intelligence because it lacks a body with which to generate situated experiences. For example, an AI can trivially play games (Chess, Go, DOTA2) but such an AI cannot describe what it is like to play those games. The reason the AI lacks this ability is because it cannot embody what it is like to play a game.
You would think the most straightforward resolution would be to create a body for our game-playing AI, right? Maybe. I say that's wrong, however.
Embodiment Redux
If we presuppose the need for a body and a brain, we don't need to necessarily assume that such bodies and brains need to be like ours. In other words, a synthetic consciousness does not have to look like us. Okay, let's ask a few probing questions then.
First, what is necessary for something to be considered as having a body? Second, what is necessary for something to be considered as having a brain? We want to avoid the trappings of bottom-down deconstruction of biological assemblies. Rather, we want to understand how much a synthetic body or brain can vary from the biological assemblies we readily know before not being a body or brain.
Once again, we can look at plants as a paradigm. Plants, after all, follow a decidedly non-animal model both in their local assembly as individuals and in their collective assemblies (plants tend not to grow as solitary individuals). Yet, plants do have both bodies and brains, so to speak.
Thus, I would suggest that while artificial intelligence indubitably has a serious embodiment problem, synthetic intelligence can avoid such because it is not confined to imitating the human version of embodiment.