A Philosopher's Blog

Brain Games

Posted in Philosophy, Technology by Michael LaBossiere on April 7, 2014
Brain Games box art

Brain Games box art (Photo credit: Wikipedia)

As a general rule, most people want to gain for as little as possible effort. For example, people seem to often buy exercise equipment thinking that it will make exercise easier. They usually find out that is not the case—thus the brisk trade in lightly used exercise equipment and its regularly being buried under clothes. The latest brain training games seem to be offering the same temptation: if a person plays these brain games, she will become smarter. The appeal is, of course, that the games are supposed to fun rather than burdensome—like education tends to be. The obvious questions is whether such games work or not.

On the face of it, the idea that playing these brain games can have positive effects does make some sense. After all, exercising the body improves it—so, by analogy, the same should hold for the brain. The obvious concern is that not everything that people think is exercise actually improves the body. Likewise, the brain games might be like useless exercises for the body: you are doing something, but it is having no effect. To address this matter, the thing to do is to turn to some actual science.

As it stands, the unbiased research seems to show that the current crop of commercial brain training games have no meaningful impact. While people do get better at the games, this is most likely due to familiarity. To use an analogy to another type of video game, doing the same scripted event over and over in a game like World of Warcraft or Deadspace III will cause a person to improve at that specific task. To use a specific example, in Deadspace III the player has to “fly” through a field of debris and avoid being smashed. My friend and I smashed into the debris repeatedly until we finally made it—through familiarity with the process rather than by getting “better.” The same seems to be true of the current brain games and getting better at such a game does not entail that one is smarter or more mentally capable. In light of the existing evidence, spending money on the commercial brain games would be a waste of money—unless one is just playing them for fun.

Interestingly enough, video games of the more “traditional” sort can improve memory and mental skills. This is not surprising—such video games typically place players in challenging environments that often mimic general challenges in the real world. As such, rather than simply focusing on a relatively simple game that is narrowly focused, the gamer is forced to fully engage the general challenge and develop a broader set of capabilities. As such, video games of this sort probably help improve mental abilities in a way analogous to how reality does so. In the case of video games, the challenges will tend to be more challenging and more frequent than what a person would generally encounter in the real world. For example, participating in a World of Warcraft raid involves tracking abilities, maintaining situational awareness, following (or giving) orders, following a strategy and so on. That is, it provides an actual mental workout. So, a person looking for games to make her smarter would be better off getting a gaming console or PC and selecting challenging games. They will probably be much more fun than the brain games and apparently more effective.

I would also like to put in a plug for traditional table top games as well—be they games like Risk or D&D. These games provide enjoyable challenges that seem to have a positive impact on cognitive abilities. Plus, they are social activities—and that is no doubt better for a person than playing brain games online solo.

 

My Amazon Author Page

My Paizo Page

My DriveThru RPG Page

Enhanced by Zemanta

The Chipped Brain & You

Posted in Ethics, Metaphysics, Philosophy by Michael LaBossiere on August 26, 2013
Cover of Cyberpunk 2020

(Photo credit: Wikipedia)

Back in the heyday of the cyberpunk genre I made some of my Ramen noodle money coming up with “cybertech” for use in the various science-fiction role-playing games. As might be guessed, these included implants, nanotechology, cyberforms, smart weapons, robots and other such technological make-believe. While cyberpunk waned over the years, it never quite died off. These days, there is a fair amount of mostly empty hype about a post-human future and folks have been brushing the silicon dust off cyberpunk.

One stock bit of cybertech is the brain chip. In the genre, there is a rather impressive variety of these chips. Some are fairly basic—they act like flash drives for the brain and store data. Others are rather more impressive—they can store skillsets that allow a person, for example, to temporarily gain the ability to fly a helicopter. The upper level chips are supposed to do even more, such as increasing a person’s intelligence. Not surprisingly, the chipping of the brain is supposed to be part of the end of the human race—presumably we will be eventually replaced by a newly designed humanity (or cybermanity).

On the face of it, adding cybertech upgrades to the brain seems rather plausible. After all, in many cases this will just be a matter of bypassing the sense organs and directly connecting the brain to the data. So, for example, instead of holding my tablet in my hands so I can see the results of Google searches with my eyes, I’ll have a computer implanted in my body that links into  the appropriate parts of my brain. While this will be a major change in the nature of the interface (far more so than going from the command line to an icon based GUI), this will not be as radical a change as some people might think. After all, it is still just me doing a Google search, only I do not need to hold the tablet or see it with my eyes. This will not, obviously enough, make me any smarter and presumably would not alter my humanity in any meaningful way relative to what the tablet did to me. To put it crudely, sticking a cell phone in your head might be cool (or creepy) but it is still just a phone. Only now it is in your head.

The more interesting sort of chip would, of course, be one that actually changes the person. For example, when many folks talk about the coming new world, they speak of brain enhancements that will improve intelligence. This is, presumably, not just a matter of sticking a calculator in someone’s head. While this would make getting answers to math problems more convenient, it would not make a person any more capable at math than does a conventional outside-the-head calculator. Likewise for sticking in a general computer. Having a PC on my desktop does not make me any smarter. Moving it into my head would not change this. It could, obviously enough, make me seem smarter—at least to those unaware of my headputer.

What would be needed, then, would be a chip (or whatever) that would actually make a change within the person herself, altering intelligence rather than merely closing the interface gap. This sort of modification does raise various concerns.

One obvious practical concern is whether or not this is even possible. That is, while it make sense to install a computer into the body that the person uses via an internal interface, the idea of dissolving the distinction between the user and the technology seems rather more questionable. It might be replied that this does not really matter. However, the obvious reply is that it does. After all, plugging my phone and PC into my body still keeps the distinction between the user and the machine in place. Whether the computer is on my desk or in my body, I am still using it and it is still not me. After all, I do not use me. I am me. As such, my abilities remain the same—it is just a tool that I am using. In order for cybertech to make me more intelligent, it would need to change the person I am—not just change how I interface with my tools. Perhaps the user-tool gap can be bridged. If so, this would have numerous interesting implications for philosophy.

Another concern is more philosophical. If a way is found to actually create a chip (or whatever) that becomes part of the person (and not just a tool that resides in the body), then what sort of effect would this have on the person in regards to his personhood? Would Chipped Sally be the same person as Sally, or would there be a new person? Suppose that Sally is chipped, then de-chipped? I am confident that armies of arguments can be marshalled on the various sides of this matter. There are also the moral questions about making such alterations to people.

My Amazon Author Page

Enhanced by Zemanta

Canine Cognition

Posted in Metaphysics, Philosophy by Michael LaBossiere on September 21, 2009

Descartes, most famous for writing “I think, therefore I am, also wrote about the minds of animals. Roughly put, his view was that animals lacked minds, at least as he saw minds (as immaterial metaphysical thinking substances). He had two main arguments for this: first, animal behavior can be explained without such minds using purely physical explanations. So, by Occam’s Razor, there is no need to accept that animals have minds. The second argument he have is that animals do not use true language and this is the surest sign that they lack minds.

Descartes was well aware that clever animals, like dogs and horses, could learn various tricks and that all animals can make noises to express feelings. However, he held that these facts did not show that animals think.

In recent years, researchers have begun to accept what dog folks have known since humans started having dogs as pets: dogs are smart. For example, research has revealed that dogs can recognize the use of a pointed finger. While recognizing what a pointed finger means (“that”) seems simple enough, it actually requires fairly advanced cognition. The intent of the action must be understood and the object of the action (what is pointed at) must also be recognized.  This sort of sign seems to be more abstract than a direct physical gesture, such as a display of anger or joy. As such, this sort of interpretation requires fairly impressive communication skills.

Dogs, as all dog folks know, are very good at conveying their feelings and desires. They are also quite good at understanding words and can have rather complex vocabularies. For example, my husky can distinguish between numerous words and phrases and react accordingly. She also has various vocalizations and behavior that make it clear what she wants or seems to be thinking at the time. While this might be dismissed as mere habituation, even habituation that complicated would require some significant mental horsepower.

While dogs do not use true language, they certainly seem to have a rather good grasp of our use of language as well as our gestures. Because of this, I am inclined to regard dogs as having minds, albeit less complex than those of most humans (of course, I believe that my husky is smarter than some humans). Unlike Descartes, my view is that having a mind is not a “you do or you don’t” sort of thing in all cases. Rather, minds seem to come in varying degrees. Of course, what the mind actually might be is something that is still under considerable debate.

Reblog this post [with Zemanta]