A Philosopher's Blog

The Better than Average Delusion

Posted in Reasoning/Logic by Michael LaBossiere on March 28, 2014
Average Joe copy

Average Joe copy (Photo credit: Wikipedia)

One interesting, but hardly surprising, cognitive bias is the tendency of a person to regard herself as better than average—even when no evidence exists for that view. Surveys in which Americans are asked to compare themselves to their fellows are quite common and nicely illustrate this bias: the overwhelming majority of Americans rank themselves as above average in everything ranging from leadership ability to accuracy in self-assessment.

Obviously enough, the majority of people cannot be better than average—that is just how averages work. As to why people think the way they do, the disparity between what is claimed and what is the case can be explained in at least two ways. One is another well-established cognitive bias, namely the tendency people have to believe that their performance is better than it actually is. Teachers get to see this in action quite often—students generally believe that they did better on the test than they actually did. For example, I have long lost count of people who have gotten Cs or worse on papers who say to me “but it felt like an A!” I have no doubt that it felt like an A to the student—after all, people tend to rather like their own work. Given that people tend to regard their own performance as better than it is, it certainly makes sense that they would regard their abilities as better than average—after all, we tend to think that we are all really good.

Another reason is yet another bias: people tend to give more weight to the negative over the positive. As such, when assessing other people, we will tend to consider negative things about them as having more significance than the positive things. So, for example, when Sally is assessing the honesty of Bill, she will give more weight to incidents in which Bill was dishonest relative to those in which he was honest. As such, Sally will most likely see herself as being more honest than Bill. After enough comparisons, she will most likely see herself as above average.

This self-delusion probably has some positive effects—for example, it no doubt allows people to maintain a sense of value and to enjoy the smug self-satisfaction that they are better than most other folks. This surely helps people get by day-to-day.

There are, of course, downsides to this—after all, a person who does not do a good job assessing himself and others will be operating on the basis of inaccurate information and this rarely leads to good decision making.

Interestingly enough, the better-than-average delusion holds up quite well even in the face of clear evidence to the contrary. For example, the British Journal of Social Psychology did a survey of British prisoners asking them to compare themselves to other prisoners and the general population in terms of such traits as honesty, compassion, and trustworthiness. Not surprisingly, the prisoners ranked themselves as above average. They did, however, only rank themselves as average when it came to the trait of law-abidingness. This suggests that reality has some slight impact on people, but not as much as one might hope.

My Amazon Author Page

My Paizo Page

My DriveThru RPG Page

Enhanced by Zemanta

Discerning Lies

Posted in Philosophy, Universities & Colleges by Michael LaBossiere on December 6, 2013
a pixelart from an iPod touch

“Lie Detector” (Photo credit: Wikipedia)

 

In my profession, I am lied to on a regular basis. Since I have been a professor for quite some time, I have learned a fair amount about the ways of lying. It also helps that I teach critical thinking and include a section on assessing credibility.

 

Interestingly enough, many people think that they can discern whether or not a person is lying by examining their behavior. This is presumably based on the assumption that there are behavioral clues to indicate when a person is lying. These include the usual signs of nervousness or anxiety such as a failure to make eye contact, sweaty palms, seeming tense, and so on. Unfortunately for those who need to discern lies from truth, such behavior is not actually an indicator of lying. People lie without showing any signs of nervousness and people show these signs when they are not lying.

 

One argument for this can be based on the obvious fact that there are plenty of successful liars. As has been said, a successful con artist will seem quite honest-after all, if he seemed dishonest he would not be successful. As such, it is quite reasonable to believe that there are people who lie without showing the signs that a liar is alleged to show.

 

Naturally, it might be countered that some people are skilled at lying and show no signs of their deceit or that others are merely bad at detecting said behavior (or both). This is, of course, worth considering.

 

My own experience has been that people are generally good at lying without showing any physical signs they are lying. Over the years, I have lost count of the number of times students have lied to me with perfectly calm voices, solid eye contact and no sign of discomfort. The only reason I knew they were lying was because I had clear evidence of this. One of my most amazing experiences was when a student came to my office to say that I had lost her test. I showed her that I had counted the tests, had everyone sign in, and also collected an assignment. All tests were accounted for, she had not signed in and I had no assignment for her. She left, then came back to say that she missed the test because she was sick. Her behavior was completely normal-no change of tone, no nervous behavior and so on. I said that she would need to bring an excuse from the Dean. She left, then came back a bit later to tell me that she had been in a car accident, which is why she missed the test. I again said she would need an excuse from the Dean. Now, it might be claimed that I was oblivious to incredibly subtle behavioral clues that she was lying that someone else might have caught. However, she showed no discernible anxiety-she merely conveyed an air of boredom and disdain. It seemed like the whole thing was routine to her and she was apparently put out by my failure to simply go along with her.

 

After a spike in students trying to change answers or names on returned tests, I started taking photos of all the answer sheets with my iPod touch. When a student would come in with an exam claiming that I had made a mistake, failed to record a test grade or that I had lost a test, I’ll whip out my iPod and look through the photos. Interestingly, the students never showed any signs of being nervous-even when the photos exposed a lie. It seemed like lying was but a tactic that had failed and they took it in stride with no change in behavior.

 

It might be countered that I was dealing with a few master liars or that I am simply incapable of basic observations regarding anxiety. In regards to the first,  it is certainly possible that I have encountered many master liars over the years. After all, I have about 300 students a year and have been teaching for a couple decades, so I could certainly be encountering people who are unusually skilled at lies. It is also possible that although I am capable of observing behavior, I simply cannot discern anxiety caused by lies or other lying behavior, although I can detect other anxiety just fine (such as that shown by students who are worried about failing).

 

I do not want to convey the impression that my students are unusually prone to lying. Based on my experience with other people, they seem to be quite normal and most of my students do not lie to me (if only because there is no need to do so).

 

Interestingly enough, I do see what would be regarded as behavioral indicators of lying from people who I know are not lying. After all, signs of anxiety or stress can result from many causes that have nothing at all to do with lying. Also, some people just normally exhibit behavior that is wrongly taken as signs of lying, such as being unwilling to make eye contact, nervous laughter, and seeming tense.

 

Not surprising, the only really reliable way to tell when people are lying is to have actual  evidence indicating they are saying untrue things. This is not to say that people never show physical signs they are lying-sometimes people do show nervousness or guilt when lying. But, trying to go by the alleged behavioral clues is a recipe for getting things wrong.

 

My Amazon Author Page

 

My Paizo Page

 

My DriveThru RPG Page

 

 

 

Enhanced by Zemanta

Attraction & Clarity

Posted in Relationships/Dating, Science by Michael LaBossiere on February 19, 2011
Facebook logo

Image via Wikipedia

One common filler in the news these days is the study story. These are the stories about various studies, often in psychology, that purport to tell us things on the basis of rather limited samples and often with somewhat amazing inferential leaps. One recent filler piece I came across is one that purports to show that women are more attracted to men whose feelings are not clear.

This study was originally published in Psychological Science, where one may find a plethora of similar studies.

Obviously, academic types have to keep the gears of the research machine going. Status, funding, promotion and tenure all depend on this.

The study mentioned above was conducted by Dr. Erin R. Whitchurch and Timothy D. Wilson of the University of Virgina and Daniel T. Gilbert of Harvard University. In the study, 47 female UVA undergrads were told that male students from other schools had looked at their profiles and that each woman would be rated. Each woman was given “fake” profiles of four men. For the study, the women were divided into four groups. Group one of the woman were told that the four men gave them high ratings, group two was told that the men ranked them as average and the third group of women were not told which rated them high or average.

The women were most attracted to the men whose ratings they did not know. In second place were the men who rated them as attractive and last were the men who rated them as average.

While this study is somewhat interesting, the media coverage certainly outstrips the foundation that it provides.

First, the sample is extremely small and this makes inferences to the general population of women questionable at best. Second, the sample consists of undergraduates at a specific university. This raises the obvious question of whether the sample is adequately representative of women in general. After all, the age of the woman, their education, and so on could be factors that affected the outcome of the study.

Another concern is that some folks presented the study as if it provided findings relevant to dating and relationships. While using the fictional Facebook scenario might tell us something about psychology, making inferences about what would occur in dating or in relationships from such a scenario would be quite a leap.

However, the study did make good press and spread widely on the web. Perhaps someone should do a psychology study on that.

I’ve posted links to articles about the study below.

Enhanced by Zemanta

Stephen Colbert a Threat to Philosophy

Posted in Humor, Philosophy by Michael LaBossiere on October 18, 2010
Stephen Colbert and his wife Evelyn McGee-Colb...
Image via Wikipedia

In a recent show, Stephen Colbert used the  expression “begs the question” in a way that moved him onto the Threats to Philosophy Board (#1 Threat: Hemlock). This is because he used the phrase incorrectly and while wearing a tie.

When people use “begs the question” in this manner, they actually mean “asks the question” or “raises the question.”However, the term “beg the question” already has an established usage as the name of a logical fallacy.

To beg the question is a logical fallacy that involves assuming what is to be proven. For example, if someone says “cheating on a test is wrong because it is wrongfully taking a test”, then he is begging the question. In effect, the person is saying “the reason cheating on a test is wrong is because it is wrong.”

One might wonder why this should be regarded as a problem. After all, it might be argued, people ought to be able to use words anyway they wish. If people use “beg the question” to mean “raises the question” then so be it.

While it is true that the meaning of terms is largely a matter of convention, it seems to make little sense to use “begs the question” to mean “asks the question.” After all, there are already perfectly good phrases to say “asks the question”, “raises the question” and so on. There thus seems to be little need to steal “begs the question.”

The matter of putting him on the threat board was not taken lightly. He was already under consideration for his use of “truthiness” and the damage that was doing to logic. Students now ask about “truthiness” tables and think that there is something called validityness (an argument such that if all the premises have truthiness then the conclusion has truthiness).

Obviously, Colbert must be stopped before he does to philosophy what hemlock did to Socrates and what MSNBC does to itself.

Enhanced by Zemanta

Politics of Anger V: Counterfactual

Posted in Philosophy, Politics by Michael LaBossiere on October 17, 2010
Angry Talk (Comic Style)
Image via Wikipedia

While the angry voters are angry about many things, one main focus of the anger is on the vast sums of money dumped into the bailout. While people are angry that the very people who caused much of the trouble got bailed out, people are also angry because they think that the bailout wasted money. To be specific, some people believe that the bailout did not work. The basis for this is that while the corporations are generally doing well, many individuals are in financial trouble. Unemployment is high and foreclosures are all too common. As such, it is hardly surprising that people are mad about what seems to have been useless spending.

Those who defend the stimulus spending claim that things would be worse now if the spending had not occurred. This sort of claim is what philosophers call a  counterfactual. A counterfactual claim is a claim about what would (or would have been) the case if things were different. People make such claims quite often. For example, someone might say “we would have won if Ted hadn’t missed that foul shot.”

In some cases, the truth value of a counterfactual is easy enough to sort out. For example, if a game is tied and the final shot is Ted’s foul shot, then it is clear that if he had made the shot, then he would have won the game. In the case of the stimulus spending the truth value of the counterfactual is rather hard to sort out because of the complexity of the situation and the fact that economics is hardly an exact science.

While various economic experts claim that the stimulus did prevent something worse, some people are skeptical. In some cases this skepticism is well founded and is based on legitimate concerns about the limits of economics as a science  in particular and about counterfactual reasoning in general. In other cases this skepticism is driven by emotional factors and is thus not well founded.

One powerful emotional factor is that people feel afraid and angry about what is happening now. After all, people typically feel what is far more strongly than what might have been. So, it is hardly shocking that telling people that things would be worse now without the spending does not make them feel better about how things are now.

It seems to be a basic feature of human psychology that telling someone how things could be worse does not, in general,  cheer them up. In my own case, this point was very strongly made when I fell from my roof and tore my quadriceps tendon. People telling me “it could have been worse” did not make me feel any better about my leg. Of course I knew, intellectually, that it could have been worse and I was glad that I had not wrecked both legs or died. But being told that it could have been worse did not make me  feel any better about my leg.

This certainly seems to apply to the current situation: telling someone who is unemployed or who has lost her house that things would be worse now without the stimulus does not make them feel any better. In fact, being told that things could be worse might have the opposite effect. In my own case, I got sick very quickly of people telling me that my fall could have been worse. Since I knew they were trying to express sympathy I was careful to hide my irritation at being told that it could have been worse.  But, as with my leg, what makes people feel better is not being told that it could be worse, but having someone actually do something to make things clearly better. Or, at the very least, saying things that make someone feel better.

Turning back to the stimulus spending, the Democrats obviously chose a poor tactic in trying to focus on how things would have been worse without the spending. While it is a point worth making, telling people this is (as noted above) not going to make them feel better. What the Democrats needed to do was to either find a way to make things better now or (this being politics) find a way to make people feel better.  However, they have clearly failed in both of these areas and people are not happy. In fact, some people are rather angry. Unless the Democrats are able to do something positive (or make people believe they are doing something positive) then things might go very badly for them in the next election cycle.

Enhanced by Zemanta

Politics of Anger IV: Coming Down

Posted in Philosophy, Politics by Michael LaBossiere on October 16, 2010
A Cannabis sativa leaf.
Image via Wikipedia

The voters are supposed to be very angry these days and I have been exploring various issues relating to this alleged anger. Today I am looking at one of the causes of this anger.

One interesting thing about human psychology is that being full of hope is remarkable similar to being high.  Just as drugs wear off, what causes people to have hope also wears off (and fairly quickly). Also, just as with drugs, people seem to build up a tolerance for what gives them hope So, it takes more and more to make a  person feel hopeful. Also, if a drug is supposed to make someone really high and it does not deliver, then the person tends to be unhappy. Finally, when someone’s high or hope fades and they are coming down from it, they tend to be rather angry.

The Democrats came into office promising a big high, metaphorically speaking. After all, Obama rolled in promising hope and change.  However, they failed to deliver on this hope and this created resentment.

So, America is metaphorically coming down hard from a hope high and people are mad as hell and looking for the next fix. The Republicans are, of course, offering to hook America up. Last time around, the Republicans were dealing in fear. Maybe they will hook America with that old standby or maybe they have something else brewing in their labs.

Enhanced by Zemanta

Meltdown?

Posted in Politics by Michael LaBossiere on September 4, 2010

Many folks in the media have turned their attention to Arizona’s Governor Jan Brewer. Unfortunately for her, this attention has been rather negative. In fact, the theme running through much of the media is that she suffered from a meltdown (or two). The first “meltdown” occurred during a debate and is captured in this video.

The second incident occurred when reporters tried to ask her questions. Their specific focus was on her remark about decapitated bodies in the desert of Arizona.

Since I am often accused of being blinded by my alleged liberal views, it might surprise some that I believe that the media is in error here. While I do not agree with Brewer on most matters, my own principles require that I come to her defense in this matter.

While I do agree that she handled the situations poorly, the media folks are engaged in hyperbole with their repeated used of the term “meltdown.” When I heard that she had a meltdown, I expected something that would, in fact, fit the term. Perhaps she had started screaming at her opponent or went into a wild frenzy. But, when I saw the video of the debate, all I saw was her losing her train of thought. True, it was a major loss, but hardly what I would consider a meltdown. While one would expect an experienced politician to be able to handle such a debate easily, there might be factors affecting her that we are not aware of or, perhaps, it was just one of those bad moments. Even though I have taught for years, I will sometimes loss my train of thought and literally draw a complete blank. While this might happen once a year, it has happened. I did not consider this a meltdown on my part, nor do I consider the governor’s situation a meltdown.

In regards to the second incident, she should have answered the question and addressed the concerns being raised. After all, she is running for office and owes the people an answer. There is also the practical matter-not answering makes her come across poorly.

However, not answering questions does not seem to qualify as a meltdown. Now, if she had screamed at the reporters or done a classic Sean Penn maneuver, then that would be a meltdown. As such, the use of the term in hyperbole.

This is not to say that the media should not have covered the story. However, more accurate terms should have been used and the story should receive the degree of coverage that it deserves (which is less than it is getting). The approach being taken now serves to lend credence to the claims that some folks in the media are biased.

Enhanced by Zemanta

Creativity

Posted in Philosophy by Michael LaBossiere on August 8, 2010
Rendering of human brain.

Image via Wikipedia

A while back I saw a Newseek article about the decline of creativity in America.  While I have some doubts about the methodology, I do agree that developing creativity is generally a good thing. As such, I’ll provide a little advice about how to be more creative.

It might seem odd that I would offer such advice. After all, it is often believed that creativity is something that you are born with (or not). While it is true that people are born with varying degrees of creativity, creativity seems like any other quality in that it can be developed (or stifled) by many other factors ranging from environment to training. In any case, it is easy enough to test whether creativity can be improved by trying to do so.

One way to enhance creativity is to develop your foundation of knowledge. This might seem a bit odd-after all, creativity and knowledge are different. A person might know a great deal, yet not be very creative and vice versa. However, consider the following analogy. Imagine a person with a small let of legos. She can be fairly creative with them, but she will be limited in what she can do. By adding more legos she can do more with her creativity. Likewise, the more you know, the more you have to work with.

A second way to enhance creativity is to expand your experiences. This can be done indirectly (reading, for example) or directly (travel, meeting new people, etc.). The focus should be on exposure to ideas, views and ways of life different from your own. In addition to boosting what you have to work with, this can be a help in expanding your perspective. To use an analogy, it is like going and seeing what other people are doing with their legos.

A third way to boost creativity is to be healthy. Eat well, get enough sleep and exercise. Being unhealthy tends to dampen creativity. True, some very unhealthy people are very creative and some very fit people are as creative as bricks. However, good health improves a person’s mental abilities and this includes creativity. You can test this yourself-the next time you are exhausted, sick and hurt, try to be really creative. Then try again when you are well rested, fit and feeling good.

A fourth way to boost creativity is to allow yourself time to mentally drift (or daydream). This allows you to let ideas drift around, merge, blend and split apart. It also allows your subconscious processes to work away at things. In my own case, I have found that I come up with my best ideas when I am running or sort of napping while I am a passenger in a car. Even when I am not consciously focusing on a problem, it seems that once I sort of set my mind in that direction it will “grind away” on the matter (much like my computer is running all sorts of  background tasks while I write this blog).  This process is, of course, formalized as brainstorming, which is a form of semi-directed drift.

A fifth way to boost creativity is to get rid of interruptions and let your mind rest. Some banes of creativity are obvious. For example, if I was trying to write while someone kept interrupting me with demands that I print this or look up that,  it would impede my creativity. Other banes are somewhat less obvious, such as the electronic interrupters most people have invited into their lives. One example is the smart phone-it is almost always one and leaps in to disrupt with its beeps, rings and bings. It also seduces people, drawing their attention to it. Another example is the computer-the web, chat and email are all there ready to intrude. They provide an endless parade of pokes and prods that keep a person from ever truly settling down into creativity. So, people think a thousand shallow and disconnected thoughts, yet rarely have the chance for that deep creative dive.

Enhanced by Zemanta

Human

Posted in Philosophy, Science by Michael LaBossiere on August 2, 2010
Kuhn used the duck-rabbit optical illusion to ...
Image via Wikipedia

Sharon Begley recently wrote an interesting article, “What’s Really Human?” In this piece, she presents her concern that American psychologists have been making hasty generalizations over the years. To be more specific, she is concerned that such researchers have been extending the results gleaned from studies of undergraduates at American universities to the entire human race. For example, findings about what college students think about self image are extended to all of humanity.

She notes that some researchers have begun to question this approach and have contended that American undergraduates are not adequate representatives of the entire human race in terms of psychology.

In one example, she considers the optical illusion involving two line segments. Although the segments have the same length, one has arrows  on the ends pointing outward and the other has the arrows pointing inward. To most American undergraduates, the one with the inward pointing arrows looks longer.  But when the San of the Kalahari, African hunter-gatherers, look at the lines, they judge them to be the same length. This is taken to reflect the differing conditions.

This result is, of course, hardly surprising. After all, people who live in different conditions will tend to have different perceptual skill sets.

Begley’s second example involves the “ultimatum game” that is typical of the tests that are intended to reveal truths about human nature via games played with money. The gist of the game is that there are two players, A and B. In this game, the experimenter gives player A $10. A then must decide how much to offer B. If B accepts the deal, they both get the money. If B rejects the deal, both leave empty handed.

When undergraduates in the States play, player A will typically offer $4-5 while those playing B will most often refuse anything below $3. This is taken as evidence that humans have evolved a sense of justice that leads us to make fair offers and also to punish unfair ones-even when doing so means a loss. According to the theorists, humans do this because we evolved in small tribal societies and social cohesion and preventing freeloaders (or free riders as they are sometimes called) from getting away with their freeloading.

As Begley points out, when “people from small, nonindustrial societies, such as the Hadza foragers of Tanzania, offer about $2.50 to the other player—who accepts it. A “universal” sense of fairness and willingness to punish injustice may instead be a vestige of living in WEIRD, market economies.”

While this does provide some evidence for Begley’s view, it does seem rather weak. The difference between the Americans and the Hadza does not seem to be one of kind (that is, Americans are motived by fairness and the Hadza are not). Rather, it seems plausible to see this is terms of quantity. After all, Americans refuse anything below $3 while the Hazda’s refusal level seems to be only 50 cents less. This difference could be explained in terms not of culture but of relative affluence. After all, to a typical American undergrad, it is no big deal to forgo $3. However, someone who has far less (as is probably the case with the Hazda) would probably be willing to settle for less. To use an analogy, imagine playing a comparable game using food instead of money. If I had recently eaten and knew I had a meal waiting at home, I would be more inclined to punish a small offer than accept it. After all, I have nothing to lose by doing so and would gain the satisfaction of denying my “opponent” her prize. However, if we were both very hungry and I knew that my cupboards were bare, then I would be much more inclined to accept a smaller offer on the principle that some food is better than none.

Naturally, cultural factors could also play a role in determining what is fair or not. After all, if A is given the money, B might regard this as A’s property and that A is being generous in sharing anything. This would show that culture is a factor, but this is hardly a shock. The idea of a universal human nature is quite consistent with it being modified by specific conditions. After all, individual behavior is modified by such conditions. To use an obvious example, my level of generosity depends on the specifics of the situation such as the who, why, when and so on.

There is also the broader question of whether such money games actually reveal truths about justice and fairness. This topic goes beyond the scope of this brief essay, however.

Begley finishes her article by noting that “the list of universals-that-aren’t kept growing.” That is, allegedly universal ways of thinking and behaving have been found to not be so universal after all.

This shows that contemporary psychology is discovering what Herodotus noted thousands of years ago, namely that “custom is king” and what the Sophists argued for, namely relativism. Later thinkers, such as Locke and other empiricists, were also critical of the idea of universal (specifically innate) ideas. In contrast, thinkers such as Descartes and Leibniz argued for the idea of universal (specifically innate) ideas.

I am not claiming that these thinkers are right (or wrong), but it certainly interesting to see that these alleged “new discoveries” are actually very, very old news. What seems to be happening in this cutting edge psychology is a return to the rationalist and empiricist battles over the innate content of the mind (or lack thereof).

Enhanced by Zemanta

Being a Man I: Social Construct

Posted in Ethics, Philosophy by Michael LaBossiere on April 19, 2010
my 1960s wedding suit
Image by Chaymation via Flickr

Apparently some men are having trouble figuring out what it is to be a man. There are various groups and individuals that purport to be able to teach men how to be men (or at least dress like the male actors on the show Mad Men).

Before a person can become a man, it must be known what it is to be a man.  There are, of course, many conceptions about what it is to be a man.

One option is to take the easy and obvious approach: just go with the generally accepted standards of  society. After all, a significant part of being a man is being accepted as a man by other people.

On a large scale, each society has a set of expectations, stereotypes and assumptions about what it is to be a man. These can be taken as forming a set of standards regarding what one needs to be and do in order to be  a man.

Naturally, there will be conflicting (even contradictory) expectations so that meeting the standards for being a man will require selecting a specific subset. One option is to select the ones that are accepted by the majority or by the dominant aspect of the population. This has the obvious advantage that this sort of manliness will be broadly accepted.

Another option is to narrow the field by selecting the standards held by a specific group. For example, a person in a fraternity might elect to go with the fraternities view of what it is to be a man (which will probably involve the mass consumption of beer). On the plus side, this enables a person to clearly  be a man in that specific group. On the minus side, if the standards (or mandards) of the group differ in significant ways from the more general view of manliness, then the individual can run into problems if he strays outside of his mangroup.

A third option is to attempt to create your own standards of being a man and getting them accepted by others (or not). Good luck with that.

Of course, there is also the question of whether there is something more to being a man above and beyond the social construction of manliness. For some theorists, gender roles and identities are simply that-social constructs. Naturally, there is also the biological matter of being a male, but being biologically male and being a man are two distinct matters. There is a clear normative aspect to being a man and merely a biological aspect to being male.

If being a man is purely a matter of social construction (that is, we create and make up gender roles) than being a man in group X simply involves meeting the standards of being a man in group X. If that involves owing guns, killing animals, and chugging beer while watching porn and sports, then do that to be a man. If it involves sipping lattes, talking about Proust,  listening to NPR  and talking about a scrumptious quiche, then do that. So, to be a man, just pick your group, sort out the standards and then meet them as best you can.

In many ways, this is comparable to being good: if being good is merely a social construct, then to be good you just meet the standards of the group in question.

But perhaps being a man is more than just meeting socially constructed gender standards. If so, a person who merely meets the “mandards” of being a man in a specific group might think he is a man, but he might be mistaken. I’m reasonable sure this happens often. (You know who you are…don’t deny it.)  This is, of course, a subject for another time and another blog.

Reblog this post [with Zemanta]