研究生英语多维教程探索课文原文

研究生英语多维教程探索课文原文
研究生英语多维教程探索课文原文

Unit 2 Lies and Truth

What is truth? –and the opposite question that goes with it: what makes a lie? Philosophers, teachers, and religious leaders from all cultures and periods of history have offered many answers to these questions. Among Euro-North-American writers, there is general agreement on two points. The first is that what we call a “lie” must be told intentionally – that is, if someone tells an untruth but they believe it to be true, we don’t consider them a liar. The second point is that practic ally everyone lies, and lies frequently. But there the agreement ends.

One rather extreme point of view is that lying is always bad and that we should try to find ways avoid doing it. The reason is that lying hurts not only the listener, but also the liar. Each lie makes the next one easier to tell, and the liar comes not only to disrespect herself, but to mistrust others, whom she believes will lie as easily as she. In a society, where lying is common, trust becomes impossible, and without trust, cooperation can not exist. Furthermore, by lying to people, we remove their power to make important choices about how to spend money, what future career to take, what medical treatment to take.

Toward the opposite extreme is the position that although some lies are evil, many others are not –in fact, they are necessary to hold our society together. We lie in harmless ways to protect other’s feelings and to better our relationship. These are not lies that try to hurt others. We laugh at the boss’s joke which we have heard before and which she doesn’t tell very well; we pretend interest in a friend’s story of something uninteresting that happened to him. If someone asks us a question that is very personal and is none of their business, we may lie in response. Sometimes we lie to protect the reputation or even the life of another person. On a larger scale, government may protect national security by lying.

Each person seems to have some point at which they draw the line between an acceptable lie and a bad lie. Obviously, this point varies from individual to individual and from culture to culture.

A sometimes painful part of growing up is realizing that not everyone shares your own individual definition of honesty. Your parents and your culture may teach you that liars suffer, but as you go through life, you find that often they don’t: in fact, dishonest people often seem to prosper more than honest ones. What are you to do with this realization? It may make your moral beliefs look weak and silly in comparison, and you may begin to question them. It takes a great deal of strength and courage to continue living an honest life in the face of such reality.

Little white lies: This is our name for lies that we consider harmless and socially acceptable. They are usually told to protect the liar or the feelings of the listener. Most of them would be considered social lies, and they include apologies and excuses: “I tried to call you, but your line was busy.” “You’re kidding! You don’t look like you’ve gained a pound.” Some people, however, would consider it acceptable to lie to save themselves from responsibility in a business transaction: “After I got home, I noticed that it was broken, so I’m returning it and would like my money back.”

Occasionally a “little white lie” may have a very profound effect on the lives of the listeners, and may even backfire. Author Stephanie Ercsson tells of the well-meaning U.S. Army sergeant who told a lie about one of his men who had been killed in action. The sergeant reported the man as

“missing in action,” not killed, so that the military would continue sending money to the dead man’s family every month. What he didn’t consider was that because of his lie, the family continued to live in that narrow space between hope and loss, always watching for the mail or jumping when the telephone or the doorbell rang. They never were able to go through the normal process of sorrowing for, and then accepting, the death of their father and husband. The wife never remarried. Which was worse, the lie or the truth? Did the sergeant have the right to do what he did to them?

What we really mean when we call an untruth a “little white lie” is that we think it was justifiable. Into this category fall many of the lies told within the walls of government. A person may lie to government, or a government official may lie to the public, and believe that by doing so, he becomes a hero. Clearly, however, one person’s “little white lie” is another person’s “dirty lie.” That brings us to the second category:

Dirty lies: There are lies told with intent to harm the listener or a third party and to benefit the liar. Into this category fall the lies of some dishonest salespersons, mechanics, repairmen; husbands or wives who are having an affair with someone else; teenagers who lie to get out of the house in order to do things that their parents would die if they knew about it; drug addicts who beg family members for money to support their habit. Dirty lies my be told to improve one person’s reputation by destroying another’s, to hurt a colleague’s chances of promotion so that the liar will be advanced. Lies of omission: Some people believe that lying covers not only what you say, but also what you choose not to say. If you’re trying to sell a car that burns a lot of oil, but the buyer don’t ask about that particular feature, is it a lie not to tell them? In the United States, a favorite place to withhold the truth is on people’s income tax returns. The government considers this an unquestionable lie, and if caught, these people are severely punished. If omission can be lying, history books are great liars. Until recently, most U.S. history textbooks painted Christopher Columbus purely as a hero, the man who “discovered America,” and had nothing to say about his darker side. Moreover, most Native American and African-American contributors to science, technology, invention, literature, art, discovery, and other areas of civilization used to be omitted form children’s schoolbooks. Many people considered this a lie, and today’s history books u sually mention at least some of it, though not as much as some people might like.

False promises: This category is made up of promises that the promiser knows are false, that he has no intention of keeping even as the world leave his lips. While some are fairly harmless and social, others are taken more seriously and can hurt the listener: “I’ll never do it again, I promise.” Advertisers and politicians suffer from terrible stereotypes because of the false promises of some of their number: “Lose 50 pounds in two weeks.” “Read my lips: No new taxes.” Probably everyone would agree that if we make a promise but have no intention of keeping it, we lie. But what if we really do plan to keep it, and then something happens to prevent it? Consider the journalist who promises not to indentify his resources, but then is pressured by his newspaper or by the law. How far should he go to keep his word? If he breaks his promise, is he dishonest?

Lies to oneself: This is perhaps the saddest and most pathetic kind of lying. These are the lies that prevent us from making needed changes in ourselves: “I know I drank/spent/ate too much yesterday,

but I can control it any time I really want to.” But there is a fine line between normal dreams and ambitions on the one hand, and deceiving ourselves on the other, and we have to be careful where we draw it. It’s common for young people to dream of rising to the top of their company, of winning a Nobel Prize, of becoming famous or rich; but is that self-deception, or simply human nature? Were they lying to themselves? More likely, they really believed that such a future was open to them, because they had seen it happen to others. We shouldn’t be too hard on ourselves, but if we have turned a blind eye to our faults, we should take an honest look in the mirror.

There is no question that the terms “lying” and “honesty” have definitions that vary across culture boundaries. Members of one culture may stereotype members of another as “great liars,” “untrustworthy,” or “afraid to face the truth.” But what may lie behind these differences is that one culture values factual information even if it hurts, while another places more value on sensitivity to other people’s feelings. While the members of each culture believe that of course their values a re the right ones, they are unlikely to convince members of other cultures to change over. And that’s “the truth.”

Unit 3 Generation X

It’s often said that kids today aren’t what they used to be. But is this new generation of teena gers and young adults, commonly referred to as “Generation X” or the “baby busters,” really so different from previous generations? What makes them tick? What impact will they have on us and our institutions as we move into the future?

Current Trends

Twent y years ago, employers didn’t worry about finding enough good people. Just like a box of tissues, there was always another candidate that would pop right up. But the 18-year baby boom of 1946-1964, when birth rates peaked at 25.3 births per 1,000 population, was followed by the 11-year “baby bust,” when the rate fell to a low of 14.6 births per 1,000. This means the smallest pool of entry-level workers since the 1930s. “Generation X,” as they were dubbed in a 1991 novel by Canadian writer Douglas Coupland, realize the numbers are on their side. They are now mainly in their 20s, and they see themselves as very marketable in the workplace. They feel that they can be patient when choosing a job, and they can look for the best wages.

This generation has watched more TV, and as a result has probably witnessed more violence and murders, than any generations in history. In addition, their gloomy view of the world has been shaped by numerous negative events, such as the Persian Gulf War, escalating crime, riots, AIDS, the nuclear threat, and pollution.

They parents practiced birth control and abortion and were highly concerned about “making it” financially. About 40% of X’ers are products of divorce, and many were brought up in single-parent homes. The emotional upheaval and conflict this causes helped shape their view of the family and the world. It seems to have sent out a negative message to X’ers about their value and worth.

Many young believe that their economic prospects are gloomy. They believe that they will not do as well financially as their parents or their grandparents. They know that the average income for young people, even with two or three college degrees, has declined significantly over the past generations. Many feel that their chances of finding the job and salary they want are bleak. Couple with the high divorce rate with the fact that many were latchkey children and you get a generation who may have had more time alone than any in history. They are also the first to spend considerable time in day care. At home, they were weaned on TV, high tech, video games, and computers. They became independent at a young age. Many had to grow up fast, taking on family responsibilities or part-time jobs to help out. All this has helped them become very freedom-minded, individualistic,and self-absorbed.

Many resent the fact that their parents were not home to spend more time with them. An often heard sentiment is that things will be different when they raise their own families.

The loyalty and commitment to the workplace that previous generations had is gone. Generation X’ers watched their grandparents slave away only to receive a gold watch and pension upon retirement. Thirty or more years of loyalty sometimes ended with a security guard helping them to clean out their desks and escorting them out the door. Their parents’ dedication to the company has been repaid with downsizing and layoffs.

Young people feel there is no such thing as job security. They feel they don’t want to wait around and pay their dues when there is no long-term commitment from the top. They can’t believe that their boomer bosses spend 60 or more hours a week at a job that they constantly complain about. They strongly believe there is life after work.

Generation X’ers take longer to make j ob choices. They look upon a job as temporary instead of as a career, partly because they want to keep their options open. They are always looking to jump ship when they can upgrade their situation. They will often leave a job at the hint of a better position.

This generation seems to do things at a much later age than their parents. They graduate from college later, stay at home longer, and marry much later. Many who leave home come back again, sometimes more than once. This is due in part to the high cost of living and the fact that many have piled up huge studentloan debts. In contrast with the baby boomers, who couldn’t wait to leave home, Generation X’ers save their money so they can live better when they do leave. It may be that some just want to delay the time when they are on their own, because they spent so much time alone as children.

Many of X’ers’ parents were busy in the morning getting ready for work and too tired to have any quality time with their children at night. X’er classrooms were often overcrowded. It was hard for the X’ers to get noticed, so as adults they have a need to be noticed. Often, they seek that attention in the workplace.

Whether from watching TV or from being spoiled by their guilt-ridden, seldom-home parents or grandparents, X’ers have come to expect a whole lot for nothing. They have a strong propensity for instant gratification, wanting it all and wanting it fast. Their favorite TV programs are soap operas. They would like their world to be filled with the same good-looking people, dressed in the latest fashions, with lots of money and prestige, and without having to work too hard.

It is not uncommon for X’ers to get out of high school and expect to be paid well despite minimal skills. Many disdain low-wage “McJobs” at fast-food chains. Young college graduates look to start at high paying positions with power and perks. They have little patience for working their way up.

Yet, the X’ers feel that making money is not as important as experiencing life. To be a workaholic is to have no life. Consequently, a paradox exists between how they view life and what they think they need from it.

Future Trends

The first boomers are only 10 or 12 years away from retiring – and finally out of the way of the next generation. The X’ers will beg in to take over in politics, arts and culture, education, media, and business. This should lead to a time of better problem solving and quicker solutions, as they hate political maneuvering and want to get to solutions in a fast, no-nonsense way.

X’ers don’t like the fact that their parents spent so many hours working. They promise to do better with their children, being more accessible and providing a more stable home life. Since many of them will marry later when they are more mature, the divorce rate will finally begin to dip.

When X’ers control the organizations of tomorrow, they will create a shorter workweek, so people will have more time to spend with their families and leisure activities. Productivity won’t suffer, as technology will enable people to be more productive. In addition, the X’ers’ disdain for office politics and desire to solve problems faster will improve productivity. If organizations do not manage their human resources better, X’ers will leave to find or create a more humane workplace.

Many Generation X’ers have a freedom-minded and individualistic nature. They like to be left alone to solve problems. They are a perfect group to become consultants, as already evidenced by so many venturing out on their own.

Organizations will come to re ly on the X’ers’ entrepreneurial spirit to foster innovation. They will create systems that will allow “intrapreneurs” to create and run small businesses within a business. The organization’s financial support will allow young people to research and create new products at unparalleled rates. Outside entrepreneurs of this generation will team up with these “intrapreneurs” to create joint ventures.

Generation X’ers have started to use their technology skills to create virtual businesses, and they will be the driving force behind this marketplace in the future. They have been quick to take advantage of the lower overhead and quick start-ups that the Internet provides. Being able to reach millions of people with new ideas and products instantly attracts this generation.

Generation X has evolved in dramatically different ways than previous generations. What motivated past generations is far different from what motivates this new breed. But the changes will be for the better in many ways. Kids may not be what they used to be, but if we listen, there is a lot we can learn from them. The future will be a better place if we do.

Unit 7 To Err Is Human

Everyone must have had at least one personal experience with a computer error by this time. Bank balances are suddenly reported to have jumped from $379 into the millions, appeals for charitable contributions are mailed over and over to people with crazy-sounding names at your address, department stores send the wrong bills, utility companies write that th ey’re turning everything off, that soft of thing. If you manage to get in touch with someone and complain, you then get instantaneously typed guilty letters from the same computer, saying, “Our computer was in error, and an adjustment is being made to your account.”

These are supposed to be the sheerest, blindest accidents. Mistakes are not believed to be part of the normal behavior of a good machine. If things go wrong, it must be a personal, human error, the result of fingering, tampering, a button getting stuck, someone hitting the wrong key. The computer, at its normal best, is infallible.

I wonder whether this can be true. After all, the whole point of computers is that they represent an extension of the human brain, vastly improved upon but nonetheless human, superhuman maybe.

A good computer can think clearly and quickly enough to beat you at chess, and some of them have even been programmed to write obscured verse. They can do anything we can do, and more besides. It is not known whether a computer has its own consciousness, and it would be hard to find about this. When you walk into one of those great halls now built for the huge machines, and stand listening, it is easy to imagine that the faint, distant noises are the sound of thinking, and the turning of the spools gives them the look of wild creatures rolling their eyes in the effort to concentrate, choking with information. But real thinking, and dreaming, are other matters.

On the other hand, the evidences of something like unconscious, equivalent to ours, are all around, in every mail. As extensions of the human brain, they have been constructed with the same property of error, spontaneous, uncontrolled, and rich in possibilities.

Mistakes are the very base of human thought, embedded there, feeding the structure like root nodules. If we were not provided with the knack of being wrong, we could never get anything useful done. We think our way along by choosing between right and wrong alternatives, and wrong choices have to be made as frequently as the right ones. We get along in life this way. We are built to make mistakes.

A good laboratory, like a good bank or a corporation or government, has to run like a computer. Almost everything is done flawlessly, by the book, and all the numbers add up to the predicted sums. The days go by. And then, if it is a lucky day, and a lucky laboratory, somebody makes a mistake: the wrong buffer, something in one of the blanks, a decimal misplaced in reading counts, the warm room off by a degree and a half, a mouse out of his box, or just a misreading of the day’s protocol. Whatever, then the results come in, something is obviously screwed up, and then the action can begin.

The misreading is not the important error; it opens the way. The next step is the crucial one. If the investigator can bring himself to say, “But even so, look at that!” then the new finding, whatever it is, is ready for snatching. What is needed, for progress to be made, is the move based on the error.

Whenever new kinds of thinking are about to be accomplished, or new varieties of music, there has to be an argument beforehand. With two sides debating in the same mind, haranguing, there is an amiable understanding that one is right and the other wrong. Sooner or later the thing is settled, but there can be no action at all if there are not the two sides, and the argument. The hope is in the faculty of wrongness, the tendency toward error. The capacity to leap across mountains of information to land lightly on the wrong side represents the highest of human endowments.

It may be that this is a uniquely human gift, perhaps even stipulated in our genetic instructions. Other creatures do not seem to have DNA sequences for making mistakes as a routine part of daily living, certainly not for programmed error as a guide for action.

We are at our human finest, dancing with our minds, when there are more choices than two. Sometimes there are ten, even twenty different ways to go, all but one bound to be wrong, and the richness of selection in such situations can lift us onto totally new ground. This process is called exploration and is based on human fallibility. If we had only a single center in our brains, capable of responding only when a correct decision was to be made, instead of the jumble of different, credulous, easily conned clusters of neurons that provide for being flung off into blind alleys, up trees, down dead ends, out into blue sky, along wrong turnings, around bends, we could only stay the way we are today, stuck fast.

The lower animals do not have this splendid freedom. They are limited, most of them, to absolute infallibility. Cats, for all their good side, never make mistakes. I have never seen a maladroit, clumsy, or blundering cat. Dogs are sometimes fallible, occasionally able to make charming minor mistakes, but they get this way by trying to mimic their masters. Fish are flawless in everything they do. Individual cells in a tissue are mindless machines, perfect in their performance, as absolutely inhuman as bees.

We should have this in mind as we become dependent on more complex computers for the arrangement of our affairs. Give the computers their head, I say; let them go their way. If we can learn to do this, turning our heads to one side and wincing while the work proceeds, the possibilities for the future of mankind, and computerkind, are limitless. You average good computer can make calculations in an instant which would take a lifetime of slide rules for any of us. Think of what we could gain from the near infinity of precise, machine-made miscomputation which is now so easily within in our grasp. We could begin the solving of some of our hardest problems. How, for instance, should we go about organizing ourselves for social living on a planetary scale, now that we have become, as a plain fact of life, a single community? We can assume, as a working hypothesis, that all the right ways of doing this are unworkable. What we need, then, for moving ahead, is a set of wrong alternatives much longer and more interesting than the short list of mistaken courses that any of us can think up right now. We need, in fact, an infinite list, and when it is printed out we need the computer to turn on itself and select, at random, the next way to go. If it is a big enough mistake, we could find ourselves on a new level, stunned, out in the clear, ready to move again.

Unit 8 Throwing Away the Key

Lock up a criminal and society will be spared whatever other crimes he might have committed if he were still on the street. That much is true.

Lock up two criminals, keep them in prison twice as long, and crime should decrease that much more? Not necessarily.

The logic behind what criminologists call incapacitation –the restraint on prisoners’ ability to commit crime – is irresistible. It has helped to fuel the “get tough” response to crime in the United States that has resulted, over the past two decades, in a major shift toward mandatory minimum sentences, increased use of the death penalty, the introduction of “three strikes” laws, and even, in a few places, the reinstitution of chain gangs.

Unfortunately, most criminologists argue, the logic is flawed. Researchers agree that prison sentences avert some crimes. The question is the degree to which they do: At what point does prison lose its effectiveness in fighting crime? Answers to the question vary widely.

At one extreme, heating up the argument considerably is John J. Dilulio, Jr., a political scientist at Princeton University and the Brookings Institution. In the past three years, he has emerged as one of the most outspoken proponents of tougher prison terms for serious offenders – and as a favorite of politicians who want to look serious about crime.

“Yes, we’ve tripled the prison population,” he says. “Yes, we’ve doubled spending. That doesn’t tell me th at we shouldn’t do more.”

Most criminal-justice scholars –to say the very least about what they think of Mr. Dilulio’s work –disagree.

“It’s distressingly easy to fill prisons,” says Franklin E.Zimring, a law professor at the University of California a t Berkeley, “but not with the kind of exceptionally threatening offenders you want.”

A Massive Natural Experiment

The United States is in the midst of a massive natural experiment in imprisonment’s effect on crime – the results of which, so far, are inconclusive.

Responding to a steep increase in the crime rate in the 1960s, political leaders fell back on tougher sanctions. Since the mid-1970s, every state has passed some kind of mandatory-minimum sentencing law. In the past two decades, the rate of imprisonment has tripled, to about 350 prisoners

for every 100,000 people from about 110. Today about 1.5 million men and women are behind bars.

Over the same period, the rate of crime has remained fairly stable. According to data from the federal Bureau of Justice Statistics, which surveys households to determine the number of people victimized by crime, the number of victims of all types of crime is down somewhat, to slightly less than 35 million in 1992 from slightly more than that in 1973. The other major source of national crime data, the Federal Bureau of Investigation’s complication of offenses that are reported to the police, shows an increase in total criminal offenses, although the rates of certain kinds of crime have held steady. Since the mid-1970s, for example, the yearly murder rate has stayed between 8 and 10 per 100,000 people.

Criminal-justice experts and policy makers see in those sets of statistics what they want to see. Some say the fact that crime rates have held steady (or gone down, depending on the source) must be due at least in part to the increase in imprisonment. In a 1991 article in Science, for example, Patrick A. Langan, a senior statistician for the Bureau of Justice Statistics, calculated that there were 66,000 fewer rapes in 1989 than in 1973, 323,000 fewer robberies, 380,000 fewer assaults, and 3.3 million fewer burglaries.

“If only one-half or even one-fourth of the reductions were the result of rising incarceration rates,” he wrote, “that would still leave prisons responsible for sizable reductions in crime.”

Yet crime has certainly not decreased in proportion to the rise in imprisonment. Experts say the law of diminishing returns is at work here: As judges send more and more people to jail, a greater proportion of prisoners will inevitably be less-frequent offenders. What’s more, most criminologists agree that the steep rise in incarceration rates has been fueled largely by low-level drug offenders. Giving them more and longer sentences has done little to stop the drug trade, scholars say, since there always seem to be others out on the street to take their place.

The fact that scholars treat the data on rates of crime and incarceration a little like tea leaves is at least partly because of the difficulty of analyzing the interplay of crime and punishment. Not many researchers have been drawn to the task.

“When you look at the relationship between crime rates and prison populations, it’s hard to tell what’s causing what,” says Daniel Nagin, a criminologist at Carnegie Mellon who is writing a review of research into the effects of imprisonment on crime. “Given the scale of this thing, it’s an underresearched question.”

‘A Very Interesting Puzzle’

They also found an anomaly, however: Most of the drop in those two crimes occurred among juvenile offenders, while most of California’s increased incarceration (and, theoretically, incapacitation) occurred among adults. “The age pattern is a very interesting puzzle,” Mr. Zimring says.

Researchers do know a lot about the ages at which offenders commit crimes. Generally speaking, they are most likely to be involved in criminal activity in their late teens and early 20’s. By the time they hit 35, most have virtually stopped committing crimes. For that reason, many scholars argue, long sentences have limited usefulness as a way to incapacitate prisoners.

That’s especially true of the current rash of “three strikes” laws, says Michael Tony, a law professor at the University of Minnesota. By the time a person is convicted of a third serious offense, he’s typically around 30 or 35; it makes little sense to throw him in prison for life, as many such laws require.

Most criminal-justice scholars agree that the research on incapacitation offers little solid evidence for the belief, apparently widespread among politicians and members of the public, that locking up more people for longer terms will proportionately cut down on crime. Criminologists as a group are politically somewhere left of center; most of them believe that, for many offenders, prison is costly, unnecessary, and overly punitive.

‘They Live It Every Day’

Mr. Dilulio blames the liberal biases of the criminal-justice field. He argues that many criminologists are more in touch with the interests of prisoners than with the interests of victims. He points to what he believes is a tendency in the literature to trivialize nonviolent crime.

“You can’t tell the American people it’s not a problem,” he says, “because they live it everyday.”

Many other criminologists blame political expediency and the need in recent years for all politicians to act tough on the issue of crime. They point to the continuing policy of putting more and more drug offenders away, in the face of overwhelming evidence that doing so has little effect.

“If something f its with our view of the world,” says Mr. Zimring of Berkeley, “our willingness to assume it works has no limit.”

Unit 9 On Being Black and Middle Class

The black middle class has always defined its class identity by means of positive images gleaned from middle- and upper-class white society, and by means of negative images of lower-class blacks. This habit goes back to the institution of slavery itself, when “house” slaves both mimicked the whites they served and held themselves above the “field” slaves. But in the 60’s the old bourgeois impulse to dissociate from the lower classes (the “we-they” distinction) backfired when racial identity suddenly called for the celebration of this same black lower class. One of the qualities of a double bind is that one feels it more than sees it, and I distinctly remember the tension and strange sense of dishonesty I felt in those days as I moved back and forth like a bigamist between the demands of class and race.

Though my father was born poor, he achieved middle-class standing through much hard work and sacrifice (one of his favorite words) and by identifying fully with solid middle-class values –mainly hard work, family life, property ownership, and education for his children (all four of whom have advanced degrees). In his mind these were not so much values as laws of nature. People who embodied them made up the positive images in his class polarity. The negative images came largely from the blacks he had left behind because they were “going nowhere.”

No one in my family remembers how it happened, but as time went on, the negative images congealed into an imaginary character named Sam who, from the extensive service we put him to, quickly grew to mythic proportions. In our family lore he was sometimes a trickster, sometimes a boob, but always possessed of a catalogue of sly faults that gave up graphic images of everything we should not be. On sacrifice: “Sam never thinks about tomorrow. He wants it now or he doesn’t care about it.” On work: “Sam doesn’t favor it too much.” On children: “Sam likes to have them but not to raise them.” On money: “Same drinks it up and pisses it out.” On fidelity: “Sam has to have two or three women.” On clothes: “Sam features loud clothes. He likes to see and be seen.” And so on. Sam’s persona amounted to a negative instruction manual in class identity.

I don’t think that any of us believed Sam’s faults were accurate representations of lower-class black life. He was an instrument of self-definition, not of sociological accuracy. It never occurred to us that he looked very much like the white racist stereotype of blacks, or that he might have been a manifestation of our own racial self-hatred. He simply gave us a counterpoint against which to express our aspirations. If self-hatred has a factor, it was not, for us, a matter of hating lower-class blacks but of hating what we did not want to be.

Still, hate or love aside, it is fundamentally true that my middle-class identity involved a dissociation from images of lower-class black life and a corresponding identification with values and patterns of responsibility that are common to the middle class everywhere. These values sent me a clear message: be both an individual and a responsible citizen, understand that the quality of your life will approximately reflect the quality of effort you put into it, know that individual responsibility is the basis of freedom and that the limitations imposed by fate (whether fair or unfair) are no excuse for passivity.

Whether I live up to these values or not, I know that my acceptance of them is the result of lifelong conditioning. I know also that I share this conditioning with middle-class people of all races and that I can no more easily be free of it than I can be free of my race. Whether all this got started

because the black middle class modeled itself on the white middle class is no longer relevant. For the middle-class black, conditioned by these values from birth, the sense of meaning they provide is as immutable as the color of his skin.

The discomfort and vulnerability felt by middle-class blacks in the 60’s, it could be argued, was a worthwhile price to pay considering the progress achieved during that time of racial confrontation. But what may have been tolerable then is intolerable now. Though changes in American society have made it an anachronism, the monolithic form of racial identification that came out of the 60’s is still very much with us. It may be more loosely held, and its power to punish heretics has probably diminished, but it continues to catch middle-class blacks in a double bind, thus impeding not only their own advancement but even, I would contend, that of blacks as a group.

The victim-focused black identity encourages the individual to feel that his advancement depends entirely on that of the group. Thus he loses sight not only his own possibilities but of the inextricable connection between individual effort and individual advancement. This is a profound encumbrance today, when there is more opportunity for blacks than ever before, for it reimposes limitations that can have the same oppressive effect as those the society has only recently begun to remove.

It was the emphasis on mass action in the 60’s that made the victim-focused black identity a necessity. But in the 80’s and beyond, when racial advancement will come only through a multitude of individual advancements, this form of identity inadvertently adds itself to the forces that hold us back. Hard work, education, individual initiative, stable family life, property ownership –these have always been the means by which ethnic groups have moved ahead in America. Regardless of past or present victimization, these “laws” of advancement apply absolutely to black Americans also. There is no getting around this. What we need is a form of racial identity that energizes the individual by putting him in touch with both his possibilities and his responsibilities.

It has always annoyed me to hear from the mouth of certain arbiters of blackness that middle-class blacks should “reach back” and pull up those blacks less fortunate than they –as though middle-class status were an unearned and essentially passive condition in which one needed a large measure of noblesse oblige to occupy one’s time. My own image is of reaching back from a moving train to lift on board those who have no tickets. A noble enough sentiment – but might it not be wiser to show them the entire structure of principles, effort, and sacrifice that puts one in a position to buy a ticket at any time one likes? This, I think, is something members of the black middle class can realistically offer to other blacks. Their example is not only a testament to possibility but also a lesson in method. But they cannot lead by example until they are released from a black identity that regard s that example as suspect, that sees them as “marginally” black, indeed that holds them back by catching them in a double bind.

To move beyond the victim-focused black identity we must learn to make a difficult but crucial distinction: between actual victimization, which we must resist with every resource, and identification with victim’s status. Until we do this we will continue to wrestle more with ourselves than with the new opportunities which so many paid so dearly to win.

Unit 10 Depression and How to Beat It

The march of science has produced this arresting tidbit: Though most of us are in a blah or foul mood three days out of ten, an annoying 0.5% of the population is in a good mood all the time. And just your luck, one of those happy-go-lucky types works in the adjoining office. Bill is perky. Bill is chatty. Bill, in fact, is getting on your nerves – more and more so, all the time. And if Mister Happy Face slaps you on the back one more goddamn morning and bubbles about work, he’s gonna get a fat lip.

Expect lately you haven’t had the energy to bust anybody’s lip. It’s a colossal effort to drag yourself out of bed. Bill and everybody else make it impossible to concentrate. The headaches and lack of sleep just make it worse. Weekends are no better. Golf used to be fun, but it turned into such a useless, boring game. Work stinks. Home stinks.

What to do? Bark a few orders to the staff to show them who’s boss, close the office door, and hope no one notices you can’t get anything done? Jump on a plane an d hide out touring the regional offices? An alarmingly pleasant thought flashes by: “Maybe this misery will end if the plane loses a wing and …”

Warning signal, warning signal: Pal, it sounds as if you’ve got more than the ordinary blahs or even burnout. Quite possibly, you are in the throes of a very common illness: major depression. Because it masquerades as dozen different ailments – including backache, stomach problems, anxiety – and because it often gets dismissed as a touch of common blues, true depression is far more prevalent than most people, including your family doctor, realize. Goof-offs and bumblers aren’t necessarily the ones who get clobbered either. Abraham Lincoln, J.P. Morgan and Winston Churchill got it, too. Says Jeffrey Lynn Speller, a Harvard-trained psychiatrist practicing near Boston who specializes in depressed execs: “Often it hits the most ambitious, creative, and conscientious.”

The bad news: Depression can be a dreadful, even fatal, disease. It screws up careers and marriages. Or, as in the case of President Clinton’s lawyer friend Vincent Foster Jr., it can lead to suicide.

The good news: Depression is one of the most easily treated emotional ailments. More than 80% of depressives can recover, most within a few weeks, thanks to the variety of effective treatments that have become available. Doctors have a far better handle than they once did on when psychotherapy is useful and when it isn’t helpful. They also have at their disposal a new generation of antidepressant drugs that come with fewer side effects, like weight gain, and are safer if taken in excess than their predecessors. So good are these newer antidepressants, in fact, that some mental health experts worry that normal people might use them to get personality improvements of the kind described in Listening to Prozac, a current bestseller.

How widespread is depression? Estimates vary, but indications are that about 15% of the U.S. population – one man in ten and one woman in five – will have a serious depression at some time in their lives usually before they hit their 40s. At any given moment, about 3% of men and – probably because of a mix of greater life stresses and subtle differences in brain chemistry – 6% of women are depressed. Studies show that the incidence of depression has been rising sharply among people born since the 1940s. baby-boomers and busters are three to six times more likely to report a

depression than someone born at the turn of the century. Reasons for the increase are not clear, but are more than just a greater awareness of depression among young people and a willingness to admit it. Some experts theorize that successive generations have higher expectations from life and are more likely to be disappointed; others blame diminished family stability.

Depression comes in different forms. Manic depression, for example, is characterized by moods that swing from wild euphoria to deep despair. About 1% of the population – as many men as women –suffers from this disorder. A variant is something called chronic hypomania, which is more likely to hit highly intelligent people than average ones. Hypomanics may go for years with extraordinary energy, remarkable creativity, and a talent for synthesizing seemingly disparate bits of data. If their condition changes, however, which it often does, it veers toward depression 70% of the time and toward mania most other times. Speller says half the execs he treats are hypomanics who have fallen into depression.

Corporate turmoil and the flat economy are also feeding the increase in depression. More executives and professionals are seeking help than in the go-go 1980s, at least according to anecdotal evidence supplied by shrinks. This trend is supported by one of the few surveys of depression among corporate types. Some four years ago, experienced clinicians interviewed more than 1,800 managers and engineers, mostly white males in their 40s, who worked at Westinghouse Electric, a company then in upheaval. Evalyn Bromet, a psychiatry professor at the State University of New York at Stony Brook, who ran the survey, describes the results as “astounding.” Bromet found 23% of the people interviewed had experienced a major depression in their lives; of the 23%, nearly one out of ten had been depressed in the preceding 12 months. “All t he literature said depression was more prevalent among poorer or less educated people,” she says, “That was not the case.”

Scientists are increasingly persuaded that what depressed people suffer from is usually a biochemical problem, a bit like diabetes, caused by an imbalance in the brain. The imbalance is triggered often, but not always, by stressful life events – certainly a personal loss, but sometimes even positive news like a promotion, a baby, or a new home. A tendency to depression is not a weakness or a self-indulgence, but runs in families; indeed, it’s probably a genetic trait. The parents, siblings, and children of a depressed person are four times more likely to get depressed than a nonrelative; his identical twin is at ten times the risk.

相关主题
相关文档
最新文档