Tuesday, October 7, 2008

Are Whites More Intelligent than Blacks?

YES, say many people. Whites, as a race, have inherited more intelligence than blacks.

William Shockley, a Nobel laureate in physics, strongly asserts that this is so. He says: “My research leads me inescapably to the opinion that the major cause of American Negroes’ intellectual and social deficits . . . racially genetic in origin.”

Professor Arthur R. Jensen of the University of California in Berkeley is a leading exponent of the view that in intelligence whites are biologically superior to blacks. He declares: “The number of intelligence genes seems to be lower, overall, in the black population than in the white.”

What is the basis for such claims?

Basis tor Claims

Inheritance, many will point out, has a lot to do with racial differences. Blacks have inherited dark skin, thick lips and kinky hair, and whites have inherited strikingly different features. So, if whole groups of people have inherited such different physical characteristics, it is only reasonable, some will argue, that the races would inherit different degrees of intelligence. But do they? Why is it claimed that blacks, as a race, have inherited less intelligence than whites?

The reason is principally due to results from Intelligence Quotient (IQ) tests. In these tests blacks score, on the average, about 15 points lower than whites. Even when whites and blacks of a similar social and economic status are tested, the scores of whites average significantly higher than do the scores of blacks. So Jensen concludes from such evidence “that something between one-half and three-fourths of the average IQ difference between American Negroes and whites is attributable to genetic factors.”

The results from IQ tests, coupled with conclusions based on the evolution theory, have reinforced the opinion of many that blacks are mentally inferior. Some scientists have argued that the races evolved, to a large extent, independently over hundreds of thousands of years. Blacks, it is claimed, crossed the evolutionary threshold into the category of Homo sapiens later than whites.

Since IQ tests today are the principal basis for the claim that blacks are inherently less intelligent than whites, let us look at those tests.

Intelligence and IQ Tests

First of all, what is meant by intelligence?

That is a surprisingly difficult question to answer. A great many different qualities might be called intelligence. People may be “intelligent” in one context, perhaps being able to memorize names and dates easily, but be “stupid” in another, such as in doing arithmetic problems. So there is no universally accepted definition of what intelligence is.

What about IQ tests, then? Do they measure intelligence? Commenting on this, Patrick Meredith, professor of psychophysics at Leeds University, England, said: “It might be held that Frenchmen are brighter than pygmies, but if you see pygmies in their natural environment making bridges out of fibre and living life successfully you might ask what you mean by intelligence. The IQ rating is no indication of how a person will behave in a defined situation. The IQ test is a totally unscientific concept.”

It is generally agreed that IQ tests fail to give a complete picture of the many factors involved in intelligence. Circumstances and backgrounds of peoples are too varied for them to be able to do this. What, then, do IQ tests measure?

Arthur Whimbey, professor of psychology at a university in the southern United States, observes: “Studies lead to the conclusion that IQ tests do not measure innate intellectual capacity, but rather a group of learned skills that can be taught in the classroom or in the home.”

To confirm this, it has been demonstrated that persons can be taught how to take IQ tests, with startling results. One investigator reports that a young Mississippi black student was given instruction about taking such tests, and in six weeks he raised his IQ score dramatically.

You can easily imagine the wrong conclusion s a person might draw from IQ scores, and the effects this can have. An American black, who is now a university professor, writes:

“At 15 I earned an IQ test score of 82 . . . Based on this scare, my counselor suggested that I take up bricklaying because I was ‘good with my hands.’ . . . I went to Philander Smith College anyway, graduating with honors, earned my master’s degree at Wayne State University and my Ph.D. at Washington University in St. Louis. Other blacks, equally as qualified, have been wiped out.”

Yet, the fact remains that whites score, on the average, 15 points higher than do blacks on IQ tests. Why? If one is going to argue that blacks are innately just as intelligent as whites, then why don’t they score better than they do?

Examining the Question in Context

There are many factors that can account for their lower average IQ scores. In particular, American blacks have been greatly disadvantaged by their treatment by whites as inferiors, and as undesirables. Former Supreme Court Chief Justice Earl Warren illustrated modern racial attitudes in an April 1977 Atlantic article.

When the Supreme Court’s school segregation decision was pending in the mid-1950’s, President Dwight Eisenhower of the United States invited Warren to a White House dinner for the purpose of influencing him to decide in favor of upholding the segregation law. “The President,” Warren writes, “took me by the arm, and, as we walked along, speaking of the southern states in the segregation cases, he said, ‘These [Southerners] are not bad people. All they are concerned about is to see that their sweet little girls are not required to sit in school alongside some big overgrown Negroes.’”

As vocalized by this president, whites have commonly attempted to “keep blacks in their place”—in a segregated, subordinate position cut off from the benefits enjoyed by whites. During slavery, and later during legalized segregation, this was easy to do. Blacks who stepped out of line were whipped, lynched or otherwise punished. The effect was to produce the childlike, subservient, mentally slow “Sambo” personality. Whites have commonly believed that this personality was inherent in blacks. However, Harvard professor Thomas F. Pettigrew explains:

“No African anthropological data have ever shown any personality type resembling Sambo; and the concentration camps [in Nazi Germany] molded the equivalent personality pattern in a wide variety of Caucasian prisoners. Nor was Sambo merely a product of ‘slavery’ in the abstract, for the less devastating Latin American system [of slavery] never developed such a type.”

Thus, IQ test results must be considered in this context of over 300 years of oppression during which many blacks, for their own defense and survival, adopted a subservient personality. And remember, until the latter part of the last century it was against the law in many places of the United States for blacks to learn to read or write. Even since then, blacks, taken as a whole, simply have not had the same educational opportunities as whites.

Effect of Environment

The quality of preschool home education also bears directly on intellectual achievements. It is of interest that the full 15-point IQ gap is manifest in the United States between black and white children by age five, even before they go to school. Some may claim that this is proof that blacks are born with less intelligence than whites, but there is evidence that other factors can be responsible.

Early childhood is a principal period of intellectual growth. Dr. Benjamin Bloom of the University of Chicago, as well as other educators, maintains that by the time a child reaches age five he has undergone as much intellectual growth as will occur over the next thirteen years. In keeping with such conclusion, Science News Letter observes: “During the early years, a child’s intelligence can be greatly influenced by a responsive environment conducive to learning and exploring.”

But consider the home situation of many American blacks. Their families are more frequently disrupted than are white families. The father is often not at home, perhaps being forced to look in another area for employment. Often, in black families, the mother alone must rear the children. Under such circumstances, can it be expected that the young will be provided the early educational training that will equip them to match the intellectual achievements of whites?

Further, recent studies show that in larger families, black or white, where parents usually give less individual attention to their children, the children have lower IQ scores. Since black families are, on the average, larger than white ones, this may also be a contributing factor to blacks’ lower intellectual achievements.

Another factor to consider is that home environments are not the same—white and black cultures are significantly different. And traditional IQ tests have clear cultural biases that favor whites. As an example, a Stanford-Binet picture test showed a prim-looking white woman and a woman with Negroid features and slightly unkempt hair. The child was marked “right” for picking the white woman as “pretty,” and “wrong” if he picked the black.

Another thing to keep in mind is that a large number of blacks have achieved IQ scores well above the average score of all whites. In fact, during World War I blacks from certain parts of the northern U.S. scored higher on IQ tests than whites from certain parts of the South, which would indicate that blacks are not born with lesser intelligence. Theodosius Dobzhansky, an American biologist, made this telling observation: “The race differences in the averages are much smaller than the variations within any race. In other words, large brains and high I.Q.’s of persons of every race are much larger and higher than the averages for their own or any other race.”

The book Intelligence—Genetic and Environmental Influences, edited by medical doctor and university professor Robert Cancro, examines at length environmental factors that contribute to the lower intellectual achievements of blacks. In view of all the disadvantages blacks have had, the writers conclude: “It is really surprising to find the mean IQ of black Americans only 15 points below that of white Americans. No reason whatever exists to consider this discrepancy as biologically inevitable.”

The well-known anthropologist Ashley Montagu reached a similar conclusion. He writes: “If nutrition is poor, health care deficient, housing debasing, family income low, family disorganization prevalent, discipline anarchic, ghettoization more or less complete, personal worth consistently diminished, expectations low, and aspirations frustrated, as well as numerous other environmental handicaps, then one may expect the kind of failures in intellectual development that are so often gratuitously attributed to genetic factors.”

Montagu concludes: “There is no evidence that any people is either biologically or mentally superior or inferior to any other people in any way whatever.”

Yet is there proof that the difference in average IQ scores of the races is not due to whites inheriting more intelligence than blacks?

Conclusions from the Evidence

There is no proof that whites either have, or have not, inherited more intelligence than blacks. What is clear, however, is that environment has a big effect on intellectual development. In Israel, for example, deprived Oriental Jewish children, who were placed in communes called kibbutzim and brought up collectively, showed higher IQ’s than children of the same background reared by their parents. Also, American Indian children reared in white foster homes obtain significantly higher IQ’s than their brothers and sisters on the Reservation. But does the same hold true for blacks?

A recent study of black children reared in white homes revealed that it does. The study, which included over a hundred white families who adopted black children at an early age and reared them in their homes, showed that the IQ’s of these blacks compared favorably with those of whites. “Overall,” write the investigators, “our study impressed us with the strength of environmental factors. . . . If a different environment can cause the IQ scores of black children to shift from a norm of 90 or 95 to 110, then the views advanced by the genetic determinists cannot account for the current IQ gap between blacks and whites.”

The weight of scientific opinion, therefore, seems to be that the lower average IQ scores of blacks can be explained largely, if not entirely, by environmental factors. In the book The Biological and Social Meaning of Race, Frederick Osborn of the Population Council of New York sums up: “Only one conclusion is possible from the studies which have been made to date. Differences in test intelligence between the major races are no greater than can be accounted for by the known differences in their environments. On this there is general scientific agreement.”

It is of interest that, as opportunities have opened to them, more and more blacks are succeeding in fields of business, education, medicine, and so forth.

Yet, it must be acknowledged, the question of the relative intelligence of the races cannot be positively determined. The evidence is now inconclusive, open to various interpretations, as one writer noted: “A hundred different conclusions can, and have been, drawn from the same body of evidence. The conclusion one arrives at depends as much on emotion as reason.”

So, then, why bring up the matter of IQ scores in an attempt to prove that blacks are less intelligent than whites? Steven Rose, professor of biology at the Open University, England, explains why some people do: “The question of the genetic basis of racial or class differences in IQ . . . achieves meaning only in a racist or classist society attempting to justify its discriminatory practices ideologically.”

As a result of the storm of controversy over the alleged lower inherent intelligence of blacks, the National Academy of Sciences declared: “There is no scientific basis for a statement that there are or that there are not substantial hereditary differences in intelligence between Negro and white populations. In the absence of some now-unforeseen way of equalizing all aspects of the environment, answers to this question can hardly be more than reasonable guesses.”

One thing is certain, however, and that is that there is no sound basis for viewing people of another race as inferior. Without making any distinctions as to race, the Bible wholesomely advises us to have “lowliness of mind considering that the others are superior to you.”—Phil. 2:3.

But still there are persistent views that hinder persons from applying this fine Scriptural counsel. A prominent one is that persons of other races than one’s own have an objectionable body odor.

“If you see pygmies in their natural environment making bridges out of fibre and living life successfully you might ask what you mean by intelligence.”

[The environment in which children grow up affects their intellectual development

Races Are Strikingly Different

IT WAS 1955, at an international gathering in Nuremberg, Germany. A group of Europeans had surrounded a couple of American blacks, visibly happy to have them. They rubbed their skin and felt their hair. Apparently they had never seen a black person before and were intrigued by the striking differences. The blacks enjoyed being warmly accepted. Back home, however, racial attitudes had developed over the centuries to create a much different situation.

Consider the Spencers, a black family who moved into a nice section of New York city. It was the eve of 1975. A pipe bomb came flying into their house, with the note attached: NIGGER, BE WARNED. “It was intended to wipe out the family,” the police captain who investigated said.

A reporter, who later spoke with white residents, explains: “I kept pressing: why don’t you want blacks here? ‘If you really want to know,’ answered the fellow with the flag, ‘they’re basically uncivilized. Wherever they go, the crime rate goes up, neighborhoods fall apart, whites have to leave.’”

Many whites feel differently about association with blacks, developing friendly relations with them. In the southern United States fine strides have been made in improving race relations. Many schools and other public places have been racially integrated. Yet, there are still many persons who feel that differences in the races are so great that they warrant racial segregation.

Basis for Segregation?

In 1954 the United States Supreme Court ruled against racial segregation in the public schools. But many Americans do not agree with that decision. Nor do they agree with the Court’s 1969 order for public-school districts to desegregate “at once.” This is evidenced by the fact that in the late 1960’s a larger percentage of black children attended predominately black schools than in 1954!

Also, there are many persons in the United States who don’t agree with the 1967 Supreme Court’s ruling that it is unconstitutional “to prevent marriages between persons solely on the basis of racial classifications.” This decree invalidated all laws in the United States against interracial marriages. Yet people are still commonly heard to say that they don’t believe blacks and whites should marry.

The situation in the churches is further evidence that many persons believe racial differences warrant segregation. Kyle Haselden, as editor of The Christian Century, wrote in 1964: “Everyone knows that 11 o’clock on Sunday morning is the most segregated hour in American life.” And segregation persists. This year the minister of the Plains, Georgia, Baptist Church “said his resignation stemmed from ‘backlash’ over his efforts to integrate the church.” New York Post, February 22, 1977

Although much progress has been made in improving race relations, some persons have recently seen causes for discouragement. A black, writing in The Christian Century of April 28, 1976, said: “I am worried, really worried, about the serious deterioration in relations between blacks and whites. Black friends share their sense of frustration and powerlessness with me.”

There is often a polarizing, with races harboring hostility and sticking to themselves. As the above writer noted: “I went for a walk on the Yale campus. Two white students joined me. They complained of being forced into segregation by their black classmates who chose to live and take their meals alone, and to maintain little or no social intercourse with their white male peers.”

How Great the Differences?

Really, how great are racial differences? Are they of such a degree that people of different races cannot live together as equals, and take real pleasure in one another’s company? For example, is there a big gap between the intelligence of people of various races? Or, do the races have a distinct body odor, making it objectionable for blacks and whites to live in close quarters with one another?

Obviously differences do exist. Skin color and texture of hair are among the most observable. There are also differences in the shape of the nose, eyelids and lips. Thick lips are common among blacks, while persons of other races tend to have thinner lips.

Yet some whites are quick to point to what they call “more important differences.” As noted earlier, it is claimed that blacks are “basically uncivilized.” It is said that “they have looser morals.” Higher illegitimacy rates among them are given as evidence for this claim. But there are more assertions that are commonly made.

Some are: “Blacks care less for family.” And, as evidence of this, the higher rate of separations in black families is pointed to. “Crime rates go up when blacks move in; neighborhoods fall apart.” To support this statement, persons will point to black neighborhoods that are generally more run down, and to statistics that show that, proportionately, blacks commit more crimes. “Blacks are less intelligent than whites.” And it is a fact that, on the average, blacks score lower on IQ tests than whites of comparable socio-economic status and generally do poorer in schoolwork.

But why do blacks show up unfavorably in such comparisons? A publication of the United States Commission on Civil Rights put the matter in focus. It said that the obvious inferior “status of nonwhites can result from only two factors. Either nonwhites are inferior as persons, or white racism has prevented their natural equality with whites from asserting itself in actual attainments during their more than 300 years in America.”—Racism in America—How to Combat It.

What do you believe is the answer?

The Once Prevalent View

At one time the prevailing view was that blacks are inferior as persons. The Encyclopædia Britannica, Ninth Edition, 1884, said: “No full-blood Negro has ever been distinguished as a man of science, a poet, or artist, and the fundamental equality claimed for him by ignorant philanthropists is belied by the whole history of the race throughout the historic period.” It also spoke of “the inherent mental inferiority of the blacks, an inferiority which is even more marked than their physical differences.”

This encyclopedia said that, as children, blacks and whites seem to have equal intelligence. “Nearly all observers admit,” it notes, “that the Negro child is on the whole quite as intelligent as those of other human varieties.” However, it was said that in blacks there is a “premature ossification of the skull, preventing all further development of the brain.” Thus, the Britannica asserted: “On arriving at puberty all further progress [of blacks] seems to be arrested.” Chambers’ Encyclopædia, 1882, although not agreeing with the Britannica, spoke of the view “that the Negro forms a connecting link between the higher order of apes and the rest of mankind.”

The view that blacks are inferior as persons is still held by some; it is by no means dead. One person wrote of the common views held where he lived: “I grew up in a southern rural community where it was said that black people are black because of a curse God placed on them. . . . In fact, it was said that black people were not really people after all but a part of the animal kingdom.”

Even certain men of science today hold that blacks are biologically inferior to whites. In 1974 a long work of authoritative appearance, endorsed by leading educators, argued in favor of this view. Of the writer, John R. Baker, The Guardian of April 6, 1974, said: “He is skilled at piling up, ostensibly as data, quotations and references which, taken with the powerfully repulsive atmosphere generated by the style, would convey to any reader quite unacquainted with any ‘Negrids’ an impression of them as subhuman (for example, ‘Long says that the Negroes are distinguished by their “bestial or fetid smell”’).”

So what about racial differences? Really, how great are they?

Stay Clean, Stay Healthy!

MAN’S struggle to stay waged healthy has been almost since the dawn of history. But it has been a ‘losing battle’ against disease, plague and epidemic. Despite advances in science and medicine, people continue to get sick and die.

At one time, it was thought that diseases were caused by evil spirits, and physicians fought this influence with charms or incantations, even with bloodletting. Sometimes herbs were ‘used, doubtless with greater benefit. It was the discovery of germs, however, that resulted in more successful treatments of sick people. And this led to an understanding of the relationship between good health and cleanliness.

Today it is understood that many maladies—the communicable diseases—are the result of three factors: the agent, the environment, and the host. The agent is the original cause of the sickness. Disease agents include bacteria (causing such maladies as typhoid fever and cholera), protozoa (resulting in diseases like amoebic dysentery), viruses (causing polio, infectious hepatitis, and so forth), parasites (causing malaria, and so forth), and fungi (responsible for problems like athlete’s foot). There are also nonmicrobic agents like lead and mercury, which can cause poisoning.

The disease agent exists in what are called reservoirs. These may be an already sick person, a carrier (someone who carries the agent, but who has no symptoms of the disease), an animal, or even a part of the inanimate environment. When the agent is expelled from the reservoir—by coughing, sneezing or in some other way—it may be picked up and transported to a potential host, that is, someone who is susceptible to catching the disease. If the agent finds the right way into the host, illness will result. The importance of the way that the agent enters is seen in the case of tetanus. If the germ enters through the mouth, it is harmless. However, if it gets in through a deep cut in the skin, the host probably will become sick with the disease.

Today men try to break this chain of disease transmission by sanitation. By this means they endeavor to control the environment so as to prevent the disease agent from getting to a new host. The relative success of this approach has been seen in many countries where garbage has been disposed of properly, sewage has been treated and the government has been able to provide for a clean water supply. In these lands diseases like typhoid fever, cholera and plague have almost been eliminated. Even in the more developed nations, however, people still fall victim to communicable diseases like influenza. Especially is this true during times of crisis, when public services break down and diseases can surface once again. These facts emphasize that sanitation is not just a government responsibility. All of us should be aware of how disease travels and what we individually can do to prevent it.

Spreading by Touch

The world today is in the grip of a pandemic of venereal disease, spread almost entirely by the direct contact of sexual intercourse. These sexually transmitted infections are among the principal diseases spread by contact transmission.

Controlling venereal diseases is largely a matter of moral cleanness, while physical cleanliness will help to prevent the spread of many other maladies. (1 Cor. 6:9, 10) Regarding the latter, one doctor said: “Washing your hands after using the toilet and before eating should be as automatic as breathing.” As a matter of fact, diseases spread by contact transmission should be the easiest for an individual to avoid.

Food and Water

Humans use automobiles or buses as vehicles for travel. Similarly, disease agents can travel in vehicles—water, milk or even food. This is called vehicle transmission. Milk, so good for growing children, may be a disease carrier if it comes from a dirty or infected animal, which is why, in Western lands, milk must be pasteurized. Many people prefer to boil milk if there is any doubt about it. Food can carry sickness if prepared by unwashed hands or if it has been in contact with rodents or insects. But maybe the most commonly contaminated material is water. We cannot live more than four or five days without it, but if our drinking water is contaminated, it will be a vehicle of entry into our body for countless millions of disease agents. And what disease agents can travel in water? Bacteria, protozoa, worms, viruses and nonmicrobic poisons.

Nowadays, many modern cities are supplied with chemically treated water; but drinking water should never be taken for granted, especially in times of flood, earthquake or similar crisis. In case of doubt, it is wise to treat water perhaps with chloride of lime, or, if that is not available, tincture of iodine. In the absence of these substances, it can be sterilized by boiling for at least ten minutes. Remember, though, that water can be contaminated after boiling as well as before. So the sterilized water should be kept in a clean and protected place until it is used.

In the countryside, particularly in developing lands, households rely on different water sources that must be protected from contamination. Those using rainwater, for example, should be sure that no dirt gets washed into the storage tank along with the rainwater. Also, the tank should be protected from insects, rodents and other animals. Persons relying on surface water, such as streams or brooks, are almost certainly drinking polluted water. It is nearly impossible to protect these from contamination by animals or runoff (rainwater running in from the surface of the ground). The only exception might be a fairly fast-running spring-fed stream where the water looks clean and sparkling and where there are no residents on the watershed spilling pollution into it.

Naturally occurring springs are better, although most householders build a concrete cover around these to protect them from animals and surface runoff. Possibly the best sources, however, are wells, particularly deep wells. Shallow ones need to be examined to make sure that they are not being contaminated by someone’s latrine. Even deep wells can be polluted by surface water runoff. Therefore, many well owners build a small platform around the well, to prevent the surface water from getting in.

Remember, too, that clean water is easily polluted. Even if it comes from a clean well, the water is not fit to drink if it is carried in a dirty container or comes in contact with dirty hands.

Another class of vehicles that germs can ride on are called fomites. These are objects (such as towels or cups) that come in contact with a sick person, and then with someone else. The new handler or user inherits the payload of disease agents left by the previous individual. Fomites should be washed in boiling water to make them harmless.

Insects and Vermin

Between the years 1347 and 1350 C.E., from a quarter to a half of the whole population of Europe died of the Black Death. This disease, also called bubonic plague, is one of the many maladies spread by what is called vector transmission. “Vector” means “carrier,” and in the field of sanitation it denotes an animal or an insect that carries the disease agent to the new host. Mostly, vectors are insects. Some, like the rat fleas that spread bubonic plague and the mosquitoes that carry malaria, actually inject the disease into the new host by biting or piercing the skin. Others, such as flies and cockroaches, walk on contaminated areas, particularly human excrement, and then walk on food, or on areas where food is being prepared. Diseases like cholera and typhoid fever can be spread in this way.

To protect themselves from malaria-carrying mosquitoes, many people in the tropics sleep under a mosquito net. Governments have tried to limit the breeding of such mosquitoes by eliminating their breeding places. Householders can cooperate with these efforts by removing potential ‘breeding grounds’ in or near their homes—things such as bottles with water in the bottom, stagnant puddles or even drains not properly covered.

Certain insects are a bigger problem. In some places, such creatures as cockroaches and flies are not regarded as enemies, just as nuisances. But they truly are health hazards, and their movement in a home should be prevented as much as possible. Dirty kitchens, however, with cracks or holes where insects can hide, are like a playground for them. Garbage not properly covered is an open invitation to flies, cockroaches and vermin. Also, hogs raised near the house encourage flies to congregate. By all means, insects and rodents should be kept away from family members and from food. You can never tell where they have been!

Clean habits, then, will help to break this link in the chain of infection. Another way to reduce the potential for harm from vectors is by seeing to the proper disposal of human waste or excrement. To persons living in cities having proper sewage disposal facilities, this may not seem like a problem. But in many parts of the world, diseases like cholera, typhoid and dysentery are spread because of improper waste disposal. In this regard, when the Israelites were wandering in the wilderness, they were commanded to go to a private place outside the camp, dig a hole with a peg, and bury their excrement there. (Deut. 23:12-14) It may be noted that when one digs into the soil, the first few feet are teeming with tiny organisms that will quickly work on the waste and render it harmless. If the waste is left on the surface, however, insects can crawl over it and carry diseases back to the household. Also, if it is left untreated and is used as fertilizer, such disease agents as amoebas and worms are likely to be transferred onto the food crop being fertilized.

So, burying is the best way of handling this problem if there are no sewage facilities. Of course, if there is a family living in one place and not moving around like the Israelites, something a little more sophisticated will be needed than just a peg or stick to dig a hole! It is surprising, however, how simple it is to make a sanitary toilet. A pit dug about six feet (2 meters) deep, and three feet (1 meter) square, raised around the top to keep surface water from draining in, with a floor cover and a seat that can be covered to prevent insects and rodents from entering, will satisfactorily serve a family for some years. Of course, more sophisticated units can be used if money is available. But there is one thing to watch. These facilities should be built well away (and, if possible, downhill) from any water source.

Carried in the Air

After the trauma of the first global war, in 1918 the world faced another grim experience. In one year, ten million more persons died during the Spanish flu than the number killed during the entire war. Most of those who suffered from the sickness probably caught it from the very air they breathed. Influenza is one of those diseases communicated by means of what is called aerial transmission. When an infected person sneezes or coughs, he sprays the air with little droplets of water that are teeming with germs just waiting to get into a new host. Fortunately, sunlight and dryness tend to kill most germs. While they are still alive, however, they can be breathed in from the air. Aside from influenza, some diseases that can be spread in this way are tuberculosis, measles, pneumonia, scarlet fever and whooping cough. Yet, the spread of these illnesses can be lessened greatly by clean personal habits, such as using a tissue or handkerchief when sneezing (and disposing of the tissue in a sanitary way) and not spitting indiscriminately.

Yes, indeed, sanitary, or clean, habits have a part to play in the matter of staying healthy. In many cases, of course, the good habits we have may prevent our disease from being spread to someone else, whereas others may not be so considerate. However, the principle of ‘loving your neighbor as yourself’ surely will guide a Christian in this regard. (Matt. 22:39) True, some people become fanatical in the matter of cleanliness and sanitation; so the spirit of a sound mind is needed too. We can be sanitary, but we cannot live in an antiseptic environment. Besides, Jehovah God has provided wonderful power right within our own bodies to overcome the attacks of most diseases. Yet, it is wise and loving to be reasonably clean and sanitary, and thus not spread germs unnecessarily.

Attention to sanitation and cleanliness will help us, though this will not remove sickness from the earth. For that, Christians patiently await God’s new order wherein Jehovah will remove sickness and other distresses afflicting mankind. At that time, there will be a full realization of the Bible’s promise that “no resident will say: ‘I am sick.’” (Isa. 33:24) Then, finally, man’s struggle to stay healthy will have been won.

A Search for an Identity

MAN has always been interested in his genealogy. The Bible itself provides a complete record of Jesus Christ’s ancestry going all the way back to the first man, Adam.The Jews, as a nation, meticulously preserved genealogical records, and it was one of their major tragedies that these records were destroyed when Jerusalem was laid waste by the Roman armies in the year 70 of our Common Era.

The Jews’ return to Palestine and the establishing of modern-day Israel was an expression of a need for an identity—in this case, a national identity. Whereas the Jews’ quest for an established identity may have had strong political overtones, families in other nations often become caught up in such a quest so as to establish claim to the inheritance of property, to royal lineage, to descent from a famous character of history or just to find out who they are.

People all over the world now are focusing attention on what has been described as the “Black man’s search for identity.” The recent Black and African Festival of Arts and Culture (FESTAC) held in Nigeria was a noted expression of this quest.

The African Identity

FESTAC ’77 was held in Lagos, Nigeria, and ran from January 15 to February 12. It was the second gathering of its kind to be convened in Africa. The first was held in Dakar, Senegal, in 1966. FESTAC ’77 drew delegates from all the nations of Africa, black communities in the Americas, Europe and Australia, and black states outside Africa. Some 17,000 artists, dancers and intellectuals came from fifty-six countries. Interestingly, representatives from the Arab states of North Africa and from among the Aborigines and Maoris of Australasia were present—all subscribing to the “attempts of Black [and African] people to revive their culture in order to integrate themselves in a world of co-operation and conflict diplomacy.”

The rich variety of presentations at the festival included cultural and traditional dances, music and singing, dramas, films and literary presentations by black and African writers. There were exhibitions of art, literature and artifacts, as well as fashion shows and a colloquium, that is, a seminar, on the theme “Black Civilisation and Education.” The principal site of these presentations was the ultramodern National Theatre in Lagos. The colorful boat regatta drew large crowds to the waterways in Lagos to watch competitive canoe races and mock battles. And the Grand Durbar, displaying the spectacular traditional horsemanship of the tribes of northern Nigeria, took the festival to Kaduna, 500 miles (800 kilometers) from Lagos.

In summing up the aims of the festival, Dr. Emiko Atimomo said: “These aims suggest that Africa and the Black World must begin to reconstruct their societies so as to revive the lost heritage of their ancestors, because it is in so doing that co-operation can better be achieved between the Black peoples of the world and other societies of the universe.” The announced objective was to promote better international and interracial understanding, which eventually would facilitate, among black communities in foreign lands, a “return to origin.” The black communities in foreign lands are called the Diaspora.

The desire to “return to origin” was expressed throughout the festival in dramas, dances, songs and the colloquium by a rejection and condemnation of colonialism and an extolling of African culture and political emancipation. A typical example of this was seen in the musical play called “The Drum,” presented by the Somali troupe. This play traced the black man’s experience from his seemingly primitive tranquillity, through the slave trade and colonial subjugation, to his regaining of independence. This “revolt against European civilization” was considered necessary because the conviction has been expressed that “time and colonialism have cut Black Africa from its authentic culture of the past” and that the “traditional culture has been undermined by foreign religion, foreign technology, foreign culture and foreign rule.”

For this reason the scholars who took part in the colloquium appealed for “unity and the solidarity of black people in spite of their ideological differences and the diversity of their geographical and historical conditions.” They held the view that the common factor shared in the destiny of the world’s black peoples is their aspiration toward liberation, toward regaining their cultural identity and their legitimate place in the world. Therefore, recommendations were presented for cooperation in various fields, such as education, government, language and religion, with an African orientation. Black peoples in the Diaspora expressed the view that Africa is the foundation of their ethnic and cultural identity, and so it is around Africa that they intend to rebuild their unity.

Acknowledged Obstacles

While recommendations were made that Swahili be adopted as Africa’s lingua franca, that there be a revival of African traditional religion and culture, and that an ideology of African Socialism be adopted, some saw the need for caution. In his analysis of FESTAC, Dr. Opeyemi Ola said that “certain aspects of the traditional culture do not deserve to be retained or revived . . . because they are either negative or outdated.” He advocated an African technology in order for “Black Africa to move rapidly into the modern present and ultra-modern future.” Therefore, Dr. Ola recommended the establishing of a Pan-African University of Science and Technology.

Dr. Ola further cautioned that “whatever FESTAC may record today in the scoreboard of triumphs, politics may offset and neutralize it tomorrow.” This is perhaps why he later wrote that some of the leaders in “their mini-nations have been more cruel and more unfaithful to the black men under their rule than the white colonial masters!” Such leaders are seen as standing between Black Africa and transformation.

Nevertheless, the nations and communities at FESTAC felt that they had established a basis for confirming African culture as a world culture for achieving progress toward a civilization that would equal that of the already developed nations.

Unsolved Problems

Yet, modern civilization as a whole has not removed the areas of social, cultural and political stress that exist among mankind. Rather, it has enlarged and accentuated them. Indeed, civilization’s technology has been directed largely in a negative way, in the production and distribution of sophisticated weapons of aggression and defense. Moreover, the breakdown of human relations has become critical, with an increase of crime, immorality and drug addiction, and a weakening of the family structure. In fact, some aspects of the increasing crime in developing countries are viewed as a legacy of modern civilization.

Nigerian journalists now are speaking of their country as a “nation threatened from within.” They lament the increasing of violent crimes among citizens to whom “the sanctity of property and of person is a meaningless concept.” In spite of the large sums of money spent in modernizing the cities and building highways, the citizens live in fear of being victims of violent crime. Even the public execution of armed robbers has not been a completely effective deterrent to such violence.

Looking back to the social situation that existed in Nigeria prior to the colonial era and the introduction of modern civilization, writers refer to the time when “daily living was more leisurely . . . Parents, children and indeed the extended family . . . were well aware of their civic and family responsibilities. There were fewer police and fewer prisoners.”

The alarming change toward a moral breakdown has been viewed as largely an economic problem. Growing corruption and dishonesty among those who make a showy display of wealth arouse envy and greed among others, who begin feeling that they, too, must be dishonest to acquire wealth and the many possessions that modern civilization offers. The materialistic outlook further expresses itself in the ‘new morality’ and the resulting promiscuity that threatens the family structure in most countries and has made venereal disease a major epidemic. In Nigeria some have termed gonorrhea a “gentleman’s disease” because to them it appears that promiscuity is more evident among the wealthy or the scholars, who are most influenced by the social ways and materialistic philosophy of modern civilization. Not surprisingly, gonorrhea and syphilis are on the increase in this country.

A “Return to Origin” the Answer?

Of course, the world in general faces formidable social, political, racial, health and other problems. So, what should nations and individuals do? Is it desirable to dispense with modern scientific aids and laborsaving devices and ‘return to their origins’ of several centuries ago, when these things were lacking, life had greater hardships, and health hazards may have been more common?

Would it not be better to ‘return to the origin’ that Jehovah God gave the human race? God gave man a perfect start and the prospect of eternal life in an earthly paradise. Most important of all, the first man, Adam, was a “son of God.” (Luke 3:38; Gen. 1:26-28; 2:7-15) Choosing to sin, however, Adam lost his position as a son of God, and he bequeathed sin and death to his offspring. (Rom. 5:12) Only by availing oneself of the ransom sacrifice of Jesus Christ can a person again entertain the prospect of everlasting life in a restored paradise on this earth. (John 3:16; 17:3; Luke 23:43) What a “return to origin” that will be!

Soon, under the rule of God’s heavenly kingdom, a new civilization will be achieved on this earth. Man then will have full opportunity to use his intellectual capacities in various fields of endeavor. But this will be more than a new civilization. It will be a true “return to origin,” because obedient mankind will become real children of God. “For,” wrote the Christian apostle Paul, “the creation was subjected to futility . . . on the basis of hope that the creation itself also will be set free from enslavement to corruption and have the glorious freedom of the children of God.”

The Continents Beneath Your Feet—Are They Drifting?

HAVE you ever noticed, when looking at a map of the Atlantic Ocean, how the east coast of South America seems to match the west coast of Africa? If you fit the hump of Brazil into Africa’s Gulf of Guinea, the shoreline all the way from Guyana to Argentina matches amazingly well with the line from Ghana to Capetown. The two continents seem like pieces of a gigantic jigsaw puzzle.

Perhaps when you noticed this, the thought crossed your mind that at one time South America and Africa may have actually been joined, and that somehow they split and drifted apart. If so, you probably dismissed the idea as preposterous, just a curious coincidence.

But do you know that this idea is now considered seriously by most geologists? A theory that proposes that the continents actually move here and there over the fluid mantle inside the earth’s crust has, since 1960, won general acceptance.

Theory of Continental Drift

The theory was first proposed, not by a geologist, but by a meteorologist in Germany, named Alfred Wegener. He suggested that, not only had South America and Africa once been joined, but all the continents had formed part of a single huge landmass. He called this hypothetical ancient continent Pangaea (meaning “all land”). He found that the fit of the continents was better when the outlines of the continental shelves were used, rather than the now-existing shorelines.

Today geologists use computers to slide and turn the continental outlines over a globe to obtain the best fit. In a typical reconstruction of the supposed ancient supercontinent, the southeastern coast of North America lies against the northwest coast of Africa. Eurasia is pivoted about Spain so that the west coast of Europe nuzzles in against Newfoundland and Greenland. Antarctica lies against southeast Africa, with Australia attached to its opposite side.

When Wegener first proposed this revolutionary concept in 1912, it aroused mixed feelings among geologists. Any theory that goes counter to prevailing notions in science is usually received cautiously. Continental drift met with a reception even cooler than usual, perhaps because its author was not a member of the geologists’ circles. Although there were solid bits of evidence to support the theory, it was “proved” mathematically that the earth’s crust is too strong to allow any lateral movement of the continents. And, it was asked, Where would any force originate to push the continents one way or another? No one could suggest anything that stood up under analysis. The idea gradually came to be ignored by reputable scientists.

Evidence for the Theory: Conformity

Why, then, have geologists changed their minds about continental drift? In the first place, there have gradually accumulated several kinds of evidence that they find hard to explain any other way. Among these are the similarity of geological formations and of fossil deposits on continents now widely separated, as well as the wandering of the magnetic poles of the earth.

As an example of geological conformity, we are told of a succession of sedimentary deposits, laid down during what is called the Paleozoic geologic era, and later exposed when they were lifted up into mountain ranges. Deposits of red sandstone, gray shales, and coal beds are found in the Appalachian mountain system in eastern North America, extending to eastern Greenland. They are also found in the highlands of the British Isles. Similar sediments are found in the Kjölen range in Scandinavia, and along the Atlas range in northwest Africa. In the theoretical parent continent of Pangaea, all these rock formations are believed to have been part of a continuous mountain system whose remnants are now widely separated on three continents.

The similarity in fossils found in these strata on both sides of the Atlantic is used as a further argument for the theory. Fish fossils are abundant, also land plants, even forests of tall tree ferns and great scale trees. Another oft-cited example of conformity of the fossil record is that of the mesosaurus, a small dinosaur that lived during the so-called Paleozoic era. Its fossils are found in southwest Africa and in Brazil, but they have not been found in other parts of the earth. If South America and Africa were joined at that time, then the range of the mesosaurus would have been one continuous area.

Wandering Magnetic Poles

More convincing proof has come from study of the mysterious phenomenon of polar wandering. The belief that the magnetic poles of the earth have moved about is based on measurements of the magnetization of igneous rocks. When a hot rock is cooled in a magnetic field, it is left weakly magnetized, because particles of magnetic minerals in the rock line up in the direction of the magnetic field. This shows the direction of the earth’s magnetic field at the time the rock was formed, like a “frozen compass.”

Now you might expect that all such fossil compasses would point north, but, surprisingly, rocks of different geologic ages show magnetization in many different directions. It is as if the magnetic pole were wandering widely and aimlessly all over the earth—hence the expression “polar wandering.”

However, when the directions are arranged in order according to the apparent successive ages of the rocks, it is found that the pole does follow a definite path from age to age. Furthermore, when the magnetism of rocks in other places on the same continent is measured, it is found that they consistently trace out the same path.

This discovery put the geophysicist in a quandary. Although no one knows what causes the earth’s magnetic field, it seems that it must be in some way related to the earth’s rotation, and it is hard to believe that the magnetic pole can stray very far from the geographic pole, surely not clear across the equator as the rock compasses indicated. Now, of course, the wandering magnetic paths would be explained equally well if the pole stayed fixed while the continents slid around over the globe, but that seemed even harder to believe.

What tipped the balance between two incredible explanations was the discovery that magnetic measurements on different continents usually indicate entirely different paths for the pole. This could not be explained by movements of the pole, because the earth has only one north pole, and it can’t go in several different directions at the same time. This appeared to geologists as a strong indication that the continents had actually moved independently of each other, over many thousands of miles.

Evidence from the Ocean Floors

New evidence that finally converted geologists to belief in continental drift came from the bottom of the sea. Exploration of the ocean floors really got under way in the International Geophysical Year of 1955. Oceanographers used elaborate sounding devices to chart the ocean floors. By timing echoes, they probed, not only the floor of sediment on the bottom, but also the depth of the basement of basalt rock underneath. They came to an astonishing conclusion about the ocean floors: They concluded that these are not fixed, but appear to be forming continuously at definite boundaries and spreading on a global scale.

Let us examine the discoveries that led to this startling hypothesis. The first clue to come to light was a long mountain ridge in the middle of the Atlantic Ocean. Starting there, geologists have mapped a system of mid-ocean ridges that literally encircles the earth. A typical ridge rises from the ocean floor, some three miles (5 kilometers) deep, to a peak about two miles (3 kilometers) above the floor. It is flanked on both sides by a strip of hilly terrain hundreds of miles wide. A striking feature is a valley that runs like a crack right along the crest of the ridge, thus dividing it into a pair of parallel ridges.

The acoustic soundings from the surface have been supplemented by using vessels equipped to drill holes in the bottom of the sea. These have brought up cores of rock for close inspection and analysis, some as long as 1,500 feet (460 meters), from many parts of the ocean. These surveys disclose that the ridges themselves are bare igneous rock, and that there is little or no sediment up to 60 miles (97 kilometers) on either side. Farther away, they show increasingly thicker layers of sediment, up to a mile thick.

Magnetic surveys over the oceans in the vicinity of the ridges resulted in another striking discovery. There are strips of rock lying parallel to the ridges in which the magnetism is reversed. It is as if the north and south poles had been reversed when the rocks formed. This reverse magnetization had been noted earlier in certain volcanic lava flows, but near the oceanic ridges there appears to be a continuous record of normal and reverse magnetic polarities frozen into the ocean bed. There is no explanation for this mysterious change; after all, no one knows why the earth has a magnetic field, much less why it reverses itself. It is just an observed fact of creation.

Sea-Floor Spreading

Geologists explain all three of these observations by a single hypothesis, called sea-floor spreading. They suppose that the mid-ocean ridge is being formed continuously by the upwelling of magma from the earth’s plastic mantle through a crack in the earth’s crust, and that the ocean floor is moving away from both sides of the crack as it is formed. The newly formed rock is clean, and sediment accumulates slowly and becomes noticeable only after the new rock has been exposed for some time and has moved away from the ridge. The parallel bands of normal and reverse magnetic polarity result when the magma oozes out and solidifies for a time while the earth’s poles are normal, and then for a time while they are reversed.

The findings indicate that at the present time the floor of the Atlantic Ocean is spreading a little more than an inch (2.5 centimeters) a year, and the Pacific Ocean about six inches (15 centimeters) a year. But if the earth is forming new crust on the ocean floor on this prodigious scale, it must be getting rid of its old crust somewhere else. After all, the total surface of the earth is not increasing. Geophysicists speculate that this takes place along certain boundaries where one part of the crust slides under another part and descends into the hot interior, where it melts and is consumed into the fluid mantle again. They believe that this is not a smooth process, but is accompanied by earthquakes and volcanic eruptions. It forms deep ocean trenches and high mountain ranges along the consumption boundary lines.

The Theory of Tectonic Plates

From a world map of the mid-ocean ridges and the consumption boundaries, geologists have divided up the whole earth’s surface into six large (and several smaller) plates of rigid rock. These plates, they postulate, are being formed at the ridges and move like a conveyor belt toward boundaries with other plates, where one of them is thrust underneath into the mantle and is dissolved. The continents are carried on these plates, like an Eskimo’s igloo on an ice floe.

This is called the tectonic-plate theory, from the Greek word for “builder.” Both the continental drift and the sea-floor spreading are included as parts of the broader theory.

Let us look at a few examples of how this theory is used to explain observed features of the earth’s crust. The American plate, which carries both North and South America, as well as the western half of the Atlantic Ocean, theoretically is being formed at the mid-Atlantic ridge and moving west. Along the western coast of South America, a smaller plate arising in the eastern Pacific collides with and plunges under the American plate. This supposedly causes a deep trench in the ocean off the coast of South America, and lifts the Andes mountains to the highest peaks in the Americas. The crumpling of the oceanic plate causes frequent earthquakes all along the Pacific coast. When, according to the theory, the lighter rock carried down into the mantle melts, it rises through cracks in the continental crust above it to form the volcanoes in the Andean Cordillera.

A detailed map of the mid-oceanic ridge shows that it is not really continuous, but it is offset by numerous faults at right angles. Along these transform faults, as they are called, the two theoretical plates slide horizontally. Geologists suggest that the friction from this movement is another cause of earthquakes. One of the longest of these transform faults lies between the American plate and the Pacific plate along the west coast of North America. Along this line, well known to Californians as the San Andreas fault, the Pacific plate is moving northwest against the American plate at about two inches (5 centimeters) per year. The resulting strains cause frequent earthquakes.

The city of San Francisco lies athwart this fault, and the coast of California to the south lies west of it, on the Pacific plate. So if the present movement is not interrupted, it is predicted that at some far-distant time the site of Los Angeles will lie close to where San Francisco is today.

Evidences that some places once had a climate very different from the present one also are viewed by geologists as fitting the theory of continental drift. In the postulated Pangaea, the present-day continents were all much farther south than now, excepting Antarctica. North America and the Spanish peninsula were on the equator. South America, Africa, India, and Australia were all clustered around Antarctica in the south polar regions.

Will the Theory Stand?

Scientists take satisfaction in finding a theory that apparently brings many disparate kinds of information together into a unified picture. That is what they believe the tectonic-plate theory has done for the science of geology. But does that mean that it is therefore the final and correct answer? Not necessarily.

In spite of seeming wide-ranging successes of the theory, there are still many bits of information that do not fit into it. Geologists argue over the interpretation of details. As research continues, some of these questions may be answered in a way that harmonizes with the theory. On the other hand, there may remain stubborn facts that cannot be reconciled with it.

One major shortcoming is acknowledged in the present state of the theory. The forces that cause the upwelling magma along the ridges are not explained. Some geologists have been content with the general statement that convection currents inside the earth’s mantle are responsible. But what generates the convection, and why does its pattern change? When this idea is examined in detail, it breaks down. A convection current in air or water rises around a central axis, not in a long slender sheet that would form a ridge. It is even more difficult to imagine how the displacements along the transform faults can result from convection currents.

Professors Flint and Skinner of Yale University offer this word of caution in their book Physical Geology:

“The theory of plate tectonics seems to provide answers for so many questions that we are tempted to believe it is the long-sought unifying theory that explains the lithosphere [the land areas of the earth, from its surface to the center of the earth]. But we must be careful. Other theories, too, have seemed overwhelming in their promise, yet in the long run have proved incorrect. The theory of plate tectonics is still only a theory.”

Whether the tectonic-plate theory survives the test of time and proves correct or not, we have abundant evidence of the great power and wisdom of earth’s Creator. Of him the psalmist wrote: “Long ago you laid the foundations of the earth itself, and the heavens are the work of your hands.” (Ps. 102:25) The questions Jehovah put to Job thousands of years ago still remain unanswerable by modern geologists: “Where did you happen to be when I founded the earth? Tell me, if you do know understanding. Who set its measurements, in case you know, or who stretched out upon it the measuring line? Into what have its socket pedestals been sunk down, or who laid its cornerstone?

Mother’s Milk—a Wasted Resource

WHAT is the economic loss involved in the bottle-feeding of babies instead of letting them breast-feed? Science magazine answers that in some countries, ‘a laborer may need to spend a third of his daily wage to buy milk for his baby.’ It is estimated that in one South American country “the annual loss of human milk is equivalent to that produced by 32,000 cows.” The magazine then observes: “For the developing world as a whole, the cost of wasted human milk can be put at more than three-quarters of a billion dollars at the very least, and losses are ‘more likely in the billions,’ according to Alan Berg, World Bank deputy director for nutrition.”

And in buying cow’s milk instead of breast-feeding the infant, is the baby benefited by the money spent? It is now recognized that human milk possesses “unidentified factors” that protect against bacterial infections, and perhaps even influenza virus. Cow’s milk lacks these qualities.

Is It Progress?

MANY persons today are greatly impressed by the achievements of science and technology. But what has been the effect of these advancements upon people as a whole? Does the standard of living that science and technology have made possible really make life richer and more meaningful?

In the book Environment, William W. Murdoch, associate professor of biology at the University of California, comments:

“I have said that in an affluent society we can conceive of an optimum standard of living which is below the maximum we could achieve, and that we should therefore consider putting an end to economic growth as we know it. This implies that increasing affluence is not necessarily correlated with an increasing quality of life, and that in the United States we may already be experiencing a decline in the average quality of life as our COLLECTIVE wealth increases. . . . The weight of evidence as exemplified in this book favors the hypothesis that as we grow richer in the United States the quality of our shared environment declines.”

Clearly, what modern science and technology can offer does not necessarily mean progress in every aspect of life. The World Book Encyclopedia acknowledges: “In spite of his scientific and technological progress, man has not been so successful in dealing with human problems.” Why is this? One reason is that all too often the guidelines of the source of wisdom, Jehovah God, are ignored.

A scientist and university professor in Chile came to appreciate this fact after some months of study with Jehovah’s Witnesses. When resigning from a political position in the university, he wrote:

“There are powerful reasons leading me to this decision. Everywhere we see fraud, lying, envy, hatred, violence and an unmerciful fight for fame and power. Universities are not free from these stigmas.

“We impart superior teaching. But what real benefits has it given to the society in which we live? We observe pollution that increases day after day, drug addiction within a large part of youth, increase of delinquency and collective neurosis.

“We have encouraged exaltation through diplomas and degrees—perhaps to dazzle the people. We have stimulated others to fight for fame and power and at the same time we have undoubtedly helped to create selfishness and disunity. As a Christian, I have become convinced of how mistaken all of this is.”

How Long Would You Like To Live?

WHEN things go well, life is enjoyable. The thought of living on and on, even forever, may well appeal to you. But then hardships, perhaps great obstacles and tragedies, may enter your life. Yet, even then, you are not eager to die.

The fact is that people generally cling to life, cost what it may. In 1974, in the United States alone, cancer patients paid out seven billion dollars in an effort to stop that killer and continue living.

The New York Times of July 22, 1974, reported concerning a cancer patient, a doctor, who used every conceivable means to fight his illness and yet died at the age of thirty-nine, as follows:

“There are many other dying patients who, like Dr. Leinbach, put up a fight to the very last. . . . Their will to live is a basic human instinct . . . his widow insisted that every day he managed to stay alive was of great value to him. ‘Of all the things Gary wanted,’ she said, ‘it was life.’ . . . Just before his death, she had asked him if he considered the vigor of his efforts to stay alive worthwhile. She said he had answered clearly: ‘Yes.’”

When we have health there is a tendency to take life for granted. A magazine writer, after a brush with death during a serious illness, writes: “I don’t know when I have been so happy in terms of enjoying the simplest things—things which I had taken completely for granted before. I sometimes laugh at myself. It’s like going through a second childhood. I enjoy a drink of water. I enjoy a piece of fruit. I enjoy the sunlight. I go into my garden and look at the trees. I discover that I had never really seen what a tree looked like in all those years that I had good health. And I enjoy the birds’ singing—just everything!”

A teacher of philosophy expressed the sentiment of many others when he said: “It is outrageous that such a beautiful phenomenon as intelligent, sentient life should be encased in such fleeting vulnerable bodies.”

Potential to Live How Long?

One may grant that it is reasonable that man should live much longer, even forever, but is it scientifically possible? In its discussion of “Death,” under the subheading “Potential Immortality,” the Encyclopædia Britannica (1959 ed., Vol. 7, p. 112A) states:

“It may fairly be said that the potential immortality of all essential cellular elements of the body either has been fully demonstrated, or has been carried far enough to make the probability very great, that properly conducted experiments would demonstrate the continuance of the life of these cells in culture to any indefinite extent.”

Of course, this is the result of an experiment with cells in the laboratory. The Encyclopædia goes on to say that the cause of death is not surely known (that is, death by degeneration, old age). It may be from cell deterioration in the body. Or it may be from a gradual breaking down of organized functions of the cells and their inability to “cooperate” within a total organism, rather than the dying off of individual cells, which, when destroyed, are, in the natural process, replaced by new cells. An exception to this restorative ability’ is found in the nerve cells, which, when destroyed, are not replaceable. However, a damaged nerve cell can heal itself. Even a severed nerve, if properly sutured, can regenerate itself, though healing of the nerves is a relatively slow process.

Says Gary K. Frykman, assistant professor of orthopedic surgery at the Loma Linda, California, School of Medicine, where one or two attachments of severed fingers are performed every month: “If more than one finger has been lost, or a thumb, the patient may feel that he needs to have them reattached to carry out his job, or even for cosmetic reasons.”

Frykman continues: “Under those circumstances, we tell the patient there is a 50-50 chance that we can reattach the fingers or thumb successfully, but we warn him that it may be several months before he will be able to get anything like full use out of them.” Thus, nerves do possess regenerative or healing power.

What Hope from the Scientific Field?

Medical researchers have labored hard and long on ways to delay aging and to prolong life. Can we look to them with hope? They can help a little. But there is no solid evidence of any progress toward a dramatic increase in the human life-span. The increase of the average life expectancy during the past fifty years is due primarily to a decrease in infant and child mortality. Writing in Bestways magazine, Graduate Pharmacist Louis Stambovsky decries the fact that mankind, maturing at twenty-one years of age, lives only about forty or fifty years of mature life. He calls attention to this interesting fact:

“It seems that every mammal [among animals] who lives in the manner and intent normal for his species, lives six to seven times its maturity age. The horse matures in about three years and dies between 18 and 21. The dog reaches a total growth in about three years and should attain the same span as the horse. This formula is applicable to the monkey, cat, bear, etc. Man’s maturity age is 21. By parallel deduction, he should live between 120 and 140 years.”

What prospect do science and medicine hold out? The Scientific American, summing up the matter, said:

“Even if the major causes of death in old age—heart disease, stroke and cancer—were eliminated, the average life expectancy would not be lengthened by much more than 10 years. It would then be about 80 years instead of the expectancy of about 70 years that now prevails in advanced countries.”

These statements are in agreement with the Bible writer Moses, who described the experience of most persons who reach old age: “In themselves the days of our years are seventy years; and if because of special mightiness they are eighty years, yet their insistence is on trouble and hurtful things; for it must quickly pass by, and away we fly.”—Ps. 90:10.

No Reason to Give Up

Do these sobering facts mean that a young person should not care for life, to make it as long as possible, or that an aged person should give up the idea of doing any worthwhile work or of making any contribution to the welfare of his fellowman? Not at all. We can take courage from a statement by Pharmacist Stambovsky.’

“Longevity . . . can be of inestimable value to the community, to the nation and to the world. Such persons are rich in valuable experience, gained through years of trial and error, successes and failures. Witness Edison whose fertile brain was active in the eighties; Gladstone was selected prime minister of England at 60, many years ago when 60 was really ancient, a position he held until 82. Walter Damrosch embarked upon a career as a concert pianist at 78.”

There are reasons, then, for doing the best we can with this life. How can it be made more enjoyable and profitable? Furthermore, is there an even better hope—that of everlasting life? Let us survey the matter further.

“This Is a Bulletin!”

NO MATTER what we are doing, the words “This is a bulletin” grip our attention. Everyone’s routine suddenly halts at those urgent words. Motorists turn up their car radios. Housewives stop their work. Conversations abruptly cease. The announcer’s next words could be anything—a disaster in your own community, the assassination of a world leader.

Such scenes are repeated somewhere in the world almost every day. But what we do not see is what happens behind the scenes in the few moments before “This is a bulletin” shatters the normal routine of broadcasting. We can find out by stepping inside the nerve center of a national news agency, the newsroom.

One of our first impressions is the quiet. Newsrooms have an almost traditional reputation for noisy, but organized, “confusion”—dozens of teletypes loudly banging out news and sports stories from all over the world, the clickety-clack of many typewriters as reporters and editors work on stories, and the copyboys rushing completed stories to and from the editors. And, indeed, for many decades this description was accurate.

But in this computer age the news agency has also kept up with the advance of science. Noisy teletypes are gone. In their place are modern machines with special electronic heads that slide noiselessly back and forth across the teletype paper. Some high-speed machines produce material at the rate of twelve hundred words per minute—entire paragraphs of six lines in only three seconds!

Gone, too, are the typewriters. Instead, newsmen sit at computer terminals resembling television sets with a keyboard. As a writer strikes the keys, letters appear on the screen and the story takes shape. With such equipment, the newsman can make changes on the spot. He can rephrase statements, take out sentences or entire paragraphs and reinsert them somewhere else in the story, or simply delete them entirely.

The only noise now is conversation, an occasional telephone ringing, and, of course—the bells. Bells signal the editor that an urgent story is coming in. They are not heard often, and a visitor may not even notice the quick series of quiet rings. But the machine that sounds the alarm gets prompt attention from at least one of the newsmen on duty.

How It All Began

In Paris, in 1835, a man named Charles Havas decided to go into a new business for himself. He subscribed to a number of foreign newspapers, and as they arrived, he had the financial information translated and printed. He sold this to businessmen in the city. Newspapers also became interested. So Havas expanded his operation, translating and selling news stories, as well as financial information.

Soon Havas was collecting news from across France—by messenger, carrier pigeon, and later by the telegraph. Thus Agence France-Presse, the news agency of France, was born. Meanwhile, in New York city, six publishers formed a news-gathering agency that later became known as the Associated Press (AP). Soon others were springing up all over the world—Reuters in London, the Canadian Press in Toronto.

Hundreds of newspapers were finding that their readers wanted to be informed of events happening throughout the world, not just in their own communities. It was out of the question for newspapers to provide such broad coverage on their own. But by pooling resources to operate a news agency, this kind of coverage became possible.

Yet, how do these agencies get all their news?

Agencies in Operation

There are two kinds of news agencies—national and international. A national agency disseminates information within a particular country. It sets up a series of bureaus, usually one in each state or province. The agency may sell its service to hundreds, even thousands of newspapers, radio and television stations across the nation. The cost generally depends on the size of a particular station or paper.

Each newspaper and radio or television station has its own news staff to handle local news in that area. But when a story breaks that may be of interest to people outside their own community, they send it to the national news-agency bureau for that region. The bureau, in turn, transmits news of regional interest to all clients in the area it covers.

Meanwhile, the agency’s head office monitors all the regional news items from its bureaus nation wide. When items of broad interest appear, they are picked up and sent out nationally. In addition, the national news agency has its own staff of reporters and editors who gather news and cover major stories.

To get information on world events, national news agencies subscribe to one or more international news agencies. These cover several countries, selling their service to national agencies and sometimes larger newspapers and radio and television stations. In turn, international agencies monitor each of the national services. When a story with an international flavor appears, the international service picks it up and the incident becomes an international story.

Agencies monitoring one another have their computers interconnected. That is, once a story moves on the wire of one agency, it automatically goes also into the computer of each agency that has bought that service. Consider what happens when a major story breaks:

Assume it happens in San Francisco. The Associated Press could be the first to have the story and a reporter there may prepare a bulletin of four or five lines in just a few seconds on his computer terminal. His editor checks it for accuracy and moves it immediately. Seconds later the item has been picked up and relayed nationally by editors at the head office in New York, to appear on teletypes in newspaper, radio and television newsrooms across the United States.

Meanwhile, an editor at the Canadian Press in Toronto, alerted by the bulletin bells, calls the story up on his computer terminal, checks it and moves it across Canada. By now AP has also moved the story on its international wire, and its affiliated national news services are transmitting the story within their own countries. Within four or five minutes of the time that the San Francisco reporter completed his bulletin, the story—never retyped or rewritten by anyone—could be appearing on the teletype of a radio station in Newfoundland, or of a newspaper in Rome.

While all of this is going on, different news agencies Reuters, United Press International and others—also are picking up the story.

Television and Satellites

Television news has similar information sources. Local stations get much of their programming from a television network that provides both news and entertainment. Though usually joining the network at least once a day for a national newscast, local stations often subscribe to one or more news agencies and provide news programs of their own.

Networks and some larger television stations are equipped with mobile studios that can drive to the scene of a breaking story and broadcast developments live. The story can either be carried on the one station or broadcast over an entire network of stations. Thus, in 1970, several million Canadians watched as kidnappers of British diplomat James Cross drove their bomb-laden car through Montreal streets after negotiating an agreement that allowed them to fly out of the country.

Affiliated networks in other countries may also pick up major stories and carry them live or broadcast them later. This is often done by means of a complicated system of space satellites and microwave relay stations.

For example, if a Canadian television network wanted film of a serious air crash in Australia, the local television station would transmit it through a series of microwave systems to the nearest earth station of a satellite system. From there it would be broadcast to an Intelsat satellite somewhere over the Pacific. This satellite would rebroadcast it to an earth station in British Columbia. From there it would be sent to a Telesat (Canadian communications satellite system) earth station and relayed to another satellite over western Canada. The signal then would be broadcast to an earth station at Rivière-Rouge, Quebec, and sent by microwave to Montreal or Toronto.

All of this takes place in just a fraction of a second. Of course, it is quite expensive—costing several thousand dollars for just a few minutes. Since satellite time is sold for a minimum of ten minutes, networks often bring in material “piggyback.” Two or three together may rent a certain period of time to transmit films that they want for later use on a newscast.

News Affects You

With all this technology, do we get all the news? No. News agencies receive far, far more information than they can possibly use. Many use only about 5 to 7 percent of their total material. In turn, the subscribers to the wire services use only a part of the information they receive. So no matter where we live or what we read, likely there is far more going on in the world than we realize.

What people living in smaller cities and communities learn about world events may depend on the decisions of just half a dozen men and women a thousand miles away. But even in major cities where the media have access to several agencies, the number of people who ultimately decide what to use is relatively small. And since any newsman is going to use the most important stories of the hour, much of the news appearing on wire services is the same, shaping your view of the world according to those particular stories.

When a government changes hands, whether by election, revolution or war, it is front-page news. But, ironically, news agencies are completely unaware of impending greatest news story of all time. For today we are at the threshold of a worldwide change in government, the end of this entire global system.

An it is only unknowingly that, by means of their fast and often thorough coverage of certain world events, news agencies make Christians ever more aware of the evidence that we are deep into the “last days” of this world’s system

How Much Confidence Should You Have in Science?

ADVANCES in various fields of science have certainly made contributions to the welfare of the human family. Various medical procedures have helped to prolong life and ease suffering. Advances in technology have improved the quality of our lives in some ways, and have made jobs easier.

Because of such advances, many people view science with an almost worshipful awe. The successful moon landings by astronauts reinforce this feeling. As a result, the ideas expressed by scientists in other matters are also highly respected by many people. And it is a widely held belief that whatever problems the human family faces will eventually be solved, with science and technology leading the way.

This prevailing view was summed up a few years ago in a report published in connection with the 200th anniversary of the founding of the well-known J. R. Geigy pharmaceutical corporation in Basel, Switzerland. One of the commentators, German physicist Professor C. F. von Weizsäcker, is reported as having stated:

“Science today is the only thing in which men as a whole believe: it is the only universal religion of our time . . . The scientist has thus got himself into an ambiguous position: he is a priest of this new religion, possessing its secrets and marvels; for what to others is puzzling, strange or secret is plain to him.”

But is such confidence in science justified? Not according to von Weizsäcker. He notes that any scientist worthy of the name should realize “that what he knows is only a fraction of what he needs to know if he is really to be fit to carry responsibility for the lives of men.” He should appreciate that even in his speciality there is so much he does not know. And honest scientists understand that while science has produced things improving life, it has also done the opposite. It has been responsible for producing things that have made life miserable for millions of people.

The bloodshed and destruction of this century’s world wars are an example. World War II alone is reported to have taken over 50 million lives. Many of these victims died in horrible ways due to the inventions of science and technology: explosives dropped on many peace-loving civilians by speeding aircraft, rockets, tanks, flamethrowers, automatic weapons, torpedoes, atomic bombs and other engines of death. These, too, were the products of scientific and industrial “advancement.”

In more recent times science and technology have shared responsibility for making and using things that have resulted in pollution, noise, congestion and tension. All these facts should make scientists more modest in their claims, and other people more careful as to where they put their confidence.

Problems with Chemicals

Even men of science generally devoted to improving man’s life have awesome problems to face as we can see, for instance, in the drug industry. New drugs are constantly appearing on the market, but the supervision and testing of such drugs have not always been thorough enough.

What happened in West Germany (as well as on a minor scale in Sweden, Canada and Brazil) a few years ago demonstrates the tragic results that can come from the misuse of drugs. The drug Thalidomide was widely used as a tranquilizer. Expectant mothers also used it. But some of them found, to their horror, that upon their giving birth their babies were malformed because of the drug. Thousands of these children were physically or mentally retarded, and remain so to this day. Of these children, the West German news magazine Der Spiegel said:

“They are the victims of a catastrophic mishap, brewed together in the test tubes of a scientifically persuaded generation; the ones forced to suffer because of a mysteriously effective mechanism built into one tenth of a gram of white substance; into the sleeping pills Thalidomide.”

Der Spiegel noted that 310,000,000 dosages of the sedative had been sold between 1957 and 1961. It had been advertised as “nontoxic,” “harmless,” and “completely nonpoisonous.’’ The magazine added: “Nine men were indicted. Not indicted is the willingness of a scientifically persuaded generation to consume medicines by the ton, although scientists in most cases do not know even today just how these affect the human organism.”

Since that time drug procedures have been tightened. Yet the quantity of drugs pouring out of factories is staggering. People all over the world are consuming billions of various drug pills each year. And newer ones are continually being put on the market. The damage to health may appear only after a long period of usage, as proved in the case of cigarette smoking. That is why H. Weicker, professor of human genetics at Bonn University and one of the leading medical experts called to testify at the Thalidomide trial in West Germany, said: “A disaster such as the Thalidomide catastrophe can again overtake us at any time.”

Naturwissenschaftliche Rundschau (Natural Science Review) of West Germany, in its September 1975 issue, stated: “Not only the feared Thalidomide, but apparently many other medicines could also cause deformities in newborn babies if taken by their mothers during the first six weeks of pregnancy, when the embryo is especially sensitive.”

At the School of Public Health in Berkeley, California, L. Milkovich and B. J. vanden Berg studied the effects of drugs in 19,044 newborn babies. Those whose mothers took no tranquilizers during the first 42 days of pregnancy had an average of 2.7 percent deformities. Where the mothers had taken a popular tranquilizer (Equanil), the deformity rate of the newborn babies was 12.1 percent. In the case of another popular drug (Librium), the deformity rate was 11.4 percent. Mothers who took other tranquilizers had about twice as many deformed babies as the mothers who took no drugs at all.

In the book Thalidomide and the Power of the Drug Companies (1972, p. 279), authors H. Sjöström and R. Nilsson declared: “In spite of all warnings, we shall evidently have to wait for a ‘genetic’ disaster to occur before the authorities and the chemical industry wake up. When this occurs owing to the failure to control the properties of some widely used chemical to induce hereditary change, we shall certainly hear from the authorities and from industry that ‘nobody ever thought of such a possibility,’ that ‘this catastrophe was unavoidable.’”

Yet, away back at the beginning of this century scientists were able to induce malformations in lower animals by the use of chemicals. And in spite of all the knowledge and experience gained since then, the load of chemicals (the effects of which on the human body when consumed over years are not yet fully understood, and which are introduced into pills, as well as food, drink and air) continues to mount. Although further factors are also involved, it is no wonder that so many ailments, such as cancer and heart disease, are on the increase.

These few examples from the history of medicine and pharmacy suffice in showing that blind and absolute faith in scientific “progress” is not justified.

This is certainly the case, too, in another field of science, where gullibility is even more pronounced and unjustified.

Tracing Man’s Origin

In the past century, the theory of evolution has been widely accepted and promoted by most scientists. This is the belief that humans have evolved from apelike beasts over a period of millions of years. Although some scientists reject the evolution theory and believe the Bible account that man was created directly by God, the majority of scientists speak as if evolution were a fact proved beyond dispute.

But that is not the case at all. If it were so, many scientists would not still be spending much of their time trying to prove it. They would not be devoting years to crawling around on their hands and knees in the heat of Africa and other places trying to find fossils to prove their theory.

But many evolutionary scientists are guilty of very unscientific procedures in being dogmatic on little or nonexistent evidence. Worse, they have at times been guilty of gross deceptions to try to convince the public that they are proving their theory.

For example, there was the infamous “Piltdown man.” This was asserted to be a vital “missing link” between man and beast. It was “discovered” by Charles Dawson at Piltdown, England, early in this century. But decades later it was exposed as a hoax, a fake. It turned out to be the skull of a modern man combined with the jawbone of an ape that had been “doctored” with chemicals to try to make it look ancient.

One of the broadcasts last year of a West German radio program dealing with science and education was entitled “Forgers in Science”; it told of more recent frauds. An interesting example was of a corpse that came to the attention of the Belgian Royal Academy of Science in 1969. The corpse was preserved in ice and appeared to be a first-rate scientific sensation. Dr. Bernard Heuvelmans, a zoologist and member of the Brussels Academy, said that it was a proof of the evolution theory. He submitted to the Academy the opinion that the apelike creature was a “missing link” between man and ape.

The creature was located in a freezer in the United States, in Minnesota. The zoologist spent days observing and appraising this supposed ancestor of man lying in icy armor. But after examinations, it was discovered that this apelike creature had been on ice, not for millions of years, but for only a few years!

What did Dr. Heuvelmans and other scientists conclude? Not that it was a fake. Instead, they concluded that in our modern era there must have been a remnant of pre-historic man living upon the earth! In a bulletin from the Belgian Academy of Natural Science, Dr. Heuvelmans tried to document his presumptuous theory with extensive illustrations. He even gave the creature the “scientific” name of homo pongoides, that is, “apelike man.”

However, the Academy was perplexed and suspicious. Further extensive and difficult investigations were made. With what conclusion? Was this the biological discovery of the century? The German radio program related: “By no means. Once again forgers had made fools of the scientists. The public was presented with a comedy which was difficult to see through, but it was very evident that it was well staged. The main characters, although unwillingly, were zoologists, anthropologists, paleontologists and other scientists.”

W. R. Lützenkirchen, who wrote the script for this radio program, said: “The ‘missing link’ between man and anthropoid ape is a swindle, a clear forgery. The primitive man . . . came out of the bag of tricks used in the film industry in Hollywood.” He noted that “trick specialists . . . brewed up the ‘missing link.’”

Other Frauds

While this forged “prehistoric man” was one of the more spectacular fakes in recent years, it was not the only one. The program commented on the discovery of supposed works of art of ‘prehistoric Neanderthal man’ in Dithmarschen, a rural section bordering on the North Sea in the northernmost German state of Schleswig-Holstein. North German historians felt that they had a sensational find. In the Dithmarsch State Museum in the city of Meldorf a display of these artifacts was quickly organized.

What happened next? Says Mr. Lützenkirchen “The well-known professor, Herbert Kühn, who specializes in pre-history and is an expert in pre-historic cave paintings, spoke at the opening of the display in Meldorf of a ‘climactic moment of archaeology.’ In exuberance and with euphoria the scientist announced ‘discoveries’ which could ‘compete with that of Galileo Galilei.’ In reality he was caught in a forgery comedy.”

The discoveries had been dated as being from 100,000 to 180,000 years old. But it was found that these works of art, supposedly Neanderthal, had been produced just recently! Responsible for the whole affair was a sales clerk from a village named “Albersdorf.” That was an appropriate coincidence, since in the German language “albern” means “silly.” The clerk had taken old wood and bones from animals and cleverly worked them over.

Some of such forgeries were discovered after only a few months. But others, such as the Piltdown fraud, took decades to uncover. And another example, which took years to expose, had to do with the ‘tools’ that the allegedly primitive ‘Steinheimer man’ was supposed to have used. Until recently these have been in museums and display cases.

In the publication Stuttgarter Beiträge zur Naturkunde (Stuttgart’s Contributions to Natural History), May 1974, evolutionist Professor K. D. Adam, chief curator of the State Museum of Natural Sciences in Stuttgart, stated that the supposed 250,000-year-old artifacts of ‘homo steinheimensis’ were a proof, not of evolution, but of scientific error. He added: “It is stated as a result of the discussed research that none of the countless, ostensible stone- and bone-tools can be proved to be an implement produced and used by man: they are pebbles of limestone, and also subordinated of sandstone and dolomite, as well as bone fragments, mostly indeterminable.”

Where Confidence Can Be Placed

Of course, there are finds relative to man’s origin that are much better documented than the forgeries. These clearly show that the ‘historical period’ of man began some five to six thousand years ago. And there has been solid scientific progress in gathering information about this earth and its life systems. Also in other fields, scientists have made genuine contributions to the welfare of mankind, all of which is commendable and very much appreciated.

But what is also clearly shown in the history of science is that scientists are only imperfect humans. They make mistakes just like everybody else. And often, because of the desire for fame, or because of pride and stubbornness, they will cling to ideas that are not the truth and that can even result in harm to people.

More and more people, including scientists, are acknowledging this. Especially is this the case in our time when the negative fruits of science and technology have become more obvious, and many times these backfire, to the torment of the human family. So it should be apparent that we cannot put total confidence and unshakable faith in humans, be they scientists or others.

There is only one source that merits total confidence and unshakable faith. That source is our Creator, Jehovah God. The Bible writer of Proverbs says: “The eyes of Jehovah are in every place.” (Prov. 15:3) Nothing is hidden from the Creator. Since he originated the universe and all life in it, he certainly knows where man came from and where he is going. He also makes available to those who trust him accurate information about such matters.

It is comforting to men and women of faith to know that their future does not depend upon what mere humans do. They appreciate that the record of human failures in past centuries gives no basis for confidence. Rather, faith in the dependable Creator does inspire confidence in the future. And the future He promises is one without sickness and sorrow. “‘For I myself well know the thoughts that I am thinking toward you,’ is the utterance of Jehovah, ‘thoughts of peace, and not of calamity, to give you a future and a hope.’”

Blog Archive