Nazca Lines - lines for aliens ?


Few archaeological enigmas have excited so much fanciful speculation as the lines and figures etched into the desert near Nazca in southern Peru. Few of the theories are scientifically tenable, and many are pure fantasy.

However, behind the speculation lies a unique cultural phenomenon that for almost a century has attracted the attention of scientists and archaeologists alike.The coastal strip of southern Peru, which is in effect the northern extension of the Atacama desert in Chile, is one of the most arid and desolate regions of the world. The landscape here comprises a series of flat desert plains, or pampas, separated by oasis-like river valleys. Measurable rainfall occurs on average only once in several years.

Rivers are dry for much of the year, and water is plentiful lor a short period only, when the seasonal melt-water flows down from the snow-capped Andes to the east. The small town of Nasca. some four hundred kilometers south of Lima, is situated in such a valley. A distinctive culture flourished here between the first and sixth centuries C.E., leaving an abundant archaeological record including a fine and distinctive style of pottery, brightly colorful and richly decorated. During the centuries before the arrival of the Europeans, the people who lived in this area also seem 10 have channeled considerable efforts into etching monumental drawings on the desert. The Nazca pampa, an arid plain to the north of the town, covers an area of some two hundred square kilometers (about eighty square miles) and is covered in a vast array of long, straight lines, rectangles and trapezoids, labyrinths and spirals.

The greatest concentration of markings is in the northern corner, where a number of large stylized bird and animal figures, as well as less readily identifiable forms, are also found. The overall impression, as viewed from one of the many light aircraft that carry tourists over the plain, is one of a giantsketchpad, much scribbled upon.The desert surface here is composed of black ferrous oxide pebbles darkened by oxidation over many centuries. By simply brushing them aside, a bright yellow sandy soil is revealed beneath. This means that the desert markings, often termed geoglyphs, are highly susceptible to modern damage.

Merely walking on the pampa is often enough to leave conspicuous footprints, which, owing to the lack of precipitation, will endure almost indefinitely. Worse still, many of the ancient lines and figures are scarred by deep ruts created by cars and even large commercial vehicles, which drive across the open desert in order to avoid paying highway tolls. On the other hand, there is no great mystery about how the Nazca geoglyphs were created, at least in principle. Armed with nothing more than a piece of string and a few-sighting poles, a group of six volunteers was able to produce a ten-meter(thirty-foot-) long straight line ending in a spiral on a nearby pampa in less than ninety minutes.Yet the Nazca markings were more than casual doodles. Some lines run for several kilometers, remaining dead straight even where they pass over small hills and dips.The figures , generally too large for what they are when standing close by , must have constructed by scaling up from a template of manageable size.

The enigma lies not in how the etchings were lines. A few visits to the pampa convinced Maria that the lines were directed toward horizon directions where the sun, moon, or stars appeared and disappeared; solving the riddle of the mathematical and astronomical meaning of the lines and figures subsequently became her mission in life. She began to visit the pampa regularly, living the life of a recluse, spending hours, days, and weeks walking on the desert and making measurements. Despite Reiche's unremitting devotion to the investigation of the lines, which lasted for the rest of her life (she died in I998, aged ninety-five), it produced precious little published hard data. Reiche's book Mystery on the Desert, which has run to several editions, concentrates mainly on descriptive material.

In 1968, the astronomer Gerald Hawkins, who had proposed that Stonehenge in England was an astronomical observatory or computer, visited the pampa and carried out a statistical examination of the line orientations. His conclusion,which came as a surprise for many, was that they had no astronomical significance whatsoever, beyond what might be expected by chance. Although very different, both these approaches failed in one fundamental respect: they were divorced from the cultural context. Each, in its different way, was an intellectual exercise dictated by Western concepts of science and mathematics but unrelated to the rich cultural traditions of pre-Columbian America.
Ancient astronomy: an encyclopedia of cosmologies and myth De Clive L. N. Ruggles

Biomedical Technologies - is the FUTURE ?


The widely publicized Human Genome Project has been a massive undertaking requiring enormous computing power. Its goal has been to produce an accurate map of the chemical structures that make a person a person—the blueprint of human life, involving several billion pieces of information.

Along the way, the project has also become a test case for weighing the benefits of private versus public science represented on each side by highly visible scientists with charismatic personalities, with the largely government-funded public project at the U.S. National Institutes of Health led by Francis Collins competing head-to-head with a for-profit commercial operation led by 1. Craig Venter of Celera Genomics to crack the code first.
Each group has claimed victor)' on several occasions as new milestones have been reached; most observers seem to accept the proposition that the project would not have advanced as quickly as it did were it not for the intensity of the competition.

The rivalry may have peaked during a joint presentation at the 2001 American Association for the Advancement of Science meeting at which Venter's Celera group , publishing in a issue of Nature that was literally hot off the presses when it arrived at the meeting, simultaneously revealed similar "draft" human genome maps. (Actually, neither map was entirely complete, and the work of both groups continues.) While the rivalry may indeed have spurred both groups to work harder and faster, it also spurred a heated debate about the conflict between the preservation of commercial patent rights, based on keeping details proprietary, and the advancement of public science, based on a policy of open information sharing.

Knowing most of the genetic code, or even all of it, does not, however, mean understanding it and does not translate directly into effective therapies for genetics-related problems. Genes with specific known effects must still be identified, defined, and distinguished from the amorphous chunks of code. Many traits are believed to be the result of the interaction of multiple genes and often reflective of environmental influences as well. Even identical twins do not always have the same personalities, problems, or diseases. Nevertheless, the project is an important and highly visible step in the direction of linking genetic heritage with a variety of conditions. This step has also raised people's awareness of the profound social and ethical issues associated with the complete mastery of human genetics that the project appears to promise in the not-so-distant future. The issues include the following:
• Genetic testing and privacy: What will employers and insurance companies do with the information about individuals' susceptibility to particular diseases?
• The question of "designer babies": Is it right for parents to choose their children's gender, height, weight, coloration, athletic ability, or intelligence?
• I he essential nature of human individuality and identity: Should the code ever be duplicated to produce a new human, and if it is, will this clone be the same person, or a new one?
• The relations between genetics and ethnic identity, genetics and personality, and genetics and human behavior: How much of our decision making is based in biology and how much is actually a matter of choice ?

Against this backdrop, public controversies have raged about the heritability of homosexuality (the "gay gene" idea), of obesity (the so-called "fat gene"), and of individual predispositions to mentally disturbed, aggressive, or criminal behavior. The idea that SO much of human behavior might be "in the genes" represents an assault on the Western legal system (and some Western theology) by undermining the presumption that humans make behavioral choices that are free and that may be rationally determined.

Some worry that the kind of genetic determinism that this line of research seems to reinforce will blind us to the social and environmental determinants of behavior, such as learned social values, economic influences, and family dynamics. At the same time the eventual promise of the project may be to enable us to transcend the tyranny of biology, making possible the achievement of human control over human evolution and destiny to an unprecedented degree. But the minute we have the capability to correct defective genes, we will be faced with the dilemma of having to choose which human characteristics actually fall in the "defect" category. As the old saying goes, we must be careful what we ask for—we might get it.

Despite the ethical challenges that a complete knowledge of human genetics may eventually engender, there are some cases where social consensus on the right course of action is more likely than in other cases. If diseases such as cystic fibrosis or diabetes can be treated with gene therapy, if dysfunctional organs can be replaced with substitutes from modified animals, or if nerve cells or other critical human tissues can be made to regenerate themselves, the benefits would appear overwhelming.

In fact, most people in both North America and Europe (the areas where opinion data are generally available) are much more supportive of medical biotechnology than of agricultural biotechnology, most likely because the benefits to the quality of human life are so readily apparent for medical interventions. Yet in each of these examples - gene therapy , xenotransplantation , and the use of stem cells substantial controversy has arisen.

Artificial Intelligence - A.I.


Artificial intelligence is the attempt to make machines that think like humans. In this article the abbreviation Al will be used to refer to this enterprise, and the phrase artificial intelligence technologies (AIT) will be reserved for new technologies that initially grew out of AI but that mimic only some aspects of human abilities, such as a subset of speech recognition or pattern recognition, while avoiding the deep problems. AIT and AI are often confused with one another.

AI proper has an important bearing on sociology in general, and social studies of science in particular, because of the light it can shed on the notion of the social. Most sociologists believe that most of a person's capacities are gained through the person's embedding in social groups. If machines could succeed in mimicking human reasoning, then either humans would have learned to "socialize" machines or there would be something wrong with the idea of "the social." Al the moment, humans have no idea how to socialize machines; there are no machines that can be raised from birth and learn language within a family, nor any that are imprinted with a ready-made set of social abilities and a capacity to continue to build them through social interaction.

From time to time such abilities have been claimed for machines as they have evolved. For example, neural nets appear to be capable of learning by themselves, but only in a
crude, behavioristic way, as one might train a pigeon or the like, so the deep problem of socialization has not been approached. This means that any real successes in Al would threaten the sociologist's idea of the social.
This is not the only kind of relationship between artifical intelligence and the social sciences. Sociologists are interested in the way new technologies change society, and die changes brought about by AIT are one such area of inquiry, One might think of machines of all sorts as already an integral part of society, but this is to use the notion of "the social" in a way that bears less directly on sociology as an enterprise.

Also, social studies of science have a legitimate concern with the development of AlTs of various kinds, and especially their relationship lo military projects.Returning to Al proper, the attempt to automate scientific discovery is a hard case for those who would wish to maintain the idea of the social. At the heart of the sociology of scientific knowledge (SSK) is the idea that even scientific knowledge is deeply invested with the social, whereas dominant models of science take it to be a paradigm of universality divorced from social influence. Thus, imagine a human community that has developed in isolation from other communities: There would be no grounds to expect such a community to develop, say. the English language, still less the nuances of any particular dialect spoken at a given moment in history. One accepts that such capacities would not develop in the absence of social contact between the community and the social group that embodied the dialect.

On the other hand, the dominant view of science would lead us to be less surprised should such an isolated community rediscover many of our scientific and mathematical laws. It is this view of science that is challenged by SSK, which treats any body of scientific knowledge as very much like a dialect in a natural language. If the current generation of asocial machines could rediscover scientific laws on their own, this would support the dominant view and challenge the sociological view of science. The sociology of scientific knowledge is, then, a hard case for the larger argument about Al and the social. If the sociologists can hold their ground in the case of science, then the ground can be held much more easily in the case of social activity as a whole.

Attempts to develop Al can be seen, then, as an expensive experiment to test the deep ideas of sociology in general and of SSK in particular. Like all experiments, however, this one suffers from indeterminacies in its outcome associated with the experimenter's regress and the like. One important source of confusion is the confusion between AIT and Al, which is amplified by humans' ability to "repair" the deficiencies in others' communication and attribute far more competence to partners in discourse that they deserve. The need for repair is crucial to ordinary communication, because speech is normally indistinct, broken, overlaid with other sounds, and invested with allusion to shared but unspoken contexts. It is only by "reading" within context and repairing the "mistakes" that we are able to make sense of others' speech and action. On these tendencies depends the success of confidence tricksters and fraudsters of various
kinds, who can rely on the "mark: to do most of the work necessary to see what they do as a competent performance.

The Big Bang theory


The Big Bang is actually not a "theory" at all, but rather a scenario or model about the early moments of our universe, for which the evidence is overwhelming.

It is a common misconception that the Big Bang was the origin of the universe. In reality, the Big Bang scenario is completely silent about how the universe came into existence in the first place. In fact, the closer we look to time "zero," the less certain we are about what actually happened, because our current description of physical laws do not yet apply to such extremes of nature.
The Big Bang scenario simply assumes that space, time, and energy already existed. But it tells us nothing about where they came from or why the universe was born hot and dense to begin with.

But if space and everything with it is expanding now, then the universe must have been much denser in the past. That is, all the matter and energy (such as light) that we observe in the universe would have been compressed into a much smaller space in the past. Einstein's theory of gravity enables us to run the "movie" of the universe backwards—i.e., to calculate the density that the universe must have had in the past.
The result: any chunk of the universe we can observe—no matter how large—must have expanded from an infinitesimally small volume of space.
By determining how fast the universe is expanding now, and then "running the movie of the universe" backwards in time, we can determine the age of the universe.The result is that space started expanding 13.7 billion years ago. This number has now been experimentally determined to within 1% accuracy. It's a common misconception that the entire universe began from a point. If the whole universe is infinitely large today (and we don't know yet), then it would have been infinitely large in the past, including during the Big Bang. But any finite chunk of the universe—such as the part of the universe we can observe today—is predicted to have started from an extremely small volume.

Part of the confusion is that scientists sometimes use the term "universe" when they're referring to just the part we can see ("the observable universe"). And sometimes they use the term universe to refer to everything, including the part of the universe beyond what we can see. It's also a common misconception that the Big Bang was an "explosion" that took place somewhere in space. But the Big Bang was an expansion of space itself. Every part of space participated in it. For example, the part of space occupied by the Earth, the Sun, and our Milky Way galaxy was once, during the Big Bang, incredibly hot and dense. The same holds true of every other part of the universe we can see. We observe that galaxies are rushing apart in just the way predicted by the Big Bang model. But there are other important observations that support the Big Bang.

Astronomers have detected, throughout the universe, two chemical elements that could only have been created during the Big Bang: hydrogen and helium. Furthermore, these elements are observed in just the proportions (roughly 75% hydrogen, 25% helium) predicted to have been produced during the Big Bang. This is the nucleosynthesis of the light elements. This prediction is based on our well-established understanding of nuclear reactions—independent of Einstein's theory of gravity. Second, we can actually detect the light left over from the era of the Big Bang. This is the origin of the cosmic microwave background radiation. The blinding light that was present in our region of space has long since traveled off to the far reaches of the universe. But light from distant parts of the universe is just now arriving here at Earth, billions of years after the Big Bang. This light is observed to have all the characteristics expected from the Big Bang scenario and from our understanding of heat and light.

The standard Hot Big Bang model also provides a framework in which to understand the collapse of matter to form galaxies and other large-scale structures observed in the Universe today. At about 10,000 years after the Big Bang, the temperature had fallen to such an extent that the energy density of the Universe began to be dominated by massive particles, rather than the light and other radiation which had
predominated earlier. This change in the form of matter density meant that the gravitational forces between the massive particles could begin to take effect, so that any small perturbations in their density would grow. Thirteen point seven billion years later we see the results of this collapse in the structure and distribution of the galaxies.