The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Monthly Archives: June 2016
The Problem with Seasteading | Bottom-up
Posted: June 19, 2016 at 3:36 am
I first wrote about seasteading two years ago, shortly after the Seasteading Institute launched. The brainchild of Patri Friedman (grandson of Milton) and others, seasteading is a program for political reform based on a proliferation of self-governing ocean colonies. As I described it in 2008:
A key advantage of seasteads is what Friedman calls dynamic geography, the fact that any given seasteading unit is free to join or leave larger units within seasteading communities. Seasteading platforms would likely band together to provide common services like police protection, but with the key difference that any platform that was dissatisfied with the value it was receiving from such jurisdictions could leave them at any time. [Friedman] argues that this would move power downward, giving smaller units within society greater leverage to ensure the interests of their members are being served.
Seasteading is based on a delightfully bottom-up argument: that the problem with government is the lack of choice. If I dont like my job, my apartment, or my grocery store, I can easily pick up and go somewhere else. The threat of exit induces employers, landlords, and store owners, and the like to treat us well without a lot of top-down oversight. In contrast, switching governments is hard, so governments treat us poorly. Seasteaders aim to change that.
The pragmatic incrementalism of seasteading is also appealing. Friedman doesnt have to foment a revolution, or even win an election, to give seasteading a try. If he can just a few hundred people of the merits of his ideas, they can go try it without needing assistance or support from the rest of us. If the experiment fails, the cost is relatively small.
Yet seasteading is a deeply flawed project. In particular, the theory of dynamic geography is based on a fundamental misunderstanding of the relationships among mobility, wealth creation, and government power. In a real-world seasteading community, powerful economic forces would cripple dynamic geography and leave seasteaders no freer than the rest of us.
To see the problem, imagine if someone developed the technology to transform my apartment building in Manhattan into a floating platform. Its owners could, at any time, float us out into the Hudson river and move to another state or country. Would they do it? Obviously not. They have hundreds of tenants who are paying good money to live in Manhattan. Wed be furious if we woke up one morning and found ourselves off the coast of South Carolina. Things get more, not less, difficult at larger scales. Imagine if Long Island (which includes the New York boroughs of Queens and Brooklyn and a lot of suburbs) were a huge ocean-going vessel. The residents of Long Island would overwhelmingly oppose moving; most of them have jobs, friends, familiy, churches, favorite restaurants, and other connections to the rest of the New York metro area. The value of being adjacent to Manhattan swamps whatever benefits there might be to being part of a state with lower taxes or better regulations.
Successful cities need a variety of infrastructureroads, electricity, network connectivity, water and sewer lines, and so forth. At small scales you could probably design this infrastructure to be completely modular. But that approach doesnt scale; at some point you need expensive fixed infrastructuremulti-lane highways, bridges, water mains, subway lines, power plantsthat only make economic sense if built on a geographically stable foundation. Such infrastructure wouldnt be feasible in a dynamic city, and without such infrastructure its hard to imagine a city of even modest size being viable.
I think the seasteaders response to this is that the advantages of increased liberty would be so large that people would be willing to deal with the inconveniences necessary to preserve dynamic geography. But heres the thing: The question of whether the advantages of freedom (in the leave me alone sense) outweigh the benefits of living in large urban areas is not a theoretical one. If all you care about is avoiding the long arm of the law, thats actually pretty easy to do. Buy a cabin in the woods in Wyoming and the government will pretty much leave you alone. Pick a job that allows you to deal in cash and you can probably get away without filing a tax return. In reality, hardly anyone does this. To the contrary, people have been leaving rural areas for high-tax, high-regulation cities for decades.
Almost no ones goal in life is to maximize their liberty in this abstract sense. Rather, liberty is valuable because it enables us to achieve other goals, like raising a family, having a successful career, making friends, and so forth. To achieve those kinds of goals, you pretty much have to live near other people, conform to social norms, and make long-term investments. And people who live close together for long periods of time need a system of mechanisms for resolving disputes, which is to say they need a government.
The power of governments rests not on the immobility of real estate, but from the fact that people want to form durable relationships with other people. The residents of a seastead city would be no more enthusiastic about dynamic geopgrahy than the residents of Brooklyn. Which means that the government of the city would have the same kind of power Mayor Bloomberg has. Indeed, it would likely have more power, because the seastead city wouldnt have New Jersey a few hundred yards away ready to take disaffected residents.
Read the original post:
Posted in Seasteading
Comments Off on The Problem with Seasteading | Bottom-up
Technology – Wikipedia, the free encyclopedia
Posted: at 3:35 am
This article is about the use and knowledge of techniques and processes for producing goods and services. For other uses, see Technology (disambiguation).
Technology ("science of craft", from Greek , techne, "art, skill, cunning of hand"; and -, -logia[3]) is the collection of techniques, skills, methods and processes used in the production of goods or services or in the accomplishment of objectives, such as scientific investigation. Technology can be the knowledge of techniques, processes, etc. or it can be embedded in machines, computers, devices and factories, which can be operated by individuals without detailed knowledge of the workings of such things.
The human species' use of technology began with the conversion of natural resources into simple tools. The prehistoric discovery of how to control fire and the later Neolithic Revolution increased the available sources of food and the invention of the wheel helped humans to travel in and control their environment. Developments in historic times, including the printing press, the telephone, and the Internet, have lessened physical barriers to communication and allowed humans to interact freely on a global scale. The steady progress of military technology has brought weapons of ever-increasing destructive power, from clubs to nuclear weapons.
Technology has many effects. It has helped develop more advanced economies (including today's global economy) and has allowed the rise of a leisure class. Many technological processes produce unwanted by-products, known as pollution, and deplete natural resources, to the detriment of Earth's environment. Various implementations of technology influence the values of a society and new technology often raises new ethical questions. Examples include the rise of the notion of efficiency in terms of human productivity, a term originally applied only to machines, and the challenge of traditional norms.
Philosophical debates have arisen over the use of technology, with disagreements over whether technology improves the human condition or worsens it. Neo-Luddism, anarcho-primitivism, and similar reactionary movements criticise the pervasiveness of technology in the modern world, arguing that it harms the environment and alienates people; proponents of ideologies such as transhumanism and techno-progressivism view continued technological progress as beneficial to society and the human condition.
Until recently, it was believed that the development of technology was restricted only to human beings, but 21st century scientific studies indicate that other primates and certain dolphin communities have developed simple tools and passed their knowledge to other generations.
The use of the term "technology" has changed significantly over the last 200 years. Before the 20th century, the term was uncommon in English, and usually referred to the description or study of the useful arts.[4] The term was often connected to technical education, as in the Massachusetts Institute of Technology (chartered in 1861).[5]
The term "technology" rose to prominence in the 20th century in connection with the Second Industrial Revolution. The term's meanings changed in the early 20th century when American social scientists, beginning with Thorstein Veblen, translated ideas from the German concept of Technik into "technology". In German and other European languages, a distinction exists between technik and technologie that is absent in English, which usually translates both terms as "technology". By the 1930s, "technology" referred not only to the study of the industrial arts but to the industrial arts themselves.[6]
In 1937, the American sociologist Read Bain wrote that "technology includes all tools, machines, utensils, weapons, instruments, housing, clothing, communicating and transporting devices and the skills by which we produce and use them."[7] Bain's definition remains common among scholars today, especially social scientists. But equally prominent is the definition of technology as applied science, especially among scientists and engineers, although most social scientists who study technology reject this definition.[8] More recently, scholars have borrowed from European philosophers of "technique" to extend the meaning of technology to various forms of instrumental reason, as in Foucault's work on technologies of the self (techniques de soi).
Dictionaries and scholars have offered a variety of definitions. The Merriam-Webster Dictionary offers a definition of the term: "the practical application of knowledge especially in a particular area" and "a capability given by the practical application of knowledge".[9]Ursula Franklin, in her 1989 "Real World of Technology" lecture, gave another definition of the concept; it is "practice, the way we do things around here".[10] The term is often used to imply a specific field of technology, or to refer to high technology or just consumer electronics, rather than technology as a whole.[11]Bernard Stiegler, in Technics and Time, 1, defines technology in two ways: as "the pursuit of life by means other than life", and as "organized inorganic matter."[12]
Technology can be most broadly defined as the entities, both material and immaterial, created by the application of mental and physical effort in order to achieve some value. In this usage, technology refers to tools and machines that may be used to solve real-world problems. It is a far-reaching term that may include simple tools, such as a crowbar or wooden spoon, or more complex machines, such as a space station or particle accelerator. Tools and machines need not be material; virtual technology, such as computer software and business methods, fall under this definition of technology.[13]W. Brian Arthur defines technology in a similarly broad way as "a means to fulfill a human purpose".[14]
The word "technology" can also be used to refer to a collection of techniques. In this context, it is the current state of humanity's knowledge of how to combine resources to produce desired products, to solve problems, fulfill needs, or satisfy wants; it includes technical methods, skills, processes, techniques, tools and raw materials. When combined with another term, such as "medical technology" or "space technology", it refers to the state of the respective field's knowledge and tools. "State-of-the-art technology" refers to the high technology available to humanity in any field.
Technology can be viewed as an activity that forms or changes culture.[15] Additionally, technology is the application of math, science, and the arts for the benefit of life as it is known. A modern example is the rise of communication technology, which has lessened barriers to human interaction and, as a result, has helped spawn new subcultures; the rise of cyberculture has, at its basis, the development of the Internet and the computer.[16] Not all technology enhances culture in a creative way; technology can also help facilitate political oppression and war via tools such as guns. As a cultural activity, technology predates both science and engineering, each of which formalize some aspects of technological endeavor.
The distinction between science, engineering and technology is not always clear. Science is the reasoned investigation or study of natural phenomena, aimed at discovering enduring principles among elements of the phenomenal world by employing formal techniques such as the scientific method.[17] Technologies are not usually exclusively products of science, because they have to satisfy requirements such as utility, usability and safety.
Engineering is the goal-oriented process of designing and making tools and systems to exploit natural phenomena for practical human means, often (but not always) using results and techniques from science. The development of technology may draw upon many fields of knowledge, including scientific, engineering, mathematical, linguistic, and historical knowledge, to achieve some practical result.
Technology is often a consequence of science and engineering although technology as a human activity precedes the two fields. For example, science might study the flow of electrons in electrical conductors, by using already-existing tools and knowledge. This new-found knowledge may then be used by engineers to create new tools and machines, such as semiconductors, computers, and other forms of advanced technology. In this sense, scientists and engineers may both be considered technologists; the three fields are often considered as one for the purposes of research and reference.[18]
The exact relations between science and technology in particular have been debated by scientists, historians, and policymakers in the late 20th century, in part because the debate can inform the funding of basic and applied science. In the immediate wake of World War II, for example, in the United States it was widely considered that technology was simply "applied science" and that to fund basic science was to reap technological results in due time. An articulation of this philosophy could be found explicitly in Vannevar Bush's treatise on postwar science policy, ScienceThe Endless Frontier: "New products, new industries, and more jobs require continuous additions to knowledge of the laws of nature... This essential new knowledge can be obtained only through basic scientific research." In the late-1960s, however, this view came under direct attack, leading towards initiatives to fund science for specific tasks (initiatives resisted by the scientific community). The issue remains contentiousthough most analysts resist the model that technology simply is a result of scientific research.[19][20]
The use of tools by early humans was partly a process of discovery and of evolution. Early humans evolved from a species of foraging hominids which were already bipedal,[21] with a brain mass approximately one third of modern humans.[22] Tool use remained relatively unchanged for most of early human history. Approximately 50,000 years ago, the use of tools and complex set of behaviors emerged, believed by many archaeologists to be connected to the emergence of fully modern language.[23]
Hominids started using primitive stone tools millions of years ago. The earliest stone tools were little more than a fractured rock, but approximately 40,000 years ago, pressure flaking provided a way to make much finer work.
The discovery and utilization of fire, a simple energy source with many profound uses, was a turning point in the technological evolution of humankind.[24] The exact date of its discovery is not known; evidence of burnt animal bones at the Cradle of Humankind suggests that the domestication of fire occurred before 1,000,000BC;[25] scholarly consensus indicates that Homo erectus had controlled fire by between 500,000BC and 400,000BC.[26][27] Fire, fueled with wood and charcoal, allowed early humans to cook their food to increase its digestibility, improving its nutrient value and broadening the number of foods that could be eaten.[28]
Other technological advances made during the Paleolithic era were clothing and shelter; the adoption of both technologies cannot be dated exactly, but they were a key to humanity's progress. As the Paleolithic era progressed, dwellings became more sophisticated and more elaborate; as early as 380,000BC, humans were constructing temporary wood huts.[29][30] Clothing, adapted from the fur and hides of hunted animals, helped humanity expand into colder regions; humans began to migrate out of Africa by 200,000BC and into other continents, such as Eurasia.[31]
Man's technological ascent began in earnest in what is known as the Neolithic period ("New stone age"). The invention of polished stone axes was a major advance that allowed forest clearance on a large scale to create farms. Agriculture fed larger populations, and the transition to sedentism allowed simultaneously raising more children, as infants no longer needed to be carried, as nomadic ones must. Additionally, children could contribute labor to the raising of crops more readily than they could to the hunter-gatherer economy.[32][33]
With this increase in population and availability of labor came an increase in labor specialization.[34] What triggered the progression from early Neolithic villages to the first cities, such as Uruk, and the first civilizations, such as Sumer, is not specifically known; however, the emergence of increasingly hierarchical social structures and specialized labor, of trade and war amongst adjacent cultures, and the need for collective action to overcome environmental challenges such as irrigation, are all thought to have played a role.[35]
Continuing improvements led to the furnace and bellows and provided the ability to smelt and forge native metals (naturally occurring in relatively pure form).[36]Gold, copper, silver, and lead, were such early metals. The advantages of copper tools over stone, bone, and wooden tools were quickly apparent to early humans, and native copper was probably used from near the beginning of Neolithic times (about 8000 BC).[37] Native copper does not naturally occur in large amounts, but copper ores are quite common and some of them produce metal easily when burned in wood or charcoal fires. Eventually, the working of metals led to the discovery of alloys such as bronze and brass (about 4000 BC). The first uses of iron alloys such as steel dates to around 1400 BC.
Meanwhile, humans were learning to harness other forms of energy. The earliest known use of wind power is the sailboat.[38] The earliest record of a ship under sail is shown on an Egyptian pot dating back to 3200 BC.[39] From prehistoric times, Egyptians probably used the power of the annual flooding of the Nile to irrigate their lands, gradually learning to regulate much of it through purposely built irrigation channels and 'catch' basins. Similarly, the early peoples of Mesopotamia, the Sumerians, learned to use the Tigris and Euphrates rivers for much the same purposes. But more extensive use of wind and water (and even human) power required another invention.
According to archaeologists, the wheel was invented around 4000 B.C. probably independently and nearly simultaneously in Mesopotamia (in present-day Iraq), the Northern Caucasus (Maykop culture) and Central Europe. Estimates on when this may have occurred range from 5500 to 3000 B.C., with most experts putting it closer to 4000 B.C. The oldest artifacts with drawings that depict wheeled carts date from about 3000 B.C.; however, the wheel may have been in use for millennia before these drawings were made. There is also evidence from the same period for the use of the potter's wheel. More recently, the oldest-known wooden wheel in the world was found in the Ljubljana marshes of Slovenia.[40]
The invention of the wheel revolutionized trade and war. It did not take long to discover that wheeled wagons could be used to carry heavy loads. Fast (rotary) potters' wheels enabled early mass production of pottery. But it was the use of the wheel as a transformer of energy (through water wheels, windmills, and even treadmills) that revolutionized the application of nonhuman power sources.
Innovations continued through the Middle Ages with innovations such as silk, the horse collar and horseshoes in the first few hundred years after the fall of the Roman Empire. Medieval technology saw the use of simple machines (such as the lever, the screw, and the pulley) being combined to form more complicated tools, such as the wheelbarrow, windmills and clocks. The Renaissance brought forth many of these innovations, including the printing press (which facilitated the greater communication of knowledge), and technology became increasingly associated with science, beginning a cycle of mutual advancement. The advancements in technology in this era allowed a more steady supply of food, followed by the wider availability of consumer goods.
Starting in the United Kingdom in the 18th century, the Industrial Revolution was a period of great technological discovery, particularly in the areas of agriculture, manufacturing, mining, metallurgy and transport, driven by the discovery of steam power. Technology took another step in a second industrial revolution with the harnessing of electricity to create such innovations as the electric motor, light bulb and countless others. Scientific advancement and the discovery of new concepts later allowed for powered flight, and advancements in medicine, chemistry, physics and engineering. The rise in technology has led to skyscrapers and broad urban areas whose inhabitants rely on motors to transport them and their daily bread. Communication was also greatly improved with the invention of the telegraph, telephone, radio and television. The late 19th and early 20th centuries saw a revolution in transportation with the invention of the airplane and automobile.
The 20th century brought a host of innovations. In physics, the discovery of nuclear fission has led to both nuclear weapons and nuclear power. Computers were also invented and later miniaturized utilizing transistors and integrated circuits. information technology subsequently led to the creation of the Internet, which ushered in the current Information Age. Humans have also been able to explore space with satellites (later used for telecommunication) and in manned missions going all the way to the moon. In medicine, this era brought innovations such as open-heart surgery and later stem cell therapy along with new medications and treatments.
Complex manufacturing and construction techniques and organizations are needed to make and maintain these new technologies, and entire industries have arisen to support and develop succeeding generations of increasingly more complex tools. Modern technology increasingly relies on training and education their designers, builders, maintainers, and users often require sophisticated general and specific training. Moreover, these technologies have become so complex that entire fields have been created to support them, including engineering, medicine, and computer science, and other fields have been made more complex, such as construction, transportation and architecture.
Generally, technicism is a reliance or confidence in technology as a benefactor of society. Taken to extreme, technicism is the belief that humanity will ultimately be able to control the entirety of existence using technology. In other words, human beings will someday be able to master all problems and possibly even control the future using technology. Some, such as Stephen V. Monsma,[41] connect these ideas to the abdication of religion as a higher moral authority.
Optimistic assumptions are made by proponents of ideologies such as transhumanism and singularitarianism, which view technological development as generally having beneficial effects for the society and the human condition. In these ideologies, technological development is morally good. Some critics see these ideologies as examples of scientism and techno-utopianism and fear the notion of human enhancement and technological singularity which they support. Some have described Karl Marx as a techno-optimist.[42]
On the somewhat skeptical side are certain philosophers like Herbert Marcuse and John Zerzan, who believe that technological societies are inherently flawed. They suggest that the inevitable result of such a society is to become evermore technological at the cost of freedom and psychological health.
Many, such as the Luddites and prominent philosopher Martin Heidegger, hold serious, although not entirely deterministic reservations, about technology (see "The Question Concerning Technology"[43]). According to Heidegger scholars Hubert Dreyfus and Charles Spinosa, "Heidegger does not oppose technology. He hopes to reveal the essence of technology in a way that 'in no way confines us to a stultified compulsion to push on blindly with technology or, what comes to the same thing, to rebel helplessly against it.' Indeed, he promises that 'when we once open ourselves expressly to the essence of technology, we find ourselves unexpectedly taken into a freeing claim.'[44]" What this entails is a more complex relationship to technology than either techno-optimists or techno-pessimists tend to allow.[45]
Some of the most poignant criticisms of technology are found in what are now considered to be dystopian literary classics, for example Aldous Huxley's Brave New World and other writings, Anthony Burgess's A Clockwork Orange, and George Orwell's Nineteen Eighty-Four. And, in Faust by Goethe, Faust's selling his soul to the devil in return for power over the physical world, is also often interpreted as a metaphor for the adoption of industrial technology. More recently, modern works of science fiction, such as those by Philip K. Dick and William Gibson, and films (e.g. Blade Runner, Ghost in the Shell) project highly ambivalent or cautionary attitudes toward technology's impact on human society and identity.
The late cultural critic Neil Postman distinguished tool-using societies from technological societies and, finally, what he called "technopolies," that is, societies that are dominated by the ideology of technological and scientific progress, to the exclusion or harm of other cultural practices, values and world-views.[46]
Darin Barney has written about technology's impact on practices of citizenship and democratic culture, suggesting that technology can be construed as (1) an object of political debate, (2) a means or medium of discussion, and (3) a setting for democratic deliberation and citizenship. As a setting for democratic culture, Barney suggests that technology tends to make ethical questions, including the question of what a good life consists in, nearly impossible, because they already give an answer to the question: a good life is one that includes the use of more and more technology.[47]
Nikolas Kompridis has also written about the dangers of new technology, such as genetic engineering, nanotechnology, synthetic biology and robotics. He warns that these technologies introduce unprecedented new challenges to human beings, including the possibility of the permanent alteration of our biological nature. These concerns are shared by other philosophers, scientists and public intellectuals who have written about similar issues (e.g. Francis Fukuyama, Jrgen Habermas, William Joy, and Michael Sandel).[48]
Another prominent critic of technology is Hubert Dreyfus, who has published books On the Internet and What Computers Still Can't Do.
Another, more infamous anti-technological treatise is Industrial Society and Its Future, written by Theodore Kaczynski (aka The Unabomber) and printed in several major newspapers (and later books) as part of an effort to end his bombing campaign of the techno-industrial infrastructure.
The notion of appropriate technology, however, was developed in the 20th century (e.g., see the work of E. F. Schumacher and of Jacques Ellul) to describe situations where it was not desirable to use very new technologies or those that required access to some centralized infrastructure or parts or skills imported from elsewhere. The eco-village movement emerged in part due to this concern.
This article mainly focusses on American concerns even if it can reasonably be generalized to other Western countries.
The inadequate quantity and quality of American jobs is one of the most fundamental economic challenges we face. [...] What's the linkage between technology and this fundamental problem?
In his article, Jared Bernstein, a Senior Fellow at the Center on Budget and Policy Priorities,[49] questions the widespread idea that automation, and more broadly technological advances have mainly contributed to this growing labor market problem. His thesis appears to be a third way between Optimism and Skepticism. Basically, he stands for a neutral approach of the linkage between technology and American issues concerning unemployment and eroding wages.
He uses two main arguments to defend his point. First of all, because of recent technological advances, an increasing number of workers are losing their jobs. Yet, scientific evidence fails to clearly demonstrate that technology has displaced so many workers that it has created more problems than it has solved. Indeed, automation threatens repetitive jobs but higher-end jobs are still necessary because they complement technology and manual jobs that "requires flexibility judgment and common sense"[50] remain hard to be replaced by machines.Second, studies have not defined clear links between recent technology advances and the wage trends of the last decades.
Therefore, according to Jared Bernstein, instead of focusing on technology and its hypothetical influences on current American increasing unemployment and eroding wages, one needs to worry more about "bad policy that fails to offset the imbalances in demand, trade, income and opportunity."[50]
Thomas P. Hughes pointed out that because technology has been considered as a key way to solve problems, we need to be aware of its complex and varied characters to use it more efficiently.[51] What is the difference between a wheel or a compass and cooking machines such as an oven or a gas stove? Can we consider all of them, only a part of them or none of them as technologies?
Technology is often considered too narrowly: according to Thomas P. Hughes "Technology is a creative process involving human ingenuity.[51] This definition emphasizing on creativity avoids unbounded definition that may mistakenly include cooking technologies. But it also highlights the prominent role of humans and therefore their responsibilities for the use of complex technological systems.
Yet, because technology is everywhere and has dramatically changed landscapes and societies, Hughes argued that engineers, scientists, and managers often have believed that they can use technology to shape the world as they want. They have often supposed that technology is easily controllable and this assumption has to be thoroughly questioned.[51] For instance, Evgeny Morozov particularly challenges two concepts: Internet-centrism and solutionism.[52] Internet-centrism refers to the idea that our society is convinced that the Internet is one of the most stable and coherent forces. Solutionism is the ideology that every social issue can be solved thanks to technology and especially thanks to the internet. In fact, technology intrinsically contains uncertainties and limitations. According to Alexis Madrigal's critique of Morozov's theory, to ignore it will lead to unexpected consequences that could eventually cause more damage than the problems they seek to address.[53]Benjamin Cohen and Gwen Ottinger precisely discussed the multivalent effects of technology.[54]
Therefore, recognition of the limitations of technology and more broadly scientific knowledge is needed especially in cases dealing with environmental justice and health issues. Gwen Ottinger continues this reasoning and argues that the ongoing recognition of the limitations of scientific knowledge goes hand in hand with scientists and engineers new comprehension of their role. Such an approach of technology and science "[require] technical professionals to conceive of their roles in the process differently. [They have to consider themselves as] collaborators in research and problem solving rather than simply providers of information and technical solutions".[55]
Technology is properly defined as any application of science to accomplish a function. The science can be leading edge or well established and the function can have high visibility or be significantly more mundane but it is all technology, and its exploitation is the foundation of all competitive advantage.
Technology-based planning is what was used to build the US industrial giants before WWII (e.g., Dow, DuPont, GM) and it what was used to transform the US into a superpower. It was not economic-based planning.
In 1983 Project Socrates was initiated in the US intelligence community to determine the source of declining US economic and military competitiveness. Project Socrates concluded that technology exploitation is the foundation of all competitive advantage and that declining US competitiveness was from decision-making in the private and public sectors switching from technology exploitation (technology-based planning) to money exploitation (economic-based planning) at the end of World War II.
Project Socrates determined that to rebuild US competitiveness, decision making throughout the US had to readopt technology-based planning. Project Socrates also determined that countries like China and India had continued executing technology-based (while the US took its detour into economic-based) planning, and as a result had considerably advanced the process and were using it to build themselves into superpowers. To rebuild US competitiveness the US decision-makers needed to adopt a form of technology-based planning that was far more advanced than that used by China and India.
Project Socrates determined that technology-based planning makes an evolutionary leap forward every few hundred years and the next evolutionary leap, the Automated Innovation Revolution, was poised to occur. In the Automated Innovation Revolution the process for determining how to acquire and utilize technology for a competitive advantage (which includes R&D) is automated so that it can be executed with unprecedented speed, efficiency and agility.
Project Socrates developed the means for automated innovation so that the US could lead the Automated Innovation Revolution in order to rebuild and maintain the country's economic competitiveness for many generations.[56][57][58]
The use of basic technology is also a feature of other animal species apart from humans. These include primates such as chimpanzees, some dolphin communities,[59][60] and crows.[61][62] Considering a more generic perspective of technology as ethology of active environmental conditioning and control, we can also refer to animal examples such as beavers and their dams, or bees and their honeycombs.
The ability to make and use tools was once considered a defining characteristic of the genus Homo.[63] However, the discovery of tool construction among chimpanzees and related primates has discarded the notion of the use of technology as unique to humans. For example, researchers have observed wild chimpanzees utilising tools for foraging: some of the tools used include leaf sponges, termite fishing probes, pestles and levers.[64]West African chimpanzees also use stone hammers and anvils for cracking nuts,[65] as do capuchin monkeys of Boa Vista, Brazil.[66]
Read the original here:
Posted in Technology
Comments Off on Technology – Wikipedia, the free encyclopedia
CATHOLIC ENCYCLOPEDIA: Rationalism – NEW ADVENT
Posted: at 3:34 am
Help support New Advent and get the full contents of this website as an instant download. Includes the Catholic Encyclopedia, Church Fathers, Summa, Bible and more all for only $19.99...
(Latin, ratio reason , the faculty of the mind which forms the ground of calculation, i.e. discursive reason. See APOLOGETICS; ATHEISM; BIBLE; DEISM; EMPIRICISM; ETHICS; BIBLICAL EXEGESIS; FAITH; MATERIALISM; MIRACLE; REVELATION).
The term is used: (1) in an exact sense, to designate a particular moment in the development of Protestant thought in Germany; (2) in a broader, and more usual, sense to cover the view (in relation to which many schools may be classed as rationalistic) that the human reason, or understanding, is the sole source and final test of all truth. It has further: (3) occasionally been applied to the method of treating revealed truth theologically, by casting it into a reasoned form , and employing philosophical Categories in its elaboration. These three uses of the term will be discussed in the present article.
The German school of theological Rationalism formed a part of the more general movement of the eighteenth-century "Enlightenment". It may be said to owe its immediate origin to the philosophical system of Christian Wolff (1679-1754), which was a modification, with Aristotelean features, of that of Leibniz, especially characterized by its spiritualism , determinism , and dogmatism. This philosophy and its method exerted a profound influence upon contemporaneous German religious thought, providing it with a rationalistic point of view in theology and exegesis. German philosophy in the eighteenth century was, as a whole, tributary to Leibniz, whose "Thodice" was written principally against the Rationalism of Bayle: it was marked by an infiltration of English Deism and French Materialism, to which the Rationalism at present considered had great affinity, and towards which it progressively developed: and it was vulgarized by its union with popular literature . Wolff himself was expelled from his chair at the University of Halle on account of the Rationalistic nature of his teaching, principally owing to the action of Lange (1670-1774; cf. "Causa Dei et reilgionis naturals adversus atheismum", and "Modesta Disputatio", Halle, 1723). Retiring to Marburg, he taught there until 1740, when he was recalled to Halle by Frederick II. Wolff's attempt to demonstrate natural religion rationally was in no sense an attack upon revelation. As a "supranaturalist" he admitted truths above reason, and he attempted to support by reason the supernatural truths contained in Holy Scripture. But his attempt, while it incensed the pietistic school and was readily welcomed by the more liberal and moderate among the orthodox Lutherans, in reality turned out to be strongly in favour of the Naturalism that he wished to condemn. Natural religion, he asserted, is demonstrable; revealed religion is to be found in the Bible alone. But in his method of proof of the authority of Scripture recourse was had to reason , and thus the human mind became, logically, the ultimate arbiter in the case of both. Supranaturalism in theology, which it was Wolff's intention to uphold, proved incompatible with such a philosophical position, and Rationalism took its place. This, however, is to be distinguished from pure Naturalism, to which it led, but with which it never became theoretically identified. Revelation was not denied by the Rationalists; though, as a matter of fact, if not of theory, it was quietly suppressed by the claim, with its ever-increasing application, that reason is the competent judge of all truth. Naturalists, on the other hand, denied the fact of revelation. As with Deism and Materialism, the German Rationalism invaded the department of Biblical exegesis. Here a destructive criticism , very similar to that of the Deists, was levelled against the miracles recorded in, and the authenticity of the Holy Scripture. Nevertheless, the distinction between Rationalism and Naturalism still obtained. The great Biblical critic Semler (1725-91), who is one of the principal representatives of the school, was a strong opponent of the latter; in company with Teller (1734-1804) and others he endeavoured to show that the records of the Bible have no more than a local and temporary character, thus attempting to safeguard the deeper revelation, while sacrificing to the critics its superficial vehicle. He makes the distinction between theology and religion (by which he signifies ethics ).
The distinction made between natural and revealed religion necessitated a closer definition of the latter. For Supernaturalists and Rationalists alike religion was held to be "a way of knowing and worshipping the Deity", but consisting chiefly, for the Rationalists, in the observance of God's law. This identification of religion with morals, which at the time was utilitarian in character (see UTILITARIANISM), led to further developments in the conceptions of the nature of religion, the meaning of revelation , and the value of the Bible as a collection of inspired writings. The earlier orthodox Protestant view of religion as a body of truths published and taught by God to man in revelation was in process of disintegration. In Semler's distinction between religion (ethics) on the one hand and theology on the other, with Herder's similar separation of religion from theological opinions and religious usages, the cause of the Christian religion, as they conceived it, seemed to be put beyond the reach of the shock of criticism, which, by destroying the foundations upon which it claimed to rest, had gone so far to discredit the older form of Lutheranism. Kant's (1724-1804) criticism of the reason, however, formed a turning-point in the development of Rationalism. For a full understanding of his attitude, the reader must be acquainted with the nature of his pietistic upbringing and later scientific and philosophical formation in the Leibniz-Wolff school of thought (see PHILOSOPHY OF KANT). As far as concerns the point that occupies us at present, Kant was a Rationalist. For him religion was coextensive, with natural , though not utilitarian, morals. When he met with the criticisms of Hume and undertook his famous "Kritik", his preoccupation was to safeguard his religious opinions, his rigorous morality , from the danger of criticism. This he did, not by means of the old Rationalism, but by throwing discredit upon metaphysics. The accepted proofs of the existence of God, immortality, and liberty were thus, in his opinion, overthrown, and the well-known set of postulates of the "categoric imperative " put forward in their place. This, obviously, was the end of Rationalism in its earlier form, in which the fundamental truths of religion were set out as demonstrable by reason . But, despite the shifting of the burden of religion from the pure to the practical reason , Kant himself never seems to have reached the view --; to which all his work pointed --; that religion is not mere ethics , "conceiving moral laws as divine commands", no matter how far removed from Utilitarianism --; not an affair of the mind , but of the heart and will ; and that revelation does not reach man by way of an exterior promulgation, but consists in a personal adaptation towards God. This conception was reached gradually with the advance of the theory that man possesses a religious sense, or faculty, distinct from the rational (Fries, 1773-1843; Jacobi, 1743-1819; Herder, 1744-1803; all opposed to the Intellectualism of Kant), and ultimately found expression with Schleiermacher (1768-1834), for whom religion is to be found neither in knowledge nor in action, but in a peculiar attitude of mind which consists in the consciousness of absolute dependence upon God. Here the older distinction between natural and revealed religion disappears. All that can be called religion the consciousness of dependence is at the same time revelational, and all religion is of the same character. There is no special revelation in the older Protestant (the Catholic) sense, but merely this attitude of dependence brought into being in the individual by the teaching of various great personalities who, from time to time, have manifested an extraordinary sense of the religious. Schleiermacher was a contemporary of Fichte, Schelling, and Hegel, whose philosophical speculations had influence, with his own, in ultimately subverting Rationalism as here dealt with. The movement may be said to have ended with him in the opinion of Teller "the greatest theologian that the Protestant Church has had since the period of the Reformation". The majority of modern Protestant theologians accept his views, not, however, to the exclusion of knowledge as a basis of religion.
Parallel with the development of the philosophical and theological views as to the nature of religion and the worth of revelation , which provided it with its critical principles, took place an exegetical evolution. The first phase consisted in replacing the orthodox Protestant doctrine (i.e. that the Sacred Scriptures are the Word of God) by a distinction between the Word of God contained in the Bible and the Bible itself (Tllner, Herder), though the Rationalists still held that the purer source of revelation lies rather in the written than in the traditional word. This distinction led inevitably to the destruction, of the rigid view of inspiration , and prepared the ground for the second phase. The principle of accommodation was now employed to explain the difficulties raised by the Scripture records of miraculous events and demoniacal manifestations (Senf, Vogel), and arbitrary methods of exegesis were also used to the same end (Paulus, Eichhorn). In the third phase Rationalists had reached the point of allowing the possibility of mistakes having been made by Christ and the Apostles, at any rate with regard to non-essential parts of religion. All the devices of exegesis were employed vainly; and, in the end, Rationalists found themselves forced to admit that the authors of the New Testament must have written from a point of view different from that which a modern theologian would adopt (Henke, Wegseheider). This principle, which is sufficiently elastic to admit of usage by nearly every variety of opinion, was admitted by several of the Supernaturalists (Reinhard, Storr ), and is very generally accepted by modern Protestant divines, in the rejection of verbal inspiration . Herder is very clear on the distinction the truly inspired must be discerned from that which is not; and de Wette lays down as the canon of interpretation "the religious perception of the divine operation, or of the Holy Spirit, in the sacred writers as regards their belief and inspiration, but not respecting their faculty of forming ideas. . ." In an extreme form it may be seen employed in such works as Strauss's "Leben Jesu", where the hypothesis of the mythical nature of miracles is developed to a greater extent than by Schleiermacher or de Wette.
Rationalism, in the broader, popular meaning of the term, is used to designate any mode of thought in which human reason holds the place of supreme criterion of truth; in this sense, it is especially applied to such modes of thought as contrasted with faith. Thus Atheism, Materialism, Naturalism, Pantheism, Scepticism, etc., fall under the head of rationalistic systems. As such, the rationalistic tendency has always existed in philosophy, and has generally shown itself powerful in all the critical schools. As has been noted in the preceding paragraph, German Rationalism had strong affinities with English Deism and French Materialism, two historic forms in which the tendency has manifested itself. But with the vulgarization of the ideas contained in the various systems that composed these movements, Rationalism has degenerated. It has become connected in the popular mind with the shallow and misleading philosophy frequently put forward in the name of science, so that a double confusion has arisen, in which;
This Rationalism is now rather a spirit, or attitude, ready to seize upon any arguments, from any source and of any or no value, to urge against the doctrines and practices of faith. Beside this crude and popular form it has taken, for which the publication of cheap reprints and a vigorous propaganda are mainly responsible, there runs the deeper and more thoughtful current of critical-philosophical Rationalism, which either rejects religion and revelation altogether or treats them in much the same manner as did the Germans. Its various manifestations have little in common in method or content, save the general appeal to reason as supreme. No better description of the position can be given than the statements of the objects of the Rationalist Press Association. Among these are: "To stimulate the habits of reflection and inquiry and the free exercise of individual intellect . . . and generally to assert the supremacy of reason as the natural and necessary means to all such knowledge and wisdom as man can achieve". A perusal of the publications of the same will show in what sense this representative body interprets the above statement. It may be said finally, that Rationalism is the direct and logical outcome of the principles of Protestantism; and that the intermediary form , in which assent is given to revealed truth as possessing the imprimatur of reason , is only a phase in the evolution of ideas towards general disbelief. Official condemnations of the various forms of Rationalism, absolute and mitigated, are to be found in the Syllabus of Pius IX.
The term Rationalism is perhaps not usually applied to the theological method of the Catholic Church. All forms of theological statement, however, and pre-eminently the dialectical form of Catholic theology, are rationalistic in the truest sense. Indeed, the claim of such Rationalism as is dealt with above is directly met by the counter claim of the Church: that it is at best but a mutilated and unreasonable Rationalism, not worthy of the name, while that of the Church is rationally complete, and integrated, moreover, with super-rational truth. In this sense Catholic theology presupposes the certain truths of natural reason as the preambula fidei, philosophy (the ancilla theologi) is employed in the defence of revealed truth (see APOLOGETICS), and the content of Divine revelation is treated and systematized in the categories of natural thought. This systematization is carried out both in dogmatic and moral theology. It is a process contemporaneous with the first attempt at a scientific statement of religious truth, comes to perfection of method in the works of such writers as St. Thomas Aquinas and St. Alphonsus, and is consistently employed and developed in the Schools.
HAGENBACH, Kirchengesch. des 18. Jahrhunderts in Vorlesungen ber Wesen u. Gesch. der Reformation in Deutschland etc., V-VI (Leipzig, 1834-43); IDEM (tr. BUCH), Compendium of the History of Doctrines (Edinburgh, 1846); HASE, Kirchengesch. (Leipzig, 1886); HENKE, Rationalismus u. Traditionalismus im 19. Jahrh. (Halle, 1864); HURST, History of Rationalism (New York, 1882); LERMINIER, De l'influence de la philosophie du XVIIIe sicle (Paris, 1833); SAINTES, Hist. critique du rationalisme en Allemagne (Paris, 1841); SCHLEIERMACHER, Der christl. Glaube nach der Grundstzen der evangelischen Kirche (Berlin, 1821-22): SEMLER, Von freier Untersuchung des Kanons (Halle, 1771-75); IDEM, Institutio ad doctrinam christianam liberaliter discendam (Halle, 1774); IDEM, Versuch einer freier theologischen Lehrart (Halle, 1777); STADLIN, Gesch. des Rationalismus u. Supranaturalismus (Gttingen, 1826); THOLUCK, Vorgesch. des Rationalismus (Halle, 1853-62); BENN, History of Rationalism in the Nineteenth Century (London, 1906).
APA citation. Aveling, F. (1911). Rationalism. In The Catholic Encyclopedia. New York: Robert Appleton Company. http://www.newadvent.org/cathen/12652a.htm
MLA citation. Aveling, Francis. "Rationalism." The Catholic Encyclopedia. Vol. 12. New York: Robert Appleton Company, 1911. <http://www.newadvent.org/cathen/12652a.htm>.
Transcription. This article was transcribed for New Advent by Douglas J. Potter. Dedicated to the Sacred Heart of Jesus Christ.
Ecclesiastical approbation. Nihil Obstat. June 1, 1911. Remy Lafort, S.T.D., Censor. Imprimatur. +John Cardinal Farley, Archbishop of New York.
Contact information. The editor of New Advent is Kevin Knight. My email address is webmaster at newadvent.org. Regrettably, I can't reply to every letter, but I greatly appreciate your feedback especially notifications about typographical errors and inappropriate ads.
Read the original post:
Posted in Rationalism
Comments Off on CATHOLIC ENCYCLOPEDIA: Rationalism – NEW ADVENT
Ore.Bz – Cryptocurrency Exchange
Posted: at 3:31 am
Ore.Bz
Low fee cryptocurrency exchange
Comfortable and responsive Ajax updated trading platform interface will not leave indifferent any trader, beginner or proffesional.
Simple, easy and fast recharge deposit. To confirm receipt of cryptocoins required only 6 confirmations. USD and EUR are credited immediately after the payment to the payment gateway. Withdraw alway immediately!
Add yor USD and EUR deposit any convenient way for you with safe and secure payment gateways. You can use your favorite payment systems or credit cards.
Exchange Ore.Bz supports API v2, you can use it for trading bots, monitoring, obtain current rates, etc. Access to the API, you can get into your account panel after registration.
The list of currently supported cryptocurrency. The list can be changed!
Bitcoin is a digital asset and a payment system invented by Satoshi Nakamoto, who published the invention in 2008 and released it as open-source software in 2009. The system is peer-to-peer - users can transact directly without an intermediary.
Ethereum is a cryptocurrency and a blockchain platform with smart contract functionality. It provides a decentralized virtual machine. Ethereum was proposed by Vitalik Buterin in late 2013 and the network went live on 30 July 2015.
Dash (formerly known as Darkcoin and XCoin) is an open source peer-to-peer cryptocurrency that uses a system called Darksend to add privacy to transactions. It was rebranded from "Darkcoin" to "Dash" on March 25, 2015, a portmanteau of "Digital Cash".
Litecoin is a peer-to-peer cryptocurrency and open source software project released under the MIT/X11 license. Inspired by and technically nearly identical to bitcoin (BTC), Litecoin creation and transfer is based on an open source protocol and is not managed by any central authority.
Dogecoin is a cryptocurrency featuring a likeness of the Shiba Inu dog from the "Doge" Internet meme as its logo. It was introduced on December 8, 2013. Started as a "joke currency" in late 2013, Dogecoin quickly developed its own online community and reached a capitalization of USD 60 million in January 2014.
GBCGoldCoin is a cryptocurrency of Scrypt algorithm. The way you get PoS is that extra coins can be obtained thanks to the ones available. GBC Gold Coin provides 60% return P.A. per balance. GBCGoldCoin 100% PoS cryptocurrency.
BlackCoin is a peer-to-peer cryptocurrency. BlackCoin uses a proof-of-stake system and is open-source. BlackCoin was created by the developer Rat4, with the goal of proving that BlackCoins way of disabling proof-of-work is stable and secure.
You can leave your comments, suggestions in the following form to contact us.
Visit link:
Ore.Bz - Cryptocurrency Exchange
Posted in Cryptocurrency
Comments Off on Ore.Bz – Cryptocurrency Exchange
Jiaogulan the Chinese Herb of Immortality | Underground …
Posted: at 3:28 am
The Jiaogulan plant from the far-distant mountains of China that is so powerful that locals call it the herb of immortality. In the Guizhou Province of China, one of the places where jiaogulan (literally, twisting-vine orchid) grows most abundantly, the natives have nicknamed the herb they regularly consume as tea the herb of immortality. They also credit it with their reportedly long life spans.
The list of jiaogulans benefits is impressive and includes its ability to:
Jiaogulan may even help inhibit cancer growth. At its essence, this Chinese herb is an incredible balancing compound. If something (like your cholesterol, for instance) is higher than it should be, it lowers it. And if something is too low, it raises it.
The reason jiaogulan can work in such a range of ways in your body is because its a powerful adaptogen. All adaptogens including garlic, ginkgo biloba, and ginseng are substances that can, once ingested, be adapted by your body in whatever way necessary to restore balance. Only 1 out of every 4,000 plants meets the criteria to be classified as adaptogens, also called biological response modifiers.
The criteria for adaptogens as devised by Russian doctors N.V. Lazarev and I.I. Brekman in the mid-20th century are:
Jiaogulan may actually be the most powerful adaptogen out there.
For example, in studies conducted by Dr. Tsunematsu Takemoto and others over the course of a decade, jiaogulan was found to have 82 saponins (the chemical source of adaptogenesis) compared to 28 found in standard ginseng.
Unlike conventional drugs, adaptogens have no side effects, do nothing to disturb the body, and work only when the body requires.
Jiaogulans adaptogenic properties are especially evident in the way it benefits your brain. Jiaogulan has a biphasic effect on brain functioning, meaning that it can energize or calm the system depending on whats needed.
And while it may seem impossible, jiaogulan is both a weight-loss and a weight-gain aid. Again, its due to the herbs adaptogenic nature. It interacts with your digestive system and corrects any areas of imbalance. If youre overweight, it helps your body process food more efficiently. And if youre underweight, it helps your body absorb the maximum amount of nutrients from everything you consume.
Known officially as Gynostemma pentaphyllum, this hardy climbing vine flourishes even when untended, like some other members of the Cucurbitaceae family that includes cucumbers, gourds, and melons.
In English, the plant has a number of names from the technical Five-Leaf Ginseng and Southern Ginseng to the fantastical, Miracle Grass and Fairy Herb. But whatever you call it, the real mystery is why this amazing plant is not yet found in every household.
The leaves of the plant can be eaten directly, put in a salad, or stored for tea. They have a sweet, fresh taste and are frequently used for alternative sweeteners in Asia.
Adding this adaptogenic, antioxidant supplement to your health regimen is such a simple step you might wonder why you havent done it already.
Jiaogulan is most often consumed as a tea, but you can also get it in extract, pill, and capsule form. Its readily available from many alternative medicine pharmacies, natural foods stores, and online sources.
Because jiaogulan is non-toxic, there is no risk of overdose. A recommended starter dose is 75 225 mg taken 2-3 times a day.
Go here to read the rest:
Jiaogulan the Chinese Herb of Immortality | Underground ...
Posted in Immortality Medicine
Comments Off on Jiaogulan the Chinese Herb of Immortality | Underground …
JET 14(1) – April 2005 – Bostrom – Transhumanist Thought
Posted: at 3:28 am
Nick Bostrom Faculty of Philosophy, Oxford University
Journal of Evolution and Technology - Vol. 14 Issue 1 - April 2005 http://jetpress.org/volume14/bostrom.html
PDF Version
This paper traces the cultural and philosophical roots of transhumanist thought and describes some of the influences and contributions that led to the development of contemporary transhumanism.
The human desire to acquire new capacities is as ancient as our species itself. We have always sought to expand the boundaries of our existence, be it socially, geographically, or mentally. There is a tendency in at least some individuals always to search for a way around every obstacle and limitation to human life and happiness.
Ceremonial burial and preserved fragments of religious writings show that prehistoric man and woman were deeply disturbed by the death of loved ones. Although the belief in an afterlife was common, this did not preclude efforts to extend the present life. In the Sumerian Epic of Gilgamesh (approx. 1700 B.C.), a king sets out on a quest for immortality. Gilgamesh learns that there exists a natural means an herb that grows at the bottom of the sea.[1] He successfully retrieves the plant, but a snake steals it from him before he can eat it. In later times, explorers sought the Fountain of Youth, alchemists labored to concoct the Elixir of Life, and various schools of esoteric Taoism in China strove for physical immortality by way of control over or harmony with the forces of nature. The boundary between mythos and science, between magic and technology, was blurry, and almost all conceivable means to the preservation of life were attempted by somebody or other. Yet while explorers made many interesting discoveries and alchemists invented some useful things, such as new dyes and improvements in metallurgy, the goal of life-extension proved elusive.
The quest to transcend our natural confines, however, has long been viewed with ambivalence. On the one hand there is fascination. On the other there is the concept of hubris: that some ambitions are off-limits and will backfire if pursued. The ancient Greeks exhibited this ambivalence in their mythology. Prometheus stole the fire from Zeus and gave it to the humans, thereby permanently improving the human condition. Yet for this act he was severely punished by Zeus. In the myth of Daedalus, the gods are repeatedly challenged, quite successfully, by the clever engineer and artist who uses non-magical means to extend human capabilities. In the end, however, disaster ensues when his son Icarus ignores paternal warnings and flies too close to the sun, causing the wax in his wings to melt.
Medieval Christianity had similarly conflicted views about the pursuits of the alchemists, who tried to transmute substances, create homunculi in test tubes, and invent a panacea. Some scholastics, following the anti-experimentalist teachings of Aquinas, believed that alchemy was an ungodly activity. There were allegations that it involved the invocation of daemonic powers. But other theologians, such as Albertus Magnus, defended the practice.[2]
The otherworldliness and stale scholastic philosophy that dominated Europe during the Middle Ages gave way to a renewed intellectual vigor in the Renaissance. The human being and the natural world again became legitimate objects of study. Renaissance humanism encouraged people to rely on their own observations and their own judgment rather than to defer in every matter to religious authorities. Renaissance humanism also created the ideal of the well-rounded person, one who is highly developed scientifically, morally, culturally, and spiritually. A landmark of the period is Giovanni Pico della Mirandolas Oration on the Dignity of Man (1486), which proclaims that man does not have a readymade form and is responsible for shaping himself:
We have made you a creature neither of heaven nor of earth, neither mortal nor immortal, in order that you may, as the free and proud shaper of your own being, fashion yourself in the form you may prefer. It will be in your power to descend to the lower, brutish forms of life; you will be able, through your own decision, to rise again to the superior orders whose life is divine.[3]
The Age of Enlightenment is often said to have started with the publication of Francis Bacons Novum Organum, the new tool " (1620), which proposes a scientific methodology based on empirical investigation rather than a priori reasoning.[4]Bacon advocated the project of "effecting all things possible, " by which he meant using science to achieve mastery over nature in order to improve the living condition of human beings. The heritage from the Renaissance combines with the influence of Isaac Newton, Thomas Hobbes, John Locke, Immanuel Kant, the Marquis de Condorcet, and others to form the basis for rational humanism, which emphasizes empirical science and critical reason rather than revelation and religious authority as ways of learning about the natural world and our place within it, and of providing a grounding for morality. Transhumanism has roots in rational humanism.
In the 18th and 19th centuries we begin to see glimpses of the idea that even humans themselves can be developed through the appliance of science. Condorcet speculated about extending human life span through medical science:
Would it be absurd now to suppose that the improvement of the human race should be regarded as capable of unlimited progress? That a time will come when death would result only from extraordinary accidents or the more and more gradual wearing out of vitality, and that, finally, the duration of the average interval between birth and wearing out has itself no specific limit whatsoever? No doubt man will not become immortal, but cannot the span constantly increase between the moment he begins to live and the time when naturally, without illness or accident, he finds life a burden?"[5]
Benjamin Franklin longed wistfully for suspended animation, foreshadowing the cryonics movement:
I wish it were possible... to invent a method of embalming drowned persons, in such a manner that they might be recalled to life at any period, however distant; for having a very ardent desire to see and observe the state of America a hundred years hence, I should prefer to an ordinary death, being immersed with a few friends in a cask of Madeira, until that time, then to be recalled to life by the solar warmth of my dear country! But... in all probability, we live in a century too little advanced, and too near the infancy of science, to see such an art brought in our time to its perfection.[6]
After the publication of Darwins Origin of Species (1859), it became increasingly plausible to view the current version of humanity not as the endpoint of evolution but rather as a possibly quite early phase."[7]The rise of scientific physicalism might also have contributed to the foundations of the idea that technology could be used to improve the human organism. For example, a simple kind of materialist view was boldly proposed in 1750 by the French physician and materialist philosopher, Julien Offray de La Mettrie in LHomme Machine, where he argued that "man is but an animal, or a collection of springs which wind each other up. "[8]If human beings are constituted by matter that obeys the same laws of physics that operate outside us, then it should in principle be possible to learn to manipulate human nature in the same way that we manipulate external objects.
It has been said that the Enlightenment expired as the victim of its own excesses. It gave way to Romanticism, and to latter day reactions against the rule of instrumental reason and the attempt to rationally control nature, such as can be found in some postmodernist writings, the New Age movement, deep environmentalism, and in some parts of the anti-globalization movement. However, the Enlightenments legacy, including a belief in the power of human rationality and science, is still an important shaper of modern culture. In his famous 1784 essay "What Is Enlightenment? ", Kant summed it up as follows:
Enlightenment is mans leaving his self-caused immaturity. Immaturity is the incapacity to use ones own understanding without the guidance of another. Such immaturity is self-caused if its cause is not lack of intelligence, but by lack of determination and courage to use ones intelligence without being guided by another. The motto of enlightenment is therefore: Sapere aude! Have courage to use your own intelligence!"[9]
It might be thought that the German philosopher Friedrich Nietzsche (1844-1900) would have been a major inspiration for transhumanism. Nietzsche is famous for his doctrine of der bermensch (the overman "):
I teach you the overman. Man is something that shall be overcome. What have you done to overcome him? All beings so far have created something beyond themselves; and do you want to be the ebb of this great flood and even go back to the beasts rather than overcome man?"[10]
What Nietzsche had in mind, however, was not technological transformation but rather a kind of soaring personal growth and cultural refinement in exceptional individuals (who he thought would have to overcome the life-sapping "slave-morality " of Christianity). Despite some surface-level similarities with the Nietzschean vision, transhumanism with its Enlightenment roots, its emphasis on individual liberties, and its humanistic concern for the welfare of all humans (and other sentient beings) probably has as much or more in common with Nietzsches contemporary J.S. Mill, the English liberal thinker and utilitarian.
In 1923, the noted British biochemist J. B. S. Haldane published the essay Daedalus: Science and the Future, in which he argued that great benefits would come from controlling our own genetics and from science in general. He projected a future society that would be richer, have abundant clean energy, where genetics would be employed to make people taller, healthier, and smarter, and where the use of ectogenesis (gestating fetuses in artificial wombs) would be commonplace. He also commented on what has in more recent years become known as the "yuck factor ":
The chemical or physical inventor is always a Prometheus. There is no great invention, from fire to flying, which has not been hailed as an insult to some god. But if every physical and chemical invention is a blasphemy, every biological invention is a perversion. There is hardly one which, on first being brought to the notice of an observer from any nation which has not previously heard of their existence, would not appear to him as indecent and unnatural.[11]
Haldanes essay became a bestseller and set off a chain reaction of future-oriented discussions, including The World, the Flesh and the Devil, by J. D. Bernal (1929)[12], which speculated about space colonization and bionic implants as well as mental improvements through advanced social science and psychology; the works of Olaf Stapledon, a philosopher and science fiction author; and the essay "Icarus: the Future of Science " (1924) by Bertrand Russell.[13]Russell took a more pessimistic view, arguing that without more kindliness in the world, technological power would mainly serve to increase mens ability to inflict harm on one another. Science fiction authors such as H. G. Wells and Stapledon got many people thinking about the future evolution of the human race.
Aldous Huxleys Brave New World, published in 1932, has had an enduring impact on debates about human technological transformation[14]matched by few other works of fiction (a possible exception would be Mary Shelleys Frankenstein, 1818[15]). Huxley describes a dystopia where psychological conditioning, promiscuous sexuality, biotechnology, and the opiate drug "soma " are used to keep the population placid and contented in a static, totally conformist caste society that is governed by ten world controllers. Children are manufactured in fertility clinics and artificially gestated. The lower castes are chemically stunted or deprived of oxygen during their maturation process to limit their physical and intellectual development. From birth, members of every caste are indoctrinated during their sleep, by recorded voices repeating the slogans of the official "Fordist " religion, and are conditioned to believe that their own caste is the best one to belong to. The society depicted in Brave New World is often compared and contrasted with that of another influential 20th century dystopia, George Orwells 1984.[16] 1984 features a more overt form of oppression, including ubiquitous surveillance by "Big Brother " and brutal police coercion. Huxleys world controllers, by contrast, rely on more "humane means ", including bio-engineered predestination, soma, and psychological conditioning to prevent people from wanting to think for themselves. Herd-mentality and promiscuity are promoted, while high art, individuality, knowledge of history, and romantic love are discouraged. It should be noted that in neither 1984 nor Brave New World has technology been used to increase human capacities. Rather, society is set up to repress the full development of humanity. Both dystopias curtail scientific and technological exploration for fear of upsetting the social equilibrium. Nevertheless, Brave New World in particular has become an emblem of the dehumanizing potential of the use of technology to promote social conformism and shallow contentment.
In the postwar era, many optimistic futurists who had become suspicious of collectively orchestrated social change found a new home for their hopes in scientific and technological progress. Space travel, medicine, and computers seemed to offer a path to a better world. The shift of attention also reflected the breathtaking pace of development taking place in these fields. Science had begun to catch up with speculation. Yesterdays science fiction was turning into todays science fact or at least into a somewhat realistic mid-term prospect.
Transhumanist themes during this period were discussed and analyzed chiefly in the science fiction literature. Authors such as Arthur C. Clarke, Isaac Asimov, Robert Heinlein, and Stanislaw Lem explored how technological development could come to profoundly alter the human condition.
The word "transhumanism " appears to have been first used by Aldous Huxleys brother, Julian Huxley, a distinguished biologist (who was also the first director-general of UNESCO and founder of the World Wildlife Fund). In Religion Without Revelation (1927), he wrote:
The human species can, if it wishes, transcend itself not just sporadically, an individual here in one way, an individual there in another way but in its entirety, as humanity. We need a name for this new belief. Perhaps transhumanism will serve: man remaining man, but transcending himself, by realizing new possibilities of and for his human nature.[17]
Human-like automata have always fascinated the human imagination. Mechanical engineers since the early Greeks have constructed clever self-moving devices.
In Judaic mysticism, a "golem " refers to an animated being crafted from inanimate material. In the early golem stories, a golem could be created by a holy person who was able to share some of Gods wisdom and power (although the golem, not being able to speak, was never more than a shadow of Gods creations). Having a golem servant was the ultimate symbol of wisdom and holiness. In the later stories, which had been influenced by the more Islamic concern about humanity getting too close to God, the golem became a creation of overreaching mystics, who would inevitably be punished for their blasphemy. The story of the Sorcerers Apprentice is a variation of this theme: the apprentice animates a broomstick to fetch water but is unable to make the broom stop like Frankenstein, a story of technology out of control. The word "robot " was coined by the Czech Karel apeks in his dark play R.U.R. (1921), in which a robot labor force destroys its human creators.[18]With the invention of the electronic computer, the idea of human-like automata graduated from the kindergarten of mythology to the school of science fiction (e.g. Isaac Asimov, Stanislav Lem, Arthur C. Clark) and eventually to the college of technological prediction.
Could continued progress in artificial intelligence lead to the creation of machines that can think in the same general way as human beings? Alan Turing gave an operational definition to this question in his classic "Computing Machinery and Intelligence " (1950), and predicted that computers would eventually pass what came to be known as the Turing Test. (In the Turing Test, a human experimenter interviews a computer and another human via a text interface, and the computer succeeds if the interviewer cannot reliably distinguish the computer from the human.)[19]Much ink has been spilt in debates on whether this test furnishes a necessary and sufficient condition for a computer being able to think, but what matters more from a practical perspective is whether and, and if so when, computers will be able to match human performance on tasks involving general reasoning ability. With the benefit of hindsight, we can say that many of the early AI researchers turned out to be overoptimistic about the timescale for this hypothetical development. Of course, the fact that we have not yet reached human-level artificial intelligence does not mean that we never will, and a number of people, e.g. Marvin Minsky, Hans Moravec, Ray Kurzweil, and Nick Bostrom have put forward reasons for thinking that this could happen within the first half of this century.[20]
In 1958, Stanislaw Ulam, referring to a meeting with John von Neumann, wrote:
One conversation centered on the ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.[21]
The rapidity of technological change in recent times leads naturally to the idea that continued technological innovation will have a large impact on humanity in the decades ahead. This prediction is strengthened if one believes that some of those variables that currently exhibit exponential growth will continue to do so and that they will be among the main drivers of change. Gordon E. Moore, co-founder of Intel, noticed in 1965 that the number of transistors on a chip exhibited exponential growth. This led to the formulation of "Moores law ", which states (roughly) that computing power doubles every 18 months to two years.[22]More recently, Kurzweil has documented similar exponential growth rates in a number of other technologies. (The world economy, which is a kind of general index of humanitys productive capacity, has doubled about every 15 years in modern times.)
The singularity hypothesis, which von Neumann seems to have alluded to in the quoted passage above, is that these changes will lead to some kind of discontinuity. But nowadays, it often refers to a more specific prediction, namely that the creation of self-improving artificial intelligence will at some point result in radical changes within a very short time span. This hypothesis was first clearly stated in 1965 by the statistician I. J. Good:
Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an intelligence explosion, and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make.[23]
Vernor Vinge discussed this idea in a little more detail in his influential 1993-paper "Technological Singularity ", in which he predicted:
Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended.[24]
Transhumanists today hold diverging views about the singularity: some see it as a likely scenario, others believe that it is more probable that there will never be any very sudden and dramatic changes as the result of progress in artificial intelligence.
The singularity idea also comes in a somewhat different eschatological version, which traces its lineage to the writings of Pierre Teilhard de Chardin, a paleontologist and Jesuit theologian who saw an evolutionary telos in the development of an encompassing noosphere (a global consciousness) via physicist Frank Tipler, who argued that advanced civilizations might come to have a defining influence on the future evolution of the cosmos, and, in the final moments of the Big Crunch, might manage to extract an infinite number of computations by harnessing the sheer energy of the collapsing matter.[25],[26] However, while these ideas might appeal to those who fancy a marriage between mysticism and science, they have not caught on either among transhumanists or the larger scientific community. Current cosmological theories indicate that the universe will continue to expand forever (falsifying Tiplers prediction). But the more general point that the transhumanist might make in this context is that we need to learn to think about "big-picture questions " without resorting to wishful thinking or mysticism. Big-picture questions, including ones about our place in the world and the long-term fate of intelligent life are part of transhumanism; however, these questions should be addressed in a sober, disinterested way, using critical reason and our best available scientific evidence. One reason why such questions are of transhumanist interest is that their answers might affect what outcomes we should expect from our own technological development, and therefore indirectly what policies it makes sense for humanity to pursue.
In 1986, Eric Drexler published Engines of Creation, the first book-length exposition of molecular manufacturing.[27](The possibility of nanotechnology had been anticipated by Nobel laureate physicist Richard Feynman in his famous after-dinner address in 1959 entitled There is Plenty of Room at the Bottom ".[28]) In this seminal work, Drexler not only argued for the feasibility of assembler-based nanotechnology but also explored its consequences and began charting the strategic challenges posed by its development. Drexlers later book Nanosystems (1992) supplied a more technical analysis that seemed to confirm his original conclusions.[29] To prepare the world for nanotechnology and work towards its safe implementation, he founded the Foresight Institute together with his then wife, Christine Peterson, in 1986.
In the last several years, nanotechnology has become big business, with worldwide research funding amounting to billions of dollars. Yet little of this work fits Drexlers ambitious vision of nanotechnology as an assembler-based, near-universal, construction technology. The mainstream nanotechnology community has sought to distance itself from Drexlers claims. The chemist Richard Smalley (another Noble laureate) has debated Drexler, asserting that non-biological molecular assemblers are impossible.[30]To date, however, no technical critique of Drexlers work in the published literature has found any significant flaws in his reasoning. If molecular nanotechnology is indeed physically possible, as Drexler maintains, the question becomes just how difficult it will be to develop it, and how long it will take. These issues are very difficult to settle in advance.
If molecular nanotechnology could be developed as Drexler envisions it, it would have momentous ramifications:
Coal and diamonds, sand and computer chips, cancer and healthy tissue: throughout history, variations in the arrangement of atoms have distinguished the cheap from the cherished, the diseased from the healthy. Arranged one way, atoms make up soil, air, and water arranged another, they make up ripe strawberries. Arranged one way, they make up homes and fresh air; arranged another, they make up ash and smoke.[31]
Molecular nanotechnology would enable us to transform coal into diamonds, sand into supercomputers, and to remove pollution from the air and tumors from healthy tissue. In its mature form, it could help us abolish most disease and aging, make possible the reanimation of cryonics patients, enable affordable space colonization, and more ominously lead to the rapid creation of vast arsenals of lethal or non-lethal weapons.
Another hypothetical technology that would have a revolutionary impact is uploading, the transfer of a human mind to a computer. This would involve the following steps: First, create a sufficiently detailed scan of a particular human brain, perhaps by deconstructing it with nanobots or by feeding thin slices of brain tissues into powerful microscopes for automatic image analysis. Second, from this scan, reconstruct the neuronal network that the brain implemented, and combine this with computational models of the different types of neurons. Third, emulate the whole computational structure on a powerful supercomputer. If successful, the procedure would result in the original mind, with memory and personality intact, being transferred to the computer where it would then exist as software; and it could either inhabit a robot body or live in a virtual reality.[32]While it is often thought that, under suitable circumstances, the upload would be conscious and that the original person would have survived the transfer to the new medium, individual transhumanists take different views on these philosophical matters.
If either superintelligence, or molecular nanotechnology, or uploading, or some other technology of a similarly revolutionary kind is developed, the human condition could clearly be radically transformed. Even if one believed that the probability of this happening any time soon is quite small, these prospects would nevertheless merit serious attention in view of their extreme impact. However, transhumanism does not depend on the feasibility of such radical technologies. Virtual reality; preimplantation genetic diagnosis; genetic engineering; pharmaceuticals that improve memory, concentration, wakefulness, and mood; performance-enhancing drugs; cosmetic surgery; sex change operations; prosthetics; anti-aging medicine; closer human-computer interfaces: these technologies are already here or can be expected within the next few decades. The combination of these technological capabilities, as they mature, could profoundly transform the human condition. The transhumanist agenda, which is to make such enhancement options safely available to all persons, will become increasingly relevant and practical in the coming years as these and other anticipated technologies come online.
Benjamin Franklin wished to be preserved in a cask of Madeira and later recalled to life, and regretted that he was living too near the infancy of science for this to be possible. Since then, science has grown up a bit. In 1962, Robert Ettinger published the book, The Prospect of Immortality, which launched the idea of cryonic suspension.[33]Ettinger argued that as medical technology seems to be constantly progressing, and since science has discovered that chemical activity comes to a complete halt at low-enough temperatures, it should be possible to freeze a person today (in liquid nitrogen) and preserve the body until a time when technology is advanced enough to repair the freezing damage and reverse the original cause of deanimation. Cryonics, Ettinger believed, offered a ticket to the future.
Alas, the masses did not line up for the ride. Cryonics has remained a fringe alternative to more traditional methods of treating the terminally diseased, such as cremation and burial. The practice of cryonics was not integrated into the mainstream clinical setting and was instead conducted on the cheap by a small number of enthusiasts. Two early cryonics organizations went bankrupt, allowing their patients to thaw out. At that point, the problem of massive cellular damage that occurs when ice crystals form in the body also became more widely known. As a result, cryonics acquired a reputation as a macabre scam. The media controversy over the suspension of baseball star Ted Williams in 2002 showed that public perception of cryonics has not changed much over the past decades.
Despite its image problem and its early failures of implementation, the cryonics community continues to be active and it counts among its members several eminent scientists and intellectuals. Suspension protocols have been improved, and the infusion of cryoprotectants prior to freezing to suppress the formation of ice crystals has become standard practice. The prospect of nanotechnology has given a more concrete shape to the hypothesized future technology that could enable reanimation. There are currently two organizations that offer full-service suspension, the Alcor Life Extension Foundation (founded in 1972) and the Cryonics Institute (founded in 1976). Alcor has recently introduced a new suspension method, which relies on a process known as "vitrification ", which further reduces micro-structural damage during suspension.
In a later work, Man into Superman (1972), Ettinger discussed a number of conceivable technological improvements of the human organism, continuing the tradition started by Haldane and Bernal.[34]
Another early transhumanist was F. M. Esfandiary, who later changed his name to FM-2030. One of the first professors of future studies, FM taught at the New School for Social Research in New York in the 1960s and formed a group of optimistic futurists known as the UpWingers.
Who are the new revolutionaries of our time? They are the geneticists, biologists, physicists, cryonologists, biotechnologists, nuclear scientists, cosmologists, radio astronomers, cosmonauts, social scientists, youth corps volunteers, internationalists, humanists, science-fiction writers, normative thinkers, inventors They and others are revolutionizing the human condition in a fundamental way. Their achievements and goals go far beyond the most radical ideologies of the Old Order.[35]
In his book Are you a transhuman? (1989), FM described what he regarded as the signs of the emergence of the "transhuman ".[36]In FMs terminology, a transhuman is a "transitional human, " someone who by virtue of their technology usage, cultural values, and lifestyle constitutes an evolutionary link to the coming era of posthumanity. The signs that FM saw as indicative of transhuman status included prostheses, plastic surgery, intensive use of telecommunications, a cosmopolitan outlook and a globetrotting lifestyle, androgyny, mediated reproduction (such as in vitro fertilization), absence of religious belief, and a rejection of traditional family values. However, it was never satisfactorily explained why somebody who, say, rejects family values, has a nose job, and spends a lot of time on jet planes is in closer proximity to posthumanity than the rest of us.
In the 1970s and 1980s, many organizations sprang up that focused on a particular topic such as life extension, cryonics, space colonization, science fiction, and futurism. These groups were often isolated from one another, and whatever shared views and values they had did not yet amount to any unified worldview. Ed Regiss Great Mambo Chicken and the Transhuman Condition (1990) took a humorous look at these proto-transhumanist fringes, which included eccentric and otherwise intelligent individuals trying to build space rockets in their backyards or experimenting with biofeedback machines and psychedelic drugs, as well as scientists pursuing more serious lines of work but who had imbibed too deeply of the Californian spirit.[37]
In 1988, the first issue of the Extropy Magazine was published by Max More and Tom Morrow, and in 1992 they founded the Extropy Institute (the term "extropy " being coined as a metaphorical opposite of entropy). The Institute served as a catalyst that brought together disparate groups of people with futuristic ideas and facilitated the formation of novel memetic compounds. The Institute ran a series of conferences, but perhaps most important was the extropians mailing list, an online discussion forum where new ideas were shared and debated. In the mid-nineties, many got first exposure to transhumanist views from the Extropy Institutes listserve.
More had immigrated to California from Britain after changing his name from Max OConnor. Of his new name, he said:
It seemed to really encapsulate the essence of what my goal is: always to improve, never to be static. I was going to get better at everything, become smarter, fitter, and healthier. It would be a constant reminder to keep moving forward.[38]
Max More wrote the first definition of transhumanism in its modern sense, and created his own distinctive brand of transhumanism, "extropianism, " which emphasized the principles of "boundless expansion, " "self-transformation, " "dynamic optimism, " "intelligent technology, " and "spontaneous order ". Originally, extropianism had a clear libertarian flavor, but in later years More has distanced himself from this ingredient, replacing "spontaneous order " with open society, " a principle that opposes authoritarian social control and promotes decentralization of power and responsibility.[39]
Natasha Vita-More (married to Max) is the Extropy Institutes current president. She is an artist and designer, and has over the years issued a number of manifestos on transhumanist and extropic art.[40]
The Extropy Institutes conferences and mailing list also served as a hangout place for some people who liked to discuss futuristic ideas but who were not necessarily joiners. Those who were around in the mid-nineties will remember individuals such as Anders Sandberg, Alexander "Sasha " Chislenko, Hal Finney, and Robin Hanson from among the more thoughtful regulars in the transhumanist milieu at the time. An enormous amount of discussion about transhumanism has taken place on various email lists in the past decade. The quality of postings has been varied (putting it mildly). Yet at their best, these online conversations explored ideas about the implications of future technologies that were, in some respects, far advanced over what could be found in printed books or journals. The Internet played an important role in incubating modern transhumanism by facilitating these meetings of minds and perhaps more indirectly, too, via the "irrational exuberance " that pervaded the dot-com era?
The World Transhumanist Association was founded in early 1998 by Nick Bostrom and David Pearce, to provide a general organizational basis for all transhumanist groups and interests, across the political spectrum. The aim was also to develop a more mature and academically respectable form of transhumanism, freed from the "cultishness " which, at least in the eyes of some critics, had afflicted some of its earlier convocations. The two founding documents of the WTA were the Transhumanist Declaration (see appendix), and the Transhumanist FAQ (v. 1.0).[41]The Declaration was intended as a concise consensus statement of the basic principle of transhumanism. The FAQ was also a consensus or near-consensus document, but it was more ambitious in its philosophical scope in that it developed a number of themes that had previously been, at most, implicit in the movement. More than fifty people contributed comments on drafts of the FAQ. The document was produced by Bostrom but major parts and ideas were also contributed by several others, including the British utilitarian thinker David Pearce, Max More, the American feminist and disability rights activist Kathryn Aegis, and the walking encyclopedia Anders Sandberg, who was at the time a neuroscience student in Sweden.
A number of related organizations have also cropped up in recent years, focusing more narrowly on particular transhumanist issues, such as life-extension, artificial intelligence, or the legal implications of "converging technologies " (nano-bio-info-neuro technologies). The Institute for Ethics and Emerging Technologies, a non-profit think tank, was established in 2004, to "promote the ethical use of technology to expand human capacities ".
Over the past couple of decades, academia has picked up the ball and started to analyze various "transhumanist matters, " both normative and positive. The contributions are far too many to comprehensively describe here, so we will pick out just a few threads, beginning with ethics.
For most of its history, moral philosophy did not shy away from addressing practical problems. In the early and mid-parts of the twentieth century, during heydays of logical positivism, applied ethics became a backwater as moral philosophers concentrated on linguistic or meta-ethical problems. Since then, however, practical ethics has reemerged as a field of academic inquiry. The comeback started in medical ethics. Revelations of the horrific experiments that the Nazis had conducted on human subjects in the name of science led to the adoption of the Nuremberg code (1947) and the Declaration of Helsinki (1964), which laid down strict safeguards for medical experimentation, emphasizing the need for patient consent.[42],[43] But the rise of the modern health care system spawned new ethical dilemmas turning off life-support, organ donation, resource allocation, abortion, advance directives, doctor-patient relationships, protocols for obtaining informed consent and for dealing with incompetent patients. In the 1970s, a broader kind of enquiry began to emerge, stimulated particularly by developments in assisted reproduction and genetics. This field became known as bioethics. Many of the ethical issues most directly linked to transhumanism would now fall under this rubric, although other normative discourses are also involved, e.g. population ethics, meta-ethics, political philosophy, and bioethics younger sisters computer ethics, engineering ethics, environmental ethics.
Bioethics was from the beginning an interdisciplinary endeavor, dominated by theologians, legal scholars, physicians, and, increasingly, philosophers, with occasional participation by representatives of patients rights groups, disability advocates, and other interested parties. [44] Lacking a clear methodology, and operating on a plain often swept by the winds of political or religious controversy, the standard of scholarship has frequently been underwhelming. Despite these difficulties, bioethics burgeoned. A cynic might ascribe this accomplishment to the ample fertilization that the field received from a number of practical imperatives: absolving doctors of moral dilemmas, training medical students to behave, enabling hospital boards to trumpet their commitment to the highest ethical standards of care, providing sound bites for the mass media, and allowing politicians to cover their behinds by delegating controversial issues to ethics committees. But a kinder gloss is possible: decent people recognized that difficult moral problems arose in modern biomedicine, that these problems needed to be addressed, and that having some professional scholars trying to clarify these problems in some sort of systematic way might be helpful. While higher-caliber scholarship and a more robust methodology would be nice, in the meantime we make the most of what we have.
Moral philosophers have in the last couple of decades made many contributions that bear on the ethics of human transformation, and we must limit ourselves to a few mentions. Derek Parfits classic Reasons and Persons (1984) discussed many relevant normative issues.[45]In addition to personal identity and foundational ethical theory, this book treats population ethics, person-affecting moral principles, and duties to future generations. Although Parfits analysis takes place on an idealized level, his arguments elucidate many moral considerations that emerge within the transhumanist program.
Jonathan Glovers What Sort of People Should there Be? (1984) addressed technology-enabled human-transformation at a somewhat more concrete level, focusing especially on genetics and various technologies that could increase social transparency. Glover gave a clear and balanced analytic treatment of these issues that was well ahead of its time. His general conclusion is that
not just any aspect of present human nature is worth preserving. Rather it is especially those features which contribute to self-development and self-expression, to certain kinds of relationships, and to the development of our consciousness and understanding. And some of these features may be extended rather than threatened by technology.[46]
James Hughes has argued that biopolitics is emerging as a fundamental new dimension of political opinion. In Hughes model, biopolitics joins with the more familiar dimensions of cultural and economic politics, to form a three-dimensional opinion-space. We have already seen that in the early 90s, the extropians combined liberal cultural politics and laissez-fair economic politics with transhumanist biopolitics. In Citizen Cyborg (2004), Hughes sets forward what he terms "democratic transhumanism, " which mates transhumanist biopolitics with social democratic economic politics and liberal cultural politics.[68]He argues that we will achieve the best posthuman future when we ensure that technologies are safe, make them available to everyone, and respect the right of individuals to control their own bodies. The key difference between extropian transhumanism and democratic transhumanism is that the latter accords a much bigger role for government in regulating new technologies for safety and ensuring that the benefits will be available to all, not just a wealthy or tech-savvy elite.
In principle, transhumanism can be combined with a wide range of political and cultural views, and many such combinations are indeed represented, e.g. within the membership of the World Transhumanist Association. One combination that is not often found is the coupling of transhumanism to a culture-conservative outlook. Whether this is because of an irresolvable tension between the transformative agenda of transhumanism and the cultural conservatives preference for traditional arrangements is not clear. It could instead be because nobody has yet seriously attempted to develop such a position. It is possible to imagine how new technologies could be used to reinforce some culture-conservative values. For instance, a pharmaceutical that facilitated long-term pair bonding could help protect the traditional family. Developing ways of using our growing technological powers to help people realize widely held cultural or spiritual values in their lives would seem a worthwhile undertaking.
This is not, however, the route for which cultural conservatives have so far opted. Instead, they have gravitated towards transhumanisms opposite, bioconservatism, which opposes the use of technology to expand human capacities or to modify aspects of our biological nature. People drawn to bioconservatism come from groups that traditionally have had little in common. Right-wing religious conservatives and left-wing environmentalists and anti-globalists have found common causes, for example in their opposition to the genetic modification of humans.
The different strands of contemporary bioconservatism can be traced to a multifarious set of origins: ancient notions of taboo; the Greek concept of hubris; the Romanticist view of nature; certain religious (anti-humanistic) interpretations of the concept of human dignity and of a God-given natural order; the Luddite workers revolt against industrialization; Karl Marxs analysis of technology under capitalism; various Continental philosophers critiques of technology, technocracy, and the rationalistic mindset that accompanies modern technoscience; foes of the military-industrial complex and multinational corporations; and objectors to the consumerist rat-race. The proposed remedies have ranged from machine-smashing (the original Luddites), to communist revolution (Marx), to buying "organic ", to yoga (Jos Ortega y Gasset), but nowadays it commonly emanates in calls for national or international bans on various human enhancement technologies (Fukuyama, Annas, etc.).
Feminist writers have come down on both sides of the debate. Ecofeminists have suspected biotechnology, especially its use to reshape bodies or control reproduction, of being an extension of traditional patriarchal exploitation of women, or, alternatively, have seen it as a symptom of a control-obsessed, unemphatic, gadget-fixated, body-loathing mindset. Some have offered a kind of psychoanalysis of transhumanism, concluding that it represents an embarrassing rationalization of self-centered immaturity and social failure. But others have welcomed the libratory potential of biotechnology. Shulamith Firestone argued in the feminist classic The Dialectic of Sex (1971) that women will be fully liberated only when technology has freed them from having to incubate children.[69]Cyberfeminist Donna Haraway says that she would "rather be a cyborg than a goddess " and argues against the dualistic view that associates men with culture and technology and women with nature.[70]
Perhaps the most prominent bioconservative voice today is that of Leon Kass, chairman of President Bushs Council on Bioethics. Kass acknowledges an intellectual debt to three other distinguished bioconservatives: Protestant theologian Paul Ramsey, Christian apologist C. S. Lewis, and German-born philosopher-theologian Hans Jonas (who studied under Martin Heidegger).[71]Kasss concerns center on human dignity and the subtle ways in which our attempts to assert technological mastery over human nature could end up dehumanizing us by undermining various traditional "meanings " such as the meaning of the life cycle, the meaning of sex, the meaning of eating, and the meaning of work. Kass is well-known for his advocacy of "the wisdom of repugnance " (which echoes Hans Jonass "heuristics of fear "). While Kass stresses that a gut feeling of revulsion is not a moral argument, he nevertheless insists that the yuck factor merits our respectful attention:
In crucial cases repugnance is the emotional expression of deep wisdom, beyond reasons power to fully articulate we intuit and feel, immediately and without argument, the violation of things we rightfully hold dear To pollution and perversion, the fitting response can only be horror and revulsion; and conversely, generalized horror and revulsion are prima facie evidence of foulness and violation.[72]
Francis Fukuyama, another prominent bioconservative and member of the Presidents Council, has recently identified transhumanism as "the worlds most dangerous idea ".[73]For Fukuyama, however, the chief concern is not about the subtle undermining of meanings " but the prospect of violence and oppression. He argues that liberal democracy depends on the fact that all humans share an undefined "Factor X ", which grounds their equal dignity and rights. The use of enhancing technologies, he fears, could destroy Factor X.[74]
Bioethicists George Annas, Lori Andrews, and Rosario Isasi have proposed legislation to make inheritable genetic modification in humans a "crime against humanity ", like torture and genocide. Their rationale is similar to Fukuyamas:
The new species, or "posthuman, " will likely view the old "normal " humans as inferior, even savages, and fit for slavery or slaughter. The normals, on the other hand, may see the posthumans as a threat and if they can, may engage in a preemptive strike by killing the posthumans before they themselves are killed or enslaved by them. It is ultimately this predictable potential for genocide that makes species-altering experiments potential weapons of mass destruction, and makes the unaccountable genetic engineer a potential bioterrorist.[75]
There is some common ground between Annas et al. and the transhumanists: they agree that murder and enslavement, whether of humans by posthumans or the other way around, would be a moral atrocity and a crime. Transhumanists deny, however, that this is a likely consequence of germ-line therapy to enhance health, memory, longevity, or other similar traits in humans. If and when we develop the capability to create some singular entity that could potentially destroy the human race, such as a superintelligent machine, then we could indeed regard it as a crime against humanity to proceed without a thorough risk analysis and the installation of adequate safety features. As we saw in the previous section, the effort to understand and find ways to reduce existential risks has been a central preoccupation for some transhumanists, such as Eric Drexler, Nick Bostrom, and Eliezer Yudkowsky.
There are other commonalities between bioconservatives and transhumanists. Both agree that we face a realistic prospect that technology could be used to substantially transform the human condition in this century. Both agree that this imposes an obligation on the current generation to think hard about the practical and ethical implications. Both are concerned with medical risks of side-effects, of course, although bioconservatives are more worried that the technology might succeed than that it might fail. Both camps agree that technology in general and medicine in particular have a legitimate role to play, although bioconservatives tend to oppose many uses of medicine that go beyond therapy to enhancement. Both sides condemn the racist and coercive state-sponsored eugenics programs of the twentieth century. Bioconservatives draw attention to the possibility that subtle human values could get eroded by technological advances, and transhumanists should perhaps learn to be more sensitive to these concerns. On the other hand, transhumanists emphasize the enormous potential for genuine improvements in human well-being and human flourishing that are attainable only via technological transformation, and bioconservatives could try to be more appreciative of the possibility that we could realize great values by venturing beyond our current biological limitations.
The Transhumanist Declaration
(1) Humanity will be radically changed by technology in the future. We foresee the feasibility of redesigning the human condition, including such parameters as the inevitability of aging, limitations on human and artificial intellects, unchosen psychology, suffering, and our confinement to the planet earth.
(2) Systematic research should be put into understanding these coming developments and their long-term consequences.
(3) Transhumanists think that by being generally open and embracing of new technology we have a better chance of turning it to our advantage than if we try to ban or prohibit it.
(4) Transhumanists advocate the moral right for those who so wish to use technology to extend their mental and physical (including reproductive) capacities and to improve their control over their own lives. We seek personal growth beyond our current biological limitations.
(5) In planning for the future, it is mandatory to take into account the prospect of dramatic progress in technological capabilities. It would be tragic if the potential benefits failed to materialize because of technophobia and unnecessary prohibitions. On the other hand, it would also be tragic if intelligent life went extinct because of some disaster or war involving advanced technologies.
(6) We need to create forums where people can rationally debate what needs to be done, and a social order where responsible decisions can be implemented.
(7) Transhumanism advocates the well- being of all sentience (whether in artificial intellects, humans, posthumans, or non- human animals) and encompasses many principles of modern humanism. Transhumanism does not support any particular party, politician or political platform.
Annas, G., L. Andrews, and R. Isasi (2002), "Protecting the Endangered Human: Toward an International Treaty Prohibiting Cloning and Inheritable Alterations ", American Journal of Law and Medicine 28 (2&3):151-178.
Bacon, F. (1620), Novum Organum. Translated by R. L. Ellis and J. Spedding. Robertson, J. ed, The Philosophical Woeks of Francis Bacon, 1905. London: Routledge.
Bernal, J. D. (1969), The world, the flesh & the devil; an enquiry into the future of the three enemies of the rational soul. Bloomington: Indiana University Press.
Bostrom, N. (1998), "How Long Before Superintelligence? " International Journal of Futures Studies 2.
(2002), "Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards ", Journal of Evolution and Technology 9.
(2002), "When Machines Outsmart Humans ", Futures 35 (7):759-764.
(2003), "Are You Living in a Computer Simulation? " Philosophical Quarterly 53 (211):243-255.
(2003), "Human Genetic Enhancements: A Transhumanist Perspective ", Journal of Value Inquiry 37 (4):493-506.
The Transhumanist FAQ: v 2.1. World Transhumanist Association 2003. http://transhumanism.org/index.php/WTA/faq/.
(2004), "Transhumanism - The World's Most Dangerous Idea? " Betterhumans 10/19/2004.
See original here:
JET 14(1) - April 2005 - Bostrom - Transhumanist Thought
Posted in Transhumanist
Comments Off on JET 14(1) – April 2005 – Bostrom – Transhumanist Thought
Socio-Economic Collapse | Prometheism.net
Posted: June 17, 2016 at 5:04 am
In archaeology, the classic Maya collapse refers to the decline of Maya civilization and abandonment of Maya cities in the southern Maya lowlands of Mesoamerica between the 8th and 9thcenturies, at the end of the Classic Mayan Period. Preclassic Maya experienced a similar collapse in the 2nd century.
The Classic Period of Mesoamerican chronology is generally defined as the period from 250 to 900, the last century of which is referred to as the Terminal Classic.[1] The classic Maya collapse is one of the greatest unsolved mysteries in archaeology. Urban centers of the southern lowlands, among them Palenque, Copn, Tikal, Calakmul, went into decline during the 8th and 9thcenturies and were abandoned shortly thereafter. Archaeologically, this decline is indicated by the cessation of monumental inscriptions and the reduction of large-scale architectural construction at the primary urban centers of the classic period.
Although termed a collapse, it did not mark the end of the Maya civilization; Northern Yucatn in particular prospered afterwards, although with very different artistic and architectural styles, and with much less use of monumental hieroglyphic writing. In the post-classic period following the collapse, the state of Chichn Itz built an empire that briefly united much of the Maya region,[citation needed] and centers such as Mayapn and Uxmal flourished, as did the Highland states of the Kiche and Kaqchikel Maya. Independent Maya civilization continued until 1697 when the Spanish conquered Nojpetn, the last independent city-state. Millions of Maya people still inhabit the Yucatn peninsula today.
Because parts of Maya civilization unambiguously continued, a number of scholars strongly dislike the term collapse.[2] Regarding the proposed collapse, E. W. Andrews IV went as far as to say, in my belief no such thing happened.[3]
The Maya often recorded dates on monuments they built. Few dated monuments were being built circa 500 around ten per year in 514, for example. The number steadily increased to make this number twenty per year by 672 and forty by around 750. After this, the number of dated monuments begins to falter relatively quickly, collapsing back to ten by 800 and to zero by 900. Likewise, recorded lists of kings complement this analysis. Altar Q shows a reign of kings from 426 to 763. One last king not recorded on Altar Q was Ukit Took, Patron of Flint, who was probably a usurper. The dynasty is believed to have collapsed entirely shortly thereafter. In Quirigua, twenty miles north of Copn, the last king Jade Sky began his rule between 895 and 900, and throughout the Maya area all kingdoms similarly fell around that time.[4]
A third piece of evidence of the progression of Maya decline, gathered by Ann Corinne Freter, Nancy Gonlin, and David Webster, uses a technique called obsidian hydration. The technique allowed them to map the spread and growth of settlements in the Copn Valley and estimate their populations. Between 400 and 450, the population was estimated at a peak of twenty-eight thousand between 750 and 800 larger than London at the time. Population then began to steadily decline. By 900 the population had fallen to fifteen thousand, and by 1200 the population was again less than 1000.
Some 88 different theories or variations of theories attempting to explain the Classic Maya Collapse have been identified.[5] From climate change to deforestation to lack of action by Mayan kings, there is no universally accepted collapse theory, although drought is gaining momentum as the leading explanation.[6]
The archaeological evidence of the Toltec intrusion into Seibal, Peten, suggests to some the theory of foreign invasion. The latest hypothesis states that the southern lowlands were invaded by a non-Maya group whose homelands were probably in the gulf coast lowlands. This invasion began in the 9thcentury and set off, within 100years, a group of events that destroyed the Classic Maya. It is believed that this invasion was somehow influenced by the Toltec people of central Mexico. However, most Mayanists do not believe that foreign invasion was the main cause of the Classic Maya Collapse; they postulate that no military defeat can explain or be the cause of the protracted and complex Classic Collapse process. Teotihuacan influence across the Maya region may have involved some form of military invasion; however, it is generally noted that significant Teotihuacan-Maya interactions date from at least the Early Classic period, well before the episodes of Late Classic collapse.[7]
The foreign invasion theory does not answer the question of where the inhabitants went. David Webster believed that the population should have increased because of the lack of elite power. Further, it is not understood why the governmental institutions were not remade following the revolts, which actually happened under similar circumstances in places like China. A study by anthropologist Elliot M. Abrams came to the conclusion that buildings, specifically in Copan, did not actually require an extensive amount of time and workers to construct.[8] However, this theory was developed during a time period when the archaeological evidence showed that there were fewer Maya people than there are now known to have been.[9] Revolutions, peasant revolts, and social turmoil change circumstances, and are often followed by foreign wars, but they run their course. There are no documented revolutions that caused wholesale abandonment of entire regions.
It has been hypothesized that the decline of the Maya is related to the collapse of their intricate trade systems, especially those connected to the central Mexican city of Teotihuacn. Preceding improved knowledge of the chronology of Mesoamerica, Teotihuacan was believed to have fallen during 700750, forcing the restructuring of economic relations throughout highland Mesoamerica and the Gulf Coast.[10] This remaking of relationships between civilizations would have then given the collapse of the Classic Maya a slightly later date. However, after knowing more about the events and the time periods that they occurred, it is now believed that the strongest Teotihuacan influence was during the 4th and 5thcenturies. In addition, the civilization of Teotihuacan started to lose its power, and maybe even abandoned the city, during 600650. This differs greatly from the previous belief that Teotihuacano power decreased during 700750.[11] But since the new decline date of 600650 has been accepted, the Maya civilizations are now thought to have lived on and prospered for another century and more[12] than what was previously believed. Rather than the decline of Teotihuacan directly preceding the collapse of the Maya, their decline is now seen as contributing to the 6thcentury hiatus.[12]
The disease theory is also a contender as a factor in the Classic Maya Collapse. Widespread disease could explain some rapid depopulation, both directly through the spread of infection itself and indirectly as an inhibition to recovery over the long run. According to Dunn (1968) and Shimkin (1973), infectious diseases spread by parasites are common in tropical rainforest regions, such as the Maya lowlands. Shimkin specifically suggests that the Maya may have encountered endemic infections related to American trypanosomiasis, Ascaris, and some enteropathogens that cause acute diarrheal illness. Furthermore, some experts believe that, through development of their civilization (that is, development of agriculture and settlements), the Maya could have created a disturbed environment, in which parasitic and pathogen-carrying insects often th
rive.[13] Among the pathogens listed above, it is thought that those that cause the acute diarrheal illnesses would have been the most devastating to the Maya population. This is because such illness would have struck a victim at an early age, thereby hampering nutritional health and the natural growth and development of a child. This would have made them more susceptible to other diseases later in life. Such ideas as this could explain the role of disease as at least a possible partial reason for the Classic Maya Collapse.[14]
Mega-droughts hit the Yucatn Peninsula and Petn Basin areas with particular ferocity, as thin tropical soils decline in fertility and become unworkable when deprived of forest cover,[15] and due to regular seasonal drought drying up surface water.[16] Colonial Spanish officials accurately documented cycles of drought, famine, disease, and war, providing a reliable historical record of the basic drought pattern in the Maya region.[17]
Climatic factors were first implicated in the Collapse as early as 1931 by Mayanists Thomas Gann and J.E.S. Thompson.[18] In The Great Maya Droughts, Richardson Gill gathers and analyzes an array of climatic, historical, hydrologic, tree ring, volcanic, geologic, lake bed, and archeological research, and demonstrates that a prolonged series of droughts probably caused the Classic Maya Collapse.[19] The drought theory provides a comprehensive explanation, because non-environmental and cultural factors (excessive warfare, foreign invasion, peasant revolt, less trade, etc.) can all be explained by the effects of prolonged drought on Classic Maya civilization.[20]
Climatic changes are, with increasing frequency, found to be major drivers in the rise and fall of civilizations all over the world.[21] Professors Harvey Weiss of Yale University and Raymond S. Bradley of the University of Massachusetts have written, Many lines of evidence now point to climate forcing as the primary agent in repeated social collapse.[22] In a separate publication, Weiss illustrates an emerging understanding of scientists:
Within the past five years new tools and new data for archaeologists, climatologists, and historians have brought us to the edge of a new era in the study of global and hemispheric climate change and its cultural impacts. The climate of the Holocene, previously assumed static, now displays a surprising dynamism, which has affected the agricultural bases of pre-industrial societies. The list of Holocene climate alterations and their socio-economic effects has rapidly become too complex for brief summary.[23]
The drought theory holds that rapid climate change in the form of severe drought brought about the Classic Maya collapse. According to the particular version put forward by Gill in The Great Maya Droughts,
[Studies of] Yucatecan lake sediment cores provide unambiguous evidence for a severe 200-year drought from AD800 to 1000 the most severe in the last 7,000years precisely at the time of the Maya Collapse.[24]
Climatic modeling, tree ring data, and historical climate data show that cold weather in the Northern Hemisphere is associated with drought in Mesoamerica.[25] Northern Europe suffered extremely low temperatures around the same time as the Maya droughts. The same connection between drought in the Maya areas and extreme cold in northern Europe was found again at the beginning of the 20thcentury. Volcanic activity, within and outside Mesoamerica, is also correlated with colder weather and resulting drought, as the effects of the Tambora volcano eruption in 1815 indicate.[26]
Mesoamerican civilization provides a remarkable exception: civilization prospering in the tropical swampland. The Maya are often perceived as having lived in a rainforest, but technically, they lived in a seasonal desert without access to stable sources of drinking water.[27] The exceptional accomplishments of the Maya are even more remarkable because of their engineered response to the fundamental environmental difficulty of relying upon rainwater rather than permanent sources of water. The Maya succeeded in creating a civilization in a seasonal desert by creating a system of water storage and management which was totally dependent on consistent rainfall.[28] The constant need for water kept the Maya on the edge of survival. Given this precarious balance of wet and dry conditions, even a slight shift in the distribution of annual precipitation can have serious consequences.[16] Water and civilization were vitally connected in ancient Mesoamerica. Archaeologist and specialist in pre-industrial land and water usage practices, Vernon Scarborough, believes water management and access were critical to the development of Maya civilization.[29]
Critics of the drought theory wonder why the southern and central lowland cities were abandoned and the northern cities like Chichen Itza, Uxmal, and Coba continued to thrive.[30] One critic argued that Chichen Itza revamped its political, military, religious, and economic institutions away from powerful lords or kings.[31] Inhabitants of the northern Yucatn also had access to seafood, which might have explained the survival of Chichen Itza and Mayapan, cities away from the coast but within reach of coastal food supplies.[32] Critics of the drought theory also point to current weather patterns: much heavier rainfall in the southern lowlands compared to the lighter amount of rain in the northern Yucatn. Drought theory supporters state that the entire regional climate changed, including the amount of rainfall, so that modern rainfall patterns are not indicative of rainfall from 800 to 900. LSU archaeologist Heather McKillop found a significant rise in sea level along the coast nearest the southern Maya lowlands, coinciding with the end of the Classic period, and indicating climate change.[33]
David Webster, a critic of the megadrought theory says that much of the evidence provided by Gill comes from the northern Yucatn and not the Southern part of the peninsula, where Classic Maya civilization flourished. He also states that if water sources were to have dried up, then several city-states would have moved to other water sources. The fact that Gill suggests that all water in the region would have dried up and destroyed Maya civilization is a stretch, according to Webster.[34]
A study published in Science in 2012 found that modest rainfall reductions, amounting to only 25 to 40 percent of annual rainfall, may have been the tipping point to the Mayan collapse. Based on samples of lake and cave sediments in the areas surrounding major Mayan cities, the researchers were able to determine the amount of annual rainfall in the region. The mild droughts that took place between 800-950 would therefore be enough to rapidly deplete seasonal water supplies in the Yucatn lowlands, where there are no rivers.[35][36][37]
Some ecological theories of Maya decline focus on the worsening agricultural and resource conditions in the late Classic period. It was originally thought that the majority of Maya agriculture was dependent on a simple slash-and-burn system. Based on this method, the hypothesis of soil exhaustion was advanced by Orator F. Cook in 1921. Similar soil exhaustion assumptions are associated with erosion, intensive agricultural, and savanna grass competition.
More recent investigations have shown a complicated variety of intensive agricultural techniques utilized by the Maya, explaining the high population of the Classic Maya polities. Modern archaeologists now comprehend the sophisticated intensive and productive agricultural techniques of the ancient Maya, and several of t
he Maya agricultural methods have not yet been reproduced. Intensive agricultural methods were developed and utilized by all the Mesoamerican cultures to boost their food production and give them a competitive advantage over less skillful peoples.[38] These intensive agricultural methods included canals, terracing, raised fields, ridged fields, chinampas, the use of human feces as fertilizer, seasonal swamps or bajos, using muck from the bajos to create fertile fields, dikes, dams, irrigation, water reservoirs, several types of water storage systems, hydraulic systems, swamp reclamation, swidden systems, and other agricultural techniques that have not yet been fully understood.[39] Systemic ecological collapse is said to be evidenced by deforestation, siltation, and the decline of biological diversity.
In addition to mountainous terrain, Mesoamericans successfully exploited the very problematic tropical rainforest for 1,500years.[40] The agricultural techniques utilized by the Maya were entirely dependent upon ample supplies of water. The Maya thrived in territory that would be uninhabitable to most peoples. Their success over two millennia in this environment was amazing.[41]
Anthropologist Joseph Tainter wrote extensively about the collapse of the Southern Lowland Maya in his 1988 study, The Collapse of Complex Societies. His theory about Mayan collapse encompasses some of the above explanations, but focuses specifically on the development of and the declining marginal returns from the increasing social complexity of the competing Mayan city-states.[42] Psychologist Julian Jaynes suggested that the collapse was due to a failure in the social control systems of religion and political authority, due to increasing socioeconomic complexity that overwhelmed the power of traditional rituals and the kings authority to compel obedience.[43]
Originally posted here:
Classic Maya collapse Wikipedia, the free encyclopedia
Read more here:
Posted in Socio-economic Collapse
Comments Off on Socio-Economic Collapse | Prometheism.net
The History Of Germ Warfare – Very Long, Very Deadly
Posted: at 5:04 am
WASHINGTON - Although anthrax and other biological weapons seem like 21st-century threats, they have been tools of terror for ages. Ancient armies, for instance, tainted water supplies of entire cities with herbs and fungus that gave people horrible diarrhea and hallucinations. One germ-warfare assault in the 1300s apparently got out of hand, triggering an epidemic that ravaged the population of Europe. British troops in the French and Indian War launched a stealth smallpox attack on Indians. During World War I, German agents ran an anthrax factory in Washington, D.C. World War II anthrax bombs left a whole island uninhabitable for almost 50 years. "The earliest reference to anthrax is found in the Fifth Plague," said Dr. Philip Brachman, an anthrax expert at Emory University in Atlanta. It took 10 calamities inflicted on the Egyptians to finally convince an obstinate pharaoh to liberate the ancient Hebrews, according to the Bible. The plagues probably date to about 1300 B.C. They ranged from Nile River water turned blood-red and undrinkable to the one-night destruction of all the first-born of Egypt. The Fifth Plague (Exodus 9:3) was an infectious disease that killed all the cattle in Egypt, while sparing the Hebrews' cattle. Brachman and other experts think the biblical account actually refers to a natural epidemic of anthrax. Such epidemics periodically decimated domestic animals in the ancient Middle East. The anthrax might have spared the Israelites because their sheep would have been grazing on poorer pastures where infections don't take hold as well. Domestic animals (and wild animals such as deer and bison) get anthrax by eating spores of the bacteria while grazing on contaminated land, or from eating contaminated feed. Animal anthrax still is an important problem in developing countries, especially in the Middle East, Africa and Asia. Humans can catch the disease from contact with infected animals, their meat, hide or hair. Medical historians see anthrax's fingerprints in manuscripts from the ancient Greeks, Romans, and Hindus in India, which contain descriptions of animal and human anthrax. They think history's most serious anthrax outbreak was "Black Bane," a terrible epidemic that swept Europe in the 1600s. It killed at least 60,000 people and many more domestic and wild animals. People called it "Black Bane" because many cases involved the cutaneous, or skin, form of anthrax, which involves a blackish sore. Anthrax actually was named from a Greek word that refers to coal and charcoal. Cutaneous anthrax can be quickly cured today with Cipro, penicillin, doxycycline or other antibiotics. Like other infections in the pre-antibiotic era, however, it often killed. Brachman said that epidemics of anthrax were common in Europe during the 1700s and 1800s, with up to 100,000 cases of human anthrax annually. Medicine's first major advance against anthrax occurred in Germany as the United States celebrated its 100th birthday. A physician named Robert Koch discovered how to grow bacteria on gelatin-like material in glass laboratory dishes, and rules to prove that specific bacteria caused specific diseases. In 1876, Koch identified the anthrax bacteria. It led to development of a vaccine that was first used to immunize livestock in 1880, and later humans. Other biological agents have roots as almost as ancient as anthrax. Some of the first recorded biological terror attacks occurred in the 6th century B.C. The ancient Assyrians (whose civilization began around 2400 B.C. in modern Turkey, Iran, Syria and Iraq) poisoned enemy wells with ergot, a fungus that can grow on wheat, rye and other grains. It produces LSD-like chemicals that cause hallucinations and other symptoms. In another 6th-century biological assault, the ancient Greeks, besieging a city called Krissa, poisoned its water supply with the herb hellebore. It causes violent diarrhea. During their sieges, ancient Roman soldiers threw decaying human corpses and carcasses of dead animals into their enemies' water supplies, and catapulted them over the walls of enemy towns. A Tartar army in 1346 launched a biological assault that may have gotten out of control - big time. While besieging a city in modern-day Crimea, soldiers hurled corpses of bubonic plague victims over city walls. Fleas from the corpses infested people and rats in the city. Plague spread as people and rats escaped and fled. Some experts believe it triggered the great epidemic of bubonic plague -the "Black Death" -that swept Europe, killing 25 million people. In 1797, Napoleon tried to infect residents of a besieged city in Italy with malaria. During the French and Indian War, the British suspected American Indians of siding with the French. In an "act of good will," the British gave the Indians nice, warm blankets -straight from the beds of smallpox victims. The resulting epidemic killed hundreds of Indians. Dr. Anton Dilger, an agent of the Imperial German Government during World War I, grew anthrax and other bacteria in a corner of his Washington home. His henchmen on the docks in Baltimore used the anthrax to infect 3,000 horses and mules destined for the Allied forces in Europe. Many of the animals died, and hundreds of soldiers on the Western Front in Europe were infected. In 1937, Japan began a biological warfare program that included anthrax, and later tested anthrax weapons in China. During
World War II, Japan spread fleas infected with bubonic plague in a dozen Chinese cities. The United States, Great Britain and other countries developed anthrax weapons during World War II. The British military in 1942 began testing "anthrax bombs" on Gruinard Island, a 500-acre dot of land off the northwestern coast of Scotland. After the war, the project was abandoned. However, the Gruinard experiments established the terrible environmental consequences of using anthrax as a weapon of mass destruction. British scientists thought the anthrax spores would quickly die or blow away into the ocean. But the spores lived on. Huge numbers remained infectious year after year. Finally, in 1986, after critics labeled Gruinard "Anthrax Island," the British government decided to clean up the mess. Workers built an irrigation system over the entire test range. It saturated the ground with 280 tons of formaldehyde -"embalming fluid" -diluted in 2,000 tons of seawater. The fluid flowed 24 hours a day for more than a year. Gruinard finally was declared decontaminated in 1990. It remains uninhabited today. Modern biological warfare programs have resulted in environmental contamination as well. An accident in 1979 at a Soviet biological warfare plant in Sverdlovsk (Ekaterinburg), released anthrax that killed at least 68 people who lived downwind. A 1972 treaty, ratified by 143 countries, banned production, deployment, possession and use of biological weapons. Analysts think that a dozen countries still may have clandestine biological weapons programs, including Iraq. Iraq is believed to have hidden stockpiles of weapons-grade anthrax and other biological agents, plus artillery shells and other weapons to deliver the germs. < B>Link MainPage http://www.rense.com This Site Served by TheHostPros
Go here to see the original:
Posted in Germ Warfare
Comments Off on The History Of Germ Warfare – Very Long, Very Deadly
The US Government's Oppression of the Poor, Homeless
Posted: at 5:04 am
By Rev. Rebecca
The United States is far from being a "righteous" nation. Many people do not realize how much we as a nation oppress the poor, weak, homeless, and strangers among us. Additionally, most Americans are willfully ignorant to the oppression we cause overseas in poor nations with our consumeristic, capitalistic, and wealthy lifestyles.
Many of our laws are set up to favor the middle and upper classes and oppress the lower and homeless classes in the United States. I believe our nation is very guilty of the Old Testament prophetic charges against nations who oppress the poor and orphaned. Having worked for years with homeless children and youth (most of whom are homeless because they were abandoned or abused severely), I have seen the way our nation's laws oppress them.
For example, it is illegal in many US cities to be homeless. This means, as a homeless person, you can be arrested for any reason anywhere, simply because you have no home address. This gives businesses and anyone the right to call the police and have a homeless person removed or arrested simply for being somewhere they don't want them to be (even because you don't like how they smell!). This includes all public and private places. Most middle and upper class folks have absolutely no sense of their human rights being taken away to such a radical degree...they can't even fathom it.
For people who think that "homeless shelters" are the answer, please understand that most all shelters are only open at night and there are only enough shelters to house a very small percentage of homeless on any given night. This means the majority of homeless have to go "somewhere" to sleep/keep warm but are always in danger of being berated, removed, or arrested simply for being there. I could recount for days the stories of homeless youths who tried to hide in parks or buildings because they were so exhausted and in need of sleep, only to be berated, beaten, or arrested for sleeping in a public/private locations. They are treated as less than human beings simply because they are homeless. There is nowhere for them to go.
Another law which is common in most US cities outlaws sitting or "loitering" on sidewalks in the city. Spokane, WA is a city who enforces this law diligently. Do you know the purpose of the law? It is to primarily to prevent homeless youths from sitting or panhandling on the sidewalks (panhandling is illegal). However, most homeless youths have no other way of getting food and money (and nowhere else to go during the day)...they have to go somewhere and so they go where the people are to seek aid. However, businesses complain that it is bad for business to have homeless around and suburban shoppers complain that they don't like "seeing" homeless youth...so this law is enacted. However, I can assure you that if you are dressed well, this law will never be enforced. Middle and upper class youth wearing the latest from the Gap will never be berated, beaten, or arrested for sitting on the sidewalks. But if you look homeless, you will. I have witnessed police and security kick and beat homeless youths for sitting on the sidewalk on numerous occasions. Having homeless around is "bad" for commercial industries and apparently insults middle and upper class sensibilities. Just because I was with homeless youths, police have threatened to beat me too. This is not uncommon...this happens in some form in every US city and goes totally unnoticed. Sadly, our nation does not look out for the poor and orphaned.
Back to Writings
Visit link:
Posted in Government Oppression
Comments Off on The US Government's Oppression of the Poor, Homeless
War on Drugs – Wikipedia, the free encyclopedia
Posted: at 5:04 am
"The War on Drugs" is an American term commonly applied to a campaign of prohibition of drugs, military aid, and military intervention, with the stated aim being to reduce the illegal drug trade.[6][7] This initiative includes a set of drug policies that are intended to discourage the production, distribution, and consumption of psychoactive drugs that the participating governments and the UN have made illegal. The term was popularized by the media shortly after a press conference given on June 18, 1971, by United States President Richard Nixonthe day after publication of a special message from President Nixon to the Congress on Drug Abuse Prevention and Controlduring which he declared drug abuse "public enemy number one". That message to the Congress included text about devoting more federal resources to the "prevention of new addicts, and the rehabilitation of those who are addicted", but that part did not receive the same public attention as the term "war on drugs".[8][9][10] However, two years even prior to this, Nixon had formally declared a "war on drugs" that would be directed toward eradication, interdiction, and incarceration.[11] Today, the Drug Policy Alliance, which advocates for an end to the War on Drugs, estimates that the United States spends $51 billion annually on these initiatives.[12]
On May 13, 2009, Gil Kerlikowskethe Director of the Office of National Drug Control Policy (ONDCP)signaled that the Obama administration did not plan to significantly alter drug enforcement policy, but also that the administration would not use the term "War on Drugs", because Kerlikowske considers the term to be "counter-productive".[13] ONDCP's view is that "drug addiction is a disease that can be successfully prevented and treated... making drugs more available will make it harder to keep our communities healthy and safe".[14] One of the alternatives that Kerlikowske has showcased is the drug policy of Sweden, which seeks to balance public health concerns with opposition to drug legalization. The prevalence rates for cocaine use in Sweden are barely one-fifth of those in Spain, the biggest consumer of the drug.[15]
In June 2011, a self-appointed Global Commission on Drug Policy released a critical report on the War on Drugs, declaring: "The global war on drugs has failed, with devastating consequences for individuals and societies around the world. Fifty years after the initiation of the UN Single Convention on Narcotic Drugs, and years after President Nixon launched the US government's war on drugs, fundamental reforms in national and global drug control policies are urgently needed."[16] The report was criticized by organizations that oppose a general legalization of drugs.[14]
The first U.S. law that restricted the distribution and use of certain drugs was the Harrison Narcotics Tax Act of 1914. The first local laws came as early as 1860.[17]
In 1919, the United States passed the 18th Amendment, prohibiting the sale, manufacture, and transportation of alcohol, with exceptions for religious and medical use.
In 1920, the United States passed the National Prohibition Act (Volstead Act), enacted to carry out the provisions in law of the 18th Amendment.
The Federal Bureau of Narcotics was established in the United States Department of the Treasury by an act of June 14, 1930 (46 Stat. 585).[18]
In 1933, the federal prohibition for alcohol was repealed by passage of the 21st Amendment.
In 1935, President Franklin D. Roosevelt publicly supported the adoption of the Uniform State Narcotic Drug Act. The New York Times used the headline "Roosevelt Asks Narcotic War Aid".[19][20]
In 1937, the Marijuana Transfer Tax Act was passed. Several scholars have claimed that the goal was to destroy the hemp industry,[21][22][23] largely as an effort of businessmen Andrew Mellon, Randolph Hearst, and the Du Pont family.[21][23] These scholars argue that with the invention of the decorticator, hemp became a very cheap substitute for the paper pulp that was used in the newspaper industry.[21][24] These scholars believe that Hearst felt[dubious discuss] that this was a threat to his extensive timber holdings. Mellon, United States Secretary of the Treasury and the wealthiest man in America, had invested heavily in the DuPont's new synthetic fiber, nylon, and considered[dubious discuss] its success to depend on its replacement of the traditional resource, hemp.[21][25][26][27][28][29][30][31] However, there were circumstances that contradict these claims. One reason for doubts about those claims is that the new decorticators did not perform fully satisfactorily in commercial production.[32] To produce fiber from hemp was a labor-intensive process if you include harvest, transport and processing. Technological developments decreased the labor with hemp but not sufficient to eliminate this disadvantage.[33][34]
The Nixon campaign in 1968, and the Nixon White House after that, had two enemies: the antiwar left and black people. You understand what Im saying? We knew we couldnt make it illegal to be either against the war or black, but by getting the public to associate the hippies with marijuana and blacks with heroin, and then criminalizing both heavily, we could disrupt those communities. We could arrest their leaders, raid their homes, break up their meetings, and vilify them night after night on the evening news. Did we know we were lying about the drugs? Of course we did.
Although Nixon declared "drug abuse" to be public enemy number one in 1971,[37] the policies that his administration implemented as part of the Comprehensive Drug Abuse Prevention and Control Act of 1970 were a continuation of drug prohibition policies in the U.S., which started in 1914.[38][39]
The Nixon Administration also repealed the federal 210-year mandatory minimum sentences for possession of marijuana and started federal demand reduction programs and drug-treatment programs. Robert DuPont, the "Drug czar" in the Nixon Administration, stated it would be more accurate to say that Nixon ended, rather than launched, the "war on drugs". DuPont also argued that it was the proponents of drug legalization that popularized the term "war on drugs".[14][unreliable source?]
On October 27, 1970, Congress passes the Comprehensive Drug Abuse Prevention and Control Act of 1970, which, among other things, categorizes controlled substances based on their medicinal use and potential for addiction.[38]
In 1971, two congressmen released an explosive report on the growing heroin epidemic among U.S. servicemen in Vietnam; ten to fifteen percent of the servicemen were addicted to heroin, and President Nixon declared drug abuse to be "public enemy number one".[38][40]
In 1973, the Drug Enforcement Administration was created to replace the Bureau of Narcotics and Dangerous Drugs.[38]
In 1982, Vice President George H. W. Bush and his aides began pushing for the involvement of the CIA and U.S. military in drug interdiction efforts.[41]
The Office of National Drug Control Policy (ONDCP) was originally established by the National Narcotics Leadership Act of 1988,[42][43] which mandated a national anti-drug media campaign for youth, which would later become the National Youth Anti-Drug Media Campaign.[44] The director of ONDCP is commonly known as the Drug czar,[38] and it was first implemented in 1989 under President George H. W. Bush,[45] and raised to cabinet-level status by Bill Clinton in 1993.[46] These activities were subsequently funded by t
he Treasury and General Government Appropriations Act of 1998.[47][48] The Drug-Free Media Campaign Act of 1998 codified the campaign at 21 U.S.C.1708.[49]
The Global Commission on Drug Policy released a report on June 2, 2011 alleging that "The War On Drugs Has Failed". The commissioned was made up of 22 self-appointed members including a number of prominent international politicians and writers. U.S. Surgeon General Regina Benjamin also released the first ever National Prevention Strategy.[50]
On May 21, 2012, the U.S. Government published an updated version of its Drug Policy.[51] The director of ONDCP stated simultaneously that this policy is something different from the "War on Drugs":
At the same meeting was a declaration signed by the representatives of Italy, the Russian Federation, Sweden, the United Kingdom and the United States in line with this: "Our approach must be a balanced one, combining effective enforcement to restrict the supply of drugs, with efforts to reduce demand and build recovery; supporting people to live a life free of addiction".[53]
According to Human Rights Watch, the War on Drugs caused soaring arrest rates which deliberately disproportionately targeted African Americans.[55] This was also confirmed by John Ehrlichman, an aide to Nixon, who said that the war on drugs was designed to criminalize and disrupt black and hippie communities.[56]
The present state of incarceration in the U.S. as a result of the war on drugs arrived in several stages. By 1971, different stops on drugs had been implemented for more than 50 years (for e.g. since 1914, 1937 etc.) with only a very small increase of inmates per 100,000 citizens. During the first 9 years after Nixon coined the expression "War on Drugs", statistics showed only a minor increase in the total number of imprisoned.
After 1980, the situation began to change. In the 1980s, while the number of arrests for all crimes had risen by 28%, the number of arrests for drug offenses rose 126%.[57] The US Department of Justice, reporting on the effects of state initiatives, has stated that, from 1990 through 2000, "the increasing number of drug offenses accounted for 27% of the total growth among black inmates, 7% of the total growth among Hispanic inmates, and 15% of the growth among white inmates." In addition to prison or jail, the United States provides for the deportation of many non-citizens convicted of drug offenses.[58]
In 1994, the New England Journal of Medicine reported that the "War on Drugs" resulted in the incarceration of one million Americans each year.[59]
In 2008, the Washington Post reported that of 1.5 million Americans arrested each year for drug offenses, half a million would be incarcerated. In addition, one in five black Americans would spend time behind bars due to drug laws.[60]
Federal and state policies also impose collateral consequences on those convicted of drug offenses, such as denial of public benefits or licenses, that are not applicable to those convicted of other types of crime.[61]
In 1986, the U.S. Congress passed laws that created a 100 to 1 sentencing disparity for the possession or trafficking of crack when compared to penalties for trafficking of powder cocaine,[62][63][64][65] which had been widely criticized as discriminatory against minorities, mostly blacks, who were more likely to use crack than powder cocaine.[66] This 100:1 ratio had been required under federal law since 1986.[67] Persons convicted in federal court of possession of 5grams of crack cocaine received a minimum mandatory sentence of 5 years in federal prison. On the other hand, possession of 500grams of powder cocaine carries the same sentence.[63][64] In 2010, the Fair Sentencing Act cut the sentencing disparity to 18:1.[66]
According to Human Rights Watch, crime statistics show thatin the United States in 1999compared to non-minorities, African Americans were far more likely to be arrested for drug crimes, and received much stiffer penalties and sentences.[68]
Statistics from 1998 show that there were wide racial disparities in arrests, prosecutions, sentencing and deaths. African-American drug users made up for 35% of drug arrests, 55% of convictions, and 74% of people sent to prison for drug possession crimes.[63] Nationwide African-Americans were sent to state prisons for drug offenses 13 times more often than other races,[69] even though they only supposedly comprised 13% of regular drug users.[63]
Anti-drug legislation over time has also displayed an apparent racial bias. University of Minnesota Professor and social justice author Michael Tonry writes, "The War on Drugs foreseeably and unnecessarily blighted the lives of hundreds and thousands of young disadvantaged black Americans and undermined decades of effort to improve the life chances of members of the urban black underclass."[70]
In 1968, President Lyndon B. Johnson decided that the government needed to make an effort to curtail the social unrest that blanketed the country at the time. He decided to focus his efforts on illegal drug use, an approach which was in line with expert opinion on the subject at the time. In the 1960s, it was believed that at least half of the crime in the U.S. was drug related, and this number grew as high as 90 percent in the next decade.[71] He created the Reorganization Plan of 1968 which merged the Bureau of Narcotics and the Bureau of Drug Abuse to form the Bureau of Narcotics and Dangerous Drugs within the Department of Justice.[72] The belief during this time about drug use was summarized by journalist Max Lerner in his celebrated[citation needed] work America as a Civilization (1957):
As a case in point we may take the known fact of the prevalence of reefer and dope addiction in Negro areas. This is essentially explained in terms of poverty, slum living, and broken families, yet it would be easy to show the lack of drug addiction among other ethnic groups where the same conditions apply.[73]
Richard Nixon became president in 1969, and did not back away from the anti-drug precedent set by Johnson. Nixon began orchestrating drug raids nationwide to improve his "watchdog" reputation. Lois B. Defleur, a social historian who studied drug arrests during this period in Chicago, stated that, "police administrators indicated they were making the kind of arrests the public wanted". Additionally, some of Nixon's newly created drug enforcement agencies would resort to illegal practices to make arrests as they tried to meet public demand for arrest numbers. From 1972 to 1973, the Office of Drug Abuse and Law Enforcement performed 6,000 drug arrests in 18 months, the majority of the arrested black.[74]
The next two Presidents, Gerald Ford and Jimmy Carter, responded with programs that were essentially a continuation of their predecessors. Shortly after Ronald Reagan became President in 1981 he delivered a speech on the topic. Reagan announced, "We're taking down the surrender flag that has flown over so many drug efforts; we're running up a battle flag."[75] For his first five years in office, Reagan slowly strengthened drug enforcement by creating mandatory minimum sentencing and forfeiture of cash and real estate for drug offenses, policies far more detrimental to poor blacks than any other sector affected by the new laws.[citation needed]
Then, driven by the 1986 cocaine overdose of black basketball star Len Bias,[dubious discuss] Reagan was able to pass the Anti-Drug Abuse Act through Congress. This legislation appropriated an additional $1.7 billion to fund th
e War on Drugs. More importantly, it established 29 new, mandatory minimum sentences for drug offenses. In the entire history of the country up until that point, the legal system had only seen 55 minimum sentences in total.[76] A major stipulation of the new sentencing rules included different mandatory minimums for powder and crack cocaine. At the time of the bill, there was public debate as to the difference in potency and effect of powder cocaine, generally used by whites, and crack cocaine, generally used by blacks, with many believing that "crack" was substantially more powerful and addictive. Crack and powder cocaine are closely related chemicals, crack being a smokeable, freebase form of powdered cocaine hydrochloride which produces a shorter, more intense high while using less of the drug. This method is more cost effective, and therefore more prevalent on the inner-city streets, while powder cocaine remains more popular in white suburbia. The Reagan administration began shoring public opinion against "crack", encouraging DEA official Robert Putnam to play up the harmful effects of the drug. Stories of "crack whores" and "crack babies" became commonplace; by 1986, Time had declared "crack" the issue of the year.[77] Riding the wave of public fervor, Reagan established much harsher sentencing for crack cocaine, handing down stiffer felony penalties for much smaller amounts of the drug.[78]
Reagan protg and former Vice-President George H. W. Bush was next to occupy the oval office, and the drug policy under his watch held true to his political background. Bush maintained the hard line drawn by his predecessor and former boss, increasing narcotics regulation when the First National Drug Control Strategy was issued by the Office of National Drug Control in 1989.[79]
The next three presidents Clinton, Bush and Obama continued this trend, maintaining the War on Drugs as they inherited it upon taking office.[80] During this time of passivity by the federal government, it was the states that initiated controversial legislation in the War on Drugs. Racial bias manifested itself in the states through such controversial policies as the "stop and frisk" police practices in New York city and the "three strikes" felony laws began in California in 1994.[81]
In August 2010, President Obama signed the Fair Sentencing Act into law that dramatically reduced the 100-to-1 sentencing disparity between powder and crack cocaine, which disproportionately affected minorities.[82]
A substantial part of the "Drug War" is the "Mexican Drug War." Many drugs are transported from Mexico into the United States, such as cocaine, marijuana, methamphetamine, and heroin.[citation needed]
The possession of cocaine is illegal in all fifty states, along with crack cocaine (the cheaper version of cocaine but has a much greater penalty). Having possession is when the accused knowingly has it on their person, or in a backpack or purse. The possession of cocaine with no prior conviction, for the first offense, the person will be sentenced to a maximum of one year in prison or fined $1,000, or both. If the person has a prior conviction, whether it is a narcotic or cocaine, they will be sentenced to two years in "prison", $2,500 fine. or both. With two or more convictions of possession prior to this present offense, they can be sentenced to 90 days in "prison" along with a $5,000 fine.[83]
Marijuana is the most popular illegal drug worldwide. The punishment for possession of it is less than for the possession of cocaine or heroin. In some states in the US the drug is legal. Over 80 million of Americans have tried this type of drug. The Criminal Defense Lawyer article claims that, depending on the age of person and how much the person has been caught for possession, they will be fined and could plea bargain into going to a treatment program versus going to "prison". In each state the convictions differ along with how much of the "marijuana" they have on their person.[84]
Crystal meth is composed of methamphetamine hydrochloride. It is marketed as either a white powder or in a solid (rock) form. The possession of crystal meth can result in a punishment varying from a fine to a jail sentence. When the convict possessed a lot[clarification needed] of meth on their person, the sentence will be longer.[85]
Heroin is an opiate that is highly addictive. If caught selling or possessing heroin, a perpetrator can be charged with a felony and face twofour years in prison and could be fined to a maximum of $20,000.[86]
Some scholars have claimed that the phrase "War on Drugs" is propaganda cloaking an extension of earlier military or paramilitary operations.[7] Others have argued that large amounts of "drug war" foreign aid money, training, and equipment actually goes to fighting leftist insurgencies and is often provided to groups who themselves are involved in large-scale narco-trafficking, such as corrupt members of the Colombian military.[6]
From 1963 to the end of the Vietnam War in 1975, marijuana usage became common among U.S. soldiers in non-combat situations. Some servicemen also used heroin. Many of the servicemen ended the heroin use after returning to the United States but came home addicted. In 1971, the U.S. military conducted a study of drug use among American servicemen and women. It found that daily usage rates for drugs on a worldwide basis were as low as two percent.[87] However, in the spring of 1971, two congressmen released an alarming report alleging that 15% of the servicemen in Vietnam were addicted to heroin. Marijuana use was also common in Vietnam. Soldiers who used drugs had more disciplinary problems. The frequent drug use had become an issue for the commanders in Vietnam; in 1971 it was estimated that 30,000 servicemen were addicted to drugs, most of them to heroin.[9]
From 1971 on, therefore, returning servicemen were required to take a mandatory heroin test. Servicemen who tested positive upon returning from Vietnam were not allowed to return home until they had passed the test with a negative result. The program also offered a treatment for heroin addicts.[88]
Elliot Borin's article "The U.S. Military Needs its Speed"published in Wired on February 10, 2003reports:
But the Defense Department, which distributed millions of amphetamine tablets to troops during World War II, Vietnam and the Gulf War, soldiers on, insisting that they are not only harmless but beneficial.
In a news conference held in connection with Schmidt and Umbach's Article 32 hearing, Dr. Pete Demitry, an Air Force physician and a pilot, claimed that the "Air Force has used (Dexedrine) safely for 60 years" with "no known speed-related mishaps."
The need for speed, Demitry added "is a life-and-death issue for our military."[89]
One of the first anti-drug efforts in the realm of foreign policy was President Nixon's Operation Intercept, announced in September 1969, targeted at reducing the amount of cannabis entering the United States from Mexico. The effort began with an intense inspection crackdown that resulted in an almost shutdown of cross-border traffic.[90] Because the burden on border crossings was controversial in border states, the effort only lasted twenty days.[91]
On December 20, 1989, the United States invaded Panama as part of Operation Just Cause, which involved 25,000 American troops. Gen. Manuel Noriega, head of the government of Panama, had been giving military assistance to Contra groups in Nicaragua at the request of the U.S. which, in exchange, tolerated his drug trafficking a
ctivities, which they had known about since the 1960s.[92][93] When the Drug Enforcement Administration (DEA) tried to indict Noriega in 1971, the CIA prevented them from doing so.[92] The CIA, which was then directed by future president George H. W. Bush, provided Noriega with hundreds of thousands of dollars per year as payment for his work in Latin America.[92] When CIA pilot Eugene Hasenfus was shot down over Nicaragua by the Sandinistas, documents aboard the plane revealed many of the CIA's activities in Latin America, and the CIA's connections with Noriega became a public relations "liability" for the U.S. government, which finally allowed the DEA to indict him for drug trafficking, after decades of tolerating his drug operations.[92] Operation Just Cause, whose purpose was to capture Noriega and overthrow his government; Noriega found temporary asylum in the Papal Nuncio, and surrendered to U.S. soldiers on January 3, 1990.[94] He was sentenced by a court in Miami to 45 years in prison.[92]
As part of its Plan Colombia program, the United States government currently provides hundreds of millions of dollars per year of military aid, training, and equipment to Colombia,[95] to fight left-wing guerrillas such as the Revolutionary Armed Forces of Colombia (FARC-EP), which has been accused of being involved in drug trafficking.[96]
Private U.S. corporations have signed contracts to carry out anti-drug activities as part of Plan Colombia. DynCorp, the largest private company involved, was among those contracted by the State Department, while others signed contracts with the Defense Department.[97]
Colombian military personnel have received extensive counterinsurgency training from U.S. military and law enforcement agencies, including the School of Americas (SOA). Author Grace Livingstone has stated that more Colombian SOA graduates have been implicated in human rights abuses than currently known SOA graduates from any other country. All of the commanders of the brigades highlighted in a 2001 Human Rights Watch report on Colombia were graduates of the SOA, including the III brigade in Valle del Cauca, where the 2001 Alto Naya Massacre occurred. US-trained officers have been accused of being directly or indirectly involved in many atrocities during the 1990s, including the Massacre of Trujillo and the 1997 Mapiripn Massacre.
In 2000, the Clinton administration initially waived all but one of the human rights conditions attached to Plan Colombia, considering such aid as crucial to national security at the time.[98]
The efforts of U.S. and Colombian governments have been criticized for focusing on fighting leftist guerrillas in southern regions without applying enough pressure on right-wing paramilitaries and continuing drug smuggling operations in the north of the country.[99][100] Human Rights Watch, congressional committees and other entities have documented the existence of connections between members of the Colombian military and the AUC, which the U.S. government has listed as a terrorist group, and that Colombian military personnel have committed human rights abuses which would make them ineligible for U.S. aid under current laws.[citation needed]
In 2010, the Washington Office on Latin America concluded that both Plan Colombia and the Colombian government's security strategy "came at a high cost in lives and resources, only did part of the job, are yielding diminishing returns and have left important institutions weaker."[101]
A 2014 report by the RAND Corporation, which was issued to analyze viable strategies for the Mexican drug war considering successes experienced in Columbia, noted:
Between 1999 and 2002, the United States gave Colombia $2.04 billion in aid, 81 percent of which was for military purposes, placing Colombia just below Israel and Egypt among the largest recipients of U.S. military assistance. Colombia increased its defense spending from 3.2 percent of gross domestic product (GDP) in 2000 to 4.19 percent in 2005. Overall, the results were extremely positive. Greater spending on infrastructure and social programs helped the Colombian government increase its political legitimacy, while improved security forces were better able to consolidate control over large swaths of the country previously overrun by insurgents and drug cartels.
It also notes that, "Plan Colombia has been widely hailed as a success, and some analysts believe that, by 2010, Colombian security forces had finally gained the upper hand once and for all."[102]
The Mrida Initiative is a security cooperation between the United States and the government of Mexico and the countries of Central America. It was approved on June 30, 2008, and its stated aim is combating the threats of drug trafficking and transnational crime. The Mrida Initiative appropriated $1.4 billion in a three-year commitment (20082010) to the Mexican government for military and law enforcement training and equipment, as well as technical advice and training to strengthen the national justice systems. The Mrida Initiative targeted many very important government officials, but it failed to address the thousands of Central Americans who had to flee their countries due to the danger they faced everyday because of the war on drugs. There is still not any type of plan that addresses these people. No weapons are included in the plan.[103][104]
The United States regularly sponsors the spraying of large amounts of herbicides such as glyphosate over the jungles of Central and South America as part of its drug eradication programs. Environmental consequences resulting from aerial fumigation have been criticized as detrimental to some of the world's most fragile ecosystems;[105] the same aerial fumigation practices are further credited with causing health problems in local populations.[106]
In 2012, the U.S. sent DEA agents to Honduras to assist security forces in counternarcotics operations. Honduras has been a major stop for drug traffickers, who use small planes and landing strips hidden throughout the country to transport drugs. The U.S. government made agreements with several Latin American countries to share intelligence and resources to counter the drug trade. DEA agents, working with other U.S. agencies such as the State Department, the CBP, and Joint Task Force-Bravo, assisted Honduras troops in conducting raids on traffickers' sites of operation.[107]
The War on Drugs has been a highly contentious issue since its inception. A poll on October 2, 2008, found that three in four Americans believed that the War On Drugs was failing.[108]
At a meeting in Guatemala in 2012, three former presidents from Guatemala, Mexico and Colombia said that the war on drugs had failed and that they would propose a discussion on alternatives, including decriminalization, at the Summit of the Americas in April of that year.[109] Guatemalan President Otto Prez Molina said that the war on drugs was exacting too high a price on the lives of Central Americans and that it was time to "end the taboo on discussing decriminalization".[110] At the summit, the government of Colombia pushed for the most far-reaching change to drugs policy since the war on narcotics was declared by Nixon four decades prior, citing the catastrophic effects it had had in Colombia.[111]
Several critics have compared the wholesale incarceration of the dissenting minority of drug users to the wholesale incarceration of other minorities in history. Psychiatrist Thomas Szasz, for example, writes in 1997 "Over the past thirty years, we have replaced the medical-political persec
ution of illegal sex users ('perverts' and 'psychopaths') with the even more ferocious medical-political persecution of illegal drug users."[112]
Penalties for drug crimes among American youth almost always involve permanent or semi-permanent removal from opportunities for education, strip them of voting rights, and later involve creation of criminal records which make employment more difficult.[113] Thus, some authors maintain that the War on Drugs has resulted in the creation of a permanent underclass of people who have few educational or job opportunities, often as a result of being punished for drug offenses which in turn have resulted from attempts to earn a living in spite of having no education or job opportunities.[113]
According to a 2008 study published by Harvard economist Jeffrey A. Miron, the annual savings on enforcement and incarceration costs from the legalization of drugs would amount to roughly $41.3 billion, with $25.7 billion being saved among the states and over $15.6 billion accrued for the federal government. Miron further estimated at least $46.7 billion in tax revenue based on rates comparable to those on tobacco and alcohol ($8.7 billion from marijuana, $32.6 billion from cocaine and heroin, remainder from other drugs).[114]
Low taxation in Central American countries has been credited with weakening the region's response in dealing with drug traffickers. Many cartels, especially Los Zetas have taken advantage of the limited resources of these nations. 2010 tax revenue in El Salvador, Guatemala, and Honduras, composed just 13.53% of GDP. As a comparison, in Chile and the U.S., taxes were 18.6% and 26.9% of GDP respectively. However, direct taxes on income are very hard to enforce and in some cases tax evasion is seen as a national pastime.[115]
The status of coca and coca growers has become an intense political issue in several countries, including Colombia and particularly Bolivia, where the president, Evo Morales, a former coca growers' union leader, has promised to legalise the traditional cultivation and use of coca.[116] Indeed, legalization efforts have yielded some successes under the Morales administration when combined with aggressive and targeted eradication efforts. The country saw a 12-13% decline in coca cultivation[116] in 2011 under Morales, who has used coca growers' federations to ensure compliance with the law rather than providing a primary role for security forces.[116]
The coca eradication policy has been criticised for its negative impact on the livelihood of coca growers in South America. In many areas of South America the coca leaf has traditionally been chewed and used in tea and for religious, medicinal and nutritional purposes by locals.[117] For this reason many insist that the illegality of traditional coca cultivation is unjust. In many areas the US government and military has forced the eradication of coca without providing for any meaningful alternative crop for farmers, and has additionally destroyed many of their food or market crops, leaving them starving and destitute.[117]
The CIA, DEA, State Department, and several other U.S. government agencies have been implicated in relations with various groups involved in drug trafficking.
Senator John Kerry's 1988 U.S. Senate Committee on Foreign Relations report on Contra drug links concludes that members of the U.S. State Department "who provided support for the Contras are involved in drug trafficking... and elements of the Contras themselves knowingly receive financial and material assistance from drug traffickers."[118] The report further states that "the Contra drug links include... payments to drug traffickers by the U.S. State Department of funds authorized by the Congress for humanitarian assistance to the Contras, in some cases after the traffickers had been indicted by federal law enforcement agencies on drug charges, in others while traffickers were under active investigation by these same agencies."
In 1996, journalist Gary Webb published reports in the San Jose Mercury News,[119] and later in his book Dark Alliance,[120] detailing how Contras, had been involved in distributing crack cocaine into Los Angeles whilst receiving money from the CIA. Contras used money from drug trafficking to buy weapons
Webb's premise regarding the U.S. Government connection was initially attacked at the time by the media. It is now widely accepted that Webb's main assertion of government "knowledge of drug operations, and collaboration with and protection of known drug traffickers" was correct.[121] In 1998, CIA Inspector General Frederick Hitz published a two-volume report[122] that while seemingly refuting Webb's claims of knowledge and collaboration in its conclusions did not deny them in its body.[123] Hitz went on to admit CIA improprieties in the affair in testimony to a House congressional committee. Some of Webb's work acknowledging is now widely accepted.
According to Rodney Campbell, an editorial assistant to Nelson Rockefeller, during World War II, the United States Navy, concerned that strikes and labor disputes in U.S. eastern shipping ports would disrupt wartime logistics, released the mobster Lucky Luciano from prison, and collaborated with him to help the mafia take control of those ports. Labor union members were terrorized and murdered by mafia members as a means of preventing labor unrest and ensuring smooth shipping of supplies to Europe.[124]
According to Alexander Cockburn and Jeffrey St. Clair, in order to prevent Communist party members from being elected in Italy following World War II, the CIA worked closely with the Sicilian Mafia, protecting them and assisting in their worldwide heroin smuggling operations. The mafia was in conflict with leftist groups and was involved in assassinating, torturing, and beating leftist political organizers.[125]
In 1986, the US Defense Department funded a two-year study by the RAND Corporation, which found that the use of the armed forces to interdict drugs coming into the United States would have little or no effect on cocaine traffic and might, in fact, raise the profits of cocaine cartels and manufacturers. The 175-page study, "Sealing the Borders: The Effects of Increased Military Participation in Drug Interdiction", was prepared by seven researchers, mathematicians and economists at the National Defense Research Institute, a branch of the RAND, and was released in 1988. The study noted that seven prior studies in the past nine years, including one by the Center for Naval Research and the Office of Technology Assessment, had come to similar conclusions. Interdiction efforts, using current armed forces resources, would have almost no effect on cocaine importation into the United States, the report concluded.[126]
During the early-to-mid-1990s, the Clinton administration ordered and funded a major cocaine policy study, again by RAND. The Rand Drug Policy Research Center study concluded that $3 billion should be switched from federal and local law enforcement to treatment. The report said that treatment is the cheapest way to cut drug use, stating that drug treatment is twenty-three times more effective than the supply-side "war on drugs".[127]
The National Research Council Committee on Data and Research for Policy on Illegal Drugs published its findings in 2001 on the efficacy of the drug war. The NRC Committee found that existing studies on efforts to address drug usage and smuggling, from U.S. military operations to eradicate coca fields in Colombia, to domestic drug treatment centers, have all been inconclusive, if the programs ha
ve been evaluated at all: "The existing drug-use monitoring systems are strikingly inadequate to support the full range of policy decisions that the nation must make.... It is unconscionable for this country to continue to carry out a public policy of this magnitude and cost without any way of knowing whether and to what extent it is having the desired effect."[128] The study, though not ignored by the press, was ignored by top-level policymakers, leading Committee Chair Charles Manski to conclude, as one observer notes, that "the drug war has no interest in its own results".[129]
In mid-1995, the US government tried to reduce the supply of methamphetamine precursors to disrupt the market of this drug. According to a 2009 study, this effort was successful, but its effects were largely temporary.[130]
During alcohol prohibition, the period from 1920 to 1933, alcohol use initially fell but began to increase as early as 1922. It has been extrapolated that even if prohibition had not been repealed in 1933, alcohol consumption would have quickly surpassed pre-prohibition levels.[131] One argument against the War on Drugs is that it uses similar measures as Prohibition and is no more effective.
In the six years from 2000 to 2006, the U.S. spent $4.7 billion on Plan Colombia, an effort to eradicate coca production in Colombia. The main result of this effort was to shift coca production into more remote areas and force other forms of adaptation. The overall acreage cultivated for coca in Colombia at the end of the six years was found to be the same, after the U.S. Drug Czar's office announced a change in measuring methodology in 2005 and included new areas in its surveys.[132] Cultivation in the neighboring countries of Peru and Bolivia increased, some would describe this effect like squeezing a balloon.[133]
Similar lack of efficacy is observed in some other countries pursuing similar[citation needed] policies. In 1994, 28.5% of Canadians reported having consumed illicit drugs in their life; by 2004, that figure had risen to 45%. 73% of the $368 million spent by the Canadian government on targeting illicit drugs in 20042005 went toward law enforcement rather than treatment, prevention or harm reduction.[134]
Richard Davenport-Hines, in his book The Pursuit of Oblivion,[135] criticized the efficacy of the War on Drugs by pointing out that
1015% of illicit heroin and 30% of illicit cocaine is intercepted. Drug traffickers have gross profit margins of up to 300%. At least 75% of illicit drug shipments would have to be intercepted before the traffickers' profits were hurt.
Alberto Fujimori, president of Peru from 1990 to 2000, described U.S. foreign drug policy as "failed" on grounds that "for 10 years, there has been a considerable sum invested by the Peruvian government and another sum on the part of the American government, and this has not led to a reduction in the supply of coca leaf offered for sale. Rather, in the 10 years from 1980 to 1990, it grew 10-fold."[136]
At least 500 economists, including Nobel Laureates Milton Friedman,[137]George Akerlof and Vernon L. Smith, have noted that reducing the supply of marijuana without reducing the demand causes the price, and hence the profits of marijuana sellers, to go up, according to the laws of supply and demand.[138] The increased profits encourage the producers to produce more drugs despite the risks, providing a theoretical explanation for why attacks on drug supply have failed to have any lasting effect. The aforementioned economists published an open letter to President George W. Bush stating "We urge...the country to commence an open and honest debate about marijuana prohibition... At a minimum, this debate will force advocates of current policy to show that prohibition has benefits sufficient to justify the cost to taxpayers, foregone tax revenues and numerous ancillary consequences that result from marijuana prohibition."
The declaration from the World Forum Against Drugs, 2008 state that a balanced policy of drug abuse prevention, education, treatment, law enforcement, research, and supply reduction provides the most effective platform to reduce drug abuse and its associated harms and call on governments to consider demand reduction as one of their first priorities in the fight against drug abuse.[139]
Despite over $7 billion spent annually towards arresting[140] and prosecuting nearly 800,000 people across the country for marijuana offenses in 2005[citation needed] (FBI Uniform Crime Reports), the federally funded Monitoring the Future Survey reports about 85% of high school seniors find marijuana "easy to obtain". That figure has remained virtually unchanged since 1975, never dropping below 82.7% in three decades of national surveys.[141] The Drug Enforcement Administration states that the number of users of marijuana in the U.S. declined between 2000 and 2005 even with many states passing new medical marijuana laws making access easier,[142] though usage rates remain higher than they were in the 1990s according to the National Survey on Drug Use and Health.[143]
ONDCP stated in April 2011 that there has been a 46 percent drop in cocaine use among young adults over the past five years, and a 65 percent drop in the rate of people testing positive for cocaine in the workplace since 2006.[144] At the same time, a 2007 study found that up to 35% of college undergraduates used stimulants not prescribed to them.[145]
A 2013 study found that prices of heroin, cocaine and cannabis had decreased from 1990 to 2007, but the purity of these drugs had increased during the same time.[146]
The legality of the War on Drugs has been challenged on four main grounds in the US.
Several authors believe that the United States' federal and state governments have chosen wrong methods for combatting the distribution of illicit substances. Aggressive, heavy-handed enforcement funnel individuals through courts and prisons, instead of treating the cause of the addiction, the focus of government efforts has been on punishment. By making drugs illegal rather than regulating them, the War on Drugs creates a highly profitable black market. Jefferson Fish has edited scholarly collections of articles offering a wide variety of public health based and rights based alternative drug policies.[147][148][149]
In the year 2000, the United States drug-control budget reached 18.4 billion dollars,[150] nearly half of which was spent financing law enforcement while only one sixth was spent on treatment. In the year 2003, 53 percent of the requested drug control budget was for enforcement, 29 percent for treatment, and 18 percent for prevention.[151] The state of New York, in particular, designated 17 percent of its budget towards substance-abuse-related spending. Of that, a mere one percent was put towards prevention, treatment, and research.
In a survey taken by Substance Abuse and Mental Health Services Administration (SAMHSA), it was found that substance abusers that remain in treatment longer are less likely to resume their former drug habits. Of the people that were studied, 66 percent were cocaine users. After experiencing long-term in-patient treatment, only 22 percent returned to the use of cocaine. Treatment had reduced the number of cocaine abusers by two-thirds.[150] By spending the majority of its money on law enforcement, the federal government had underestimated the true value of drug-treatment facilities and their benefit towards reducing the number of addicts in the U.S.
In 2004 the federal government issued the National Drug Con
trol Strategy. It supported programs designed to expand treatment options, enhance treatment delivery, and improve treatment outcomes. For example, the Strategy provided SAMHSA with a $100.6 million grant to put towards their Access to Recovery (ATR) initiative. ATR is a program that provides vouchers to addicts to provide them with the means to acquire clinical treatment or recovery support. The project's goals are to expand capacity, support client choice, and increase the array of faith-based and community based providers for clinical treatment and recovery support services.[152] The ATR program will also provide a more flexible array of services based on the individual's treatment needs.
The 2004 Strategy additionally declared a significant 32 million dollar raise in the Drug Courts Program, which provides drug offenders with alternatives to incarceration. As a substitute for imprisonment, drug courts identify substance-abusing offenders and place them under strict court monitoring and community supervision, as well as provide them with long-term treatment services.[153] According to a report issued by the National Drug Court Institute, drug courts have a wide array of benefits, with only 16.4 percent of the nation's drug court graduates rearrested and charged with a felony within one year of completing the program (versus the 44.1% of released prisoners who end up back in prison within 1-year). Additionally, enrolling an addict in a drug court program costs much less than incarcerating one in prison.[154] According to the Bureau of Prisons, the fee to cover the average cost of incarceration for Federal inmates in 2006 was $24,440.[155] The annual cost of receiving treatment in a drug court program ranges from $900 to $3,500. Drug courts in New York State alone saved $2.54 million in incarceration costs.[154]
Read the original:
Posted in War On Drugs
Comments Off on War on Drugs – Wikipedia, the free encyclopedia