12345...102030...


Technology Articles, Technological News | Popular Science

Innovations in materials, artificial intelligence, and stealth technology are continually retooling modern warfare. Here we delve into the latest developments, be that ultra stealthy fuselages, superalloys that make engines more efficient, or tanks with radar-guided munitions and projectile launchers. If you’re fascinated by drones, catapults, and aircraft with foldable wings, take heart: you’re not alone. SEE MORE

More:

Technology Articles, Technological News | Popular Science

Technology r/technology – reddit

We have posted this before, but this needs to be reiterated.

We understand that many of you are emotionally driven to discuss your feelings on recent events, most notably the repeal of Net Neutrality – however inciting violence towards others is never ok. It is upsetting that we even have to post this.

Do we enjoy banning people for these types of offences? No… Many of us feel as if the system has failed and want some form of repercussion. But threats of violence and harassment are not the answer here.

And to be clear – here are some examples of what will get you banned:

I hope this PoS dies in a car fire

I want to punch him in the face til his teeth fall out

And if you are trying to be slick by using this form

I never condone violence but…

I would never say he should die but…

Im not one to wish death upon but…

Let’s keep the threads civil.

If you violate this rule, you will be banned for 30 days, no exceptions

Go here to see the original:

Technology r/technology – reddit

Technology – Wikipedia

This article is about the use and knowledge of techniques and processes for producing goods and services. For other uses, see Technology (disambiguation).

Technology (“science of craft”, from Greek , techne, “art, skill, cunning of hand”; and -, -logia[2]) is first robustly defined by Jacob Bigelow in 1829 as: “…principles, processes, and nomenclatures of the more conspicuous arts, particularly those which involve applications of science, and which may be considered useful, by promoting the benefit of society, together with the emolument [compensation [3]] of those who pursue them” [4].

The simplest form of technology is the development and use of basic tools. The prehistoric discovery of how to control fire and the later Neolithic Revolution increased the available sources of food, and the invention of the wheel helped humans to travel in and control their environment. Developments in historic times, including the printing press, the telephone, and the Internet, have lessened physical barriers to communication and allowed humans to interact freely on a global scale.

Technology has many effects. It has helped develop more advanced economies (including today’s global economy) and has allowed the rise of a leisure class. Many technological processes produce unwanted by-products known as pollution and deplete natural resources to the detriment of Earth’s environment. Innovations have always influenced the values of a society and raised new questions of the ethics of technology. Examples include the rise of the notion of efficiency in terms of human productivity, and the challenges of bioethics.

Philosophical debates have arisen over the use of technology, with disagreements over whether technology improves the human condition or worsens it. Neo-Luddism, anarcho-primitivism, and similar reactionary movements criticize the pervasiveness of technology, arguing that it harms the environment and alienates people; proponents of ideologies such as transhumanism and techno-progressivism view continued technological progress as beneficial to society and the human condition.

The use of the term “technology” has changed significantly over the last 200 years. Before the 20th century, the term was uncommon in English, and it was used either to refer to the description or study of the useful arts[14] or to allude to technical education, as in the Massachusetts Institute of Technology (chartered in 1861).[15]

The term “technology” rose to prominence in the 20th century in connection with the Second Industrial Revolution. The term’s meanings changed in the early 20th century when American social scientists, beginning with Thorstein Veblen, translated ideas from the German concept of Technik into “technology.” In German and other European languages, a distinction exists between technik and technologie that is absent in English, which usually translates both terms as “technology.” By the 1930s, “technology” referred not only to the study of the industrial arts but to the industrial arts themselves.[16]

In 1937, the American sociologist Read Bain wrote that “technology includes all tools, machines, utensils, weapons, instruments, housing, clothing, communicating and transporting devices and the skills by which we produce and use them.”[17] Bain’s definition remains common among scholars today, especially social scientists. Scientists and engineers usually prefer to define technology as applied science, rather than as the things that people make and use.[18] More recently, scholars have borrowed from European philosophers of “technique” to extend the meaning of technology to various forms of instrumental reason, as in Foucault’s work on technologies of the self (techniques de soi).

Dictionaries and scholars have offered a variety of definitions. The Merriam-Webster Learner’s Dictionary offers a definition of the term: “the use of science in industry, engineering, etc., to invent useful things or to solve problems” and “a machine, piece of equipment, method, etc., that is created by technology.”[19] Ursula Franklin, in her 1989 “Real World of Technology” lecture, gave another definition of the concept; it is “practice, the way we do things around here.”[20] The term is often used to imply a specific field of technology, or to refer to high technology or just consumer electronics, rather than technology as a whole.[21] Bernard Stiegler, in Technics and Time, 1, defines technology in two ways: as “the pursuit of life by means other than life,” and as “organized inorganic matter.”[22]

Technology can be most broadly defined as the entities, both material and immaterial, created by the application of mental and physical effort in order to achieve some value. In this usage, technology refers to tools and machines that may be used to solve real-world problems. It is a far-reaching term that may include simple tools, such as a crowbar or wooden spoon, or more complex machines, such as a space station or particle accelerator. Tools and machines need not be material; virtual technology, such as computer software and business methods, fall under this definition of technology.[23] W. Brian Arthur defines technology in a similarly broad way as “a means to fulfill a human purpose.”[24]

The word “technology” can also be used to refer to a collection of techniques. In this context, it is the current state of humanity’s knowledge of how to combine resources to produce desired products, to solve problems, fulfill needs, or satisfy wants; it includes technical methods, skills, processes, techniques, tools and raw materials. When combined with another term, such as “medical technology” or “space technology,” it refers to the state of the respective field’s knowledge and tools. “State-of-the-art technology” refers to the high technology available to humanity in any field.

Technology can be viewed as an activity that forms or changes culture.[25] Additionally, technology is the application of math, science, and the arts for the benefit of life as it is known. A modern example is the rise of communication technology, which has lessened barriers to human interaction and as a result has helped spawn new subcultures; the rise of cyberculture has at its basis the development of the Internet and the computer.[26] Not all technology enhances culture in a creative way; technology can also help facilitate political oppression and war via tools such as guns. As a cultural activity, technology predates both science and engineering, each of which formalize some aspects of technological endeavor.

The distinction between science, engineering, and technology is not always clear. Science is systematic knowledge of the physical or material world gained through observation and experimentation.[27] Technologies are not usually exclusively products of science, because they have to satisfy requirements such as utility, usability, and safety.[citation needed]

Engineering is the goal-oriented process of designing and making tools and systems to exploit natural phenomena for practical human means, often (but not always) using results and techniques from science. The development of technology may draw upon many fields of knowledge, including scientific, engineering, mathematical, linguistic, and historical knowledge, to achieve some practical result.

Technology is often a consequence of science and engineering, although technology as a human activity precedes the two fields. For example, science might study the flow of electrons in electrical conductors by using already-existing tools and knowledge. This new-found knowledge may then be used by engineers to create new tools and machines such as semiconductors, computers, and other forms of advanced technology. In this sense, scientists and engineers may both be considered technologists; the three fields are often considered as one for the purposes of research and reference.[28]

The exact relations between science and technology in particular have been debated by scientists, historians, and policymakers in the late 20th century, in part because the debate can inform the funding of basic and applied science. In the immediate wake of World War II, for example, it was widely considered in the United States that technology was simply “applied science” and that to fund basic science was to reap technological results in due time. An articulation of this philosophy could be found explicitly in Vannevar Bush’s treatise on postwar science policy, Science The Endless Frontier: “New products, new industries, and more jobs require continuous additions to knowledge of the laws of nature… This essential new knowledge can be obtained only through basic scientific research.”[29] In the late-1960s, however, this view came under direct attack, leading towards initiatives to fund science for specific tasks (initiatives resisted by the scientific community). The issue remains contentious, though most analysts resist the model that technology simply is a result of scientific research.[30][31]

The use of tools by early humans was partly a process of discovery and of evolution. Early humans evolved from a species of foraging hominids which were already bipedal,[32] with a brain mass approximately one third of modern humans.[33] Tool use remained relatively unchanged for most of early human history. Approximately 50,000 years ago, the use of tools and complex set of behaviors emerged, believed by many archaeologists to be connected to the emergence of fully modern language.[34]

Hominids started using primitive stone tools millions of years ago. The earliest stone tools were little more than a fractured rock, but approximately 75,000 years ago,[35] pressure flaking provided a way to make much finer work.

The discovery and utilization of fire, a simple energy source with many profound uses, was a turning point in the technological evolution of humankind.[36] The exact date of its discovery is not known; evidence of burnt animal bones at the Cradle of Humankind suggests that the domestication of fire occurred before 1 Ma;[37] scholarly consensus indicates that Homo erectus had controlled fire by between 500 and 400 ka.[38][39] Fire, fueled with wood and charcoal, allowed early humans to cook their food to increase its digestibility, improving its nutrient value and broadening the number of foods that could be eaten.[40]

Other technological advances made during the Paleolithic era were clothing and shelter; the adoption of both technologies cannot be dated exactly, but they were a key to humanity’s progress. As the Paleolithic era progressed, dwellings became more sophisticated and more elaborate; as early as 380 ka, humans were constructing temporary wood huts.[41][42] Clothing, adapted from the fur and hides of hunted animals, helped humanity expand into colder regions; humans began to migrateout of Africa by 200 ka and into other continents such as Eurasia.[43]

Human’s technological ascent began in earnest in what is known as the Neolithic Period (“New Stone Age”). The invention of polished stone axes was a major advance that allowed forest clearance on a large scale to create farms. This use of polished stone axes increased greatly in the Neolithic, but were originally used in the preceding Mesolithic in some areas such as Ireland.[44] Agriculture fed larger populations, and the transition to sedentism allowed simultaneously raising more children, as infants no longer needed to be carried, as nomadic ones must. Additionally, children could contribute labor to the raising of crops more readily than they could to the hunter-gatherer economy.[45][46]

With this increase in population and availability of labor came an increase in labor specialization.[47] What triggered the progression from early Neolithic villages to the first cities, such as Uruk, and the first civilizations, such as Sumer, is not specifically known; however, the emergence of increasingly hierarchical social structures and specialized labor, of trade and war amongst adjacent cultures, and the need for collective action to overcome environmental challenges such as irrigation, are all thought to have played a role.[48]

Continuing improvements led to the furnace and bellows and provided, for the first time, the ability to smelt and forge of gold, copper, silver, and lead native metals found in relatively pure form in nature.[49] The advantages of copper tools over stone, bone, and wooden tools were quickly apparent to early humans, and native copper was probably used from near the beginning of Neolithic times (about 10 ka).[50] Native copper does not naturally occur in large amounts, but copper ores are quite common and some of them produce metal easily when burned in wood or charcoal fires. Eventually, the working of metals led to the discovery of alloys such as bronze and brass (about 4000 BCE). The first uses of iron alloys such as steel dates to around 1800 BCE.[51][52]

Meanwhile, humans were learning to harness other forms of energy. The earliest known use of wind power is the sailing ship; the earliest record of a ship under sail is that of a Nile boat dating to the 8th millennium BCE.[53] From prehistoric times, Egyptians probably used the power of the annual flooding of the Nile to irrigate their lands, gradually learning to regulate much of it through purposely built irrigation channels and “catch” basins. The ancient Sumerians in Mesopotamia used a complex system of canals and levees to divert water from the Tigris and Euphrates rivers for irrigation.[54]

According to archaeologists, the wheel was invented around 4000 BCE probably independently and nearly simultaneously in Mesopotamia (in present-day Iraq), the Northern Caucasus (Maykop culture) and Central Europe.[55] Estimates on when this may have occurred range from 5500 to 3000 BCE with most experts putting it closer to 4000 BCE.[56] The oldest artifacts with drawings depicting wheeled carts date from about 3500 BCE;[57] however, the wheel may have been in use for millennia before these drawings were made. More recently, the oldest-known wooden wheel in the world was found in the Ljubljana marshes of Slovenia.[58]

The invention of the wheel revolutionized trade and war. It did not take long to discover that wheeled wagons could be used to carry heavy loads. The ancient Sumerians used the potter’s wheel and may have invented it.[59] A stone pottery wheel found in the city-state of Ur dates to around 3429 BCE,[60] and even older fragments of wheel-thrown pottery have been found in the same area.[60] Fast (rotary) potters’ wheels enabled early mass production of pottery, but it was the use of the wheel as a transformer of energy (through water wheels, windmills, and even treadmills) that revolutionized the application of nonhuman power sources. The first two-wheeled carts were derived from travois[61] and were first used in Mesopotamia and Iran in around 3000 BCE.[61]

The oldest known constructed roadways are the stone-paved streets of the city-state of Ur, dating to circa 4000 BCE[62] and timber roads leading through the swamps of Glastonbury, England, dating to around the same time period.[62] The first long-distance road, which came into use around 3500 BCE,[62] spanned 1,500 miles from the Persian Gulf to the Mediterranean Sea,[62] but was not paved and was only partially maintained.[62] In around 2000 BCE, the Minoans on the Greek island of Crete built a fifty-kilometer (thirty-mile) road leading from the palace of Gortyn on the south side of the island, through the mountains, to the palace of Knossos on the north side of the island.[62] Unlike the earlier road, the Minoan road was completely paved.[62]

Ancient Minoan private homes had running water.[64] A bathtub virtually identical to modern ones was unearthed at the Palace of Knossos.[64][65] Several Minoan private homes also had toilets, which could be flushed by pouring water down the drain.[64] The ancient Romans had many public flush toilets,[65] which emptied into an extensive sewage system.[65] The primary sewer in Rome was the Cloaca Maxima;[65] construction began on it in the sixth century BCE and it is still in use today.[65]

The ancient Romans also had a complex system of aqueducts,[63] which were used to transport water across long distances.[63] The first Roman aqueduct was built in 312 BCE.[63] The eleventh and final ancient Roman aqueduct was built in 226 CE.[63] Put together, the Roman aqueducts extended over 450 kilometers,[63] but less than seventy kilometers of this was above ground and supported by arches.[63]

Innovations continued through the Middle Ages with innovations such as silk, the horse collar and horseshoes in the first few hundred years after the fall of the Roman Empire. Medieval technology saw the use of simple machines (such as the lever, the screw, and the pulley) being combined to form more complicated tools, such as the wheelbarrow, windmills and clocks. The Renaissance brought forth many of these innovations, including the printing press (which facilitated the greater communication of knowledge), and technology became increasingly associated with science, beginning a cycle of mutual advancement. The advancements in technology in this era allowed a more steady supply of food, followed by the wider availability of consumer goods.

Starting in the United Kingdom in the 18th century, the Industrial Revolution was a period of great technological discovery, particularly in the areas of agriculture, manufacturing, mining, metallurgy, and transport, driven by the discovery of steam power. Technology took another step in a second industrial revolution with the harnessing of electricity to create such innovations as the electric motor, light bulb, and countless others. Scientific advancement and the discovery of new concepts later allowed for powered flight and advancements in medicine, chemistry, physics, and engineering. The rise in technology has led to skyscrapers and broad urban areas whose inhabitants rely on motors to transport them and their food supply. Communication was also greatly improved with the invention of the telegraph, telephone, radio and television. The late 19th and early 20th centuries saw a revolution in transportation with the invention of the airplane and automobile.

The 20th century brought a host of innovations. In physics, the discovery of nuclear fission has led to both nuclear weapons and nuclear power. Computers were also invented and later miniaturized utilizing transistors and integrated circuits. Information technology subsequently led to the creation of the Internet, which ushered in the current Information Age. Humans have also been able to explore space with satellites (later used for telecommunication) and in manned missions going all the way to the moon. In medicine, this era brought innovations such as open-heart surgery and later stem cell therapy along with new medications and treatments.

Complex manufacturing and construction techniques and organizations are needed to make and maintain these new technologies, and entire industries have arisen to support and develop succeeding generations of increasingly more complex tools. Modern technology increasingly relies on training and education their designers, builders, maintainers, and users often require sophisticated general and specific training. Moreover, these technologies have become so complex that entire fields have been created to support them, including engineering, medicine, and computer science, and other fields have been made more complex, such as construction, transportation and architecture.

Generally, technicism is the belief in the utility of technology for improving human societies.[66] Taken to an extreme, technicism “reflects a fundamental attitude which seeks to control reality, to resolve all problems with the use of scientifictechnological methods and tools.”[67] In other words, human beings will someday be able to master all problems and possibly even control the future using technology. Some, such as Stephen V. Monsma,[68] connect these ideas to the abdication of religion as a higher moral authority.

Optimistic assumptions are made by proponents of ideologies such as transhumanism and singularitarianism, which view technological development as generally having beneficial effects for the society and the human condition. In these ideologies, technological development is morally good.

Transhumanists generally believe that the point of technology is to overcome barriers, and that what we commonly refer to as the human condition is just another barrier to be surpassed.

Singularitarians believe in some sort of “accelerating change”; that the rate of technological progress accelerates as we obtain more technology, and that this will culminate in a “Singularity” after artificial general intelligence is invented in which progress is nearly infinite; hence the term. Estimates for the date of this Singularity vary,[69] but prominent futurist Ray Kurzweil estimates the Singularity will occur in 2045.

Kurzweil is also known for his history of the universe in six epochs: (1) the physical/chemical epoch, (2) the life epoch, (3) the human/brain epoch, (4) the technology epoch, (5) the artificial intelligence epoch, and (6) the universal colonization epoch. Going from one epoch to the next is a Singularity in its own right, and a period of speeding up precedes it. Each epoch takes a shorter time, which means the whole history of the universe is one giant Singularity event.[70]

Some critics see these ideologies as examples of scientism and techno-utopianism and fear the notion of human enhancement and technological singularity which they support. Some have described Karl Marx as a techno-optimist.[71]

On the somewhat skeptical side are certain philosophers like Herbert Marcuse and John Zerzan, who believe that technological societies are inherently flawed. They suggest that the inevitable result of such a society is to become evermore technological at the cost of freedom and psychological health.

Many, such as the Luddites and prominent philosopher Martin Heidegger, hold serious, although not entirely, deterministic reservations about technology (see “The Question Concerning Technology”[72]). According to Heidegger scholars Hubert Dreyfus and Charles Spinosa, “Heidegger does not oppose technology. He hopes to reveal the essence of technology in a way that ‘in no way confines us to a stultified compulsion to push on blindly with technology or, what comes to the same thing, to rebel helplessly against it.’ Indeed, he promises that ‘when we once open ourselves expressly to the essence of technology, we find ourselves unexpectedly taken into a freeing claim.'[73] What this entails is a more complex relationship to technology than either techno-optimists or techno-pessimists tend to allow.”[74]

Some of the most poignant criticisms of technology are found in what are now considered to be dystopian literary classics such as Aldous Huxley’s Brave New World, Anthony Burgess’s A Clockwork Orange, and George Orwell’s Nineteen Eighty-Four. In Goethe’s Faust, Faust selling his soul to the devil in return for power over the physical world is also often interpreted as a metaphor for the adoption of industrial technology. More recently, modern works of science fiction such as those by Philip K. Dick and William Gibson and films such as Blade Runner and Ghost in the Shell project highly ambivalent or cautionary attitudes toward technology’s impact on human society and identity.

The late cultural critic Neil Postman distinguished tool-using societies from technological societies and from what he called “technopolies,” societies that are dominated by the ideology of technological and scientific progress to the exclusion or harm of other cultural practices, values, and world-views.[75]

Darin Barney has written about technology’s impact on practices of citizenship and democratic culture, suggesting that technology can be construed as (1) an object of political debate, (2) a means or medium of discussion, and (3) a setting for democratic deliberation and citizenship. As a setting for democratic culture, Barney suggests that technology tends to make ethical questions, including the question of what a good life consists in, nearly impossible because they already give an answer to the question: a good life is one that includes the use of more and more technology.[76]

Nikolas Kompridis has also written about the dangers of new technology, such as genetic engineering, nanotechnology, synthetic biology, and robotics. He warns that these technologies introduce unprecedented new challenges to human beings, including the possibility of the permanent alteration of our biological nature. These concerns are shared by other philosophers, scientists and public intellectuals who have written about similar issues (e.g. Francis Fukuyama, Jrgen Habermas, William Joy, and Michael Sandel).[77]

Another prominent critic of technology is Hubert Dreyfus, who has published books such as On the Internet and What Computers Still Can’t Do.

A more infamous anti-technological treatise is Industrial Society and Its Future, written by the Unabomber Ted Kaczynski and printed in several major newspapers (and later books) as part of an effort to end his bombing campaign of the techno-industrial infrastructure. There are also subcultures that disapprove of some or most technology, such as self-identified off-gridders.[78]

The notion of appropriate technology was developed in the 20th century by thinkers such as E. F. Schumacher and Jacques Ellul to describe situations where it was not desirable to use very new technologies or those that required access to some centralized infrastructure or parts or skills imported from elsewhere. The ecovillage movement emerged in part due to this concern.

This section mainly focuses on American concerns even if it can reasonably be generalized to other Western countries.

The inadequate quantity and quality of American jobs is one of the most fundamental economic challenges we face. […] What’s the linkage between technology and this fundamental problem?

In his article, Jared Bernstein, a Senior Fellow at the Center on Budget and Policy Priorities,[79] questions the widespread idea that automation, and more broadly, technological advances, have mainly contributed to this growing labor market problem.His thesis appears to be a third way between optimism and skepticism. Essentially, he stands for a neutral approach of the linkage between technology and American issues concerning unemployment and declining wages.

He uses two main arguments to defend his point.First, because of recent technological advances, an increasing number of workers are losing their jobs. Yet, scientific evidence fails to clearly demonstrate that technology has displaced so many workers that it has created more problems than it has solved. Indeed, automation threatens repetitive jobs but higher-end jobs are still necessary because they complement technology and manual jobs that “requires flexibility judgment and common sense”[80] remain hard to replace with machines. Second, studies have not shown clear links between recent technology advances and the wage trends of the last decades.

Therefore, according to Bernstein, instead of focusing on technology and its hypothetical influences on current American increasing unemployment and declining wages, one needs to worry more about “bad policy that fails to offset the imbalances in demand, trade, income, and opportunity.”[80]

For people who use both the Internet and mobile devices in excessive quantities it is likely for them to experience fatigue and over exhaustion as a result of disruptions in their sleeping patterns. Continuous studies have shown that increased BMI and weight gain are associated with people who spend long hours online and not exercising frequently.[81] Heavy Internet use is also displayed in the school lower grades of those who use it in excessive amounts.[82] It has also been noted that the use of mobile phones whilst driving has increased the occurrence of road accidents particularly amongst teen drivers. Statistically, teens reportedly have fourfold the amount of road traffic incidents as those who are 20 years or older, and a very high percentage of adolescents write (81%) and read (92%) texts while driving.[83] In this context, mass media and technology have a negative impact on people, on both their mental and physical health.

Thomas P. Hughes stated that because technology has been considered as a key way to solve problems, we need to be aware of its complex and varied characters to use it more efficiently.[84] What is the difference between a wheel or a compass and cooking machines such as an oven or a gas stove? Can we consider all of them, only a part of them, or none of them as technologies?

Technology is often considered too narrowly; according to Hughes, “Technology is a creative process involving human ingenuity”.[85] This definition’s emphasis on creativity avoids unbounded definitions that may mistakenly include cooking technologies,” but it also highlights the prominent role of humans and therefore their responsibilities for the use of complex technological systems.

Yet, because technology is everywhere and has dramatically changed landscapes and societies, Hughes argues that engineers, scientists, and managers have often believed that they can use technology to shape the world as they want. They have often supposed that technology is easily controllable and this assumption has to be thoroughly questioned.[84] For instance, Evgeny Morozov particularly challenges two concepts: Internet-centrism and solutionism.”[86] Internet-centrism refers to the idea that our society is convinced that the Internet is one of the most stable and coherent forces. Solutionism is the ideology that every social issue can be solved thanks to technology and especially thanks to the internet. In fact, technology intrinsically contains uncertainties and limitations. According to Alexis Madrigal’s review of Morozov’s theory, to ignore it will lead to unexpected consequences that could eventually cause more damage than the problems they seek to address.”[87] Benjamin R. Cohen and Gwen Ottinger also discussed the multivalent effects of technology.[88]

Therefore, recognition of the limitations of technology, and more broadly, scientific knowledge, is needed especially in cases dealing with environmental justice and health issues. Ottinger continues this reasoning and argues that the ongoing recognition of the limitations of scientific knowledge goes hand in hand with scientists and engineers new comprehension of their role. Such an approach of technology and science “[require] technical professionals to conceive of their roles in the process differently. [They have to consider themselves as] collaborators in research and problem solving rather than simply providers of information and technical solutions.”[89]

Technology is properly defined as any application of science to accomplish a function. The science can be leading edge or well established and the function can have high visibility or be significantly more mundane, but it is all technology, and its exploitation is the foundation of all competitive advantage.

Technology-based planning is what was used to build the US industrial giants before WWII (e.g., Dow, DuPont, GM) and it is what was used to transform the US into a superpower. It was not economic-based planning.

The use of basic technology is also a feature of other animal species apart from humans. These include primates such as chimpanzees,[90] some dolphin communities,[91] and crows.[92][93] Considering a more generic perspective of technology as ethology of active environmental conditioning and control, we can also refer to animal examples such as beavers and their dams, or bees and their honeycombs.

The ability to make and use tools was once considered a defining characteristic of the genus Homo.[94] However, the discovery of tool construction among chimpanzees and related primates has discarded the notion of the use of technology as unique to humans. For example, researchers have observed wild chimpanzees utilising tools for foraging: some of the tools used include leaf sponges, termite fishing probes, pestles and levers.[95] West African chimpanzees also use stone hammers and anvils for cracking nuts,[96] as do capuchin monkeys of Boa Vista, Brazil.[97]

Theories of technology often attempt to predict the future of technology based on the high technology and science of the time. As with all predictions of the future, however, technology’s is uncertain.

In 2005, futurist Ray Kurzweil predicted that the future of technology would mainly consist of an overlapping “GNR Revolution” of genetics, nanotechnology and robotics, with robotics being the most important of the three.[98]

Read the rest here:

Technology – Wikipedia

Technology Articles, Technological News | Popular Science

Innovations in materials, artificial intelligence, and stealth technology are continually retooling modern warfare. Here we delve into the latest developments, be that ultra stealthy fuselages, superalloys that make engines more efficient, or tanks with radar-guided munitions and projectile launchers. If you’re fascinated by drones, catapults, and aircraft with foldable wings, take heart: you’re not alone. SEE MORE

Visit link:

Technology Articles, Technological News | Popular Science

Technology Chevron.com

Technology plays an important role in helping us deliver affordable, reliable energy that fuels human progress and economic growth around the world. The technologies we deploy not only help us cost effectively find and commercialize new oil and gas fields, but also help us recover more resources from existing fields. They enable us to integrate data and information so that we can manage and develop our global assets efficiently. And they help us advance emerging energy with the goal of developing scalable and economical new resources while reducing our environmental footprint.

See the rest here:

Technology Chevron.com

Technology – definition of technology by The Free Dictionary

Technology is almost magical, and ambition for a better life is now universal.I had made up my mind to he an engineer, and I went over to the Boston Institute of Technology.This article advocates that once teachers identify the components of technology to be taught or the ones most appropriate to augment learning, as trained professionals they should also devise pedagogical strategies that they can use in their instruction to yield positive learning outcomes (Sandholtz, Ringstaff & Dyer, 1997).The S & T community is best able to identify potential technology solutions, assuming that needs have been clearly defined, but they are usually not the best qualified to make business decisions related to development of the technology or issues related to manufacturing of the product.In this article, modern technology refers to technologies which have been developed during and after the Industrial Revolution mostly in the West and which have now spread all over the world.A very successful Forest, Wood and Paper Industry Technology Summit was held in Peachtree City, Georgia, USA in May 2001, where leading scientists, educators, technology leaders and policymakers provided significant input to a Business Plan for Agenda 2020.Of the two broad categories of medical technology to observe, “internal” can be broadly seen as provider-to-provider and “external,” more specifically, provider-to-patient.It is because of these achievements that we bring you our choices of the Top Ten Technology Districts.Move over, David Letterman, CPAs have developed their own top 10 lists, at least in the technology area.I’ve found over the past three to five years that it’s much more important for information systems to be involved strategically, partially because of what we’ve been going through as an organization in determining which technology we want to put in and how we can apply it.For executives facing mounting competition in global markets, strategic technology alliances offer a promise of dramatic improvements in competitive position.Rehabilitation technology is the systematic application of scientific and engineering principles to the rehabilitation process for the removal of barriers to employment and independent living.

Continue reading here:

Technology – definition of technology by The Free Dictionary

Ripple Price Prediction: xRapid Shows Success, But SEC Still Holds Power

XRP Prices Hang in the Balance
Ripple bears like to claim that XRP “serves no purpose” in its technology, but recent success with the “xRapid” software says otherwise. That—plus the continual “Is XRP a security?” debate—drove Ripple prices round and round in circles last week.

I see these two forces working in opposite directions.

Investors should be happy that xRapid is providing genuine benefits to businesses that dared to take a chance on XRP. But does it matter if the U.S. Securities & Exchange Commission (SEC) designates XRP a security?
xRapid Success
For the uninitiated, Ripple has multiple offerings. One is “xCurrent,” a.

The post Ripple Price Prediction: xRapid Shows Success, But SEC Still Holds Power appeared first on Profit Confidential.

Original post:

Ripple Price Prediction: xRapid Shows Success, But SEC Still Holds Power

Technology – Wikipedia

This article is about the use and knowledge of techniques and processes for producing goods and services. For other uses, see Technology (disambiguation).

Technology (“science of craft”, from Greek , techne, “art, skill, cunning of hand”; and -, -logia[2]) is first robustly defined by Jacob Bigelow in 1829 as: “…principles, processes, and nomenclatures of the more conspicuous arts, particularly those which involve applications of science, and which may be considered useful, by promoting the benefit of society, together with the emolument [compensation [3]] of those who pursue them” [4].

The simplest form of technology is the development and use of basic tools. The prehistoric discovery of how to control fire and the later Neolithic Revolution increased the available sources of food, and the invention of the wheel helped humans to travel in and control their environment. Developments in historic times, including the printing press, the telephone, and the Internet, have lessened physical barriers to communication and allowed humans to interact freely on a global scale.

Technology has many effects. It has helped develop more advanced economies (including today’s global economy) and has allowed the rise of a leisure class. Many technological processes produce unwanted by-products known as pollution and deplete natural resources to the detriment of Earth’s environment. Innovations have always influenced the values of a society and raised new questions of the ethics of technology. Examples include the rise of the notion of efficiency in terms of human productivity, and the challenges of bioethics.

Philosophical debates have arisen over the use of technology, with disagreements over whether technology improves the human condition or worsens it. Neo-Luddism, anarcho-primitivism, and similar reactionary movements criticize the pervasiveness of technology, arguing that it harms the environment and alienates people; proponents of ideologies such as transhumanism and techno-progressivism view continued technological progress as beneficial to society and the human condition.

The use of the term “technology” has changed significantly over the last 200 years. Before the 20th century, the term was uncommon in English, and it was used either to refer to the description or study of the useful arts[14] or to allude to technical education, as in the Massachusetts Institute of Technology (chartered in 1861).[15]

The term “technology” rose to prominence in the 20th century in connection with the Second Industrial Revolution. The term’s meanings changed in the early 20th century when American social scientists, beginning with Thorstein Veblen, translated ideas from the German concept of Technik into “technology.” In German and other European languages, a distinction exists between technik and technologie that is absent in English, which usually translates both terms as “technology.” By the 1930s, “technology” referred not only to the study of the industrial arts but to the industrial arts themselves.[16]

In 1937, the American sociologist Read Bain wrote that “technology includes all tools, machines, utensils, weapons, instruments, housing, clothing, communicating and transporting devices and the skills by which we produce and use them.”[17] Bain’s definition remains common among scholars today, especially social scientists. Scientists and engineers usually prefer to define technology as applied science, rather than as the things that people make and use.[18] More recently, scholars have borrowed from European philosophers of “technique” to extend the meaning of technology to various forms of instrumental reason, as in Foucault’s work on technologies of the self (techniques de soi).

Dictionaries and scholars have offered a variety of definitions. The Merriam-Webster Learner’s Dictionary offers a definition of the term: “the use of science in industry, engineering, etc., to invent useful things or to solve problems” and “a machine, piece of equipment, method, etc., that is created by technology.”[19] Ursula Franklin, in her 1989 “Real World of Technology” lecture, gave another definition of the concept; it is “practice, the way we do things around here.”[20] The term is often used to imply a specific field of technology, or to refer to high technology or just consumer electronics, rather than technology as a whole.[21] Bernard Stiegler, in Technics and Time, 1, defines technology in two ways: as “the pursuit of life by means other than life,” and as “organized inorganic matter.”[22]

Technology can be most broadly defined as the entities, both material and immaterial, created by the application of mental and physical effort in order to achieve some value. In this usage, technology refers to tools and machines that may be used to solve real-world problems. It is a far-reaching term that may include simple tools, such as a crowbar or wooden spoon, or more complex machines, such as a space station or particle accelerator. Tools and machines need not be material; virtual technology, such as computer software and business methods, fall under this definition of technology.[23] W. Brian Arthur defines technology in a similarly broad way as “a means to fulfill a human purpose.”[24]

The word “technology” can also be used to refer to a collection of techniques. In this context, it is the current state of humanity’s knowledge of how to combine resources to produce desired products, to solve problems, fulfill needs, or satisfy wants; it includes technical methods, skills, processes, techniques, tools and raw materials. When combined with another term, such as “medical technology” or “space technology,” it refers to the state of the respective field’s knowledge and tools. “State-of-the-art technology” refers to the high technology available to humanity in any field.

Technology can be viewed as an activity that forms or changes culture.[25] Additionally, technology is the application of math, science, and the arts for the benefit of life as it is known. A modern example is the rise of communication technology, which has lessened barriers to human interaction and as a result has helped spawn new subcultures; the rise of cyberculture has at its basis the development of the Internet and the computer.[26] Not all technology enhances culture in a creative way; technology can also help facilitate political oppression and war via tools such as guns. As a cultural activity, technology predates both science and engineering, each of which formalize some aspects of technological endeavor.

The distinction between science, engineering, and technology is not always clear. Science is systematic knowledge of the physical or material world gained through observation and experimentation.[27] Technologies are not usually exclusively products of science, because they have to satisfy requirements such as utility, usability, and safety.[citation needed]

Engineering is the goal-oriented process of designing and making tools and systems to exploit natural phenomena for practical human means, often (but not always) using results and techniques from science. The development of technology may draw upon many fields of knowledge, including scientific, engineering, mathematical, linguistic, and historical knowledge, to achieve some practical result.

Technology is often a consequence of science and engineering, although technology as a human activity precedes the two fields. For example, science might study the flow of electrons in electrical conductors by using already-existing tools and knowledge. This new-found knowledge may then be used by engineers to create new tools and machines such as semiconductors, computers, and other forms of advanced technology. In this sense, scientists and engineers may both be considered technologists; the three fields are often considered as one for the purposes of research and reference.[28]

The exact relations between science and technology in particular have been debated by scientists, historians, and policymakers in the late 20th century, in part because the debate can inform the funding of basic and applied science. In the immediate wake of World War II, for example, it was widely considered in the United States that technology was simply “applied science” and that to fund basic science was to reap technological results in due time. An articulation of this philosophy could be found explicitly in Vannevar Bush’s treatise on postwar science policy, Science The Endless Frontier: “New products, new industries, and more jobs require continuous additions to knowledge of the laws of nature… This essential new knowledge can be obtained only through basic scientific research.”[29] In the late-1960s, however, this view came under direct attack, leading towards initiatives to fund science for specific tasks (initiatives resisted by the scientific community). The issue remains contentious, though most analysts resist the model that technology simply is a result of scientific research.[30][31]

The use of tools by early humans was partly a process of discovery and of evolution. Early humans evolved from a species of foraging hominids which were already bipedal,[32] with a brain mass approximately one third of modern humans.[33] Tool use remained relatively unchanged for most of early human history. Approximately 50,000 years ago, the use of tools and complex set of behaviors emerged, believed by many archaeologists to be connected to the emergence of fully modern language.[34]

Hominids started using primitive stone tools millions of years ago. The earliest stone tools were little more than a fractured rock, but approximately 75,000 years ago,[35] pressure flaking provided a way to make much finer work.

The discovery and utilization of fire, a simple energy source with many profound uses, was a turning point in the technological evolution of humankind.[36] The exact date of its discovery is not known; evidence of burnt animal bones at the Cradle of Humankind suggests that the domestication of fire occurred before 1 Ma;[37] scholarly consensus indicates that Homo erectus had controlled fire by between 500 and 400 ka.[38][39] Fire, fueled with wood and charcoal, allowed early humans to cook their food to increase its digestibility, improving its nutrient value and broadening the number of foods that could be eaten.[40]

Other technological advances made during the Paleolithic era were clothing and shelter; the adoption of both technologies cannot be dated exactly, but they were a key to humanity’s progress. As the Paleolithic era progressed, dwellings became more sophisticated and more elaborate; as early as 380 ka, humans were constructing temporary wood huts.[41][42] Clothing, adapted from the fur and hides of hunted animals, helped humanity expand into colder regions; humans began to migrate out of Africa by 200 ka and into other continents such as Eurasia.[43]

Human’s technological ascent began in earnest in what is known as the Neolithic Period (“New Stone Age”). The invention of polished stone axes was a major advance that allowed forest clearance on a large scale to create farms. This use of polished stone axes increased greatly in the Neolithic, but were originally used in the preceding Mesolithic in some areas such as Ireland.[44] Agriculture fed larger populations, and the transition to sedentism allowed simultaneously raising more children, as infants no longer needed to be carried, as nomadic ones must. Additionally, children could contribute labor to the raising of crops more readily than they could to the hunter-gatherer economy.[45][46]

With this increase in population and availability of labor came an increase in labor specialization.[47] What triggered the progression from early Neolithic villages to the first cities, such as Uruk, and the first civilizations, such as Sumer, is not specifically known; however, the emergence of increasingly hierarchical social structures and specialized labor, of trade and war amongst adjacent cultures, and the need for collective action to overcome environmental challenges such as irrigation, are all thought to have played a role.[48]

Continuing improvements led to the furnace and bellows and provided, for the first time, the ability to smelt and forge of gold, copper, silver, and lead native metals found in relatively pure form in nature.[49] The advantages of copper tools over stone, bone, and wooden tools were quickly apparent to early humans, and native copper was probably used from near the beginning of Neolithic times (about 10 ka).[50] Native copper does not naturally occur in large amounts, but copper ores are quite common and some of them produce metal easily when burned in wood or charcoal fires. Eventually, the working of metals led to the discovery of alloys such as bronze and brass (about 4000 BCE). The first uses of iron alloys such as steel dates to around 1800 BCE.[51][52]

Meanwhile, humans were learning to harness other forms of energy. The earliest known use of wind power is the sailing ship; the earliest record of a ship under sail is that of a Nile boat dating to the 8th millennium BCE.[53] From prehistoric times, Egyptians probably used the power of the annual flooding of the Nile to irrigate their lands, gradually learning to regulate much of it through purposely built irrigation channels and “catch” basins. The ancient Sumerians in Mesopotamia used a complex system of canals and levees to divert water from the Tigris and Euphrates rivers for irrigation.[54]

According to archaeologists, the wheel was invented around 4000 BCE probably independently and nearly simultaneously in Mesopotamia (in present-day Iraq), the Northern Caucasus (Maykop culture) and Central Europe.[55] Estimates on when this may have occurred range from 5500 to 3000 BCE with most experts putting it closer to 4000 BCE.[56] The oldest artifacts with drawings depicting wheeled carts date from about 3500 BCE;[57] however, the wheel may have been in use for millennia before these drawings were made. More recently, the oldest-known wooden wheel in the world was found in the Ljubljana marshes of Slovenia.[58]

The invention of the wheel revolutionized trade and war. It did not take long to discover that wheeled wagons could be used to carry heavy loads. The ancient Sumerians used the potter’s wheel and may have invented it.[59] A stone pottery wheel found in the city-state of Ur dates to around 3429 BCE,[60] and even older fragments of wheel-thrown pottery have been found in the same area.[60] Fast (rotary) potters’ wheels enabled early mass production of pottery, but it was the use of the wheel as a transformer of energy (through water wheels, windmills, and even treadmills) that revolutionized the application of nonhuman power sources. The first two-wheeled carts were derived from travois[61] and were first used in Mesopotamia and Iran in around 3000 BCE.[61]

The oldest known constructed roadways are the stone-paved streets of the city-state of Ur, dating to circa 4000 BCE[62] and timber roads leading through the swamps of Glastonbury, England, dating to around the same time period.[62] The first long-distance road, which came into use around 3500 BCE,[62] spanned 1,500 miles from the Persian Gulf to the Mediterranean Sea,[62] but was not paved and was only partially maintained.[62] In around 2000 BCE, the Minoans on the Greek island of Crete built a fifty-kilometer (thirty-mile) road leading from the palace of Gortyn on the south side of the island, through the mountains, to the palace of Knossos on the north side of the island.[62] Unlike the earlier road, the Minoan road was completely paved.[62]

Ancient Minoan private homes had running water.[64] A bathtub virtually identical to modern ones was unearthed at the Palace of Knossos.[64][65] Several Minoan private homes also had toilets, which could be flushed by pouring water down the drain.[64] The ancient Romans had many public flush toilets,[65] which emptied into an extensive sewage system.[65] The primary sewer in Rome was the Cloaca Maxima;[65] construction began on it in the sixth century BCE and it is still in use today.[65]

The ancient Romans also had a complex system of aqueducts,[63] which were used to transport water across long distances.[63] The first Roman aqueduct was built in 312 BCE.[63] The eleventh and final ancient Roman aqueduct was built in 226 CE.[63] Put together, the Roman aqueducts extended over 450 kilometers,[63] but less than seventy kilometers of this was above ground and supported by arches.[63]

Innovations continued through the Middle Ages with innovations such as silk, the horse collar and horseshoes in the first few hundred years after the fall of the Roman Empire. Medieval technology saw the use of simple machines (such as the lever, the screw, and the pulley) being combined to form more complicated tools, such as the wheelbarrow, windmills and clocks. The Renaissance brought forth many of these innovations, including the printing press (which facilitated the greater communication of knowledge), and technology became increasingly associated with science, beginning a cycle of mutual advancement. The advancements in technology in this era allowed a more steady supply of food, followed by the wider availability of consumer goods.

Starting in the United Kingdom in the 18th century, the Industrial Revolution was a period of great technological discovery, particularly in the areas of agriculture, manufacturing, mining, metallurgy, and transport, driven by the discovery of steam power. Technology took another step in a second industrial revolution with the harnessing of electricity to create such innovations as the electric motor, light bulb, and countless others. Scientific advancement and the discovery of new concepts later allowed for powered flight and advancements in medicine, chemistry, physics, and engineering. The rise in technology has led to skyscrapers and broad urban areas whose inhabitants rely on motors to transport them and their food supply. Communication was also greatly improved with the invention of the telegraph, telephone, radio and television. The late 19th and early 20th centuries saw a revolution in transportation with the invention of the airplane and automobile.

The 20th century brought a host of innovations. In physics, the discovery of nuclear fission has led to both nuclear weapons and nuclear power. Computers were also invented and later miniaturized utilizing transistors and integrated circuits. Information technology subsequently led to the creation of the Internet, which ushered in the current Information Age. Humans have also been able to explore space with satellites (later used for telecommunication) and in manned missions going all the way to the moon. In medicine, this era brought innovations such as open-heart surgery and later stem cell therapy along with new medications and treatments.

Complex manufacturing and construction techniques and organizations are needed to make and maintain these new technologies, and entire industries have arisen to support and develop succeeding generations of increasingly more complex tools. Modern technology increasingly relies on training and education their designers, builders, maintainers, and users often require sophisticated general and specific training. Moreover, these technologies have become so complex that entire fields have been created to support them, including engineering, medicine, and computer science, and other fields have been made more complex, such as construction, transportation and architecture.

Generally, technicism is the belief in the utility of technology for improving human societies.[66] Taken to an extreme, technicism “reflects a fundamental attitude which seeks to control reality, to resolve all problems with the use of scientifictechnological methods and tools.”[67] In other words, human beings will someday be able to master all problems and possibly even control the future using technology. Some, such as Stephen V. Monsma,[68] connect these ideas to the abdication of religion as a higher moral authority.

Optimistic assumptions are made by proponents of ideologies such as transhumanism and singularitarianism, which view technological development as generally having beneficial effects for the society and the human condition. In these ideologies, technological development is morally good.

Transhumanists generally believe that the point of technology is to overcome barriers, and that what we commonly refer to as the human condition is just another barrier to be surpassed.

Singularitarians believe in some sort of “accelerating change”; that the rate of technological progress accelerates as we obtain more technology, and that this will culminate in a “Singularity” after artificial general intelligence is invented in which progress is nearly infinite; hence the term. Estimates for the date of this Singularity vary,[69] but prominent futurist Ray Kurzweil estimates the Singularity will occur in 2045.

Kurzweil is also known for his history of the universe in six epochs: (1) the physical/chemical epoch, (2) the life epoch, (3) the human/brain epoch, (4) the technology epoch, (5) the artificial intelligence epoch, and (6) the universal colonization epoch. Going from one epoch to the next is a Singularity in its own right, and a period of speeding up precedes it. Each epoch takes a shorter time, which means the whole history of the universe is one giant Singularity event.[70]

Some critics see these ideologies as examples of scientism and techno-utopianism and fear the notion of human enhancement and technological singularity which they support. Some have described Karl Marx as a techno-optimist.[71]

On the somewhat skeptical side are certain philosophers like Herbert Marcuse and John Zerzan, who believe that technological societies are inherently flawed. They suggest that the inevitable result of such a society is to become evermore technological at the cost of freedom and psychological health.

Many, such as the Luddites and prominent philosopher Martin Heidegger, hold serious, although not entirely, deterministic reservations about technology (see “The Question Concerning Technology”[72]). According to Heidegger scholars Hubert Dreyfus and Charles Spinosa, “Heidegger does not oppose technology. He hopes to reveal the essence of technology in a way that ‘in no way confines us to a stultified compulsion to push on blindly with technology or, what comes to the same thing, to rebel helplessly against it.’ Indeed, he promises that ‘when we once open ourselves expressly to the essence of technology, we find ourselves unexpectedly taken into a freeing claim.'[73] What this entails is a more complex relationship to technology than either techno-optimists or techno-pessimists tend to allow.”[74]

Some of the most poignant criticisms of technology are found in what are now considered to be dystopian literary classics such as Aldous Huxley’s Brave New World, Anthony Burgess’s A Clockwork Orange, and George Orwell’s Nineteen Eighty-Four. In Goethe’s Faust, Faust selling his soul to the devil in return for power over the physical world is also often interpreted as a metaphor for the adoption of industrial technology. More recently, modern works of science fiction such as those by Philip K. Dick and William Gibson and films such as Blade Runner and Ghost in the Shell project highly ambivalent or cautionary attitudes toward technology’s impact on human society and identity.

The late cultural critic Neil Postman distinguished tool-using societies from technological societies and from what he called “technopolies,” societies that are dominated by the ideology of technological and scientific progress to the exclusion or harm of other cultural practices, values, and world-views.[75]

Darin Barney has written about technology’s impact on practices of citizenship and democratic culture, suggesting that technology can be construed as (1) an object of political debate, (2) a means or medium of discussion, and (3) a setting for democratic deliberation and citizenship. As a setting for democratic culture, Barney suggests that technology tends to make ethical questions, including the question of what a good life consists in, nearly impossible because they already give an answer to the question: a good life is one that includes the use of more and more technology.[76]

Nikolas Kompridis has also written about the dangers of new technology, such as genetic engineering, nanotechnology, synthetic biology, and robotics. He warns that these technologies introduce unprecedented new challenges to human beings, including the possibility of the permanent alteration of our biological nature. These concerns are shared by other philosophers, scientists and public intellectuals who have written about similar issues (e.g. Francis Fukuyama, Jrgen Habermas, William Joy, and Michael Sandel).[77]

Another prominent critic of technology is Hubert Dreyfus, who has published books such as On the Internet and What Computers Still Can’t Do.

A more infamous anti-technological treatise is Industrial Society and Its Future, written by the Unabomber Ted Kaczynski and printed in several major newspapers (and later books) as part of an effort to end his bombing campaign of the techno-industrial infrastructure. There are also subcultures that disapprove of some or most technology, such as self-identified off-gridders.[78]

The notion of appropriate technology was developed in the 20th century by thinkers such as E. F. Schumacher and Jacques Ellul to describe situations where it was not desirable to use very new technologies or those that required access to some centralized infrastructure or parts or skills imported from elsewhere. The ecovillage movement emerged in part due to this concern.

This section mainly focuses on American concerns even if it can reasonably be generalized to other Western countries.

The inadequate quantity and quality of American jobs is one of the most fundamental economic challenges we face. […] What’s the linkage between technology and this fundamental problem?

In his article, Jared Bernstein, a Senior Fellow at the Center on Budget and Policy Priorities,[79] questions the widespread idea that automation, and more broadly, technological advances, have mainly contributed to this growing labor market problem. His thesis appears to be a third way between optimism and skepticism. Essentially, he stands for a neutral approach of the linkage between technology and American issues concerning unemployment and declining wages.

He uses two main arguments to defend his point. First, because of recent technological advances, an increasing number of workers are losing their jobs. Yet, scientific evidence fails to clearly demonstrate that technology has displaced so many workers that it has created more problems than it has solved. Indeed, automation threatens repetitive jobs but higher-end jobs are still necessary because they complement technology and manual jobs that “requires flexibility judgment and common sense”[80] remain hard to replace with machines. Second, studies have not shown clear links between recent technology advances and the wage trends of the last decades.

Therefore, according to Bernstein, instead of focusing on technology and its hypothetical influences on current American increasing unemployment and declining wages, one needs to worry more about “bad policy that fails to offset the imbalances in demand, trade, income, and opportunity.”[80]

For people who use both the Internet and mobile devices in excessive quantities it is likely for them to experience fatigue and over exhaustion as a result of disruptions in their sleeping patterns. Continuous studies have shown that increased BMI and weight gain are associated with people who spend long hours online and not exercising frequently.[81] Heavy Internet use is also displayed in the school lower grades of those who use it in excessive amounts.[82] It has also been noted that the use of mobile phones whilst driving has increased the occurrence of road accidents particularly amongst teen drivers. Statistically, teens reportedly have fourfold the amount of road traffic incidents as those who are 20 years or older, and a very high percentage of adolescents write (81%) and read (92%) texts while driving.[83] In this context, mass media and technology have a negative impact on people, on both their mental and physical health.

Thomas P. Hughes stated that because technology has been considered as a key way to solve problems, we need to be aware of its complex and varied characters to use it more efficiently.[84] What is the difference between a wheel or a compass and cooking machines such as an oven or a gas stove? Can we consider all of them, only a part of them, or none of them as technologies?

Technology is often considered too narrowly; according to Hughes, “Technology is a creative process involving human ingenuity”.[85] This definition’s emphasis on creativity avoids unbounded definitions that may mistakenly include cooking technologies,” but it also highlights the prominent role of humans and therefore their responsibilities for the use of complex technological systems.

Yet, because technology is everywhere and has dramatically changed landscapes and societies, Hughes argues that engineers, scientists, and managers have often believed that they can use technology to shape the world as they want. They have often supposed that technology is easily controllable and this assumption has to be thoroughly questioned.[84] For instance, Evgeny Morozov particularly challenges two concepts: Internet-centrism and solutionism.”[86] Internet-centrism refers to the idea that our society is convinced that the Internet is one of the most stable and coherent forces. Solutionism is the ideology that every social issue can be solved thanks to technology and especially thanks to the internet. In fact, technology intrinsically contains uncertainties and limitations. According to Alexis Madrigal’s review of Morozov’s theory, to ignore it will lead to unexpected consequences that could eventually cause more damage than the problems they seek to address.”[87] Benjamin R. Cohen and Gwen Ottinger also discussed the multivalent effects of technology.[88]

Therefore, recognition of the limitations of technology, and more broadly, scientific knowledge, is needed especially in cases dealing with environmental justice and health issues. Ottinger continues this reasoning and argues that the ongoing recognition of the limitations of scientific knowledge goes hand in hand with scientists and engineers new comprehension of their role. Such an approach of technology and science “[require] technical professionals to conceive of their roles in the process differently. [They have to consider themselves as] collaborators in research and problem solving rather than simply providers of information and technical solutions.”[89]

Technology is properly defined as any application of science to accomplish a function. The science can be leading edge or well established and the function can have high visibility or be significantly more mundane, but it is all technology, and its exploitation is the foundation of all competitive advantage.

Technology-based planning is what was used to build the US industrial giants before WWII (e.g., Dow, DuPont, GM) and it is what was used to transform the US into a superpower. It was not economic-based planning.

The use of basic technology is also a feature of other animal species apart from humans. These include primates such as chimpanzees,[90] some dolphin communities,[91] and crows.[92][93] Considering a more generic perspective of technology as ethology of active environmental conditioning and control, we can also refer to animal examples such as beavers and their dams, or bees and their honeycombs.

The ability to make and use tools was once considered a defining characteristic of the genus Homo.[94] However, the discovery of tool construction among chimpanzees and related primates has discarded the notion of the use of technology as unique to humans. For example, researchers have observed wild chimpanzees utilising tools for foraging: some of the tools used include leaf sponges, termite fishing probes, pestles and levers.[95] West African chimpanzees also use stone hammers and anvils for cracking nuts,[96] as do capuchin monkeys of Boa Vista, Brazil.[97]

Theories of technology often attempt to predict the future of technology based on the high technology and science of the time. As with all predictions of the future, however, technology’s is uncertain.

In 2005, futurist Ray Kurzweil predicted that the future of technology would mainly consist of an overlapping “GNR Revolution” of genetics, nanotechnology and robotics, with robotics being the most important of the three.[98]

Read more:

Technology – Wikipedia

Nineteen Eighty-Four – Wikipedia

Nineteen Eighty-Four, often published as 1984, is a dystopian novel published in 1949 by English author George Orwell.[2][3] The novel is set in the year 1984 when most of the world population have become victims of perpetual war, omnipresent government surveillance and public manipulation.

In the novel, Great Britain (“Airstrip One”) has become a province of a superstate named Oceania. Oceania is ruled by the “Party”, who employ the “Thought Police” to persecute individualism and independent thinking.[4] The Party’s leader is Big Brother, who enjoys an intense cult of personality but may not even exist. The protagonist of the novel, Winston Smith, is a rank-and-file Party member. Smith is an outwardly diligent and skillful worker, but he secretly hates the Party and dreams of rebellion against Big Brother. Smith rebels by entering a forbidden relationship with fellow employee Julia.

As literary political fiction and dystopian science-fiction, Nineteen Eighty-Four is a classic novel in content, plot, and style. Many of its terms and concepts, such as Big Brother, doublethink, thoughtcrime, Newspeak, Room 101, telescreen, 2 + 2 = 5, and memory hole, have entered into common usage since its publication in 1949. Nineteen Eighty-Four popularised the adjective Orwellian, which describes official deception, secret surveillance, brazenly misleading terminology, and manipulation of recorded history by a totalitarian or authoritarian state.[5] In 2005, the novel was chosen by Time magazine as one of the 100 best English-language novels from 1923 to 2005.[6] It was awarded a place on both lists of Modern Library 100 Best Novels, reaching number 13 on the editor’s list, and 6 on the readers’ list.[7] In 2003, the novel was listed at number 8 on the BBC’s survey The Big Read.[8]

Orwell “encapsulate[d] the thesis at the heart of his unforgiving novel” in 1944, the implications of dividing the world up into zones of influence, which had been conjured by the Tehran Conference. Three years later, he wrote most of it on the Scottish island of Jura from 1947 to 1948 despite being seriously ill with tuberculosis.[9][10] On 4 December 1948, he sent the final manuscript to the publisher Secker and Warburg, and Nineteen Eighty-Four was published on 8 June 1949.[11][12] By 1989, it had been translated into 65 languages, more than any other novel in English until then.[13] The title of the novel, its themes, the Newspeak language and the author’s surname are often invoked against control and intrusion by the state, and the adjective Orwellian describes a totalitarian dystopia that is characterised by government control and subjugation of the people.

Orwell’s invented language, Newspeak, satirises hypocrisy and evasion by the state: the Ministry of Love (Miniluv) oversees torture and brainwashing, the Ministry of Plenty (Miniplenty) oversees shortage and rationing, the Ministry of Peace (Minipax) oversees war and atrocity and the Ministry of Truth (Minitrue) oversees propaganda and historical revisionism.

The Last Man in Europe was an early title for the novel, but in a letter dated 22 October 1948 to his publisher Fredric Warburg, eight months before publication, Orwell wrote about hesitating between that title and Nineteen Eighty-Four.[14] Warburg suggested choosing the main title to be the latter, a more commercial one.[15]

In the novel 1985 (1978), Anthony Burgess suggests that Orwell, disillusioned by the onset of the Cold War (194591), intended to call the book 1948. The introduction to the Penguin Books Modern Classics edition of Nineteen Eighty-Four reports that Orwell originally set the novel in 1980 but that he later shifted the date to 1982 and then to 1984. The introduction to the Houghton Mifflin Harcourt edition of Animal Farm and 1984 (2003) reports that the title 1984 was chosen simply as an inversion of the year 1948, the year in which it was being completed, and that the date was meant to give an immediacy and urgency to the menace of totalitarian rule.[16]

Throughout its publication history, Nineteen Eighty-Four has been either banned or legally challenged, as subversive or ideologically corrupting, like Aldous Huxley’s Brave New World (1932), We (1924) by Yevgeny Zamyatin, Darkness at Noon (1940) by Arthur Koestler, Kallocain (1940) by Karin Boye and Fahrenheit 451 (1953) by Ray Bradbury.[17] Some writers consider the Russian dystopian novel We by Zamyatin to have influenced Nineteen Eighty-Four,[18][19] and the novel bears significant similarities in its plot and characters to Darkness at Noon, written years before by Arthur Koestler, who was a personal friend of Orwell.[20]

The novel is in the public domain in Canada,[21] South Africa,[22] Argentina,[23] Australia,[24] and Oman.[25] It will be in the public domain in the United Kingdom, the EU,[26] and Brazil in 2021[27] (70 years after the author’s death), and in the United States in 2044.[28]

Nineteen Eighty-Four is set in Oceania, one of three inter-continental superstates that divided the world after a global war.

Smith’s memories and his reading of the proscribed book, The Theory and Practice of Oligarchical Collectivism by Emmanuel Goldstein, reveal that after the Second World War, the United Kingdom became involved in a war fought in Europe, western Russia, and North America during the early 1950s. Nuclear weapons were used during the war, leading to the destruction of Colchester. London would also suffer widespread aerial raids, leading Winston’s family to take refuge in a London Underground station. Britain fell to civil war, with street fighting in London, before the English Socialist Party, abbreviated as Ingsoc, emerged victorious and formed a totalitarian government in Britain. The British Commonwealth was absorbed by the United States to become Oceania. Eventually Ingsoc emerged to form a totalitarian government in the country.

Simultaneously, the Soviet Union conquered continental Europe and established the second superstate of Eurasia. The third superstate of Eastasia would emerge in the Far East after several decades of fighting. The three superstates wage perpetual war for the remaining unconquered lands of the world in “a rough quadrilateral with its corners at Tangier, Brazzaville, Darwin, and Hong Kong” through constantly shifting alliances. Although each of the three states are said to have sufficient natural resources, the war continues in order to maintain ideological control over the people.

However, due to the fact that Winston barely remembers these events and due to the Party’s manipulation of history, the continuity and accuracy of these events are unclear. Winston himself notes that the Party has claimed credit for inventing helicopters, airplanes and trains, while Julia theorizes that the perpetual bombing of London is merely a false-flag operation designed to convince the populace that a war is occurring. If the official account was accurate, Smith’s strengthening memories and the story of his family’s dissolution suggest that the atomic bombings occurred first, followed by civil war featuring “confused street fighting in London itself” and the societal postwar reorganisation, which the Party retrospectively calls “the Revolution”.

Most of the plot takes place in London, the “chief city of Airstrip One”, the Oceanic province that “had once been called England or Britain”.[29][30] Posters of the Party leader, Big Brother, bearing the caption “BIG BROTHER IS WATCHING YOU”, dominate the city (Winston states it can be found on nearly every house), while the ubiquitous telescreen (transceiving television set) monitors the private and public lives of the populace. Military parades, propaganda films, and public executions are said to be commonplace.

The class hierarchy of Oceania has three levels:

As the government, the Party controls the population with four ministries:

The protagonist Winston Smith, a member of the Outer Party, works in the Records Department of the Ministry of Truth as an editor, revising historical records, to make the past conform to the ever-changing party line and deleting references to unpersons, people who have been “vaporised”, i.e., not only killed by the state but denied existence even in history or memory.

The story of Winston Smith begins on 4 April 1984: “It was a bright cold day in April, and the clocks were striking thirteen.” Yet he is uncertain of the true date, given the regime’s continual rewriting and manipulation of history.[31]

In the year 1984, civilization has been damaged by war, civil conflict, and revolution. Airstrip One (formerly Britain) is a province of Oceania, one of the three totalitarian super-states that rules the world. It is ruled by the “Party” under the ideology of “Ingsoc” and the mysterious leader Big Brother, who has an intense cult of personality. The Party stamps out anyone who does not fully conform to their regime using the Thought Police and constant surveillance, through devices such as Telescreens (two-way televisions).

Winston Smith is a member of the middle class Outer Party. He works at the Ministry of Truth, where he rewrites historical records to conform to the state’s ever-changing version of history. Those who fall out of favour with the Party become “unpersons”, disappearing with all evidence of their existence removed. Winston revises past editions of The Times, while the original documents are destroyed by fire in a “memory hole”. He secretly opposes the Party’s rule and dreams of rebellion. He realizes that he is already a “thoughtcriminal” and likely to be caught one day.

While in a proletarian neighbourhood, he meets an antique shop owner called Mr. Charrington and buys a diary. He uses an alcove to hide it from the Telescreen in his room, and writes thoughts criticising the Party and Big Brother. In the journal, he records his sexual frustration over a young woman maintaining the novel-writing machines at the ministry named Julia, whom Winston is attracted to but suspects is an informant. He also suspects that his superior, an Inner Party official named O’Brien, is a secret agent for an enigmatic underground resistance movement known as the Brotherhood, a group formed by Big Brother’s reviled political rival Emmanuel Goldstein.

The next day, Julia secretly hands Winston a note confessing her love for him. Winston and Julia begin an affair, an act of the rebellion as the Party insists that sex may only be used for reproduction. Winston realizes that she shares his loathing of the Party. They first meet in the country, and later in a rented room above Mr. Charrington’s shop. During his affair with Julia, Winston remembers the disappearance of his family during the civil war of the 1950s and his terse relationship with his ex-wife Katharine. Winston also interacts with his colleague Syme, who is writing a dictionary for a revised version of the English language called Newspeak. After Syme admits that the true purpose of Newspeak is to reduce the capacity of human thought, Winston speculates that Syme will disappear. Not long after, Syme disappears and no one acknowledges his absence.

Weeks later, Winston is approached by O’Brien, who offers Winston a chance to join the Brotherhood. They arrange a meeting at O’Brien’s luxurious flat where both Winston and Julia swear allegiance to the Brotherhood. He sends Winston a copy of The Theory and Practice of Oligarchical Collectivism by Emmanuel Goldstein. Winston and Julia read parts of the book, which explains more about how the Party maintains power, the true meanings of its slogans and the concept of perpetual war. It argues that the Party can be overthrown if proles (proletarians) rise up against it.

Mr. Charrington is revealed to be an agent of the Thought Police. Winston and Julia are captured in the shop and imprisoned in the Ministry of Love. O’Brien reveals that he is loyal to the party, and part of a special sting operation to catch “thoughtcriminals”. Over many months, Winston is tortured and forced to “cure” himself of his “insanity” by changing his own perception to fit the Party line, even if it requires saying that “2 + 2 = 5”. O’Brien openly admits that the Party “is not interested in the good of others; it is interested solely in power.” He says that once Winston is brainwashed into loyalty, he will be released back into society for a period of time, before they execute him. Winston points out that the Party has not managed to make him betray Julia.

O’Brien then takes Winston to Room 101 for the final stage of re-education. The room contains each prisoner’s worst fear, in Winston’s case rats. As a wire cage holding hungry rats is fitted onto his face, Winston shouts “Do it to Julia!”, thus betraying her. After being released, Winston meets Julia in a park. She says that she was also tortured, and both reveal betraying the other. Later, Winston sits alone in a caf as Oceania celebrates a supposed victory over Eurasian armies in Africa, and realizes that “He loved Big Brother.”

Ingsoc (English Socialism) is the predominant ideology and pseudophilosophy of Oceania, and Newspeak is the official language of official documents.

In London, the capital city of Airstrip One, Oceania’s four government ministries are in pyramids (300 m high), the faades of which display the Party’s three slogans. The ministries’ names are the opposite (doublethink) of their true functions: “The Ministry of Peace concerns itself with war, the Ministry of Truth with lies, the Ministry of Love with torture and the Ministry of Plenty with starvation.” (Part II, Chapter IX The Theory and Practice of Oligarchical Collectivism)

The Ministry of Peace supports Oceania’s perpetual war against either of the two other superstates:

The primary aim of modern warfare (in accordance with the principles of doublethink, this aim is simultaneously recognized and not recognized by the directing brains of the Inner Party) is to use up the products of the machine without raising the general standard of living. Ever since the end of the nineteenth century, the problem of what to do with the surplus of consumption goods has been latent in industrial society. At present, when few human beings even have enough to eat, this problem is obviously not urgent, and it might not have become so, even if no artificial processes of destruction had been at work.

The Ministry of Plenty rations and controls food, goods, and domestic production; every fiscal quarter, it publishes false claims of having raised the standard of living, when it has, in fact, reduced rations, availability, and production. The Ministry of Truth substantiates Ministry of Plenty’s claims by revising historical records to report numbers supporting the current, “increased rations”.

The Ministry of Truth controls information: news, entertainment, education, and the arts. Winston Smith works in the Minitrue RecDep (Records Department), “rectifying” historical records to concord with Big Brother’s current pronouncements so that everything the Party says is true.

The Ministry of Love identifies, monitors, arrests, and converts real and imagined dissidents. In Winston’s experience, the dissident is beaten and tortured, and, when near-broken, he is sent to Room 101 to face “the worst thing in the world”until love for Big Brother and the Party replaces dissension.

The keyword here is blackwhite. Like so many Newspeak words, this word has two mutually contradictory meanings. Applied to an opponent, it means the habit of impudently claiming that black is white, in contradiction of the plain facts. Applied to a Party member, it means a loyal willingness to say that black is white when Party discipline demands this. But it means also the ability to believe that black is white, and more, to know that black is white, and to forget that one has ever believed the contrary. This demands a continuous alteration of the past, made possible by the system of thought which really embraces all the rest, and which is known in Newspeak as doublethink. Doublethink is basically the power of holding two contradictory beliefs in one’s mind simultaneously, and accepting both of them.

Three perpetually warring totalitarian super-states control the world:[34]

The perpetual war is fought for control of the “disputed area” lying “between the frontiers of the super-states”, which forms “a rough parallelogram with its corners at Tangier, Brazzaville, Darwin and Hong Kong”,[34] and Northern Africa, the Middle East, India and Indonesia are where the superstates capture and use slave labour. Fighting also takes place between Eurasia and Eastasia in Manchuria, Mongolia and Central Asia, and all three powers battle one another over various Atlantic and Pacific islands.

Goldstein’s book, The Theory and Practice of Oligarchical Collectivism, explains that the superstates’ ideologies are alike and that the public’s ignorance of this fact is imperative so that they might continue believing in the detestability of the opposing ideologies. The only references to the exterior world for the Oceanian citizenry (the Outer Party and the Proles) are Ministry of Truth maps and propaganda to ensure their belief in “the war”.

Winston Smith’s memory and Emmanuel Goldstein’s book communicate some of the history that precipitated the Revolution. Eurasia was formed when the Soviet Union conquered Continental Europe, creating a single state stretching from Portugal to the Bering Strait. Eurasia does not include the British Isles because the United States annexed them along with the rest of the British Empire and Latin America, thus establishing Oceania and gaining control over a quarter of the planet. Eastasia, the last superstate established, emerged only after “a decade of confused fighting”. It includes the Asian lands conquered by China and Japan. Although Eastasia is prevented from matching Eurasia’s size, its larger populace compensates for that handicap.

The annexation of Britain occurred about the same time as the atomic war that provoked civil war, but who fought whom in the war is left unclear. Nuclear weapons fell on Britain; an atomic bombing of Colchester is referenced in the text. Exactly how Ingsoc and its rival systems (Neo-Bolshevism and Death Worship) gained power in their respective countries is also unclear.

While the precise chronology cannot be traced, most of the global societal reorganization occurred between 1945 and the early 1960s. Winston and Julia once meet in the ruins of a church that was destroyed in a nuclear attack “thirty years” earlier, which suggests 1954 as the year of the atomic war that destabilised society and allowed the Party to seize power. It is stated in the novel that the “fourth quarter of 1983” was “also the sixth quarter of the Ninth Three-Year Plan”, which implies that the first quarter of the first three-year plan began in July 1958. By then, the Party was apparently in control of Oceania.

In 1984, there is a perpetual war between Oceania, Eurasia and Eastasia, the superstates that emerged from the global atomic war. The Theory and Practice of Oligarchical Collectivism, by Emmanuel Goldstein, explains that each state is so strong it cannot be defeated, even with the combined forces of two superstates, despite changing alliances. To hide such contradictions, history is rewritten to explain that the (new) alliance always was so; the populaces are accustomed to doublethink and accept it. The war is not fought in Oceanian, Eurasian or Eastasian territory but in the Arctic wastes and in a disputed zone comprising the sea and land from Tangiers (Northern Africa) to Darwin (Australia). At the start, Oceania and Eastasia are allies fighting Eurasia in northern Africa and the Malabar Coast.

That alliance ends and Oceania, allied with Eurasia, fights Eastasia, a change occurring on Hate Week, dedicated to creating patriotic fervour for the Party’s perpetual war. The public are blind to the change; in mid-sentence, an orator changes the name of the enemy from “Eurasia” to “Eastasia” without pause. When the public are enraged at noticing that the wrong flags and posters are displayed, they tear them down; the Party later claims to have captured Africa.

Goldstein’s book explains that the purpose of the unwinnable, perpetual war is to consume human labour and commodities so that the economy of a superstate cannot support economic equality, with a high standard of life for every citizen. By using up most of the produced objects like boots and rations, the proles are kept poor and uneducated and will neither realise what the government is doing nor rebel. Goldstein also details an Oceanian strategy of attacking enemy cities with atomic rockets before invasion but dismisses it as unfeasible and contrary to the war’s purpose; despite the atomic bombing of cities in the 1950s, the superstates stopped it for fear that would imbalance the powers. The military technology in the novel differs little from that of World War II, but strategic bomber aeroplanes are replaced with rocket bombs, helicopters were heavily used as weapons of war (they did not figure in World War II in any form but prototypes) and surface combat units have been all but replaced by immense and unsinkable Floating Fortresses, island-like contraptions concentrating the firepower of a whole naval task force in a single, semi-mobile platform (in the novel, one is said to have been anchored between Iceland and the Faroe Islands, suggesting a preference for sea lane interdiction and denial).

The society of Airstrip One and, according to “The Book”, almost the whole world, lives in poverty: hunger, disease and filth are the norms. Ruined cities and towns are common: the consequence of the civil war, the atomic wars and the purportedly enemy (but possibly false flag) rockets. Social decay and wrecked buildings surround Winston; aside from the ministerial pyramids, little of London was rebuilt. Members of the Outer Party consume synthetic foodstuffs and poor-quality “luxuries” such as oily gin and loosely-packed cigarettes, distributed under the “Victory” brand. (That is a parody of the low-quality Indian-made “Victory” cigarettes, widely smoked in Britain and by British soldiers during World War II. They were smoked because it was easier to import them from India than it was to import American cigarettes from across the Atlantic because of the War of the Atlantic.)

Winston describes something as simple as the repair of a broken pane of glass as requiring committee approval that can take several years and so most of those living in one of the blocks usually do the repairs themselves (Winston himself is called in by Mrs. Parsons to repair her blocked sink). All Outer Party residences include telescreens that serve both as outlets for propaganda and to monitor the Party members; they can be turned down, but they cannot be turned off.

In contrast to their subordinates, the Inner Party upper class of Oceanian society reside in clean and comfortable flats in their own quarter of the city, with pantries well-stocked with foodstuffs such as wine, coffee and sugar, all denied to the general populace.[35] Winston is astonished that the lifts in O’Brien’s building work, the telescreens can be switched off and O’Brien has an Asian manservant, Martin. All members of the Inner Party are attended to by slaves captured in the disputed zone, and “The Book” suggests that many have their own motorcars or even helicopters. Nonetheless, “The Book” makes clear that even the conditions enjoyed by the Inner Party are only “relatively” comfortable, and standards would be regarded as austere by those of the prerevolutionary lite.[36]

The proles live in poverty and are kept sedated with alcohol, pornography and a national lottery whose winnings are never actually paid out; that is obscured by propaganda and the lack of communication within Oceania. At the same time, the proles are freer and less intimidated than the middle-class Outer Party: they are subject to certain levels of monitoring but are not expected to be particularly patriotic. They lack telescreens in their own homes and often jeer at the telescreens that they see. “The Book” indicates that is because the middle class, not the lower class, traditionally starts revolutions. The model demands tight control of the middle class, with ambitious Outer-Party members neutralised via promotion to the Inner Party or “reintegration” by the Ministry of Love, and proles can be allowed intellectual freedom because they lack intellect. Winston nonetheless believes that “the future belonged to the proles”.[37]

The standard of living of the populace is low overall. Consumer goods are scarce, and all those available through official channels are of low quality; for instance, despite the Party regularly reporting increased boot production, more than half of the Oceanian populace goes barefoot. The Party claims that poverty is a necessary sacrifice for the war effort, and “The Book” confirms that to be partially correct since the purpose of perpetual war consumes surplus industrial production. Outer Party members and proles occasionally gain access to better items in the market, which deals in goods that were pilfered from the residences of the Inner Party.[citation needed]

Nineteen Eighty-Four expands upon the subjects summarised in Orwell’s essay “Notes on Nationalism”[38] about the lack of vocabulary needed to explain the unrecognised phenomena behind certain political forces. In Nineteen Eighty-Four, the Party’s artificial, minimalist language ‘Newspeak’ addresses the matter.

O’Brien concludes: “The object of persecution is persecution. The object of torture is torture. The object of power is power.”

In the book, Inner Party member O’Brien describes the Party’s vision of the future:

There will be no curiosity, no enjoyment of the process of life. All competing pleasures will be destroyed. But alwaysdo not forget this, Winstonalways there will be the intoxication of power, constantly increasing and constantly growing subtler. Always, at every moment, there will be the thrill of victory, the sensation of trampling on an enemy who is helpless. If you want a picture of the future, imagine a boot stamping on a human faceforever.

Part III, Chapter III, Nineteen Eighty-Four

A major theme of Nineteen Eighty-Four is censorship, especially in the Ministry of Truth, where photographs are modified and public archives rewritten to rid them of “unpersons” (persons who are erased from history by the Party). On the telescreens, figures for all types of production are grossly exaggerated or simply invented to indicate an ever-growing economy, when the reality is the opposite. One small example of the endless censorship is Winston being charged with the task of eliminating a reference to an unperson in a newspaper article. He proceeds to write an article about Comrade Ogilvy, a made-up party member who displayed great heroism by leaping into the sea from a helicopter so that the dispatches he was carrying would not fall into enemy hands.

The inhabitants of Oceania, particularly the Outer Party members, have no real privacy. Many of them live in apartments equipped with two-way telescreens so that they may be watched or listened to at any time. Similar telescreens are found at workstations and in public places, along with hidden microphones. Written correspondence is routinely opened and read by the government before it is delivered. The Thought Police employ undercover agents, who pose as normal citizens and report any person with subversive tendencies. Children are encouraged to report suspicious persons to the government, and some denounce their parents. Citizens are controlled, and the smallest sign of rebellion, even something so small as a facial expression, can result in immediate arrest and imprisonment. Thus, citizens, particularly party members, are compelled to obedience.

“The Principles of Newspeak” is an academic essay appended to the novel. It describes the development of Newspeak, the Party’s minimalist artificial language meant to ideologically align thought and action with the principles of Ingsoc by making “all other modes of thought impossible”. (A linguistic theory about how language may direct thought is the SapirWhorf hypothesis.)

Whether or not the Newspeak appendix implies a hopeful end to Nineteen Eighty-Four remains a critical debate, as it is in Standard English and refers to Newspeak, Ingsoc, the Party etc., in the past tense: “Relative to our own, the Newspeak vocabulary was tiny, and new ways of reducing it were constantly being devised” p.422). Some critics (Atwood,[39] Benstead,[40] Milner,[41] Pynchon[42]) claim that for the essay’s author, both Newspeak and the totalitarian government are in the past.

Nineteen Eighty-Four uses themes from life in the Soviet Union and wartime life in Great Britain as sources for many of its motifs. Some time at an unspecified date after the first American publication of the book, producer Sidney Sheldon wrote to Orwell interested in adapting the novel to the Broadway stage. Orwell sold the American stage rights to Sheldon, explaining that his basic goal with Nineteen Eighty-Four was imagining the consequences of Stalinist government ruling British society:

[Nineteen Eighty-Four] was based chiefly on communism, because that is the dominant form of totalitarianism, but I was trying chiefly to imagine what communism would be like if it were firmly rooted in the English speaking countries, and was no longer a mere extension of the Russian Foreign Office.[43]

The statement “2 + 2 = 5”, used to torment Winston Smith during his interrogation, was a communist party slogan from the second five-year plan, which encouraged fulfillment of the five-year plan in four years. The slogan was seen in electric lights on Moscow house-fronts, billboards and elsewhere.[44]

The switch of Oceania’s allegiance from Eastasia to Eurasia and the subsequent rewriting of history (“Oceania was at war with Eastasia: Oceania had always been at war with Eastasia. A large part of the political literature of five years was now completely obsolete”; ch 9) is evocative of the Soviet Union’s changing relations with Nazi Germany. The two nations were open and frequently vehement critics of each other until the signing of the 1939 Treaty of Non-Aggression. Thereafter, and continuing until the Nazi invasion of the Soviet Union in 1941, no criticism of Germany was allowed in the Soviet press, and all references to prior party lines stoppedincluding in the majority of non-Russian communist parties who tended to follow the Russian line. Orwell had criticised the Communist Party of Great Britain for supporting the Treaty in his essays for Betrayal of the Left (1941). “The Hitler-Stalin pact of August 1939 reversed the Soviet Union’s stated foreign policy. It was too much for many of the fellow-travellers like Gollancz [Orwell’s sometime publisher] who had put their faith in a strategy of construction Popular Front governments and the peace bloc between Russia, Britain and France.”[45]

The description of Emmanuel Goldstein, with a “small, goatee beard”, evokes the image of Leon Trotsky. The film of Goldstein during the Two Minutes Hate is described as showing him being transformed into a bleating sheep. This image was used in a propaganda film during the Kino-eye period of Soviet film, which showed Trotsky transforming into a goat.[46] Goldstein’s book is similar to Trotsky’s highly critical analysis of the USSR, The Revolution Betrayed, published in 1936.

The omnipresent images of Big Brother, a man described as having a moustache, bears resemblance to the cult of personality built up around Joseph Stalin.

The news in Oceania emphasised production figures, just as it did in the Soviet Union, where record-setting in factories (by “Heroes of Socialist Labor”) was especially glorified. The best known of these was Alexey Stakhanov, who purportedly set a record for coal mining in 1935.

The tortures of the Ministry of Love evoke the procedures used by the NKVD in their interrogations,[47] including the use of rubber truncheons, being forbidden to put your hands in your pockets, remaining in brightly lit rooms for days, torture through the use of provoked rodents, and the victim being shown a mirror after their physical collapse.

The random bombing of Airstrip One is based on the Buzz bombs and the V-2 rocket, which struck England at random in 19441945.

The Thought Police is based on the NKVD, which arrested people for random “anti-soviet” remarks.[48] The Thought Crime motif is drawn from Kempeitai, the Japanese wartime secret police, who arrested people for “unpatriotic” thoughts.

The confessions of the “Thought Criminals” Rutherford, Aaronson and Jones are based on the show trials of the 1930s, which included fabricated confessions by prominent Bolsheviks Nikolai Bukharin, Grigory Zinoviev and Lev Kamenev to the effect that they were being paid by the Nazi government to undermine the Soviet regime under Leon Trotsky’s direction.

The song “Under the Spreading Chestnut Tree” (“Under the spreading chestnut tree, I sold you, and you sold me”) was based on an old English song called “Go no more a-rushing” (“Under the spreading chestnut tree, Where I knelt upon my knee, We were as happy as could be, ‘Neath the spreading chestnut tree.”). The song was published as early as 1891. The song was a popular camp song in the 1920s, sung with corresponding movements (like touching your chest when you sing “chest”, and touching your head when you sing “nut”). Glenn Miller recorded the song in 1939.[49]

The “Hates” (Two Minutes Hate and Hate Week) were inspired by the constant rallies sponsored by party organs throughout the Stalinist period. These were often short pep-talks given to workers before their shifts began (Two Minutes Hate), but could also last for days, as in the annual celebrations of the anniversary of the October revolution (Hate Week).

Orwell fictionalized “newspeak”, “doublethink”, and “Ministry of Truth” as evinced by both the Soviet press and that of Nazi Germany.[50] In particular, he adapted Soviet ideological discourse constructed to ensure that public statements could not be questioned.[51]

Winston Smith’s job, “revising history” (and the “unperson” motif) are based on the Stalinist habit of airbrushing images of ‘fallen’ people from group photographs and removing references to them in books and newspapers.[53] In one well-known example, the Soviet encyclopaedia had an article about Lavrentiy Beria. When he fell in 1953, and was subsequently executed, institutes that had the encyclopaedia were sent an article about the Bering Strait, with instructions to paste it over the article about Beria.[54]

Big Brother’s “Orders of the Day” were inspired by Stalin’s regular wartime orders, called by the same name. A small collection of the more political of these have been published (together with his wartime speeches) in English as “On the Great Patriotic War of the Soviet Union” By Joseph Stalin.[55][56] Like Big Brother’s Orders of the day, Stalin’s frequently lauded heroic individuals,[57] like Comrade Ogilvy, the fictitious hero Winston Smith invented to ‘rectify’ (fabricate) a Big Brother Order of the day.

The Ingsoc slogan “Our new, happy life”, repeated from telescreens, evokes Stalin’s 1935 statement, which became a CPSU slogan, “Life has become better, Comrades; life has become more cheerful.”[48]

In 1940 Argentine writer Jorge Luis Borges published Tln, Uqbar, Orbis Tertius which described the invention by a “benevolent secret society” of a world that would seek to remake human language and reality along human-invented lines. The story concludes with an appendix describing the success of the project. Borges’ story addresses similar themes of epistemology, language and history to 1984.[58]

During World War II, Orwell believed that British democracy as it existed before 1939 would not survive the war. The question being “Would it end via Fascist coup d’tat from above or via Socialist revolution from below”?[citation needed] Later, he admitted that events proved him wrong: “What really matters is that I fell into the trap of assuming that ‘the war and the revolution are inseparable’.”[59]

Nineteen Eighty-Four (1949) and Animal Farm (1945) share themes of the betrayed revolution, the person’s subordination to the collective, rigorously enforced class distinctions (Inner Party, Outer Party, Proles), the cult of personality, concentration camps, Thought Police, compulsory regimented daily exercise, and youth leagues. Oceania resulted from the US annexation of the British Empire to counter the Asian peril to Australia and New Zealand. It is a naval power whose militarism venerates the sailors of the floating fortresses, from which battle is given to recapturing India, the “Jewel in the Crown” of the British Empire. Much of Oceanic society is based upon the USSR under Joseph StalinBig Brother. The televised Two Minutes Hate is ritual demonisation of the enemies of the State, especially Emmanuel Goldstein (viz Leon Trotsky). Altered photographs and newspaper articles create unpersons deleted from the national historical record, including even founding members of the regime (Jones, Aaronson and Rutherford) in the 1960s purges (viz the Soviet Purges of the 1930s, in which leaders of the Bolshevik Revolution were similarly treated). A similar thing also happened during the French Revolution in which many of the original leaders of the Revolution were later put to death, for example Danton who was put to death by Robespierre, and then later Robespierre himself met the same fate.

In his 1946 essay “Why I Write”, Orwell explains that the serious works he wrote since the Spanish Civil War (193639) were “written, directly or indirectly, against totalitarianism and for democratic socialism”.[3][60] Nineteen Eighty-Four is a cautionary tale about revolution betrayed by totalitarian defenders previously proposed in Homage to Catalonia (1938) and Animal Farm (1945), while Coming Up for Air (1939) celebrates the personal and political freedoms lost in Nineteen Eighty-Four (1949). Biographer Michael Shelden notes Orwell’s Edwardian childhood at Henley-on-Thames as the golden country; being bullied at St Cyprian’s School as his empathy with victims; his life in the Indian Imperial Police in Burma and the techniques of violence and censorship in the BBC as capricious authority.[61]

Other influences include Darkness at Noon (1940) and The Yogi and the Commissar (1945) by Arthur Koestler; The Iron Heel (1908) by Jack London; 1920: Dips into the Near Future[62] by John A. Hobson; Brave New World (1932) by Aldous Huxley; We (1921) by Yevgeny Zamyatin which he reviewed in 1946;[63] and The Managerial Revolution (1940) by James Burnham predicting perpetual war among three totalitarian superstates. Orwell told Jacintha Buddicom that he would write a novel stylistically like A Modern Utopia (1905) by H. G. Wells.[citation needed]

Extrapolating from World War II, the novel’s pastiche parallels the politics and rhetoric at war’s endthe changed alliances at the “Cold War’s” (194591) beginning; the Ministry of Truth derives from the BBC’s overseas service, controlled by the Ministry of Information; Room 101 derives from a conference room at BBC Broadcasting House;[64] the Senate House of the University of London, containing the Ministry of Information is the architectural inspiration for the Minitrue; the post-war decrepitude derives from the socio-political life of the UK and the US, i.e., the impoverished Britain of 1948 losing its Empire despite newspaper-reported imperial triumph; and war ally but peace-time foe, Soviet Russia became Eurasia.

The term “English Socialism” has precedents in his wartime writings; in the essay “The Lion and the Unicorn: Socialism and the English Genius” (1941), he said that “the war and the revolution are inseparable…the fact that we are at war has turned Socialism from a textbook word into a realisable policy” because Britain’s superannuated social class system hindered the war effort and only a socialist economy would defeat Adolf Hitler. Given the middle class’s grasping this, they too would abide socialist revolution and that only reactionary Britons would oppose it, thus limiting the force revolutionaries would need to take power. An English Socialism would come about which “will never lose touch with the tradition of compromise and the belief in a law that is above the State. It will shoot traitors, but it will give them a solemn trial beforehand and occasionally it will acquit them. It will crush any open revolt promptly and cruelly, but it will interfere very little with the spoken and written word.”[65]

In the world of Nineteen Eighty-Four, “English Socialism”(or “Ingsoc” in Newspeak) is a totalitarian ideology unlike the English revolution he foresaw. Comparison of the wartime essay “The Lion and the Unicorn” with Nineteen Eighty-Four shows that he perceived a Big Brother regime as a perversion of his cherished socialist ideals and English Socialism. Thus Oceania is a corruption of the British Empire he believed would evolve “into a federation of Socialist states, like a looser and freer version of the Union of Soviet Republics”.[66][verification needed]

When first published, Nineteen Eighty-Four was generally well received by reviewers. V. S. Pritchett, reviewing the novel for the New Statesman stated: “I do not think I have ever read a novel more frightening and depressing; and yet, such are the originality, the suspense, the speed of writing and withering indignation that it is impossible to put the book down.”[67] P. H. Newby, reviewing Nineteen Eighty-Four for The Listener magazine, described it as “the most arresting political novel written by an Englishman since Rex Warner’s The Aerodrome.”[68] Nineteen Eighty-Four was also praised by Bertrand Russell, E. M. Forster and Harold Nicolson.[68] On the other hand, Edward Shanks, reviewing Nineteen Eighty-Four for The Sunday Times, was dismissive; Shanks claimed Nineteen Eighty-Four “breaks all records for gloomy vaticination”.[68] C. S. Lewis was also critical of the novel, claiming that the relationship of Julia and Winston, and especially the Party’s view on sex, lacked credibility, and that the setting was “odious rather than tragic”.[69]

Nineteen Eighty-Four has been adapted for the cinema, radio, television and theatre at least twice each, as well as for other art media, such as ballet and opera.

The effect of Nineteen Eighty-Four on the English language is extensive; the concepts of Big Brother, Room 101, the Thought Police, thoughtcrime, unperson, memory hole (oblivion), doublethink (simultaneously holding and believing contradictory beliefs) and Newspeak (ideological language) have become common phrases for denoting totalitarian authority. Doublespeak and groupthink are both deliberate elaborations of doublethink, and the adjective “Orwellian” means similar to Orwell’s writings, especially Nineteen Eighty-Four. The practice of ending words with “-speak” (such as mediaspeak) is drawn from the novel.[70] Orwell is perpetually associated with 1984; in July 1984, an asteroid was discovered by Antonn Mrkos and named after Orwell.

References to the themes, concepts and plot of Nineteen Eighty-Four have appeared frequently in other works, especially in popular music and video entertainment. An example is the worldwide hit reality television show Big Brother, in which a group of people live together in a large house, isolated from the outside world but continuously watched by television cameras.

The book touches on the invasion of privacy and ubiquitous surveillance. From mid-2013 it was publicized that the NSA has been secretly monitoring and storing global internet traffic, including the bulk data collection of email and phone call data. Sales of Nineteen Eighty-Four increased by up to seven times within the first week of the 2013 mass surveillance leaks.[79][80][81] The book again topped the Amazon.com sales charts in 2017 after a controversy involving Kellyanne Conway using the phrase “alternative facts” to explain discrepancies with the media.[82][83][84][85]

The book also shows mass media as a catalyst for the intensification of destructive emotions and violence. Since the 20th century, news and other forms of media have been publicizing violence more often.[86][87] In 2013, the Almeida Theatre and Headlong staged a successful new adaptation (by Robert Icke and Duncan Macmillan), which twice toured the UK and played an extended run in London’s West End. The play opened on Broadway in 2017.

In the decades since the publication of Nineteen Eighty-Four, there have been numerous comparisons to Aldous Huxley’s novel Brave New World, which had been published 17 years earlier, in 1932.[88][89][90][91] They are both predictions of societies dominated by a central government and are both based on extensions of the trends of their times. However, members of the ruling class of Nineteen Eighty-Four use brutal force, torture and mind control to keep individuals in line, but rulers in Brave New World keep the citizens in line by addictive drugs and pleasurable distractions.

In October 1949, after reading Nineteen Eighty-Four, Huxley sent a letter to Orwell and wrote that it would be more efficient for rulers to stay in power by the softer touch by allowing citizens to self-seek pleasure to control them rather than brute force and to allow a false sense of freedom:

Within the next generation I believe that the world’s rulers will discover that infant conditioning and narco-hypnosis are more efficient, as instruments of government, than clubs and prisons, and that the lust for power can be just as completely satisfied by suggesting people into loving their servitude as by flogging and kicking them into obedience.[92]

Elements of both novels can be seen in modern-day societies, with Huxley’s vision being more dominant in the West and Orwell’s vision more prevalent with dictators in ex-communist countries, as is pointed out in essays that compare the two novels, including Huxley’s own Brave New World Revisited.[93][94][95][85]

Comparisons with other dystopian novels like The Handmaid’s Tale, Virtual Light, The Private Eye and Children of Men have also been drawn.[96][97]

More here:

Nineteen Eighty-Four – Wikipedia

Technology – Wikipedia

This article is about the use and knowledge of techniques and processes for producing goods and services. For other uses, see Technology (disambiguation).

Technology (“science of craft”, from Greek , techne, “art, skill, cunning of hand”; and -, -logia[2]) is first robustly defined by Jacob Bigelow in 1829 as: “…principles, processes, and nomenclatures of the more conspicuous arts, particularly those which involve applications of science, and which may be considered useful, by promoting the benefit of society, together with the emolument [compensation [3]] of those who pursue them” [4].

The simplest form of technology is the development and use of basic tools. The prehistoric discovery of how to control fire and the later Neolithic Revolution increased the available sources of food, and the invention of the wheel helped humans to travel in and control their environment. Developments in historic times, including the printing press, the telephone, and the Internet, have lessened physical barriers to communication and allowed humans to interact freely on a global scale.

Technology has many effects. It has helped develop more advanced economies (including today’s global economy) and has allowed the rise of a leisure class. Many technological processes produce unwanted by-products known as pollution and deplete natural resources to the detriment of Earth’s environment. Innovations have always influenced the values of a society and raised new questions of the ethics of technology. Examples include the rise of the notion of efficiency in terms of human productivity, and the challenges of bioethics.

Philosophical debates have arisen over the use of technology, with disagreements over whether technology improves the human condition or worsens it. Neo-Luddism, anarcho-primitivism, and similar reactionary movements criticize the pervasiveness of technology, arguing that it harms the environment and alienates people; proponents of ideologies such as transhumanism and techno-progressivism view continued technological progress as beneficial to society and the human condition.

The use of the term “technology” has changed significantly over the last 200 years. Before the 20th century, the term was uncommon in English, and it was used either to refer to the description or study of the useful arts[14] or to allude to technical education, as in the Massachusetts Institute of Technology (chartered in 1861).[15]

The term “technology” rose to prominence in the 20th century in connection with the Second Industrial Revolution. The term’s meanings changed in the early 20th century when American social scientists, beginning with Thorstein Veblen, translated ideas from the German concept of Technik into “technology.” In German and other European languages, a distinction exists between technik and technologie that is absent in English, which usually translates both terms as “technology.” By the 1930s, “technology” referred not only to the study of the industrial arts but to the industrial arts themselves.[16]

In 1937, the American sociologist Read Bain wrote that “technology includes all tools, machines, utensils, weapons, instruments, housing, clothing, communicating and transporting devices and the skills by which we produce and use them.”[17] Bain’s definition remains common among scholars today, especially social scientists. Scientists and engineers usually prefer to define technology as applied science, rather than as the things that people make and use.[18] More recently, scholars have borrowed from European philosophers of “technique” to extend the meaning of technology to various forms of instrumental reason, as in Foucault’s work on technologies of the self (techniques de soi).

Dictionaries and scholars have offered a variety of definitions. The Merriam-Webster Learner’s Dictionary offers a definition of the term: “the use of science in industry, engineering, etc., to invent useful things or to solve problems” and “a machine, piece of equipment, method, etc., that is created by technology.”[19] Ursula Franklin, in her 1989 “Real World of Technology” lecture, gave another definition of the concept; it is “practice, the way we do things around here.”[20] The term is often used to imply a specific field of technology, or to refer to high technology or just consumer electronics, rather than technology as a whole.[21] Bernard Stiegler, in Technics and Time, 1, defines technology in two ways: as “the pursuit of life by means other than life,” and as “organized inorganic matter.”[22]

Technology can be most broadly defined as the entities, both material and immaterial, created by the application of mental and physical effort in order to achieve some value. In this usage, technology refers to tools and machines that may be used to solve real-world problems. It is a far-reaching term that may include simple tools, such as a crowbar or wooden spoon, or more complex machines, such as a space station or particle accelerator. Tools and machines need not be material; virtual technology, such as computer software and business methods, fall under this definition of technology.[23] W. Brian Arthur defines technology in a similarly broad way as “a means to fulfill a human purpose.”[24]

The word “technology” can also be used to refer to a collection of techniques. In this context, it is the current state of humanity’s knowledge of how to combine resources to produce desired products, to solve problems, fulfill needs, or satisfy wants; it includes technical methods, skills, processes, techniques, tools and raw materials. When combined with another term, such as “medical technology” or “space technology,” it refers to the state of the respective field’s knowledge and tools. “State-of-the-art technology” refers to the high technology available to humanity in any field.

Technology can be viewed as an activity that forms or changes culture.[25] Additionally, technology is the application of math, science, and the arts for the benefit of life as it is known. A modern example is the rise of communication technology, which has lessened barriers to human interaction and as a result has helped spawn new subcultures; the rise of cyberculture has at its basis the development of the Internet and the computer.[26] Not all technology enhances culture in a creative way; technology can also help facilitate political oppression and war via tools such as guns. As a cultural activity, technology predates both science and engineering, each of which formalize some aspects of technological endeavor.

The distinction between science, engineering, and technology is not always clear. Science is systematic knowledge of the physical or material world gained through observation and experimentation.[27] Technologies are not usually exclusively products of science, because they have to satisfy requirements such as utility, usability, and safety.[citation needed]

Engineering is the goal-oriented process of designing and making tools and systems to exploit natural phenomena for practical human means, often (but not always) using results and techniques from science. The development of technology may draw upon many fields of knowledge, including scientific, engineering, mathematical, linguistic, and historical knowledge, to achieve some practical result.

Technology is often a consequence of science and engineering, although technology as a human activity precedes the two fields. For example, science might study the flow of electrons in electrical conductors by using already-existing tools and knowledge. This new-found knowledge may then be used by engineers to create new tools and machines such as semiconductors, computers, and other forms of advanced technology. In this sense, scientists and engineers may both be considered technologists; the three fields are often considered as one for the purposes of research and reference.[28]

The exact relations between science and technology in particular have been debated by scientists, historians, and policymakers in the late 20th century, in part because the debate can inform the funding of basic and applied science. In the immediate wake of World War II, for example, it was widely considered in the United States that technology was simply “applied science” and that to fund basic science was to reap technological results in due time. An articulation of this philosophy could be found explicitly in Vannevar Bush’s treatise on postwar science policy, Science The Endless Frontier: “New products, new industries, and more jobs require continuous additions to knowledge of the laws of nature… This essential new knowledge can be obtained only through basic scientific research.”[29] In the late-1960s, however, this view came under direct attack, leading towards initiatives to fund science for specific tasks (initiatives resisted by the scientific community). The issue remains contentious, though most analysts resist the model that technology simply is a result of scientific research.[30][31]

The use of tools by early humans was partly a process of discovery and of evolution. Early humans evolved from a species of foraging hominids which were already bipedal,[32] with a brain mass approximately one third of modern humans.[33] Tool use remained relatively unchanged for most of early human history. Approximately 50,000 years ago, the use of tools and complex set of behaviors emerged, believed by many archaeologists to be connected to the emergence of fully modern language.[34]

Hominids started using primitive stone tools millions of years ago. The earliest stone tools were little more than a fractured rock, but approximately 75,000 years ago,[35] pressure flaking provided a way to make much finer work.

The discovery and utilization of fire, a simple energy source with many profound uses, was a turning point in the technological evolution of humankind.[36] The exact date of its discovery is not known; evidence of burnt animal bones at the Cradle of Humankind suggests that the domestication of fire occurred before 1 Ma;[37] scholarly consensus indicates that Homo erectus had controlled fire by between 500 and 400 ka.[38][39] Fire, fueled with wood and charcoal, allowed early humans to cook their food to increase its digestibility, improving its nutrient value and broadening the number of foods that could be eaten.[40]

Other technological advances made during the Paleolithic era were clothing and shelter; the adoption of both technologies cannot be dated exactly, but they were a key to humanity’s progress. As the Paleolithic era progressed, dwellings became more sophisticated and more elaborate; as early as 380 ka, humans were constructing temporary wood huts.[41][42] Clothing, adapted from the fur and hides of hunted animals, helped humanity expand into colder regions; humans began to migrate out of Africa by 200 ka and into other continents such as Eurasia.[43]

Human’s technological ascent began in earnest in what is known as the Neolithic Period (“New Stone Age”). The invention of polished stone axes was a major advance that allowed forest clearance on a large scale to create farms. This use of polished stone axes increased greatly in the Neolithic, but were originally used in the preceding Mesolithic in some areas such as Ireland.[44] Agriculture fed larger populations, and the transition to sedentism allowed simultaneously raising more children, as infants no longer needed to be carried, as nomadic ones must. Additionally, children could contribute labor to the raising of crops more readily than they could to the hunter-gatherer economy.[45][46]

With this increase in population and availability of labor came an increase in labor specialization.[47] What triggered the progression from early Neolithic villages to the first cities, such as Uruk, and the first civilizations, such as Sumer, is not specifically known; however, the emergence of increasingly hierarchical social structures and specialized labor, of trade and war amongst adjacent cultures, and the need for collective action to overcome environmental challenges such as irrigation, are all thought to have played a role.[48]

Continuing improvements led to the furnace and bellows and provided, for the first time, the ability to smelt and forge of gold, copper, silver, and lead native metals found in relatively pure form in nature.[49] The advantages of copper tools over stone, bone, and wooden tools were quickly apparent to early humans, and native copper was probably used from near the beginning of Neolithic times (about 10 ka).[50] Native copper does not naturally occur in large amounts, but copper ores are quite common and some of them produce metal easily when burned in wood or charcoal fires. Eventually, the working of metals led to the discovery of alloys such as bronze and brass (about 4000 BCE). The first uses of iron alloys such as steel dates to around 1800 BCE.[51][52]

Meanwhile, humans were learning to harness other forms of energy. The earliest known use of wind power is the sailing ship; the earliest record of a ship under sail is that of a Nile boat dating to the 8th millennium BCE.[53] From prehistoric times, Egyptians probably used the power of the annual flooding of the Nile to irrigate their lands, gradually learning to regulate much of it through purposely built irrigation channels and “catch” basins. The ancient Sumerians in Mesopotamia used a complex system of canals and levees to divert water from the Tigris and Euphrates rivers for irrigation.[54]

According to archaeologists, the wheel was invented around 4000 BCE probably independently and nearly simultaneously in Mesopotamia (in present-day Iraq), the Northern Caucasus (Maykop culture) and Central Europe.[55] Estimates on when this may have occurred range from 5500 to 3000 BCE with most experts putting it closer to 4000 BCE.[56] The oldest artifacts with drawings depicting wheeled carts date from about 3500 BCE;[57] however, the wheel may have been in use for millennia before these drawings were made. More recently, the oldest-known wooden wheel in the world was found in the Ljubljana marshes of Slovenia.[58]

The invention of the wheel revolutionized trade and war. It did not take long to discover that wheeled wagons could be used to carry heavy loads. The ancient Sumerians used the potter’s wheel and may have invented it.[59] A stone pottery wheel found in the city-state of Ur dates to around 3429 BCE,[60] and even older fragments of wheel-thrown pottery have been found in the same area.[60] Fast (rotary) potters’ wheels enabled early mass production of pottery, but it was the use of the wheel as a transformer of energy (through water wheels, windmills, and even treadmills) that revolutionized the application of nonhuman power sources. The first two-wheeled carts were derived from travois[61] and were first used in Mesopotamia and Iran in around 3000 BCE.[61]

The oldest known constructed roadways are the stone-paved streets of the city-state of Ur, dating to circa 4000 BCE[62] and timber roads leading through the swamps of Glastonbury, England, dating to around the same time period.[62] The first long-distance road, which came into use around 3500 BCE,[62] spanned 1,500 miles from the Persian Gulf to the Mediterranean Sea,[62] but was not paved and was only partially maintained.[62] In around 2000 BCE, the Minoans on the Greek island of Crete built a fifty-kilometer (thirty-mile) road leading from the palace of Gortyn on the south side of the island, through the mountains, to the palace of Knossos on the north side of the island.[62] Unlike the earlier road, the Minoan road was completely paved.[62]

Ancient Minoan private homes had running water.[64] A bathtub virtually identical to modern ones was unearthed at the Palace of Knossos.[64][65] Several Minoan private homes also had toilets, which could be flushed by pouring water down the drain.[64] The ancient Romans had many public flush toilets,[65] which emptied into an extensive sewage system.[65] The primary sewer in Rome was the Cloaca Maxima;[65] construction began on it in the sixth century BCE and it is still in use today.[65]

The ancient Romans also had a complex system of aqueducts,[63] which were used to transport water across long distances.[63] The first Roman aqueduct was built in 312 BCE.[63] The eleventh and final ancient Roman aqueduct was built in 226 CE.[63] Put together, the Roman aqueducts extended over 450 kilometers,[63] but less than seventy kilometers of this was above ground and supported by arches.[63]

Innovations continued through the Middle Ages with innovations such as silk, the horse collar and horseshoes in the first few hundred years after the fall of the Roman Empire. Medieval technology saw the use of simple machines (such as the lever, the screw, and the pulley) being combined to form more complicated tools, such as the wheelbarrow, windmills and clocks. The Renaissance brought forth many of these innovations, including the printing press (which facilitated the greater communication of knowledge), and technology became increasingly associated with science, beginning a cycle of mutual advancement. The advancements in technology in this era allowed a more steady supply of food, followed by the wider availability of consumer goods.

Starting in the United Kingdom in the 18th century, the Industrial Revolution was a period of great technological discovery, particularly in the areas of agriculture, manufacturing, mining, metallurgy, and transport, driven by the discovery of steam power. Technology took another step in a second industrial revolution with the harnessing of electricity to create such innovations as the electric motor, light bulb, and countless others. Scientific advancement and the discovery of new concepts later allowed for powered flight and advancements in medicine, chemistry, physics, and engineering. The rise in technology has led to skyscrapers and broad urban areas whose inhabitants rely on motors to transport them and their food supply. Communication was also greatly improved with the invention of the telegraph, telephone, radio and television. The late 19th and early 20th centuries saw a revolution in transportation with the invention of the airplane and automobile.

The 20th century brought a host of innovations. In physics, the discovery of nuclear fission has led to both nuclear weapons and nuclear power. Computers were also invented and later miniaturized utilizing transistors and integrated circuits. Information technology subsequently led to the creation of the Internet, which ushered in the current Information Age. Humans have also been able to explore space with satellites (later used for telecommunication) and in manned missions going all the way to the moon. In medicine, this era brought innovations such as open-heart surgery and later stem cell therapy along with new medications and treatments.

Complex manufacturing and construction techniques and organizations are needed to make and maintain these new technologies, and entire industries have arisen to support and develop succeeding generations of increasingly more complex tools. Modern technology increasingly relies on training and education their designers, builders, maintainers, and users often require sophisticated general and specific training. Moreover, these technologies have become so complex that entire fields have been created to support them, including engineering, medicine, and computer science, and other fields have been made more complex, such as construction, transportation and architecture.

Generally, technicism is the belief in the utility of technology for improving human societies.[66] Taken to an extreme, technicism “reflects a fundamental attitude which seeks to control reality, to resolve all problems with the use of scientifictechnological methods and tools.”[67] In other words, human beings will someday be able to master all problems and possibly even control the future using technology. Some, such as Stephen V. Monsma,[68] connect these ideas to the abdication of religion as a higher moral authority.

Optimistic assumptions are made by proponents of ideologies such as transhumanism and singularitarianism, which view technological development as generally having beneficial effects for the society and the human condition. In these ideologies, technological development is morally good.

Transhumanists generally believe that the point of technology is to overcome barriers, and that what we commonly refer to as the human condition is just another barrier to be surpassed.

Singularitarians believe in some sort of “accelerating change”; that the rate of technological progress accelerates as we obtain more technology, and that this will culminate in a “Singularity” after artificial general intelligence is invented in which progress is nearly infinite; hence the term. Estimates for the date of this Singularity vary,[69] but prominent futurist Ray Kurzweil estimates the Singularity will occur in 2045.

Kurzweil is also known for his history of the universe in six epochs: (1) the physical/chemical epoch, (2) the life epoch, (3) the human/brain epoch, (4) the technology epoch, (5) the artificial intelligence epoch, and (6) the universal colonization epoch. Going from one epoch to the next is a Singularity in its own right, and a period of speeding up precedes it. Each epoch takes a shorter time, which means the whole history of the universe is one giant Singularity event.[70]

Some critics see these ideologies as examples of scientism and techno-utopianism and fear the notion of human enhancement and technological singularity which they support. Some have described Karl Marx as a techno-optimist.[71]

On the somewhat skeptical side are certain philosophers like Herbert Marcuse and John Zerzan, who believe that technological societies are inherently flawed. They suggest that the inevitable result of such a society is to become evermore technological at the cost of freedom and psychological health.

Many, such as the Luddites and prominent philosopher Martin Heidegger, hold serious, although not entirely, deterministic reservations about technology (see “The Question Concerning Technology”[72]). According to Heidegger scholars Hubert Dreyfus and Charles Spinosa, “Heidegger does not oppose technology. He hopes to reveal the essence of technology in a way that ‘in no way confines us to a stultified compulsion to push on blindly with technology or, what comes to the same thing, to rebel helplessly against it.’ Indeed, he promises that ‘when we once open ourselves expressly to the essence of technology, we find ourselves unexpectedly taken into a freeing claim.'[73] What this entails is a more complex relationship to technology than either techno-optimists or techno-pessimists tend to allow.”[74]

Some of the most poignant criticisms of technology are found in what are now considered to be dystopian literary classics such as Aldous Huxley’s Brave New World, Anthony Burgess’s A Clockwork Orange, and George Orwell’s Nineteen Eighty-Four. In Goethe’s Faust, Faust selling his soul to the devil in return for power over the physical world is also often interpreted as a metaphor for the adoption of industrial technology. More recently, modern works of science fiction such as those by Philip K. Dick and William Gibson and films such as Blade Runner and Ghost in the Shell project highly ambivalent or cautionary attitudes toward technology’s impact on human society and identity.

The late cultural critic Neil Postman distinguished tool-using societies from technological societies and from what he called “technopolies,” societies that are dominated by the ideology of technological and scientific progress to the exclusion or harm of other cultural practices, values, and world-views.[75]

Darin Barney has written about technology’s impact on practices of citizenship and democratic culture, suggesting that technology can be construed as (1) an object of political debate, (2) a means or medium of discussion, and (3) a setting for democratic deliberation and citizenship. As a setting for democratic culture, Barney suggests that technology tends to make ethical questions, including the question of what a good life consists in, nearly impossible because they already give an answer to the question: a good life is one that includes the use of more and more technology.[76]

Nikolas Kompridis has also written about the dangers of new technology, such as genetic engineering, nanotechnology, synthetic biology, and robotics. He warns that these technologies introduce unprecedented new challenges to human beings, including the possibility of the permanent alteration of our biological nature. These concerns are shared by other philosophers, scientists and public intellectuals who have written about similar issues (e.g. Francis Fukuyama, Jrgen Habermas, William Joy, and Michael Sandel).[77]

Another prominent critic of technology is Hubert Dreyfus, who has published books such as On the Internet and What Computers Still Can’t Do.

A more infamous anti-technological treatise is Industrial Society and Its Future, written by the Unabomber Ted Kaczynski and printed in several major newspapers (and later books) as part of an effort to end his bombing campaign of the techno-industrial infrastructure. There are also subcultures that disapprove of some or most technology, such as self-identified off-gridders.[78]

The notion of appropriate technology was developed in the 20th century by thinkers such as E. F. Schumacher and Jacques Ellul to describe situations where it was not desirable to use very new technologies or those that required access to some centralized infrastructure or parts or skills imported from elsewhere. The ecovillage movement emerged in part due to this concern.

This section mainly focuses on American concerns even if it can reasonably be generalized to other Western countries.

The inadequate quantity and quality of American jobs is one of the most fundamental economic challenges we face. […] What’s the linkage between technology and this fundamental problem?

In his article, Jared Bernstein, a Senior Fellow at the Center on Budget and Policy Priorities,[79] questions the widespread idea that automation, and more broadly, technological advances, have mainly contributed to this growing labor market problem. His thesis appears to be a third way between optimism and skepticism. Essentially, he stands for a neutral approach of the linkage between technology and American issues concerning unemployment and declining wages.

He uses two main arguments to defend his point. First, because of recent technological advances, an increasing number of workers are losing their jobs. Yet, scientific evidence fails to clearly demonstrate that technology has displaced so many workers that it has created more problems than it has solved. Indeed, automation threatens repetitive jobs but higher-end jobs are still necessary because they complement technology and manual jobs that “requires flexibility judgment and common sense”[80] remain hard to replace with machines. Second, studies have not shown clear links between recent technology advances and the wage trends of the last decades.

Therefore, according to Bernstein, instead of focusing on technology and its hypothetical influences on current American increasing unemployment and declining wages, one needs to worry more about “bad policy that fails to offset the imbalances in demand, trade, income, and opportunity.”[80]

For people who use both the Internet and mobile devices in excessive quantities it is likely for them to experience fatigue and over exhaustion as a result of disruptions in their sleeping patterns. Continuous studies have shown that increased BMI and weight gain are associated with people who spend long hours online and not exercising frequently.[81] Heavy Internet use is also displayed in the school lower grades of those who use it in excessive amounts.[82] It has also been noted that the use of mobile phones whilst driving has increased the occurrence of road accidents particularly amongst teen drivers. Statistically, teens reportedly have fourfold the amount of road traffic incidents as those who are 20 years or older, and a very high percentage of adolescents write (81%) and read (92%) texts while driving.[83] In this context, mass media and technology have a negative impact on people, on both their mental and physical health.

Thomas P. Hughes stated that because technology has been considered as a key way to solve problems, we need to be aware of its complex and varied characters to use it more efficiently.[84] What is the difference between a wheel or a compass and cooking machines such as an oven or a gas stove? Can we consider all of them, only a part of them, or none of them as technologies?

Technology is often considered too narrowly; according to Hughes, “Technology is a creative process involving human ingenuity”.[85] This definition’s emphasis on creativity avoids unbounded definitions that may mistakenly include cooking technologies,” but it also highlights the prominent role of humans and therefore their responsibilities for the use of complex technological systems.

Yet, because technology is everywhere and has dramatically changed landscapes and societies, Hughes argues that engineers, scientists, and managers have often believed that they can use technology to shape the world as they want. They have often supposed that technology is easily controllable and this assumption has to be thoroughly questioned.[84] For instance, Evgeny Morozov particularly challenges two concepts: Internet-centrism and solutionism.”[86] Internet-centrism refers to the idea that our society is convinced that the Internet is one of the most stable and coherent forces. Solutionism is the ideology that every social issue can be solved thanks to technology and especially thanks to the internet. In fact, technology intrinsically contains uncertainties and limitations. According to Alexis Madrigal’s review of Morozov’s theory, to ignore it will lead to unexpected consequences that could eventually cause more damage than the problems they seek to address.”[87] Benjamin R. Cohen and Gwen Ottinger also discussed the multivalent effects of technology.[88]

Therefore, recognition of the limitations of technology, and more broadly, scientific knowledge, is needed especially in cases dealing with environmental justice and health issues. Ottinger continues this reasoning and argues that the ongoing recognition of the limitations of scientific knowledge goes hand in hand with scientists and engineers new comprehension of their role. Such an approach of technology and science “[require] technical professionals to conceive of their roles in the process differently. [They have to consider themselves as] collaborators in research and problem solving rather than simply providers of information and technical solutions.”[89]

Technology is properly defined as any application of science to accomplish a function. The science can be leading edge or well established and the function can have high visibility or be significantly more mundane, but it is all technology, and its exploitation is the foundation of all competitive advantage.

Technology-based planning is what was used to build the US industrial giants before WWII (e.g., Dow, DuPont, GM) and it is what was used to transform the US into a superpower. It was not economic-based planning.

The use of basic technology is also a feature of other animal species apart from humans. These include primates such as chimpanzees,[90] some dolphin communities,[91] and crows.[92][93] Considering a more generic perspective of technology as ethology of active environmental conditioning and control, we can also refer to animal examples such as beavers and their dams, or bees and their honeycombs.

The ability to make and use tools was once considered a defining characteristic of the genus Homo.[94] However, the discovery of tool construction among chimpanzees and related primates has discarded the notion of the use of technology as unique to humans. For example, researchers have observed wild chimpanzees utilising tools for foraging: some of the tools used include leaf sponges, termite fishing probes, pestles and levers.[95] West African chimpanzees also use stone hammers and anvils for cracking nuts,[96] as do capuchin monkeys of Boa Vista, Brazil.[97]

Theories of technology often attempt to predict the future of technology based on the high technology and science of the time. As with all predictions of the future, however, technology’s is uncertain.

In 2005, futurist Ray Kurzweil predicted that the future of technology would mainly consist of an overlapping “GNR Revolution” of genetics, nanotechnology and robotics, with robotics being the most important of the three.[98]

See the original post:

Technology – Wikipedia

War on drugs – Wikipedia

War on Drugs is an American term[6][7] usually applied to the U.S. federal government’s campaign of prohibition of drugs, military aid, and military intervention, with the stated aim being to reduce the illegal drug trade.[8][9] The initiative includes a set of drug policies that are intended to discourage the production, distribution, and consumption of psychoactive drugs that the participating governments and the UN have made illegal. The term was popularized by the media shortly after a press conference given on June 18, 1971, by President Richard Nixonthe day after publication of a special message from President Nixon to the Congress on Drug Abuse Prevention and Controlduring which he declared drug abuse “public enemy number one”. That message to the Congress included text about devoting more federal resources to the “prevention of new addicts, and the rehabilitation of those who are addicted”, but that part did not receive the same public attention as the term “war on drugs”.[10][11][12] However, two years prior to this, Nixon had formally declared a “war on drugs” that would be directed toward eradication, interdiction, and incarceration.[13] Today, the Drug Policy Alliance, which advocates for an end to the War on Drugs, estimates that the United States spends $51 billion annually on these initiatives.[14]

On May 13, 2009, Gil Kerlikowskethe Director of the Office of National Drug Control Policy (ONDCP)signaled that the Obama administration did not plan to significantly alter drug enforcement policy, but also that the administration would not use the term “War on Drugs”, because Kerlikowske considers the term to be “counter-productive”.[15] ONDCP’s view is that “drug addiction is a disease that can be successfully prevented and treated… making drugs more available will make it harder to keep our communities healthy and safe”.[16] One of the alternatives that Kerlikowske has showcased is the drug policy of Sweden, which seeks to balance public health concerns with opposition to drug legalization. The prevalence rates for cocaine use in Sweden are barely one-fifth of those in Spain, the biggest consumer of the drug.[17]

In June 2011, the Global Commission on Drug Policy released a critical report on the War on Drugs, declaring: “The global war on drugs has failed, with devastating consequences for individuals and societies around the world. Fifty years after the initiation of the UN Single Convention on Narcotic Drugs, and years after President Nixon launched the US government’s war on drugs, fundamental reforms in national and global drug control policies are urgently needed.”[18] The report was criticized by organizations that oppose a general legalization of drugs.[16]

The first U.S. law that restricted the distribution and use of certain drugs was the Harrison Narcotics Tax Act of 1914. The first local laws came as early as 1860.[19] In 1919, the United States passed the 18th Amendment, prohibiting the sale, manufacture, and transportation of alcohol, with exceptions for religious and medical use. In 1920, the United States passed the National Prohibition Act (Volstead Act), enacted to carry out the provisions in law of the 18th Amendment.

The Federal Bureau of Narcotics was established in the United States Department of the Treasury by an act of June 14, 1930 (46 Stat. 585).[20] In 1933, the federal prohibition for alcohol was repealed by passage of the 21st Amendment. In 1935, President Franklin D. Roosevelt publicly supported the adoption of the Uniform State Narcotic Drug Act. The New York Times used the headline “Roosevelt Asks Narcotic War Aid”.[21][22]

In 1937, the Marihuana Tax Act of 1937 was passed. Several scholars have claimed that the goal was to destroy the hemp industry,[23][24][25] largely as an effort of businessmen Andrew Mellon, Randolph Hearst, and the Du Pont family.[23][25] These scholars argue that with the invention of the decorticator, hemp became a very cheap substitute for the paper pulp that was used in the newspaper industry.[23][26] These scholars believe that Hearst felt[dubious discuss] that this was a threat to his extensive timber holdings. Mellon, United States Secretary of the Treasury and the wealthiest man in America, had invested heavily in the DuPont’s new synthetic fiber, nylon, and considered[dubious discuss] its success to depend on its replacement of the traditional resource, hemp.[23][27][28][29][30][31][32][33] However, there were circumstances that contradict these claims. One reason for doubts about those claims is that the new decorticators did not perform fully satisfactorily in commercial production.[34] To produce fiber from hemp was a labor-intensive process if you include harvest, transport and processing. Technological developments decreased the labor with hemp but not sufficient to eliminate this disadvantage.[35][36]

On October 27, 1970, Congress passes the Comprehensive Drug Abuse Prevention and Control Act of 1970, which, among other things, categorizes controlled substances based on their medicinal use and potential for addiction.[37] In 1971, two congressmen released an explosive report on the growing heroin epidemic among U.S. servicemen in Vietnam; ten to fifteen percent of the servicemen were addicted to heroin, and President Nixon declared drug abuse to be “public enemy number one”.[37][38]

Although Nixon declared “drug abuse” to be public enemy number one in 1971,[39] the policies that his administration implemented as part of the Comprehensive Drug Abuse Prevention and Control Act of 1970 were a continuation of drug prohibition policies in the U.S., which started in 1914.[37][40]

“The Nixon campaign in 1968, and the Nixon White House after that, had two enemies: the antiwar left and black people. You understand what I’m saying? We knew we couldn’t make it illegal to be either against the war or black, but by getting the public to associate the hippies with marijuana and blacks with heroin, and then criminalizing both heavily, we could disrupt those communities. We could arrest their leaders, raid their homes, break up their meetings, and vilify them night after night on the evening news. Did we know we were lying about the drugs? Of course we did.” John Ehrlichman, to Dan Baum[41][42][43] for Harper’s Magazine[44] in 1994, about President Richard Nixon’s war on drugs, declared in 1971.[45][46]

In 1973, the Drug Enforcement Administration was created to replace the Bureau of Narcotics and Dangerous Drugs.[37]

The Nixon Administration also repealed the federal 210-year mandatory minimum sentences for possession of marijuana and started federal demand reduction programs and drug-treatment programs. Robert DuPont, the “Drug czar” in the Nixon Administration, stated it would be more accurate to say that Nixon ended, rather than launched, the “war on drugs”. DuPont also argued that it was the proponents of drug legalization that popularized the term “war on drugs”.[16][unreliable source?]

In 1982, Vice President George H. W. Bush and his aides began pushing for the involvement of the CIA and U.S. military in drug interdiction efforts.[47]

The Office of National Drug Control Policy (ONDCP) was originally established by the National Narcotics Leadership Act of 1988,[48][49] which mandated a national anti-drug media campaign for youth, which would later become the National Youth Anti-Drug Media Campaign.[50] The director of ONDCP is commonly known as the Drug czar,[37] and it was first implemented in 1989 under President George H. W. Bush,[51] and raised to cabinet-level status by Bill Clinton in 1993.[52] These activities were subsequently funded by the Treasury and General Government Appropriations Act of 1998.[53][54] The Drug-Free Media Campaign Act of 1998 codified the campaign at 21 U.S.C.1708.[55]

The Global Commission on Drug Policy released a report on June 2, 2011, alleging that “The War On Drugs Has Failed.” The commissioned was made up of 22 self-appointed members including a number of prominent international politicians and writers. U.S. Surgeon General Regina Benjamin also released the first ever National Prevention Strategy.[56]

On May 21, 2012, the U.S. Government published an updated version of its Drug Policy.[57] The director of ONDCP stated simultaneously that this policy is something different from the “War on Drugs”:

At the same meeting was a declaration signed by the representatives of Italy, the Russian Federation, Sweden, the United Kingdom and the United States in line with this: “Our approach must be a balanced one, combining effective enforcement to restrict the supply of drugs, with efforts to reduce demand and build recovery; supporting people to live a life free of addiction.”[59]

In March 2016 the International Narcotics Control Board stated that the International Drug Control treaties do not mandate a “war on drugs.”[60]

According to Human Rights Watch, the War on Drugs caused soaring arrest rates that disproportionately targeted African Americans due to various factors.[62] John Ehrlichman, an aide to Nixon, said that Nixon used the war on drugs to criminalize and disrupt black and hippie communities and their leaders.[63]

The present state of incarceration in the U.S. as a result of the war on drugs arrived in several stages. By 1971, different stops on drugs had been implemented for more than 50 years (for e.g. since 1914, 1937 etc.) with only a very small increase of inmates per 100,000 citizens. During the first 9 years after Nixon coined the expression “War on Drugs”, statistics showed only a minor increase in the total number of imprisoned.

After 1980, the situation began to change. In the 1980s, while the number of arrests for all crimes had risen by 28%, the number of arrests for drug offenses rose 126%.[64] The result of increased demand was the development of privatization and the for-profit prison industry.[65] The US Department of Justice, reporting on the effects of state initiatives, has stated that, from 1990 through 2000, “the increasing number of drug offenses accounted for 27% of the total growth among black inmates, 7% of the total growth among Hispanic inmates, and 15% of the growth among white inmates.” In addition to prison or jail, the United States provides for the deportation of many non-citizens convicted of drug offenses.[66]

In 1994, the New England Journal of Medicine reported that the “War on Drugs” resulted in the incarceration of one million Americans each year.[67] In 2008, the Washington Post reported that of 1.5 million Americans arrested each year for drug offenses, half a million would be incarcerated.[68] In addition, one in five black Americans would spend time behind bars due to drug laws.[68]

Federal and state policies also impose collateral consequences on those convicted of drug offenses, such as denial of public benefits or licenses, that are not applicable to those convicted of other types of crime.[69] In particular, the passage of the 1990 SolomonLautenberg amendment led many states to impose mandatory driver’s license suspensions (of at least 6 months) for persons committing a drug offense, regardless of whether any motor vehicle was involved.[70][71] Approximately 191,000 licenses were suspended in this manner in 2016, according to a Prison Policy Initiative report.[72]

In 1986, the U.S. Congress passed laws that created a 100 to 1 sentencing disparity for the trafficking or possession of crack when compared to penalties for trafficking of powder cocaine,[73][74][75][76] which had been widely criticized as discriminatory against minorities, mostly blacks, who were more likely to use crack than powder cocaine.[77] This 100:1 ratio had been required under federal law since 1986.[78] Persons convicted in federal court of possession of 5grams of crack cocaine received a minimum mandatory sentence of 5 years in federal prison. On the other hand, possession of 500grams of powder cocaine carries the same sentence.[74][75] In 2010, the Fair Sentencing Act cut the sentencing disparity to 18:1.[77]

According to Human Rights Watch, crime statistics show thatin the United States in 1999compared to non-minorities, African Americans were far more likely to be arrested for drug crimes, and received much stiffer penalties and sentences.[79]

Statistics from 1998 show that there were wide racial disparities in arrests, prosecutions, sentencing and deaths. African-American drug users made up for 35% of drug arrests, 55% of convictions, and 74% of people sent to prison for drug possession crimes.[74] Nationwide African-Americans were sent to state prisons for drug offenses 13 times more often than other races,[80] even though they only supposedly comprised 13% of regular drug users.[74]

Anti-drug legislation over time has also displayed an apparent racial bias. University of Minnesota Professor and social justice author Michael Tonry writes, “The War on Drugs foreseeably and unnecessarily blighted the lives of hundreds and thousands of young disadvantaged black Americans and undermined decades of effort to improve the life chances of members of the urban black underclass.”[81]

In 1968, President Lyndon B. Johnson decided that the government needed to make an effort to curtail the social unrest that blanketed the country at the time. He decided to focus his efforts on illegal drug use, an approach which was in line with expert opinion on the subject at the time. In the 1960s, it was believed that at least half of the crime in the U.S. was drug related, and this number grew as high as 90 percent in the next decade.[82] He created the Reorganization Plan of 1968 which merged the Bureau of Narcotics and the Bureau of Drug Abuse to form the Bureau of Narcotics and Dangerous Drugs within the Department of Justice.[83] The belief during this time about drug use was summarized by journalist Max Lerner in his celebrated[citation needed] work America as a Civilization (1957):

As a case in point we may take the known fact of the prevalence of reefer and dope addiction in Negro areas. This is essentially explained in terms of poverty, slum living, and broken families, yet it would be easy to show the lack of drug addiction among other ethnic groups where the same conditions apply.[84]

Richard Nixon became president in 1969, and did not back away from the anti-drug precedent set by Johnson. Nixon began orchestrating drug raids nationwide to improve his “watchdog” reputation. Lois B. Defleur, a social historian who studied drug arrests during this period in Chicago, stated that, “police administrators indicated they were making the kind of arrests the public wanted”. Additionally, some of Nixon’s newly created drug enforcement agencies would resort to illegal practices to make arrests as they tried to meet public demand for arrest numbers. From 1972 to 1973, the Office of Drug Abuse and Law Enforcement performed 6,000 drug arrests in 18 months, the majority of the arrested black.[85]

The next two Presidents, Gerald Ford and Jimmy Carter, responded with programs that were essentially a continuation of their predecessors. Shortly after Ronald Reagan became President in 1981 he delivered a speech on the topic. Reagan announced, “We’re taking down the surrender flag that has flown over so many drug efforts; we’re running up a battle flag.”[86] For his first five years in office, Reagan slowly strengthened drug enforcement by creating mandatory minimum sentencing and forfeiture of cash and real estate for drug offenses, policies far more detrimental to poor blacks than any other sector affected by the new laws.[citation needed]

Then, driven by the 1986 cocaine overdose of black basketball star Len Bias,[dubious discuss] Reagan was able to pass the Anti-Drug Abuse Act through Congress. This legislation appropriated an additional $1.7 billion to fund the War on Drugs. More importantly, it established 29 new, mandatory minimum sentences for drug offenses. In the entire history of the country up until that point, the legal system had only seen 55 minimum sentences in total.[87] A major stipulation of the new sentencing rules included different mandatory minimums for powder and crack cocaine. At the time of the bill, there was public debate as to the difference in potency and effect of powder cocaine, generally used by whites, and crack cocaine, generally used by blacks, with many believing that “crack” was substantially more powerful and addictive. Crack and powder cocaine are closely related chemicals, crack being a smokeable, freebase form of powdered cocaine hydrochloride which produces a shorter, more intense high while using less of the drug. This method is more cost effective, and therefore more prevalent on the inner-city streets, while powder cocaine remains more popular in white suburbia. The Reagan administration began shoring public opinion against “crack”, encouraging DEA official Robert Putnam to play up the harmful effects of the drug. Stories of “crack whores” and “crack babies” became commonplace; by 1986, Time had declared “crack” the issue of the year.[88] Riding the wave of public fervor, Reagan established much harsher sentencing for crack cocaine, handing down stiffer felony penalties for much smaller amounts of the drug.[89]

Reagan protg and former Vice-President George H. W. Bush was next to occupy the oval office, and the drug policy under his watch held true to his political background. Bush maintained the hard line drawn by his predecessor and former boss, increasing narcotics regulation when the First National Drug Control Strategy was issued by the Office of National Drug Control in 1989.[90]

The next three presidents Clinton, Bush and Obama continued this trend, maintaining the War on Drugs as they inherited it upon taking office.[91] During this time of passivity by the federal government, it was the states that initiated controversial legislation in the War on Drugs. Racial bias manifested itself in the states through such controversial policies as the “stop and frisk” police practices in New York city and the “three strikes” felony laws began in California in 1994.[92]

In August 2010, President Obama signed the Fair Sentencing Act into law that dramatically reduced the 100-to-1 sentencing disparity between powder and crack cocaine, which disproportionately affected minorities.[93]

Commonly used illegal drugs include heroin, cocaine, methamphetamine, and, marijuana.

Heroin is an opiate that is highly addictive. If caught selling or possessing heroin, a perpetrator can be charged with a felony and face twofour years in prison and could be fined to a maximum of $20,000.[94]

Crystal meth is composed of methamphetamine hydrochloride. It is marketed as either a white powder or in a solid (rock) form. The possession of crystal meth can result in a punishment varying from a fine to a jail sentence. As with other drug crimes, sentencing length may increase depending on the amount of the drug found in the possession of the defendant.[95]

Cocaine possession is illegal across the U.S., with the cheaper crack cocaine incurring even greater penalties. Having possession is when the accused knowingly has it on their person, or in a backpack or purse. The possession of cocaine with no prior conviction, for the first offense, the person will be sentenced to a maximum of one year in prison or fined $1,000, or both. If the person has a prior conviction, whether it is a narcotic or cocaine, they will be sentenced to two years in prison, a $2,500 fine, or both. With two or more convictions of possession prior to this present offense, they can be sentenced to 90 days in prison along with a $5,000 fine.[96]

Marijuana is the most popular illegal drug worldwide. The punishment for possession of it is less than for the possession of cocaine or heroin. In some U.S. states, the drug is legal. Over 80 million Americans have tried marijuana. The Criminal Defense Lawyer article claims that, depending on the age of person and how much the person has been caught for possession, they will be fined and could plea bargain into going to a treatment program versus going to prison. In each state the convictions differ along with how much marijuana they have on their person.[97]

Some scholars have claimed that the phrase “War on Drugs” is propaganda cloaking an extension of earlier military or paramilitary operations.[9] Others have argued that large amounts of “drug war” foreign aid money, training, and equipment actually goes to fighting leftist insurgencies and is often provided to groups who themselves are involved in large-scale narco-trafficking, such as corrupt members of the Colombian military.[8]

From 1963 to the end of the Vietnam War in 1975, marijuana usage became common among U.S. soldiers in non-combat situations. Some servicemen also used heroin. Many of the servicemen ended the heroin use after returning to the United States but came home addicted. In 1971, the U.S. military conducted a study of drug use among American servicemen and women. It found that daily usage rates for drugs on a worldwide basis were as low as two percent.[98] However, in the spring of 1971, two congressmen released an alarming report alleging that 15% of the servicemen in Vietnam were addicted to heroin. Marijuana use was also common in Vietnam. Soldiers who used drugs had more disciplinary problems. The frequent drug use had become an issue for the commanders in Vietnam; in 1971 it was estimated that 30,000 servicemen were addicted to drugs, most of them to heroin.[11]

From 1971 on, therefore, returning servicemen were required to take a mandatory heroin test. Servicemen who tested positive upon returning from Vietnam were not allowed to return home until they had passed the test with a negative result. The program also offered a treatment for heroin addicts.[99]

Elliot Borin’s article “The U.S. Military Needs its Speed”published in Wired on February 10, 2003reports:

But the Defense Department, which distributed millions of amphetamine tablets to troops during World War II, Vietnam and the Gulf War, soldiers on, insisting that they are not only harmless but beneficial.

In a news conference held in connection with Schmidt and Umbach’s Article 32 hearing, Dr. Pete Demitry, an Air Force physician and a pilot, claimed that the “Air Force has used (Dexedrine) safely for 60 years” with “no known speed-related mishaps.”

The need for speed, Demitry added “is a life-and-death issue for our military.”[100]

One of the first anti-drug efforts in the realm of foreign policy was President Nixon’s Operation Intercept, announced in September 1969, targeted at reducing the amount of cannabis entering the United States from Mexico. The effort began with an intense inspection crackdown that resulted in an almost shutdown of cross-border traffic.[101] Because the burden on border crossings was controversial in border states, the effort only lasted twenty days.[102]

On December 20, 1989, the United States invaded Panama as part of Operation Just Cause, which involved 25,000 American troops. Gen. Manuel Noriega, head of the government of Panama, had been giving military assistance to Contra groups in Nicaragua at the request of the U.S. which, in exchange, tolerated his drug trafficking activities, which they had known about since the 1960s.[103][104] When the Drug Enforcement Administration (DEA) tried to indict Noriega in 1971, the CIA prevented them from doing so.[103] The CIA, which was then directed by future president George H. W. Bush, provided Noriega with hundreds of thousands of dollars per year as payment for his work in Latin America.[103] When CIA pilot Eugene Hasenfus was shot down over Nicaragua by the Sandinistas, documents aboard the plane revealed many of the CIA’s activities in Latin America, and the CIA’s connections with Noriega became a public relations “liability” for the U.S. government, which finally allowed the DEA to indict him for drug trafficking, after decades of tolerating his drug operations.[103] Operation Just Cause, whose purpose was to capture Noriega and overthrow his government; Noriega found temporary asylum in the Papal Nuncio, and surrendered to U.S. soldiers on January 3, 1990.[105] He was sentenced by a court in Miami to 45 years in prison.[103]

As part of its Plan Colombia program, the United States government currently provides hundreds of millions of dollars per year of military aid, training, and equipment to Colombia,[106] to fight left-wing guerrillas such as the Revolutionary Armed Forces of Colombia (FARC-EP), which has been accused of being involved in drug trafficking.[107]

Private U.S. corporations have signed contracts to carry out anti-drug activities as part of Plan Colombia. DynCorp, the largest private company involved, was among those contracted by the State Department, while others signed contracts with the Defense Department.[108]

Colombian military personnel have received extensive counterinsurgency training from U.S. military and law enforcement agencies, including the School of Americas (SOA). Author Grace Livingstone has stated that more Colombian SOA graduates have been implicated in human rights abuses than currently known SOA graduates from any other country. All of the commanders of the brigades highlighted in a 2001 Human Rights Watch report on Colombia were graduates of the SOA, including the III brigade in Valle del Cauca, where the 2001 Alto Naya Massacre occurred. US-trained officers have been accused of being directly or indirectly involved in many atrocities during the 1990s, including the Massacre of Trujillo and the 1997 Mapiripn Massacre.

In 2000, the Clinton administration initially waived all but one of the human rights conditions attached to Plan Colombia, considering such aid as crucial to national security at the time.[109]

The efforts of U.S. and Colombian governments have been criticized for focusing on fighting leftist guerrillas in southern regions without applying enough pressure on right-wing paramilitaries and continuing drug smuggling operations in the north of the country.[110][111] Human Rights Watch, congressional committees and other entities have documented the existence of connections between members of the Colombian military and the AUC, which the U.S. government has listed as a terrorist group, and that Colombian military personnel have committed human rights abuses which would make them ineligible for U.S. aid under current laws.[citation needed]

In 2010, the Washington Office on Latin America concluded that both Plan Colombia and the Colombian government’s security strategy “came at a high cost in lives and resources, only did part of the job, are yielding diminishing returns and have left important institutions weaker.”[112]

A 2014 report by the RAND Corporation, which was issued to analyze viable strategies for the Mexican drug war considering successes experienced in Columbia, noted:

Between 1999 and 2002, the United States gave Colombia $2.04 billion in aid, 81 percent of which was for military purposes, placing Colombia just below Israel and Egypt among the largest recipients of U.S. military assistance. Colombia increased its defense spending from 3.2 percent of gross domestic product (GDP) in 2000 to 4.19 percent in 2005. Overall, the results were extremely positive. Greater spending on infrastructure and social programs helped the Colombian government increase its political legitimacy, while improved security forces were better able to consolidate control over large swaths of the country previously overrun by insurgents and drug cartels.

It also notes that, “Plan Colombia has been widely hailed as a success, and some analysts believe that, by 2010, Colombian security forces had finally gained the upper hand once and for all.”[113]

The Mrida Initiative is a security cooperation between the United States and the government of Mexico and the countries of Central America. It was approved on June 30, 2008, and its stated aim is combating the threats of drug trafficking and transnational crime. The Mrida Initiative appropriated $1.4 billion in a three-year commitment (20082010) to the Mexican government for military and law enforcement training and equipment, as well as technical advice and training to strengthen the national justice systems. The Mrida Initiative targeted many very important government officials, but it failed to address the thousands of Central Americans who had to flee their countries due to the danger they faced everyday because of the war on drugs. There is still not any type of plan that addresses these people. No weapons are included in the plan.[114][115]

The United States regularly sponsors the spraying of large amounts of herbicides such as glyphosate over the jungles of Central and South America as part of its drug eradication programs. Environmental consequences resulting from aerial fumigation have been criticized as detrimental to some of the world’s most fragile ecosystems;[116] the same aerial fumigation practices are further credited with causing health problems in local populations.[117]

In 2012, the U.S. sent DEA agents to Honduras to assist security forces in counternarcotics operations. Honduras has been a major stop for drug traffickers, who use small planes and landing strips hidden throughout the country to transport drugs. The U.S. government made agreements with several Latin American countries to share intelligence and resources to counter the drug trade. DEA agents, working with other U.S. agencies such as the State Department, the CBP, and Joint Task Force-Bravo, assisted Honduras troops in conducting raids on traffickers’ sites of operation.[118]

The War on Drugs has been a highly contentious issue since its inception. A poll on October 2, 2008, found that three in four Americans believed that the War On Drugs was failing.[119]

At a meeting in Guatemala in 2012, three former presidents from Guatemala, Mexico and Colombia said that the war on drugs had failed and that they would propose a discussion on alternatives, including decriminalization, at the Summit of the Americas in April of that year.[120] Guatemalan President Otto Prez Molina said that the war on drugs was exacting too high a price on the lives of Central Americans and that it was time to “end the taboo on discussing decriminalization”.[121] At the summit, the government of Colombia pushed for the most far-reaching change to drugs policy since the war on narcotics was declared by Nixon four decades prior, citing the catastrophic effects it had had in Colombia.[122]

Several critics have compared the wholesale incarceration of the dissenting minority of drug users to the wholesale incarceration of other minorities in history. Psychiatrist Thomas Szasz, for example, writes in 1997 “Over the past thirty years, we have replaced the medical-political persecution of illegal sex users (‘perverts’ and ‘psychopaths’) with the even more ferocious medical-political persecution of illegal drug users.”[123]

Penalties for drug crimes among American youth almost always involve permanent or semi-permanent removal from opportunities for education, strip them of voting rights, and later involve creation of criminal records which make employment more difficult.[124] Thus, some authors maintain that the War on Drugs has resulted in the creation of a permanent underclass of people who have few educational or job opportunities, often as a result of being punished for drug offenses which in turn have resulted from attempts to earn a living in spite of having no education or job opportunities.[124]

According to a 2008 study published by Harvard economist Jeffrey A. Miron, the annual savings on enforcement and incarceration costs from the legalization of drugs would amount to roughly $41.3 billion, with $25.7 billion being saved among the states and over $15.6 billion accrued for the federal government. Miron further estimated at least $46.7 billion in tax revenue based on rates comparable to those on tobacco and alcohol ($8.7 billion from marijuana, $32.6 billion from cocaine and heroin, remainder from other drugs).[125]

Low taxation in Central American countries has been credited with weakening the region’s response in dealing with drug traffickers. Many cartels, especially Los Zetas have taken advantage of the limited resources of these nations. 2010 tax revenue in El Salvador, Guatemala, and Honduras, composed just 13.53% of GDP. As a comparison, in Chile and the U.S., taxes were 18.6% and 26.9% of GDP respectively. However, direct taxes on income are very hard to enforce and in some cases tax evasion is seen as a national pastime.[126]

The status of coca and coca growers has become an intense political issue in several countries, including Colombia and particularly Bolivia, where the president, Evo Morales, a former coca growers’ union leader, has promised to legalise the traditional cultivation and use of coca.[127] Indeed, legalization efforts have yielded some successes under the Morales administration when combined with aggressive and targeted eradication efforts. The country saw a 1213% decline in coca cultivation[127] in 2011 under Morales, who has used coca growers’ federations to ensure compliance with the law rather than providing a primary role for security forces.[127]

The coca eradication policy has been criticised for its negative impact on the livelihood of coca growers in South America. In many areas of South America the coca leaf has traditionally been chewed and used in tea and for religious, medicinal and nutritional purposes by locals.[128] For this reason many insist that the illegality of traditional coca cultivation is unjust. In many areas the U.S. government and military has forced the eradication of coca without providing for any meaningful alternative crop for farmers, and has additionally destroyed many of their food or market crops, leaving them starving and destitute.[128]

The CIA, DEA, State Department, and several other U.S. government agencies have been alleged to have relations with various groups which are involved in drug trafficking.

Senator John Kerry’s 1988 U.S. Senate Committee on Foreign Relations report on Contra drug links concludes that members of the U.S. State Department “who provided support for the Contras are involved in drug trafficking… and elements of the Contras themselves knowingly receive financial and material assistance from drug traffickers.”[129] The report further states that “the Contra drug links include… payments to drug traffickers by the U.S. State Department of funds authorized by the Congress for humanitarian assistance to the Contras, in some cases after the traffickers had been indicted by federal law enforcement agencies on drug charges, in others while traffickers were under active investigation by these same agencies.”

In 1996, journalist Gary Webb published reports in the San Jose Mercury News, and later in his book Dark Alliance, detailing how Contras, had been involved in distributing crack cocaine into Los Angeles whilst receiving money from the CIA.[citation needed] Contras used money from drug trafficking to buy weapons.[citation needed]

Webb’s premise regarding the U.S. Government connection was initially attacked at the time by the media. It is now widely accepted that Webb’s main assertion of government “knowledge of drug operations, and collaboration with and protection of known drug traffickers” was correct.[130][not in citation given] In 1998, CIA Inspector General Frederick Hitz published a two-volume report[131] that while seemingly refuting Webb’s claims of knowledge and collaboration in its conclusions did not deny them in its body.[citation needed] Hitz went on to admit CIA improprieties in the affair in testimony to a House congressional committee. There has been a reversal amongst mainstream media of its position on Webb’s work, with acknowledgement made of his contribution to exposing a scandal it had ignored.

According to Rodney Campbell, an editorial assistant to Nelson Rockefeller, during World War II, the United States Navy, concerned that strikes and labor disputes in U.S. eastern shipping ports would disrupt wartime logistics, released the mobster Lucky Luciano from prison, and collaborated with him to help the mafia take control of those ports. Labor union members were terrorized and murdered by mafia members as a means of preventing labor unrest and ensuring smooth shipping of supplies to Europe.[132]

According to Alexander Cockburn and Jeffrey St. Clair, in order to prevent Communist party members from being elected in Italy following World War II, the CIA worked closely with the Sicilian Mafia, protecting them and assisting in their worldwide heroin smuggling operations. The mafia was in conflict with leftist groups and was involved in assassinating, torturing, and beating leftist political organizers.[133]

In 1986, the US Defense Department funded a two-year study by the RAND Corporation, which found that the use of the armed forces to interdict drugs coming into the United States would have little or no effect on cocaine traffic and might, in fact, raise the profits of cocaine cartels and manufacturers. The 175-page study, “Sealing the Borders: The Effects of Increased Military Participation in Drug Interdiction”, was prepared by seven researchers, mathematicians and economists at the National Defense Research Institute, a branch of the RAND, and was released in 1988. The study noted that seven prior studies in the past nine years, including one by the Center for Naval Research and the Office of Technology Assessment, had come to similar conclusions. Interdiction efforts, using current armed forces resources, would have almost no effect on cocaine importation into the United States, the report concluded.[135]

During the early-to-mid-1990s, the Clinton administration ordered and funded a major cocaine policy study, again by RAND. The Rand Drug Policy Research Center study concluded that $3 billion should be switched from federal and local law enforcement to treatment. The report said that treatment is the cheapest way to cut drug use, stating that drug treatment is twenty-three times more effective than the supply-side “war on drugs”.[136]

The National Research Council Committee on Data and Research for Policy on Illegal Drugs published its findings in 2001 on the efficacy of the drug war. The NRC Committee found that existing studies on efforts to address drug usage and smuggling, from U.S. military operations to eradicate coca fields in Colombia, to domestic drug treatment centers, have all been inconclusive, if the programs have been evaluated at all: “The existing drug-use monitoring systems are strikingly inadequate to support the full range of policy decisions that the nation must make…. It is unconscionable for this country to continue to carry out a public policy of this magnitude and cost without any way of knowing whether and to what extent it is having the desired effect.”[137] The study, though not ignored by the press, was ignored by top-level policymakers, leading Committee Chair Charles Manski to conclude, as one observer notes, that “the drug war has no interest in its own results”.[138]

In mid-1995, the US government tried to reduce the supply of methamphetamine precursors to disrupt the market of this drug. According to a 2009 study, this effort was successful, but its effects were largely temporary.[139]

During alcohol prohibition, the period from 1920 to 1933, alcohol use initially fell but began to increase as early as 1922. It has been extrapolated that even if prohibition had not been repealed in 1933, alcohol consumption would have quickly surpassed pre-prohibition levels.[140] One argument against the War on Drugs is that it uses similar measures as Prohibition and is no more effective.

In the six years from 2000 to 2006, the U.S. spent $4.7 billion on Plan Colombia, an effort to eradicate coca production in Colombia. The main result of this effort was to shift coca production into more remote areas and force other forms of adaptation. The overall acreage cultivated for coca in Colombia at the end of the six years was found to be the same, after the U.S. Drug Czar’s office announced a change in measuring methodology in 2005 and included new areas in its surveys.[141] Cultivation in the neighboring countries of Peru and Bolivia increased, some would describe this effect like squeezing a balloon.[142]

Richard Davenport-Hines, in his book The Pursuit of Oblivion,[143] criticized the efficacy of the War on Drugs by pointing out that

1015% of illicit heroin and 30% of illicit cocaine is intercepted. Drug traffickers have gross profit margins of up to 300%. At least 75% of illicit drug shipments would have to be intercepted before the traffickers’ profits were hurt.

Alberto Fujimori, president of Peru from 1990 to 2000, described U.S. foreign drug policy as “failed” on grounds that “for 10 years, there has been a considerable sum invested by the Peruvian government and another sum on the part of the American government, and this has not led to a reduction in the supply of coca leaf offered for sale. Rather, in the 10 years from 1980 to 1990, it grew 10-fold.”[144]

At least 500 economists, including Nobel Laureates Milton Friedman,[145] George Akerlof and Vernon L. Smith, have noted that reducing the supply of marijuana without reducing the demand causes the price, and hence the profits of marijuana sellers, to go up, according to the laws of supply and demand.[146] The increased profits encourage the producers to produce more drugs despite the risks, providing a theoretical explanation for why attacks on drug supply have failed to have any lasting effect. The aforementioned economists published an open letter to President George W. Bush stating “We urge…the country to commence an open and honest debate about marijuana prohibition… At a minimum, this debate will force advocates of current policy to show that prohibition has benefits sufficient to justify the cost to taxpayers, foregone tax revenues and numerous ancillary consequences that result from marijuana prohibition.”

The declaration from the World Forum Against Drugs, 2008 state that a balanced policy of drug abuse prevention, education, treatment, law enforcement, research, and supply reduction provides the most effective platform to reduce drug abuse and its associated harms and call on governments to consider demand reduction as one of their first priorities in the fight against drug abuse.[147]

Despite over $7 billion spent annually towards arresting[148] and prosecuting nearly 800,000 people across the country for marijuana offenses in 2005[citation needed] (FBI Uniform Crime Reports), the federally funded Monitoring the Future Survey reports about 85% of high school seniors find marijuana “easy to obtain”. That figure has remained virtually unchanged since 1975, never dropping below 82.7% in three decades of national surveys.[149] The Drug Enforcement Administration states that the number of users of marijuana in the U.S. declined between 2000 and 2005 even with many states passing new medical marijuana laws making access easier,[150] though usage rates remain higher than they were in the 1990s according to the National Survey on Drug Use and Health.[151]

ONDCP stated in April 2011 that there has been a 46 percent drop in cocaine use among young adults over the past five years, and a 65 percent drop in the rate of people testing positive for cocaine in the workplace since 2006.[152] At the same time, a 2007 study found that up to 35% of college undergraduates used stimulants not prescribed to them.[153]

A 2013 study found that prices of heroin, cocaine and cannabis had decreased from 1990 to 2007, but the purity of these drugs had increased during the same time.[154]

The War on Drugs is often called a policy failure.[155][156][157][158][159]

The legality of the War on Drugs has been challenged on four main grounds in the U.S.

Several authors believe that the United States’ federal and state governments have chosen wrong methods for combatting the distribution of illicit substances. Aggressive, heavy-handed enforcement funnels individuals through courts and prisons; instead of treating the cause of the addiction, the focus of government efforts has been on punishment. By making drugs illegal rather than regulating them, the War on Drugs creates a highly profitable black market. Jefferson Fish has edited scholarly collections of articles offering a wide variety of public health based and rights based alternative drug policies.[160][161][162]

In the year 2000, the United States drug-control budget reached 18.4 billion dollars,[163] nearly half of which was spent financing law enforcement while only one sixth was spent on treatment. In the year 2003, 53 percent of the requested drug control budget was for enforcement, 29 percent for treatment, and 18 percent for prevention.[164] The state of New York, in particular, designated 17 percent of its budget towards substance-abuse-related spending. Of that, a mere one percent was put towards prevention, treatment, and research.

In a survey taken by Substance Abuse and Mental Health Services Administration (SAMHSA), it was found that substance abusers that remain in treatment longer are less likely to resume their former drug habits. Of the people that were studied, 66 percent were cocaine users. After experiencing long-term in-patient treatment, only 22 percent returned to the use of cocaine. Treatment had reduced the number of cocaine abusers by two-thirds.[163] By spending the majority of its money on law enforcement, the federal government had underestimated the true value of drug-treatment facilities and their benefit towards reducing the number of addicts in the U.S.

In 2004 the federal government issued the National Drug Control Strategy. It supported programs designed to expand treatment options, enhance treatment delivery, and improve treatment outcomes. For example, the Strategy provided SAMHSA with a $100.6 million grant to put towards their Access to Recovery (ATR) initiative. ATR is a program that provides vouchers to addicts to provide them with the means to acquire clinical treatment or recovery support. The project’s goals are to expand capacity, support client choice, and increase the array of faith-based and community based providers for clinical treatment and recovery support services.[165] The ATR program will also provide a more flexible array of services based on the individual’s treatment needs.

The 2004 Strategy additionally declared a significant 32 million dollar raise in the Drug Courts Program, which provides drug offenders with alternatives to incarceration. As a substitute for imprisonment, drug courts identify substance-abusing offenders and place them under strict court monitoring and community supervision, as well as provide them with long-term treatment services.[166] According to a report issued by the National Drug Court Institute, drug courts have a wide array of benefits, with only 16.4 percent of the nation’s drug court graduates rearrested and charged with a felony within one year of completing the program (versus the 44.1% of released prisoners who end up back in prison within 1-year). Additionally, enrolling an addict in a drug court program costs much less than incarcerating one in prison.[167] According to the Bureau of Prisons, the fee to cover the average cost of incarceration for Federal inmates in 2006 was $24,440.[168] The annual cost of receiving treatment in a drug court program ranges from $900 to $3,500. Drug courts in New York State alone saved $2.54 million in incarceration costs.[167]

Describing the failure of the War on Drugs, New York Times columnist Eduardo Porter noted:

Jeffrey Miron, an economist at Harvard who studies drug policy closely, has suggested that legalizing all illicit drugs would produce net benefits to the United States of some $65 billion a year, mostly by cutting public spending on enforcement as well as through reduced crime and corruption. A study by analysts at the RAND Corporation, a California research organization, suggested that if marijuana were legalized in California and the drug spilled from there to other states, Mexican drug cartels would lose about a fifth of their annual income of some $6.5 billion from illegal exports to the United States.[169]

Many believe that the War on Drugs has been costly and ineffective largely because inadequate emphasis is placed on treatment of addiction. The United States leads the world in both recreational drug usage and incarceration rates. 70% of men arrested in metropolitan areas test positive for an illicit substance,[170] and 54% of all men incarcerated will be repeat offenders.[171]

There are also programs in the United States to combat public health risks of injecting drug users such as the Needle exchange programme. The “needle exchange programme” is intended to provide injecting drug users with new needles in exchange for used needles to prevent needle sharing.

Covert activities and foreign policy

Continued here:

War on drugs – Wikipedia

Technology – Wikipedia

This article is about the use and knowledge of techniques and processes for producing goods and services. For other uses, see Technology (disambiguation).

Technology (“science of craft”, from Greek , techne, “art, skill, cunning of hand”; and -, -logia[2]) is first robustly defined by Jacob Bigelow in 1829 as: “…principles, processes, and nomenclatures of the more conspicuous arts, particularly those which involve applications of science, and which may be considered useful, by promoting the benefit of society, together with the emolument [compensation [3]] of those who pursue them” [4].

The simplest form of technology is the development and use of basic tools. The prehistoric discovery of how to control fire and the later Neolithic Revolution increased the available sources of food, and the invention of the wheel helped humans to travel in and control their environment. Developments in historic times, including the printing press, the telephone, and the Internet, have lessened physical barriers to communication and allowed humans to interact freely on a global scale.

Technology has many effects. It has helped develop more advanced economies (including today’s global economy) and has allowed the rise of a leisure class. Many technological processes produce unwanted by-products known as pollution and deplete natural resources to the detriment of Earth’s environment. Innovations have always influenced the values of a society and raised new questions of the ethics of technology. Examples include the rise of the notion of efficiency in terms of human productivity, and the challenges of bioethics.

Philosophical debates have arisen over the use of technology, with disagreements over whether technology improves the human condition or worsens it. Neo-Luddism, anarcho-primitivism, and similar reactionary movements criticize the pervasiveness of technology, arguing that it harms the environment and alienates people; proponents of ideologies such as transhumanism and techno-progressivism view continued technological progress as beneficial to society and the human condition.

The use of the term “technology” has changed significantly over the last 200 years. Before the 20th century, the term was uncommon in English, and it was used either to refer to the description or study of the useful arts[14] or to allude to technical education, as in the Massachusetts Institute of Technology (chartered in 1861).[15]

The term “technology” rose to prominence in the 20th century in connection with the Second Industrial Revolution. The term’s meanings changed in the early 20th century when American social scientists, beginning with Thorstein Veblen, translated ideas from the German concept of Technik into “technology.” In German and other European languages, a distinction exists between technik and technologie that is absent in English, which usually translates both terms as “technology.” By the 1930s, “technology” referred not only to the study of the industrial arts but to the industrial arts themselves.[16]

In 1937, the American sociologist Read Bain wrote that “technology includes all tools, machines, utensils, weapons, instruments, housing, clothing, communicating and transporting devices and the skills by which we produce and use them.”[17] Bain’s definition remains common among scholars today, especially social scientists. Scientists and engineers usually prefer to define technology as applied science, rather than as the things that people make and use.[18] More recently, scholars have borrowed from European philosophers of “technique” to extend the meaning of technology to various forms of instrumental reason, as in Foucault’s work on technologies of the self (techniques de soi).

Dictionaries and scholars have offered a variety of definitions. The Merriam-Webster Learner’s Dictionary offers a definition of the term: “the use of science in industry, engineering, etc., to invent useful things or to solve problems” and “a machine, piece of equipment, method, etc., that is created by technology.”[19] Ursula Franklin, in her 1989 “Real World of Technology” lecture, gave another definition of the concept; it is “practice, the way we do things around here.”[20] The term is often used to imply a specific field of technology, or to refer to high technology or just consumer electronics, rather than technology as a whole.[21] Bernard Stiegler, in Technics and Time, 1, defines technology in two ways: as “the pursuit of life by means other than life,” and as “organized inorganic matter.”[22]

Technology can be most broadly defined as the entities, both material and immaterial, created by the application of mental and physical effort in order to achieve some value. In this usage, technology refers to tools and machines that may be used to solve real-world problems. It is a far-reaching term that may include simple tools, such as a crowbar or wooden spoon, or more complex machines, such as a space station or particle accelerator. Tools and machines need not be material; virtual technology, such as computer software and business methods, fall under this definition of technology.[23] W. Brian Arthur defines technology in a similarly broad way as “a means to fulfill a human purpose.”[24]

The word “technology” can also be used to refer to a collection of techniques. In this context, it is the current state of humanity’s knowledge of how to combine resources to produce desired products, to solve problems, fulfill needs, or satisfy wants; it includes technical methods, skills, processes, techniques, tools and raw materials. When combined with another term, such as “medical technology” or “space technology,” it refers to the state of the respective field’s knowledge and tools. “State-of-the-art technology” refers to the high technology available to humanity in any field.

Technology can be viewed as an activity that forms or changes culture.[25] Additionally, technology is the application of math, science, and the arts for the benefit of life as it is known. A modern example is the rise of communication technology, which has lessened barriers to human interaction and as a result has helped spawn new subcultures; the rise of cyberculture has at its basis the development of the Internet and the computer.[26] Not all technology enhances culture in a creative way; technology can also help facilitate political oppression and war via tools such as guns. As a cultural activity, technology predates both science and engineering, each of which formalize some aspects of technological endeavor.

The distinction between science, engineering, and technology is not always clear. Science is systematic knowledge of the physical or material world gained through observation and experimentation.[27] Technologies are not usually exclusively products of science, because they have to satisfy requirements such as utility, usability, and safety.[citation needed]

Engineering is the goal-oriented process of designing and making tools and systems to exploit natural phenomena for practical human means, often (but not always) using results and techniques from science. The development of technology may draw upon many fields of knowledge, including scientific, engineering, mathematical, linguistic, and historical knowledge, to achieve some practical result.

Technology is often a consequence of science and engineering, although technology as a human activity precedes the two fields. For example, science might study the flow of electrons in electrical conductors by using already-existing tools and knowledge. This new-found knowledge may then be used by engineers to create new tools and machines such as semiconductors, computers, and other forms of advanced technology. In this sense, scientists and engineers may both be considered technologists; the three fields are often considered as one for the purposes of research and reference.[28]

The exact relations between science and technology in particular have been debated by scientists, historians, and policymakers in the late 20th century, in part because the debate can inform the funding of basic and applied science. In the immediate wake of World War II, for example, it was widely considered in the United States that technology was simply “applied science” and that to fund basic science was to reap technological results in due time. An articulation of this philosophy could be found explicitly in Vannevar Bush’s treatise on postwar science policy, Science The Endless Frontier: “New products, new industries, and more jobs require continuous additions to knowledge of the laws of nature… This essential new knowledge can be obtained only through basic scientific research.”[29] In the late-1960s, however, this view came under direct attack, leading towards initiatives to fund science for specific tasks (initiatives resisted by the scientific community). The issue remains contentious, though most analysts resist the model that technology simply is a result of scientific research.[30][31]

The use of tools by early humans was partly a process of discovery and of evolution. Early humans evolved from a species of foraging hominids which were already bipedal,[32] with a brain mass approximately one third of modern humans.[33] Tool use remained relatively unchanged for most of early human history. Approximately 50,000 years ago, the use of tools and complex set of behaviors emerged, believed by many archaeologists to be connected to the emergence of fully modern language.[34]

Hominids started using primitive stone tools millions of years ago. The earliest stone tools were little more than a fractured rock, but approximately 75,000 years ago,[35] pressure flaking provided a way to make much finer work.

The discovery and utilization of fire, a simple energy source with many profound uses, was a turning point in the technological evolution of humankind.[36] The exact date of its discovery is not known; evidence of burnt animal bones at the Cradle of Humankind suggests that the domestication of fire occurred before 1 Ma;[37] scholarly consensus indicates that Homo erectus had controlled fire by between 500 and 400 ka.[38][39] Fire, fueled with wood and charcoal, allowed early humans to cook their food to increase its digestibility, improving its nutrient value and broadening the number of foods that could be eaten.[40]

Other technological advances made during the Paleolithic era were clothing and shelter; the adoption of both technologies cannot be dated exactly, but they were a key to humanity’s progress. As the Paleolithic era progressed, dwellings became more sophisticated and more elaborate; as early as 380 ka, humans were constructing temporary wood huts.[41][42] Clothing, adapted from the fur and hides of hunted animals, helped humanity expand into colder regions; humans began to migrate out of Africa by 200 ka and into other continents such as Eurasia.[43]

Human’s technological ascent began in earnest in what is known as the Neolithic Period (“New Stone Age”). The invention of polished stone axes was a major advance that allowed forest clearance on a large scale to create farms. This use of polished stone axes increased greatly in the Neolithic, but were originally used in the preceding Mesolithic in some areas such as Ireland.[44] Agriculture fed larger populations, and the transition to sedentism allowed simultaneously raising more children, as infants no longer needed to be carried, as nomadic ones must. Additionally, children could contribute labor to the raising of crops more readily than they could to the hunter-gatherer economy.[45][46]

With this increase in population and availability of labor came an increase in labor specialization.[47] What triggered the progression from early Neolithic villages to the first cities, such as Uruk, and the first civilizations, such as Sumer, is not specifically known; however, the emergence of increasingly hierarchical social structures and specialized labor, of trade and war amongst adjacent cultures, and the need for collective action to overcome environmental challenges such as irrigation, are all thought to have played a role.[48]

Continuing improvements led to the furnace and bellows and provided, for the first time, the ability to smelt and forge of gold, copper, silver, and lead native metals found in relatively pure form in nature.[49] The advantages of copper tools over stone, bone, and wooden tools were quickly apparent to early humans, and native copper was probably used from near the beginning of Neolithic times (about 10 ka).[50] Native copper does not naturally occur in large amounts, but copper ores are quite common and some of them produce metal easily when burned in wood or charcoal fires. Eventually, the working of metals led to the discovery of alloys such as bronze and brass (about 4000 BCE). The first uses of iron alloys such as steel dates to around 1800 BCE.[51][52]

Meanwhile, humans were learning to harness other forms of energy. The earliest known use of wind power is the sailing ship; the earliest record of a ship under sail is that of a Nile boat dating to the 8th millennium BCE.[53] From prehistoric times, Egyptians probably used the power of the annual flooding of the Nile to irrigate their lands, gradually learning to regulate much of it through purposely built irrigation channels and “catch” basins. The ancient Sumerians in Mesopotamia used a complex system of canals and levees to divert water from the Tigris and Euphrates rivers for irrigation.[54]

According to archaeologists, the wheel was invented around 4000 BCE probably independently and nearly simultaneously in Mesopotamia (in present-day Iraq), the Northern Caucasus (Maykop culture) and Central Europe.[55] Estimates on when this may have occurred range from 5500 to 3000 BCE with most experts putting it closer to 4000 BCE.[56] The oldest artifacts with drawings depicting wheeled carts date from about 3500 BCE;[57] however, the wheel may have been in use for millennia before these drawings were made. More recently, the oldest-known wooden wheel in the world was found in the Ljubljana marshes of Slovenia.[58]

The invention of the wheel revolutionized trade and war. It did not take long to discover that wheeled wagons could be used to carry heavy loads. The ancient Sumerians used the potter’s wheel and may have invented it.[59] A stone pottery wheel found in the city-state of Ur dates to around 3429 BCE,[60] and even older fragments of wheel-thrown pottery have been found in the same area.[60] Fast (rotary) potters’ wheels enabled early mass production of pottery, but it was the use of the wheel as a transformer of energy (through water wheels, windmills, and even treadmills) that revolutionized the application of nonhuman power sources. The first two-wheeled carts were derived from travois[61] and were first used in Mesopotamia and Iran in around 3000 BCE.[61]

The oldest known constructed roadways are the stone-paved streets of the city-state of Ur, dating to circa 4000 BCE[62] and timber roads leading through the swamps of Glastonbury, England, dating to around the same time period.[62] The first long-distance road, which came into use around 3500 BCE,[62] spanned 1,500 miles from the Persian Gulf to the Mediterranean Sea,[62] but was not paved and was only partially maintained.[62] In around 2000 BCE, the Minoans on the Greek island of Crete built a fifty-kilometer (thirty-mile) road leading from the palace of Gortyn on the south side of the island, through the mountains, to the palace of Knossos on the north side of the island.[62] Unlike the earlier road, the Minoan road was completely paved.[62]

Ancient Minoan private homes had running water.[64] A bathtub virtually identical to modern ones was unearthed at the Palace of Knossos.[64][65] Several Minoan private homes also had toilets, which could be flushed by pouring water down the drain.[64] The ancient Romans had many public flush toilets,[65] which emptied into an extensive sewage system.[65] The primary sewer in Rome was the Cloaca Maxima;[65] construction began on it in the sixth century BCE and it is still in use today.[65]

The ancient Romans also had a complex system of aqueducts,[63] which were used to transport water across long distances.[63] The first Roman aqueduct was built in 312 BCE.[63] The eleventh and final ancient Roman aqueduct was built in 226 CE.[63] Put together, the Roman aqueducts extended over 450 kilometers,[63] but less than seventy kilometers of this was above ground and supported by arches.[63]

Innovations continued through the Middle Ages with innovations such as silk, the horse collar and horseshoes in the first few hundred years after the fall of the Roman Empire. Medieval technology saw the use of simple machines (such as the lever, the screw, and the pulley) being combined to form more complicated tools, such as the wheelbarrow, windmills and clocks. The Renaissance brought forth many of these innovations, including the printing press (which facilitated the greater communication of knowledge), and technology became increasingly associated with science, beginning a cycle of mutual advancement. The advancements in technology in this era allowed a more steady supply of food, followed by the wider availability of consumer goods.

Starting in the United Kingdom in the 18th century, the Industrial Revolution was a period of great technological discovery, particularly in the areas of agriculture, manufacturing, mining, metallurgy, and transport, driven by the discovery of steam power. Technology took another step in a second industrial revolution with the harnessing of electricity to create such innovations as the electric motor, light bulb, and countless others. Scientific advancement and the discovery of new concepts later allowed for powered flight and advancements in medicine, chemistry, physics, and engineering. The rise in technology has led to skyscrapers and broad urban areas whose inhabitants rely on motors to transport them and their food supply. Communication was also greatly improved with the invention of the telegraph, telephone, radio and television. The late 19th and early 20th centuries saw a revolution in transportation with the invention of the airplane and automobile.

The 20th century brought a host of innovations. In physics, the discovery of nuclear fission has led to both nuclear weapons and nuclear power. Computers were also invented and later miniaturized utilizing transistors and integrated circuits. Information technology subsequently led to the creation of the Internet, which ushered in the current Information Age. Humans have also been able to explore space with satellites (later used for telecommunication) and in manned missions going all the way to the moon. In medicine, this era brought innovations such as open-heart surgery and later stem cell therapy along with new medications and treatments.

Complex manufacturing and construction techniques and organizations are needed to make and maintain these new technologies, and entire industries have arisen to support and develop succeeding generations of increasingly more complex tools. Modern technology increasingly relies on training and education their designers, builders, maintainers, and users often require sophisticated general and specific training. Moreover, these technologies have become so complex that entire fields have been created to support them, including engineering, medicine, and computer science, and other fields have been made more complex, such as construction, transportation and architecture.

Generally, technicism is the belief in the utility of technology for improving human societies.[66] Taken to an extreme, technicism “reflects a fundamental attitude which seeks to control reality, to resolve all problems with the use of scientifictechnological methods and tools.”[67] In other words, human beings will someday be able to master all problems and possibly even control the future using technology. Some, such as Stephen V. Monsma,[68] connect these ideas to the abdication of religion as a higher moral authority.

Optimistic assumptions are made by proponents of ideologies such as transhumanism and singularitarianism, which view technological development as generally having beneficial effects for the society and the human condition. In these ideologies, technological development is morally good.

Transhumanists generally believe that the point of technology is to overcome barriers, and that what we commonly refer to as the human condition is just another barrier to be surpassed.

Singularitarians believe in some sort of “accelerating change”; that the rate of technological progress accelerates as we obtain more technology, and that this will culminate in a “Singularity” after artificial general intelligence is invented in which progress is nearly infinite; hence the term. Estimates for the date of this Singularity vary,[69] but prominent futurist Ray Kurzweil estimates the Singularity will occur in 2045.

Kurzweil is also known for his history of the universe in six epochs: (1) the physical/chemical epoch, (2) the life epoch, (3) the human/brain epoch, (4) the technology epoch, (5) the artificial intelligence epoch, and (6) the universal colonization epoch. Going from one epoch to the next is a Singularity in its own right, and a period of speeding up precedes it. Each epoch takes a shorter time, which means the whole history of the universe is one giant Singularity event.[70]

Some critics see these ideologies as examples of scientism and techno-utopianism and fear the notion of human enhancement and technological singularity which they support. Some have described Karl Marx as a techno-optimist.[71]

On the somewhat skeptical side are certain philosophers like Herbert Marcuse and John Zerzan, who believe that technological societies are inherently flawed. They suggest that the inevitable result of such a society is to become evermore technological at the cost of freedom and psychological health.

Many, such as the Luddites and prominent philosopher Martin Heidegger, hold serious, although not entirely, deterministic reservations about technology (see “The Question Concerning Technology”[72]). According to Heidegger scholars Hubert Dreyfus and Charles Spinosa, “Heidegger does not oppose technology. He hopes to reveal the essence of technology in a way that ‘in no way confines us to a stultified compulsion to push on blindly with technology or, what comes to the same thing, to rebel helplessly against it.’ Indeed, he promises that ‘when we once open ourselves expressly to the essence of technology, we find ourselves unexpectedly taken into a freeing claim.'[73] What this entails is a more complex relationship to technology than either techno-optimists or techno-pessimists tend to allow.”[74]

Some of the most poignant criticisms of technology are found in what are now considered to be dystopian literary classics such as Aldous Huxley’s Brave New World, Anthony Burgess’s A Clockwork Orange, and George Orwell’s Nineteen Eighty-Four. In Goethe’s Faust, Faust selling his soul to the devil in return for power over the physical world is also often interpreted as a metaphor for the adoption of industrial technology. More recently, modern works of science fiction such as those by Philip K. Dick and William Gibson and films such as Blade Runner and Ghost in the Shell project highly ambivalent or cautionary attitudes toward technology’s impact on human society and identity.

The late cultural critic Neil Postman distinguished tool-using societies from technological societies and from what he called “technopolies,” societies that are dominated by the ideology of technological and scientific progress to the exclusion or harm of other cultural practices, values, and world-views.[75]

Darin Barney has written about technology’s impact on practices of citizenship and democratic culture, suggesting that technology can be construed as (1) an object of political debate, (2) a means or medium of discussion, and (3) a setting for democratic deliberation and citizenship. As a setting for democratic culture, Barney suggests that technology tends to make ethical questions, including the question of what a good life consists in, nearly impossible because they already give an answer to the question: a good life is one that includes the use of more and more technology.[76]

Nikolas Kompridis has also written about the dangers of new technology, such as genetic engineering, nanotechnology, synthetic biology, and robotics. He warns that these technologies introduce unprecedented new challenges to human beings, including the possibility of the permanent alteration of our biological nature. These concerns are shared by other philosophers, scientists and public intellectuals who have written about similar issues (e.g. Francis Fukuyama, Jrgen Habermas, William Joy, and Michael Sandel).[77]

Another prominent critic of technology is Hubert Dreyfus, who has published books such as On the Internet and What Computers Still Can’t Do.

A more infamous anti-technological treatise is Industrial Society and Its Future, written by the Unabomber Ted Kaczynski and printed in several major newspapers (and later books) as part of an effort to end his bombing campaign of the techno-industrial infrastructure. There are also subcultures that disapprove of some or most technology, such as self-identified off-gridders.[78]

The notion of appropriate technology was developed in the 20th century by thinkers such as E. F. Schumacher and Jacques Ellul to describe situations where it was not desirable to use very new technologies or those that required access to some centralized infrastructure or parts or skills imported from elsewhere. The ecovillage movement emerged in part due to this concern.

This section mainly focuses on American concerns even if it can reasonably be generalized to other Western countries.

The inadequate quantity and quality of American jobs is one of the most fundamental economic challenges we face. […] What’s the linkage between technology and this fundamental problem?

In his article, Jared Bernstein, a Senior Fellow at the Center on Budget and Policy Priorities,[79] questions the widespread idea that automation, and more broadly, technological advances, have mainly contributed to this growing labor market problem. His thesis appears to be a third way between optimism and skepticism. Essentially, he stands for a neutral approach of the linkage between technology and American issues concerning unemployment and declining wages.

He uses two main arguments to defend his point. First, because of recent technological advances, an increasing number of workers are losing their jobs. Yet, scientific evidence fails to clearly demonstrate that technology has displaced so many workers that it has created more problems than it has solved. Indeed, automation threatens repetitive jobs but higher-end jobs are still necessary because they complement technology and manual jobs that “requires flexibility judgment and common sense”[80] remain hard to replace with machines. Second, studies have not shown clear links between recent technology advances and the wage trends of the last decades.

Therefore, according to Bernstein, instead of focusing on technology and its hypothetical influences on current American increasing unemployment and declining wages, one needs to worry more about “bad policy that fails to offset the imbalances in demand, trade, income, and opportunity.”[80]

For people who use both the Internet and mobile devices in excessive quantities it is likely for them to experience fatigue and over exhaustion as a result of disruptions in their sleeping patterns. Continuous studies have shown that increased BMI and weight gain are associated with people who spend long hours online and not exercising frequently.[81] Heavy Internet use is also displayed in the school lower grades of those who use it in excessive amounts.[82] It has also been noted that the use of mobile phones whilst driving has increased the occurrence of road accidents particularly amongst teen drivers. Statistically, teens reportedly have fourfold the amount of road traffic incidents as those who are 20 years or older, and a very high percentage of adolescents write (81%) and read (92%) texts while driving.[83] In this context, mass media and technology have a negative impact on people, on both their mental and physical health.

Thomas P. Hughes stated that because technology has been considered as a key way to solve problems, we need to be aware of its complex and varied characters to use it more efficiently.[84] What is the difference between a wheel or a compass and cooking machines such as an oven or a gas stove? Can we consider all of them, only a part of them, or none of them as technologies?

Technology is often considered too narrowly; according to Hughes, “Technology is a creative process involving human ingenuity”.[85] This definition’s emphasis on creativity avoids unbounded definitions that may mistakenly include cooking technologies,” but it also highlights the prominent role of humans and therefore their responsibilities for the use of complex technological systems.

Yet, because technology is everywhere and has dramatically changed landscapes and societies, Hughes argues that engineers, scientists, and managers have often believed that they can use technology to shape the world as they want. They have often supposed that technology is easily controllable and this assumption has to be thoroughly questioned.[84] For instance, Evgeny Morozov particularly challenges two concepts: Internet-centrism and solutionism.”[86] Internet-centrism refers to the idea that our society is convinced that the Internet is one of the most stable and coherent forces. Solutionism is the ideology that every social issue can be solved thanks to technology and especially thanks to the internet. In fact, technology intrinsically contains uncertainties and limitations. According to Alexis Madrigal’s review of Morozov’s theory, to ignore it will lead to unexpected consequences that could eventually cause more damage than the problems they seek to address.”[87] Benjamin R. Cohen and Gwen Ottinger also discussed the multivalent effects of technology.[88]

Therefore, recognition of the limitations of technology, and more broadly, scientific knowledge, is needed especially in cases dealing with environmental justice and health issues. Ottinger continues this reasoning and argues that the ongoing recognition of the limitations of scientific knowledge goes hand in hand with scientists and engineers new comprehension of their role. Such an approach of technology and science “[require] technical professionals to conceive of their roles in the process differently. [They have to consider themselves as] collaborators in research and problem solving rather than simply providers of information and technical solutions.”[89]

Technology is properly defined as any application of science to accomplish a function. The science can be leading edge or well established and the function can have high visibility or be significantly more mundane, but it is all technology, and its exploitation is the foundation of all competitive advantage.

Technology-based planning is what was used to build the US industrial giants before WWII (e.g., Dow, DuPont, GM) and it is what was used to transform the US into a superpower. It was not economic-based planning.

The use of basic technology is also a feature of other animal species apart from humans. These include primates such as chimpanzees,[90] some dolphin communities,[91] and crows.[92][93] Considering a more generic perspective of technology as ethology of active environmental conditioning and control, we can also refer to animal examples such as beavers and their dams, or bees and their honeycombs.

The ability to make and use tools was once considered a defining characteristic of the genus Homo.[94] However, the discovery of tool construction among chimpanzees and related primates has discarded the notion of the use of technology as unique to humans. For example, researchers have observed wild chimpanzees utilising tools for foraging: some of the tools used include leaf sponges, termite fishing probes, pestles and levers.[95] West African chimpanzees also use stone hammers and anvils for cracking nuts,[96] as do capuchin monkeys of Boa Vista, Brazil.[97]

Theories of technology often attempt to predict the future of technology based on the high technology and science of the time. As with all predictions of the future, however, technology’s is uncertain.

In 2005, futurist Ray Kurzweil predicted that the future of technology would mainly consist of an overlapping “GNR Revolution” of genetics, nanotechnology and robotics, with robotics being the most important of the three.[98]

See the article here:

Technology – Wikipedia

Technology Articles, Technological News | Popular Science

Innovations in materials, artificial intelligence, and stealth technology are continually retooling modern warfare. Here we delve into the latest developments, be that ultra stealthy fuselages, superalloys that make engines more efficient, or tanks with radar-guided munitions and projectile launchers. If you’re fascinated by drones, catapults, and aircraft with foldable wings, take heart: you’re not alone. SEE MORE

Read the rest here:

Technology Articles, Technological News | Popular Science

Technology r/technology – reddit

We have posted this before, but this needs to be reiterated.

We understand that many of you are emotionally driven to discuss your feelings on recent events, most notably the repeal of Net Neutrality – however inciting violence towards others is never ok. It is upsetting that we even have to post this.

Do we enjoy banning people for these types of offences? No… Many of us feel as if the system has failed and want some form of repercussion. But threats of violence and harassment are not the answer here.

And to be clear – here are some examples of what will get you banned:

I hope this PoS dies in a car fire

I want to punch him in the face til his teeth fall out

And if you are trying to be slick by using this form

I never condone violence but…

I would never say he should die but…

Im not one to wish death upon but…

Let’s keep the threads civil.

If you violate this rule, you will be banned for 30 days, no exceptions

Read the rest here:

Technology r/technology – reddit

Technology – Wikipedia

This article is about the use and knowledge of techniques and processes for producing goods and services. For other uses, see Technology (disambiguation).

Technology (“science of craft”, from Greek , techne, “art, skill, cunning of hand”; and -, -logia[2]) is first robustly defined by Jacob Bigelow in 1829 as: “…principles, processes, and nomenclatures of the more conspicuous arts, particularly those which involve applications of science, and which may be considered useful, by promoting the benefit of society, together with the emolument [compensation [3]] of those who pursue them” [4].

The simplest form of technology is the development and use of basic tools. The prehistoric discovery of how to control fire and the later Neolithic Revolution increased the available sources of food, and the invention of the wheel helped humans to travel in and control their environment. Developments in historic times, including the printing press, the telephone, and the Internet, have lessened physical barriers to communication and allowed humans to interact freely on a global scale.

Technology has many effects. It has helped develop more advanced economies (including today’s global economy) and has allowed the rise of a leisure class. Many technological processes produce unwanted by-products known as pollution and deplete natural resources to the detriment of Earth’s environment. Innovations have always influenced the values of a society and raised new questions of the ethics of technology. Examples include the rise of the notion of efficiency in terms of human productivity, and the challenges of bioethics.

Philosophical debates have arisen over the use of technology, with disagreements over whether technology improves the human condition or worsens it. Neo-Luddism, anarcho-primitivism, and similar reactionary movements criticize the pervasiveness of technology, arguing that it harms the environment and alienates people; proponents of ideologies such as transhumanism and techno-progressivism view continued technological progress as beneficial to society and the human condition.

The use of the term “technology” has changed significantly over the last 200 years. Before the 20th century, the term was uncommon in English, and it was used either to refer to the description or study of the useful arts[14] or to allude to technical education, as in the Massachusetts Institute of Technology (chartered in 1861).[15]

The term “technology” rose to prominence in the 20th century in connection with the Second Industrial Revolution. The term’s meanings changed in the early 20th century when American social scientists, beginning with Thorstein Veblen, translated ideas from the German concept of Technik into “technology.” In German and other European languages, a distinction exists between technik and technologie that is absent in English, which usually translates both terms as “technology.” By the 1930s, “technology” referred not only to the study of the industrial arts but to the industrial arts themselves.[16]

In 1937, the American sociologist Read Bain wrote that “technology includes all tools, machines, utensils, weapons, instruments, housing, clothing, communicating and transporting devices and the skills by which we produce and use them.”[17] Bain’s definition remains common among scholars today, especially social scientists. Scientists and engineers usually prefer to define technology as applied science, rather than as the things that people make and use.[18] More recently, scholars have borrowed from European philosophers of “technique” to extend the meaning of technology to various forms of instrumental reason, as in Foucault’s work on technologies of the self (techniques de soi).

Dictionaries and scholars have offered a variety of definitions. The Merriam-Webster Learner’s Dictionary offers a definition of the term: “the use of science in industry, engineering, etc., to invent useful things or to solve problems” and “a machine, piece of equipment, method, etc., that is created by technology.”[19] Ursula Franklin, in her 1989 “Real World of Technology” lecture, gave another definition of the concept; it is “practice, the way we do things around here.”[20] The term is often used to imply a specific field of technology, or to refer to high technology or just consumer electronics, rather than technology as a whole.[21] Bernard Stiegler, in Technics and Time, 1, defines technology in two ways: as “the pursuit of life by means other than life,” and as “organized inorganic matter.”[22]

Technology can be most broadly defined as the entities, both material and immaterial, created by the application of mental and physical effort in order to achieve some value. In this usage, technology refers to tools and machines that may be used to solve real-world problems. It is a far-reaching term that may include simple tools, such as a crowbar or wooden spoon, or more complex machines, such as a space station or particle accelerator. Tools and machines need not be material; virtual technology, such as computer software and business methods, fall under this definition of technology.[23] W. Brian Arthur defines technology in a similarly broad way as “a means to fulfill a human purpose.”[24]

The word “technology” can also be used to refer to a collection of techniques. In this context, it is the current state of humanity’s knowledge of how to combine resources to produce desired products, to solve problems, fulfill needs, or satisfy wants; it includes technical methods, skills, processes, techniques, tools and raw materials. When combined with another term, such as “medical technology” or “space technology,” it refers to the state of the respective field’s knowledge and tools. “State-of-the-art technology” refers to the high technology available to humanity in any field.

Technology can be viewed as an activity that forms or changes culture.[25] Additionally, technology is the application of math, science, and the arts for the benefit of life as it is known. A modern example is the rise of communication technology, which has lessened barriers to human interaction and as a result has helped spawn new subcultures; the rise of cyberculture has at its basis the development of the Internet and the computer.[26] Not all technology enhances culture in a creative way; technology can also help facilitate political oppression and war via tools such as guns. As a cultural activity, technology predates both science and engineering, each of which formalize some aspects of technological endeavor.

The distinction between science, engineering, and technology is not always clear. Science is systematic knowledge of the physical or material world gained through observation and experimentation.[27] Technologies are not usually exclusively products of science, because they have to satisfy requirements such as utility, usability, and safety.[citation needed]

Engineering is the goal-oriented process of designing and making tools and systems to exploit natural phenomena for practical human means, often (but not always) using results and techniques from science. The development of technology may draw upon many fields of knowledge, including scientific, engineering, mathematical, linguistic, and historical knowledge, to achieve some practical result.

Technology is often a consequence of science and engineering, although technology as a human activity precedes the two fields. For example, science might study the flow of electrons in electrical conductors by using already-existing tools and knowledge. This new-found knowledge may then be used by engineers to create new tools and machines such as semiconductors, computers, and other forms of advanced technology. In this sense, scientists and engineers may both be considered technologists; the three fields are often considered as one for the purposes of research and reference.[28]

The exact relations between science and technology in particular have been debated by scientists, historians, and policymakers in the late 20th century, in part because the debate can inform the funding of basic and applied science. In the immediate wake of World War II, for example, it was widely considered in the United States that technology was simply “applied science” and that to fund basic science was to reap technological results in due time. An articulation of this philosophy could be found explicitly in Vannevar Bush’s treatise on postwar science policy, Science The Endless Frontier: “New products, new industries, and more jobs require continuous additions to knowledge of the laws of nature… This essential new knowledge can be obtained only through basic scientific research.”[29] In the late-1960s, however, this view came under direct attack, leading towards initiatives to fund science for specific tasks (initiatives resisted by the scientific community). The issue remains contentious, though most analysts resist the model that technology simply is a result of scientific research.[30][31]

The use of tools by early humans was partly a process of discovery and of evolution. Early humans evolved from a species of foraging hominids which were already bipedal,[32] with a brain mass approximately one third of modern humans.[33] Tool use remained relatively unchanged for most of early human history. Approximately 50,000 years ago, the use of tools and complex set of behaviors emerged, believed by many archaeologists to be connected to the emergence of fully modern language.[34]

Hominids started using primitive stone tools millions of years ago. The earliest stone tools were little more than a fractured rock, but approximately 75,000 years ago,[35] pressure flaking provided a way to make much finer work.

The discovery and utilization of fire, a simple energy source with many profound uses, was a turning point in the technological evolution of humankind.[36] The exact date of its discovery is not known; evidence of burnt animal bones at the Cradle of Humankind suggests that the domestication of fire occurred before 1 Ma;[37] scholarly consensus indicates that Homo erectus had controlled fire by between 500 and 400 ka.[38][39] Fire, fueled with wood and charcoal, allowed early humans to cook their food to increase its digestibility, improving its nutrient value and broadening the number of foods that could be eaten.[40]

Other technological advances made during the Paleolithic era were clothing and shelter; the adoption of both technologies cannot be dated exactly, but they were a key to humanity’s progress. As the Paleolithic era progressed, dwellings became more sophisticated and more elaborate; as early as 380 ka, humans were constructing temporary wood huts.[41][42] Clothing, adapted from the fur and hides of hunted animals, helped humanity expand into colder regions; humans began to migrate out of Africa by 200 ka and into other continents such as Eurasia.[43]

Human’s technological ascent began in earnest in what is known as the Neolithic Period (“New Stone Age”). The invention of polished stone axes was a major advance that allowed forest clearance on a large scale to create farms. This use of polished stone axes increased greatly in the Neolithic, but were originally used in the preceding Mesolithic in some areas such as Ireland.[44] Agriculture fed larger populations, and the transition to sedentism allowed simultaneously raising more children, as infants no longer needed to be carried, as nomadic ones must. Additionally, children could contribute labor to the raising of crops more readily than they could to the hunter-gatherer economy.[45][46]

With this increase in population and availability of labor came an increase in labor specialization.[47] What triggered the progression from early Neolithic villages to the first cities, such as Uruk, and the first civilizations, such as Sumer, is not specifically known; however, the emergence of increasingly hierarchical social structures and specialized labor, of trade and war amongst adjacent cultures, and the need for collective action to overcome environmental challenges such as irrigation, are all thought to have played a role.[48]

Continuing improvements led to the furnace and bellows and provided, for the first time, the ability to smelt and forge of gold, copper, silver, and lead native metals found in relatively pure form in nature.[49] The advantages of copper tools over stone, bone, and wooden tools were quickly apparent to early humans, and native copper was probably used from near the beginning of Neolithic times (about 10 ka).[50] Native copper does not naturally occur in large amounts, but copper ores are quite common and some of them produce metal easily when burned in wood or charcoal fires. Eventually, the working of metals led to the discovery of alloys such as bronze and brass (about 4000 BCE). The first uses of iron alloys such as steel dates to around 1800 BCE.[51][52]

Meanwhile, humans were learning to harness other forms of energy. The earliest known use of wind power is the sailing ship; the earliest record of a ship under sail is that of a Nile boat dating to the 8th millennium BCE.[53] From prehistoric times, Egyptians probably used the power of the annual flooding of the Nile to irrigate their lands, gradually learning to regulate much of it through purposely built irrigation channels and “catch” basins. The ancient Sumerians in Mesopotamia used a complex system of canals and levees to divert water from the Tigris and Euphrates rivers for irrigation.[54]

According to archaeologists, the wheel was invented around 4000 BCE probably independently and nearly simultaneously in Mesopotamia (in present-day Iraq), the Northern Caucasus (Maykop culture) and Central Europe.[55] Estimates on when this may have occurred range from 5500 to 3000 BCE with most experts putting it closer to 4000 BCE.[56] The oldest artifacts with drawings depicting wheeled carts date from about 3500 BCE;[57] however, the wheel may have been in use for millennia before these drawings were made. More recently, the oldest-known wooden wheel in the world was found in the Ljubljana marshes of Slovenia.[58]

The invention of the wheel revolutionized trade and war. It did not take long to discover that wheeled wagons could be used to carry heavy loads. The ancient Sumerians used the potter’s wheel and may have invented it.[59] A stone pottery wheel found in the city-state of Ur dates to around 3429 BCE,[60] and even older fragments of wheel-thrown pottery have been found in the same area.[60] Fast (rotary) potters’ wheels enabled early mass production of pottery, but it was the use of the wheel as a transformer of energy (through water wheels, windmills, and even treadmills) that revolutionized the application of nonhuman power sources. The first two-wheeled carts were derived from travois[61] and were first used in Mesopotamia and Iran in around 3000 BCE.[61]

The oldest known constructed roadways are the stone-paved streets of the city-state of Ur, dating to circa 4000 BCE[62] and timber roads leading through the swamps of Glastonbury, England, dating to around the same time period.[62] The first long-distance road, which came into use around 3500 BCE,[62] spanned 1,500 miles from the Persian Gulf to the Mediterranean Sea,[62] but was not paved and was only partially maintained.[62] In around 2000 BCE, the Minoans on the Greek island of Crete built a fifty-kilometer (thirty-mile) road leading from the palace of Gortyn on the south side of the island, through the mountains, to the palace of Knossos on the north side of the island.[62] Unlike the earlier road, the Minoan road was completely paved.[62]

Ancient Minoan private homes had running water.[64] A bathtub virtually identical to modern ones was unearthed at the Palace of Knossos.[64][65] Several Minoan private homes also had toilets, which could be flushed by pouring water down the drain.[64] The ancient Romans had many public flush toilets,[65] which emptied into an extensive sewage system.[65] The primary sewer in Rome was the Cloaca Maxima;[65] construction began on it in the sixth century BCE and it is still in use today.[65]

The ancient Romans also had a complex system of aqueducts,[63] which were used to transport water across long distances.[63] The first Roman aqueduct was built in 312 BCE.[63] The eleventh and final ancient Roman aqueduct was built in 226 CE.[63] Put together, the Roman aqueducts extended over 450 kilometers,[63] but less than seventy kilometers of this was above ground and supported by arches.[63]

Innovations continued through the Middle Ages with innovations such as silk, the horse collar and horseshoes in the first few hundred years after the fall of the Roman Empire. Medieval technology saw the use of simple machines (such as the lever, the screw, and the pulley) being combined to form more complicated tools, such as the wheelbarrow, windmills and clocks. The Renaissance brought forth many of these innovations, including the printing press (which facilitated the greater communication of knowledge), and technology became increasingly associated with science, beginning a cycle of mutual advancement. The advancements in technology in this era allowed a more steady supply of food, followed by the wider availability of consumer goods.

Starting in the United Kingdom in the 18th century, the Industrial Revolution was a period of great technological discovery, particularly in the areas of agriculture, manufacturing, mining, metallurgy, and transport, driven by the discovery of steam power. Technology took another step in a second industrial revolution with the harnessing of electricity to create such innovations as the electric motor, light bulb, and countless others. Scientific advancement and the discovery of new concepts later allowed for powered flight and advancements in medicine, chemistry, physics, and engineering. The rise in technology has led to skyscrapers and broad urban areas whose inhabitants rely on motors to transport them and their food supply. Communication was also greatly improved with the invention of the telegraph, telephone, radio and television. The late 19th and early 20th centuries saw a revolution in transportation with the invention of the airplane and automobile.

The 20th century brought a host of innovations. In physics, the discovery of nuclear fission has led to both nuclear weapons and nuclear power. Computers were also invented and later miniaturized utilizing transistors and integrated circuits. Information technology subsequently led to the creation of the Internet, which ushered in the current Information Age. Humans have also been able to explore space with satellites (later used for telecommunication) and in manned missions going all the way to the moon. In medicine, this era brought innovations such as open-heart surgery and later stem cell therapy along with new medications and treatments.

Complex manufacturing and construction techniques and organizations are needed to make and maintain these new technologies, and entire industries have arisen to support and develop succeeding generations of increasingly more complex tools. Modern technology increasingly relies on training and education their designers, builders, maintainers, and users often require sophisticated general and specific training. Moreover, these technologies have become so complex that entire fields have been created to support them, including engineering, medicine, and computer science, and other fields have been made more complex, such as construction, transportation and architecture.

Generally, technicism is the belief in the utility of technology for improving human societies.[66] Taken to an extreme, technicism “reflects a fundamental attitude which seeks to control reality, to resolve all problems with the use of scientifictechnological methods and tools.”[67] In other words, human beings will someday be able to master all problems and possibly even control the future using technology. Some, such as Stephen V. Monsma,[68] connect these ideas to the abdication of religion as a higher moral authority.

Optimistic assumptions are made by proponents of ideologies such as transhumanism and singularitarianism, which view technological development as generally having beneficial effects for the society and the human condition. In these ideologies, technological development is morally good.

Transhumanists generally believe that the point of technology is to overcome barriers, and that what we commonly refer to as the human condition is just another barrier to be surpassed.

Singularitarians believe in some sort of “accelerating change”; that the rate of technological progress accelerates as we obtain more technology, and that this will culminate in a “Singularity” after artificial general intelligence is invented in which progress is nearly infinite; hence the term. Estimates for the date of this Singularity vary,[69] but prominent futurist Ray Kurzweil estimates the Singularity will occur in 2045.

Kurzweil is also known for his history of the universe in six epochs: (1) the physical/chemical epoch, (2) the life epoch, (3) the human/brain epoch, (4) the technology epoch, (5) the artificial intelligence epoch, and (6) the universal colonization epoch. Going from one epoch to the next is a Singularity in its own right, and a period of speeding up precedes it. Each epoch takes a shorter time, which means the whole history of the universe is one giant Singularity event.[70]

Some critics see these ideologies as examples of scientism and techno-utopianism and fear the notion of human enhancement and technological singularity which they support. Some have described Karl Marx as a techno-optimist.[71]

On the somewhat skeptical side are certain philosophers like Herbert Marcuse and John Zerzan, who believe that technological societies are inherently flawed. They suggest that the inevitable result of such a society is to become evermore technological at the cost of freedom and psychological health.

Many, such as the Luddites and prominent philosopher Martin Heidegger, hold serious, although not entirely, deterministic reservations about technology (see “The Question Concerning Technology”[72]). According to Heidegger scholars Hubert Dreyfus and Charles Spinosa, “Heidegger does not oppose technology. He hopes to reveal the essence of technology in a way that ‘in no way confines us to a stultified compulsion to push on blindly with technology or, what comes to the same thing, to rebel helplessly against it.’ Indeed, he promises that ‘when we once open ourselves expressly to the essence of technology, we find ourselves unexpectedly taken into a freeing claim.'[73] What this entails is a more complex relationship to technology than either techno-optimists or techno-pessimists tend to allow.”[74]

Some of the most poignant criticisms of technology are found in what are now considered to be dystopian literary classics such as Aldous Huxley’s Brave New World, Anthony Burgess’s A Clockwork Orange, and George Orwell’s Nineteen Eighty-Four. In Goethe’s Faust, Faust selling his soul to the devil in return for power over the physical world is also often interpreted as a metaphor for the adoption of industrial technology. More recently, modern works of science fiction such as those by Philip K. Dick and William Gibson and films such as Blade Runner and Ghost in the Shell project highly ambivalent or cautionary attitudes toward technology’s impact on human society and identity.

The late cultural critic Neil Postman distinguished tool-using societies from technological societies and from what he called “technopolies,” societies that are dominated by the ideology of technological and scientific progress to the exclusion or harm of other cultural practices, values, and world-views.[75]

Darin Barney has written about technology’s impact on practices of citizenship and democratic culture, suggesting that technology can be construed as (1) an object of political debate, (2) a means or medium of discussion, and (3) a setting for democratic deliberation and citizenship. As a setting for democratic culture, Barney suggests that technology tends to make ethical questions, including the question of what a good life consists in, nearly impossible because they already give an answer to the question: a good life is one that includes the use of more and more technology.[76]

Nikolas Kompridis has also written about the dangers of new technology, such as genetic engineering, nanotechnology, synthetic biology, and robotics. He warns that these technologies introduce unprecedented new challenges to human beings, including the possibility of the permanent alteration of our biological nature. These concerns are shared by other philosophers, scientists and public intellectuals who have written about similar issues (e.g. Francis Fukuyama, Jrgen Habermas, William Joy, and Michael Sandel).[77]

Another prominent critic of technology is Hubert Dreyfus, who has published books such as On the Internet and What Computers Still Can’t Do.

A more infamous anti-technological treatise is Industrial Society and Its Future, written by the Unabomber Ted Kaczynski and printed in several major newspapers (and later books) as part of an effort to end his bombing campaign of the techno-industrial infrastructure.

The notion of appropriate technology was developed in the 20th century by thinkers such as E. F. Schumacher and Jacques Ellul to describe situations where it was not desirable to use very new technologies or those that required access to some centralized infrastructure or parts or skills imported from elsewhere. The ecovillage movement emerged in part due to this concern.

This section mainly focuses on American concerns even if it can reasonably be generalized to other Western countries.

The inadequate quantity and quality of American jobs is one of the most fundamental economic challenges we face. […] What’s the linkage between technology and this fundamental problem?

In his article, Jared Bernstein, a Senior Fellow at the Center on Budget and Policy Priorities,[78] questions the widespread idea that automation, and more broadly, technological advances, have mainly contributed to this growing labor market problem. His thesis appears to be a third way between optimism and skepticism. Essentially, he stands for a neutral approach of the linkage between technology and American issues concerning unemployment and declining wages.

He uses two main arguments to defend his point. First, because of recent technological advances, an increasing number of workers are losing their jobs. Yet, scientific evidence fails to clearly demonstrate that technology has displaced so many workers that it has created more problems than it has solved. Indeed, automation threatens repetitive jobs but higher-end jobs are still necessary because they complement technology and manual jobs that “requires flexibility judgment and common sense”[79] remain hard to replace with machines. Second, studies have not shown clear links between recent technology advances and the wage trends of the last decades.

Therefore, according to Bernstein, instead of focusing on technology and its hypothetical influences on current American increasing unemployment and declining wages, one needs to worry more about “bad policy that fails to offset the imbalances in demand, trade, income, and opportunity.”[79]

For people who use both the Internet and mobile devices in excessive quantities it is likely for them to experience fatigue and over exhaustion as a result of disruptions in their sleeping patterns. Continuous studies have shown that increased BMI and weight gain are associated with people who spend long hours online and not exercising frequently.[80] Heavy Internet use is also displayed in the school lower grades of those who use it in excessive amounts.[81] It has also been noted that the use of mobile phones whilst driving has increased the occurrence of road accidents particularly amongst teen drivers. Statistically, teens reportedly have fourfold the amount of road traffic incidents as those who are 20 years or older, and a very high percentage of adolescents write (81%) and read (92%) texts while driving.[82] In this context, mass media and technology have a negative impact on people, on both their mental and physical health.

Thomas P. Hughes stated that because technology has been considered as a key way to solve problems, we need to be aware of its complex and varied characters to use it more efficiently.[83] What is the difference between a wheel or a compass and cooking machines such as an oven or a gas stove? Can we consider all of them, only a part of them, or none of them as technologies?

Technology is often considered too narrowly; according to Hughes, “Technology is a creative process involving human ingenuity”.[84] This definition’s emphasis on creativity avoids unbounded definitions that may mistakenly include cooking technologies,” but it also highlights the prominent role of humans and therefore their responsibilities for the use of complex technological systems.

Yet, because technology is everywhere and has dramatically changed landscapes and societies, Hughes argues that engineers, scientists, and managers have often believed that they can use technology to shape the world as they want. They have often supposed that technology is easily controllable and this assumption has to be thoroughly questioned.[83] For instance, Evgeny Morozov particularly challenges two concepts: Internet-centrism and solutionism.”[85] Internet-centrism refers to the idea that our society is convinced that the Internet is one of the most stable and coherent forces. Solutionism is the ideology that every social issue can be solved thanks to technology and especially thanks to the internet. In fact, technology intrinsically contains uncertainties and limitations. According to Alexis Madrigal’s review of Morozov’s theory, to ignore it will lead to unexpected consequences that could eventually cause more damage than the problems they seek to address.”[86] Benjamin R. Cohen and Gwen Ottinger also discussed the multivalent effects of technology.[87]

Therefore, recognition of the limitations of technology, and more broadly, scientific knowledge, is needed especially in cases dealing with environmental justice and health issues. Ottinger continues this reasoning and argues that the ongoing recognition of the limitations of scientific knowledge goes hand in hand with scientists and engineers new comprehension of their role. Such an approach of technology and science “[require] technical professionals to conceive of their roles in the process differently. [They have to consider themselves as] collaborators in research and problem solving rather than simply providers of information and technical solutions.”[88]

Technology is properly defined as any application of science to accomplish a function. The science can be leading edge or well established and the function can have high visibility or be significantly more mundane, but it is all technology, and its exploitation is the foundation of all competitive advantage.

Technology-based planning is what was used to build the US industrial giants before WWII (e.g., Dow, DuPont, GM) and it is what was used to transform the US into a superpower. It was not economic-based planning.

The use of basic technology is also a feature of other animal species apart from humans. These include primates such as chimpanzees,[89] some dolphin communities,[90] and crows.[91][92] Considering a more generic perspective of technology as ethology of active environmental conditioning and control, we can also refer to animal examples such as beavers and their dams, or bees and their honeycombs.

The ability to make and use tools was once considered a defining characteristic of the genus Homo.[93] However, the discovery of tool construction among chimpanzees and related primates has discarded the notion of the use of technology as unique to humans. For example, researchers have observed wild chimpanzees utilising tools for foraging: some of the tools used include leaf sponges, termite fishing probes, pestles and levers.[94] West African chimpanzees also use stone hammers and anvils for cracking nuts,[95] as do capuchin monkeys of Boa Vista, Brazil.[96]

Theories of technology often attempt to predict the future of technology based on the high technology and science of the time. As with all predictions of the future, however, technology’s is uncertain.

In 2005, futurist Ray Kurzweil predicted that the future of technology would mainly consist of an overlapping “GNR Revolution” of genetics, nanotechnology and robotics, with robotics being the most important of the three.[97]

Originally posted here:

Technology – Wikipedia

Technology Articles, Technological News | Popular Science

Innovations in materials, artificial intelligence, and stealth technology are continually retooling modern warfare. Here we delve into the latest developments, be that ultra stealthy fuselages, superalloys that make engines more efficient, or tanks with radar-guided munitions and projectile launchers. If you’re fascinated by drones, catapults, and aircraft with foldable wings, take heart: you’re not alone. SEE MORE

The rest is here:

Technology Articles, Technological News | Popular Science


12345...102030...