Page 304«..1020..302303304305

Category Archives: Technology

Technology – Northern Illinois University

Posted: August 27, 2016 at 7:13 pm

Technology Technology Academic Programs

The Department of Technology is unique in the area of technical education. Our goal is to develop applied engineering and technical management skills in our students so that they excel in industry after graduation. This is accomplished with a mix of fundamental applied engineering knowledge, understanding of the applications and the tools needed in todays industry, and an understanding of basic and advanced theory.

As the name technology implies, our students learn the latest industry applications and equipment. Technology involves the application of science, mathematics, computers, and management skills to the solution of real world problems.

The first and foremost departmental mission is providing a quality hands-on and theoretical education to our students at both the undergraduate and graduate levels. The department prepares our students so they can not only obtain employment after graduation, but the can excel at the job from day one. The department offers undergraduate education in industrial management and technology, manufacturing engineering technology, and electrical engineering technology, and a graduate program in industrial management.

The departmental faculty is involved in continual programmatic assessment and improvement which allows us to educate the next generation of industrial technologists and engineering technologists who excel in the work environment. Within the departmental undergraduate programs, we offer a very broad range of courses which provide both a sound introduction and in-depth analysis in given topics. Each program places emphasis upon theoretical/application instruction and laboratory instruction, skills that are demanded by industry.

The departmental faculty has taken the lead in developing a new Homeland Security Certificate, which develops knowledge needed to obtain employment in this ever changing and expanding security area. Our departmental Masters of Industrial Management program includes course work in the fields of industrial management, occupational safety, and manufacturing.

Working with industry is another area that is at the forefront of the departmental mission. The NIU Department of Technology prides itself on industrial interaction. Many of the departmental senior projects are conducted with industrial involvement. In addition, students and faculty are involved with projects sponsored through industry and government. As is the case in the technology disciplines, our relationship with industry is the basis behind the concepts that are taught in our curriculums. In addition, technology students are involved with interdisciplinary external projects, from the design of the Basic-Utility-Vehicle to a hovering platform for a science museum. It is this involvement with the faculty and other students which sets our program apart!

To offer engaged learning programs that promote strong partnerships with industry and foster a synergetic interactive relationship between faculty and students. The Northern Illinois Department of Technology is committed to providing our students with an industry-focused technical education that emphasizes theoretical and applications-oriented approaches to problem solving. The departmental faculty will strive to provide technical programs which allow our students to excel in current and future industrial settings.

The Northern Illinois University Department of Technology is committed to technical education and programs which incorporate continuous improvement, student-centered engagement, and applied research that prepare our students to analyze, develop, and implement innovative and sustainable solutions for (contemporary) society.

Original post:

Technology - Northern Illinois University

Posted in Technology | Comments Off on Technology – Northern Illinois University

Technology – Blue Sky Innovation – Chicago Tribune

Posted: at 7:13 pm

The Cincinnati Zoo has deleted its Twitter account, a day after the zoo asked the internet to please stop making so many jokes and memes about Harambe, the gorilla who was shot and killed there in May after a child entered his enclosure. "We deactivated our Twitter accounts," zoo spokeswoman Michelle...

Pinterest is buying Instapaper, the app that lets you save an article to read later, as it works to understand the technology behind recommending stories for people. The acquisition of Instapaper, which has expertise in saving, curating and analyzing articles, aligns with Pinterest's goal to provide...

Backpack makers like to consider their products "enablers," company executives say, especially of active lifestyles. No matter where you go or what you do, a reliable backpack can keep your stuff safe and sound right there with you. Now, backpacks are enabling your smartphone addiction. Traditionally,...

Twitter is making a "quality filter" available to all users, allowing them to hide tweets that contain threats, appear to be automated or spammy. The feature was previously only available to users with "verified" accounts, which are typically celebrities, public figures or journalists, and who...

The option to hail a ride in a self-driving car, which was science fiction just a few years ago, will soon be available to Uber users in Pittsburgh, the first time the technology has been offered to the general public. Within weeks, the company announced Thursday, customers will be able to opt...

Twitter said Thursday it has suspended 360,000 accounts since mid-2015 for violating its policies banning the promotion of terrorism and violent extremism. The San Francisco-based company said in a blog post that it has also made progress in preventing users who were suspended from immediately...

Michael Udall earned three years worth of tuition and a hefty medal whenhis college video game team won a national tournament in April. Thathaul marksthe first of what the Arizona State junior hopes are many accomplishments in a storied e-sports career. But Udall covets one achievementnot...

Today, the software economy makes hailing a ride incredibly easy. Tap a button, and a driver will arrive at your door to whisk you away. But in just a few years, the car may show up without any driver inside at all. Ford said Tuesday that it wants to be first to roll out a completely automated...

Google nailed email with the 2004 introduction of Gmail. Now it's the No. 1 form of electronic correspondence in the U.S. But as traditional email falls out of favor with a growing sliver of the population, Google has struggled to make its messaging tools relevant or introduce new ones that resonate...

Like many in Silicon Valley, technology entrepreneur Bryan Johnson sees a future in which intelligent machines can do things like drive cars on their own and anticipate our needs before we ask. What's uncommon is how Johnson wants to respond: Find a way to supercharge the human brain so that we...

Read this article:

Technology - Blue Sky Innovation - Chicago Tribune

Posted in Technology | Comments Off on Technology – Blue Sky Innovation – Chicago Tribune

History of technology – Wikipedia, the free encyclopedia

Posted: at 7:13 pm

The history of technology is the history of the invention of tools and techniques and is similar to other sides of the history of humanity. Technology can refer to methods ranging from as simple as language and stone tools to the complex genetic engineering and information technology that has emerged since the 1980s.

New knowledge has enabled people to create new things, and conversely, many scientific endeavors are made possible by technologies which assist humans in travelling to places they could not previously reach, and by scientific instruments by which we study nature in more detail than our natural senses allow.

Since much of technology is applied science, technical history is connected to the history of science. Since technology uses resources, technical history is tightly connected to economic history. From those resources, technology produces other resources, including technological artifacts used in everyday life.

Technological change affects, and is affected by, a society's cultural traditions. It is a force for economic growth and a means to develop and project economic, political and military power.

Many sociologists and anthropologists have created social theories dealing with social and cultural evolution. Some, like Lewis H. Morgan, Leslie White, and Gerhard Lenski, have declared technological progress to be the primary factor driving the development of human civilization. Morgan's concept of three major stages of social evolution (savagery, barbarism, and civilization) can be divided by technological milestones, such as fire. White argued the measure by which to judge the evolution of culture was energy.[1]

For White, "the primary function of culture" is to "harness and control energy." White differentiates between five stages of human development: In the first, people use energy of their own muscles. In the second, they use energy of domesticated animals. In the third, they use the energy of plants (agricultural revolution). In the fourth, they learn to use the energy of natural resources: coal, oil, gas. In the fifth, they harness nuclear energy. White introduced a formula P=E*T, where E is a measure of energy consumed, and T is the measure of efficiency of technical factors utilizing the energy. In his own words, "culture evolves as the amount of energy harnessed per capita per year is increased, or as the efficiency of the instrumental means of putting the energy to work is increased". Russian astronomer Nikolai Kardashev extrapolated his theory, creating the Kardashev scale, which categorizes the energy use of advanced civilizations.

Lenski's approach focuses on information. The more information and knowledge (especially allowing the shaping of natural environment) a given society has, the more advanced it is. He identifies four stages of human development, based on advances in the history of communication. In the first stage, information is passed by genes. In the second, when humans gain sentience, they can learn and pass information through by experience. In the third, the humans start using signs and develop logic. In the fourth, they can create symbols, develop language and writing. Advancements in communications technology translates into advancements in the economic system and political system, distribution of wealth, social inequality and other spheres of social life. He also differentiates societies based on their level of technology, communication and economy:

In economics productivity is a measure of technological progress. Productivity increases when fewer inputs (labor, energy, materials or land) are used in the production of a unit of output.[2] Another indicator of technological progress is the development of new products and services, which is necessary to offset unemployment that would otherwise result as labor inputs are reduced. In developed countries productivity growth has been slowing since the late 1970s; however, productivity growth was higher in some economic sectors, such as manufacturing.[3] For example, in employment in manufacturing in the United States declined from over 30% in the 1940s to just over 10% 70 years later. Similar changes occurred in other developed countries. This stage is referred to as post-industrial.

In the late 1970s sociologists and anthropologists like Alvin Toffler (author of Future Shock), Daniel Bell and John Naisbitt have approached the theories of post-industrial societies, arguing that the current era of industrial society is coming to an end, and services and information are becoming more important than industry and goods. Some extreme visions of the post-industrial society, especially in fiction, are strikingly similar to the visions of near and post-Singularity societies.

The following is a summary of the history of technology by time period and geography:

-10

-9

-8

-7

-6

-5

-4

-3

-2

-1

0

During most of the Paleolithic - the bulk of the Stone Age - all humans had a lifestyle which involved limited tools and few permanent settlements. The first major technologies were tied to survival, hunting, and food preparation. Stone tools and weapons, fire, and clothing were technological developments of major importance during this period.

Human ancestors have been using stone and other tools since long before the emergence of Homo sapiens approximately 200,000 years ago.[4] The earliest methods of stone tool making, known as the Oldowan "industry", date back to at least 2.3 million years ago,[5] with the earliest direct evidence of tool usage found in Ethiopia within the Great Rift Valley, dating back to 2.5 million years ago.[6] This era of stone tool use is called the Paleolithic, or "Old stone age", and spans all of human history up to the development of agriculture approximately 12,000 years ago.

To make a stone tool, a "core" of hard stone with specific flaking properties (such as flint) was struck with a hammerstone. This flaking produced sharp edges which could be used as tools, primarily in the form of choppers or scrapers.[7] These tools greatly aided the early humans in their hunter-gatherer lifestyle to perform a variety of tasks including butchering carcasses (and breaking bones to get at the marrow); chopping wood; cracking open nuts; skinning an animal for its hide; and even forming other tools out of softer materials such as bone and wood.[8]

The earliest stone tools were crude, being little more than a fractured rock. In the Acheulian era, beginning approximately 1.65 million years ago, methods of working these stone into specific shapes, such as hand axes emerged. This early Stone Age is described as Epipaleolithic or Mesolithic. The former is generally used to describe the early Stone Age in areas with limited glacial impact.

The Middle Paleolithic, approximately 300,000 years ago, saw the introduction of the prepared-core technique, where multiple blades could be rapidly formed from a single core stone.[7] The Upper Paleolithic, beginning approximately 40,000 years ago, saw the introduction of pressure flaking, where a wood, bone, or antler punch could be used to shape a stone very finely.[9]

The later Stone Age, during which the rudiments of agricultural technology were developed, is called the Neolithic period. During this period, polished stone tools were made from a variety of hard rocks such as flint, jade, jadeite and greenstone, largely by working exposures as quarries, but later the valuable rocks were pursued by tunnelling underground, the first steps in mining technology. The polished axes were used for forest clearance and the establishment of crop farming, and were so effective as to remain in use when bronze and iron appeared.

Stone Age cultures developed music, and engaged in organized warfare. Stone Age humans developed ocean-worthy outrigger canoe technology, leading to migration across the Malay archipelago, across the Indian Ocean to Madagascar and also across the Pacific Ocean, which required knowledge of the ocean currents, weather patterns, sailing, and celestial navigation.

Although Paleolithic cultures left no written records, the shift from nomadic life to settlement and agriculture can be inferred from a range of archaeological evidence. Such evidence includes ancient tools,[10]cave paintings, and other prehistoric art, such as the Venus of Willendorf. Human remains also provide direct evidence, both through the examination of bones, and the study of mummies. Scientists and historians have been able to form significant inferences about the lifestyle and culture of various prehistoric peoples, and especially their technology.

The Stone Age developed into the Bronze Age after the Neolithic Revolution. The Neolithic Revolution involved radical changes in agricultural technology which included development of agriculture, animal domestication, and the adoption of permanent settlements. These combined factors made possible the development of metal smelting, with copper and later bronze, an alloy of tin and copper, being the materials of choice, although polished stone tools continued to be used for a considerable time owing to their abundance compared with the less common metals (especially tin).

This technological trend apparently began in the Fertile Crescent, and spread outward over time. These developments were not, and still are not, universal. The three-age system does not accurately describe the technology history of groups outside of Eurasia, and does not apply at all in the case of some isolated populations, such as the Spinifex People, the Sentinelese, and various Amazonian tribes, which still make use of Stone Age technology, and have not developed agricultural or metal technology.

The Iron age involved the adoption of iron smelting technology. It generally replaced bronze, and made it possible to produce tools which were stronger, lighter and cheaper to make than bronze equivalents. In many Eurasian cultures, the Iron Age was the last major step before the development of written language, though again this was not universally the case. It was not possible to mass manufacture steel because high furnace temperatures were needed, but steel could be produced by forging bloomery iron to reduce the carbon content in a controllable way. Iron ores were much more widespread than either copper or tin. In Europe, large hill forts were built either as a refuge in time of war, or sometimes as permanent settlements. In some cases, existing forts from the Bronze Age were expanded and enlarged. The pace of land clearance using the more effective iron axes increased, providing more farmland to support the growing population.

It was the growth of the ancient civilizations which produced the greatest advances in technology and engineering, advances which stimulated other societies to adopt new ways of living and governance.

The Egyptians invented and used many simple machines, such as the ramp to aid construction processes. The Indus Valley Civilization, situated in a resource-rich area, is notable for its early application of city planning and sanitation technologies. Ancient India was also at the forefront of seafaring technologya panel found at Mohenjodaro depicts a sailing craft. Indian construction and architecture, called 'Vaastu Shastra', suggests a thorough understanding of materials engineering, hydrology, and sanitation.

The peoples of Mesopotamia (Sumerians, Assyrians, and Babylonians) have been credited with the invention of the wheel, but this is no longer certain. They lived in cities from c. 4000BC,[11] and developed a sophisticated architecture in mud-brick and stone,[12] including the use of the true arch. The walls of Babylon were so massive they were quoted as a Wonder of the World. They developed extensive water systems; canals for transport and irrigation in the alluvial south, and catchment systems stretching for tens of kilometres in the hilly north. Their palaces had sophisticated drainage systems.[13]

Writing was invented in Mesopotamia, using cuneiform script. Many records on clay tablets and stone inscriptions have survived. These civilizations were early adopters of bronze technologies which they used for tools, weapons and monumental statuary. By 1200BC they could cast objects 5 m long in a single piece. The Assyrian King Sennacherib (704-681BC) claims to have invented automatic sluices and to have been the first to use water screws, of up to 30 tons weight, which were cast using two-part clay moulds rather than by the 'lost wax' process.[13] The Jerwan Aqueduct (c. 688BC) is made with stone arches and lined with waterproof concrete.[14]

The Babylonian astronomical diaries spanned 800 years. They enabled meticulous astronomers to plot the motions of the planets and to predict eclipses.[15]

The Chinese made many first-known discoveries and developments. Major technological contributions from China include early seismological detectors, matches, paper, sliding calipers, the double-action piston pump, cast iron, the iron plough, the multi-tube seed drill, the wheelbarrow, the suspension bridge, the parachute, natural gas as fuel, the compass, the raised-relief map, the propeller, the crossbow, the South Pointing Chariot and gunpowder.

Other Chinese discoveries and inventions from the Medieval period,include: block printing, movable type printing, phosphorescent paint, endless power chain drive and the clock escapement mechanism. The solid-fuel rocket was invented in China about 1150, nearly 200 years after the invention of gunpowder (which acted as the rocket's fuel). Decades before the West's age of exploration, the Chinese emperors of the Ming Dynasty also sent large fleets for maritime voyages, some reaching Africa.

Greek and Hellenistic engineers were responsible for myriad inventions and improvements to existing technology. The Hellenistic period in particular saw a sharp increase in technological advancement, fostered by a climate of openness to new ideas, the blossoming of a mechanistic philosophy, and the establishment of the Library of Alexandria and its close association with the adjacent museion. In contrast to the typically anonymous inventors of earlier ages, ingenious minds such as Archimedes, Philo of Byzantium, Heron, Ctesibius, and Archytas remain known by name to posterity.

Ancient Greek innovations were particularly pronounced in mechanical technology, including the ground-breaking invention of the watermill which constituted the first human-devised motive force not to rely on muscle power (besides the sail). Apart from their pioneering use of waterpower, Greek inventors were also the first to experiment with wind power (see Heron's windwheel) and even created the earliest steam engine (the aeolipile), opening up entirely new possibilities in harnessing natural forces whose full potential would not be exploited until the Industrial Revolution. The newly devised right-angled gear and screw would become particularly important to the operation of mechanical devices. Thats when when the age of mechanical devices started.

Ancient agriculture, as in any period prior to the modern age the primary mode of production and subsistence, and its irrigation methods, were considerably advanced by the invention and widespread application of a number of previously unknown water-lifting devices, such as the vertical water-wheel, the compartmented wheel, the water turbine, Archimedes' screw, the bucket-chain and pot-garland, the force pump, the suction pump, the double-action piston pump and quite possibly the chain pump.[16]

In music, the water organ, invented by Ctesibius and subsequently improved, constituted the earliest instance of a keyboard instrument. In time-keeping, the introduction of the inflow clepsydra and its mechanization by the dial and pointer, the application of a feedback system and the escapement mechanism far superseded the earlier outflow clepsydra.

The famous Antikythera mechanism, a kind of analogous computer working with a differential gear, and the astrolabe both show great refinement in astronomical science.

Greek engineers were also the first to devise automata such as vending machines, suspended ink pots, automatic washstands and doors, primarily as toys, which however featured many new useful mechanisms such as the cam and gimbals.

In other fields, ancient Greek inventions include the catapult and the gastraphetes crossbow in warfare, hollow bronze-casting in metallurgy, the dioptra for surveying, in infrastructure the lighthouse, central heating, the tunnel excavated from both ends by scientific calculations, the ship trackway, the dry dock and plumbing. In horizontal vertical and transport great progress resulted from the invention of the crane, the winch, the wheelbarrow and the odometer.

Further newly created techniques and items were spiral staircases, the chain drive, sliding calipers and showers.

The Romans developed an intensive and sophisticated agriculture, expanded upon existing iron working technology, created laws providing for individual ownership, advanced stone masonry technology, advanced road-building (exceeded only in the 19th century), military engineering, civil engineering, spinning and weaving and several different machines like the Gallic reaper that helped to increase productivity in many sectors of the Roman economy. Roman engineers were the first to build monumental arches, amphitheatres, aqueducts, public baths, true arch bridges, harbours, reservoirs and dams, vaults and domes on a very large scale across their Empire. Notable Roman inventions include the book (Codex), glass blowing and concrete. Because Rome was located on a volcanic peninsula, with sand which contained suitable crystalline grains, the concrete which the Romans formulated was especially durable. Some of their buildings have lasted 2000 years, to the present day.

The engineering skills of the Inca and the Mayans were great, even by today's standards. An example is the use of pieces weighing upwards of one ton in their stonework placed together so that not even a blade can fit in-between the cracks. The villages used irrigation canals and drainage systems, making agriculture very efficient. While some claim that the Incas were the first inventors of hydroponics, their agricultural technology was still soil based, if advanced. Though the Maya civilization had no metallurgy or wheel technology, they developed complex writing and astrological systems, and created sculptural works in stone and flint. Like the Inca, the Maya also had command of fairly advanced agricultural and construction technology. Throughout this time period, much of this construction was made only by women, as men of the Maya civilization believed that females were responsible for the creation of new things. The main contribution of the Aztec rule was a system of communications between the conquered cities. In Mesoamerica, without draft animals for transport (nor, as a result, wheeled vehicles), the roads were designed for travel on foot, just like the Inca and Mayan civilizations

As earlier empires had done, the Muslim caliphates united in trade large areas that had previously traded little. The conquered sometimes paid lower taxes than in their earlier independence, and ideas spread even more easily than goods. Peace was more frequent than it had been. These conditions fostered improvements in agriculture and other technology as well as in sciences which largely adapted from earlier Greek, Roman and Persian empires, with improvements.

European technology in the Middle Ages may be best described as a symbiosis of traditio et innovatio. While medieval technology has been long depicted as a step backwards in the evolution of Western technology, sometimes willfully so by modern authors intent on denouncing the church as antagonistic to scientific progress (see e.g. Myth of the Flat Earth), a generation of medievalists around the American historian of science Lynn White stressed from the 1940s onwards the innovative character of many medieval techniques. Genuine medieval contributions include for example mechanical clocks, spectacles and vertical windmills. Medieval ingenuity was also displayed in the invention of seemingly inconspicuous items like the watermark or the functional button. In navigation, the foundation to the subsequent age of exploration was laid by the introduction of pintle-and-gudgeon rudders, lateen sails, the dry compass, the horseshoe and the astrolabe.

Significant advances were also made in military technology with the development of plate armour, steel crossbows, counterweight trebuchets and cannon. The Middle Ages are perhaps best known for their architectural heritage: While the invention of the rib vault and pointed arch gave rise to the high rising Gothic style, the ubiquitous medieval fortifications gave the era the almost proverbial title of the 'age of castles'.

Papermaking, a 2nd-century Chinese technology, was carried to the Middle East when a group of Chinese papermakers were captured in the 8th century.[17] Papermaking technology was spread to Europe by the Umayyad conquest of Hispania.[18] A paper mill was established in Sicily in the 12th century. In Europe the fiber to make pulp for making paper was obtained from linen and cotton rags. Lynn White credited the spinning wheel with increasing the supply of rags, which led to cheap paper, which was a factor in the development of printing.[19]

The era is marked by such profound technical advancements like linear perceptivity, double shell domes or Bastion fortresses. Note books of the Renaissance artist-engineers such as Taccola and Leonardo da Vinci give a deep insight into the mechanical technology then known and applied. Architects and engineers were inspired by the structures of Ancient Rome, and men like Brunelleschi created the large dome of Florence Cathedral as a result. He was awarded one of the first patents ever issued in order to protect an ingenious crane he designed to raise the large masonry stones to the top of the structure. Military technology developed rapidly with the widespread use of the cross-bow and ever more powerful artillery, as the city-states of Italy were usually in conflict with one another. Powerful families like the Medici were strong patrons of the arts and sciences. Renaissance science spawned the Scientific Revolution; science and technology began a cycle of mutual advancement.

The invention of the movable cast metal type printing press, whose pressing mechanism was adapted from an olive screw press, (c. 1441) lead to a tremendous increase in the number of books and the number of titles published.

An improved sailing ship, the (nau or carrack), enabled the Age of Exploration with the European colonization of the Americas, epitomized by Francis Bacon's New Atlantis. Pioneers like Vasco da Gama, Cabral, Magellan and Christopher Columbus explored the world in search of new trade routes for their goods and contacts with Africa, India and China to shorten the journey compared with traditional routes overland. They produced new maps and charts which enabled following mariners to explore further with greater confidence. Navigation was generally difficult, however, owing to the problem of longitude and the absence of accurate chronometers. European powers rediscovered the idea of the civil code, lost since the time of the Ancient Greeks.

The British Industrial Revolution is characterized by developments in the areas of textile manufacturing, mining, metallurgy and transport driven by the development of the steam engine. Above all else, the revolution was driven by cheap energy in the form of coal, produced in ever-increasing amounts from the abundant resources of Britain. Coal converted to coke gave the blast furnace and cast iron in much larger amounts than before, and a range of structures could be created, such as The Iron Bridge. Cheap coal meant that industry was no longer constrained by water resources driving the mills, although it continued as a valuable source of power. The steam engine helped drain the mines, so more coal reserves could be accessed, and the output of coal increased. The development of the high-pressure steam engine made locomotives possible, and a transport revolution followed.[20]

The 19th century saw astonishing developments in transportation, construction, manufacturing and communication technologies originating in Europe. The steam engine which had existed since the early 18th century, was practically applied to both steamboat and railway transportation. The Liverpool and Manchester Railway, the first purpose built railway line, opened in 1830, the Rocket locomotive of Robert Stephenson being one of its first working locomotives used. Telegraphy also developed into a practical technology in the 19th century to help run the railways safely.

Other technologies were explored for the first time, including the incandescent light bulb. The invention of the incandescent light bulb had a profound effect on the workplace because factories could now have second and third shift workers. Manufacture of ships' pulley blocks by all-metal machines at the Portsmouth Block Mills instigated the age of mass production. Machine tools used by engineers to manufacture parts began in the first decade of the century, notably by Richard Roberts and Joseph Whitworth. The development of interchangeable parts through what is now called the American system of manufacturing began in the firearms industry at the U.S Federal arsenals in the early 19th century, and became widely used by the end of the century.

Shoe production was mechanized and sewing machines introduced around the middle of the 19th century. Mass production of sewing machines and agricultural machinery such as reapers occurred in the mid to late 19th century. Bicycles were mass-produced beginning in the 1880s.

Steam-powered factories became widespread, although the conversion from water power to steam occurred in England before in the U.S.

Steamships were eventually completely iron-clad, and played a role in the opening of Japan and China to trade with the West. The Second Industrial Revolution at the end of the 19th century saw rapid development of chemical, electrical, petroleum, and steel technologies connected with highly structured technology research.

The period from the last third of the 19th century until WW1 is sometimes referred to as the Second Industrial Revolution.

20th century technology developed rapidly. Broad teaching and implementation of the scientific method, and increased research spending contributed to the advancement of modern science and technology. New technology improved communication and transport, thus spreading technical understanding.

Mass production brought automobiles and other high-tech goods to masses of consumers. Military research and development sped advances including electronic computing and jet engines. Radio and telephony improved greatly and spread to larger populations of users, though near-universal access would not be possible until mobile phones became affordable to developing world residents in the late 2000s and early 2010s.

Energy and engine technology improvements included nuclear power, developed after the Manhattan project which heralded the new Atomic Age. Rocket development led to long range missiles and the first space age that lasted from the 1950s with the launch of Sputnik to the mid-1980s.

Electrification spread rapidly in the 20th century. At the beginning of the century electric power was for the most part only available to wealthy people in a few major cities such as New York, London, Paris, and Newcastle upon Tyne, but by the time the World Wide Web was invented in 1990 an estimated 62 percent of homes worldwide had electric power, including about a third of households in [21] the rural developing world.

Birth control also became widespread during the 20th century. Electron microscopes were very powerful by the late 1970s and genetic theory and knowledge were expanding, leading to developments in genetic engineering .

The first "test tube baby" Louise Brown was born in 1978, which led to the first successful gestational surrogacy pregnancy in 1985 and the first pregnancy by ICSI in 1991, which is the implanting of a single sperm into an egg. Preimplantation genetic diagnosis was first performed in late 1989 and led to successful births in July 1990. These procedures have become relatively common and are changing the concept of what it means to be a parent.

The massive data analysis resources necessary for running transatlantic research programs such as the Human Genome Project and the Large Electron-Positron Collider led to a necessity for distributed communications, causing Internet protocols to be more widely adopted by researchers and also creating a justification for Tim Berners-Lee to create the World Wide Web.

Vaccination spread rapidly to the developing world from the 1980s onward due to many successful humanitarian initiatives, greatly reducing childhood mortality in many poor countries with limited medical resources.

The US National Academy of Engineering, by expert vote, established the following ranking of the most important technological developments of the 20th century:[22]

In the early 21st century research is ongoing into quantum computers, gene therapy (introduced 1990), 3D printing (introduced 1981), nanotechnology (introduced 1985), bioengineering/biotechnology, nuclear technology, advanced materials (e.g., graphene), the scramjet and drones (along with railguns and high-energy laser beams for military uses), superconductivity, the memristor, and green technologies such as alternative fuels (e.g., fuel cells, self-driving electric & plug-in hybrid cars), augmented reality devices and wearable electronics, artificial intelligence, and more efficient & powerful LEDs, solar cells, integrated circuits, wireless power devices, engines, and batteries.

Perhaps the greatest research tool built in the 21st century is the Large Hadron Collider, the largest single machine ever built. The understanding of particle physics is expected to expand with better instruments including larger particle accelerators such as the LHC [23] and better neutrino detectors. Dark matter is sought via underground detectors and observatories like LIGO have started to detect gravitational waves.

Genetic engineering technology continues to improve, and the importance of epigenetics on development and inheritance has also become increasingly recognized.[24]

New spaceflight technology and spacecraft are also being developed, like the Orion and Dragon. New, more capable space telescopes are being designed. The International Space Station was completed in the 2000s, and NASA and ESA plan a manned mission to Mars in the 2030s. The Variable Specific Impulse Magnetoplasma Rocket (VASIMR) is an electro-magnetic thruster for spacecraft propulsion and is expected to be tested in 2015.

2004 saw the first manned commercial spaceflight when Mike Melvill crossed the boundary of space on June 21, 2004.

Originally posted here:

History of technology - Wikipedia, the free encyclopedia

Posted in Technology | Comments Off on History of technology – Wikipedia, the free encyclopedia

NOAA Ocean Explorer: Technology

Posted: at 7:13 pm

Todays technologies allow us to explore the ocean in increasingly systematic, scientific, and noninvasive ways. With continuing scientific and technological advances, our ability to observe the ocean environment and its resident creatures is beginning to catch up with our imaginations, expanding our understanding and appreciation of this still largely unexplored realm.

This section of the Ocean Explorer website highlights the technologies that make today's explorations possible and the scientific achievements that result from these explorations. Technologies include platforms such as vessels and submersibles, observing systems and sensors, communication technologies, and diving technologies that transport us across ocean waters and into the depths, allowing us to scientifically examine, record, and analyze the mysteries of the ocean.

From onboard equipment to collect weather and ocean information to divers, submersibles, and other observations deployed from a ship, vessels are the most critical tool for scientists when it comes to exploring the ocean.

Darkness, cold, and crushing pressures have challenged the most experienced engineers to develop submersibles that descend to sea floor depths that are not safe for divers, allowing us to explore ocean depths firsthand, make detailed observations, and collect samples of unexplored ecosystems.

Scientists rely on an array of tools to collect weather and ocean observations such as water temperatures and salinities, the shape of the seafloor, and the speed of currents. Using to tools to record and monitor water column condition and to collect samples for analyses allows scientists to enhance our understanding of the ocean.

Technologies that allow scientists to collaborate and transmit data more quickly and to a greater number of users are changing the way that we explore. From telepresence to shipboard computers, these technologies are increasing the pace, efficiency, and scope of ocean exploration.

When depths are not too great or conditions are not too unsafe, divers can descend into the water to explore the ocean realm. It is only through relatively recent advances in technology that this type of exploration has been possible.

These pages offer a comprehensive look at NOAA's history of ocean exploration through a series of chronological essays. Also included is a rich selection of historical quotations, arranged thematically, that capture the many advances, challenges, and misunderstandings through the years as both early and modern explorers struggled to study the mysterious ocean realm.

Read more here:

NOAA Ocean Explorer: Technology

Posted in Technology | Comments Off on NOAA Ocean Explorer: Technology

Technology – The Atlantic

Posted: at 7:13 pm

AP

Humans discovered two new planets this weekone, remade by their efforts; the other, for light years away.

A new team improves on the old Earth at night technique.

The popularity of Alaskas Bear Cam is a testament to technologys influence on peoples connections with nature.

Overused phrasesI hope youre well, Best, etc.are more valuable than they seem.

Hackers can use artificial intelligence to mimic their targets tweetsand entice them to click on malicious links.

The billionaire justified his fight against Gawker with a misleading reference to privacy legislation.

A group calling itself the Shadow Broker is auctioning off what it says are the agencys cyberweapons.

When officers categorize wallets or cellphones as evidence, getting them back can be nearly impossibleeven if the owner isnt charged with a crime.

As a startup, it can sometimes be hard to navigate the shifting sands of race, gender, and power. But some mistakes are easy to avoid.

The clear colas nostalgic relaunch harkens back to a time when the worlds problems seemed simple.

Archaeologists say the sandstone etchings may be 400 years old.

Go here to see the original:

Technology - The Atlantic

Posted in Technology | Comments Off on Technology – The Atlantic

Technology Org – Science and technology news

Posted: July 5, 2016 at 7:10 am

Posted Today

Cells contain thousands of messenger RNA molecules, which carry copies of DNAs genetic instructions to the rest of

Posted Today

Scientists from the Department of Energys Lawrence Berkeley National Laboratory (Berkeley Lab) have discovered a possible secret to

Posted Today

Researchers have made two new scientific points with a set of experiments in which they induced people to

Posted Today

In a new report, dozens of scientists, health practitioners and childrens health advocates are calling for renewed attention

Posted Today

Women who carry the BRCA1 gene mutation that dramatically increases their risk of breast and ovarian cancers are

Posted Today

Mice are one of the most commonly used laboratory organisms, widely used to study everything from autism to

Posted Today

Climate change is always in the spotlight of attention. Melting glaciers, spreading deserts and pollution is always considered

Posted Yesterday

If we continue going on the path we, as humanity, chose, someday we may have to move out

55,026 science & technology articles

Read the rest here:

Technology Org - Science and technology news

Posted in Technology | Comments Off on Technology Org – Science and technology news

Technology – Wikipedia, the free encyclopedia

Posted: June 19, 2016 at 3:35 am

This article is about the use and knowledge of techniques and processes for producing goods and services. For other uses, see Technology (disambiguation).

Technology ("science of craft", from Greek , techne, "art, skill, cunning of hand"; and -, -logia[3]) is the collection of techniques, skills, methods and processes used in the production of goods or services or in the accomplishment of objectives, such as scientific investigation. Technology can be the knowledge of techniques, processes, etc. or it can be embedded in machines, computers, devices and factories, which can be operated by individuals without detailed knowledge of the workings of such things.

The human species' use of technology began with the conversion of natural resources into simple tools. The prehistoric discovery of how to control fire and the later Neolithic Revolution increased the available sources of food and the invention of the wheel helped humans to travel in and control their environment. Developments in historic times, including the printing press, the telephone, and the Internet, have lessened physical barriers to communication and allowed humans to interact freely on a global scale. The steady progress of military technology has brought weapons of ever-increasing destructive power, from clubs to nuclear weapons.

Technology has many effects. It has helped develop more advanced economies (including today's global economy) and has allowed the rise of a leisure class. Many technological processes produce unwanted by-products, known as pollution, and deplete natural resources, to the detriment of Earth's environment. Various implementations of technology influence the values of a society and new technology often raises new ethical questions. Examples include the rise of the notion of efficiency in terms of human productivity, a term originally applied only to machines, and the challenge of traditional norms.

Philosophical debates have arisen over the use of technology, with disagreements over whether technology improves the human condition or worsens it. Neo-Luddism, anarcho-primitivism, and similar reactionary movements criticise the pervasiveness of technology in the modern world, arguing that it harms the environment and alienates people; proponents of ideologies such as transhumanism and techno-progressivism view continued technological progress as beneficial to society and the human condition.

Until recently, it was believed that the development of technology was restricted only to human beings, but 21st century scientific studies indicate that other primates and certain dolphin communities have developed simple tools and passed their knowledge to other generations.

The use of the term "technology" has changed significantly over the last 200 years. Before the 20th century, the term was uncommon in English, and usually referred to the description or study of the useful arts.[4] The term was often connected to technical education, as in the Massachusetts Institute of Technology (chartered in 1861).[5]

The term "technology" rose to prominence in the 20th century in connection with the Second Industrial Revolution. The term's meanings changed in the early 20th century when American social scientists, beginning with Thorstein Veblen, translated ideas from the German concept of Technik into "technology". In German and other European languages, a distinction exists between technik and technologie that is absent in English, which usually translates both terms as "technology". By the 1930s, "technology" referred not only to the study of the industrial arts but to the industrial arts themselves.[6]

In 1937, the American sociologist Read Bain wrote that "technology includes all tools, machines, utensils, weapons, instruments, housing, clothing, communicating and transporting devices and the skills by which we produce and use them."[7] Bain's definition remains common among scholars today, especially social scientists. But equally prominent is the definition of technology as applied science, especially among scientists and engineers, although most social scientists who study technology reject this definition.[8] More recently, scholars have borrowed from European philosophers of "technique" to extend the meaning of technology to various forms of instrumental reason, as in Foucault's work on technologies of the self (techniques de soi).

Dictionaries and scholars have offered a variety of definitions. The Merriam-Webster Dictionary offers a definition of the term: "the practical application of knowledge especially in a particular area" and "a capability given by the practical application of knowledge".[9]Ursula Franklin, in her 1989 "Real World of Technology" lecture, gave another definition of the concept; it is "practice, the way we do things around here".[10] The term is often used to imply a specific field of technology, or to refer to high technology or just consumer electronics, rather than technology as a whole.[11]Bernard Stiegler, in Technics and Time, 1, defines technology in two ways: as "the pursuit of life by means other than life", and as "organized inorganic matter."[12]

Technology can be most broadly defined as the entities, both material and immaterial, created by the application of mental and physical effort in order to achieve some value. In this usage, technology refers to tools and machines that may be used to solve real-world problems. It is a far-reaching term that may include simple tools, such as a crowbar or wooden spoon, or more complex machines, such as a space station or particle accelerator. Tools and machines need not be material; virtual technology, such as computer software and business methods, fall under this definition of technology.[13]W. Brian Arthur defines technology in a similarly broad way as "a means to fulfill a human purpose".[14]

The word "technology" can also be used to refer to a collection of techniques. In this context, it is the current state of humanity's knowledge of how to combine resources to produce desired products, to solve problems, fulfill needs, or satisfy wants; it includes technical methods, skills, processes, techniques, tools and raw materials. When combined with another term, such as "medical technology" or "space technology", it refers to the state of the respective field's knowledge and tools. "State-of-the-art technology" refers to the high technology available to humanity in any field.

Technology can be viewed as an activity that forms or changes culture.[15] Additionally, technology is the application of math, science, and the arts for the benefit of life as it is known. A modern example is the rise of communication technology, which has lessened barriers to human interaction and, as a result, has helped spawn new subcultures; the rise of cyberculture has, at its basis, the development of the Internet and the computer.[16] Not all technology enhances culture in a creative way; technology can also help facilitate political oppression and war via tools such as guns. As a cultural activity, technology predates both science and engineering, each of which formalize some aspects of technological endeavor.

The distinction between science, engineering and technology is not always clear. Science is the reasoned investigation or study of natural phenomena, aimed at discovering enduring principles among elements of the phenomenal world by employing formal techniques such as the scientific method.[17] Technologies are not usually exclusively products of science, because they have to satisfy requirements such as utility, usability and safety.

Engineering is the goal-oriented process of designing and making tools and systems to exploit natural phenomena for practical human means, often (but not always) using results and techniques from science. The development of technology may draw upon many fields of knowledge, including scientific, engineering, mathematical, linguistic, and historical knowledge, to achieve some practical result.

Technology is often a consequence of science and engineering although technology as a human activity precedes the two fields. For example, science might study the flow of electrons in electrical conductors, by using already-existing tools and knowledge. This new-found knowledge may then be used by engineers to create new tools and machines, such as semiconductors, computers, and other forms of advanced technology. In this sense, scientists and engineers may both be considered technologists; the three fields are often considered as one for the purposes of research and reference.[18]

The exact relations between science and technology in particular have been debated by scientists, historians, and policymakers in the late 20th century, in part because the debate can inform the funding of basic and applied science. In the immediate wake of World War II, for example, in the United States it was widely considered that technology was simply "applied science" and that to fund basic science was to reap technological results in due time. An articulation of this philosophy could be found explicitly in Vannevar Bush's treatise on postwar science policy, ScienceThe Endless Frontier: "New products, new industries, and more jobs require continuous additions to knowledge of the laws of nature... This essential new knowledge can be obtained only through basic scientific research." In the late-1960s, however, this view came under direct attack, leading towards initiatives to fund science for specific tasks (initiatives resisted by the scientific community). The issue remains contentiousthough most analysts resist the model that technology simply is a result of scientific research.[19][20]

The use of tools by early humans was partly a process of discovery and of evolution. Early humans evolved from a species of foraging hominids which were already bipedal,[21] with a brain mass approximately one third of modern humans.[22] Tool use remained relatively unchanged for most of early human history. Approximately 50,000 years ago, the use of tools and complex set of behaviors emerged, believed by many archaeologists to be connected to the emergence of fully modern language.[23]

Hominids started using primitive stone tools millions of years ago. The earliest stone tools were little more than a fractured rock, but approximately 40,000 years ago, pressure flaking provided a way to make much finer work.

The discovery and utilization of fire, a simple energy source with many profound uses, was a turning point in the technological evolution of humankind.[24] The exact date of its discovery is not known; evidence of burnt animal bones at the Cradle of Humankind suggests that the domestication of fire occurred before 1,000,000BC;[25] scholarly consensus indicates that Homo erectus had controlled fire by between 500,000BC and 400,000BC.[26][27] Fire, fueled with wood and charcoal, allowed early humans to cook their food to increase its digestibility, improving its nutrient value and broadening the number of foods that could be eaten.[28]

Other technological advances made during the Paleolithic era were clothing and shelter; the adoption of both technologies cannot be dated exactly, but they were a key to humanity's progress. As the Paleolithic era progressed, dwellings became more sophisticated and more elaborate; as early as 380,000BC, humans were constructing temporary wood huts.[29][30] Clothing, adapted from the fur and hides of hunted animals, helped humanity expand into colder regions; humans began to migrate out of Africa by 200,000BC and into other continents, such as Eurasia.[31]

Man's technological ascent began in earnest in what is known as the Neolithic period ("New stone age"). The invention of polished stone axes was a major advance that allowed forest clearance on a large scale to create farms. Agriculture fed larger populations, and the transition to sedentism allowed simultaneously raising more children, as infants no longer needed to be carried, as nomadic ones must. Additionally, children could contribute labor to the raising of crops more readily than they could to the hunter-gatherer economy.[32][33]

With this increase in population and availability of labor came an increase in labor specialization.[34] What triggered the progression from early Neolithic villages to the first cities, such as Uruk, and the first civilizations, such as Sumer, is not specifically known; however, the emergence of increasingly hierarchical social structures and specialized labor, of trade and war amongst adjacent cultures, and the need for collective action to overcome environmental challenges such as irrigation, are all thought to have played a role.[35]

Continuing improvements led to the furnace and bellows and provided the ability to smelt and forge native metals (naturally occurring in relatively pure form).[36]Gold, copper, silver, and lead, were such early metals. The advantages of copper tools over stone, bone, and wooden tools were quickly apparent to early humans, and native copper was probably used from near the beginning of Neolithic times (about 8000 BC).[37] Native copper does not naturally occur in large amounts, but copper ores are quite common and some of them produce metal easily when burned in wood or charcoal fires. Eventually, the working of metals led to the discovery of alloys such as bronze and brass (about 4000 BC). The first uses of iron alloys such as steel dates to around 1400 BC.

Meanwhile, humans were learning to harness other forms of energy. The earliest known use of wind power is the sailboat.[38] The earliest record of a ship under sail is shown on an Egyptian pot dating back to 3200 BC.[39] From prehistoric times, Egyptians probably used the power of the annual flooding of the Nile to irrigate their lands, gradually learning to regulate much of it through purposely built irrigation channels and 'catch' basins. Similarly, the early peoples of Mesopotamia, the Sumerians, learned to use the Tigris and Euphrates rivers for much the same purposes. But more extensive use of wind and water (and even human) power required another invention.

According to archaeologists, the wheel was invented around 4000 B.C. probably independently and nearly simultaneously in Mesopotamia (in present-day Iraq), the Northern Caucasus (Maykop culture) and Central Europe. Estimates on when this may have occurred range from 5500 to 3000 B.C., with most experts putting it closer to 4000 B.C. The oldest artifacts with drawings that depict wheeled carts date from about 3000 B.C.; however, the wheel may have been in use for millennia before these drawings were made. There is also evidence from the same period for the use of the potter's wheel. More recently, the oldest-known wooden wheel in the world was found in the Ljubljana marshes of Slovenia.[40]

The invention of the wheel revolutionized trade and war. It did not take long to discover that wheeled wagons could be used to carry heavy loads. Fast (rotary) potters' wheels enabled early mass production of pottery. But it was the use of the wheel as a transformer of energy (through water wheels, windmills, and even treadmills) that revolutionized the application of nonhuman power sources.

Innovations continued through the Middle Ages with innovations such as silk, the horse collar and horseshoes in the first few hundred years after the fall of the Roman Empire. Medieval technology saw the use of simple machines (such as the lever, the screw, and the pulley) being combined to form more complicated tools, such as the wheelbarrow, windmills and clocks. The Renaissance brought forth many of these innovations, including the printing press (which facilitated the greater communication of knowledge), and technology became increasingly associated with science, beginning a cycle of mutual advancement. The advancements in technology in this era allowed a more steady supply of food, followed by the wider availability of consumer goods.

Starting in the United Kingdom in the 18th century, the Industrial Revolution was a period of great technological discovery, particularly in the areas of agriculture, manufacturing, mining, metallurgy and transport, driven by the discovery of steam power. Technology took another step in a second industrial revolution with the harnessing of electricity to create such innovations as the electric motor, light bulb and countless others. Scientific advancement and the discovery of new concepts later allowed for powered flight, and advancements in medicine, chemistry, physics and engineering. The rise in technology has led to skyscrapers and broad urban areas whose inhabitants rely on motors to transport them and their daily bread. Communication was also greatly improved with the invention of the telegraph, telephone, radio and television. The late 19th and early 20th centuries saw a revolution in transportation with the invention of the airplane and automobile.

The 20th century brought a host of innovations. In physics, the discovery of nuclear fission has led to both nuclear weapons and nuclear power. Computers were also invented and later miniaturized utilizing transistors and integrated circuits. information technology subsequently led to the creation of the Internet, which ushered in the current Information Age. Humans have also been able to explore space with satellites (later used for telecommunication) and in manned missions going all the way to the moon. In medicine, this era brought innovations such as open-heart surgery and later stem cell therapy along with new medications and treatments.

Complex manufacturing and construction techniques and organizations are needed to make and maintain these new technologies, and entire industries have arisen to support and develop succeeding generations of increasingly more complex tools. Modern technology increasingly relies on training and education their designers, builders, maintainers, and users often require sophisticated general and specific training. Moreover, these technologies have become so complex that entire fields have been created to support them, including engineering, medicine, and computer science, and other fields have been made more complex, such as construction, transportation and architecture.

Generally, technicism is a reliance or confidence in technology as a benefactor of society. Taken to extreme, technicism is the belief that humanity will ultimately be able to control the entirety of existence using technology. In other words, human beings will someday be able to master all problems and possibly even control the future using technology. Some, such as Stephen V. Monsma,[41] connect these ideas to the abdication of religion as a higher moral authority.

Optimistic assumptions are made by proponents of ideologies such as transhumanism and singularitarianism, which view technological development as generally having beneficial effects for the society and the human condition. In these ideologies, technological development is morally good. Some critics see these ideologies as examples of scientism and techno-utopianism and fear the notion of human enhancement and technological singularity which they support. Some have described Karl Marx as a techno-optimist.[42]

On the somewhat skeptical side are certain philosophers like Herbert Marcuse and John Zerzan, who believe that technological societies are inherently flawed. They suggest that the inevitable result of such a society is to become evermore technological at the cost of freedom and psychological health.

Many, such as the Luddites and prominent philosopher Martin Heidegger, hold serious, although not entirely deterministic reservations, about technology (see "The Question Concerning Technology"[43]). According to Heidegger scholars Hubert Dreyfus and Charles Spinosa, "Heidegger does not oppose technology. He hopes to reveal the essence of technology in a way that 'in no way confines us to a stultified compulsion to push on blindly with technology or, what comes to the same thing, to rebel helplessly against it.' Indeed, he promises that 'when we once open ourselves expressly to the essence of technology, we find ourselves unexpectedly taken into a freeing claim.'[44]" What this entails is a more complex relationship to technology than either techno-optimists or techno-pessimists tend to allow.[45]

Some of the most poignant criticisms of technology are found in what are now considered to be dystopian literary classics, for example Aldous Huxley's Brave New World and other writings, Anthony Burgess's A Clockwork Orange, and George Orwell's Nineteen Eighty-Four. And, in Faust by Goethe, Faust's selling his soul to the devil in return for power over the physical world, is also often interpreted as a metaphor for the adoption of industrial technology. More recently, modern works of science fiction, such as those by Philip K. Dick and William Gibson, and films (e.g. Blade Runner, Ghost in the Shell) project highly ambivalent or cautionary attitudes toward technology's impact on human society and identity.

The late cultural critic Neil Postman distinguished tool-using societies from technological societies and, finally, what he called "technopolies," that is, societies that are dominated by the ideology of technological and scientific progress, to the exclusion or harm of other cultural practices, values and world-views.[46]

Darin Barney has written about technology's impact on practices of citizenship and democratic culture, suggesting that technology can be construed as (1) an object of political debate, (2) a means or medium of discussion, and (3) a setting for democratic deliberation and citizenship. As a setting for democratic culture, Barney suggests that technology tends to make ethical questions, including the question of what a good life consists in, nearly impossible, because they already give an answer to the question: a good life is one that includes the use of more and more technology.[47]

Nikolas Kompridis has also written about the dangers of new technology, such as genetic engineering, nanotechnology, synthetic biology and robotics. He warns that these technologies introduce unprecedented new challenges to human beings, including the possibility of the permanent alteration of our biological nature. These concerns are shared by other philosophers, scientists and public intellectuals who have written about similar issues (e.g. Francis Fukuyama, Jrgen Habermas, William Joy, and Michael Sandel).[48]

Another prominent critic of technology is Hubert Dreyfus, who has published books On the Internet and What Computers Still Can't Do.

Another, more infamous anti-technological treatise is Industrial Society and Its Future, written by Theodore Kaczynski (aka The Unabomber) and printed in several major newspapers (and later books) as part of an effort to end his bombing campaign of the techno-industrial infrastructure.

The notion of appropriate technology, however, was developed in the 20th century (e.g., see the work of E. F. Schumacher and of Jacques Ellul) to describe situations where it was not desirable to use very new technologies or those that required access to some centralized infrastructure or parts or skills imported from elsewhere. The eco-village movement emerged in part due to this concern.

This article mainly focusses on American concerns even if it can reasonably be generalized to other Western countries.

The inadequate quantity and quality of American jobs is one of the most fundamental economic challenges we face. [...] What's the linkage between technology and this fundamental problem?

In his article, Jared Bernstein, a Senior Fellow at the Center on Budget and Policy Priorities,[49] questions the widespread idea that automation, and more broadly technological advances have mainly contributed to this growing labor market problem. His thesis appears to be a third way between Optimism and Skepticism. Basically, he stands for a neutral approach of the linkage between technology and American issues concerning unemployment and eroding wages.

He uses two main arguments to defend his point. First of all, because of recent technological advances, an increasing number of workers are losing their jobs. Yet, scientific evidence fails to clearly demonstrate that technology has displaced so many workers that it has created more problems than it has solved. Indeed, automation threatens repetitive jobs but higher-end jobs are still necessary because they complement technology and manual jobs that "requires flexibility judgment and common sense"[50] remain hard to be replaced by machines.Second, studies have not defined clear links between recent technology advances and the wage trends of the last decades.

Therefore, according to Jared Bernstein, instead of focusing on technology and its hypothetical influences on current American increasing unemployment and eroding wages, one needs to worry more about "bad policy that fails to offset the imbalances in demand, trade, income and opportunity."[50]

Thomas P. Hughes pointed out that because technology has been considered as a key way to solve problems, we need to be aware of its complex and varied characters to use it more efficiently.[51] What is the difference between a wheel or a compass and cooking machines such as an oven or a gas stove? Can we consider all of them, only a part of them or none of them as technologies?

Technology is often considered too narrowly: according to Thomas P. Hughes "Technology is a creative process involving human ingenuity.[51] This definition emphasizing on creativity avoids unbounded definition that may mistakenly include cooking technologies. But it also highlights the prominent role of humans and therefore their responsibilities for the use of complex technological systems.

Yet, because technology is everywhere and has dramatically changed landscapes and societies, Hughes argued that engineers, scientists, and managers often have believed that they can use technology to shape the world as they want. They have often supposed that technology is easily controllable and this assumption has to be thoroughly questioned.[51] For instance, Evgeny Morozov particularly challenges two concepts: Internet-centrism and solutionism.[52] Internet-centrism refers to the idea that our society is convinced that the Internet is one of the most stable and coherent forces. Solutionism is the ideology that every social issue can be solved thanks to technology and especially thanks to the internet. In fact, technology intrinsically contains uncertainties and limitations. According to Alexis Madrigal's critique of Morozov's theory, to ignore it will lead to unexpected consequences that could eventually cause more damage than the problems they seek to address.[53]Benjamin Cohen and Gwen Ottinger precisely discussed the multivalent effects of technology.[54]

Therefore, recognition of the limitations of technology and more broadly scientific knowledge is needed especially in cases dealing with environmental justice and health issues. Gwen Ottinger continues this reasoning and argues that the ongoing recognition of the limitations of scientific knowledge goes hand in hand with scientists and engineers new comprehension of their role. Such an approach of technology and science "[require] technical professionals to conceive of their roles in the process differently. [They have to consider themselves as] collaborators in research and problem solving rather than simply providers of information and technical solutions".[55]

Technology is properly defined as any application of science to accomplish a function. The science can be leading edge or well established and the function can have high visibility or be significantly more mundane but it is all technology, and its exploitation is the foundation of all competitive advantage.

Technology-based planning is what was used to build the US industrial giants before WWII (e.g., Dow, DuPont, GM) and it what was used to transform the US into a superpower. It was not economic-based planning.

In 1983 Project Socrates was initiated in the US intelligence community to determine the source of declining US economic and military competitiveness. Project Socrates concluded that technology exploitation is the foundation of all competitive advantage and that declining US competitiveness was from decision-making in the private and public sectors switching from technology exploitation (technology-based planning) to money exploitation (economic-based planning) at the end of World War II.

Project Socrates determined that to rebuild US competitiveness, decision making throughout the US had to readopt technology-based planning. Project Socrates also determined that countries like China and India had continued executing technology-based (while the US took its detour into economic-based) planning, and as a result had considerably advanced the process and were using it to build themselves into superpowers. To rebuild US competitiveness the US decision-makers needed to adopt a form of technology-based planning that was far more advanced than that used by China and India.

Project Socrates determined that technology-based planning makes an evolutionary leap forward every few hundred years and the next evolutionary leap, the Automated Innovation Revolution, was poised to occur. In the Automated Innovation Revolution the process for determining how to acquire and utilize technology for a competitive advantage (which includes R&D) is automated so that it can be executed with unprecedented speed, efficiency and agility.

Project Socrates developed the means for automated innovation so that the US could lead the Automated Innovation Revolution in order to rebuild and maintain the country's economic competitiveness for many generations.[56][57][58]

The use of basic technology is also a feature of other animal species apart from humans. These include primates such as chimpanzees, some dolphin communities,[59][60] and crows.[61][62] Considering a more generic perspective of technology as ethology of active environmental conditioning and control, we can also refer to animal examples such as beavers and their dams, or bees and their honeycombs.

The ability to make and use tools was once considered a defining characteristic of the genus Homo.[63] However, the discovery of tool construction among chimpanzees and related primates has discarded the notion of the use of technology as unique to humans. For example, researchers have observed wild chimpanzees utilising tools for foraging: some of the tools used include leaf sponges, termite fishing probes, pestles and levers.[64]West African chimpanzees also use stone hammers and anvils for cracking nuts,[65] as do capuchin monkeys of Boa Vista, Brazil.[66]

Read the original here:

Technology - Wikipedia, the free encyclopedia

Posted in Technology | Comments Off on Technology – Wikipedia, the free encyclopedia

Information technology – Wikipedia, the free encyclopedia

Posted: March 27, 2016 at 12:44 am

Information technology (IT) is the application of computers to store, retrieve, transmit and manipulate data,[1] often in the context of a business or other enterprise.[2] IT is considered a subset of information and communications technology (ICT). In 2012, Zuppo proposed an ICT hierarchy where each hierarchy level "contain some degree of commonality in that they are related to technologies that facilitate the transfer of information and various types of electronically mediated communications.".[3] Business/IT was one level of the ICT hierarchy.

The term is commonly used as a synonym for computers and computer networks, but it also encompasses other information distribution technologies such as television and telephones. Several industries are associated with information technology, including computer hardware, software, electronics, semiconductors, internet, telecom equipment, engineering, healthcare, e-commerce and computer services.[4][a]

Humans have been storing, retrieving, manipulating and communicating information since the Sumerians in Mesopotamia developed writing in about 3000BC,[6] but the term information technology in its modern sense first appeared in a 1958 article published in the Harvard Business Review; authors Harold J. Leavitt and Thomas L. Whisler commented that "the new technology does not yet have a single established name. We shall call it information technology (IT)." Their definition consists of three categories: techniques for processing, the application of statistical and mathematical methods to decision-making, and the simulation of higher-order thinking through computer programs.[7]

Based on the storage and processing technologies employed, it is possible to distinguish four distinct phases of IT development: pre-mechanical (3000BC 1450AD), mechanical (14501840), electromechanical (18401940) electronic (1940present),[6] and moreover, IT as a service. This article focuses on the most recent period (electronic), which began in about 1940.

Devices have been used to aid computation for thousands of years, probably initially in the form of a tally stick.[8] The Antikythera mechanism, dating from about the beginning of the first century BC, is generally considered to be the earliest known mechanical analog computer, and the earliest known geared mechanism. Comparable geared devices did not emerge in Europe until the 16th century, and it was not until 1645 that the first mechanical calculator capable of performing the four basic arithmetical operations was developed.

Electronic computers, using either relays or valves, began to appear in the early 1940s. The electromechanical Zuse Z3, completed in 1941, was the world's first programmable computer, and by modern standards one of the first machines that could be considered a complete computing machine. Colossus, developed during the Second World War to decrypt German messages was the first electronic digital computer. Although it was programmable, it was not general-purpose, being designed to perform only a single task. It also lacked the ability to store its program in memory; programming was carried out using plugs and switches to alter the internal wiring. The first recognisably modern electronic digital stored-program computer was the Manchester Small-Scale Experimental Machine (SSEM), which ran its first program on 21 June 1948.[13]

The development of transistors in the late 1940s at Bell Laboratories allowed a new generation of computers to be designed with greatly reduced power consumption. The first commercially available stored-program computer, the Ferranti Mark I, contained 4050 valves and had a power consumption of 25 kilowatts. By comparison the first transistorised computer, developed at the University of Manchester and operational by November 1953, consumed only 150 watts in its final version.[14]

Early electronic computers such as Colossus made use of punched tape, a long strip of paper on which data was represented by a series of holes, a technology now obsolete. Electronic data storage, which is used in modern computers, dates from World War II, when a form of delay line memory was developed to remove the clutter from radar signals, the first practical application of which was the mercury delay line. The first random-access digital storage device was the Williams tube, based on a standard cathode ray tube,[17] but the information stored in it and delay line memory was volatile in that it had to be continuously refreshed, and thus was lost once power was removed. The earliest form of non-volatile computer storage was the magnetic drum, invented in 1932[18] and used in the Ferranti Mark 1, the world's first commercially available general-purpose electronic computer.[19]

IBM introduced the first hard disk drive in 1956, as a component of their 305 RAMAC computer system. Most digital data today is still stored magnetically on hard disks, or optically on media such as CD-ROMs. Until 2002 most information was stored on analog devices, but that year digital storage capacity exceeded analog for the first time. As of 2007 almost 94% of the data stored worldwide was held digitally:[22] 52% on hard disks, 28% on optical devices and 11% on digital magnetic tape. It has been estimated that the worldwide capacity to store information on electronic devices grew from less than 3 exabytes in 1986 to 295 exabytes in 2007,[23] doubling roughly every 3 years.[24]

Database management systems emerged in the 1960s to address the problem of storing and retrieving large amounts of data accurately and quickly. One of the earliest such systems was IBM's Information Management System (IMS), which is still widely deployed more than 40 years later.[26] IMS stores data hierarchically, but in the 1970s Ted Codd proposed an alternative relational storage model based on set theory and predicate logic and the familiar concepts of tables, rows and columns. The first commercially available relational database management system (RDBMS) was available from Oracle in 1980.

All database management systems consist of a number of components that together allow the data they store to be accessed simultaneously by many users while maintaining its integrity. A characteristic of all databases is that the structure of the data they contain is defined and stored separately from the data itself, in a database schema.

The extensible markup language (XML) has become a popular format for data representation in recent years. Although XML data can be stored in normal file systems, it is commonly held in relational databases to take advantage of their "robust implementation verified by years of both theoretical and practical effort". As an evolution of the Standard Generalized Markup Language (SGML), XML's text-based structure offers the advantage of being both machine and human-readable.

The relational database model introduced a programming-language independent Structured Query Language (SQL), based on relational algebra.

The terms "data" and "information" are not synonymous. Anything stored is data, but it only becomes information when it is organized and presented meaningfully. Most of the world's digital data is unstructured, and stored in a variety of different physical formats[b] even within a single organization. Data warehouses began to be developed in the 1980s to integrate these disparate stores. They typically contain data extracted from various sources, including external sources such as the Internet, organized in such a way as to facilitate decision support systems (DSS).

Data transmission has three aspects: transmission, propagation, and reception. It can be broadly categorized as broadcasting, in which information is transmitted unidirectionally downstream, or telecommunications, with bidirectional upstream and downstream channels.[23]

XML has been increasingly employed as a means of data interchange since the early 2000s, particularly for machine-oriented interactions such as those involved in web-oriented protocols such as SOAP, describing "data-in-transit rather than... data-at-rest". One of the challenges of such usage is converting data from relational databases into XML Document Object Model (DOM) structures.

Hilbert and Lopez identify the exponential pace of technological change (a kind of Moore's law): machines' application-specific capacity to compute information per capita roughly doubled every 14 months between 1986 and 2007; the per capita capacity of the world's general-purpose computers doubled every 18 months during the same two decades; the global telecommunication capacity per capita doubled every 34 months; the world's storage capacity per capita required roughly 40 months to double (every 3 years); and per capita broadcast information has doubled every 12.3 years.[23]

Massive amounts of data are stored worldwide every day, but unless it can be analysed and presented effectively it essentially resides in what have been called data tombs: "data archives that are seldom visited". To address that issue, the field of data mining "the process of discovering interesting patterns and knowledge from large amounts of data" emerged in the late 1980s.

In an academic context, the Association for Computing Machinery defines IT as "undergraduate degree programs that prepare students to meet the computer technology needs of business, government, healthcare, schools, and other kinds of organizations.... IT specialists assume responsibility for selecting hardware and software products appropriate for an organization, integrating those products with organizational needs and infrastructure, and installing, customizing, and maintaining those applications for the organizations computer users."[40]

In a business context, the Information Technology Association of America has defined information technology as "the study, design, development, application, implementation, support or management of computer-based information systems". The responsibilities of those working in the field include network administration, software development and installation, and the planning and management of an organization's technology life cycle, by which hardware and software are maintained, upgraded and replaced.

The business value of information technology lies in the automation of business processes, provision of information for decision making, connecting businesses with their customers, and the provision of productivity tools to increase efficiency.

Employment distribution of computer systems design and related services, 2011[43]

Employment in the computer systems and design related services industry, in thousands, 1990-2011[43]

Occupational growth and wages in computer systems design and related services, 2010-2020[43]

Projected percent change in employment in selected occupations in computer systems design and related services, 2010-2020[43]

Projected average annual percent change in output and employment in selected industries, 2010-2020[43]

The field of information ethics was established by mathematician Norbert Wiener in the 1940s. Some of the ethical issues associated with the use of information technology include:

Notes

Citations

Bibliography

Continue reading here:

Information technology - Wikipedia, the free encyclopedia

Posted in Technology | Comments Off on Information technology – Wikipedia, the free encyclopedia

Technology News | Reuters.com

Posted: at 12:44 am

Verizon Communications Inc said an attacker had exploited a security vulnerability on its enterprise client portal to steal contact information of a number of customers.

26 Mar 2016

NEW YORK Apple Inc said the U.S. Justice Department's new attempts to unlock an iPhone used by one of the San Bernardino shooters without the tech giant's help could eliminate the government's need for its assistance in a similar dispute in New York.

BOAO, China Ride hailing app company Uber Technologies Inc [UBER.UL] is generating more than $1 billion in profit a year in its top 30 cities globally, and partly using that money to bankroll its expansion in China, Chief Executive Travis Kalanick said in an interview.

25 Mar 2016

SHANGHAI China's commerce ministry said on Friday it hoped telecom equipment maker ZTE Corp could be removed as soon as possible from a U.S. list of companies slapped with tough export restrictions this month.

24 Mar 2016

Netflix Inc said it had been lowering the quality of its video for customers watching its service on wireless networks such as AT&T and Verizon Communications for more than five years, The Wall Street Journal reported on Thursday.

24 Mar 2016

View original post here:

Technology News | Reuters.com

Posted in Technology | Comments Off on Technology News | Reuters.com

Technology Synonyms, Technology Antonyms | Thesaurus.com

Posted: at 12:44 am

Try to tell this to the champions of technology who predicted the paperless office and who now predict the networked world.

Technology at this level uncouples the past from the present.

Understanding the degree of necessity of the technology in the first place is where the focus should be.

The Ministry of Science and Technology had sent up a lengthy one.

He couldn't duplicate the weaponthe technology required lies so far beyond this age.

Music is acquiring a technology as confusing and as extensive as bacteriology.

Patents may be consulted evenings and Sundays by arrangement with the technology librarian, Room 115.

From this time the success of schools of technology was assured.

Technology is no cure for this paranoia; in fact, it may enhance the paranoia: it turns us into prisoners of our own device.

Down the stretch Allan gained on a Technology runner, but failed to pass him.

See the original post:

Technology Synonyms, Technology Antonyms | Thesaurus.com

Posted in Technology | Comments Off on Technology Synonyms, Technology Antonyms | Thesaurus.com

Page 304«..1020..302303304305