Technology Types & Uses | What is Technology? – Study.com

Though mechanical technology is simple, it has allowed for extremely important advancements in the human experience. Early humans using the wheel allowed for the transportation of heavy material to be faster and easier. The first wheel was found in ancient Mesopotamia, and is thought to be used as a potter's wheel to throw clay pots. Ancient Egypt and India saw the invention of the shaduf, a hand-operated lever and fulcrum used to lift water to irrigate crops. Ancient Greek philosopher Archimedes was the first to record simple machines, including pulleys, levers, and inclined planes, all used as machines to lessen the work needed to accomplish a task. During the Industrial Revolution, mechanical principles were used in the invention of engines, which use a system of pistons to generate large amounts of force needed to move trains and power factories. In modern times, mechanical technology is employed to accomplish all sorts of engineering tasks, such as running our cars, lifting heavy objects, and transporting goods.

Medical technology is defined as the application of scientific principles to develop solutions to problems regarding health, prevent or delay the onset of disease, and promote the overall health of humans. Medical technology is used to prevent, diagnose, treat, and monitor symptoms and diseases. This includes the production of drugs and medications, and the use of x-rays, MRIs, and ultrasounds which are tools used to look in the body for ailments. Ventilators are another type of medical technology, and are used to assist people in breathing. Medical technology can also include the equipment invented and used specifically for medical practices, from stethoscopes to scalpels.

An MRI scans the internal organs of a person, and helps doctors diagnose ailments.

Communications technology is the application of scientific knowledge to communicate. This includes everything from telegrams to landlines to cell phones. The internet is considered a communications technology, because it is a system that communicates information in infinite ways. Communications technology also includes systems that aid in the effectiveness and efficiency of communication, such as communication satellites.

Electronic technology is the application of scientific understanding of electricity to do work and perform tasks. We think of electronic technology as the many electronic devices, known as electronics, used in our modern world, such as tablets, laptops, and phones, all with internal computers that run on electricity. However, electronic technology includes any machine that runs on electricity. This includes washing machines, lights, and toasters. Electronic technology uses a system of electrical circuits to achieve a goal. This system can be simple, like that of a light circuit, or can be very complex, like that of a computer. Regardless of the system's complexity, if it uses electricity, it is considered electronic technology.

A Nintendo 64 controller circuit board shows the intricate ways electricity powers a computer.

Industrial and manufacturing technologies is the application of scientific principles to make the production of objects faster, safer, and more efficient. It is a large field that includes many other forms of technology, including electrical and mechanical technologies. In the Industrial Revolution of the 1700s and 1800s, this type of technology revolutionized how humans travel, eat, and live. The invention of engines enabled factories to build machines that mass-produced objects. Engines also enabled those products to be shipped like never before, and a huge variety of products were available to people all over the world. The advancement of industrial and manufacturing technologies also revolutionized war, making the production of weapons faster and cheaper. Through the 1940s, '50s, and '60s, manufacturing technologies brought the world fast food, paper plates, and cheap and affordable housing.

A woman working in a 1914 British wartime factory.

Forms of technology have been developed and used as far back as humans have existed. Some of the earliest tools used by humans included sharpened stones used as arrowheads, axes, and cutting tools, which can be considered mechanical technology. With the invention of the wheel, early humans were able to make more sophisticated pottery, as well as lighten the load through the use of wheelbarrows. Boats, which were invented and used before the wheel, utilized all sorts of technologies, from navigational tools to pulley systems and wind power. Plumbing and irrigation technologies allowed for the transportation of water to vital systems needed for human life, such as watering crops and providing water for human consumption, to the disposal of waste. Building structures became easier as technology advanced, which lead to some of the most impressive structures ever built by humans, including Stonehenge and the Egyptian Pyramids. As technologies advanced, so too could civilization. More technology meant more solutions, which meant bigger cities, more trade, and the expansion of civilization.

Industrial and manufacturing technologies is the application of scientific principles to make the production of objects faster, safer, and more efficient. It is a large field that includes many other forms of technology, including electrical and mechanical technologies. In the Industrial Revolution of the 1700s and 1800s, this type of technology revolutionized how humans travel, eat, and live. The invention of engines enabled factories to build machines that mass-produced objects. Engines also enabled those products to be shipped like never before, and a huge variety of products were available to people all over the world. The advancement of industrial and manufacturing technologies also revolutionized war, making the production of weapons faster and cheaper. Through the 1940s, '50s, and '60s, manufacturing technologies brought the world fast food, paper plates, and cheap and affordable housing.

A woman working in a 1914 British wartime factory.

Forms of technology have been developed and used as far back as humans have existed. Some of the earliest tools used by humans included sharpened stones used as arrowheads, axes, and cutting tools, which can be considered mechanical technology. With the invention of the wheel, early humans were able to make more sophisticated pottery, as well as lighten the load through the use of wheelbarrows. Boats, which were invented and used before the wheel, utilized all sorts of technologies, from navigational tools to pulley systems and wind power. Plumbing and irrigation technologies allowed for the transportation of water to vital systems needed for human life, such as watering crops and providing water for human consumption, to the disposal of waste. Building structures became easier as technology advanced, which lead to some of the most impressive structures ever built by humans, including Stonehenge and the Egyptian Pyramids. As technologies advanced, so too could civilization. More technology meant more solutions, which meant bigger cities, more trade, and the expansion of civilization.

What does technology mean for the future? If the past is any indication, technology is the key to solving the world's problems. Technology has revolutionized different ways of doing things, ways that improve the lives of humans, and advanced industries. The advancement of technology has also led to harmful effects on both people and the environment, particularly the invention and overuse of fossil fuels as an energy source to power so many of the technologies used in our modern life. Though technology has harmed the planet in our overuse of the planet's resources, technology could be the answer to many of the climate problems we face today. Technology has the power to solve modern problems as well, and the problems humans have faced in the past. With the world's greatest minds ever seeking solutions to the problems we face today, and an increasing number of people recognizing those problems, technology is sure to find ways of increasing the quality of life for humans, as well as the overall health of the planet and all who live here.

Technology has advanced human society for as long as our species has been in existence. Human life is full of problems that need solving, and technology provides innovative solutions to those problems that reduce effort and increase efficiency. Technology is the use of scientific knowledge for practical purposes that benefit our everyday lives, as well as the industries created by humans. There are several types of technologies. Mechanical technology includes the use of machines to do work, and utilizes many simple machines such as wheels, levers, pulleys, and even cogs and gears that all function together to accomplish a task. Machines that move are mechanical technology. Medical technology is another type, and includes ventilators, medication, and MRIs. Communication technology is the third type, and includes all types of tools used to communicate, from telegrams to telephones. Electronic technology includes technology that requires electricity, from dishwashers to blenders to various electronic devices. Finally, industrial and manufacturing technologies advance ways of producing objects used by people around the world.

Excerpt from:

Technology Types & Uses | What is Technology? - Study.com

Technology – Wikipedia

Use of knowledge for practical goals

Technology is the application of knowledge for achieving practical goals in a reproducible way.[1] The word technology can also mean the products resulting from such efforts,[2]:117[3] including both tangible tools such as utensils or machines, and intangible ones such as software. Technology plays a critical role in science, engineering, and everyday life.

Technological advancements have led to significant changes in society. The earliest known technology is the stone tool, used during prehistoric times, followed by the control of fire, which contributed to the growth of the human brain and the development of language during the Ice Age. The invention of the wheel in the Bronze Age allowed greater travel and the creation of more complex machines. More recent technological inventions, including the printing press, telephone, and the Internet, have lowered barriers to communication and ushered in the knowledge economy.

While technology contributes to economic development and improve human prosperity, it can also have negative impacts like pollution or resource depletion, or may cause social harms like technological unemployment resulting from automation. As a result, there are ongoing philosophical and political debates about the role and use of technology, the ethics of technology, and ways to mitigate potential downsides.

Technology is a term dating back to the early 17th century that meant 'systematic treatment' (from Greek , from 'art, craft' and -, 'study, knowledge').[4][5] It is predated in use by the Ancient Greek , used to mean 'knowledge of how to make things', which encompassed activities like architecture.[6]

Starting in the 19th century, continental Europeans started using the terms Technik (German) or technique (French) to refer to a 'way of doing', which included all technical arts, such as dancing, navigation, or printing, whether or not they required tools or instruments.[2]:114115 At the time, Technologie (German and French) referred either to the academic discipline studying the "methods of arts and crafts", or to the political discipline "intended to legislate on the functions of the arts and crafts."[2]:117 Since the distinction between Technik and Technologie is absent in English, both were translated as technology. The term was previously uncommon in English and mostly referred to the academic discipline, as in the Massachusetts Institute of Technology.[7]

In the 20th century, as a result of scientific progress and the Second Industrial Revolution, technology stopped being considered a distinct academic discipline and took on its current-day meaning: the systemic use of knowledge to practical ends.[2]:119

Tools were initially developed by hominids through observation and trial and error.[8] Around 2 Mya (million years ago), they learned to make the first stone tools by hammering flakes off a pebble, forming a sharp hand axe.[9] This practice was refined 75 kya (thousand years ago) into pressure flaking, enabling much finer work.[10]

The discovery of fire was described by Charles Darwin as "possibly the greatest ever made by man".[11] Archeological, dietary, and social evidence point to "continuous [human] fire-use" at least 1.5 Mya.[12] Fire, fueled with wood and charcoal, allowed early humans to cook their food to increase its digestibility, improving its nutrient value and broadening the number of foods that could be eaten.[13] The cooking hypothesis proposes that the ability to cook promoted an increase in hominid brain size, though some researchers find the evidence inconclusive.[14] Archeological evidence of hearths was dated to 790 kya; researchers believe this is likely to have intensified human socialization and may have contributed to the emergence of language.[15][16]

Other technological advances made during the Paleolithic era include clothing and shelter.[17] No consensus exists on the approximate time of adoption of either technology, but archeologists have found archeological evidence of clothing 90-120 kya[18] and shelter 450 kya.[17] As the Paleolithic era progressed, dwellings became more sophisticated and more elaborate; as early as 380 kya, humans were constructing temporary wood huts.[19][20] Clothing, adapted from the fur and hides of hunted animals, helped humanity expand into colder regions; humans began to migrate out of Africa around 200 kya, initially moving to Eurasia.[21][22][23]

The Neolithic Revolution (or First Agricultural Revolution) brought about an acceleration of technological innovation, and a consequent increase in social complexity.[24] The invention of the polished stone axe was a major advance that allowed large-scale forest clearance and farming.[25] This use of polished stone axes increased greatly in the Neolithic but was originally used in the preceding Mesolithic in some areas such as Ireland.[26] Agriculture fed larger populations, and the transition to sedentism allowed for the simultaneous raising of more children, as infants no longer needed to be carried around by nomads. Additionally, children could contribute labor to the raising of crops more readily than they could participate in hunter-gatherer activities.[27][28]

With this increase in population and availability of labor came an increase in labor specialization.[29] What triggered the progression from early Neolithic villages to the first cities, such as Uruk, and the first civilizations, such as Sumer, is not specifically known; however, the emergence of increasingly hierarchical social structures and specialized labor, of trade and war amongst adjacent cultures, and the need for collective action to overcome environmental challenges such as irrigation, are all thought to have played a role.[30]

Continuing improvements led to the furnace and bellows and provided, for the first time, the ability to smelt and forge gold, copper, silver, and lead native metals found in relatively pure form in nature.[31] The advantages of copper tools over stone, bone and wooden tools were quickly apparent to early humans, and native copper was probably used from near the beginning of Neolithic times (about 10 ka).[32] Native copper does not naturally occur in large amounts, but copper ores are quite common and some of them produce metal easily when burned in wood or charcoal fires. Eventually, the working of metals led to the discovery of alloys such as bronze and brass (about 4,000 BCE). The first use of iron alloys such as steel dates to around 1,800 BCE.[33][34]

After harnessing fire, humans discovered other forms of energy. The earliest known use of wind power is the sailing ship; the earliest record of a ship under sail is that of a Nile boat dating to around 7,000 BCE.[35] From prehistoric times, Egyptians likely used the power of the annual flooding of the Nile to irrigate their lands, gradually learning to regulate much of it through purposely built irrigation channels and "catch" basins.[36] The ancient Sumerians in Mesopotamia used a complex system of canals and levees to divert water from the Tigris and Euphrates rivers for irrigation.[37]

Archaeologists estimate that the wheel was invented independently and concurrently in Mesopotamia (in present-day Iraq), the Northern Caucasus (Maykop culture), and Central Europe.[38] Time estimates range from 5,500 to 3,000 BCE with most experts putting it closer to 4,000 BCE.[39] The oldest artifacts with drawings depicting wheeled carts date from about 3,500 BCE.[40] More recently, the oldest-known wooden wheel in the world was found in the Ljubljana Marsh of Slovenia.[41]

The invention of the wheel revolutionized trade and war. It did not take long to discover that wheeled wagons could be used to carry heavy loads. The ancient Sumerians used a potter's wheel and may have invented it.[42] A stone pottery wheel found in the city-state of Ur dates to around 3,429 BCE,[43] and even older fragments of wheel-thrown pottery have been found in the same area.[43] Fast (rotary) potters' wheels enabled early mass production of pottery, but it was the use of the wheel as a transformer of energy (through water wheels, windmills, and even treadmills) that revolutionized the application of nonhuman power sources. The first two-wheeled carts were derived from travois[44] and were first used in Mesopotamia and Iran in around 3,000 BCE.[44]

The oldest known constructed roadways are the stone-paved streets of the city-state of Ur, dating to circa 4,000 BCE,[45] and timber roads leading through the swamps of Glastonbury, England, dating to around the same period.[45] The first long-distance road, which came into use around 3,500 BCE,[45] spanned 2,400km from the Persian Gulf to the Mediterranean Sea,[45] but was not paved and was only partially maintained.[45] In around 2,000 BCE, the Minoans on the Greek island of Crete built a 50km road leading from the palace of Gortyn on the south side of the island, through the mountains, to the palace of Knossos on the north side of the island.[45] Unlike the earlier road, the Minoan road was completely paved.[45]

Ancient Minoan private homes had running water.[47] A bathtub virtually identical to modern ones was unearthed at the Palace of Knossos.[47][48] Several Minoan private homes also had toilets, which could be flushed by pouring water down the drain.[47] The ancient Romans had many public flush toilets,[48] which emptied into an extensive sewage system.[48] The primary sewer in Rome was the Cloaca Maxima;[48] construction began on it in the sixth century BCE and it is still in use today.[48]

The ancient Romans also had a complex system of aqueducts,[46] which were used to transport water across long distances.[46] The first Roman aqueduct was built in 312 BCE.[46] The eleventh and final ancient Roman aqueduct was built in 226 CE.[46] Put together, the Roman aqueducts extended over 450km,[46] but less than 70km of this was above ground and supported by arches.[46]

Innovations continued through the Middle Ages with the introduction of silk production (in Asia and later Europe), the horse collar, and horseshoes. Simple machines (such as the lever, the screw, and the pulley) were combined into more complicated tools, such as the wheelbarrow, windmills, and clocks.[49] A system of universities developed and spread scientific ideas and practices, including Oxford and Cambridge.[50]

The Renaissance era produced many innovations, including the introduction of the movable type printing press to Europe, which facilitated the communication of knowledge. Technology became increasingly influenced by science, beginning a cycle of mutual advancement.[51]

Starting in the United Kingdom in the 18th century, the discovery of steam power set off the Industrial Revolution, which saw wide-ranging technological discoveries, particularly in the areas of agriculture, manufacturing, mining, metallurgy, and transport, and the widespread application of the factory system.[52] This was followed a century later by the Second Industrial Revolution which led to rapid scientific discovery, standardization, and mass production. New technologies were developed, including sewage systems, electricity, light bulbs, electric motors, railroads, automobiles, and airplanes. These technological advances led to significant developments in medicine, chemistry, physics, and engineering.[53] They were accompanied by consequential social change, with the introduction of skyscrapers accompanied by rapid urbanization.[54] Communication improved with the invention of the telegraph, the telephone, the radio, and television.[55]

The 20th century brought a host of innovations. In physics, the discovery of nuclear fission in the Atomic Age led to both nuclear weapons and nuclear power. Computers were invented and later shifted from analog to digital in the Digital Revolution. Information technology, particularly optical fiber and optical amplifiers led to the birth of the Internet, which ushered in the Information Age. The Space Age began with the launch of Sputnik 1 in 1957, and later the launch of crewed missions to the moon in the 1960s. Organized efforts to search for extraterrestrial intelligence have used radio telescopes to detect signs of technology use, or technosignatures, given off by alien civilizations. In medicine, new technologies were developed for diagnosis (CT, PET, and MRI scanning), treatment (like the dialysis machine, defibrillator, pacemaker, and a wide array of new pharmaceutical drugs), and research (like interferon cloning and DNA microarrays).[56]

Complex manufacturing and construction techniques and organizations are needed to make and maintain more modern technologies, and entire industries have arisen to develop succeeding generations of increasingly more complex tools. Modern technology increasingly relies on training and education their designers, builders, maintainers, and users often require sophisticated general and specific training.[57] Moreover, these technologies have become so complex that entire fields have developed to support them, including engineering, medicine, and computer science; and other fields have become more complex, such as construction, transportation, and architecture.

Technological change is the largest cause of long-term economic growth.[58][59] Throughout human history, energy production was the main constraint on economic development, and new technologies allowed humans to significantly increase the amount of available energy. First came fire, which made edible a wider variety of foods, and made it less physically demanding to digest them. Fire also enabled smelting, and the use of tin, copper, and iron tools, used for hunting or tradesmanship. Then came the agricultural revolution: humans no longer needed to hunt or gather to survive, and began to settle in towns and cities, forming more complex societies, with militaries and more organized forms of religion.[60]

Technologies have contributed to human welfare through increased prosperity, improved comfort and quality of life, and medical progress, but they can also disrupt existing social hierarchies, cause pollution, and harm individuals or groups.

Recent years have brought about a rise in social media's cultural prominence, with potential repercussions on democracy, and economic and social life. Early on, the internet was seen as a "liberation technology" that would democratize knowledge, improve access to education, and promote democracy. Modern research has turned to investigate the internet's downsides, including disinformation, polarization, hate speech, and propaganda.[61]

Since the 1970s, technology's impact on the environment has been criticized, leading to a surge in investment in solar, wind, and other forms of clean energy.

Since the invention of the wheel, technologies have helped increase humans' economic output. Past automation has both substituted and complemented labor; machines replaced humans at some lower-paying jobs (for example in agriculture), but this was compensated by the creation of new, higher-paying jobs.[62] Studies have found that computers did not create significant net technological unemployment. [63] Due to artificial intelligence being far more capable than computers, and still being in its infancy, it is not known whether it will follow the same trend; the question has been debated at length among economists and policymakers. A 2017 survey found no clear consensus among economists on whether AI would increase long-term unemployment.[64] According to the World Economic Forum's "The Future of Jobs Report 2020", AI is predicted to replace 85 million jobs worldwide, and create 97 million new jobs by 2025.[65][66] From 1990 to 2007, a study in the U.S by MIT economist Daron Acemoglu showed that an addition of one robot for every 1,000 workers decreased the employment-to-population ratio by 0.2%, or about 3.3 workers, and lowered wages by 0.42%.[67][68] Concerns about technology replacing human labor however are long-lasting. As US president Lyndon Johnson said in 1964, Technology is creating both new opportunities and new obligations for us, opportunity for greater productivity and progress; obligation to be sure that no workingman, no family must pay an unjust price for progress. upon signing the National Commission on Technology, Automation, and Economic Progress bill.[69][70][71][72][73]

With the growing reliance of technology, there have been security and privacy concerns along with it. Billions of people use different online payment methods, such as WeChat Pay, PayPal, Alipay, and much more to help transfer money. Although security measures are placed, some criminals are able to bypass them.[74] In March 2022, North Korea used Blender.io, a mixer which helped them to hide their cryptocurrency exchanges, to launder over $20.5 million in cryptocurrency, from Axie Infinity, and steal over $600 million worth of cryptocurrency from the games owner. Because of this, the U.S. Treasury Department sanctioned Blender.io, which marked the first time it has taken action against a mixer, to try and crack down on North Korean hackers.[75][76] The privacy of cryptocurrency has been debated. Although many customers like the privacy of cryptocurrency, many also argue that it needs more transparency and stability.[74]

Philosophy of technology is a branch of philosophy that studies the "practice of designing and creating artifacts", and the "nature of the things so created."[77] It emerged as a discipline over the past two centuries, and has grown "considerably" since the 1970s.[78] The humanities philosophy of technology is concerned with the "meaning of technology for, and its impact on, society and culture".[77]

Initially, technology was seen as an extension of the human organism that replicated or amplified bodily and mental faculties.[79] Marx framed it as a tool used by capitalists to oppress the proletariat, but believe technology would be a fundamentally liberating force once it was "freed from societal deformations". Second-wave philosophers like Ortega later shifted their focus from economics and politics to "daily life and living in a techno-material culture," arguing that technology could oppress "even the members of the bourgeoisie who were its ostensible masters and possessors." Third-stage philosophers like Don Ihde and Albert Borgmann represent a turn toward de-generalization and empiricism, and considered how humans can learn to live with technology.[78][pageneeded]

Early scholarship on technology was split between two arguments: technological determinism, and social construction. Technological determinism is the idea that technologies cause unavoidable social changes.[80]:95 It usually encompasses a related argument, technological autonomy, which asserts that technological progress follows a natural progression and cannot be prevented.[81] Social constructivists[who?] argue that technologies follow no natural progression, and are shaped by cultural values, laws, politics, and economic incentives. Modern scholarship has shifted towards an analysis of sociotechnical systems, "assemblages of things, people, practices, and meanings", looking at the value judgments that shape technology.[80][pageneeded]

Cultural critic Neil Postman distinguished tool-using societies from technological societies and from what he called "technopolies," societies that are dominated by an ideology of technological and scientific progress to the detriment of other cultural practices, values, and world views.[82] Herbert Marcuse and John Zerzan suggest that technological society will inevitably deprive us of our freedom and psychological health.[83]

The ethics of technology is an interdisciplinary subfield of ethics that analyzes technology's ethical implications and explores ways to mitigate the potential negative impacts of new technologies. There is a broad range of ethical issues revolving around technology, from specific areas of focus affecting professionals working with technology to broader social, ethical, and legal issues concerning the role of technology in society and everyday life.[84]

Prominent debates have surrounded genetically modified organisms, the use of robotic soldiers, algorithmic bias, and the issue of aligning AI behavior with human values[85]

Technology ethics encompasses several key fields. Bioethics looks at ethical issues surrounding biotechnologies and modern medicine, including cloning, human genetic engineering, and stem cell research. Computer ethics focuses on issues related to computing. Cyberethics explores internet-related issues like intellectual property rights, privacy, and censorship. Nanoethics examines issues surrounding the alteration of matter at the atomic and molecular level in various disciplines including computer science, engineering, and biology. And engineering ethics deals with the professional standards of engineers, including software engineers and their moral responsibilities to the public.[86]

A wide branch of technology ethics is concerned with the ethics of artificial intelligence: it includes robot ethics, which deals with ethical issues involved in the design, construction, use, and treatment of robots,[87] as well as machine ethics, which is concerned with ensuring the ethical behavior of artificial intelligent agents.[88] Within the field of AI ethics, significant yet-unsolved research problems include AI alignment (ensuring that AI behaviors are aligned with their creators' intended goals and interests) and the reduction of algorithmic bias. Some researchers have warned against the hypothetical risk of an AI takeover, and have advocated for the use of AI capability control in addition to AI alignment methods.

Other fields of ethics have had to contend with technology-related issues, including military ethics, media ethics, and educational ethics.

Futures studies is the systematic and interdisciplinary study of social and technological progress. It aims to quantitatively and qualitatively explore the range of plausible futures and to incorporate human values in the development of new technologies.[89]:54 More generally, futures researchers are interested in improving "the freedom and welfare of humankind".[89]:73 It relies on a thorough quantitative and qualitative analysis of past and present technological trends, and attempts to rigorously extrapolate them into the future.[89] Science fiction is often used as a source of ideas.[89]:173 Futures research methodologies include survey research, modeling, statistical analysis, and computer simulations.[89]:187

Existential risk researchers analyze risks that could lead to human extinction or civilizational collapse, and look for ways to build resilience against them.[90][91] Relevant research centers include the Cambridge Center for the Study of Existential Risk, and the Stanford Existential Risk Initiative.[92] Future technologies may contribute to the risks of artificial general intelligence, biological warfare, nuclear warfare, nanotechnology, anthropogenic climate change, global warming, or stable global totalitarianism, though technologies may also help us mitigate asteroid impacts and gamma-ray bursts.[93] In 2019 philosopher Nick Bostrom introduced the notion of a vulnerable world, "one in which there is some level of technological development at which civilization almost certainly gets devastated by default", citing the risks of a pandemic caused by bioterrorists, or an arms race triggered by the development of novel armaments and the loss of mutual assured destruction.[94] He invites policymakers to question the assumptions that technological progress is always beneficial, that scientific openness is always preferable, or that they can afford to wait until a dangerous technology has been invented before they prepare mitigations.[94]

Emerging technologies are novel technologies whose development or practical applications are still largely unrealized. They include nanotechnology, biotechnology, robotics, 3D printing, blockchains, and artificial intelligence.

In 2005, futurist Ray Kurzweil claimed the next technological revolution would rest upon advances in genetics, nanotechnology, and robotics, with robotics being the most impactful of the three.[95] Genetic engineering will allow far greater control over human biological nature through a process called directed evolution. Some thinkers believe that this may shatter our sense of self, and have urged for renewed public debate exploring the issue more thoroughly;[96] others fear that directed evolution could lead to eugenics or extreme social inequality. Nanotechnology will grant us the ability to manipulate matter "at the molecular and atomic scale",[97] which could allow us to reshape ourselves and our environment in fundamental ways.[98] Nanobots could be used within the human body to destroy cancer cells or form new body parts, blurring the line between biology and technology.[99] Autonomous robots have undergone rapid progress, and are expected to replace humans at many dangerous tasks, including search and rescue, bomb disposal, firefighting, and war.[100]

Estimates on the advent of artificial general intelligence vary, but half of machine learning experts surveyed in 2018 believe that AI will "accomplish every task better and more cheaply" than humans by 2063, and automate all human jobs by 2140.[101] This expected technological unemployment has led to calls for increased emphasis on computer science education and debates about UBI. Political science experts predict that this could lead to a rise in extremism, while others see it as an opportunity to usher in a post-scarcity economy.

Some segments of the 1960s hippie counterculture grew to dislike urban living and developed a preference for locally autonomous, sustainable, and decentralized technology, termed appropriate technology. This later influenced hacker culture and technopaganism.

Technological utopianism refers to the belief that technological development is a moral good, which can and should bring about a utopia, that is, a society in which laws, governments, and social conditions serve the needs of all its citizens.[102] Examples of techno-utopian goals include post-scarcity economics, life extension, mind uploading, cryonics, and the creation of artificial superintelligence. Major techno-utopian movements include transhumanism and singularitarianism.

The transhumanism movement is founded upon the "continued evolution of human life beyond its current human form" through science and technology, informed by "life-promoting principles and values."[103] The movement gained wider popularity in the early 21st century.[104]

Singularitarians believe that machine superintelligence will "accelerate technological progress" by orders of magnitude and "create even more intelligent entities ever faster", which may lead to a pace of societal and technological change that is "incomprehensible" to us. This event horizon is known as the technological singularity.[105]

Major figures of techno-utopianism include Ray Kurzweil and Nick Bostrom. Techno-utopianism has attracted both praise and criticism from progressive, religious, and conservative thinkers.[106]

Technology's central role in our lives has drawn concerns and backlash. The backlash against technology is not a uniform movement and encompasses many heterogeneous ideologies.[107]

The earliest known revolt against technology was Luddism, a pushback against early automation in textile production. Automation had resulted in a need for fewer workers, a process known as technological unemployment.

Between the 1970s and 1990s, American terrorist Ted Kaczynski carried out a series of bombings across America and published the Unabomber Manifesto denouncing technology's negative impacts on nature and human freedom. The essay resonated with a large part of the American public.[108] It was partly inspired by Jacques Ellul's The Technological Society.[109]

Some subcultures, like the off-the-grid movement, advocate a withdrawal from technology and a return to nature. The ecovillage movement seeks to reestablish harmony between technology and nature.[110]

Engineering is the process by which technology is developed. It often requires problem-solving under strict constraints.[111] Technological development is "action-oriented", while scientific knowledge is fundamentally explanatory.[112] Polish philosopher Henryk Skolimowski framed it like so: "science concerns itself with what is, technology with what is to be."[113]:375

The direction of causality between scientific discovery and technological innovation has been debated by scientists, philosophers and policymakers.[114] Because innovation is often undertaken at the edge of scientific knowledge, most technologies are not derived from scientific knowledge, but instead from engineering, tinkering and chance.[115]:217240 For example, in the 1940s and 1950s, when knowledge of turbulent combustion or fluid dynamics was still crude, jet engines were invented through "running the device to destruction, analyzing what broke [...] and repeating the process".[111] Scientific explanations often follow technological developments rather than preceding them.[115]:217240 Many discoveries also arose from pure chance, like the discovery of penicillin as a result of accidental lab contamination.[116] Since the 1960s, the assumption that government funding of basic research would lead to the discovery of marketable technologies has lost credibility.[117][118] Probabilist Nassim Taleb argues that national research programs that implement the notions of serendipity and convexity through frequent trial and error are more likely to lead to useful innovations than research that aims to reach specific outcomes.[115][119]

Despite this, modern technology is increasingly reliant on deep, domain-specific scientific knowledge. In 1979, an average of one in three patents granted in the U.S. cited the scientific literature; by 1989, this increased to an average of one citation per patent. The average was skewed upwards by patents related to the pharmaceutical industry, chemistry, and electronics.[120] A 2021 analysis shows that patents that are based on scientific discoveries are on average 26% more valuable than equivalent non-science-based patents.[121]

The use of basic technology is also a feature of non-human animal species. Tool use was once considered a defining characteristic of the genus Homo.[122] This view was supplanted after discovering evidence of tool use among chimpanzees and other primates,[123] dolphins,[124] and crows.[125][126] For example, researchers have observed wild chimpanzees using basic foraging tools, pestles, levers, using leaves as sponges, and tree bark or vines as probes to fish termites.[127] West African chimpanzees use stone hammers and anvils for cracking nuts,[128] as do capuchin monkeys of Boa Vista, Brazil.[129] Tool use is not the only form of animal technology use; for example, beaver dams, built with wooden sticks or large stones, are a technology with "dramatic" impacts on river habitats and ecosystems.[130]

Man's relationship with technology has been explored in science-fiction literature, for example in Brave New World, A Clockwork Orange, Nineteen Eighty-Four, Isaac Asimov's essays, and movies like Minority Report, Total Recall, Gattaca, and Inception. It has spawned the dystopian and futuristic cyberpunk genre, which juxtaposes futuristic technology with societal collapse, dystopia or decay.[131] Notable cyberpunk works include William Gibson's Neuromancer novel, and movies like Blade Runner, and The Matrix.

Here is the original post:

Technology - Wikipedia

Startup Says It’s Building a Giant CO2 Battery in the United States

Italian startup Energy Dome has designed an ingenious battery that uses CO2 to store energy, and it only needs non-exotic materials like steel and water.

Italian Import

Carbon dioxide has a bad rep for its role in driving climate change, but in an unexpected twist, it could also play a key role in storing renewable energy.

The world's first CO2 battery, built by Italian startup Energy Dome, promises to store renewables on an industrial scale, which could help green energy rival fossil fuels in terms of cost and practicality.

After successfully testing the battery at a small scale plant in Sardinia, the company is now bringing its technology to the United States.

"The US market is a primary market for Energy Dome and we are working to become a market leader in the US," an Energy Dome spokesperson told Electrek. "The huge demand of [long duration energy storage] and incentive mechanisms like the Inflation Reduction Act will be key drivers for the industry in the short term."

Storage Solution

As renewables like wind and solar grow, one of the biggest infrastructural obstacles is the storage of the power they produce. Since wind and solar sources aren't always going to be available, engineers need a way to save excess power for days when it's less sunny and windy out, or when there's simply more demand.

One obvious solution is to use conventional battery technology like lithium batteries, to store the energy. The problem is that building giant batteries from rare earth minerals — which can be prone to degradation over time — is expensive, not to mention wasteful.

Energy Dome's CO2 batteries, on the other hand, use mostly "readily available materials" like steel, water, and of course CO2.

In Charge

As its name suggests, the battery works by taking CO2, stored in a giant dome, and compressing it into a liquid by using the excess energy generated from a renewable source. That process generates heat, which is stored alongside the now liquefied CO2, "charging" the battery.

To discharge power, the stored heat is used to vaporize the liquid CO2 back into a gas, powering a turbine that feeds back into the power grid. Crucially, the whole process is self-contained, so no CO2 leaks back into the atmosphere.

The battery could be a game-changer for renewables. As of now, Energy Dome plans to build batteries that can store up to 200 MWh of energy. But we'll have to see how it performs as it gains traction.

More on batteries: Scientists Propose Turning Skyscrapers Into Massive Gravity Batteries

The post Startup Says It's Building a Giant CO2 Battery in the United States appeared first on Futurism.

Excerpt from:

Startup Says It's Building a Giant CO2 Battery in the United States

Former Facebook Exec Says Zuckerberg Has Surrounded Himself With Sycophants

Conviction is easy if you're surrounded by a bunch of yes men — which Mark Zuckerberg just might be. And $15 billion down the line, that may not bode well.

In just about a year, Facebook-turned-Meta CEO Mark Zuckerberg's metaverse vision has cost his company upwards of $15 billion, cratering value and — at least in part — triggering mass company layoffs. That's a high price tag, especially when the Facebook creator has shockingly little to show for it, both in actual technology and public interest.

Indeed, it seems that every time Zuckerberg excitedly explains what his currently-legless metaverse will one day hold, he's met with crickets — and a fair share of ridicule — at the town square. Most everyone finds themselves looking around and asking themselves the same question: who could this possibly be for, other than Zucko himself?

That question, however, doesn't really seem to matter to the swashzuckling CEO, who's either convinced that the public wants and needs his metaverse just as much as he does, or is simply just convicted to the belief that one day people will finally get it. After all, he's bet his company on this thing and needs the public to engage to stay financially viable long-term.

And sure, points for conviction. But conviction is easy if you're surrounded by a bunch of yes men — which, according to Vanity Fair, the founder unfortunately is. And with $15 billion down the line, that may not bode well for the Silicon Valley giant.

"The problem now is that Mark has surrounded himself with sycophants, and for some reason he's fallen for their vision of the future, which no one else is interested in," one former Facebook exec told Vanity Fair. "In a previous era, someone would have been able to reason with Mark about the company's direction, but that is no longer the case."

Given that previous reports have revealed that some Meta employees have taken to marking metaverse documents with the label "MMA" — "Make Mark Happy" — the revelation that he's limited his close circle to people who only agree with him isn't all that shocking. He wants the metaverse, he wants it bad, and he's put a mind-boggling amount of social and financial capital into his AR-driven dream.

While the majority of his many thousands of employees might disagree with him — Vanity Fair reports that current and former metamates have written things like "the metaverse will be our slow death" and "Mark Zuckerberg will single-handedly kill a company with the metaverse" on the Silicon Valley-loved Blind app — it's not exactly easy, or even that possible, to wrestle with the fact that you may have made a dire miscalculation this financially far down the road.

And if you just keep a close circle of people who just agree with you, you may not really have to confront that potential for failure. At least not for a while.

The truth is that Zuckerberg successfully created a thing that has impacted nearly every single person on this Earth. Few people can say that. And while it can be argued that the thing he built has, at its best, created some real avenues for connection, that same creation also seems to have led to his own isolation, in life and at work.

How ironic it is that he's marketed his metaverse on that same promise of connection, only to become more disconnected than ever.

READ MORE: "Mark Has Surrounded Himself with Sycophants": Zuckerberg's Big Bet on the Metaverse Is Backfiring [Vanity Fair]

More on the Meta value: Stock Analyst Cries on Tv Because He Recommended Facebook Stock

The post Former Facebook Exec Says Zuckerberg Has Surrounded Himself With Sycophants appeared first on Futurism.

View post:

Former Facebook Exec Says Zuckerberg Has Surrounded Himself With Sycophants

This Deepfake AI Singing Dolly Parton’s "Jolene" Is Worryingly Good

Holly Herndon uses her AI twin Holly+ to sing a cover of Dolly Parton's

AI-lands in the Stream

Sorry, but not even Dolly Parton is sacred amid the encroachment of AI into art.

Holly Herndon, an avant garde pop musician, has released a cover of Dolly Parton's beloved and frequently covered hit single, "Jolene." Except it's not really Herndon singing, but her digital deepfake twin known as Holly+.

The music video features a 3D avatar of Holly+ frolicking in what looks like a decaying digital world.

And honestly, it's not bad — dare we say, almost kind of good? Herndon's rendition croons with a big, round sound, soaked in reverb and backed by a bouncy, acoustic riff and a chorus of plaintive wailing. And she has a nice voice. Or, well, Holly+ does. Maybe predictably indie-folk, but it's certainly an effective demonstration of AI with a hint of creative flair, or at least effective curation.

Checking the Boxes

But the performance is also a little unsettling. For one, the giant inhales between verses are too long to be real and are almost cajolingly dramatic. The vocals themselves are strangely even and, despite the somber tone affected by the AI, lack Parton's iconic vulnerability.

Overall, it feels like the AI is simply checking the boxes of what makes a good, swooning cover after listening to Jeff Buckley's "Hallelujah" a million times — which, to be fair, is a pretty good starting point.

Still, it'd be remiss to downplay what Herndon has managed to pull off here, and the criticisms mostly reflect the AI's limited capabilities more than her chops as a musician. The AI's seams are likely intentional, if her previous work is anything to go off of.

Either way, if you didn't know you were listening to an AI from the get-go, you'd probably be fooled. And that alone is striking.

The Digital Self

Despite AI's usually ominous implications for art, Herndon views her experiment as a "way for artists to take control of their digital selves," according to a statement on her website.

"Vocal deepfakes are here to stay," Herndon was quoted saying. "A balance needs to be found between protecting artists, and encouraging people to experiment with a new and exciting technology."

Whether Herndon's views are fatalistic or prudently pragmatic remains to be seen. But even if her intentions are meant to be good for artists, it's still worrying that an AI could pull off such a convincing performance.

More on AI music: AI That Generates Music from Prompts Should Probably Scare Musicians

The post This Deepfake AI Singing Dolly Parton's "Jolene" Is Worryingly Good appeared first on Futurism.

Read the original:

This Deepfake AI Singing Dolly Parton's "Jolene" Is Worryingly Good

US Gov to Crack Down on "Bossware" That Spies On Employees’ Computers

In the era of remote work, employers have turned to invasive

Spying @ Home

Ever since the COVID-19 pandemic drove a wave of working from home, companies have been relentless in their efforts to digitally police and spy on remote employees by using what's known as "bossware." That's the pejorative name for software that tracks the websites an employee visits, screenshots their computer screens, and even records their faces and voices.

And now, the National Labor Relations Board (NLRB), an agency of the federal government, is looking to intervene.

"Close, constant surveillance and management through electronic means threaten employees' basic ability to exercise their rights," said NLRB general counsel Jennifer Abruzzo, in a Monday memo. "I plan to urge the Board to apply the Act to protect employees, to the greatest extent possible, from intrusive or abusive electronic monitoring and automated management practices."

Undoing Unions

In particular, Abruzzo is worried about how bossware could infringe on workers' rights to unionize. It's not hard to imagine how such invasive surveillance could be used to bust unionization. Even if the technology isn't explicitly deployed to impede organization efforts, the ominous presence of the surveillance on its own can be a looming deterrent, which Abruzzo argues is illegal.

And now is the perfect moment for the NLRB to step in. The use and abuse of worker surveillance tech in general — not just bossware — has been "growing by the minute," Mark Gaston Pearce, executive director of the Workers' Rights Institute at Georgetown Law School, told CBS.

"Employers are embracing technology because technology helps them run a more efficient business," Gaston explained. "… What comes with that is monitoring a lot of things that employers have no business doing."

Overbearing Overlord

In some ways, surveillance tech like bossware can be worse than having a nosy, actual human boss. Generally speaking, in a physical workplace employees have an understanding of how much privacy they have (unless they work at a place like Amazon or Walmart, that is).

But when bossware spies on you, who knows how much information an employer could be gathering — or even when they're looking in. And if it surveils an employee's personal computer, which more often than not contains plenty of personal information that a boss has no business seeing, that's especially invasive.

Which is why Abruzzo is pushing to require employers to disclose exactly how much they're tracking.

It's a stern message from the NLRB, but at the end of the day, it's just a memo. We'll have to wait and see how enforcing it pans out.

More on surveillance: Casinos to Use Facial Recognition to Keep "Problem Gamblers" Away

The post US Gov to Crack Down on "Bossware" That Spies On Employees' Computers appeared first on Futurism.

More here:

US Gov to Crack Down on "Bossware" That Spies On Employees' Computers

That "Research" About How Smartphones Are Causing Deformed Human Bodies Is SEO Spam, You Idiots

That

You know that "research" going around saying humans are going to evolve to have hunchbacks and claws because of the way we use our smartphones? Though our posture could certainly use some work, you'll be glad to know that it's just lazy spam intended to juice search engine results.

Let's back up. Today the Daily Mail published a viral story about "how humans may look in the year 3000." Among its predictions: hunched backs, clawed hands, a second eyelid, a thicker skull and a smaller brain.

Sure, that's fascinating! The only problem? The Mail's only source is a post published a year ago by the renowned scientists at... uh... TollFreeForwarding.com, a site that sells, as its name suggests, virtual phone numbers.

If the idea that phone salespeople are purporting to be making predictions about human evolution didn't tip you off, this "research" doesn't seem very scientific at all. Instead, it more closely resembles what it actually is — a blog post written by some poor grunt, intended to get backlinks from sites like the Mail that'll juice TollFreeForwarding's position in search engine results.

To get those delicious backlinks, the top minds at TollFreeForwarding leveraged renders of a "future human" by a 3D model artist. The result of these efforts is "Mindy," a creepy-looking hunchback in black skinny jeans (which is how you can tell she's from a different era).

Grotesque model reveals what humans could look like in the year 3000 due to our reliance on technology

Full story: https://t.co/vQzyMZPNBv pic.twitter.com/vqBuYOBrcg

— Daily Mail Online (@MailOnline) November 3, 2022

"To fully realize the impact everyday tech has on us, we sourced scientific research and expert opinion on the subject," the TollFreeForwarding post reads, "before working with a 3D designer to create a future human whose body has physically changed due to consistent use of smartphones, laptops, and other tech."

Its sources, though, are dubious. Its authority on spinal development, for instance, is a "health and wellness expert" at a site that sells massage lotion. His highest academic achievement? A business degree.

We could go on and on about TollFreeForwarding's dismal sourcing — some of which looks suspiciously like even more SEO spam for entirely different clients — but you get the idea.

It's probably not surprising that the this gambit for clicks took off among dingbats on Twitter. What is somewhat disappointing is that it ended up on StudyFinds, a generally reliable blog about academic research. This time, though, for inscrutable reasons it treated this egregious SEO spam as a legitimate scientific study.

The site's readers, though, were quick to call it out, leading to a comically enormous editor's note appended to the story.

"Our content is intended to stir debate and conversation, and we always encourage our readers to discuss why or why not they agree with the findings," it reads in part. "If you heavily disagree with a report — please debunk to your delight in the comments below."

You heard them! Get debunking, people.

More conspiracy theories: If You Think Joe Rogan Is Credible, This Bizarre Clip of Him Yelling at a Scientist Will Probably Change Your Mind

The post That "Research" About How Smartphones Are Causing Deformed Human Bodies Is SEO Spam, You Idiots appeared first on Futurism.

View original post here:

That "Research" About How Smartphones Are Causing Deformed Human Bodies Is SEO Spam, You Idiots

Advocates: Speed limit technology could keep women safer on the road – WSB Atlanta

WASHINGTON DC Theres a new push to require car companies to put speed-limiting technology in newly manufactured cars. Advocates say it will specifically keep women safer on the road.

Research shows women who are driving and involved in a crash are more likely to be seriously injured than men. Some believe crash test dummies that simulate the female body will address the gender disparity, but others think the fix is not so simple.

Our crash test dummies have inherent limitations, Jessica Jermakian, vice president of vehicle research at the Insurance Institute for Highway Safety, said. We like to think of them as people, but theyre really not people.

If we really want to have equity in crash safety, the answer is really going to extend beyond thinking about crash test dummies, she explained.

IIHS is now looking to data from real-world crashes and how to make a car less deadly when it hits another car.

[DOWNLOAD: Free WSB-TV News app for alerts as news breaks]

One way to do that, IIHS said, is with something called Intelligent Speed Assistance, or ISA. Essentially, it can either prevent speeding or issue warnings if the car is going faster than the speed limit. It is required on new vehicles in Europe, but not here in the U.S.

It is something they could put in the vehicles today that would help bring speeds down and reduce the severity of crashes, Jermakian said.

We asked the National Highway Traffic Safety Administration if it is in fact considering requiring ISA in new cars. Officials said earlier this year they requested input on these systems and will consider public comments received.

We obtained a document that shows the auto industry lobby told NHTSA it should take this system into consideration.

Safety is the auto industrys top priority, a spokesperson for the Alliance for Automotive Innovation wrote. Vehicles continue to get safer as automakers across the board test, develop and integrate promising new technologies including intelligent speed assist, alcohol detection and rear seat child reminders, among others. These innovations make the driving experience safer and protect lives or prevent injuries.

Reactions from drivers are mixed.

TRENDING STORIES:

Everybody in California drives about 10 miles an hour over the speed limit, so I dont think that would go over very well in California, Paul Greenfield of Sacramento said.

Just like the little side ones that go oh, you know, youre going 50 mph in a 20, slow down, so I think it would be a good idea to have them in the cars, Debbie Colby of Oregon, said.

Safety advocates are also trying out virtual crash testing to be able to simulate a broader range of people in the car.

The group VERITY NOW, which fights for equity in vehicle safety, said the government needs to act on all of this.

We keep researching endlessly, co-chair Beth Brooke explained. Why? Why? Why? And my question would be why not? It exists today. Were not going to make anybody less safe by using it. Why not? Lets keep advancing all of these technologies further, and well save womens lives in the meantime.

[SIGN UP: WSB-TV Daily Headlines Newsletter]

IN OTHER NEWS:

Jorge Estevez talks to a family who rode out Hurricane Ian in their Fort Myers home

2022 Cox Media Group

Continue reading here:

Advocates: Speed limit technology could keep women safer on the road - WSB Atlanta

Startup’s New Technology Could Create Faster, Less Expensive, and Better Way to Identify Disease – UConn Today – University of Connecticut

Imagine waiting 36 hours for a lab report to determine if you have sepsis, a life-threatening infection that causes inflammation throughout the body.

The team of entrepreneurs at RiboDynamics, a UConn-affiliated startup, believe they can cut that wait time to two hours with their new medical technology, which detects pathogens in biological material based on the presence of specific RNA biomarkers.

This is a life-changing technology for both patients and medical providers, says Professor Dan Fabris of the Chemistry Department. It will allow patients to get proper treatment with no delay, thus increasing the chance of healing fast, and hopefully avoid a stay in the ICU. For hospitals and physicians, this leads to a quicker diagnosis and considerable savingsup to $70,000 per patientin terms of medical expenses.

The technology, in development for the last 10 years, also holds promise for many other pathogens, including HIV, Hepatitis C, and COVID-19.

RiboDynamics identified as promising startup

RiboDynamics participated in the School of Business Connecticut Center for Entrepreneurship & Innovation (CCEI)s Summer Fellowship this year. That program helps UConn-affiliated companies to grow and move closer to market readiness.

RiboDynamics impressed the judges at the Summer Fellowship finale, and the company was invited to participate in the Wolff New Venture Competition and vie for a $25,000 prize. The event on Monday, Oct. 3 is the pinnacle entrepreneurship challenge hosted by CCEI.

Limin Deng, a post-doc in the College of Liberal Arts & Sciences, represented the company at the CCEI Summer Fellowship. She has played a key role in the development of the technology and the company as a whole.

I had to learn to switch from being a scientist to being a business person, Deng says. In the beginning, it was tough to explain our work. We are trained to speak to other scientists and end-users. We had to learn to explain our idea in very simple terms for everyone to understand.

The CCEI Summer Fellowship really helps startups get into the marketplace. It is a source for everything you need, she says, adding that the camaraderie among all the entrepreneurs was strong and that they were always happy to share information and ideas.

Exploring RNA before it was popular

Fabris says he has been interested in RNA technology since the early 2000s, before it was popular in science and well-known to the public through mRNA vaccines. He began this specific work as a faculty member at SUNY Albany and applied for a patent in 2006, which he later received.

We have already demonstrated the ability to detect salmonella, listeria and E.coli in milk, as well as Zika virus in mosquitos, he says. We are now testing applications in human diagnostics of infectious diseases and other health conditions.

The entrepreneurs are working with the Lahey Clinic in Massachusetts to investigate how well the test works in patients who have recently had a hip replacement.

We are very far along in the development of our technology and our company, Fabris says. We are at a point where we need investors and then to focus on sales and marketing.

As proven by the response to the COVID-19 pandemic, the healthcare sector severely lacks reliable, high throughput diagnostic techniques for pathogens and infectious diseases, he says. The earliest possible diagnostic results can improve the overall outcome of virtually any type of disease.

The 2022 Wolff New Venture Competition will be held on Oct. 3, from 5 to 7:30 p.m. on the Observation Deck at the Graduate Business Learning Center in Hartford. It will also be livestreamed at:https://ccei.uconn.edu/wolff-new-venture-competition/. This event is open to the public.

Read the original post:

Startup's New Technology Could Create Faster, Less Expensive, and Better Way to Identify Disease - UConn Today - University of Connecticut