12345...102030...


Nano

Researchers develop novel process to 3-D print one of the strongest materials on earth

Researchers from Virginia Tech and Lawrence Livermore National Laboratory have developed a novel way to 3-D print complex objects of one of the highest-performing materials used in the battery and aerospace industries.

See the article here:

Nano

Nanotechnology – Wikipedia

Nanotechnology (“nanotech”) is manipulation of matter on an atomic, molecular, and supramolecular scale. The earliest, widespread description of nanotechnology[1][2] referred to the particular technological goal of precisely manipulating atoms and molecules for fabrication of macroscale products, also now referred to as molecular nanotechnology. A more generalized description of nanotechnology was subsequently established by the National Nanotechnology Initiative, which defines nanotechnology as the manipulation of matter with at least one dimension sized from 1 to 100 nanometers. This definition reflects the fact that quantum mechanical effects are important at this quantum-realm scale, and so the definition shifted from a particular technological goal to a research category inclusive of all types of research and technologies that deal with the special properties of matter which occur below the given size threshold. It is therefore common to see the plural form “nanotechnologies” as well as “nanoscale technologies” to refer to the broad range of research and applications whose common trait is size. Because of the variety of potential applications (including industrial and military), governments have invested billions of dollars in nanotechnology research. Through 2012, the USA has invested $3.7 billion using its National Nanotechnology Initiative, the European Union has invested $1.2 billion, and Japan has invested $750 million.[3]

Nanotechnology as defined by size is naturally very broad, including fields of science as diverse as surface science, organic chemistry, molecular biology, semiconductor physics, energy storage,[4][5] microfabrication,[6] molecular engineering, etc.[7] The associated research and applications are equally diverse, ranging from extensions of conventional device physics to completely new approaches based upon molecular self-assembly,[8] from developing new materials with dimensions on the nanoscale to direct control of matter on the atomic scale.

Scientists currently debate the future implications of nanotechnology. Nanotechnology may be able to create many new materials and devices with a vast range of applications, such as in nanomedicine, nanoelectronics, biomaterials energy production, and consumer products. On the other hand, nanotechnology raises many of the same issues as any new technology, including concerns about the toxicity and environmental impact of nanomaterials,[9] and their potential effects on global economics, as well as speculation about various doomsday scenarios. These concerns have led to a debate among advocacy groups and governments on whether special regulation of nanotechnology is warranted.

The concepts that seeded nanotechnology were first discussed in 1959 by renowned physicist Richard Feynman in his talk There’s Plenty of Room at the Bottom, in which he described the possibility of synthesis via direct manipulation of atoms. The term “nano-technology” was first used by Norio Taniguchi in 1974, though it was not widely known.

Inspired by Feynman’s concepts, K. Eric Drexler used the term “nanotechnology” in his 1986 book Engines of Creation: The Coming Era of Nanotechnology, which proposed the idea of a nanoscale “assembler” which would be able to build a copy of itself and of other items of arbitrary complexity with atomic control. Also in 1986, Drexler co-founded The Foresight Institute (with which he is no longer affiliated) to help increase public awareness and understanding of nanotechnology concepts and implications.

Thus, emergence of nanotechnology as a field in the 1980s occurred through convergence of Drexler’s theoretical and public work, which developed and popularized a conceptual framework for nanotechnology, and high-visibility experimental advances that drew additional wide-scale attention to the prospects of atomic control of matter. Since the popularity spike in the 1980s, most of nanotechnology has involved investigation of several approaches to making mechanical devices out of a small number of atoms.[10]

In the 1980s, two major breakthroughs sparked the growth of nanotechnology in modern era. First, the invention of the scanning tunneling microscope in 1981 which provided unprecedented visualization of individual atoms and bonds, and was successfully used to manipulate individual atoms in 1989. The microscope’s developers Gerd Binnig and Heinrich Rohrer at IBM Zurich Research Laboratory received a Nobel Prize in Physics in 1986.[11][12] Binnig, Quate and Gerber also invented the analogous atomic force microscope that year.

Second, Fullerenes were discovered in 1985 by Harry Kroto, Richard Smalley, and Robert Curl, who together won the 1996 Nobel Prize in Chemistry.[13][14] C60 was not initially described as nanotechnology; the term was used regarding subsequent work with related graphene tubes (called carbon nanotubes and sometimes called Bucky tubes) which suggested potential applications for nanoscale electronics and devices.

In the early 2000s, the field garnered increased scientific, political, and commercial attention that led to both controversy and progress. Controversies emerged regarding the definitions and potential implications of nanotechnologies, exemplified by the Royal Society’s report on nanotechnology.[15] Challenges were raised regarding the feasibility of applications envisioned by advocates of molecular nanotechnology, which culminated in a public debate between Drexler and Smalley in 2001 and 2003.[16]

Meanwhile, commercialization of products based on advancements in nanoscale technologies began emerging. These products are limited to bulk applications of nanomaterials and do not involve atomic control of matter. Some examples include the Silver Nano platform for using silver nanoparticles as an antibacterial agent, nanoparticle-based transparent sunscreens, carbon fiber strengthening using silica nanoparticles, and carbon nanotubes for stain-resistant textiles.[17][18]

Governments moved to promote and fund research into nanotechnology, such as in the U.S. with the National Nanotechnology Initiative, which formalized a size-based definition of nanotechnology and established funding for research on the nanoscale, and in Europe via the European Framework Programmes for Research and Technological Development.

By the mid-2000s new and serious scientific attention began to flourish. Projects emerged to produce nanotechnology roadmaps[19][20] which center on atomically precise manipulation of matter and discuss existing and projected capabilities, goals, and applications.

Nanotechnology is the engineering of functional systems at the molecular scale. This covers both current work and concepts that are more advanced. In its original sense, nanotechnology refers to the projected ability to construct items from the bottom up, using techniques and tools being developed today to make complete, high performance products.

One nanometer (nm) is one billionth, or 109, of a meter. By comparison, typical carbon-carbon bond lengths, or the spacing between these atoms in a molecule, are in the range 0.120.15 nm, and a DNA double-helix has a diameter around 2nm. On the other hand, the smallest cellular life-forms, the bacteria of the genus Mycoplasma, are around 200nm in length. By convention, nanotechnology is taken as the scale range 1 to 100 nm following the definition used by the National Nanotechnology Initiative in the US. The lower limit is set by the size of atoms (hydrogen has the smallest atoms, which are approximately a quarter of a nm kinetic diameter) since nanotechnology must build its devices from atoms and molecules. The upper limit is more or less arbitrary but is around the size below which phenomena not observed in larger structures start to become apparent and can be made use of in the nano device.[21] These new phenomena make nanotechnology distinct from devices which are merely miniaturised versions of an equivalent macroscopic device; such devices are on a larger scale and come under the description of microtechnology.[22]

To put that scale in another context, the comparative size of a nanometer to a meter is the same as that of a marble to the size of the earth.[23] Or another way of putting it: a nanometer is the amount an average man’s beard grows in the time it takes him to raise the razor to his face.[23]

Two main approaches are used in nanotechnology. In the “bottom-up” approach, materials and devices are built from molecular components which assemble themselves chemically by principles of molecular recognition.[24] In the “top-down” approach, nano-objects are constructed from larger entities without atomic-level control.[25]

Areas of physics such as nanoelectronics, nanomechanics, nanophotonics and nanoionics have evolved during the last few decades to provide a basic scientific foundation of nanotechnology.

Several phenomena become pronounced as the size of the system decreases. These include statistical mechanical effects, as well as quantum mechanical effects, for example the “quantum size effect” where the electronic properties of solids are altered with great reductions in particle size. This effect does not come into play by going from macro to micro dimensions. However, quantum effects can become significant when the nanometer size range is reached, typically at distances of 100 nanometers or less, the so-called quantum realm. Additionally, a number of physical (mechanical, electrical, optical, etc.) properties change when compared to macroscopic systems. One example is the increase in surface area to volume ratio altering mechanical, thermal and catalytic properties of materials. Diffusion and reactions at nanoscale, nanostructures materials and nanodevices with fast ion transport are generally referred to nanoionics. Mechanical properties of nanosystems are of interest in the nanomechanics research. The catalytic activity of nanomaterials also opens potential risks in their interaction with biomaterials.

Materials reduced to the nanoscale can show different properties compared to what they exhibit on a macroscale, enabling unique applications. For instance, opaque substances can become transparent (copper); stable materials can turn combustible (aluminium); insoluble materials may become soluble (gold). A material such as gold, which is chemically inert at normal scales, can serve as a potent chemical catalyst at nanoscales. Much of the fascination with nanotechnology stems from these quantum and surface phenomena that matter exhibits at the nanoscale.[26]

Modern synthetic chemistry has reached the point where it is possible to prepare small molecules to almost any structure. These methods are used today to manufacture a wide variety of useful chemicals such as pharmaceuticals or commercial polymers. This ability raises the question of extending this kind of control to the next-larger level, seeking methods to assemble these single molecules into supramolecular assemblies consisting of many molecules arranged in a well defined manner.

These approaches utilize the concepts of molecular self-assembly and/or supramolecular chemistry to automatically arrange themselves into some useful conformation through a bottom-up approach. The concept of molecular recognition is especially important: molecules can be designed so that a specific configuration or arrangement is favored due to non-covalent intermolecular forces. The WatsonCrick basepairing rules are a direct result of this, as is the specificity of an enzyme being targeted to a single substrate, or the specific folding of the protein itself. Thus, two or more components can be designed to be complementary and mutually attractive so that they make a more complex and useful whole.

Such bottom-up approaches should be capable of producing devices in parallel and be much cheaper than top-down methods, but could potentially be overwhelmed as the size and complexity of the desired assembly increases. Most useful structures require complex and thermodynamically unlikely arrangements of atoms. Nevertheless, there are many examples of self-assembly based on molecular recognition in biology, most notably WatsonCrick basepairing and enzyme-substrate interactions. The challenge for nanotechnology is whether these principles can be used to engineer new constructs in addition to natural ones.

Molecular nanotechnology, sometimes called molecular manufacturing, describes engineered nanosystems (nanoscale machines) operating on the molecular scale. Molecular nanotechnology is especially associated with the molecular assembler, a machine that can produce a desired structure or device atom-by-atom using the principles of mechanosynthesis. Manufacturing in the context of productive nanosystems is not related to, and should be clearly distinguished from, the conventional technologies used to manufacture nanomaterials such as carbon nanotubes and nanoparticles.

When the term “nanotechnology” was independently coined and popularized by Eric Drexler (who at the time was unaware of an earlier usage by Norio Taniguchi) it referred to a future manufacturing technology based on molecular machine systems. The premise was that molecular scale biological analogies of traditional machine components demonstrated molecular machines were possible: by the countless examples found in biology, it is known that sophisticated, stochastically optimised biological machines can be produced.

It is hoped that developments in nanotechnology will make possible their construction by some other means, perhaps using biomimetic principles. However, Drexler and other researchers[27] have proposed that advanced nanotechnology, although perhaps initially implemented by biomimetic means, ultimately could be based on mechanical engineering principles, namely, a manufacturing technology based on the mechanical functionality of these components (such as gears, bearings, motors, and structural members) that would enable programmable, positional assembly to atomic specification.[28] The physics and engineering performance of exemplar designs were analyzed in Drexler’s book Nanosystems.

In general it is very difficult to assemble devices on the atomic scale, as one has to position atoms on other atoms of comparable size and stickiness. Another view, put forth by Carlo Montemagno,[29] is that future nanosystems will be hybrids of silicon technology and biological molecular machines. Richard Smalley argued that mechanosynthesis are impossible due to the difficulties in mechanically manipulating individual molecules.

This led to an exchange of letters in the ACS publication Chemical & Engineering News in 2003.[30] Though biology clearly demonstrates that molecular machine systems are possible, non-biological molecular machines are today only in their infancy. Leaders in research on non-biological molecular machines are Dr. Alex Zettl and his colleagues at Lawrence Berkeley Laboratories and UC Berkeley.[1] They have constructed at least three distinct molecular devices whose motion is controlled from the desktop with changing voltage: a nanotube nanomotor, a molecular actuator,[31] and a nanoelectromechanical relaxation oscillator.[32] See nanotube nanomotor for more examples.

An experiment indicating that positional molecular assembly is possible was performed by Ho and Lee at Cornell University in 1999. They used a scanning tunneling microscope to move an individual carbon monoxide molecule (CO) to an individual iron atom (Fe) sitting on a flat silver crystal, and chemically bound the CO to the Fe by applying a voltage.

The nanomaterials field includes subfields which develop or study materials having unique properties arising from their nanoscale dimensions.[35]

These seek to arrange smaller components into more complex assemblies.

These seek to create smaller devices by using larger ones to direct their assembly.

These seek to develop components of a desired functionality without regard to how they might be assembled.

These subfields seek to anticipate what inventions nanotechnology might yield, or attempt to propose an agenda along which inquiry might progress. These often take a big-picture view of nanotechnology, with more emphasis on its societal implications than the details of how such inventions could actually be created.

Nanomaterials can be classified in 0D, 1D, 2D and 3D nanomaterials. The dimensionality play a major role in determining the characteristic of nanomaterials including physical, chemical and biological characteristics. With the decrease in dimensionality, an increase in surface-to-volume ratio is observed. This indicate that smaller dimensional nanomaterials have higher surface area compared to 3D nanomaterials. Recently, two dimensional (2D) nanomaterials are extensively investigated for electronic, biomedical, drug delivery and biosensor applications.

There are several important modern developments. The atomic force microscope (AFM) and the Scanning Tunneling Microscope (STM) are two early versions of scanning probes that launched nanotechnology. There are other types of scanning probe microscopy. Although conceptually similar to the scanning confocal microscope developed by Marvin Minsky in 1961 and the scanning acoustic microscope (SAM) developed by Calvin Quate and coworkers in the 1970s, newer scanning probe microscopes have much higher resolution, since they are not limited by the wavelength of sound or light.

The tip of a scanning probe can also be used to manipulate nanostructures (a process called positional assembly). Feature-oriented scanning methodology may be a promising way to implement these nanomanipulations in automatic mode.[53][54] However, this is still a slow process because of low scanning velocity of the microscope.

Various techniques of nanolithography such as optical lithography, X-ray lithography, dip pen nanolithography, electron beam lithography or nanoimprint lithography were also developed. Lithography is a top-down fabrication technique where a bulk material is reduced in size to nanoscale pattern.

Another group of nanotechnological techniques include those used for fabrication of nanotubes and nanowires, those used in semiconductor fabrication such as deep ultraviolet lithography, electron beam lithography, focused ion beam machining, nanoimprint lithography, atomic layer deposition, and molecular vapor deposition, and further including molecular self-assembly techniques such as those employing di-block copolymers. The precursors of these techniques preceded the nanotech era, and are extensions in the development of scientific advancements rather than techniques which were devised with the sole purpose of creating nanotechnology and which were results of nanotechnology research.[55]

The top-down approach anticipates nanodevices that must be built piece by piece in stages, much as manufactured items are made. Scanning probe microscopy is an important technique both for characterization and synthesis of nanomaterials. Atomic force microscopes and scanning tunneling microscopes can be used to look at surfaces and to move atoms around. By designing different tips for these microscopes, they can be used for carving out structures on surfaces and to help guide self-assembling structures. By using, for example, feature-oriented scanning approach, atoms or molecules can be moved around on a surface with scanning probe microscopy techniques.[53][54] At present, it is expensive and time-consuming for mass production but very suitable for laboratory experimentation.

In contrast, bottom-up techniques build or grow larger structures atom by atom or molecule by molecule. These techniques include chemical synthesis, self-assembly and positional assembly. Dual polarisation interferometry is one tool suitable for characterisation of self assembled thin films. Another variation of the bottom-up approach is molecular beam epitaxy or MBE. Researchers at Bell Telephone Laboratories like John R. Arthur. Alfred Y. Cho, and Art C. Gossard developed and implemented MBE as a research tool in the late 1960s and 1970s. Samples made by MBE were key to the discovery of the fractional quantum Hall effect for which the 1998 Nobel Prize in Physics was awarded. MBE allows scientists to lay down atomically precise layers of atoms and, in the process, build up complex structures. Important for research on semiconductors, MBE is also widely used to make samples and devices for the newly emerging field of spintronics.

However, new therapeutic products, based on responsive nanomaterials, such as the ultradeformable, stress-sensitive Transfersome vesicles, are under development and already approved for human use in some countries.[56]

As of August 21, 2008, the Project on Emerging Nanotechnologies estimates that over 800 manufacturer-identified nanotech products are publicly available, with new ones hitting the market at a pace of 34 per week.[18] The project lists all of the products in a publicly accessible online database. Most applications are limited to the use of “first generation” passive nanomaterials which includes titanium dioxide in sunscreen, cosmetics, surface coatings,[57] and some food products; Carbon allotropes used to produce gecko tape; silver in food packaging, clothing, disinfectants and household appliances; zinc oxide in sunscreens and cosmetics, surface coatings, paints and outdoor furniture varnishes; and cerium oxide as a fuel catalyst.[17]

Further applications allow tennis balls to last longer, golf balls to fly straighter, and even bowling balls to become more durable and have a harder surface. Trousers and socks have been infused with nanotechnology so that they will last longer and keep people cool in the summer. Bandages are being infused with silver nanoparticles to heal cuts faster.[58] Video game consoles and personal computers may become cheaper, faster, and contain more memory thanks to nanotechnology.[59] Also, to build structures for on chip computing with light, for example on chip optical quantum information processing, and picosecond transmission of information.[60]

Nanotechnology may have the ability to make existing medical applications cheaper and easier to use in places like the general practitioner’s office and at home.[61] Cars are being manufactured with nanomaterials so they may need fewer metals and less fuel to operate in the future.[62]

Scientists are now turning to nanotechnology in an attempt to develop diesel engines with cleaner exhaust fumes. Platinum is currently used as the diesel engine catalyst in these engines. The catalyst is what cleans the exhaust fume particles. First a reduction catalyst is employed to take nitrogen atoms from NOx molecules in order to free oxygen. Next the oxidation catalyst oxidizes the hydrocarbons and carbon monoxide to form carbon dioxide and water.[63] Platinum is used in both the reduction and the oxidation catalysts.[64] Using platinum though, is inefficient in that it is expensive and unsustainable. Danish company InnovationsFonden invested DKK 15 million in a search for new catalyst substitutes using nanotechnology. The goal of the project, launched in the autumn of 2014, is to maximize surface area and minimize the amount of material required. Objects tend to minimize their surface energy; two drops of water, for example, will join to form one drop and decrease surface area. If the catalyst’s surface area that is exposed to the exhaust fumes is maximized, efficiency of the catalyst is maximized. The team working on this project aims to create nanoparticles that will not merge. Every time the surface is optimized, material is saved. Thus, creating these nanoparticles will increase the effectiveness of the resulting diesel engine catalystin turn leading to cleaner exhaust fumesand will decrease cost. If successful, the team hopes to reduce platinum use by 25%.[65]

Nanotechnology also has a prominent role in the fast developing field of Tissue Engineering. When designing scaffolds, researchers attempt to the mimic the nanoscale features of a Cell’s microenvironment to direct its differentiation down a suitable lineage.[66] For example, when creating scaffolds to support the growth of bone, researchers may mimic osteoclast resorption pits.[67]

Researchers have successfully used DNA origami-based nanobots capable of carrying out logic functions to achieve targeted drug delivery in cockroaches. It is said that the computational power of these nanobots can be scaled up to that of a Commodore 64.[68]

An area of concern is the effect that industrial-scale manufacturing and use of nanomaterials would have on human health and the environment, as suggested by nanotoxicology research. For these reasons, some groups advocate that nanotechnology be regulated by governments. Others counter that overregulation would stifle scientific research and the development of beneficial innovations. Public health research agencies, such as the National Institute for Occupational Safety and Health are actively conducting research on potential health effects stemming from exposures to nanoparticles.[69][70]

Some nanoparticle products may have unintended consequences. Researchers have discovered that bacteriostatic silver nanoparticles used in socks to reduce foot odor are being released in the wash.[71] These particles are then flushed into the waste water stream and may destroy bacteria which are critical components of natural ecosystems, farms, and waste treatment processes.[72]

Public deliberations on risk perception in the US and UK carried out by the Center for Nanotechnology in Society found that participants were more positive about nanotechnologies for energy applications than for health applications, with health applications raising moral and ethical dilemmas such as cost and availability.[73]

Experts, including director of the Woodrow Wilson Center’s Project on Emerging Nanotechnologies David Rejeski, have testified[74] that successful commercialization depends on adequate oversight, risk research strategy, and public engagement. Berkeley, California is currently the only city in the United States to regulate nanotechnology;[75] Cambridge, Massachusetts in 2008 considered enacting a similar law,[76] but ultimately rejected it.[77] Relevant for both research on and application of nanotechnologies, the insurability of nanotechnology is contested.[78] Without state regulation of nanotechnology, the availability of private insurance for potential damages is seen as necessary to ensure that burdens are not socialised implicitly. Over the next several decades, applications of nanotechnology will likely include much higher-capacity computers, active materials of various kinds, and cellular-scale biomedical devices.[79]

Nanofibers are used in several areas and in different products, in everything from aircraft wings to tennis rackets. Inhaling airborne nanoparticles and nanofibers may lead to a number of pulmonary diseases, e.g. fibrosis.[80] Researchers have found that when rats breathed in nanoparticles, the particles settled in the brain and lungs, which led to significant increases in biomarkers for inflammation and stress response[81] and that nanoparticles induce skin aging through oxidative stress in hairless mice.[82][83]

A two-year study at UCLA’s School of Public Health found lab mice consuming nano-titanium dioxide showed DNA and chromosome damage to a degree “linked to all the big killers of man, namely cancer, heart disease, neurological disease and aging”.[84]

A major study published more recently in Nature Nanotechnology suggests some forms of carbon nanotubes a poster child for the “nanotechnology revolution” could be as harmful as asbestos if inhaled in sufficient quantities. Anthony Seaton of the Institute of Occupational Medicine in Edinburgh, Scotland, who contributed to the article on carbon nanotubes said “We know that some of them probably have the potential to cause mesothelioma. So those sorts of materials need to be handled very carefully.”[85] In the absence of specific regulation forthcoming from governments, Paull and Lyons (2008) have called for an exclusion of engineered nanoparticles in food.[86] A newspaper article reports that workers in a paint factory developed serious lung disease and nanoparticles were found in their lungs.[87][88][89][90]

Calls for tighter regulation of nanotechnology have occurred alongside a growing debate related to the human health and safety risks of nanotechnology.[91] There is significant debate about who is responsible for the regulation of nanotechnology. Some regulatory agencies currently cover some nanotechnology products and processes (to varying degrees) by “bolting on” nanotechnology to existing regulations there are clear gaps in these regimes.[92] Davies (2008) has proposed a regulatory road map describing steps to deal with these shortcomings.[93]

Stakeholders concerned by the lack of a regulatory framework to assess and control risks associated with the release of nanoparticles and nanotubes have drawn parallels with bovine spongiform encephalopathy (“mad cow” disease), thalidomide, genetically modified food,[94] nuclear energy, reproductive technologies, biotechnology, and asbestosis. Dr. Andrew Maynard, chief science advisor to the Woodrow Wilson Center’s Project on Emerging Nanotechnologies, concludes that there is insufficient funding for human health and safety research, and as a result there is currently limited understanding of the human health and safety risks associated with nanotechnology.[95] As a result, some academics have called for stricter application of the precautionary principle, with delayed marketing approval, enhanced labelling and additional safety data development requirements in relation to certain forms of nanotechnology.[96][97]

The Royal Society report[15] identified a risk of nanoparticles or nanotubes being released during disposal, destruction and recycling, and recommended that “manufacturers of products that fall under extended producer responsibility regimes such as end-of-life regulations publish procedures outlining how these materials will be managed to minimize possible human and environmental exposure” (p. xiii).

The Center for Nanotechnology in Society has found that people respond to nanotechnologies differently, depending on application with participants in public deliberations more positive about nanotechnologies for energy than health applications suggesting that any public calls for nano regulations may differ by technology sector.[73]

Read the original here:

Nanotechnology – Wikipedia

Nano

Researchers develop novel process to 3-D print one of the strongest materials on earth

Researchers from Virginia Tech and Lawrence Livermore National Laboratory have developed a novel way to 3-D print complex objects of one of the highest-performing materials used in the battery and aerospace industries.

The rest is here:

Nano

Molecular nanotechnology – Wikipedia

Molecular nanotechnology (MNT) is a technology based on the ability to build structures to complex, atomic specifications by means of mechanosynthesis.[1] This is distinct from nanoscale materials. Based on Richard Feynman’s vision of miniature factories using nanomachines to build complex products (including additional nanomachines), this advanced form of nanotechnology (or molecular manufacturing[2]) would make use of positionally-controlled mechanosynthesis guided by molecular machine systems. MNT would involve combining physical principles demonstrated by biophysics, chemistry, other nanotechnologies, and the molecular machinery of life with the systems engineering principles found in modern macroscale factories.

While conventional chemistry uses inexact processes obtaining inexact results, and biology exploits inexact processes to obtain definitive results, molecular nanotechnology would employ original definitive processes to obtain definitive results. The desire in molecular nanotechnology would be to balance molecular reactions in positionally-controlled locations and orientations to obtain desired chemical reactions, and then to build systems by further assembling the products of these reactions.

A roadmap for the development of MNT is an objective of a broadly based technology project led by Battelle (the manager of several U.S. National Laboratories) and the Foresight Institute.[3] The roadmap was originally scheduled for completion by late 2006, but was released in January 2008.[4] The Nanofactory Collaboration[5] is a more focused ongoing effort involving 23 researchers from 10 organizations and 4 countries that is developing a practical research agenda[6] specifically aimed at positionally-controlled diamond mechanosynthesis and diamondoid nanofactory development. In August 2005, a task force consisting of 50+ international experts from various fields was organized by the Center for Responsible Nanotechnology to study the societal implications of molecular nanotechnology.[7]

One proposed application of MNT is so-called smart materials. This term refers to any sort of material designed and engineered at the nanometer scale for a specific task. It encompasses a wide variety of possible commercial applications. One example would be materials designed to respond differently to various molecules; such a capability could lead, for example, to artificial drugs which would recognize and render inert specific viruses. Another is the idea of self-healing structures, which would repair small tears in a surface naturally in the same way as self-sealing tires or human skin.

A MNT nanosensor would resemble a smart material, involving a small component within a larger machine that would react to its environment and change in some fundamental, intentional way. A very simple example: a photosensor might passively measure the incident light and discharge its absorbed energy as electricity when the light passes above or below a specified threshold, sending a signal to a larger machine. Such a sensor would supposedly cost less and use less power than a conventional sensor, and yet function usefully in all the same applications for example, turning on parking lot lights when it gets dark.

While smart materials and nanosensors both exemplify useful applications of MNT, they pale in comparison with the complexity of the technology most popularly associated with the term: the replicating nanorobot.

MNT nanofacturing is popularly linked with the idea of swarms of coordinated nanoscale robots working together, a popularization of an early proposal by K. Eric Drexler in his 1986 discussions of MNT, but superseded in 1992. In this early proposal, sufficiently capable nanorobots would construct more nanorobots in an artificial environment containing special molecular building blocks.

Critics have doubted both the feasibility of self-replicating nanorobots and the feasibility of control if self-replicating nanorobots could be achieved: they cite the possibility of mutations removing any control and favoring reproduction of mutant pathogenic variations. Advocates address the first doubt by pointing out that the first macroscale autonomous machine replicator, made of Lego blocks, was built and operated experimentally in 2002.[8] While there are sensory advantages present at the macroscale compared to the limited sensorium available at the nanoscale, proposals for positionally controlled nanoscale mechanosynthetic fabrication systems employ dead reckoning of tooltips combined with reliable reaction sequence design to ensure reliable results, hence a limited sensorium is no handicap; similar considerations apply to the positional assembly of small nanoparts. Advocates address the second doubt by arguing that bacteria are (of necessity) evolved to evolve, while nanorobot mutation could be actively prevented by common error-correcting techniques. Similar ideas are advocated in the Foresight Guidelines on Molecular Nanotechnology,[9] and a map of the 137-dimensional replicator design space[10] recently published by Freitas and Merkle provides numerous proposed methods by which replicators could, in principle, be safely controlled by good design.

However, the concept of suppressing mutation raises the question: How can design evolution occur at the nanoscale without a process of random mutation and deterministic selection? Critics argue that MNT advocates have not provided a substitute for such a process of evolution in this nanoscale arena where conventional sensory-based selection processes are lacking. The limits of the sensorium available at the nanoscale could make it difficult or impossible to winnow successes from failures. Advocates argue that design evolution should occur deterministically and strictly under human control, using the conventional engineering paradigm of modeling, design, prototyping, testing, analysis, and redesign.

In any event, since 1992 technical proposals for MNT do not include self-replicating nanorobots, and recent ethical guidelines put forth by MNT advocates prohibit unconstrained self-replication.[9][11]

One of the most important applications of MNT would be medical nanorobotics or nanomedicine, an area pioneered by Robert Freitas in numerous books[12] and papers.[13] The ability to design, build, and deploy large numbers of medical nanorobots would, at a minimum, make possible the rapid elimination of disease and the reliable and relatively painless recovery from physical trauma. Medical nanorobots might also make possible the convenient correction of genetic defects, and help to ensure a greatly expanded lifespan. More controversially, medical nanorobots might be used to augment natural human capabilities. One study has reported on the conditions like tumors, arteriosclerosis, blood clots leading to stroke, accumulation of scar tissue and localized pockets of infection can be possibly be addressed by employing medical nanorobots.[14][15]

Another proposed application of molecular nanotechnology is “utility fog”[16] in which a cloud of networked microscopic robots (simpler than assemblers) would change its shape and properties to form macroscopic objects and tools in accordance with software commands. Rather than modify the current practices of consuming material goods in different forms, utility fog would simply replace many physical objects.

Yet another proposed application of MNT would be phased-array optics (PAO).[17] However, this appears to be a problem addressable by ordinary nanoscale technology. PAO would use the principle of phased-array millimeter technology but at optical wavelengths. This would permit the duplication of any sort of optical effect but virtually. Users could request holograms, sunrises and sunsets, or floating lasers as the mood strikes. PAO systems were described in BC Crandall’s Nanotechnology: Molecular Speculations on Global Abundance in the Brian Wowk article “Phased-Array Optics.”[18]

Molecular manufacturing is a potential future subfield of nanotechnology that would make it possible to build complex structures at atomic precision.[19] Molecular manufacturing requires significant advances in nanotechnology, but once achieved could produce highly advanced products at low costs and in large quantities in nanofactories weighing a kilogram or more.[19][20] When nanofactories gain the ability to produce other nanofactories production may only be limited by relatively abundant factors such as input materials, energy and software.[20]

The products of molecular manufacturing could range from cheaper, mass-produced versions of known high-tech products to novel products with added capabilities in many areas of application. Some applications that have been suggested are advanced smart materials, nanosensors, medical nanorobots and space travel.[19] Additionally, molecular manufacturing could be used to cheaply produce highly advanced, durable weapons, which is an area of special concern regarding the impact of nanotechnology.[20] Being equipped with compact computers and motors these could be increasingly autonomous and have a large range of capabilities.[20]

According to Chris Phoenix and Mike Treder from the Center for Responsible Nanotechnology as well as Anders Sandberg from the Future of Humanity Institute molecular manufacturing is the application of nanotechnology that poses the most significant global catastrophic risk.[20][21] Several nanotechnology researchers state that the bulk of risk from nanotechnology comes from the potential to lead to war, arms races and destructive global government.[20][21][22] Several reasons have been suggested why the availability of nanotech weaponry may with significant likelihood lead to unstable arms races (compared to e.g. nuclear arms races): (1) A large number of players may be tempted to enter the race since the threshold for doing so is low;[20] (2) the ability to make weapons with molecular manufacturing will be cheap and easy to hide;[20] (3) therefore lack of insight into the other parties’ capabilities can tempt players to arm out of caution or to launch preemptive strikes;[20][23] (4) molecular manufacturing may reduce dependency on international trade,[20] a potential peace-promoting factor;[24] (5) wars of aggression may pose a smaller economic threat to the aggressor since manufacturing is cheap and humans may not be needed on the battlefield.[20]

Since self-regulation by all state and non-state actors seems hard to achieve,[25] measures to mitigate war-related risks have mainly been proposed in the area of international cooperation.[20][26] International infrastructure may be expanded giving more sovereignty to the international level. This could help coordinate efforts for arms control.[27] International institutions dedicated specifically to nanotechnology (perhaps analogously to the International Atomic Energy Agency IAEA) or general arms control may also be designed.[26] One may also jointly make differential technological progress on defensive technologies, a policy that players should usually favour.[20] The Center for Responsible Nanotechnology also suggest some technical restrictions.[28] Improved transparency regarding technological capabilities may be another important facilitator for arms-control.[29]

A grey goo is another catastrophic scenario, which was proposed by Eric Drexler in his 1986 book Engines of Creation,[30] has been analyzed by Freitas in “Some Limits to Global Ecophagy by Biovorous Nanoreplicators, with Public Policy Recommendations” [31] and has been a theme in mainstream media and fiction.[32][33] This scenario involves tiny self-replicating robots that consume the entire biosphere using it as a source of energy and building blocks. Nanotech experts including Drexler now discredit the scenario. According to Chris Phoenix a “So-called grey goo could only be the product of a deliberate and difficult engineering process, not an accident”.[34] With the advent of nano-biotech, a different scenario called green goo has been forwarded. Here, the malignant substance is not nanobots but rather self-replicating biological organisms engineered through nanotechnology.

Nanotechnology (or molecular nanotechnology to refer more specifically to the goals discussed here) will let us continue the historical trends in manufacturing right up to the fundamental limits imposed by physical law. It will let us make remarkably powerful molecular computers. It will let us make materials over fifty times lighter than steel or aluminium alloy but with the same strength. We’ll be able to make jets, rockets, cars or even chairs that, by today’s standards, would be remarkably light, strong, and inexpensive. Molecular surgical tools, guided by molecular computers and injected into the blood stream could find and destroy cancer cells or invading bacteria, unclog arteries, or provide oxygen when the circulation is impaired.

Nanotechnology will replace our entire manufacturing base with a new, radically more precise, radically less expensive, and radically more flexible way of making products. The aim is not simply to replace today’s computer chip making plants, but also to replace the assembly lines for cars, televisions, telephones, books, surgical tools, missiles, bookcases, airplanes, tractors, and all the rest. The objective is a pervasive change in manufacturing, a change that will leave virtually no product untouched. Economic progress and military readiness in the 21st Century will depend fundamentally on maintaining a competitive position in nanotechnology.

[35]

Despite the current early developmental status of nanotechnology and molecular nanotechnology, much concern surrounds MNT’s anticipated impact on economics[36][37] and on law. Whatever the exact effects, MNT, if achieved, would tend to reduce the scarcity of manufactured goods and make many more goods (such as food and health aids) manufacturable.

MNT should make possible nanomedical capabilities able to cure any medical condition not already cured by advances in other areas. Good health would be common, and poor health of any form would be as rare as smallpox and scurvy are today. Even cryonics would be feasible, as cryopreserved tissue could be fully repaired.

Molecular nanotechnology is one of the technologies that some analysts believe could lead to a technological singularity.Some feel that molecular nanotechnology would have daunting risks.[38] It conceivably could enable cheaper and more destructive conventional weapons. Also, molecular nanotechnology might permit weapons of mass destruction that could self-replicate, as viruses and cancer cells do when attacking the human body. Commentators generally agree that, in the event molecular nanotechnology were developed, its self-replication should be permitted only under very controlled or “inherently safe” conditions.

A fear exists that nanomechanical robots, if achieved, and if designed to self-replicate using naturally occurring materials (a difficult task), could consume the entire planet in their hunger for raw materials,[39] or simply crowd out natural life, out-competing it for energy (as happened historically when blue-green algae appeared and outcompeted earlier life forms). Some commentators have referred to this situation as the “grey goo” or “ecophagy” scenario. K. Eric Drexler considers an accidental “grey goo” scenario extremely unlikely and says so in later editions of Engines of Creation.

In light of this perception of potential danger, the Foresight Institute, founded by Drexler, has prepared a set of guidelines[40] for the ethical development of nanotechnology. These include the banning of free-foraging self-replicating pseudo-organisms on the Earth’s surface, at least, and possibly in other places.

The feasibility of the basic technologies analyzed in Nanosystems has been the subject of a formal scientific review by U.S. National Academy of Sciences, and has also been the focus of extensive debate on the internet and in the popular press.

In 2006, U.S. National Academy of Sciences released the report of a study of molecular manufacturing as part of a longer report, A Matter of Size: Triennial Review of the National Nanotechnology Initiative[41] The study committee reviewed the technical content of Nanosystems, and in its conclusion states that no current theoretical analysis can be considered definitive regarding several questions of potential system performance, and that optimal paths for implementing high-performance systems cannot be predicted with confidence. It recommends experimental research to advance knowledge in this area:

A section heading in Drexler’s Engines of Creation reads[42] “Universal Assemblers”, and the following text speaks of multiple types of assemblers which, collectively, could hypothetically “build almost anything that the laws of nature allow to exist.” Drexler’s colleague Ralph Merkle has noted that, contrary to widespread legend,[43] Drexler never claimed that assembler systems could build absolutely any molecular structure. The endnotes in Drexler’s book explain the qualification “almost”: “For example, a delicate structure might be designed that, like a stone arch, would self-destruct unless all its pieces were already in place. If there were no room in the design for the placement and removal of a scaffolding, then the structure might be impossible to build. Few structures of practical interest seem likely to exhibit such a problem, however.”

In 1992, Drexler published Nanosystems: Molecular Machinery, Manufacturing, and Computation,[44] a detailed proposal for synthesizing stiff covalent structures using a table-top factory. Diamondoid structures and other stiff covalent structures, if achieved, would have a wide range of possible applications, going far beyond current MEMS technology. An outline of a path was put forward in 1992 for building a table-top factory in the absence of an assembler. Other researchers have begun advancing tentative, alternative proposed paths [5] for this in the years since Nanosystems was published.

In 2004 Richard Jones wrote Soft Machines (nanotechnology and life), a book for lay audiences published by Oxford University. In this book he describes radical nanotechnology (as advocated by Drexler) as a deterministic/mechanistic idea of nano engineered machines that does not take into account the nanoscale challenges such as wetness, stickiness, Brownian motion, and high viscosity. He also explains what is soft nanotechnology or more appropriatelly biomimetic nanotechnology which is the way forward, if not the best way, to design functional nanodevices that can cope with all the problems at a nanoscale. One can think of soft nanotechnology as the development of nanomachines that uses the lessons learned from biology on how things work, chemistry to precisely engineer such devices and stochastic physics to model the system and its natural processes in detail.

Several researchers, including Nobel Prize winner Dr. Richard Smalley (19432005),[45] attacked the notion of universal assemblers, leading to a rebuttal from Drexler and colleagues,[46] and eventually to an exchange of letters.[47] Smalley argued that chemistry is extremely complicated, reactions are hard to control, and that a universal assembler is science fiction. Drexler and colleagues, however, noted that Drexler never proposed universal assemblers able to make absolutely anything, but instead proposed more limited assemblers able to make a very wide variety of things. They challenged the relevance of Smalley’s arguments to the more specific proposals advanced in Nanosystems. Also, Smalley argued that nearly all of modern chemistry involves reactions that take place in a solvent (usually water), because the small molecules of a solvent contribute many things, such as lowering binding energies for transition states. Since nearly all known chemistry requires a solvent, Smalley felt that Drexler’s proposal to use a high vacuum environment was not feasible. However, Drexler addresses this in Nanosystems by showing mathematically that well designed catalysts can provide the effects of a solvent and can fundamentally be made even more efficient than a solvent/enzyme reaction could ever be. It is noteworthy that, contrary to Smalley’s opinion that enzymes require water, “Not only do enzymes work vigorously in anhydrous organic media, but in this unnatural milieu they acquire remarkable properties such as greatly enhanced stability, radically altered substrate and enantiomeric specificities, molecular memory, and the ability to catalyse unusual reactions.”[48]

For the future, some means have to be found for MNT design evolution at the nanoscale which mimics the process of biological evolution at the molecular scale. Biological evolution proceeds by random variation in ensemble averages of organisms combined with culling of the less-successful variants and reproduction of the more-successful variants, and macroscale engineering design also proceeds by a process of design evolution from simplicity to complexity as set forth somewhat satirically by John Gall: “A complex system that works is invariably found to have evolved from a simple system that worked. . . . A complex system designed from scratch never works and can not be patched up to make it work. You have to start over, beginning with a system that works.” [49] A breakthrough in MNT is needed which proceeds from the simple atomic ensembles which can be built with, e.g., an STM to complex MNT systems via a process of design evolution. A handicap in this process is the difficulty of seeing and manipulation at the nanoscale compared to the macroscale which makes deterministic selection of successful trials difficult; in contrast biological evolution proceeds via action of what Richard Dawkins has called the “blind watchmaker”[50] comprising random molecular variation and deterministic reproduction/extinction.

At present in 2007 the practice of nanotechnology embraces both stochastic approaches (in which, for example, supramolecular chemistry creates waterproof pants) and deterministic approaches wherein single molecules (created by stochastic chemistry) are manipulated on substrate surfaces (created by stochastic deposition methods) by deterministic methods comprising nudging them with STM or AFM probes and causing simple binding or cleavage reactions to occur. The dream of a complex, deterministic molecular nanotechnology remains elusive. Since the mid-1990s, thousands of surface scientists and thin film technocrats have latched on to the nanotechnology bandwagon and redefined their disciplines as nanotechnology. This has caused much confusion in the field and has spawned thousands of “nano”-papers on the peer reviewed literature. Most of these reports are extensions of the more ordinary research done in the parent fields.

The feasibility of Drexler’s proposals largely depends, therefore, on whether designs like those in Nanosystems could be built in the absence of a universal assembler to build them and would work as described. Supporters of molecular nanotechnology frequently claim that no significant errors have been discovered in Nanosystems since 1992. Even some critics concede[51] that “Drexler has carefully considered a number of physical principles underlying the ‘high level’ aspects of the nanosystems he proposes and, indeed, has thought in some detail” about some issues.

Other critics claim, however, that Nanosystems omits important chemical details about the low-level ‘machine language’ of molecular nanotechnology.[52][53][54][55] They also claim that much of the other low-level chemistry in Nanosystems requires extensive further work, and that Drexler’s higher-level designs therefore rest on speculative foundations. Recent such further work by Freitas and Merkle [56] is aimed at strengthening these foundations by filling the existing gaps in the low-level chemistry.

Drexler argues that we may need to wait until our conventional nanotechnology improves before solving these issues: “Molecular manufacturing will result from a series of advances in molecular machine systems, much as the first Moon landing resulted from a series of advances in liquid-fuel rocket systems. We are now in a position like that of the British Interplanetary Society of the 1930s which described how multistage liquid-fueled rockets could reach the Moon and pointed to early rockets as illustrations of the basic principle.”[57] However, Freitas and Merkle argue [58] that a focused effort to achieve diamond mechanosynthesis (DMS) can begin now, using existing technology, and might achieve success in less than a decade if their “direct-to-DMS approach is pursued rather than a more circuitous development approach that seeks to implement less efficacious nondiamondoid molecular manufacturing technologies before progressing to diamondoid”.

To summarize the arguments against feasibility: First, critics argue that a primary barrier to achieving molecular nanotechnology is the lack of an efficient way to create machines on a molecular/atomic scale, especially in the absence of a well-defined path toward a self-replicating assembler or diamondoid nanofactory. Advocates respond that a preliminary research path leading to a diamondoid nanofactory is being developed.[6]

A second difficulty in reaching molecular nanotechnology is design. Hand design of a gear or bearing at the level of atoms might take a few to several weeks. While Drexler, Merkle and others have created designs of simple parts, no comprehensive design effort for anything approaching the complexity of a Model T Ford has been attempted. Advocates respond that it is difficult to undertake a comprehensive design effort in the absence of significant funding for such efforts, and that despite this handicap much useful design-ahead has nevertheless been accomplished with new software tools that have been developed, e.g., at Nanorex.[59]

In the latest report A Matter of Size: Triennial Review of the National Nanotechnology Initiative[41] put out by the National Academies Press in December 2006 (roughly twenty years after Engines of Creation was published), no clear way forward toward molecular nanotechnology could yet be seen, as per the conclusion on page 108 of that report: “Although theoretical calculations can be made today, the eventually attainablerange of chemical reaction cycles, error rates, speed of operation, and thermodynamicefficiencies of such bottom-up manufacturing systems cannot be reliablypredicted at this time. Thus, the eventually attainable perfection and complexity ofmanufactured products, while they can be calculated in theory, cannot be predictedwith confidence. Finally, the optimum research paths that might lead to systemswhich greatly exceed the thermodynamic efficiencies and other capabilities ofbiological systems cannot be reliably predicted at this time. Research funding thatis based on the ability of investigators to produce experimental demonstrationsthat link to abstract models and guide long-term vision is most appropriate toachieve this goal.” This call for research leading to demonstrations is welcomed by groups such as the Nanofactory Collaboration who are specifically seeking experimental successes in diamond mechanosynthesis.[60] The “Technology Roadmap for Productive Nanosystems”[61] aims to offer additional constructive insights.

It is perhaps interesting to ask whether or not most structures consistent with physical law can in fact be manufactured. Advocates assert that to achieve most of the vision of molecular manufacturing it is not necessary to be able to build “any structure that is compatible with natural law.” Rather, it is necessary to be able to build only a sufficient (possibly modest) subset of such structuresas is true, in fact, of any practical manufacturing process used in the world today, and is true even in biology. In any event, as Richard Feynman once said, “It is scientific only to say what’s more likely or less likely, and not to be proving all the time what’s possible or impossible.”[62]

There is a growing body of peer-reviewed theoretical work on synthesizing diamond by mechanically removing/adding hydrogen atoms [63] and depositing carbon atoms [64][65][66][67][68][69] (a process known as mechanosynthesis). This work is slowly permeating the broader nanoscience community and is being critiqued. For instance, Peng et al. (2006)[70] (in the continuing research effort by Freitas, Merkle and their collaborators) reports that the most-studied mechanosynthesis tooltip motif (DCB6Ge) successfully places a C2 carbon dimer on a C(110) diamond surface at both 300K (room temperature) and 80K (liquid nitrogen temperature), and that the silicon variant (DCB6Si) also works at 80K but not at 300K. Over 100,000 CPU hours were invested in this latest study. The DCB6 tooltip motif, initially described by Merkle and Freitas at a Foresight Conference in 2002, was the first complete tooltip ever proposed for diamond mechanosynthesis and remains the only tooltip motif that has been successfully simulated for its intended function on a full 200-atom diamond surface.

The tooltips modeled in this work are intended to be used only in carefully controlled environments (e.g., vacuum). Maximum acceptable limits for tooltip translational and rotational misplacement errors are reported in Peng et al. (2006) — tooltips must be positioned with great accuracy to avoid bonding the dimer incorrectly. Peng et al. (2006) reports that increasing the handle thickness from 4 support planes of C atoms above the tooltip to 5 planes decreases the resonance frequency of the entire structure from 2.0THz to 1.8THz. More importantly, the vibrational footprints of a DCB6Ge tooltip mounted on a 384-atom handle and of the same tooltip mounted on a similarly constrained but much larger 636-atom “crossbar” handle are virtually identical in the non-crossbar directions. Additional computational studies modeling still bigger handle structures are welcome, but the ability to precisely position SPM tips to the requisite atomic accuracy has been repeatedly demonstrated experimentally at low temperature,[71][72] or even at room temperature[73][74] constituting a basic existence proof for this capability.

Further research[75] to consider additional tooltips will require time-consuming computational chemistry and difficult laboratory work.

A working nanofactory would require a variety of well-designed tips for different reactions, and detailed analyses of placing atoms on more complicated surfaces. Although this appears a challenging problem given current resources, many tools will be available to help future researchers: Moore’s law predicts further increases in computer power, semiconductor fabrication techniques continue to approach the nanoscale, and researchers grow ever more skilled at using proteins, ribosomes and DNA to perform novel chemistry.

View original post here:

Molecular nanotechnology – Wikipedia

Nanotechnology – Wikipedia

Nanotechnology (“nanotech”) is manipulation of matter on an atomic, molecular, and supramolecular scale. The earliest, widespread description of nanotechnology[1][2] referred to the particular technological goal of precisely manipulating atoms and molecules for fabrication of macroscale products, also now referred to as molecular nanotechnology. A more generalized description of nanotechnology was subsequently established by the National Nanotechnology Initiative, which defines nanotechnology as the manipulation of matter with at least one dimension sized from 1 to 100 nanometers. This definition reflects the fact that quantum mechanical effects are important at this quantum-realm scale, and so the definition shifted from a particular technological goal to a research category inclusive of all types of research and technologies that deal with the special properties of matter which occur below the given size threshold. It is therefore common to see the plural form “nanotechnologies” as well as “nanoscale technologies” to refer to the broad range of research and applications whose common trait is size. Because of the variety of potential applications (including industrial and military), governments have invested billions of dollars in nanotechnology research. Through 2012, the USA has invested $3.7 billion using its National Nanotechnology Initiative, the European Union has invested $1.2 billion, and Japan has invested $750 million.[3]

Nanotechnology as defined by size is naturally very broad, including fields of science as diverse as surface science, organic chemistry, molecular biology, semiconductor physics, energy storage,[4][5] microfabrication,[6] molecular engineering, etc.[7] The associated research and applications are equally diverse, ranging from extensions of conventional device physics to completely new approaches based upon molecular self-assembly,[8] from developing new materials with dimensions on the nanoscale to direct control of matter on the atomic scale.

Scientists currently debate the future implications of nanotechnology. Nanotechnology may be able to create many new materials and devices with a vast range of applications, such as in nanomedicine, nanoelectronics, biomaterials energy production, and consumer products. On the other hand, nanotechnology raises many of the same issues as any new technology, including concerns about the toxicity and environmental impact of nanomaterials,[9] and their potential effects on global economics, as well as speculation about various doomsday scenarios. These concerns have led to a debate among advocacy groups and governments on whether special regulation of nanotechnology is warranted.

The concepts that seeded nanotechnology were first discussed in 1959 by renowned physicist Richard Feynman in his talk There’s Plenty of Room at the Bottom, in which he described the possibility of synthesis via direct manipulation of atoms. The term “nano-technology” was first used by Norio Taniguchi in 1974, though it was not widely known.

Inspired by Feynman’s concepts, K. Eric Drexler used the term “nanotechnology” in his 1986 book Engines of Creation: The Coming Era of Nanotechnology, which proposed the idea of a nanoscale “assembler” which would be able to build a copy of itself and of other items of arbitrary complexity with atomic control. Also in 1986, Drexler co-founded The Foresight Institute (with which he is no longer affiliated) to help increase public awareness and understanding of nanotechnology concepts and implications.

Thus, emergence of nanotechnology as a field in the 1980s occurred through convergence of Drexler’s theoretical and public work, which developed and popularized a conceptual framework for nanotechnology, and high-visibility experimental advances that drew additional wide-scale attention to the prospects of atomic control of matter. Since the popularity spike in the 1980s, most of nanotechnology has involved investigation of several approaches to making mechanical devices out of a small number of atoms.[10]

In the 1980s, two major breakthroughs sparked the growth of nanotechnology in modern era. First, the invention of the scanning tunneling microscope in 1981 which provided unprecedented visualization of individual atoms and bonds, and was successfully used to manipulate individual atoms in 1989. The microscope’s developers Gerd Binnig and Heinrich Rohrer at IBM Zurich Research Laboratory received a Nobel Prize in Physics in 1986.[11][12] Binnig, Quate and Gerber also invented the analogous atomic force microscope that year.

Second, Fullerenes were discovered in 1985 by Harry Kroto, Richard Smalley, and Robert Curl, who together won the 1996 Nobel Prize in Chemistry.[13][14] C60 was not initially described as nanotechnology; the term was used regarding subsequent work with related graphene tubes (called carbon nanotubes and sometimes called Bucky tubes) which suggested potential applications for nanoscale electronics and devices.

In the early 2000s, the field garnered increased scientific, political, and commercial attention that led to both controversy and progress. Controversies emerged regarding the definitions and potential implications of nanotechnologies, exemplified by the Royal Society’s report on nanotechnology.[15] Challenges were raised regarding the feasibility of applications envisioned by advocates of molecular nanotechnology, which culminated in a public debate between Drexler and Smalley in 2001 and 2003.[16]

Meanwhile, commercialization of products based on advancements in nanoscale technologies began emerging. These products are limited to bulk applications of nanomaterials and do not involve atomic control of matter. Some examples include the Silver Nano platform for using silver nanoparticles as an antibacterial agent, nanoparticle-based transparent sunscreens, carbon fiber strengthening using silica nanoparticles, and carbon nanotubes for stain-resistant textiles.[17][18]

Governments moved to promote and fund research into nanotechnology, such as in the U.S. with the National Nanotechnology Initiative, which formalized a size-based definition of nanotechnology and established funding for research on the nanoscale, and in Europe via the European Framework Programmes for Research and Technological Development.

By the mid-2000s new and serious scientific attention began to flourish. Projects emerged to produce nanotechnology roadmaps[19][20] which center on atomically precise manipulation of matter and discuss existing and projected capabilities, goals, and applications.

Nanotechnology is the engineering of functional systems at the molecular scale. This covers both current work and concepts that are more advanced. In its original sense, nanotechnology refers to the projected ability to construct items from the bottom up, using techniques and tools being developed today to make complete, high performance products.

One nanometer (nm) is one billionth, or 109, of a meter. By comparison, typical carbon-carbon bond lengths, or the spacing between these atoms in a molecule, are in the range 0.120.15 nm, and a DNA double-helix has a diameter around 2nm. On the other hand, the smallest cellular life-forms, the bacteria of the genus Mycoplasma, are around 200nm in length. By convention, nanotechnology is taken as the scale range 1 to 100 nm following the definition used by the National Nanotechnology Initiative in the US. The lower limit is set by the size of atoms (hydrogen has the smallest atoms, which are approximately a quarter of a nm kinetic diameter) since nanotechnology must build its devices from atoms and molecules. The upper limit is more or less arbitrary but is around the size below which phenomena not observed in larger structures start to become apparent and can be made use of in the nano device.[21] These new phenomena make nanotechnology distinct from devices which are merely miniaturised versions of an equivalent macroscopic device; such devices are on a larger scale and come under the description of microtechnology.[22]

To put that scale in another context, the comparative size of a nanometer to a meter is the same as that of a marble to the size of the earth.[23] Or another way of putting it: a nanometer is the amount an average man’s beard grows in the time it takes him to raise the razor to his face.[23]

Two main approaches are used in nanotechnology. In the “bottom-up” approach, materials and devices are built from molecular components which assemble themselves chemically by principles of molecular recognition.[24] In the “top-down” approach, nano-objects are constructed from larger entities without atomic-level control.[25]

Areas of physics such as nanoelectronics, nanomechanics, nanophotonics and nanoionics have evolved during the last few decades to provide a basic scientific foundation of nanotechnology.

Several phenomena become pronounced as the size of the system decreases. These include statistical mechanical effects, as well as quantum mechanical effects, for example the “quantum size effect” where the electronic properties of solids are altered with great reductions in particle size. This effect does not come into play by going from macro to micro dimensions. However, quantum effects can become significant when the nanometer size range is reached, typically at distances of 100 nanometers or less, the so-called quantum realm. Additionally, a number of physical (mechanical, electrical, optical, etc.) properties change when compared to macroscopic systems. One example is the increase in surface area to volume ratio altering mechanical, thermal and catalytic properties of materials. Diffusion and reactions at nanoscale, nanostructures materials and nanodevices with fast ion transport are generally referred to nanoionics. Mechanical properties of nanosystems are of interest in the nanomechanics research. The catalytic activity of nanomaterials also opens potential risks in their interaction with biomaterials.

Materials reduced to the nanoscale can show different properties compared to what they exhibit on a macroscale, enabling unique applications. For instance, opaque substances can become transparent (copper); stable materials can turn combustible (aluminium); insoluble materials may become soluble (gold). A material such as gold, which is chemically inert at normal scales, can serve as a potent chemical catalyst at nanoscales. Much of the fascination with nanotechnology stems from these quantum and surface phenomena that matter exhibits at the nanoscale.[26]

Modern synthetic chemistry has reached the point where it is possible to prepare small molecules to almost any structure. These methods are used today to manufacture a wide variety of useful chemicals such as pharmaceuticals or commercial polymers. This ability raises the question of extending this kind of control to the next-larger level, seeking methods to assemble these single molecules into supramolecular assemblies consisting of many molecules arranged in a well defined manner.

These approaches utilize the concepts of molecular self-assembly and/or supramolecular chemistry to automatically arrange themselves into some useful conformation through a bottom-up approach. The concept of molecular recognition is especially important: molecules can be designed so that a specific configuration or arrangement is favored due to non-covalent intermolecular forces. The WatsonCrick basepairing rules are a direct result of this, as is the specificity of an enzyme being targeted to a single substrate, or the specific folding of the protein itself. Thus, two or more components can be designed to be complementary and mutually attractive so that they make a more complex and useful whole.

Such bottom-up approaches should be capable of producing devices in parallel and be much cheaper than top-down methods, but could potentially be overwhelmed as the size and complexity of the desired assembly increases. Most useful structures require complex and thermodynamically unlikely arrangements of atoms. Nevertheless, there are many examples of self-assembly based on molecular recognition in biology, most notably WatsonCrick basepairing and enzyme-substrate interactions. The challenge for nanotechnology is whether these principles can be used to engineer new constructs in addition to natural ones.

Molecular nanotechnology, sometimes called molecular manufacturing, describes engineered nanosystems (nanoscale machines) operating on the molecular scale. Molecular nanotechnology is especially associated with the molecular assembler, a machine that can produce a desired structure or device atom-by-atom using the principles of mechanosynthesis. Manufacturing in the context of productive nanosystems is not related to, and should be clearly distinguished from, the conventional technologies used to manufacture nanomaterials such as carbon nanotubes and nanoparticles.

When the term “nanotechnology” was independently coined and popularized by Eric Drexler (who at the time was unaware of an earlier usage by Norio Taniguchi) it referred to a future manufacturing technology based on molecular machine systems. The premise was that molecular scale biological analogies of traditional machine components demonstrated molecular machines were possible: by the countless examples found in biology, it is known that sophisticated, stochastically optimised biological machines can be produced.

It is hoped that developments in nanotechnology will make possible their construction by some other means, perhaps using biomimetic principles. However, Drexler and other researchers[27] have proposed that advanced nanotechnology, although perhaps initially implemented by biomimetic means, ultimately could be based on mechanical engineering principles, namely, a manufacturing technology based on the mechanical functionality of these components (such as gears, bearings, motors, and structural members) that would enable programmable, positional assembly to atomic specification.[28] The physics and engineering performance of exemplar designs were analyzed in Drexler’s book Nanosystems.

In general it is very difficult to assemble devices on the atomic scale, as one has to position atoms on other atoms of comparable size and stickiness. Another view, put forth by Carlo Montemagno,[29] is that future nanosystems will be hybrids of silicon technology and biological molecular machines. Richard Smalley argued that mechanosynthesis are impossible due to the difficulties in mechanically manipulating individual molecules.

This led to an exchange of letters in the ACS publication Chemical & Engineering News in 2003.[30] Though biology clearly demonstrates that molecular machine systems are possible, non-biological molecular machines are today only in their infancy. Leaders in research on non-biological molecular machines are Dr. Alex Zettl and his colleagues at Lawrence Berkeley Laboratories and UC Berkeley.[1] They have constructed at least three distinct molecular devices whose motion is controlled from the desktop with changing voltage: a nanotube nanomotor, a molecular actuator,[31] and a nanoelectromechanical relaxation oscillator.[32] See nanotube nanomotor for more examples.

An experiment indicating that positional molecular assembly is possible was performed by Ho and Lee at Cornell University in 1999. They used a scanning tunneling microscope to move an individual carbon monoxide molecule (CO) to an individual iron atom (Fe) sitting on a flat silver crystal, and chemically bound the CO to the Fe by applying a voltage.

The nanomaterials field includes subfields which develop or study materials having unique properties arising from their nanoscale dimensions.[35]

These seek to arrange smaller components into more complex assemblies.

These seek to create smaller devices by using larger ones to direct their assembly.

These seek to develop components of a desired functionality without regard to how they might be assembled.

These subfields seek to anticipate what inventions nanotechnology might yield, or attempt to propose an agenda along which inquiry might progress. These often take a big-picture view of nanotechnology, with more emphasis on its societal implications than the details of how such inventions could actually be created.

Nanomaterials can be classified in 0D, 1D, 2D and 3D nanomaterials. The dimensionality play a major role in determining the characteristic of nanomaterials including physical, chemical and biological characteristics. With the decrease in dimensionality, an increase in surface-to-volume ratio is observed. This indicate that smaller dimensional nanomaterials have higher surface area compared to 3D nanomaterials. Recently, two dimensional (2D) nanomaterials are extensively investigated for electronic, biomedical, drug delivery and biosensor applications.

There are several important modern developments. The atomic force microscope (AFM) and the Scanning Tunneling Microscope (STM) are two early versions of scanning probes that launched nanotechnology. There are other types of scanning probe microscopy. Although conceptually similar to the scanning confocal microscope developed by Marvin Minsky in 1961 and the scanning acoustic microscope (SAM) developed by Calvin Quate and coworkers in the 1970s, newer scanning probe microscopes have much higher resolution, since they are not limited by the wavelength of sound or light.

The tip of a scanning probe can also be used to manipulate nanostructures (a process called positional assembly). Feature-oriented scanning methodology may be a promising way to implement these nanomanipulations in automatic mode.[53][54] However, this is still a slow process because of low scanning velocity of the microscope.

Various techniques of nanolithography such as optical lithography, X-ray lithography, dip pen nanolithography, electron beam lithography or nanoimprint lithography were also developed. Lithography is a top-down fabrication technique where a bulk material is reduced in size to nanoscale pattern.

Another group of nanotechnological techniques include those used for fabrication of nanotubes and nanowires, those used in semiconductor fabrication such as deep ultraviolet lithography, electron beam lithography, focused ion beam machining, nanoimprint lithography, atomic layer deposition, and molecular vapor deposition, and further including molecular self-assembly techniques such as those employing di-block copolymers. The precursors of these techniques preceded the nanotech era, and are extensions in the development of scientific advancements rather than techniques which were devised with the sole purpose of creating nanotechnology and which were results of nanotechnology research.[55]

The top-down approach anticipates nanodevices that must be built piece by piece in stages, much as manufactured items are made. Scanning probe microscopy is an important technique both for characterization and synthesis of nanomaterials. Atomic force microscopes and scanning tunneling microscopes can be used to look at surfaces and to move atoms around. By designing different tips for these microscopes, they can be used for carving out structures on surfaces and to help guide self-assembling structures. By using, for example, feature-oriented scanning approach, atoms or molecules can be moved around on a surface with scanning probe microscopy techniques.[53][54] At present, it is expensive and time-consuming for mass production but very suitable for laboratory experimentation.

In contrast, bottom-up techniques build or grow larger structures atom by atom or molecule by molecule. These techniques include chemical synthesis, self-assembly and positional assembly. Dual polarisation interferometry is one tool suitable for characterisation of self assembled thin films. Another variation of the bottom-up approach is molecular beam epitaxy or MBE. Researchers at Bell Telephone Laboratories like John R. Arthur. Alfred Y. Cho, and Art C. Gossard developed and implemented MBE as a research tool in the late 1960s and 1970s. Samples made by MBE were key to the discovery of the fractional quantum Hall effect for which the 1998 Nobel Prize in Physics was awarded. MBE allows scientists to lay down atomically precise layers of atoms and, in the process, build up complex structures. Important for research on semiconductors, MBE is also widely used to make samples and devices for the newly emerging field of spintronics.

However, new therapeutic products, based on responsive nanomaterials, such as the ultradeformable, stress-sensitive Transfersome vesicles, are under development and already approved for human use in some countries.[56]

As of August 21, 2008, the Project on Emerging Nanotechnologies estimates that over 800 manufacturer-identified nanotech products are publicly available, with new ones hitting the market at a pace of 34 per week.[18] The project lists all of the products in a publicly accessible online database. Most applications are limited to the use of “first generation” passive nanomaterials which includes titanium dioxide in sunscreen, cosmetics, surface coatings,[57] and some food products; Carbon allotropes used to produce gecko tape; silver in food packaging, clothing, disinfectants and household appliances; zinc oxide in sunscreens and cosmetics, surface coatings, paints and outdoor furniture varnishes; and cerium oxide as a fuel catalyst.[17]

Further applications allow tennis balls to last longer, golf balls to fly straighter, and even bowling balls to become more durable and have a harder surface. Trousers and socks have been infused with nanotechnology so that they will last longer and keep people cool in the summer. Bandages are being infused with silver nanoparticles to heal cuts faster.[58] Video game consoles and personal computers may become cheaper, faster, and contain more memory thanks to nanotechnology.[59] Also, to build structures for on chip computing with light, for example on chip optical quantum information processing, and picosecond transmission of information.[60]

Nanotechnology may have the ability to make existing medical applications cheaper and easier to use in places like the general practitioner’s office and at home.[61] Cars are being manufactured with nanomaterials so they may need fewer metals and less fuel to operate in the future.[62]

Scientists are now turning to nanotechnology in an attempt to develop diesel engines with cleaner exhaust fumes. Platinum is currently used as the diesel engine catalyst in these engines. The catalyst is what cleans the exhaust fume particles. First a reduction catalyst is employed to take nitrogen atoms from NOx molecules in order to free oxygen. Next the oxidation catalyst oxidizes the hydrocarbons and carbon monoxide to form carbon dioxide and water.[63] Platinum is used in both the reduction and the oxidation catalysts.[64] Using platinum though, is inefficient in that it is expensive and unsustainable. Danish company InnovationsFonden invested DKK 15 million in a search for new catalyst substitutes using nanotechnology. The goal of the project, launched in the autumn of 2014, is to maximize surface area and minimize the amount of material required. Objects tend to minimize their surface energy; two drops of water, for example, will join to form one drop and decrease surface area. If the catalyst’s surface area that is exposed to the exhaust fumes is maximized, efficiency of the catalyst is maximized. The team working on this project aims to create nanoparticles that will not merge. Every time the surface is optimized, material is saved. Thus, creating these nanoparticles will increase the effectiveness of the resulting diesel engine catalystin turn leading to cleaner exhaust fumesand will decrease cost. If successful, the team hopes to reduce platinum use by 25%.[65]

Nanotechnology also has a prominent role in the fast developing field of Tissue Engineering. When designing scaffolds, researchers attempt to the mimic the nanoscale features of a Cell’s microenvironment to direct its differentiation down a suitable lineage.[66] For example, when creating scaffolds to support the growth of bone, researchers may mimic osteoclast resorption pits.[67]

Researchers have successfully used DNA origami-based nanobots capable of carrying out logic functions to achieve targeted drug delivery in cockroaches. It is said that the computational power of these nanobots can be scaled up to that of a Commodore 64.[68]

An area of concern is the effect that industrial-scale manufacturing and use of nanomaterials would have on human health and the environment, as suggested by nanotoxicology research. For these reasons, some groups advocate that nanotechnology be regulated by governments. Others counter that overregulation would stifle scientific research and the development of beneficial innovations. Public health research agencies, such as the National Institute for Occupational Safety and Health are actively conducting research on potential health effects stemming from exposures to nanoparticles.[69][70]

Some nanoparticle products may have unintended consequences. Researchers have discovered that bacteriostatic silver nanoparticles used in socks to reduce foot odor are being released in the wash.[71] These particles are then flushed into the waste water stream and may destroy bacteria which are critical components of natural ecosystems, farms, and waste treatment processes.[72]

Public deliberations on risk perception in the US and UK carried out by the Center for Nanotechnology in Society found that participants were more positive about nanotechnologies for energy applications than for health applications, with health applications raising moral and ethical dilemmas such as cost and availability.[73]

Experts, including director of the Woodrow Wilson Center’s Project on Emerging Nanotechnologies David Rejeski, have testified[74] that successful commercialization depends on adequate oversight, risk research strategy, and public engagement. Berkeley, California is currently the only city in the United States to regulate nanotechnology;[75] Cambridge, Massachusetts in 2008 considered enacting a similar law,[76] but ultimately rejected it.[77] Relevant for both research on and application of nanotechnologies, the insurability of nanotechnology is contested.[78] Without state regulation of nanotechnology, the availability of private insurance for potential damages is seen as necessary to ensure that burdens are not socialised implicitly. Over the next several decades, applications of nanotechnology will likely include much higher-capacity computers, active materials of various kinds, and cellular-scale biomedical devices.[79]

Nanofibers are used in several areas and in different products, in everything from aircraft wings to tennis rackets. Inhaling airborne nanoparticles and nanofibers may lead to a number of pulmonary diseases, e.g. fibrosis.[80] Researchers have found that when rats breathed in nanoparticles, the particles settled in the brain and lungs, which led to significant increases in biomarkers for inflammation and stress response[81] and that nanoparticles induce skin aging through oxidative stress in hairless mice.[82][83]

A two-year study at UCLA’s School of Public Health found lab mice consuming nano-titanium dioxide showed DNA and chromosome damage to a degree “linked to all the big killers of man, namely cancer, heart disease, neurological disease and aging”.[84]

A major study published more recently in Nature Nanotechnology suggests some forms of carbon nanotubes a poster child for the “nanotechnology revolution” could be as harmful as asbestos if inhaled in sufficient quantities. Anthony Seaton of the Institute of Occupational Medicine in Edinburgh, Scotland, who contributed to the article on carbon nanotubes said “We know that some of them probably have the potential to cause mesothelioma. So those sorts of materials need to be handled very carefully.”[85] In the absence of specific regulation forthcoming from governments, Paull and Lyons (2008) have called for an exclusion of engineered nanoparticles in food.[86] A newspaper article reports that workers in a paint factory developed serious lung disease and nanoparticles were found in their lungs.[87][88][89][90]

Calls for tighter regulation of nanotechnology have occurred alongside a growing debate related to the human health and safety risks of nanotechnology.[91] There is significant debate about who is responsible for the regulation of nanotechnology. Some regulatory agencies currently cover some nanotechnology products and processes (to varying degrees) by “bolting on” nanotechnology to existing regulations there are clear gaps in these regimes.[92] Davies (2008) has proposed a regulatory road map describing steps to deal with these shortcomings.[93]

Stakeholders concerned by the lack of a regulatory framework to assess and control risks associated with the release of nanoparticles and nanotubes have drawn parallels with bovine spongiform encephalopathy (“mad cow” disease), thalidomide, genetically modified food,[94] nuclear energy, reproductive technologies, biotechnology, and asbestosis. Dr. Andrew Maynard, chief science advisor to the Woodrow Wilson Center’s Project on Emerging Nanotechnologies, concludes that there is insufficient funding for human health and safety research, and as a result there is currently limited understanding of the human health and safety risks associated with nanotechnology.[95] As a result, some academics have called for stricter application of the precautionary principle, with delayed marketing approval, enhanced labelling and additional safety data development requirements in relation to certain forms of nanotechnology.[96][97]

The Royal Society report[15] identified a risk of nanoparticles or nanotubes being released during disposal, destruction and recycling, and recommended that “manufacturers of products that fall under extended producer responsibility regimes such as end-of-life regulations publish procedures outlining how these materials will be managed to minimize possible human and environmental exposure” (p. xiii).

The Center for Nanotechnology in Society has found that people respond to nanotechnologies differently, depending on application with participants in public deliberations more positive about nanotechnologies for energy than health applications suggesting that any public calls for nano regulations may differ by technology sector.[73]

Read more:

Nanotechnology – Wikipedia

Grey goo – Wikipedia

Grey goo (also spelled gray goo) is a hypothetical end-of-the-world scenario involving molecular nanotechnology in which out-of-control self-replicating robots consume all biomass on Earth while building more of themselves,[1][2] a scenario that has been called ecophagy (“eating the environment”, more literally “eating the habitation”).[3] The original idea assumed machines were designed to have this capability, while popularizations have assumed that machines might somehow gain this capability by accident.

Self-replicating machines of the macroscopic variety were originally described by mathematician John von Neumann, and are sometimes referred to as von Neumann machines or clanking replicators.The term gray goo was coined by nanotechnology pioneer Eric Drexler in his 1986 book Engines of Creation.[4] In 2004 he stated, “I wish I had never used the term ‘gray goo’.”[5] Engines of Creation mentions “gray goo” in two paragraphs and a note, while the popularized idea of gray goo was first publicized in a mass-circulation magazine, Omni, in November 1986.[6]

The term was first used by molecular nanotechnology pioneer Eric Drexler in his book Engines of Creation (1986). In Chapter 4, Engines Of Abundance, Drexler illustrates both exponential growth and inherent limits (not gray goo) by describing nanomachines that can function only if given special raw materials:

Imagine such a replicator floating in a bottle of chemicals, making copies of itself…the first replicator assembles a copy in one thousand seconds, the two replicators then build two more in the next thousand seconds, the four build another four, and the eight build another eight. At the end of ten hours, there are not thirty-six new replicators, but over 68 billion. In less than a day, they would weigh a ton; in less than two days, they would outweigh the Earth; in another four hours, they would exceed the mass of the Sun and all the planets combinedif the bottle of chemicals hadn’t run dry long before.

According to Drexler, the term was popularized by an article in science fiction magazine Omni, which also popularized the term nanotechnology in the same issue. Drexler says arms control is a far greater issue than grey goo “nanobugs”.[7]

In a History Channel broadcast, a contrasting idea (a kind of gray goo) is referred to in a futuristic doomsday scenario:”In a common practice, billions of nanobots are released to clean up an oil spill off the coast of Louisiana. However, due to a programming error, the nanobots devour all carbon based objects, instead of just the hydrocarbons of the oil. The nanobots destroy everything, all the while, replicating themselves. Within days, the planet is turned to dust.”[8]

Drexler describes gray goo in Chapter 11 of Engines of Creation:

Early assembler-based replicators could beat the most advanced modern organisms. ‘Plants’ with ‘leaves’ no more efficient than today’s solar cells could out-compete real plants, crowding the biosphere with an inedible foliage. Tough, omnivorous ‘bacteria’ could out-compete real bacteria: they could spread like blowing pollen, replicate swiftly, and reduce the biosphere to dust in a matter of days. Dangerous replicators could easily be too tough, small, and rapidly spreading to stopat least if we made no preparation. We have trouble enough controlling viruses and fruit flies.

Drexler notes that the geometric growth made possible by self-replication is inherently limited by the availability of suitable raw materials.

Drexler used the term “gray goo” not to indicate color or texture, but to emphasize the difference between “superiority” in terms of human values and “superiority” in terms of competitive success:

Though masses of uncontrolled replicators need not be grey or gooey, the term “grey goo” emphasizes that replicators able to obliterate life might be less inspiring than a single species of crabgrass. They might be “superior” in an evolutionary sense, but this need not make them valuable.

Bill Joy, one of the founders of Sun Microsystems, discussed some of the problems with pursuing this technology in his now-famous 2000 article in Wired magazine, titled “Why the Future Doesn’t Need Us”. In direct response to Joy’s concerns, the first quantitative technical analysis of the ecophagy scenario was published in 2000 by nanomedicine pioneer Robert Freitas.[3]

Drexler more recently conceded that there is no need to build anything that even resembles a potential runaway replicator. This would avoid the problem entirely. In a paper in the journal Nanotechnology, he argues that self-replicating machines are needlessly complex and inefficient. His 1992 technical book on advanced nanotechnologies Nanosystems: Molecular Machinery, Manufacturing, and Computation[9] describes manufacturing systems that are desktop-scale factories with specialized machines in fixed locations and conveyor belts to move parts from place to place. None of these measures would prevent a party from creating a weaponized grey goo, were such a thing possible.

Prince Charles called upon the British Royal Society to investigate the “enormous environmental and social risks” of nanotechnology in a planned report, leading to much media commentary on grey goo. The Royal Society’s report on nanoscience was released on 29 July 2004, and declared the possibility of self-replicating machines to lie too far in the future to be of concern to regulators.[10]

More recent analysis in the paper titled Safe Exponential Manufacturing from the Institute of Physics (co-written by Chris Phoenix, Director of Research of the Center for Responsible Nanotechnology, and Eric Drexler), shows that the danger of grey goo is far less likely than originally thought.[11] However, other long-term major risks to society and the environment from nanotechnology have been identified.[12] Drexler has made a somewhat public effort to retract his grey goo hypothesis, in an effort to focus the debate on more realistic threats associated with knowledge-enabled nanoterrorism and other misuses.[13]

In Safe Exponential Manufacturing, which was published in a 2004 issue of Nanotechnology, it was suggested that creating manufacturing systems with the ability to self-replicate by the use of their own energy sources would not be needed.[14] The Foresight Institute also recommended embedding controls in the molecular machines. These controls would be able to prevent anyone from purposely abusing nanotechnology, and therefore avoid the grey goo scenario.[15]

Grey goo is a useful construct for considering low-probability, high-impact outcomes from emerging technologies. Thus, it is a useful tool in the ethics of technology. Daniel A. Vallero[16] applied it as a worst-case scenario thought experiment for technologists contemplating possible risks from advancing a technology. This requires that a decision tree or event tree include even extremely low probability events if such events may have an extremely negative and irreversible consequence, i.e. application of the precautionary principle. Dianne Irving[17] admonishes that “any error in science will have a rippling effect….”. Vallero adapted this reference to chaos theory to emerging technologies, wherein slight permutations of initial conditions can lead to unforeseen and profoundly negative downstream effects, for which the technologist and the new technology’s proponents must be held accountable.

See the original post here:

Grey goo – Wikipedia

Nanotechnology – Wikipedia

Nanotechnology (“nanotech”) is manipulation of matter on an atomic, molecular, and supramolecular scale. The earliest, widespread description of nanotechnology[1][2] referred to the particular technological goal of precisely manipulating atoms and molecules for fabrication of macroscale products, also now referred to as molecular nanotechnology. A more generalized description of nanotechnology was subsequently established by the National Nanotechnology Initiative, which defines nanotechnology as the manipulation of matter with at least one dimension sized from 1 to 100 nanometers. This definition reflects the fact that quantum mechanical effects are important at this quantum-realm scale, and so the definition shifted from a particular technological goal to a research category inclusive of all types of research and technologies that deal with the special properties of matter which occur below the given size threshold. It is therefore common to see the plural form “nanotechnologies” as well as “nanoscale technologies” to refer to the broad range of research and applications whose common trait is size. Because of the variety of potential applications (including industrial and military), governments have invested billions of dollars in nanotechnology research. Through 2012, the USA has invested $3.7 billion using its National Nanotechnology Initiative, the European Union has invested $1.2 billion, and Japan has invested $750 million.[3]

Nanotechnology as defined by size is naturally very broad, including fields of science as diverse as surface science, organic chemistry, molecular biology, semiconductor physics, energy storage,[4][5] microfabrication,[6] molecular engineering, etc.[7] The associated research and applications are equally diverse, ranging from extensions of conventional device physics to completely new approaches based upon molecular self-assembly,[8] from developing new materials with dimensions on the nanoscale to direct control of matter on the atomic scale.

Scientists currently debate the future implications of nanotechnology. Nanotechnology may be able to create many new materials and devices with a vast range of applications, such as in nanomedicine, nanoelectronics, biomaterials energy production, and consumer products. On the other hand, nanotechnology raises many of the same issues as any new technology, including concerns about the toxicity and environmental impact of nanomaterials,[9] and their potential effects on global economics, as well as speculation about various doomsday scenarios. These concerns have led to a debate among advocacy groups and governments on whether special regulation of nanotechnology is warranted.

The concepts that seeded nanotechnology were first discussed in 1959 by renowned physicist Richard Feynman in his talk There’s Plenty of Room at the Bottom, in which he described the possibility of synthesis via direct manipulation of atoms. The term “nano-technology” was first used by Norio Taniguchi in 1974, though it was not widely known.

Inspired by Feynman’s concepts, K. Eric Drexler used the term “nanotechnology” in his 1986 book Engines of Creation: The Coming Era of Nanotechnology, which proposed the idea of a nanoscale “assembler” which would be able to build a copy of itself and of other items of arbitrary complexity with atomic control. Also in 1986, Drexler co-founded The Foresight Institute (with which he is no longer affiliated) to help increase public awareness and understanding of nanotechnology concepts and implications.

Thus, emergence of nanotechnology as a field in the 1980s occurred through convergence of Drexler’s theoretical and public work, which developed and popularized a conceptual framework for nanotechnology, and high-visibility experimental advances that drew additional wide-scale attention to the prospects of atomic control of matter. In the 1980s, two major breakthroughs sparked the growth of nanotechnology in modern era.

First, the invention of the scanning tunneling microscope in 1981 which provided unprecedented visualization of individual atoms and bonds, and was successfully used to manipulate individual atoms in 1989. The microscope’s developers Gerd Binnig and Heinrich Rohrer at IBM Zurich Research Laboratory received a Nobel Prize in Physics in 1986.[10][11] Binnig, Quate and Gerber also invented the analogous atomic force microscope that year.

Second, Fullerenes were discovered in 1985 by Harry Kroto, Richard Smalley, and Robert Curl, who together won the 1996 Nobel Prize in Chemistry.[12][13] C60 was not initially described as nanotechnology; the term was used regarding subsequent work with related graphene tubes (called carbon nanotubes and sometimes called Bucky tubes) which suggested potential applications for nanoscale electronics and devices.

In the early 2000s, the field garnered increased scientific, political, and commercial attention that led to both controversy and progress. Controversies emerged regarding the definitions and potential implications of nanotechnologies, exemplified by the Royal Society’s report on nanotechnology.[14] Challenges were raised regarding the feasibility of applications envisioned by advocates of molecular nanotechnology, which culminated in a public debate between Drexler and Smalley in 2001 and 2003.[15]

Meanwhile, commercialization of products based on advancements in nanoscale technologies began emerging. These products are limited to bulk applications of nanomaterials and do not involve atomic control of matter. Some examples include the Silver Nano platform for using silver nanoparticles as an antibacterial agent, nanoparticle-based transparent sunscreens, carbon fiber strengthening using silica nanoparticles, and carbon nanotubes for stain-resistant textiles.[16][17]

Governments moved to promote and fund research into nanotechnology, such as in the U.S. with the National Nanotechnology Initiative, which formalized a size-based definition of nanotechnology and established funding for research on the nanoscale, and in Europe via the European Framework Programmes for Research and Technological Development.

By the mid-2000s new and serious scientific attention began to flourish. Projects emerged to produce nanotechnology roadmaps[18][19] which center on atomically precise manipulation of matter and discuss existing and projected capabilities, goals, and applications.

Nanotechnology is the engineering of functional systems at the molecular scale. This covers both current work and concepts that are more advanced. In its original sense, nanotechnology refers to the projected ability to construct items from the bottom up, using techniques and tools being developed today to make complete, high performance products.

One nanometer (nm) is one billionth, or 109, of a meter. By comparison, typical carbon-carbon bond lengths, or the spacing between these atoms in a molecule, are in the range 0.120.15 nm, and a DNA double-helix has a diameter around 2nm. On the other hand, the smallest cellular life-forms, the bacteria of the genus Mycoplasma, are around 200nm in length. By convention, nanotechnology is taken as the scale range 1 to 100 nm following the definition used by the National Nanotechnology Initiative in the US. The lower limit is set by the size of atoms (hydrogen has the smallest atoms, which are approximately a quarter of a nm kinetic diameter) since nanotechnology must build its devices from atoms and molecules. The upper limit is more or less arbitrary but is around the size below which phenomena not observed in larger structures start to become apparent and can be made use of in the nano device.[20] These new phenomena make nanotechnology distinct from devices which are merely miniaturised versions of an equivalent macroscopic device; such devices are on a larger scale and come under the description of microtechnology.[21]

To put that scale in another context, the comparative size of a nanometer to a meter is the same as that of a marble to the size of the earth.[22] Or another way of putting it: a nanometer is the amount an average man’s beard grows in the time it takes him to raise the razor to his face.[22]

Two main approaches are used in nanotechnology. In the “bottom-up” approach, materials and devices are built from molecular components which assemble themselves chemically by principles of molecular recognition.[23] In the “top-down” approach, nano-objects are constructed from larger entities without atomic-level control.[24]

Areas of physics such as nanoelectronics, nanomechanics, nanophotonics and nanoionics have evolved during the last few decades to provide a basic scientific foundation of nanotechnology.

Several phenomena become pronounced as the size of the system decreases. These include statistical mechanical effects, as well as quantum mechanical effects, for example the “quantum size effect” where the electronic properties of solids are altered with great reductions in particle size. This effect does not come into play by going from macro to micro dimensions. However, quantum effects can become significant when the nanometer size range is reached, typically at distances of 100 nanometers or less, the so-called quantum realm. Additionally, a number of physical (mechanical, electrical, optical, etc.) properties change when compared to macroscopic systems. One example is the increase in surface area to volume ratio altering mechanical, thermal and catalytic properties of materials. Diffusion and reactions at nanoscale, nanostructures materials and nanodevices with fast ion transport are generally referred to nanoionics. Mechanical properties of nanosystems are of interest in the nanomechanics research. The catalytic activity of nanomaterials also opens potential risks in their interaction with biomaterials.

Materials reduced to the nanoscale can show different properties compared to what they exhibit on a macroscale, enabling unique applications. For instance, opaque substances can become transparent (copper); stable materials can turn combustible (aluminium); insoluble materials may become soluble (gold). A material such as gold, which is chemically inert at normal scales, can serve as a potent chemical catalyst at nanoscales. Much of the fascination with nanotechnology stems from these quantum and surface phenomena that matter exhibits at the nanoscale.[25]

Modern synthetic chemistry has reached the point where it is possible to prepare small molecules to almost any structure. These methods are used today to manufacture a wide variety of useful chemicals such as pharmaceuticals or commercial polymers. This ability raises the question of extending this kind of control to the next-larger level, seeking methods to assemble these single molecules into supramolecular assemblies consisting of many molecules arranged in a well defined manner.

These approaches utilize the concepts of molecular self-assembly and/or supramolecular chemistry to automatically arrange themselves into some useful conformation through a bottom-up approach. The concept of molecular recognition is especially important: molecules can be designed so that a specific configuration or arrangement is favored due to non-covalent intermolecular forces. The WatsonCrick basepairing rules are a direct result of this, as is the specificity of an enzyme being targeted to a single substrate, or the specific folding of the protein itself. Thus, two or more components can be designed to be complementary and mutually attractive so that they make a more complex and useful whole.

Such bottom-up approaches should be capable of producing devices in parallel and be much cheaper than top-down methods, but could potentially be overwhelmed as the size and complexity of the desired assembly increases. Most useful structures require complex and thermodynamically unlikely arrangements of atoms. Nevertheless, there are many examples of self-assembly based on molecular recognition in biology, most notably WatsonCrick basepairing and enzyme-substrate interactions. The challenge for nanotechnology is whether these principles can be used to engineer new constructs in addition to natural ones.

Molecular nanotechnology, sometimes called molecular manufacturing, describes engineered nanosystems (nanoscale machines) operating on the molecular scale. Molecular nanotechnology is especially associated with the molecular assembler, a machine that can produce a desired structure or device atom-by-atom using the principles of mechanosynthesis. Manufacturing in the context of productive nanosystems is not related to, and should be clearly distinguished from, the conventional technologies used to manufacture nanomaterials such as carbon nanotubes and nanoparticles.

When the term “nanotechnology” was independently coined and popularized by Eric Drexler (who at the time was unaware of an earlier usage by Norio Taniguchi) it referred to a future manufacturing technology based on molecular machine systems. The premise was that molecular scale biological analogies of traditional machine components demonstrated molecular machines were possible: by the countless examples found in biology, it is known that sophisticated, stochastically optimised biological machines can be produced.

It is hoped that developments in nanotechnology will make possible their construction by some other means, perhaps using biomimetic principles. However, Drexler and other researchers[26] have proposed that advanced nanotechnology, although perhaps initially implemented by biomimetic means, ultimately could be based on mechanical engineering principles, namely, a manufacturing technology based on the mechanical functionality of these components (such as gears, bearings, motors, and structural members) that would enable programmable, positional assembly to atomic specification.[27] The physics and engineering performance of exemplar designs were analyzed in Drexler’s book Nanosystems.

In general it is very difficult to assemble devices on the atomic scale, as one has to position atoms on other atoms of comparable size and stickiness. Another view, put forth by Carlo Montemagno,[28] is that future nanosystems will be hybrids of silicon technology and biological molecular machines. Richard Smalley argued that mechanosynthesis are impossible due to the difficulties in mechanically manipulating individual molecules.

This led to an exchange of letters in the ACS publication Chemical & Engineering News in 2003.[29] Though biology clearly demonstrates that molecular machine systems are possible, non-biological molecular machines are today only in their infancy. Leaders in research on non-biological molecular machines are Dr. Alex Zettl and his colleagues at Lawrence Berkeley Laboratories and UC Berkeley.[1] They have constructed at least three distinct molecular devices whose motion is controlled from the desktop with changing voltage: a nanotube nanomotor, a molecular actuator,[30] and a nanoelectromechanical relaxation oscillator.[31] See nanotube nanomotor for more examples.

An experiment indicating that positional molecular assembly is possible was performed by Ho and Lee at Cornell University in 1999. They used a scanning tunneling microscope to move an individual carbon monoxide molecule (CO) to an individual iron atom (Fe) sitting on a flat silver crystal, and chemically bound the CO to the Fe by applying a voltage.

The nanomaterials field includes subfields which develop or study materials having unique properties arising from their nanoscale dimensions.[34]

These seek to arrange smaller components into more complex assemblies.

These seek to create smaller devices by using larger ones to direct their assembly.

These seek to develop components of a desired functionality without regard to how they might be assembled.

These subfields seek to anticipate what inventions nanotechnology might yield, or attempt to propose an agenda along which inquiry might progress. These often take a big-picture view of nanotechnology, with more emphasis on its societal implications than the details of how such inventions could actually be created.

Nanomaterials can be classified in 0D, 1D, 2D and 3D nanomaterials. The dimensionality play a major role in determining the characteristic of nanomaterials including physical, chemical and biological characteristics. With the decrease in dimensionality, an increase in surface-to-volume ratio is observed. This indicate that smaller dimensional nanomaterials have higher surface area compared to 3D nanomaterials. Recently, two dimensional (2D) nanomaterials are extensively investigated for electronic, biomedical, drug delivery and biosensor applications.

There are several important modern developments. The atomic force microscope (AFM) and the Scanning Tunneling Microscope (STM) are two early versions of scanning probes that launched nanotechnology. There are other types of scanning probe microscopy. Although conceptually similar to the scanning confocal microscope developed by Marvin Minsky in 1961 and the scanning acoustic microscope (SAM) developed by Calvin Quate and coworkers in the 1970s, newer scanning probe microscopes have much higher resolution, since they are not limited by the wavelength of sound or light.

The tip of a scanning probe can also be used to manipulate nanostructures (a process called positional assembly). Feature-oriented scanning methodology may be a promising way to implement these nanomanipulations in automatic mode.[52][53] However, this is still a slow process because of low scanning velocity of the microscope.

Various techniques of nanolithography such as optical lithography, X-ray lithography, dip pen nanolithography, electron beam lithography or nanoimprint lithography were also developed. Lithography is a top-down fabrication technique where a bulk material is reduced in size to nanoscale pattern.

Another group of nanotechnological techniques include those used for fabrication of nanotubes and nanowires, those used in semiconductor fabrication such as deep ultraviolet lithography, electron beam lithography, focused ion beam machining, nanoimprint lithography, atomic layer deposition, and molecular vapor deposition, and further including molecular self-assembly techniques such as those employing di-block copolymers. The precursors of these techniques preceded the nanotech era, and are extensions in the development of scientific advancements rather than techniques which were devised with the sole purpose of creating nanotechnology and which were results of nanotechnology research.[54]

The top-down approach anticipates nanodevices that must be built piece by piece in stages, much as manufactured items are made. Scanning probe microscopy is an important technique both for characterization and synthesis of nanomaterials. Atomic force microscopes and scanning tunneling microscopes can be used to look at surfaces and to move atoms around. By designing different tips for these microscopes, they can be used for carving out structures on surfaces and to help guide self-assembling structures. By using, for example, feature-oriented scanning approach, atoms or molecules can be moved around on a surface with scanning probe microscopy techniques.[52][53] At present, it is expensive and time-consuming for mass production but very suitable for laboratory experimentation.

In contrast, bottom-up techniques build or grow larger structures atom by atom or molecule by molecule. These techniques include chemical synthesis, self-assembly and positional assembly. Dual polarisation interferometry is one tool suitable for characterisation of self assembled thin films. Another variation of the bottom-up approach is molecular beam epitaxy or MBE. Researchers at Bell Telephone Laboratories like John R. Arthur. Alfred Y. Cho, and Art C. Gossard developed and implemented MBE as a research tool in the late 1960s and 1970s. Samples made by MBE were key to the discovery of the fractional quantum Hall effect for which the 1998 Nobel Prize in Physics was awarded. MBE allows scientists to lay down atomically precise layers of atoms and, in the process, build up complex structures. Important for research on semiconductors, MBE is also widely used to make samples and devices for the newly emerging field of spintronics.

However, new therapeutic products, based on responsive nanomaterials, such as the ultradeformable, stress-sensitive Transfersome vesicles, are under development and already approved for human use in some countries.[55]

As of August 21, 2008, the Project on Emerging Nanotechnologies estimates that over 800 manufacturer-identified nanotech products are publicly available, with new ones hitting the market at a pace of 34 per week.[17] The project lists all of the products in a publicly accessible online database. Most applications are limited to the use of “first generation” passive nanomaterials which includes titanium dioxide in sunscreen, cosmetics, surface coatings,[56] and some food products; Carbon allotropes used to produce gecko tape; silver in food packaging, clothing, disinfectants and household appliances; zinc oxide in sunscreens and cosmetics, surface coatings, paints and outdoor furniture varnishes; and cerium oxide as a fuel catalyst.[16]

Further applications allow tennis balls to last longer, golf balls to fly straighter, and even bowling balls to become more durable and have a harder surface. Trousers and socks have been infused with nanotechnology so that they will last longer and keep people cool in the summer. Bandages are being infused with silver nanoparticles to heal cuts faster.[57] Video game consoles and personal computers may become cheaper, faster, and contain more memory thanks to nanotechnology.[58] Also, to build structures for on chip computing with light, for example on chip optical quantum information processing, and picosecond transmission of information.[59]

Nanotechnology may have the ability to make existing medical applications cheaper and easier to use in places like the general practitioner’s office and at home.[60] Cars are being manufactured with nanomaterials so they may need fewer metals and less fuel to operate in the future.[61]

Scientists are now turning to nanotechnology in an attempt to develop diesel engines with cleaner exhaust fumes. Platinum is currently used as the diesel engine catalyst in these engines. The catalyst is what cleans the exhaust fume particles. First a reduction catalyst is employed to take nitrogen atoms from NOx molecules in order to free oxygen. Next the oxidation catalyst oxidizes the hydrocarbons and carbon monoxide to form carbon dioxide and water.[62] Platinum is used in both the reduction and the oxidation catalysts.[63] Using platinum though, is inefficient in that it is expensive and unsustainable. Danish company InnovationsFonden invested DKK 15 million in a search for new catalyst substitutes using nanotechnology. The goal of the project, launched in the autumn of 2014, is to maximize surface area and minimize the amount of material required. Objects tend to minimize their surface energy; two drops of water, for example, will join to form one drop and decrease surface area. If the catalyst’s surface area that is exposed to the exhaust fumes is maximized, efficiency of the catalyst is maximized. The team working on this project aims to create nanoparticles that will not merge. Every time the surface is optimized, material is saved. Thus, creating these nanoparticles will increase the effectiveness of the resulting diesel engine catalystin turn leading to cleaner exhaust fumesand will decrease cost. If successful, the team hopes to reduce platinum use by 25%.[64]

Nanotechnology also has a prominent role in the fast developing field of Tissue Engineering. When designing scaffolds, researchers attempt to the mimic the nanoscale features of a Cell’s microenvironment to direct its differentiation down a suitable lineage.[65] For example, when creating scaffolds to support the growth of bone, researchers may mimic osteoclast resorption pits.[66]

Researchers have successfully used DNA origami-based nanobots capable of carrying out logic functions to achieve targeted drug delivery in cockroaches. It is said that the computational power of these nanobots can be scaled up to that of a Commodore 64.[67]

An area of concern is the effect that industrial-scale manufacturing and use of nanomaterials would have on human health and the environment, as suggested by nanotoxicology research. For these reasons, some groups advocate that nanotechnology be regulated by governments. Others counter that overregulation would stifle scientific research and the development of beneficial innovations. Public health research agencies, such as the National Institute for Occupational Safety and Health are actively conducting research on potential health effects stemming from exposures to nanoparticles.[68][69]

Some nanoparticle products may have unintended consequences. Researchers have discovered that bacteriostatic silver nanoparticles used in socks to reduce foot odor are being released in the wash.[70] These particles are then flushed into the waste water stream and may destroy bacteria which are critical components of natural ecosystems, farms, and waste treatment processes.[71]

Public deliberations on risk perception in the US and UK carried out by the Center for Nanotechnology in Society found that participants were more positive about nanotechnologies for energy applications than for health applications, with health applications raising moral and ethical dilemmas such as cost and availability.[72]

Experts, including director of the Woodrow Wilson Center’s Project on Emerging Nanotechnologies David Rejeski, have testified[73] that successful commercialization depends on adequate oversight, risk research strategy, and public engagement. Berkeley, California is currently the only city in the United States to regulate nanotechnology;[74] Cambridge, Massachusetts in 2008 considered enacting a similar law,[75] but ultimately rejected it.[76] Relevant for both research on and application of nanotechnologies, the insurability of nanotechnology is contested.[77] Without state regulation of nanotechnology, the availability of private insurance for potential damages is seen as necessary to ensure that burdens are not socialised implicitly.

Nanofibers are used in several areas and in different products, in everything from aircraft wings to tennis rackets. Inhaling airborne nanoparticles and nanofibers may lead to a number of pulmonary diseases, e.g. fibrosis.[78] Researchers have found that when rats breathed in nanoparticles, the particles settled in the brain and lungs, which led to significant increases in biomarkers for inflammation and stress response[79] and that nanoparticles induce skin aging through oxidative stress in hairless mice.[80][81]

A two-year study at UCLA’s School of Public Health found lab mice consuming nano-titanium dioxide showed DNA and chromosome damage to a degree “linked to all the big killers of man, namely cancer, heart disease, neurological disease and aging”.[82]

A major study published more recently in Nature Nanotechnology suggests some forms of carbon nanotubes a poster child for the “nanotechnology revolution” could be as harmful as asbestos if inhaled in sufficient quantities. Anthony Seaton of the Institute of Occupational Medicine in Edinburgh, Scotland, who contributed to the article on carbon nanotubes said “We know that some of them probably have the potential to cause mesothelioma. So those sorts of materials need to be handled very carefully.”[83] In the absence of specific regulation forthcoming from governments, Paull and Lyons (2008) have called for an exclusion of engineered nanoparticles in food.[84] A newspaper article reports that workers in a paint factory developed serious lung disease and nanoparticles were found in their lungs.[85][86][87][88]

Calls for tighter regulation of nanotechnology have occurred alongside a growing debate related to the human health and safety risks of nanotechnology.[89] There is significant debate about who is responsible for the regulation of nanotechnology. Some regulatory agencies currently cover some nanotechnology products and processes (to varying degrees) by “bolting on” nanotechnology to existing regulations there are clear gaps in these regimes.[90] Davies (2008) has proposed a regulatory road map describing steps to deal with these shortcomings.[91]

Stakeholders concerned by the lack of a regulatory framework to assess and control risks associated with the release of nanoparticles and nanotubes have drawn parallels with bovine spongiform encephalopathy (“mad cow” disease), thalidomide, genetically modified food,[92] nuclear energy, reproductive technologies, biotechnology, and asbestosis. Dr. Andrew Maynard, chief science advisor to the Woodrow Wilson Center’s Project on Emerging Nanotechnologies, concludes that there is insufficient funding for human health and safety research, and as a result there is currently limited understanding of the human health and safety risks associated with nanotechnology.[93] As a result, some academics have called for stricter application of the precautionary principle, with delayed marketing approval, enhanced labelling and additional safety data development requirements in relation to certain forms of nanotechnology.[94][95]

The Royal Society report[14] identified a risk of nanoparticles or nanotubes being released during disposal, destruction and recycling, and recommended that “manufacturers of products that fall under extended producer responsibility regimes such as end-of-life regulations publish procedures outlining how these materials will be managed to minimize possible human and environmental exposure” (p. xiii).

The Center for Nanotechnology in Society has found that people respond to nanotechnologies differently, depending on application with participants in public deliberations more positive about nanotechnologies for energy than health applications suggesting that any public calls for nano regulations may differ by technology sector.[72]

Read the original post:

Nanotechnology – Wikipedia

Nanotechnology : Dallas County Community College District

Nanotechnology and nanoscience refer to the behavior and properties of materials at the nanoscale: about 1,000 times smaller than is visible to the human eye. The technology allows for the fabrication of devices with molecular dimensions, as well as producing entirely new properties that emerge at that size. To get an idea of the scale:

Applications can be found in areas as diverse as semiconductors, electronics, medicine, robotics, energy production and other fields. Learn more about nanotechnology:

Nanotechnology has been identified by the U.S. Department of Labor as one of the countrys top three emerging technologies over the next decade. Still in its relative infancy, it has the potential to revolutionize science.

The ability to earn a degree in nanotechnology is relatively new, with Richland College offering one of the few associate degrees in the area. Several Texas universities and colleges offer bachelors, masters or doctoral degrees with an emphasis in nanotechnology.

If you are already in or considering a career path in a science- or manufacturing-related field including chemistry, biology, physics, medicine, engineering, electronics, telecommunications or semiconductor manufacturing you should look at nanotechnology.

There is no one job described as a nanotechnician, but a number of career fields incorporate nanotechnology into their research, development, manufacturing and production processes, including:

Its the wide range of potential products and applications that gives nanotechnology its enormous job-growth prospects. According to a study by market researcher Global Information Inc., the annual worldwide market for products incorporating nanotechnology is expected to reach $3.3 trillion by 2018.

Though many career paths incorporate nanotechnology, engineering positions in particular are projected for high growth. Workforce Solutions of Greater Dallas estimates that more than 30,000 engineering positions including electronic, environmental, mechanical, civil and petroleum engineers will be available locally this year. CareerOneStop, sponsored by the U.S. Department of Labor, estimates 20 to 42 percent growth in all engineering fields (high growth is considered to be more than 10 percent annually) through 2024 in Texas.

The U.S. Bureau of Labor Statistics projects that the fastest-growing engineering specialty will be biomedical engineering. Jobs in this field, which centers on developing and testing health-care innovations such as artificial organs or imaging systems, are expected to grow by an astounding 72 percent.

See more about careers in Nanotechnology.

Richland College

Richland College is the only college of DCCCD to offer a program in Nanotechnology. See more about theassociate degree in Nanotechnology.

The rest is here:

Nanotechnology : Dallas County Community College District

What is Nanotechnology? | Nano

Nanotechnology is science, engineering, and technologyconductedat the nanoscale, which is about 1 to 100 nanometers.

Physicist Richard Feynman, the father of nanotechnology.

Nanoscience and nanotechnology are the study and application of extremely small things and can be used across all the other science fields, such as chemistry, biology, physics, materials science, and engineering.

The ideas and concepts behind nanoscience and nanotechnology started with a talk entitled Theres Plenty of Room at the Bottom by physicist Richard Feynman at an American Physical Society meeting at the California Institute of Technology (CalTech) on December 29, 1959, long before the term nanotechnology was used. In his talk, Feynman described a process in which scientists would be able to manipulate and control individual atoms and molecules. Over a decade later, in his explorations of ultraprecision machining, Professor Norio Taniguchi coined the term nanotechnology. It wasn’t until 1981, with the development of the scanning tunneling microscope that could “see” individual atoms, that modern nanotechnology began.

Its hard to imagine just how small nanotechnology is. One nanometer is a billionth of a meter, or 10-9 of a meter. Here are a few illustrative examples:

Nanoscience and nanotechnology involve the ability to see and to control individual atoms and molecules. Everything on Earth is made up of atomsthe food we eat, the clothes we wear, the buildings and houses we live in, and our own bodies.

But something as small as an atom is impossible to see with the naked eye. In fact, its impossible to see with the microscopes typically used in a high school science classes. The microscopes needed to see things at the nanoscale were invented relatively recentlyabout 30 years ago.

Once scientists had the right tools, such as thescanning tunneling microscope (STM)and the atomic force microscope (AFM), the age of nanotechnology was born.

Although modern nanoscience and nanotechnology are quite new, nanoscale materialswereused for centuries. Alternate-sized gold and silver particles created colors in the stained glass windows of medieval churches hundreds of years ago. The artists back then just didnt know that the process they used to create these beautiful works of art actually led to changes in the composition of the materials they were working with.

Today’s scientists andengineers are finding a wide variety of ways to deliberatelymake materials at the nanoscale to take advantage of their enhanced properties such as higher strength, lighter weight,increased control oflight spectrum, and greater chemical reactivity than theirlarger-scale counterparts.

Read more from the original source:

What is Nanotechnology? | Nano

Nanotechnology – Wikipedia

Nanotechnology (“nanotech”) is manipulation of matter on an atomic, molecular, and supramolecular scale. The earliest, widespread description of nanotechnology[1][2] referred to the particular technological goal of precisely manipulating atoms and molecules for fabrication of macroscale products, also now referred to as molecular nanotechnology. A more generalized description of nanotechnology was subsequently established by the National Nanotechnology Initiative, which defines nanotechnology as the manipulation of matter with at least one dimension sized from 1 to 100 nanometers. This definition reflects the fact that quantum mechanical effects are important at this quantum-realm scale, and so the definition shifted from a particular technological goal to a research category inclusive of all types of research and technologies that deal with the special properties of matter which occur below the given size threshold. It is therefore common to see the plural form “nanotechnologies” as well as “nanoscale technologies” to refer to the broad range of research and applications whose common trait is size. Because of the variety of potential applications (including industrial and military), governments have invested billions of dollars in nanotechnology research. Through 2012, the USA has invested $3.7 billion using its National Nanotechnology Initiative, the European Union has invested $1.2 billion, and Japan has invested $750 million.[3]

Nanotechnology as defined by size is naturally very broad, including fields of science as diverse as surface science, organic chemistry, molecular biology, semiconductor physics, energy storage,[4][5] microfabrication,[6] molecular engineering, etc.[7] The associated research and applications are equally diverse, ranging from extensions of conventional device physics to completely new approaches based upon molecular self-assembly,[8] from developing new materials with dimensions on the nanoscale to direct control of matter on the atomic scale.

Scientists currently debate the future implications of nanotechnology. Nanotechnology may be able to create many new materials and devices with a vast range of applications, such as in nanomedicine, nanoelectronics, biomaterials energy production, and consumer products. On the other hand, nanotechnology raises many of the same issues as any new technology, including concerns about the toxicity and environmental impact of nanomaterials,[9] and their potential effects on global economics, as well as speculation about various doomsday scenarios. These concerns have led to a debate among advocacy groups and governments on whether special regulation of nanotechnology is warranted.

The concepts that seeded nanotechnology were first discussed in 1959 by renowned physicist Richard Feynman in his talk There’s Plenty of Room at the Bottom, in which he described the possibility of synthesis via direct manipulation of atoms. The term “nano-technology” was first used by Norio Taniguchi in 1974, though it was not widely known.

Inspired by Feynman’s concepts, K. Eric Drexler used the term “nanotechnology” in his 1986 book Engines of Creation: The Coming Era of Nanotechnology, which proposed the idea of a nanoscale “assembler” which would be able to build a copy of itself and of other items of arbitrary complexity with atomic control. Also in 1986, Drexler co-founded The Foresight Institute (with which he is no longer affiliated) to help increase public awareness and understanding of nanotechnology concepts and implications.

Thus, emergence of nanotechnology as a field in the 1980s occurred through convergence of Drexler’s theoretical and public work, which developed and popularized a conceptual framework for nanotechnology, and high-visibility experimental advances that drew additional wide-scale attention to the prospects of atomic control of matter. In the 1980s, two major breakthroughs sparked the growth of nanotechnology in modern era.

First, the invention of the scanning tunneling microscope in 1981 which provided unprecedented visualization of individual atoms and bonds, and was successfully used to manipulate individual atoms in 1989. The microscope’s developers Gerd Binnig and Heinrich Rohrer at IBM Zurich Research Laboratory received a Nobel Prize in Physics in 1986.[10][11] Binnig, Quate and Gerber also invented the analogous atomic force microscope that year.

Second, Fullerenes were discovered in 1985 by Harry Kroto, Richard Smalley, and Robert Curl, who together won the 1996 Nobel Prize in Chemistry.[12][13] C60 was not initially described as nanotechnology; the term was used regarding subsequent work with related graphene tubes (called carbon nanotubes and sometimes called Bucky tubes) which suggested potential applications for nanoscale electronics and devices.

In the early 2000s, the field garnered increased scientific, political, and commercial attention that led to both controversy and progress. Controversies emerged regarding the definitions and potential implications of nanotechnologies, exemplified by the Royal Society’s report on nanotechnology.[14] Challenges were raised regarding the feasibility of applications envisioned by advocates of molecular nanotechnology, which culminated in a public debate between Drexler and Smalley in 2001 and 2003.[15]

Meanwhile, commercialization of products based on advancements in nanoscale technologies began emerging. These products are limited to bulk applications of nanomaterials and do not involve atomic control of matter. Some examples include the Silver Nano platform for using silver nanoparticles as an antibacterial agent, nanoparticle-based transparent sunscreens, carbon fiber strengthening using silica nanoparticles, and carbon nanotubes for stain-resistant textiles.[16][17]

Governments moved to promote and fund research into nanotechnology, such as in the U.S. with the National Nanotechnology Initiative, which formalized a size-based definition of nanotechnology and established funding for research on the nanoscale, and in Europe via the European Framework Programmes for Research and Technological Development.

By the mid-2000s new and serious scientific attention began to flourish. Projects emerged to produce nanotechnology roadmaps[18][19] which center on atomically precise manipulation of matter and discuss existing and projected capabilities, goals, and applications.

Nanotechnology is the engineering of functional systems at the molecular scale. This covers both current work and concepts that are more advanced. In its original sense, nanotechnology refers to the projected ability to construct items from the bottom up, using techniques and tools being developed today to make complete, high performance products.

One nanometer (nm) is one billionth, or 109, of a meter. By comparison, typical carbon-carbon bond lengths, or the spacing between these atoms in a molecule, are in the range 0.120.15 nm, and a DNA double-helix has a diameter around 2nm. On the other hand, the smallest cellular life-forms, the bacteria of the genus Mycoplasma, are around 200nm in length. By convention, nanotechnology is taken as the scale range 1 to 100 nm following the definition used by the National Nanotechnology Initiative in the US. The lower limit is set by the size of atoms (hydrogen has the smallest atoms, which are approximately a quarter of a nm kinetic diameter) since nanotechnology must build its devices from atoms and molecules. The upper limit is more or less arbitrary but is around the size below which phenomena not observed in larger structures start to become apparent and can be made use of in the nano device.[20] These new phenomena make nanotechnology distinct from devices which are merely miniaturised versions of an equivalent macroscopic device; such devices are on a larger scale and come under the description of microtechnology.[21]

To put that scale in another context, the comparative size of a nanometer to a meter is the same as that of a marble to the size of the earth.[22] Or another way of putting it: a nanometer is the amount an average man’s beard grows in the time it takes him to raise the razor to his face.[22]

Two main approaches are used in nanotechnology. In the “bottom-up” approach, materials and devices are built from molecular components which assemble themselves chemically by principles of molecular recognition.[23] In the “top-down” approach, nano-objects are constructed from larger entities without atomic-level control.[24]

Areas of physics such as nanoelectronics, nanomechanics, nanophotonics and nanoionics have evolved during the last few decades to provide a basic scientific foundation of nanotechnology.

Several phenomena become pronounced as the size of the system decreases. These include statistical mechanical effects, as well as quantum mechanical effects, for example the “quantum size effect” where the electronic properties of solids are altered with great reductions in particle size. This effect does not come into play by going from macro to micro dimensions. However, quantum effects can become significant when the nanometer size range is reached, typically at distances of 100 nanometers or less, the so-called quantum realm. Additionally, a number of physical (mechanical, electrical, optical, etc.) properties change when compared to macroscopic systems. One example is the increase in surface area to volume ratio altering mechanical, thermal and catalytic properties of materials. Diffusion and reactions at nanoscale, nanostructures materials and nanodevices with fast ion transport are generally referred to nanoionics. Mechanical properties of nanosystems are of interest in the nanomechanics research. The catalytic activity of nanomaterials also opens potential risks in their interaction with biomaterials.

Materials reduced to the nanoscale can show different properties compared to what they exhibit on a macroscale, enabling unique applications. For instance, opaque substances can become transparent (copper); stable materials can turn combustible (aluminium); insoluble materials may become soluble (gold). A material such as gold, which is chemically inert at normal scales, can serve as a potent chemical catalyst at nanoscales. Much of the fascination with nanotechnology stems from these quantum and surface phenomena that matter exhibits at the nanoscale.[25]

Modern synthetic chemistry has reached the point where it is possible to prepare small molecules to almost any structure. These methods are used today to manufacture a wide variety of useful chemicals such as pharmaceuticals or commercial polymers. This ability raises the question of extending this kind of control to the next-larger level, seeking methods to assemble these single molecules into supramolecular assemblies consisting of many molecules arranged in a well defined manner.

These approaches utilize the concepts of molecular self-assembly and/or supramolecular chemistry to automatically arrange themselves into some useful conformation through a bottom-up approach. The concept of molecular recognition is especially important: molecules can be designed so that a specific configuration or arrangement is favored due to non-covalent intermolecular forces. The WatsonCrick basepairing rules are a direct result of this, as is the specificity of an enzyme being targeted to a single substrate, or the specific folding of the protein itself. Thus, two or more components can be designed to be complementary and mutually attractive so that they make a more complex and useful whole.

Such bottom-up approaches should be capable of producing devices in parallel and be much cheaper than top-down methods, but could potentially be overwhelmed as the size and complexity of the desired assembly increases. Most useful structures require complex and thermodynamically unlikely arrangements of atoms. Nevertheless, there are many examples of self-assembly based on molecular recognition in biology, most notably WatsonCrick basepairing and enzyme-substrate interactions. The challenge for nanotechnology is whether these principles can be used to engineer new constructs in addition to natural ones.

Molecular nanotechnology, sometimes called molecular manufacturing, describes engineered nanosystems (nanoscale machines) operating on the molecular scale. Molecular nanotechnology is especially associated with the molecular assembler, a machine that can produce a desired structure or device atom-by-atom using the principles of mechanosynthesis. Manufacturing in the context of productive nanosystems is not related to, and should be clearly distinguished from, the conventional technologies used to manufacture nanomaterials such as carbon nanotubes and nanoparticles.

When the term “nanotechnology” was independently coined and popularized by Eric Drexler (who at the time was unaware of an earlier usage by Norio Taniguchi) it referred to a future manufacturing technology based on molecular machine systems. The premise was that molecular scale biological analogies of traditional machine components demonstrated molecular machines were possible: by the countless examples found in biology, it is known that sophisticated, stochastically optimised biological machines can be produced.

It is hoped that developments in nanotechnology will make possible their construction by some other means, perhaps using biomimetic principles. However, Drexler and other researchers[26] have proposed that advanced nanotechnology, although perhaps initially implemented by biomimetic means, ultimately could be based on mechanical engineering principles, namely, a manufacturing technology based on the mechanical functionality of these components (such as gears, bearings, motors, and structural members) that would enable programmable, positional assembly to atomic specification.[27] The physics and engineering performance of exemplar designs were analyzed in Drexler’s book Nanosystems.

In general it is very difficult to assemble devices on the atomic scale, as one has to position atoms on other atoms of comparable size and stickiness. Another view, put forth by Carlo Montemagno,[28] is that future nanosystems will be hybrids of silicon technology and biological molecular machines. Richard Smalley argued that mechanosynthesis are impossible due to the difficulties in mechanically manipulating individual molecules.

This led to an exchange of letters in the ACS publication Chemical & Engineering News in 2003.[29] Though biology clearly demonstrates that molecular machine systems are possible, non-biological molecular machines are today only in their infancy. Leaders in research on non-biological molecular machines are Dr. Alex Zettl and his colleagues at Lawrence Berkeley Laboratories and UC Berkeley.[1] They have constructed at least three distinct molecular devices whose motion is controlled from the desktop with changing voltage: a nanotube nanomotor, a molecular actuator,[30] and a nanoelectromechanical relaxation oscillator.[31] See nanotube nanomotor for more examples.

An experiment indicating that positional molecular assembly is possible was performed by Ho and Lee at Cornell University in 1999. They used a scanning tunneling microscope to move an individual carbon monoxide molecule (CO) to an individual iron atom (Fe) sitting on a flat silver crystal, and chemically bound the CO to the Fe by applying a voltage.

The nanomaterials field includes subfields which develop or study materials having unique properties arising from their nanoscale dimensions.[34]

These seek to arrange smaller components into more complex assemblies.

These seek to create smaller devices by using larger ones to direct their assembly.

These seek to develop components of a desired functionality without regard to how they might be assembled.

These subfields seek to anticipate what inventions nanotechnology might yield, or attempt to propose an agenda along which inquiry might progress. These often take a big-picture view of nanotechnology, with more emphasis on its societal implications than the details of how such inventions could actually be created.

Nanomaterials can be classified in 0D, 1D, 2D and 3D nanomaterials. The dimensionality play a major role in determining the characteristic of nanomaterials including physical, chemical and biological characteristics. With the decrease in dimensionality, an increase in surface-to-volume ratio is observed. This indicate that smaller dimensional nanomaterials have higher surface area compared to 3D nanomaterials. Recently, two dimensional (2D) nanomaterials are extensively investigated for electronic, biomedical, drug delivery and biosensor applications.

There are several important modern developments. The atomic force microscope (AFM) and the Scanning Tunneling Microscope (STM) are two early versions of scanning probes that launched nanotechnology. There are other types of scanning probe microscopy. Although conceptually similar to the scanning confocal microscope developed by Marvin Minsky in 1961 and the scanning acoustic microscope (SAM) developed by Calvin Quate and coworkers in the 1970s, newer scanning probe microscopes have much higher resolution, since they are not limited by the wavelength of sound or light.

The tip of a scanning probe can also be used to manipulate nanostructures (a process called positional assembly). Feature-oriented scanning methodology may be a promising way to implement these nanomanipulations in automatic mode.[52][53] However, this is still a slow process because of low scanning velocity of the microscope.

Various techniques of nanolithography such as optical lithography, X-ray lithography, dip pen nanolithography, electron beam lithography or nanoimprint lithography were also developed. Lithography is a top-down fabrication technique where a bulk material is reduced in size to nanoscale pattern.

Another group of nanotechnological techniques include those used for fabrication of nanotubes and nanowires, those used in semiconductor fabrication such as deep ultraviolet lithography, electron beam lithography, focused ion beam machining, nanoimprint lithography, atomic layer deposition, and molecular vapor deposition, and further including molecular self-assembly techniques such as those employing di-block copolymers. The precursors of these techniques preceded the nanotech era, and are extensions in the development of scientific advancements rather than techniques which were devised with the sole purpose of creating nanotechnology and which were results of nanotechnology research.[54]

The top-down approach anticipates nanodevices that must be built piece by piece in stages, much as manufactured items are made. Scanning probe microscopy is an important technique both for characterization and synthesis of nanomaterials. Atomic force microscopes and scanning tunneling microscopes can be used to look at surfaces and to move atoms around. By designing different tips for these microscopes, they can be used for carving out structures on surfaces and to help guide self-assembling structures. By using, for example, feature-oriented scanning approach, atoms or molecules can be moved around on a surface with scanning probe microscopy techniques.[52][53] At present, it is expensive and time-consuming for mass production but very suitable for laboratory experimentation.

In contrast, bottom-up techniques build or grow larger structures atom by atom or molecule by molecule. These techniques include chemical synthesis, self-assembly and positional assembly. Dual polarisation interferometry is one tool suitable for characterisation of self assembled thin films. Another variation of the bottom-up approach is molecular beam epitaxy or MBE. Researchers at Bell Telephone Laboratories like John R. Arthur. Alfred Y. Cho, and Art C. Gossard developed and implemented MBE as a research tool in the late 1960s and 1970s. Samples made by MBE were key to the discovery of the fractional quantum Hall effect for which the 1998 Nobel Prize in Physics was awarded. MBE allows scientists to lay down atomically precise layers of atoms and, in the process, build up complex structures. Important for research on semiconductors, MBE is also widely used to make samples and devices for the newly emerging field of spintronics.

However, new therapeutic products, based on responsive nanomaterials, such as the ultradeformable, stress-sensitive Transfersome vesicles, are under development and already approved for human use in some countries.[55]

As of August 21, 2008, the Project on Emerging Nanotechnologies estimates that over 800 manufacturer-identified nanotech products are publicly available, with new ones hitting the market at a pace of 34 per week.[17] The project lists all of the products in a publicly accessible online database. Most applications are limited to the use of “first generation” passive nanomaterials which includes titanium dioxide in sunscreen, cosmetics, surface coatings,[56] and some food products; Carbon allotropes used to produce gecko tape; silver in food packaging, clothing, disinfectants and household appliances; zinc oxide in sunscreens and cosmetics, surface coatings, paints and outdoor furniture varnishes; and cerium oxide as a fuel catalyst.[16]

Further applications allow tennis balls to last longer, golf balls to fly straighter, and even bowling balls to become more durable and have a harder surface. Trousers and socks have been infused with nanotechnology so that they will last longer and keep people cool in the summer. Bandages are being infused with silver nanoparticles to heal cuts faster.[57] Video game consoles and personal computers may become cheaper, faster, and contain more memory thanks to nanotechnology.[58] Also, to build structures for on chip computing with light, for example on chip optical quantum information processing, and picosecond transmission of information.[59]

Nanotechnology may have the ability to make existing medical applications cheaper and easier to use in places like the general practitioner’s office and at home.[60] Cars are being manufactured with nanomaterials so they may need fewer metals and less fuel to operate in the future.[61]

Scientists are now turning to nanotechnology in an attempt to develop diesel engines with cleaner exhaust fumes. Platinum is currently used as the diesel engine catalyst in these engines. The catalyst is what cleans the exhaust fume particles. First a reduction catalyst is employed to take nitrogen atoms from NOx molecules in order to free oxygen. Next the oxidation catalyst oxidizes the hydrocarbons and carbon monoxide to form carbon dioxide and water.[62] Platinum is used in both the reduction and the oxidation catalysts.[63] Using platinum though, is inefficient in that it is expensive and unsustainable. Danish company InnovationsFonden invested DKK 15 million in a search for new catalyst substitutes using nanotechnology. The goal of the project, launched in the autumn of 2014, is to maximize surface area and minimize the amount of material required. Objects tend to minimize their surface energy; two drops of water, for example, will join to form one drop and decrease surface area. If the catalyst’s surface area that is exposed to the exhaust fumes is maximized, efficiency of the catalyst is maximized. The team working on this project aims to create nanoparticles that will not merge. Every time the surface is optimized, material is saved. Thus, creating these nanoparticles will increase the effectiveness of the resulting diesel engine catalystin turn leading to cleaner exhaust fumesand will decrease cost. If successful, the team hopes to reduce platinum use by 25%.[64]

Nanotechnology also has a prominent role in the fast developing field of Tissue Engineering. When designing scaffolds, researchers attempt to the mimic the nanoscale features of a Cell’s microenvironment to direct its differentiation down a suitable lineage.[65] For example, when creating scaffolds to support the growth of bone, researchers may mimic osteoclast resorption pits.[66]

Researchers have successfully used DNA origami-based nanobots capable of carrying out logic functions to achieve targeted drug delivery in cockroaches. It is said that the computational power of these nanobots can be scaled up to that of a Commodore 64.[67]

An area of concern is the effect that industrial-scale manufacturing and use of nanomaterials would have on human health and the environment, as suggested by nanotoxicology research. For these reasons, some groups advocate that nanotechnology be regulated by governments. Others counter that overregulation would stifle scientific research and the development of beneficial innovations. Public health research agencies, such as the National Institute for Occupational Safety and Health are actively conducting research on potential health effects stemming from exposures to nanoparticles.[68][69]

Some nanoparticle products may have unintended consequences. Researchers have discovered that bacteriostatic silver nanoparticles used in socks to reduce foot odor are being released in the wash.[70] These particles are then flushed into the waste water stream and may destroy bacteria which are critical components of natural ecosystems, farms, and waste treatment processes.[71]

Public deliberations on risk perception in the US and UK carried out by the Center for Nanotechnology in Society found that participants were more positive about nanotechnologies for energy applications than for health applications, with health applications raising moral and ethical dilemmas such as cost and availability.[72]

Experts, including director of the Woodrow Wilson Center’s Project on Emerging Nanotechnologies David Rejeski, have testified[73] that successful commercialization depends on adequate oversight, risk research strategy, and public engagement. Berkeley, California is currently the only city in the United States to regulate nanotechnology;[74] Cambridge, Massachusetts in 2008 considered enacting a similar law,[75] but ultimately rejected it.[76] Relevant for both research on and application of nanotechnologies, the insurability of nanotechnology is contested.[77] Without state regulation of nanotechnology, the availability of private insurance for potential damages is seen as necessary to ensure that burdens are not socialised implicitly.

Nanofibers are used in several areas and in different products, in everything from aircraft wings to tennis rackets. Inhaling airborne nanoparticles and nanofibers may lead to a number of pulmonary diseases, e.g. fibrosis.[78] Researchers have found that when rats breathed in nanoparticles, the particles settled in the brain and lungs, which led to significant increases in biomarkers for inflammation and stress response[79] and that nanoparticles induce skin aging through oxidative stress in hairless mice.[80][81]

A two-year study at UCLA’s School of Public Health found lab mice consuming nano-titanium dioxide showed DNA and chromosome damage to a degree “linked to all the big killers of man, namely cancer, heart disease, neurological disease and aging”.[82]

A major study published more recently in Nature Nanotechnology suggests some forms of carbon nanotubes a poster child for the “nanotechnology revolution” could be as harmful as asbestos if inhaled in sufficient quantities. Anthony Seaton of the Institute of Occupational Medicine in Edinburgh, Scotland, who contributed to the article on carbon nanotubes said “We know that some of them probably have the potential to cause mesothelioma. So those sorts of materials need to be handled very carefully.”[83] In the absence of specific regulation forthcoming from governments, Paull and Lyons (2008) have called for an exclusion of engineered nanoparticles in food.[84] A newspaper article reports that workers in a paint factory developed serious lung disease and nanoparticles were found in their lungs.[85][86][87][88]

Calls for tighter regulation of nanotechnology have occurred alongside a growing debate related to the human health and safety risks of nanotechnology.[89] There is significant debate about who is responsible for the regulation of nanotechnology. Some regulatory agencies currently cover some nanotechnology products and processes (to varying degrees) by “bolting on” nanotechnology to existing regulations there are clear gaps in these regimes.[90] Davies (2008) has proposed a regulatory road map describing steps to deal with these shortcomings.[91]

Stakeholders concerned by the lack of a regulatory framework to assess and control risks associated with the release of nanoparticles and nanotubes have drawn parallels with bovine spongiform encephalopathy (“mad cow” disease), thalidomide, genetically modified food,[92] nuclear energy, reproductive technologies, biotechnology, and asbestosis. Dr. Andrew Maynard, chief science advisor to the Woodrow Wilson Center’s Project on Emerging Nanotechnologies, concludes that there is insufficient funding for human health and safety research, and as a result there is currently limited understanding of the human health and safety risks associated with nanotechnology.[93] As a result, some academics have called for stricter application of the precautionary principle, with delayed marketing approval, enhanced labelling and additional safety data development requirements in relation to certain forms of nanotechnology.[94][95]

The Royal Society report[14] identified a risk of nanoparticles or nanotubes being released during disposal, destruction and recycling, and recommended that “manufacturers of products that fall under extended producer responsibility regimes such as end-of-life regulations publish procedures outlining how these materials will be managed to minimize possible human and environmental exposure” (p. xiii).

The Center for Nanotechnology in Society has found that people respond to nanotechnologies differently, depending on application with participants in public deliberations more positive about nanotechnologies for energy than health applications suggesting that any public calls for nano regulations may differ by technology sector.[72]

Continue reading here:

Nanotechnology – Wikipedia

Nanotechnology – Simple English Wikipedia, the free …

Nanotechnology is a part of science and technology about the control of matter on the atomic and molecular scale – this means things that are about 100 nanometres across

Nanotechnology includes making products that use parts this small, such as electronic devices, catalysts, sensors, etc. To give you an idea of how small that is, there are more nanometres in an inch than there are inches in 400 miles.[1]

To give a international idea of how small that is, there are as many nanometres in a centimetre, as there are centimetres in 100 kilometres.

Nanotechnology brings together scientists and engineers from many different subjects, such as applied physics, materials science, interface and colloid science, device physics, chemistry, supramolecular chemistry (which refers to the area of chemistry that focuses on the non-covalent bonding interactions of molecules), self-replicating machines and robotics, chemical engineering, mechanical engineering, biology, biological engineering, and electrical engineering.

Generally, when people talk about nanotechnology, they mean structures of the size 100 nanometers or smaller. There are one million nanometers in a millimeter. Nanotechnology tries to make materials or machines of that size.

People are doing many different types of work in the field of nanotechnology. Most current work looks at making nanoparticles (particles with nanometer size) that have special properties, such as the way they scatter light, absorb X-rays, transport electrical currents or heat, etc. At the more “science fiction” end of the field are attempts to make small copies of bigger machines or really new ideas for structures that make themselves. New materials are possible with nano size structures. It is even possible to work with single atoms.

There has been a lot of discussion about the future of nanotechnology and its dangers. Nanotechnology may be able to invent new materials and instruments which would be very useful, such as in medicine, computers, and making clean electricity (nanoelectromechanical systems) is helping design the next generation of solar panels, and efficient low-energy lighting). On the other hand, nanotechnology is new and there could be unknown problems. For example if the materials are bad for people’s health or for nature. They may have a bad effect on the economy or even big natural systems like the Earth itself. Some groups argue that there should be rules about the use of nanotechnology.

Ideas of nanotechnology were first used in talk “There’s Plenty of Room at the Bottom”, a talk given by the scientist Richard Feynman at an American Physical Society meeting at Caltech on December 29, 1959. Feynman described a way to move individual atoms to build smaller instruments and operate at that scale. Properties such as surface tension and Van der walls force would become very important.

Feynman’s simple idea seemed possible. The word “nanotechnology” was explained by Tokyo Science University Professor Norio Taniguchi in a 1974 paper. He said that nanotechnology was the work of changing materials by one atom or by one molecule. In the 1980s this idea was studied by Dr. K. Eric Drexler, who spoke and wrote about the importance of nano-scale events . “Engines of Creation: The Coming Era of Nanotechnology” (1986) is thought to be the willythirst book on nanotechnology. Nanotechnology and Nano science started with two key developments: the start of cluster science and the invention of the scanning tunneling microscope (STM). Soon afterwards, new molecules with carbon were discovered – first fullerenes in 1986 and carbon nanotubes a few years later. In another development, people studied how to make semiconductor nano crystals. Many metal oxide nanoparticles are now used as quantum dots (nanoparticles where the behaviour of single electrons becomes important). In 2000, the United States National Nanotechnology Initiative began to develop science in this field.

Nanotechnology has nanomaterials which can be classified into one, two and three dimensions nanoparticles. This classification is based upon different properties it holds such as scattering of light, absorbing x rays, transport electric current or heat. Nanotechnology has multidisciplinary character affecting multiple traditional technologies and different scientific disciplines. New materials which can be scaled even at atomic size can be manufactured.

At nano scale physical properties of system or particles substantially change. Physical properties such as quantum size effects where electrons move different for very small sizes of particle. Properties such as mechanical, electrical and optical changes when macroscopic system changes to microscopic one which is of utmost importance.

Nano materials and particles can act as catalyst to increase the reaction rate along with that produce better yield as compared to other catalyst.Some of the most interesting properties when particle gets converted to nano scale are substances which usually stop light become transparent (copper); it becomes possible to burn some materials (aluminum); solids turn into liquids at room temperature (gold); insulators become conductors (silicon). A material such as gold, which does not react with other chemicals at normal scales, can be a powerful chemical catalyst at nanoscales. These special properties which we can only see at the nano scale are one of the most interesting things about nanotechnology.

Follow this link:

Nanotechnology – Simple English Wikipedia, the free …

3 Reasons Why We Might Return to The Moon

we may see manned missions to the moon. Science, politics, and celestial cash grabs are at the forefront of why people want to go back.

Friday marks the 49th anniversary of the first time any human set foot on solid, extraterrestrial ground. The details are probably familiar: on July 20, 1969, Neil Armstrong and Buzz Aldrin became the first people to walk on the Moon. It’s a rare privilege, even now: only ten other people have landed on the Moon and gone out for a stroll.

Just over three years later, humans walked on the Moon for the last time. Changing political and economic priorities meant NASA would no longer focus on sending people to the Moon. After all, we had already planted a flag, confirmed that the Moon wasn’t made of cheese, and played some golf. What else is left?

Well, it just so turns out that we might be heading back out there — and soon. President Trump has insisted on resuming manned Moon missions, despite the fact that it doesn’t match the public or scientific community’s desires for a space program (no one is quite sure where his determination stems from, but it doesn’t seem to have much more substance than a whim).

But there are some other, real reasons that we might want to send someone to the Moon. There’s science to be done, and money to be made. Let’s dig a little deeper and see what might be bringing us back to our lunar neighbor.

1) Trump really wants it to happen.

Last December, President Trump signed a directive indicating that NASA would prioritize human exploration to the Moon and beyond. Just imagine: a human setting foot on the Moon! Accomplishing such an impossible feat would show the rest of the world that America is capable of great things, which would really assert our dominance on the international stage!

So, assuming that President Trump knows we won the space race 43 years ago (he knows, right? right?) there might be other reasons why Trump wants more people to go visit. Maybe it’s a display of national achievement, maybe it’s to develop economic or military advantages. Either way, the White House is pushing hard for that giant leap.

2) Cash money.

A rare isotope called helium-3 could help us produce clean and safe nuclear energy without giving off any hazardous or radioactive waste. And it just so happens that the Moon has loads of the stuff (so does Jupiter, but that’s a bit harder to reach).

While a helium nuclear fusion reactor does not yet exist, many expect that helium-3 could be the missing piece — and whoever secures the supply would unlock riches to rival Scrooge McDuck.

Two years ago, the federal government gave a private company its blessing to land on the Moon for the first time. Moon Express, which also plans to dump human ashes on the Moon (read: litter) for customers who want an unconventional cremation, has the ultimate goal of establishing a lunar mining colony. According to the company’s website, Expedition “Harvest Moon” plans to have a permanent research station up and running by 2021. At that point, it will begin extracting samples and raw materials to send back to Earth.

This could lead to more and (maybe) better research into the moon’s history and makeup, especially since our supply of samples from the Apollo missions is so limited. But helium-3 is what Moon Express is really after. And they’re not the only ones  the Chinese government also has its eyes set on the Moon’s helium-3 supply.

In addition to opening space up to private mining operations, Trump has reached out to NASA in hopes that the agency’s technology could be used to launch mining rigs to the Moon and to asteroids.

But there’s a lot that needs to happen before the spacefaring equivalents of coal barons start selling space rocks. For instance, we need to figure out how to approach and land on an asteroid, and to set up at least semi-permanent bases and mining operations. But still, some companies some companies are forging ahead.

3) Science! slash, practice for Mars.

The government, along with multiple space-interested billionaires, have some well-publicized plans to colonize Mars. Their reasons range from: furthering scientific research, to exploring the cosmos for funsies, to saving humanity from, uh, something.

The Moon could play a vital role in those plans — as practice off-world destination, and as a celestial truck stop along the way.

In February, Commerce Secretary Wilbur Ross said that setting up a colony on the Moon will be essential for future space exploration. Especially, he mentioned, so that it can serve as a refueling station. His logic seems to be based on the fact that the Moon exerts less gravitational force than the Earth, so landing and relaunching a refueled rocket would let that rocket explore farther into space.

Some have also proposed using a Moon base as practice for a Martian settlement, since they would be much closer to Earth — Moon-dwellers would only be three days from Earth, while human Martians would be eight months from home.

NASA’s Gateway mission, as Time reported, could give rise to lunar settlements within the next ten years. Gateway would function as a space station in orbit around the Moon, but would be capable of traveling to and from the surface. The expected Gateway timeline is controversial even within NASA, however, as some feel that its far too optimistic about when we might actually see results.

There are still too many unknowns and hazards for people in space settlements for such a program to succeed today. Even trying to simulate a Mars colony on Earth led to several unforeseen mental strains and complications.

But either way, ongoing exploration and research missions continue to radically change our understanding of the Moon.

“Ten years ago we would have said that the Moon was complete dry,” Ryan Zeigler, NASA’s curator of lunar samples from the Apollo missions, told Futurism. “Over the past ten years, new instruments and new scientists have shown this to not be the case, and that has had profound effects on the models that predict how the Earth-Moon system has formed,” he added.

Of course, there are financial reasons at the forefront the recent push for lunar exploration. But even if its just a pleasant side effect, we may get valuable new science out of these missions, too.

Read more about complications with NASA’s lunar plans: NASA Just Canceled Its Only Moon Rover Project. That’s Bad News for Trump’s Lunar Plans.

The post 3 Reasons Why We Might Return to The Moon appeared first on Futurism.

Here is the original post:

3 Reasons Why We Might Return to The Moon

Most Of NASA’s Moon Rocks Remain Untouched By Scientists

we have only studied about 16 percent of the moon rocks taken during the Apollo missions. NASA's Apollo curator keeps them for future generations.

Forty-nine years ago this Friday, Neil Armstrong and Buzz Aldrin became the first humans to set foot on the Moon. That day, they also became the first people to harvest samples from another celestial body and bring them back to Earth.

Over the course of the Apollo missions, astronauts collected about 2,200 individual samples weighing a total of 842 pounds (382 kg) for scientific study that continues today, NASA curator Ryan Zeigler told Futurism. Zeigler, who also conducts geochemical research, is responsible for overseeing NASA’s collection of space rocks from the Apollo missions, as well as those from Mars, asteroids, stars, and anywhere else other than Earth.

Scientists have only studied about 16 percent of all the Apollo samples by mass, Zeigler told Futurism. Within that 16 percent, just under one-third has been put on display, which Zeigler noted largely keeps the samples pristine. Another quarter were at least partially destroyed (on purpose) during NASA-approved research, and the rest have been analyzed in less destructive ways.

“Trying not to deplete the samples so that future scientists will still have the opportunity to work with them is definitely something we are considering,” says Zeigler. “Also, while I would consider the Apollo samples primarily a scientific resource (though as a scientist am obviously biased), it is undeniable that these samples also have significant historic and cultural importance as well, and thus need to be preserved on those grounds, too.”

The cultural reasons to preserve moon rocks, Zeigler says, are harder to define. But it’s still important to make sure future scientists have enough space rocks left to work with, especially since we can’t fully predict the sorts of questions they’ll try to answer using the Apollo samples, or the technology that will be at their disposal.

“Every decade since the Apollo samples came back has seen significant advances in instrumentation that have allowed samples to be analyzed at higher levels of precision, or smaller spatial resolution,” Zeigler says. “Our understanding of the Moon, and really the whole solar system, has evolved considerably by continuing studies of the Apollo samples.”

“Our understanding of the Moon, and really the whole solar system, has evolved considerably by continuing studies of the Apollo samples.”

In the last six years, Zeigler says that his curation team saw 351 requests for Apollo samples, which comes out to about 60 each year. Within those requests, the scientists have asked for about 692 individual samples per year, most of which weigh one to two grams each. Even if the researchers don’t get everything that they ask for, Zeigler says, most of the studies are at least partially approved, and he’s been loaning out about 525 samples every year. That comes out to just over 75 percent of what the scientists requested.

“So while it is true that significant scientific justification is required to get Apollo samples, and we (NASA, with the support of the planetary scientific community) are intentionally reserving a portion of the Apollo samples for future generations of scientists and scientific instruments to study, the samples are available to scientists around the world to study, and we are slowly lowering the percentage of material that is left,” Zeigler says.

Thankfully, about 84 percent of the Apollo samples are still untouched. That pretty much guarantees that the next generation of geologists and astronomers who try to decipher the Moon’s remaining secrets will have enough samples to fiddle with.

To read more on future lunar research, click here: Three Reasons Why We Might Return To The Moon

The post Most Of NASA’s Moon Rocks Remain Untouched By Scientists appeared first on Futurism.

See the original post:

Most Of NASA’s Moon Rocks Remain Untouched By Scientists

This New Startup Is Making Chatbots Dumber So You Can Actually Talk to Them

A Spanish tech startup decided to ditch artificial intelligence to make its chatbot platform more approachable

Tech giants have been trying to one-up each other to make the most intelligent chatbot out there. They can help you simply fill in forms, or take the form of fleshed-out digital personalities that can have meaningful conversations with you. Those that have voice functions have come insanely close to mimicking human speech — inflections, and even the occasional “uhm’s” and “ah’s” — perfectly.

And they’re much more common than you might think. In 2016, Facebook introduced Messenger Bots that businesses worldwide now use for simple tasks like ordering flowers, getting news updates in chat form, or getting information on flights from an airline. Millions of users are filling waiting lists to talk to an “emotional chatbot” on an app called Replika.

But there’s no getting around AI’s shortcomings. And for chatbots in particular, the frustration arises from a disconnect between the user’s intent or expectations, and the chatbot’s programmed abilities.

Take Facebook’s Project M. Sources believe Facebook’s (long forgotten) attempt at developing a truly intelligent chatbot never surpassed a 30 percent success rate, according to Wired — the remaining 70 percent of the time, human employees had to step in to solve tasks. Facebook billed the bot as all-knowing, but the reality was far less promising. It simply couldn’t handle pretty much any task it was asked to do by Facebook’s numerous users.

Admittedly, takes a a lot of resources to develop complex AI chatbots. Even Google Duplex, arguably the most advanced chatbot around today, is still limited to verifying business hours and making simple appointments. Still, users simply expect far more than what AI chatbots can actually do, which tends to enrage users.

The tech industry isn’t giving up. Market researchers predict that chatbots will grow to become a $1 billion market by 2025.

But maybe they’re going about this all wrong. Maybe, instead of making more sophisticated chatbots, businesses should focus on what users really need in a chatbot, stripped down to its very essence.

Landbot, a one-year-old Spanish tech startup, is taking a different approach: it’s making a chatbot-builder for businesses that does the bare minimum, and nothing more. The small company landed $2.2 million in a single round of funding (it plans to use those funds primarily to expand its operations and cover the costs of relocating to tech innovation hub Barcelona).

“We started our chatbot journey using Artificial Intelligence technology but found out that there was a huge gap between user expectations and reality,” co-founder Jiaqi Pan tells TechCrunch. “No matter how well trained our chatbots were, users were constantly dropped off the desired flow, which ended up in 20 different ways of saying ‘TALK WITH A HUMAN’.”

Instead of creating advanced tech that could predict and analyze user prompts, Landbot decided to work on a simple user interface that allows businesses to create chat flows that link prompt and action, question and answer. It’s kind of like a chatbot flowchart builder. And the results are pretty positive: the company has seen healthy revenue growth, and the tool is used by hundreds of businesses in more than 50 countries, according to TechCrunch.

The world is obsessed with achieving perfect artificial intelligence, and the growing AI chatbot market is no different. So obsessed in fact, it’s driving users away — growing disillusionment, frustration, and rage are undermining tech companies’ efforts. And this obsession might be doing far more harm than good. It’s simple: people are happiest when they get the results they expect. Added complexity or lofty promises of “true AI” will end up pushing them away if it doesn’t actually end up helping them.

After all, sometimes less is more. Landbot and its customers are making it work with less.

Besides, listening to your customers can go a long way.

Now can you please connect me to a human?

The post This New Startup Is Making Chatbots Dumber So You Can Actually Talk to Them appeared first on Futurism.

Link:

This New Startup Is Making Chatbots Dumber So You Can Actually Talk to Them

This Wearable Controller Lets You Pilot a Drone With Your Body

PUT DOWN THE JOYSTICK. If you’ve ever tried to pilot a drone, it’s probably taken a little while to do it well; each drone is a little different, and figuring out how to use its manual controller can take time. There seems to be no shortcut other than to suffer a crash landing or two.

Now, a team of researchers from the Swiss Federal Institute of Technology in Lausanne (EPFL) have created a wearable drone controller that makes the process of navigation so intuitive, it requires almost no thought at all. They published their research in the journal PNAS on Monday.

NOW, PRETEND YOU’RE A DRONE. To create their wearable drone controller, the researchers first needed to figure out how people wanted to move their bodies to control a drone. So they placed 19 motion-capture markers and various electrodes all across the upper bodies of 17 volunteers. Then, they asked each volunteer to watch simulated drone footage through virtual reality goggles. This let the volunteer feel like they were seeing through the eyes of a drone.

The researchers then asked the volunteers to move their bodies however they liked to mimic the drone as it completed five specific movements (for example, turning right or flying toward the ground). The markers and electrodes allowed the researchers to monitor those movements, and they found that most volunteers moved their torsos in a way simple enough to track using just four motion-capture markers.

With this information, the researchers created a wearable drone controller that could relay the user’s movements to an actual drone — essentially, they built a wearable joystick.

PUTTING IT TO THE TEST. To test their wearable drone controller, the researchers asked 39 volunteers to complete a real (not virtual) drone course using either the wearable or a standard joystick. They found that volunteers wearing the suit outperformed those using the joystick in both learning time and steering abilities.

“Using your torso really gives you the feeling that you are actually flying,” lead author Jenifer Miehlbradt said in a press release. “Joysticks, on the other hand, are of simple design but mastering their use to precisely control distant objects can be challenging.”

IN THE FIELD. Mehlbradt envisions search and rescue crews using her team’s wearable drone controller. “These tasks require you to control the drone and analyze the environment simultaneously, so the cognitive load is much higher,” she told Inverse. “I think having control over the drone with your body will allow you to focus more on what’s around you.”

However, this greater sense of immersion in the drone’s environment might not be beneficial in all scenarios. Previous research has shown that piloting strike drones for the military can cause soldiers to experience significant levels of trauma, and a wearable like the EPFL team’s has the potential to exacerbate the problem.

While Miehlbradt told Futurism her team did not consider drone strikes while developing their drone suit, she speculates that such applications wouldn’t be a good fit.

“I think that, in this case, the ‘distance’ created between the operator and the drone by the use of a third-party control device is beneficial regarding posterior emotional trauma,” she said. “With great caution, I would speculate that our control approach — should it be used in such a case —  may therefore increase the risk of experiencing such symptoms.”

READ MORE: Drone Researchers Develop Genius Method for Piloting Using Body Movements [Inverse]

More on rescue drones: A Rescue Drone Saved Two Teen Swimmers on Its First Day of Deployment

The post This Wearable Controller Lets You Pilot a Drone With Your Body appeared first on Futurism.

Read the original post:

This Wearable Controller Lets You Pilot a Drone With Your Body

Google and The UN Team Up To Study The Effects of Climate Change

Google agreed to work with UN Environment to create a platform that gives the world access to valuable environmental data.

WITH OUR POWERS COMBINED… The United Nations’ environmental agency has landed itself a powerful partner in the fight against climate change: Google. The tech company has agreed to partner with UN Environment to increase the world’s access to valuable environmental data. Specifically, the two plan to create a user-friendly platform that lets anyone, anywhere, access environmental data collected by Google’s vast network of satellites. The organizations announced their partnership at a UN forum focused on sustainable development on Monday.

FRESHWATER FIRST. The partnership will first focus on freshwater ecosystems, such as mountains, wetlands, and rivers. These ecosystems provide homes for an estimated 10 percent of our planet’s known species, and research has shown that climate change is causing a rapid loss in biodiversity. Google will use satellite imagery to produce maps and data on these ecosystems in real-time, making that information freely available to anyone via the in-development online platform. According to a UN Environment press release, this will allow nations and other organizations to track changes and take action to prevent or reverse ecosystem loss.

LOST FUNDING. Since President Trump took office, the United States has consistently decreased its contributions to global climate research funds. Collecting and analyzing satellite data is neither cheap nor easy, but Google is already doing it to power platforms such as Google Maps and Google Earth. Now, thanks to this partnership, people all over the world will have a way to access information to help combat the impacts of climate change. Seems the same data that let’s you virtually visit the Eiffel Tower could help save our planet.

READ MORE: UN Environment and Google Announce Ground-Breaking Partnership to Protect Our Planet [UN Environment]

More on freshwater: Climate Change Is Acidifying Our Lakes and Rivers the Same Way It Does With Oceans

The post Google and The UN Team Up To Study The Effects of Climate Change appeared first on Futurism.

View original post here:

Google and The UN Team Up To Study The Effects of Climate Change

Alphabet Will Bring Its Balloon-Powered Internet to Kenya

Alphabet has inked a deal with a Kenyan telecom to bring its balloon-powered internet to rural and suburban parts of Kenya

BADASS BALLOONS. In 2013, Google unveiled Project Loon, a plan to send a fleet of balloons into the stratosphere that could then beam internet service back down to people on Earth.

And it worked! Just last year, the project provided more than 250,000 Puerto Ricans with internet service in the wake of the devastation of Hurricane Maria. The company, now simply called Loon, was the work of X, an innovation lab originally nestled under Google but now a subsidiary of Google’s parent company, Alphabet. And it’s planning to bring its balloon-powered internet to Kenya.

EYES ON AFRICA. On Thursday, Loon announced a partnership with Telkom Kenya, Kenya’s third largest telecommunications provider. Starting next year, Loon balloons will soar high above the East African nation, sending 4G internet coverage down to its rural and suburban populations. This marks the first time Loon has inked a commercial deal with an African nation.

“Loon’s mission is to connect people everywhere by inventing and integrating audacious technologies,” Loon CEO Alastair Westgarth told Reuters. Telkom CEO Aldo Mareuse added,“We will work very hard with Loon, to deliver the first commercial mobile service, as quickly as possible, using Loon’s balloon-powered internet in Africa.”

INTERNET EVERYWHERE. The internet is such an important part of modern life that, back in 2016, the United Nations declared access to it a human right. And while you might have a hard time thinking about going even a day without internet access, more than half of the world’s population still can’t log on. In Kenya, about one-third of the population still lacks access.

Thankfully, Alphabet isn’t the only company working to get the world connected. SpaceX, Facebook, and SoftBank-backed startup Altaeros have their own plans involving satellites, drones, and blimps, respectively. Between those projects and Loon, the world wide web may finally be available to the entire world.

READ MORE: Alphabet to Deploy Balloon Internet in Kenya With Telkom in 2019 [Reuters]

More on Loon: Alphabet Has Officially Launched Balloons that Deliver Internet In Puerto Rico

The post Alphabet Will Bring Its Balloon-Powered Internet to Kenya appeared first on Futurism.

Follow this link:

Alphabet Will Bring Its Balloon-Powered Internet to Kenya

China Is Investing In Its Own Hyperloop To Clear Its Crowded Highways

Chinese state-backed companies just made huge investments in U.S. based Hyperloop startups. But will it solve China's stifling traffic problems?

GRIDLOCK. China’s largest cities are choking in traffic. Millions of cars on the road means stifling levels of air pollution and astronomical commute times, especially during rush hours.

The latest move to address this urban traffic nightmare: Chinese state-backed companies are making heavy investments in U.S. hyperloop startups Arrivo and Hyperloop Transportation Technologies, lining up $1 billion and $300 million in credit respectively. It’s substantial financing that could put China ahead in the race to open the first full-scale hyperloop track.

MAG-LEV SLEDS. Both companies are planning something big, although their approaches differ in some key ways. Transport company Arrivo is focusing on relieving highway traffic by creating a separate track that allows cars to zip along at 200 miles per hour (320 km/h) on magnetically levitated sleds inside vacuum-sealed tubes (it’s not yet clear if this will be above ground or underground).

Arrivo’s exact plans to build a Chinese hyperloop system have not yet been announced, but co-founder Andrew Liu told Bloomberg that $1 billion in funding could be enough to build “as many as three legs of a commercial, citywide hyperloop system of 6 miles to 9 miles [9.5 to 14.4 km] per section.” The company hasn’t yet announced in which city it’ll be built.

Meanwhile, Hyperloop Transportation Technologies has already made up its mind as to where it will plop down its first Chinese loop. It’s the old familiar maglev train design inside a vacuum tube, but instead it’s passengers, not their cars, that will ride along at speeds of up to 750 mph (1200 km/h). Most of the $300 million will go towards building a 6.2 mile (10 km) test track in Guizhou province. According to a press release, this marks the third commercial agreement for HyperloopTT after Abu Dhabi and Ukraine from earlier this year.

A PRICEY SOLUTION. Building a hyperloop is expensive. This latest investment hints at just how expensive just a single system could be in the end. But providing high-speed alternatives to car-based transport is only one of many ways to deal with the gridlock and traffic jams that plague urban centers. China, for instance, has attempted to tackle the problem by restricting driving times based on license plates, expanding bike sharing networks, and even mesh ride-sharing data with smart traffic lights.

And according to a recent report by Chinese location-based services provider AutoNavi, those solutions seem to be working: a Quartz analysis of the data found that traffic declined by 12.5 and 9 percent in Hangzhou and Shenzhen respectively, even though the population grew by 3 and 5 percent.

MO’ MONEY, MO’ PROBLEMS. There are more hurdles to overcome before hyperloop can have a significant impact in China. There is the cost of using the hyperloop system — if admission is priced too high (perhaps to cover astronomical infrastructure costs), adoption rates may remain too low to have a significant effect.

The capacity of a maglev train system would also have to accommodate China’s  growing population centers. That’s not an easy feat HyperloopTT’s capusles have to squeeze through a four meter (13 feet) diameter tube and only hold 28 to 40 people at a time, and there are 3 million cars in Shenzhen alone.

We don’t know yet whether China’s hyperloop investments will pay off and significantly reduce traffic in China’s urban centers. But bringing new innovations to transportation in massive and growing cities — especially when those new innovations are more environmentally friendly — is rarely a bad idea.

The post China Is Investing In Its Own Hyperloop To Clear Its Crowded Highways appeared first on Futurism.

Read more:

China Is Investing In Its Own Hyperloop To Clear Its Crowded Highways


12345...102030...