12345...102030...


Gene therapy – Wikipedia

In the medicine field, gene therapy (also called human gene transfer) is the therapeutic delivery of nucleic acid into a patient’s cells as a drug to treat disease.[1][2] The first attempt at modifying human DNA was performed in 1980 by Martin Cline, but the first successful nuclear gene transfer in humans, approved by the National Institutes of Health, was performed in May 1989.[3] The first therapeutic use of gene transfer as well as the first direct insertion of human DNA into the nuclear genome was performed by French Anderson in a trial starting in September 1990.

Between 1989 and February 2016, over 2,300 clinical trials had been conducted, more than half of them in phase I.[4]

Not all medical procedures that introduce alterations to a patient’s genetic makeup can be considered gene therapy. Bone marrow transplantation and organ transplants in general have been found to introduce foreign DNA into patients.[5] Gene therapy is defined by the precision of the procedure and the intention of direct therapeutic effects.

Gene therapy was conceptualized in 1972, by authors who urged caution before commencing human gene therapy studies.

The first attempt, an unsuccessful one, at gene therapy (as well as the first case of medical transfer of foreign genes into humans not counting organ transplantation) was performed by Martin Cline on 10 July 1980.[6][7] Cline claimed that one of the genes in his patients was active six months later, though he never published this data or had it verified[8] and even if he is correct, it’s unlikely it produced any significant beneficial effects treating beta-thalassemia.

After extensive research on animals throughout the 1980s and a 1989 bacterial gene tagging trial on humans, the first gene therapy widely accepted as a success was demonstrated in a trial that started on 14 September 1990, when Ashi DeSilva was treated for ADA-SCID.[9]

The first somatic treatment that produced a permanent genetic change was performed in 1993.[citation needed]

Gene therapy is a way to fix a genetic problem at its source. The polymers are either translated into proteins, interfere with target gene expression, or possibly correct genetic mutations.

The most common form uses DNA that encodes a functional, therapeutic gene to replace a mutated gene. The polymer molecule is packaged within a “vector”, which carries the molecule inside cells.

Early clinical failures led to dismissals of gene therapy. Clinical successes since 2006 regained researchers’ attention, although as of 2014, it was still largely an experimental technique.[10] These include treatment of retinal diseases Leber’s congenital amaurosis[11][12][13][14] and choroideremia,[15] X-linked SCID,[16] ADA-SCID,[17][18] adrenoleukodystrophy,[19] chronic lymphocytic leukemia (CLL),[20] acute lymphocytic leukemia (ALL),[21] multiple myeloma,[22] haemophilia,[18] and Parkinson’s disease.[23] Between 2013 and April 2014, US companies invested over $600 million in the field.[24]

The first commercial gene therapy, Gendicine, was approved in China in 2003 for the treatment of certain cancers.[25] In 2011 Neovasculgen was registered in Russia as the first-in-class gene-therapy drug for treatment of peripheral artery disease, including critical limb ischemia.[26]In 2012 Glybera, a treatment for a rare inherited disorder, became the first treatment to be approved for clinical use in either Europe or the United States after its endorsement by the European Commission.[10][27]

Following early advances in genetic engineering of bacteria, cells, and small animals, scientists started considering how to apply it to medicine. Two main approaches were considered replacing or disrupting defective genes.[28] Scientists focused on diseases caused by single-gene defects, such as cystic fibrosis, haemophilia, muscular dystrophy, thalassemia, and sickle cell anemia. Glybera treats one such disease, caused by a defect in lipoprotein lipase.[27]

DNA must be administered, reach the damaged cells, enter the cell and either express or disrupt a protein.[29] Multiple delivery techniques have been explored. The initial approach incorporated DNA into an engineered virus to deliver the DNA into a chromosome.[30][31] Naked DNA approaches have also been explored, especially in the context of vaccine development.[32]

Generally, efforts focused on administering a gene that causes a needed protein to be expressed. More recently, increased understanding of nuclease function has led to more direct DNA editing, using techniques such as zinc finger nucleases and CRISPR. The vector incorporates genes into chromosomes. The expressed nucleases then knock out and replace genes in the chromosome. As of 2014 these approaches involve removing cells from patients, editing a chromosome and returning the transformed cells to patients.[33]

Gene editing is a potential approach to alter the human genome to treat genetic diseases,[34] viral diseases,[35] and cancer.[36] As of 2016 these approaches were still years from being medicine.[37][38]

Gene therapy may be classified into two types:

In somatic cell gene therapy (SCGT), the therapeutic genes are transferred into any cell other than a gamete, germ cell, gametocyte, or undifferentiated stem cell. Any such modifications affect the individual patient only, and are not inherited by offspring. Somatic gene therapy represents mainstream basic and clinical research, in which therapeutic DNA (either integrated in the genome or as an external episome or plasmid) is used to treat disease.

Over 600 clinical trials utilizing SCGT are underway in the US. Most focus on severe genetic disorders, including immunodeficiencies, haemophilia, thalassaemia, and cystic fibrosis. Such single gene disorders are good candidates for somatic cell therapy. The complete correction of a genetic disorder or the replacement of multiple genes is not yet possible. Only a few of the trials are in the advanced stages.[39]

In germline gene therapy (GGT), germ cells (sperm or egg cells) are modified by the introduction of functional genes into their genomes. Modifying a germ cell causes all the organism’s cells to contain the modified gene. The change is therefore heritable and passed on to later generations. Australia, Canada, Germany, Israel, Switzerland, and the Netherlands[40] prohibit GGT for application in human beings, for technical and ethical reasons, including insufficient knowledge about possible risks to future generations[40] and higher risks versus SCGT.[41] The US has no federal controls specifically addressing human genetic modification (beyond FDA regulations for therapies in general).[40][42][43][44]

The delivery of DNA into cells can be accomplished by multiple methods. The two major classes are recombinant viruses (sometimes called biological nanoparticles or viral vectors) and naked DNA or DNA complexes (non-viral methods).

In order to replicate, viruses introduce their genetic material into the host cell, tricking the host’s cellular machinery into using it as blueprints for viral proteins. Retroviruses go a stage further by having their genetic material copied into the genome of the host cell. Scientists exploit this by substituting a virus’s genetic material with therapeutic DNA. (The term ‘DNA’ may be an oversimplification, as some viruses contain RNA, and gene therapy could take this form as well.) A number of viruses have been used for human gene therapy, including retroviruses, adenoviruses, herpes simplex, vaccinia, and adeno-associated virus.[4] Like the genetic material (DNA or RNA) in viruses, therapeutic DNA can be designed to simply serve as a temporary blueprint that is degraded naturally or (at least theoretically) to enter the host’s genome, becoming a permanent part of the host’s DNA in infected cells.

Non-viral methods present certain advantages over viral methods, such as large scale production and low host immunogenicity. However, non-viral methods initially produced lower levels of transfection and gene expression, and thus lower therapeutic efficacy. Later technology remedied this deficiency.[citation needed]

Methods for non-viral gene therapy include the injection of naked DNA, electroporation, the gene gun, sonoporation, magnetofection, the use of oligonucleotides, lipoplexes, dendrimers, and inorganic nanoparticles.

Some of the unsolved problems include:

Three patients’ deaths have been reported in gene therapy trials, putting the field under close scrutiny. The first was that of Jesse Gelsinger in 1999. Jesse Gelsinger died because of immune rejection response.[51] One X-SCID patient died of leukemia in 2003.[9] In 2007, a rheumatoid arthritis patient died from an infection; the subsequent investigation concluded that the death was not related to gene therapy.[52]

In 1972 Friedmann and Roblin authored a paper in Science titled “Gene therapy for human genetic disease?”[53] Rogers (1970) was cited for proposing that exogenous good DNA be used to replace the defective DNA in those who suffer from genetic defects.[54]

In 1984 a retrovirus vector system was designed that could efficiently insert foreign genes into mammalian chromosomes.[55]

The first approved gene therapy clinical research in the US took place on 14 September 1990, at the National Institutes of Health (NIH), under the direction of William French Anderson.[56] Four-year-old Ashanti DeSilva received treatment for a genetic defect that left her with ADA-SCID, a severe immune system deficiency. The defective gene of the patient’s blood cells was replaced by the functional variant. Ashantis immune system was partially restored by the therapy. Production of the missing enzyme was temporarily stimulated, but the new cells with functional genes were not generated. She led a normal life only with the regular injections performed every two months. The effects were successful, but temporary.[57]

Cancer gene therapy was introduced in 1992/93 (Trojan et al. 1993).[58] The treatment of glioblastoma multiforme, the malignant brain tumor whose outcome is always fatal, was done using a vector expressing antisense IGF-I RNA (clinical trial approved by NIH protocolno.1602 November 24, 1993,[59] and by the FDA in 1994). This therapy also represents the beginning of cancer immunogene therapy, a treatment which proves to be effective due to the anti-tumor mechanism of IGF-I antisense, which is related to strong immune and apoptotic phenomena.

In 1992 Claudio Bordignon, working at the Vita-Salute San Raffaele University, performed the first gene therapy procedure using hematopoietic stem cells as vectors to deliver genes intended to correct hereditary diseases.[60] In 2002 this work led to the publication of the first successful gene therapy treatment for adenosine deaminase deficiency (ADA-SCID). The success of a multi-center trial for treating children with SCID (severe combined immune deficiency or “bubble boy” disease) from 2000 and 2002, was questioned when two of the ten children treated at the trial’s Paris center developed a leukemia-like condition. Clinical trials were halted temporarily in 2002, but resumed after regulatory review of the protocol in the US, the United Kingdom, France, Italy, and Germany.[61]

In 1993 Andrew Gobea was born with SCID following prenatal genetic screening. Blood was removed from his mother’s placenta and umbilical cord immediately after birth, to acquire stem cells. The allele that codes for adenosine deaminase (ADA) was obtained and inserted into a retrovirus. Retroviruses and stem cells were mixed, after which the viruses inserted the gene into the stem cell chromosomes. Stem cells containing the working ADA gene were injected into Andrew’s blood. Injections of the ADA enzyme were also given weekly. For four years T cells (white blood cells), produced by stem cells, made ADA enzymes using the ADA gene. After four years more treatment was needed.[62]

Jesse Gelsinger’s death in 1999 impeded gene therapy research in the US.[63][64] As a result, the FDA suspended several clinical trials pending the reevaluation of ethical and procedural practices.[65]

The modified cancer gene therapy strategy of antisense IGF-I RNA (NIH n 1602)[59] using antisense / triple helix anti-IGF-I approach was registered in 2002 by Wiley gene therapy clinical trial – n 635 and 636. The approach has shown promising results in the treatment of six different malignant tumors: glioblastoma, cancers of liver, colon, prostate, uterus, and ovary (Collaborative NATO Science Programme on Gene Therapy USA, France, Poland n LST 980517 conducted by J. Trojan) (Trojan et al., 2012). This anti-gene antisense/triple helix therapy has proven to be efficient, due to the mechanism stopping simultaneously IGF-I expression on translation and transcription levels, strengthening anti-tumor immune and apoptotic phenomena.

Sickle-cell disease can be treated in mice.[66] The mice which have essentially the same defect that causes human cases used a viral vector to induce production of fetal hemoglobin (HbF), which normally ceases to be produced shortly after birth. In humans, the use of hydroxyurea to stimulate the production of HbF temporarily alleviates sickle cell symptoms. The researchers demonstrated this treatment to be a more permanent means to increase therapeutic HbF production.[67]

A new gene therapy approach repaired errors in messenger RNA derived from defective genes. This technique has the potential to treat thalassaemia, cystic fibrosis and some cancers.[68]

Researchers created liposomes 25 nanometers across that can carry therapeutic DNA through pores in the nuclear membrane.[69]

In 2003 a research team inserted genes into the brain for the first time. They used liposomes coated in a polymer called polyethylene glycol, which unlike viral vectors, are small enough to cross the bloodbrain barrier.[70]

Short pieces of double-stranded RNA (short, interfering RNAs or siRNAs) are used by cells to degrade RNA of a particular sequence. If a siRNA is designed to match the RNA copied from a faulty gene, then the abnormal protein product of that gene will not be produced.[71]

Gendicine is a cancer gene therapy that delivers the tumor suppressor gene p53 using an engineered adenovirus. In 2003, it was approved in China for the treatment of head and neck squamous cell carcinoma.[25]

In March researchers announced the successful use of gene therapy to treat two adult patients for X-linked chronic granulomatous disease, a disease which affects myeloid cells and damages the immune system. The study is the first to show that gene therapy can treat the myeloid system.[72]

In May a team reported a way to prevent the immune system from rejecting a newly delivered gene.[73] Similar to organ transplantation, gene therapy has been plagued by this problem. The immune system normally recognizes the new gene as foreign and rejects the cells carrying it. The research utilized a newly uncovered network of genes regulated by molecules known as microRNAs. This natural function selectively obscured their therapeutic gene in immune system cells and protected it from discovery. Mice infected with the gene containing an immune-cell microRNA target sequence did not reject the gene.

In August scientists successfully treated metastatic melanoma in two patients using killer T cells genetically retargeted to attack the cancer cells.[74]

In November researchers reported on the use of VRX496, a gene-based immunotherapy for the treatment of HIV that uses a lentiviral vector to deliver an antisense gene against the HIV envelope. In a phase I clinical trial, five subjects with chronic HIV infection who had failed to respond to at least two antiretroviral regimens were treated. A single intravenous infusion of autologous CD4 T cells genetically modified with VRX496 was well tolerated. All patients had stable or decreased viral load; four of the five patients had stable or increased CD4 T cell counts. All five patients had stable or increased immune response to HIV antigens and other pathogens. This was the first evaluation of a lentiviral vector administered in a US human clinical trial.[75][76]

In May researchers announced the first gene therapy trial for inherited retinal disease. The first operation was carried out on a 23-year-old British male, Robert Johnson, in early 2007.[77]

Leber’s congenital amaurosis is an inherited blinding disease caused by mutations in the RPE65 gene. The results of a small clinical trial in children were published in April.[11] Delivery of recombinant adeno-associated virus (AAV) carrying RPE65 yielded positive results. In May two more groups reported positive results in independent clinical trials using gene therapy to treat the condition. In all three clinical trials, patients recovered functional vision without apparent side-effects.[11][12][13][14]

In September researchers were able to give trichromatic vision to squirrel monkeys.[78] In November 2009, researchers halted a fatal genetic disorder called adrenoleukodystrophy in two children using a lentivirus vector to deliver a functioning version of ABCD1, the gene that is mutated in the disorder.[79]

An April paper reported that gene therapy addressed achromatopsia (color blindness) in dogs by targeting cone photoreceptors. Cone function and day vision were restored for at least 33 months in two young specimens. The therapy was less efficient for older dogs.[80]

In September it was announced that an 18-year-old male patient in France with beta-thalassemia major had been successfully treated.[81] Beta-thalassemia major is an inherited blood disease in which beta haemoglobin is missing and patients are dependent on regular lifelong blood transfusions.[82] The technique used a lentiviral vector to transduce the human -globin gene into purified blood and marrow cells obtained from the patient in June 2007.[83] The patient’s haemoglobin levels were stable at 9 to 10 g/dL. About a third of the hemoglobin contained the form introduced by the viral vector and blood transfusions were not needed.[83][84] Further clinical trials were planned.[85] Bone marrow transplants are the only cure for thalassemia, but 75% of patients do not find a matching donor.[84]

Cancer immunogene therapy using modified antigene, antisense/triple helix approach was introduced in South America in 2010/11 in La Sabana University, Bogota (Ethical Committee 14 December 2010, no P-004-10). Considering the ethical aspect of gene diagnostic and gene therapy targeting IGF-I, the IGF-I expressing tumors i.e. lung and epidermis cancers were treated (Trojan et al. 2016).[86][87]

In 2007 and 2008, a man (Timothy Ray Brown) was cured of HIV by repeated hematopoietic stem cell transplantation (see also allogeneic stem cell transplantation, allogeneic bone marrow transplantation, allotransplantation) with double-delta-32 mutation which disables the CCR5 receptor. This cure was accepted by the medical community in 2011.[88] It required complete ablation of existing bone marrow, which is very debilitating.

In August two of three subjects of a pilot study were confirmed to have been cured from chronic lymphocytic leukemia (CLL). The therapy used genetically modified T cells to attack cells that expressed the CD19 protein to fight the disease.[20] In 2013, the researchers announced that 26 of 59 patients had achieved complete remission and the original patient had remained tumor-free.[89]

Human HGF plasmid DNA therapy of cardiomyocytes is being examined as a potential treatment for coronary artery disease as well as treatment for the damage that occurs to the heart after myocardial infarction.[90][91]

In 2011 Neovasculgen was registered in Russia as the first-in-class gene-therapy drug for treatment of peripheral artery disease, including critical limb ischemia; it delivers the gene encoding for VEGF.[92][26] Neovasculogen is a plasmid encoding the CMV promoter and the 165 amino acid form of VEGF.[93][94]

The FDA approved Phase 1 clinical trials on thalassemia major patients in the US for 10 participants in July.[95] The study was expected to continue until 2015.[85]

In July 2012, the European Medicines Agency recommended approval of a gene therapy treatment for the first time in either Europe or the United States. The treatment used Alipogene tiparvovec (Glybera) to compensate for lipoprotein lipase deficiency, which can cause severe pancreatitis.[96] The recommendation was endorsed by the European Commission in November 2012[10][27][97][98] and commercial rollout began in late 2014.[99] Alipogene tiparvovec was expected to cost around $1.6 million per treatment in 2012,[100] revised to $1 million in 2015,[101] making it the most expensive medicine in the world at the time.[102] As of 2016, only one person had been treated with drug.[103]

In December 2012, it was reported that 10 of 13 patients with multiple myeloma were in remission “or very close to it” three months after being injected with a treatment involving genetically engineered T cells to target proteins NY-ESO-1 and LAGE-1, which exist only on cancerous myeloma cells.[22]

In March researchers reported that three of five adult subjects who had acute lymphocytic leukemia (ALL) had been in remission for five months to two years after being treated with genetically modified T cells which attacked cells with CD19 genes on their surface, i.e. all B-cells, cancerous or not. The researchers believed that the patients’ immune systems would make normal T-cells and B-cells after a couple of months. They were also given bone marrow. One patient relapsed and died and one died of a blood clot unrelated to the disease.[21]

Following encouraging Phase 1 trials, in April, researchers announced they were starting Phase 2 clinical trials (called CUPID2 and SERCA-LVAD) on 250 patients[104] at several hospitals to combat heart disease. The therapy was designed to increase the levels of SERCA2, a protein in heart muscles, improving muscle function.[105] The FDA granted this a Breakthrough Therapy Designation to accelerate the trial and approval process.[106] In 2016 it was reported that no improvement was found from the CUPID 2 trial.[107]

In July researchers reported promising results for six children with two severe hereditary diseases had been treated with a partially deactivated lentivirus to replace a faulty gene and after 732 months. Three of the children had metachromatic leukodystrophy, which causes children to lose cognitive and motor skills.[108] The other children had Wiskott-Aldrich syndrome, which leaves them to open to infection, autoimmune diseases, and cancer.[109] Follow up trials with gene therapy on another six children with Wiskott-Aldrich syndrome were also reported as promising.[110][111]

In October researchers reported that two children born with adenosine deaminase severe combined immunodeficiency disease (ADA-SCID) had been treated with genetically engineered stem cells 18 months previously and that their immune systems were showing signs of full recovery. Another three children were making progress.[18] In 2014 a further 18 children with ADA-SCID were cured by gene therapy.[112] ADA-SCID children have no functioning immune system and are sometimes known as “bubble children.”[18]

Also in October researchers reported that they had treated six hemophilia sufferers in early 2011 using an adeno-associated virus. Over two years later all six were producing clotting factor.[18][113]

In January researchers reported that six choroideremia patients had been treated with adeno-associated virus with a copy of REP1. Over a six-month to two-year period all had improved their sight.[114][115] By 2016, 32 patients had been treated with positive results and researchers were hopeful the treatment would be long-lasting.[15] Choroideremia is an inherited genetic eye disease with no approved treatment, leading to loss of sight.

In March researchers reported that 12 HIV patients had been treated since 2009 in a trial with a genetically engineered virus with a rare mutation (CCR5 deficiency) known to protect against HIV with promising results.[116][117]

Clinical trials of gene therapy for sickle cell disease were started in 2014.[118][119] There is a need for high quality randomised controlled trials assessing the risks and benefits involved with gene therapy for people with sickle cell disease.[120]

In February LentiGlobin BB305, a gene therapy treatment undergoing clinical trials for treatment of beta thalassemia gained FDA “breakthrough” status after several patients were able to forgo the frequent blood transfusions usually required to treat the disease.[121]

In March researchers delivered a recombinant gene encoding a broadly neutralizing antibody into monkeys infected with simian HIV; the monkeys’ cells produced the antibody, which cleared them of HIV. The technique is named immunoprophylaxis by gene transfer (IGT). Animal tests for antibodies to ebola, malaria, influenza, and hepatitis were underway.[122][123]

In March, scientists, including an inventor of CRISPR, Jennifer Doudna, urged a worldwide moratorium on germline gene therapy, writing “scientists should avoid even attempting, in lax jurisdictions, germline genome modification for clinical application in humans” until the full implications “are discussed among scientific and governmental organizations”.[124][125][126][127]

In October, researchers announced that they had treated a baby girl, Layla Richards, with an experimental treatment using donor T-cells genetically engineered using TALEN to attack cancer cells. One year after the treatment she was still free of her cancer (a highly aggressive form of acute lymphoblastic leukaemia [ALL]).[128] Children with highly aggressive ALL normally have a very poor prognosis and Layla’s disease had been regarded as terminal before the treatment.[129]

In December, scientists of major world academies called for a moratorium on inheritable human genome edits, including those related to CRISPR-Cas9 technologies[130] but that basic research including embryo gene editing should continue.[131]

In April the Committee for Medicinal Products for Human Use of the European Medicines Agency endorsed a gene therapy treatment called Strimvelis[132][133] and the European Commission approved it in June.[134] This treats children born with adenosine deaminase deficiency and who have no functioning immune system. This was the second gene therapy treatment to be approved in Europe.[135]

In October, Chinese scientists reported they had started a trial to genetically modify T-cells from 10 adult patients with lung cancer and reinject the modified T-cells back into their bodies to attack the cancer cells. The T-cells had the PD-1 protein (which stops or slows the immune response) removed using CRISPR-Cas9.[136][137]

A 2016 Cochrane systematic review looking at data from four trials on topical cystic fibrosis transmembrane conductance regulator (CFTR) gene therapy does not support its clinical use as a mist inhaled into the lungs to treat cystic fibrosis patients with lung infections. One of the four trials did find weak evidence that liposome-based CFTR gene transfer therapy may lead to a small respiratory improvement for people with CF. This weak evidence is not enough to make a clinical recommendation for routine CFTR gene therapy.[138]

In February Kite Pharma announced results from a clinical trial of CAR-T cells in around a hundred people with advanced Non-Hodgkin lymphoma.[139]

In March, French scientists reported on clinical research of gene therapy to treat sickle-cell disease.[140]

In August, the FDA approved tisagenlecleucel for acute lymphoblastic leukemia.[141] Tisagenlecleucel is an adoptive cell transfer therapy for B-cell acute lymphoblastic leukemia; T cells from a person with cancer are removed, genetically engineered to make a specific T-cell receptor (a chimeric T cell receptor, or “CAR-T”) that reacts to the cancer, and are administered back to the person. The T cells are engineered to target a protein called CD19 that is common on B cells. This is the first form of gene therapy to be approved in the United States. In October, a similar therapy called axicabtagene ciloleucel was approved for non-Hodgkin lymphoma.[142]

In December the results of using an adeno-associated virus with blood clotting factor VIII to treat nine haemophilia A patients were published. Six of the seven patients on the high dose regime increased the level of the blood clotting VIII to normal levels. The low and medium dose regimes had no effect on the patient’s blood clotting levels.[143][144]

In December, the FDA approved Luxturna, the first in vivo gene therapy, for the treatment of blindness due to Leber’s congenital amaurosis.[145] The price of this treatment was 850,000 US dollars for both eyes.[146][147]

Speculated uses for gene therapy include:

Gene Therapy techniques have the potential to provide alternative treatments for those with infertility. Recently, successful experimentation on mice has proven that fertility can be restored by using the gene therapy method, CRISPR.[148] Spermatogenical stem cells from another organism were transplanted into the testes of an infertile male mouse. The stem cells re-established spermatogenesis and fertility.[149]

Athletes might adopt gene therapy technologies to improve their performance.[150] Gene doping is not known to occur, but multiple gene therapies may have such effects. Kayser et al. argue that gene doping could level the playing field if all athletes receive equal access. Critics claim that any therapeutic intervention for non-therapeutic/enhancement purposes compromises the ethical foundations of medicine and sports.[151]

Genetic engineering could be used to cure diseases, but also to change physical appearance, metabolism, and even improve physical capabilities and mental faculties such as memory and intelligence. Ethical claims about germline engineering include beliefs that every fetus has a right to remain genetically unmodified, that parents hold the right to genetically modify their offspring, and that every child has the right to be born free of preventable diseases.[152][153][154] For parents, genetic engineering could be seen as another child enhancement technique to add to diet, exercise, education, training, cosmetics, and plastic surgery.[155][156] Another theorist claims that moral concerns limit but do not prohibit germline engineering.[157]

Possible regulatory schemes include a complete ban, provision to everyone, or professional self-regulation. The American Medical Associations Council on Ethical and Judicial Affairs stated that “genetic interventions to enhance traits should be considered permissible only in severely restricted situations: (1) clear and meaningful benefits to the fetus or child; (2) no trade-off with other characteristics or traits; and (3) equal access to the genetic technology, irrespective of income or other socioeconomic characteristics.”[158]

As early in the history of biotechnology as 1990, there have been scientists opposed to attempts to modify the human germline using these new tools,[159] and such concerns have continued as technology progressed.[160][161] With the advent of new techniques like CRISPR, in March 2015 a group of scientists urged a worldwide moratorium on clinical use of gene editing technologies to edit the human genome in a way that can be inherited.[124][125][126][127] In April 2015, researchers sparked controversy when they reported results of basic research to edit the DNA of non-viable human embryos using CRISPR.[148][162] A committee of the American National Academy of Sciences and National Academy of Medicine gave qualified support to human genome editing in 2017[163][164] once answers have been found to safety and efficiency problems “but only for serious conditions under stringent oversight.”[165]

Regulations covering genetic modification are part of general guidelines about human-involved biomedical research. There are no international treaties which are legally binding in this area, but there are recommendations for national laws from various bodies.

The Helsinki Declaration (Ethical Principles for Medical Research Involving Human Subjects) was amended by the World Medical Association’s General Assembly in 2008. This document provides principles physicians and researchers must consider when involving humans as research subjects. The Statement on Gene Therapy Research initiated by the Human Genome Organization (HUGO) in 2001 provides a legal baseline for all countries. HUGOs document emphasizes human freedom and adherence to human rights, and offers recommendations for somatic gene therapy, including the importance of recognizing public concerns about such research.[166]

No federal legislation lays out protocols or restrictions about human genetic engineering. This subject is governed by overlapping regulations from local and federal agencies, including the Department of Health and Human Services, the FDA and NIH’s Recombinant DNA Advisory Committee. Researchers seeking federal funds for an investigational new drug application, (commonly the case for somatic human genetic engineering,) must obey international and federal guidelines for the protection of human subjects.[167]

NIH serves as the main gene therapy regulator for federally funded research. Privately funded research is advised to follow these regulations. NIH provides funding for research that develops or enhances genetic engineering techniques and to evaluate the ethics and quality in current research. The NIH maintains a mandatory registry of human genetic engineering research protocols that includes all federally funded projects.

An NIH advisory committee published a set of guidelines on gene manipulation.[168] The guidelines discuss lab safety as well as human test subjects and various experimental types that involve genetic changes. Several sections specifically pertain to human genetic engineering, including Section III-C-1. This section describes required review processes and other aspects when seeking approval to begin clinical research involving genetic transfer into a human patient.[169] The protocol for a gene therapy clinical trial must be approved by the NIH’s Recombinant DNA Advisory Committee prior to any clinical trial beginning; this is different from any other kind of clinical trial.[168]

As with other kinds of drugs, the FDA regulates the quality and safety of gene therapy products and supervises how these products are used clinically. Therapeutic alteration of the human genome falls under the same regulatory requirements as any other medical treatment. Research involving human subjects, such as clinical trials, must be reviewed and approved by the FDA and an Institutional Review Board.[170][171]

Gene therapy is the basis for the plotline of the film I Am Legend[172] and the TV show Will Gene Therapy Change the Human Race?.[173] In 1994, gene therapy was a plot element in The Erlenmeyer Flask, The X-Files’ first season finale. It is also used in Stargate as a means of allowing humans to use Ancient technology.[174]

Continue reading here:

Gene therapy – Wikipedia

Genetic engineering – Wikipedia

Genetic engineering, also called genetic modification or genetic manipulation, is the direct manipulation of an organism’s genes using biotechnology. It is a set of technologies used to change the genetic makeup of cells, including the transfer of genes within and across species boundaries to produce improved or novel organisms. New DNA is obtained by either isolating and copying the genetic material of interest using recombinant DNA methods or by artificially synthesising the DNA. A construct is usually created and used to insert this DNA into the host organism. The first recombinant DNA molecule was made by Paul Berg in 1972 by combining DNA from the monkey virus SV40 with the lambda virus. As well as inserting genes, the process can be used to remove, or “knock out”, genes. The new DNA can be inserted randomly, or targeted to a specific part of the genome.

An organism that is generated through genetic engineering is considered to be genetically modified (GM) and the resulting entity is a genetically modified organism (GMO). The first GMO was a bacterium generated by Herbert Boyer and Stanley Cohen in 1973. Rudolf Jaenisch created the first GM animal when he inserted foreign DNA into a mouse in 1974. The first company to focus on genetic engineering, Genentech, was founded in 1976 and started the production of human proteins. Genetically engineered human insulin was produced in 1978 and insulin-producing bacteria were commercialised in 1982. Genetically modified food has been sold since 1994, with the release of the Flavr Savr tomato. The Flavr Savr was engineered to have a longer shelf life, but most current GM crops are modified to increase resistance to insects and herbicides. GloFish, the first GMO designed as a pet, was sold in the United States in December 2003. In 2016 salmon modified with a growth hormone were sold.

Genetic engineering has been applied in numerous fields including research, medicine, industrial biotechnology and agriculture. In research GMOs are used to study gene function and expression through loss of function, gain of function, tracking and expression experiments. By knocking out genes responsible for certain conditions it is possible to create animal model organisms of human diseases. As well as producing hormones, vaccines and other drugs genetic engineering has the potential to cure genetic diseases through gene therapy. The same techniques that are used to produce drugs can also have industrial applications such as producing enzymes for laundry detergent, cheeses and other products.

The rise of commercialised genetically modified crops has provided economic benefit to farmers in many different countries, but has also been the source of most of the controversy surrounding the technology. This has been present since its early use, the first field trials were destroyed by anti-GM activists. Although there is a scientific consensus that currently available food derived from GM crops poses no greater risk to human health than conventional food, GM food safety is a leading concern with critics. Gene flow, impact on non-target organisms, control of the food supply and intellectual property rights have also been raised as potential issues. These concerns have led to the development of a regulatory framework, which started in 1975. It has led to an international treaty, the Cartagena Protocol on Biosafety, that was adopted in 2000. Individual countries have developed their own regulatory systems regarding GMOs, with the most marked differences occurring between the USA and Europe.

Genetic engineering is a process that alters the genetic structure of an organism by either removing or introducing DNA. Unlike traditionally animal and plant breeding, which involves doing multiple crosses and then selecting for the organism with the desired phenotype, genetic engineering takes the gene directly from one organism and inserts it in the other. This is much faster, can be used to insert any genes from any organism (even ones from different domains) and prevents other undesirable genes from also being added.[3]

Genetic engineering could potentially fix severe genetic disorders in humans by replacing the defective gene with a functioning one.[4] It is an important tool in research that allows the function of specific genes to be studied.[5] Drugs, vaccines and other products have been harvested from organisms engineered to produce them.[6] Crops have been developed that aid food security by increasing yield, nutritional value and tolerance to environmental stresses.[7]

The DNA can be introduced directly into the host organism or into a cell that is then fused or hybridised with the host.[8] This relies on recombinant nucleic acid techniques to form new combinations of heritable genetic material followed by the incorporation of that material either indirectly through a vector system or directly through micro-injection, macro-injection or micro-encapsulation.[9]

Genetic engineering does not normally include traditional breeding, in vitro fertilisation, induction of polyploidy, mutagenesis and cell fusion techniques that do not use recombinant nucleic acids or a genetically modified organism in the process.[8] However, some broad definitions of genetic engineering include selective breeding.[9] Cloning and stem cell research, although not considered genetic engineering,[10] are closely related and genetic engineering can be used within them.[11] Synthetic biology is an emerging discipline that takes genetic engineering a step further by introducing artificially synthesised material into an organism.[12]

Plants, animals or micro organisms that have been changed through genetic engineering are termed genetically modified organisms or GMOs.[13] If genetic material from another species is added to the host, the resulting organism is called transgenic. If genetic material from the same species or a species that can naturally breed with the host is used the resulting organism is called cisgenic.[14] If genetic engineering is used to remove genetic material from the target organism the resulting organism is termed a knockout organism.[15] In Europe genetic modification is synonymous with genetic engineering while within the United States of America and Canada genetic modification can also be used to refer to more conventional breeding methods.[16][17][18]

Humans have altered the genomes of species for thousands of years through selective breeding, or artificial selection[19]:1[20]:1 as contrasted with natural selection, and more recently through mutagenesis. Genetic engineering as the direct manipulation of DNA by humans outside breeding and mutations has only existed since the 1970s. The term “genetic engineering” was first coined by Jack Williamson in his science fiction novel Dragon’s Island, published in 1951[21] one year before DNA’s role in heredity was confirmed by Alfred Hershey and Martha Chase,[22] and two years before James Watson and Francis Crick showed that the DNA molecule has a double-helix structure though the general concept of direct genetic manipulation was explored in rudimentary form in Stanley G. Weinbaum’s 1936 science fiction story Proteus Island.[23][24]

In 1972, Paul Berg created the first recombinant DNA molecules by combining DNA from the monkey virus SV40 with that of the lambda virus.[25] In 1973 Herbert Boyer and Stanley Cohen created the first transgenic organism by inserting antibiotic resistance genes into the plasmid of an Escherichia coli bacterium.[26][27] A year later Rudolf Jaenisch created a transgenic mouse by introducing foreign DNA into its embryo, making it the worlds first transgenic animal.[28] These achievements led to concerns in the scientific community about potential risks from genetic engineering, which were first discussed in depth at the Asilomar Conference in 1975. One of the main recommendations from this meeting was that government oversight of recombinant DNA research should be established until the technology was deemed safe.[29][30]

In 1976 Genentech, the first genetic engineering company, was founded by Herbert Boyer and Robert Swanson and a year later the company produced a human protein (somatostatin) in E.coli. Genentech announced the production of genetically engineered human insulin in 1978.[31] In 1980, the U.S. Supreme Court in the Diamond v. Chakrabarty case ruled that genetically altered life could be patented.[32] The insulin produced by bacteria was approved for release by the Food and Drug Administration (FDA) in 1982.[33]

In 1983, a biotech company, Advanced Genetic Sciences (AGS) applied for U.S. government authorisation to perform field tests with the ice-minus strain of Pseudomonas syringae to protect crops from frost, but environmental groups and protestors delayed the field tests for four years with legal challenges.[34] In 1987, the ice-minus strain of P. syringae became the first genetically modified organism (GMO) to be released into the environment[35] when a strawberry field and a potato field in California were sprayed with it.[36] Both test fields were attacked by activist groups the night before the tests occurred: “The world’s first trial site attracted the world’s first field trasher”.[35]

The first field trials of genetically engineered plants occurred in France and the USA in 1986, tobacco plants were engineered to be resistant to herbicides.[37] The Peoples Republic of China was the first country to commercialise transgenic plants, introducing a virus-resistant tobacco in 1992.[38] In 1994 Calgene attained approval to commercially release the first genetically modified food, the Flavr Savr, a tomato engineered to have a longer shelf life.[39] In 1994, the European Union approved tobacco engineered to be resistant to the herbicide bromoxynil, making it the first genetically engineered crop commercialised in Europe.[40] In 1995, Bt Potato was approved safe by the Environmental Protection Agency, after having been approved by the FDA, making it the first pesticide producing crop to be approved in the USA.[41] In 2009 11 transgenic crops were grown commercially in 25 countries, the largest of which by area grown were the USA, Brazil, Argentina, India, Canada, China, Paraguay and South Africa.[42]

In 2010, scientists at the J. Craig Venter Institute created the first synthetic genome and inserted it into an empty bacterial cell. The resulting bacterium, named Mycoplasma laboratorium, could replicate and produce proteins.[43][44] Four years later this was taken a step further when bacterium was developed that replicated a plasmid containing a unique base pair, creating the first organism engineered to use an expanded genetic alphabet.[45][46] In 2012, Jennifer Doudna and Emmanuelle Charpentier collaborated to develop the CRISPR/Cas9 system,[47][48] a technique which can be used to easily and specifically alter the genome of almost any organism.[49]

Creating a GMO is a multi-step process. Genetic engineers must first choose what gene they wish to insert into the organism. This is driven by what the aim is for the resultant organism and is built on earlier research. Genetic screens can be carried out to determine potential genes and further tests then used to identify the best candidates. The development of microarrays, transcriptomes and genome sequencing has made it much easier to find suitable genes.[50] Luck also plays its part; the round-up ready gene was discovered after scientists noticed a bacterium thriving in the presence of the herbicide.[51]

The next step is to isolate the candidate gene. The cell containing the gene is opened and the DNA is purified.[52] The gene is separated by using restriction enzymes to cut the DNA into fragments[53] or polymerase chain reaction (PCR) to amplify up the gene segment.[54] These segments can then be extracted through gel electrophoresis. If the chosen gene or the donor organism’s genome has been well studied it may already be accessible from a genetic library. If the DNA sequence is known, but no copies of the gene are available, it can also be artificially synthesised.[55] Once isolated the gene is ligated into a plasmid that is then inserted into a bacterium. The plasmid is replicated when the bacteria divide, ensuring unlimited copies of the gene are available.[56]

Before the gene is inserted into the target organism it must be combined with other genetic elements. These include a promoter and terminator region, which initiate and end transcription. A selectable marker gene is added, which in most cases confers antibiotic resistance, so researchers can easily determine which cells have been successfully transformed. The gene can also be modified at this stage for better expression or effectiveness. These manipulations are carried out using recombinant DNA techniques, such as restriction digests, ligations and molecular cloning.[57]

There are a number of techniques available for inserting the gene into the host genome. Some bacteria can naturally take up foreign DNA. This ability can be induced in other bacteria via stress (e.g. thermal or electric shock), which increases the cell membrane’s permeability to DNA; up-taken DNA can either integrate with the genome or exist as extrachromosomal DNA. DNA is generally inserted into animal cells using microinjection, where it can be injected through the cell’s nuclear envelope directly into the nucleus, or through the use of viral vectors.[58]

In plants the DNA is often inserted using Agrobacterium-mediated recombination,[59] taking advantage of the Agrobacteriums T-DNA sequence that allows natural insertion of genetic material into plant cells.[60] Other methods include biolistics, where particles of gold or tungsten are coated with DNA and then shot into young plant cells,[61] and electroporation, which involves using an electric shock to make the cell membrane permeable to plasmid DNA. Due to the damage caused to the cells and DNA the transformation efficiency of biolistics and electroporation is lower than agrobacterial transformation and microinjection.[62]

As only a single cell is transformed with genetic material, the organism must be regenerated from that single cell. In plants this is accomplished through the use of tissue culture.[63][64] In animals it is necessary to ensure that the inserted DNA is present in the embryonic stem cells.[65] Bacteria consist of a single cell and reproduce clonally so regeneration is not necessary. Selectable markers are used to easily differentiate transformed from untransformed cells. These markers are usually present in the transgenic organism, although a number of strategies have been developed that can remove the selectable marker from the mature transgenic plant.[66]

Further testing using PCR, Southern hybridization, and DNA sequencing is conducted to confirm that an organism contains the new gene.[67] These tests can also confirm the chromosomal location and copy number of the inserted gene. The presence of the gene does not guarantee it will be expressed at appropriate levels in the target tissue so methods that look for and measure the gene products (RNA and protein) are also used. These include northern hybridisation, quantitative RT-PCR, Western blot, immunofluorescence, ELISA and phenotypic analysis.[68]

The new genetic material can be inserted randomly within the host genome or targeted to a specific location. The technique of gene targeting uses homologous recombination to make desired changes to a specific endogenous gene. This tends to occur at a relatively low frequency in plants and animals and generally requires the use of selectable markers. The frequency of gene targeting can be greatly enhanced through genome editing. Genome editing uses artificially engineered nucleases that create specific double-stranded breaks at desired locations in the genome, and use the cells endogenous mechanisms to repair the induced break by the natural processes of homologous recombination and nonhomologous end-joining. There are four families of engineered nucleases: meganucleases,[69][70] zinc finger nucleases,[71][72] transcription activator-like effector nucleases (TALENs),[73][74] and the Cas9-guideRNA system (adapted from CRISPR).[75][76] TALEN and CRISPR are the two most commonly used and each has its own advantages.[77] TALENs have greater target specificity, while CRISPR is easier to design and more efficient.[77] In addition to enhancing gene targeting, engineered nucleases can be used to introduce mutations at endogenous genes that generate a gene knockout.[78][79]

Genetic engineering has applications in medicine, research, industry and agriculture and can be used on a wide range of plants, animals and micro organisms. Bacteria, the first organisms to be genetically modified, can have plasmid DNA inserted containing new genes that code for medicines or enzymes that process food and other substrates.[80][81] Plants have been modified for insect protection, herbicide resistance, virus resistance, enhanced nutrition, tolerance to environmental pressures and the production of edible vaccines.[82] Most commercialised GMOs are insect resistant or herbicide tolerant crop plants.[83] Genetically modified animals have been used for research, model animals and the production of agricultural or pharmaceutical products. The genetically modified animals include animals with genes knocked out, increased susceptibility to disease, hormones for extra growth and the ability to express proteins in their milk.[84]

Genetic engineering has many applications to medicine that include the manufacturing of drugs, creation of model animals that mimic human conditions and gene therapy. One of the earliest uses of genetic engineering was to mass-produce human insulin in bacteria.[31] This application has now been applied to, human growth hormones, follicle stimulating hormones (for treating infertility), human albumin, monoclonal antibodies, antihemophilic factors, vaccines and many other drugs.[85][86] Mouse hybridomas, cells fused together to create monoclonal antibodies, have been adapted through genetic engineering to create human monoclonal antibodies.[87] In 2017, genetic engineering of chimeric antigen receptors on a patient’s own T-cells was approved by the U.S. FDA as a treatment for the cancer acute lymphoblastic leukemia. Genetically engineered viruses are being developed that can still confer immunity, but lack the infectious sequences.[88]

Genetic engineering is also used to create animal models of human diseases. Genetically modified mice are the most common genetically engineered animal model.[89] They have been used to study and model cancer (the oncomouse), obesity, heart disease, diabetes, arthritis, substance abuse, anxiety, aging and Parkinson disease.[90] Potential cures can be tested against these mouse models. Also genetically modified pigs have been bred with the aim of increasing the success of pig to human organ transplantation.[91]

Gene therapy is the genetic engineering of humans, generally by replacing defective genes with effective ones. Clinical research using somatic gene therapy has been conducted with several diseases, including X-linked SCID,[92] chronic lymphocytic leukemia (CLL),[93][94] and Parkinson’s disease.[95] In 2012, Alipogene tiparvovec became the first gene therapy treatment to be approved for clinical use.[96][97] In 2015 a virus was used to insert a healthy gene into the skin cells of a boy suffering from a rare skin disease, epidermolysis bullosa, in order to grow, and then graft healthy skin onto 80 percent of the boy’s body which was affected by the illness.[98] Germline gene therapy would result in any change being inheritable, which has raised concerns within the scientific community.[99][100] In 2015, CRISPR was used to edit the DNA of non-viable human embryos,[101][102] leading scientists of major world academies to call for a moratorium on inheritable human genome edits.[103] There are also concerns that the technology could be used not just for treatment, but for enhancement, modification or alteration of a human beings’ appearance, adaptability, intelligence, character or behavior.[104] The distinction between cure and enhancement can also be difficult to establish.[105]

Researchers are altering the genome of pigs to induce the growth of human organs to be used in transplants. Scientists are creating “gene drives”, changing the genomes of mosquitoes to make them immune to malaria, and then spreading the genetically altered mosquitoes throughout the mosquito population in the hopes of eliminating the disease.[106]

Genetic engineering is an important tool for natural scientists. Genes and other genetic information from a wide range of organisms can be inserted into bacteria for storage and modification, creating genetically modified bacteria in the process. Bacteria are cheap, easy to grow, clonal, multiply quickly, relatively easy to transform and can be stored at -80C almost indefinitely. Once a gene is isolated it can be stored inside the bacteria providing an unlimited supply for research.[107]

Organisms are genetically engineered to discover the functions of certain genes. This could be the effect on the phenotype of the organism, where the gene is expressed or what other genes it interacts with. These experiments generally involve loss of function, gain of function, tracking and expression.

Organisms can have their cells transformed with a gene coding for a useful protein, such as an enzyme, so that they will overexpress the desired protein. Mass quantities of the protein can then be manufactured by growing the transformed organism in bioreactor equipment using industrial fermentation, and then purifying the protein.[111] Some genes do not work well in bacteria, so yeast, insect cells or mammalians cells can also be used.[112] These techniques are used to produce medicines such as insulin, human growth hormone, and vaccines, supplements such as tryptophan, aid in the production of food (chymosin in cheese making) and fuels.[113] Other applications with genetically engineered bacteria could involve making them perform tasks outside their natural cycle, such as making biofuels,[114] cleaning up oil spills, carbon and other toxic waste[115] and detecting arsenic in drinking water.[116] Certain genetically modified microbes can also be used in biomining and bioremediation, due to their ability to extract heavy metals from their environment and incorporate them into compounds that are more easily recoverable.[117]

In materials science, a genetically modified virus has been used in a research laboratory as a scaffold for assembling a more environmentally friendly lithium-ion battery.[118][119] Bacteria have also been engineered to function as sensors by expressing a fluorescent protein under certain environmental conditions.[120]

One of the best-known and controversial applications of genetic engineering is the creation and use of genetically modified crops or genetically modified livestock to produce genetically modified food. Crops have been developed to increase production, increase tolerance to abiotic stresses, alter the composition of the food, or to produce novel products.[122]

The first crops to be realised commercially on a large scale provided protection from insect pests or tolerance to herbicides. Fungal and virus resistant crops have also been developed or are in development.[123][124] This make the insect and weed management of crops easier and can indirectly increase crop yield.[125][126] GM crops that directly improve yield by accelerating growth or making the plant more hardy (by improving salt, cold or drought tolerance) are also under development.[127] In 2016 Salmon have been genetically modified with growth hormones to reach normal adult size much faster.[128]

GMOs have been developed that modify the quality of produce by increasing the nutritional value or providing more industrially useful qualities or quantities.[127] The Amflora potato produces a more industrially useful blend of starches. Soybeans and canola have been genetically modified to produce more healthy oils.[129][130] The first commercialised GM food was a tomato that had delayed ripening, increasing its shelf life.[131]

Plants and animals have been engineered to produce materials they do not normally make. Pharming uses crops and animals as bioreactors to produce vaccines, drug intermediates, or the drugs themselves; the useful product is purified from the harvest and then used in the standard pharmaceutical production process.[132] Cows and goats have been engineered to express drugs and other proteins in their milk, and in 2009 the FDA approved a drug produced in goat milk.[133][134]

Genetic engineering has potential applications in conservation and natural area management. Gene transfer through viral vectors has been proposed as a means of controlling invasive species as well as vaccinating threatened fauna from disease.[135] Transgenic trees have been suggested as a way to confer resistance to pathogens in wild populations.[136] With the increasing risks of maladaptation in organisms as a result of climate change and other perturbations, facilitated adaptation through gene tweaking could be one solution to reducing extinction risks.[137] Applications of genetic engineering in conservation are thus far mostly theoretical and have yet to be put into practice.

Genetic engineering is also being used to create microbial art.[138] Some bacteria have been genetically engineered to create black and white photographs.[139] Novelty items such as lavender-colored carnations,[140] blue roses,[141] and glowing fish[142][143] have also been produced through genetic engineering.

The regulation of genetic engineering concerns the approaches taken by governments to assess and manage the risks associated with the development and release of GMOs. The development of a regulatory framework began in 1975, at Asilomar, California.[144] The Asilomar meeting recommended a set of voluntary guidelines regarding the use of recombinant technology.[145] As the technology improved USA established a committee at the Office of Science and Technology,[146] which assigned regulatory approval of GM plants to the USDA, FDA and EPA.[147] The Cartagena Protocol on Biosafety, an international treaty that governs the transfer, handling, and use of GMOs,[148] was adopted on 29 January 2000.[149] One hundred and fifty-seven countries are members of the Protocol and many use it as a reference point for their own regulations.[150]

The legal and regulatory status of GM foods varies by country, with some nations banning or restricting them, and others permitting them with widely differing degrees of regulation.[151][152][153][154] Some countries allow the import of GM food with authorisation, but either do not allow its cultivation (Russia, Norway, Israel) or have provisions for cultivation, but no GM products are yet produced (Japan, South Korea). Most countries that do not allow for GMO cultivation do permit research.[155] Some of the most marked differences occurring between the USA and Europe. The US policy focuses on the product (not the process), only looks at verifiable scientific risks and uses the concept of substantial equivalence.[156] The European Union by contrast has possibly the most stringent GMO regulations in the world.[157] All GMOs, along with irradiated food, are considered “new food” and subject to extensive, case-by-case, science-based food evaluation by the European Food Safety Authority. The criteria for authorisation fall in four broad categories: “safety,” “freedom of choice,” “labelling,” and “traceability.”[158] The level of regulation in other countries that cultivate GMOs lie in between Europe and the United States.

One of the key issues concerning regulators is whether GM products should be labeled. The European Commission says that mandatory labeling and traceability are needed to allow for informed choice, avoid potential false advertising[169] and facilitate the withdrawal of products if adverse effects on health or the environment are discovered.[170] The American Medical Association[171] and the American Association for the Advancement of Science[172] say that absent scientific evidence of harm even voluntary labeling is misleading and will falsely alarm consumers”. Labeling of GMO products in the marketplace is required in 64 countries.[173] Labeling can be mandatory up to a threshold GM content level (which varies between countries) or voluntary. In Canada and the USA labeling of GM food is voluntary,[174] while in Europe all food (including processed food) or feed which contains greater than 0.9% of approved GMOs must be labelled.[157]

Critics have objected to the use of genetic engineering on several grounds, that include ethical, ecological and economic concerns. Many of these concerns involve GM crops and whether food produced from them is safe, whether it should be labeled and what impact growing them will have on the environment. These controversies have led to litigation, international trade disputes, and protests, and to restrictive regulation of commercial products in some countries.[175]

Accusations that scientists are “playing God” and other religious issues have been ascribed to the technology from the beginning.[176] Other ethical issues raised include the patenting of life,[177] the use of intellectual property rights,[178] the level of labeling on products,[179][180] control of the food supply[181] and the objectivity of the regulatory process.[182] Although doubts have been raised,[183] economically most studies have found growing GM crops to be beneficial to farmers.[184][185][186]

Gene flow between GM crops and compatible plants, along with increased use of selective herbicides, can increase the risk of “superweeds” developing.[187] Other environmental concerns involve potential impacts on non-target organisms, including soil microbes,[188] and an increase in secondary and resistant insect pests.[189][190] Many of the environmental impacts regarding GM crops may take many years to be understood are also evident in conventional agriculture practices.[188][191] With the commercialisation of genetically modified fish there are concerns over what the environmental consequences will be if they escape.[192]

There are three main concerns over the safety of genetically modified food: whether they may provoke an allergic reaction; whether the genes could transfer from the food into human cells; and whether the genes not approved for human consumption could outcross to other crops.[193] There is a scientific consensus[194][195][196][197] that currently available food derived from GM crops poses no greater risk to human health than conventional food,[198][199][200][201][202] but that each GM food needs to be tested on a case-by-case basis before introduction.[203][204][205] Nonetheless, members of the public are much less likely than scientists to perceive GM foods as safe.[206][207][208][209]

The literature about Biodiversity and the GE food/feed consumption has sometimes resulted in animated debate regarding the suitability of the experimental designs, the choice of the statistical methods or the public accessibility of data. Such debate, even if positive and part of the natural process of review by the scientific community, has frequently been distorted by the media and often used politically and inappropriately in anti-GE crops campaigns.

Panchin, Alexander Y.; Tuzhikov, Alexander I. (14 January 2016). “Published GMO studies find no evidence of harm when corrected for multiple comparisons”. Critical Reviews in Biotechnology: 15. doi:10.3109/07388551.2015.1130684. ISSN0738-8551. PMID26767435. Here, we show that a number of articles some of which have strongly and negatively influenced the public opinion on GM crops and even provoked political actions, such as GMO embargo, share common flaws in the statistical evaluation of the data. Having accounted for these flaws, we conclude that the data presented in these articles does not provide any substantial evidence of GMO harm.

The presented articles suggesting possible harm of GMOs received high public attention. However, despite their claims, they actually weaken the evidence for the harm and lack of substantial equivalency of studied GMOs. We emphasize that with over 1783 published articles on GMOs over the last 10 years it is expected that some of them should have reported undesired differences between GMOs and conventional crops even if no such differences exist in reality.and

Yang, Y.T.; Chen, B. (2016). “Governing GMOs in the USA: science, law and public health”. Journal of the Science of Food and Agriculture. 96: 18511855. doi:10.1002/jsfa.7523. PMID26536836. It is therefore not surprising that efforts to require labeling and to ban GMOs have been a growing political issue in the USA (citing Domingo and Bordonaba, 2011).

Overall, a broad scientific consensus holds that currently marketed GM food poses no greater risk than conventional food… Major national and international science and medical associations have stated that no adverse human health effects related to GMO food have been reported or substantiated in peer-reviewed literature to date.

Despite various concerns, today, the American Association for the Advancement of Science, the World Health Organization, and many independent international science organizations agree that GMOs are just as safe as other foods. Compared with conventional breeding techniques, genetic engineering is far more precise and, in most cases, less likely to create an unexpected outcome.

GM foods currently available on the international market have passed safety assessments and are not likely to present risks for human health. In addition, no effects on human health have been shown as a result of the consumption of such foods by the general population in the countries where they have been approved. Continuous application of safety assessments based on the Codex Alimentarius principles and, where appropriate, adequate post market monitoring, should form the basis for ensuring the safety of GM foods.

“Genetically modified foods and health: a second interim statement” (PDF). British Medical Association. March 2004. Retrieved 21 March 2016. In our view, the potential for GM foods to cause harmful health effects is very small and many of the concerns expressed apply with equal vigour to conventionally derived foods. However, safety concerns cannot, as yet, be dismissed completely on the basis of information currently available.

When seeking to optimise the balance between benefits and risks, it is prudent to err on the side of caution and, above all, learn from accumulating knowledge and experience. Any new technology such as genetic modification must be examined for possible benefits and risks to human health and the environment. As with all novel foods, safety assessments in relation to GM foods must be made on a case-by-case basis.

Members of the GM jury project were briefed on various aspects of genetic modification by a diverse group of acknowledged experts in the relevant subjects. The GM jury reached the conclusion that the sale of GM foods currently available should be halted and the moratorium on commercial growth of GM crops should be continued. These conclusions were based on the precautionary principle and lack of evidence of any benefit. The Jury expressed concern over the impact of GM crops on farming, the environment, food safety and other potential health effects.

The Royal Society review (2002) concluded that the risks to human health associated with the use of specific viral DNA sequences in GM plants are negligible, and while calling for caution in the introduction of potential allergens into food crops, stressed the absence of evidence that commercially available GM foods cause clinical allergic manifestations. The BMA shares the view that that there is no robust evidence to prove that GM foods are unsafe but we endorse the call for further research and surveillance to provide convincing evidence of safety and benefit.

More here:

Genetic engineering – Wikipedia

Genetic Engineering | Definition of Genetic Engineering by …

: the group of applied techniques of genetics and biotechnology used to cut up and join together genetic material and especially DNA from one or more species of organism and to introduce the result into an organism in order to change one or more of its characteristics

genetically engineered adjective

genetic engineer noun

View original post here:

Genetic Engineering | Definition of Genetic Engineering by …

GEN

GEN BioPerspectives is a content forum where life science thought leaders share scientific insights or talk in their own words about issues that may enhance your research. This content is produced solely by GEN’s marketing partners. More on Bioperspectives here, or email us at [emailprotected].

Originally posted here:

GEN

Libertarianism – Wikipedia

“Libertarians” redirects here. For political parties that may go by this name, see Libertarian Party.

Libertarianism (from Latin: libertas, meaning “freedom”) is a collection of political philosophies and movements that uphold liberty as a core principle.[1] Libertarians seek to maximize political freedom and autonomy, emphasizing freedom of choice, voluntary association, and individual judgment.[2][3][4] Libertarians share a skepticism of authority and state power, but they diverge on the scope of their opposition to existing political and economic systems. Various schools of libertarian thought offer a range of views regarding the legitimate functions of state and private power, often calling for the restriction or dissolution of coercive social institutions.[5]

Left-libertarian ideologies seek to abolish capitalism and private ownership of the means of production, or else to restrict their purview or effects, in favor of common or cooperative ownership and management, viewing private property as a barrier to freedom and liberty.[6][7][8][9] In contrast, modern right-libertarian ideologies, such as minarchism and anarcho-capitalism, instead advocate laissez-faire capitalism and strong private property rights,[10] such as in land, infrastructure, and natural resources.

The first recorded use of the term “libertarian” was in 1789, when William Belsham wrote about libertarianism in the context of metaphysics.[11]

“Libertarian” came to mean an advocate or defender of liberty, especially in the political and social spheres, as early as 1796, when the London Packet printed on 12 February: “Lately marched out of the Prison at Bristol, 450 of the French Libertarians”.[12] The word was again used in a political sense in 1802 in a short piece critiquing a poem by “the author of Gebir” and has since been used with this meaning.[13][14][15]

The use of the word “libertarian” to describe a new set of political positions has been traced to the French cognate, libertaire, coined in a letter French libertarian communist Joseph Djacque wrote to mutualist Pierre-Joseph Proudhon in 1857.[16][17][18] Djacque also used the term for his anarchist publication Le Libertaire: Journal du Mouvement Social, which was printed from 9 June 1858 to 4 February 1861 in New York City.[19][20] In the mid-1890s, Sbastien Faure began publishing a new Le Libertaire while France’s Third Republic enacted the lois sclrates (“villainous laws”), which banned anarchist publications in France. Libertarianism has frequently been used as a synonym for anarchism since this time.[21][22][23]

The term “libertarianism” was first used in the United States as a synonym for classic liberalism in May 1955 by writer Dean Russell, a colleague of Leonard Read and a classic liberal himself. He justified the choice of the word as follows: “Many of us call ourselves ‘liberals.’ And it is true that the word ‘liberal’ once described persons who respected the individual and feared the use of mass compulsions. But the leftists have now corrupted that once-proud term to identify themselves and their program of more government ownership of property and more controls over persons. As a result, those of us who believe in freedom must explain that when we call ourselves liberals, we mean liberals in the uncorrupted classical sense. At best, this is awkward and subject to misunderstanding. Here is a suggestion: Let those of us who love liberty trade-mark and reserve for our own use the good and honorable word ‘libertarian'”.[24]

Subsequently, a growing number of Americans with classical liberal beliefs in the United States began to describe themselves as “libertarian”. The person most responsible for popularizing the term “libertarian” was Murray Rothbard,[25] who started publishing libertarian works in the 1960s.

Libertarianism in the United States has been described as conservative on economic issues and liberal on personal freedom[26] (for common meanings of conservative and liberal in the United States) and it is also often associated with a foreign policy of non-interventionism.[27][28]

Although the word “libertarian” has been used to refer to socialists internationally, its meaning in the United States has deviated from its political origins.[29][30]

There is contention about whether left and right libertarianism “represent distinct ideologies as opposed to variations on a theme”.[31] All libertarians begin with a conception of personal autonomy from which they argue in favor of civil liberties and a reduction or elimination of the state.

Left-libertarianism encompasses those libertarian beliefs that claim the Earth’s natural resources belong to everyone in an egalitarian manner, either unowned or owned collectively. Contemporary left-libertarians such as Hillel Steiner, Peter Vallentyne, Philippe Van Parijs, Michael Otsuka and David Ellerman believe the appropriation of land must leave “enough and as good” for others or be taxed by society to compensate for the exclusionary effects of private property. Libertarian socialists (social and individualist anarchists, libertarian Marxists, council communists, Luxemburgists and DeLeonists) promote usufruct and socialist economic theories, including communism, collectivism, syndicalism and mutualism. They criticize the state for being the defender of private property and believe capitalism entails wage slavery.

Right-libertarianism[32] developed in the United States in the mid-20th century and is the most popular conception of libertarianism in that region.[33] It is commonly referred to as a continuation or radicalization of classical liberalism.[34][35] Right-libertarians, while often sharing left-libertarians’ advocacy for social freedom, also value the social institutions that enforce conditions of capitalism, while rejecting institutions that function in opposition to these on the grounds that such interventions represent unnecessary coercion of individuals and abrogation of their economic freedom.[36] Anarcho-capitalists[37][38] seek complete elimination of the state in favor of privately funded security services while minarchists defend “night-watchman states”, which maintain only those functions of government necessary to maintain conditions of capitalism and personal security.

Anarchism envisages freedom as a form of autonomy,[39] which Paul Goodman describes as “the ability to initiate a task and do it one’s own way, without orders from authorities who do not know the actual problem and the available means”.[40] All anarchists oppose political and legal authority, but collectivist strains also oppose the economic authority of private property.[41] These social anarchists emphasize mutual aid, whereas individualist anarchists extoll individual sovereignty.[42]

Some right-libertarians consider the non-aggression principle (NAP) to be a core part of their beliefs.[43][44]

Libertarians have been advocates and activists of civil liberties, including free love and free thought.[45][46] Advocates of free love viewed sexual freedom as a clear, direct expression of individual sovereignty and they particularly stressed women’s rights as most sexual laws discriminated against women: for example, marriage laws and anti-birth control measures.[47]

Free love appeared alongside anarcha-feminism and advocacy of LGBT rights. Anarcha-feminism developed as a synthesis of radical feminism and anarchism and views patriarchy as a fundamental manifestation of compulsory government. It was inspired by the late-19th-century writings of early feminist anarchists such as Lucy Parsons, Emma Goldman, Voltairine de Cleyre and Virginia Bolten. Anarcha-feminists, like other radical feminists, criticise and advocate the abolition of traditional conceptions of family, education and gender roles. Free Society (18951897 as The Firebrand, 18971904 as Free Society) was an anarchist newspaper in the United States that staunchly advocated free love and women’s rights, while criticizing “comstockery”, the censorship of sexual information.[48] In recent times, anarchism has also voiced opinions and taken action around certain sex-related subjects such as pornography,[49] BDSM[50] and the sex industry.[50]

Free thought is a philosophical viewpoint that holds opinions should be formed on the basis of science, logic and reason in contrast with authority, tradition or other dogmas.[51][52] In the United States, free thought was an anti-Christian, anti-clerical movement whose purpose was to make the individual politically and spiritually free to decide on religious matters. A number of contributors to Liberty were prominent figures in both free thought and anarchism. In 1901, Catalan anarchist and free-thinker Francesc Ferrer i Gurdia established “modern” or progressive schools in Barcelona in defiance of an educational system controlled by the Catholic Church.[53] Fiercely anti-clerical, Ferrer believed in “freedom in education”, i.e. education free from the authority of the church and state.[54] The schools’ stated goal was to “educate the working class in a rational, secular and non-coercive setting”. Later in the 20th century, Austrian Freudo-Marxist Wilhelm Reich became a consistent propagandist for sexual freedom going as far as opening free sex-counselling clinics in Vienna for working-class patients[55] as well as coining the phrase “sexual revolution” in one of his books from the 1940s.[56] During the early 1970s, the English anarchist and pacifist Alex Comfort achieved international celebrity for writing the sex manuals The Joy of Sex and More Joy of Sex.

Most left-libertarians are anarchists and believe the state inherently violates personal autonomy: “As Robert Paul Wolff has argued, since ‘the state is authority, the right to rule’, anarchism which rejects the State is the only political doctrine consistent with autonomy in which the individual alone is the judge of his moral constraints”.[41] Social anarchists believe the state defends private property, which they view as intrinsically harmful, while market-oriented left-libertarians argue that so-called free markets actually consist of economic privileges granted by the state. These latter libertarians advocate instead for freed markets, which are freed from these privileges.[57]

There is a debate amongst right-libertarians as to whether or not the state is legitimate: while anarcho-capitalists advocate its abolition, minarchists support minimal states, often referred to as night-watchman states. Libertarians take a skeptical view of government authority.[58][unreliable source?] Minarchists maintain that the state is necessary for the protection of individuals from aggression, theft, breach of contract and fraud. They believe the only legitimate governmental institutions are the military, police and courts, though some expand this list to include fire departments, prisons and the executive and legislative branches.[59] They justify the state on the grounds that it is the logical consequence of adhering to the non-aggression principle and argue that anarchism is immoral because it implies that the non-aggression principle is optional, that the enforcement of laws under anarchism is open to competition.[citation needed] Another common justification is that private defense agencies and court firms would tend to represent the interests of those who pay them enough.[60]

Anarcho-capitalists argue that the state violates the non-aggression principle (NAP) by its nature because governments use force against those who have not stolen or vandalized private property, assaulted anyone or committed fraud.[61][62] Linda & Morris Tannehill argue that no coercive monopoly of force can arise on a truly free market and that a government’s citizenry can not desert them in favor of a competent protection and defense agency.[63]

Left-libertarians believe that neither claiming nor mixing one’s labor with natural resources is enough to generate full private property rights[64][65] and maintain that natural resources ought to be held in an egalitarian manner, either unowned or owned collectively.[66]

Right-libertarians maintain that unowned natural resources “may be appropriated by the first person who discovers them, mixes his labor with them, or merely claims themwithout the consent of others, and with little or no payment to them”. They believe that natural resources are originally unowned and therefore private parties may appropriate them at will without the consent of, or owing to, others.[67]

Left-libertarians (social and individualist anarchists, libertarian Marxists and left-wing market anarchists) argue in favor of socialist theories such as communism, syndicalism and mutualism (anarchist economics). Daniel Gurin writes that “anarchism is really a synonym for socialism. The anarchist is primarily a socialist whose aim is to abolish the exploitation of man by man. Anarchism is only one of the streams of socialist thought, that stream whose main components are concern for liberty and haste to abolish the State”.[68]

Right-libertarians are economic liberals of either the Austrian School or Chicago school and support laissez-faire capitalism.[69]

Wage labour has long been compared by socialists and anarcho-syndicalists to slavery.[70][71][72][73] As a result, the term “wage slavery” is often utilised as a pejorative for wage labor.[74] Advocates of slavery looked upon the “comparative evils of Slave Society and of Free Society, of slavery to human Masters and slavery to Capital”[75] and proceeded to argue that wage slavery was actually worse than chattel slavery.[76] Slavery apologists like George Fitzhugh contended that workers only accepted wage labour with the passage of time, as they became “familiarized and inattentive to the infected social atmosphere they continually inhale[d]”.[75]

According to Noam Chomsky, analysis of the psychological implications of wage slavery goes back to the Enlightenment era. In his 1791 book On the Limits of State Action, classical liberal thinker Wilhelm von Humboldt explained how “whatever does not spring from a man’s free choice, or is only the result of instruction and guidance, does not enter into his very nature; he does not perform it with truly human energies, but merely with mechanical exactness” and so when the labourer works under external control “we may admire what he does, but we despise what he is”.[77] For Marxists, labour-as-commodity, which is how they regard wage labour,[78] provides an absolutely fundamental point of attack against capitalism.[79] “It can be persuasively argued”, noted philosopher John Nelson, “that the conception of the worker’s labour as a commodity confirms Marx’s stigmatization of the wage system of private capitalism as ‘wage-slavery;’ that is, as an instrument of the capitalist’s for reducing the worker’s condition to that of a slave, if not below it”.[80] That this objection is fundamental follows immediately from Marx’s conclusion that wage labour is the very foundation of capitalism: “Without a class dependent on wages, the moment individuals confront each other as free persons, there can be no production of surplus value; without the production of surplus-value there can be no capitalist production, and hence no capital and no capitalist!”.[81]

Left-libertarianism (or left-wing libertarianism) names several related, but distinct approaches to political and social theory which stresses both individual freedom and social equality. In its classical usage, left-libertarianism is a synonym for anti-authoritarian varieties of left-wing politics, i.e. libertarian socialism, which includes anarchism and libertarian Marxism, among others.[82][83] Left-libertarianism can also refer to political positions associated with academic philosophers Hillel Steiner, Philippe Van Parijs and Peter Vallentyne that combine self-ownership with an egalitarian approach to natural resouces.[84]

While maintaining full respect for personal property, left-libertarians are skeptical of or fully against private property, arguing that neither claiming nor mixing one’s labor with natural resources is enough to generate full private property rights[85][86] and maintain that natural resources (land, oil, gold and vegetation) should be held in an egalitarian manner, either unowned or owned collectively. Those left-libertarians who support private property do so under the condition that recompense is offered to the local community.[86] Many left-libertarian schools of thought are communist, advocating the eventual replacement of money with labor vouchers or decentralized planning.

On the other hand, left-wing market anarchism, which includes Pierre-Joseph Proudhon’s mutualism and Samuel Edward Konkin III’s agorism, appeals to left-wing concerns such as egalitarianism, gender and sexuality, class, immigration and environmentalism within the paradigm of a socialist free market.[82]

Right-libertarianism (or right-wing libertarianism) refers to libertarian political philosophies that advocate negative rights, natural law and a major reversal of the modern welfare state.[87] Right-libertarians strongly support private property rights and defend market distribution of natural resources and private property.[88] This position is contrasted with that of some versions of left-libertarianism, which maintain that natural resources belong to everyone in an egalitarian manner, either unowned or owned collectively.[89] Right-libertarianism includes anarcho-capitalism and laissez-faire, minarchist liberalism.[note 1]

Elements of libertarianism can be traced as far back as the ancient Chinese philosopher Lao-Tzu and the higher-law concepts of the Greeks and the Israelites.[90][91] In 17th-century England, libertarian ideas began to take modern form in the writings of the Levellers and John Locke. In the middle of that century, opponents of royal power began to be called Whigs, or sometimes simply “opposition” or “country” (as opposed to Court) writers.[92]

During the 18th century, classical liberal ideas flourished in Europe and North America.[93][94] Libertarians of various schools were influenced by classical liberal ideas.[95] For libertarian philosopher Roderick T. Long, both libertarian socialists and libertarian capitalists “share a commonor at least an overlapping intellectual ancestry… both claim the seventeenth century English Levellers and the eighteenth century French encyclopedists among their ideological forebears; and (also)… usually share an admiration for Thomas Jefferson[96][97][98] and Thomas Paine”.[99]

John Locke greatly influenced both libertarianism and the modern world in his writings published before and after the English Revolution of 1688, especially A Letter Concerning Toleration (1667), Two Treatises of Government (1689) and An Essay Concerning Human Understanding (1690). In the text of 1689, he established the basis of liberal political theory: that people’s rights existed before government; that the purpose of government is to protect personal and property rights; that people may dissolve governments that do not do so; and that representative government is the best form to protect rights.[100] The United States Declaration of Independence was inspired by Locke in its statement: “[T]o secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed. That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it”.[101] Nevertheless scholar Ellen Meiksins Wood says that “there are doctrines of individualism that are opposed to Lockean individualism… and non-Lockean individualism may encompass socialism”.[102]

According to Murray Rothbard, the libertarian creed emerged from the classical liberal challenges to an “absolute central State and a king ruling by divine right on top of an older, restrictive web of feudal land monopolies and urban guild controls and restrictions”, the mercantilism of a bureaucratic warfaring state allied with privileged merchants. The object of classical liberals was individual liberty in the economy, in personal freedoms and civil liberty, separation of state and religion, and peace as an alternative to imperial aggrandizement. He cites Locke’s contemporaries, the Levellers, who held similar views. Also influential were the English “Cato’s Letters” during the early 1700s, reprinted eagerly by American colonists who already were free of European aristocracy and feudal land monopolies.[101]

In January of 1776, only two years after coming to America from England, Thomas Paine published his pamphlet Common Sense calling for independence for the colonies.[103] Paine promoted classical liberal ideas in clear, concise language that allowed the general public to understand the debates among the political elites.[104] Common Sense was immensely popular in disseminating these ideas,[105] selling hundreds of thousands of copies.[106] Paine later would write the Rights of Man and The Age of Reason and participate in the French Revolution.[103] Paine’s theory of property showed a “libertarian concern” with the redistribution of resources.[107]

In 1793, William Godwin wrote a libertarian philosophical treatise, Enquiry Concerning Political Justice and its Influence on Morals and Happiness, which criticized ideas of human rights and of society by contract based on vague promises. He took classical liberalism to its logical anarchic conclusion by rejecting all political institutions, law, government and apparatus of coercion as well as all political protest and insurrection. Instead of institutionalized justice, Godwin proposed that people influence one another to moral goodness through informal reasoned persuasion, including in the associations they joined as this would facilitate happiness.[108][109]

Modern anarchism sprang from the secular or religious thought of the Enlightenment, particularly Jean-Jacques Rousseau’s arguments for the moral centrality of freedom.[110]

As part of the political turmoil of the 1790s in the wake of the French Revolution, William Godwin developed the first expression of modern anarchist thought.[111][112] According to Peter Kropotkin, Godwin was “the first to formulate the political and economical conceptions of anarchism, even though he did not give that name to the ideas developed in his work”,[113] while Godwin attached his anarchist ideas to an early Edmund Burke.[114]

Godwin is generally regarded as the founder of the school of thought known as philosophical anarchism. He argued in Political Justice (1793)[112][115] that government has an inherently malevolent influence on society and that it perpetuates dependency and ignorance. He thought that the spread of the use of reason to the masses would eventually cause government to wither away as an unnecessary force. Although he did not accord the state with moral legitimacy, he was against the use of revolutionary tactics for removing the government from power. Rather, Godwin advocated for its replacement through a process of peaceful evolution.[112][116]

His aversion to the imposition of a rules-based society led him to denounce, as a manifestation of the people’s “mental enslavement”, the foundations of law, property rights and even the institution of marriage. Godwin considered the basic foundations of society as constraining the natural development of individuals to use their powers of reasoning to arrive at a mutually beneficial method of social organization. In each case, government and its institutions are shown to constrain the development of our capacity to live wholly in accordance with the full and free exercise of private judgment.

In France, various anarchist currents were present during the Revolutionary period, with some revolutionaries using the term anarchiste in a positive light as early as September 1793.[117] The enrags opposed revolutionary government as a contradiction in terms. Denouncing the Jacobin dictatorship, Jean Varlet wrote in 1794 that “government and revolution are incompatible, unless the people wishes to set its constituted authorities in permanent insurrection against itself”.[118] In his “Manifesto of the Equals”, Sylvain Marchal looked forward to the disappearance, once and for all, of “the revolting distinction between rich and poor, of great and small, of masters and valets, of governors and governed”.[118]

Libertarian socialism, libertarian communism and libertarian Marxism are all phrases which activists with a variety of perspectives have applied to their views.[119] Anarchist communist philosopher Joseph Djacque was the first person to describe himself as a libertarian.[120] Unlike mutualist anarchist philosopher Pierre-Joseph Proudhon, he argued that “it is not the product of his or her labor that the worker has a right to, but to the satisfaction of his or her needs, whatever may be their nature”.[121][122] According to anarchist historian Max Nettlau, the first use of the term “libertarian communism” was in November 1880, when a French anarchist congress employed it to more clearly identify its doctrines.[123] The French anarchist journalist Sbastien Faure started the weekly paper Le Libertaire (The Libertarian) in 1895.[124]

Individualist anarchism refers to several traditions of thought within the anarchist movement that emphasize the individual and their will over any kinds of external determinants such as groups, society, traditions, and ideological systems.[125][126] An influential form of individualist anarchism called egoism[127] or egoist anarchism was expounded by one of the earliest and best-known proponents of individualist anarchism, the German Max Stirner.[128] Stirner’s The Ego and Its Own, published in 1844, is a founding text of the philosophy.[128] According to Stirner, the only limitation on the rights of the individual is their power to obtain what they desire,[129] without regard for God, state or morality.[130] Stirner advocated self-assertion and foresaw unions of egoists, non-systematic associations continually renewed by all parties’ support through an act of will,[131] which Stirner proposed as a form of organisation in place of the state.[132] Egoist anarchists argue that egoism will foster genuine and spontaneous union between individuals.[133] Egoism has inspired many interpretations of Stirner’s philosophy. It was re-discovered and promoted by German philosophical anarchist and LGBT activist John Henry Mackay. Josiah Warren is widely regarded as the first American anarchist,[134] and the four-page weekly paper he edited during 1833, The Peaceful Revolutionist, was the first anarchist periodical published.[135] For American anarchist historian Eunice Minette Schuster, “[i]t is apparent… that Proudhonian Anarchism was to be found in the United States at least as early as 1848 and that it was not conscious of its affinity to the Individualist Anarchism of Josiah Warren and Stephen Pearl Andrews… William B. Greene presented this Proudhonian Mutualism in its purest and most systematic form.”.[136] Later, Benjamin Tucker fused Stirner’s egoism with the economics of Warren and Proudhon in his eclectic influential publication Liberty. From these early influences, individualist anarchism in different countries attracted a small yet diverse following of bohemian artists and intellectuals,[137] free love and birth control advocates (anarchism and issues related to love and sex),[138][139] individualist naturists nudists (anarcho-naturism),[140][141][142] free thought and anti-clerical activists[143][144] as well as young anarchist outlaws in what became known as illegalism and individual reclamation[145][146] (European individualist anarchism and individualist anarchism in France). These authors and activists included Emile Armand, Han Ryner, Henri Zisly, Renzo Novatore, Miguel Gimenez Igualada, Adolf Brand and Lev Chernyi.

In 1873, the follower and translator of Proudhon, the Catalan Francesc Pi i Margall, became President of Spain with a program which wanted “to establish a decentralized, or “cantonalist,” political system on Proudhonian lines”,[147] who according to Rudolf Rocker had “political ideas…much in common with those of Richard Price, Joseph Priestly [sic], Thomas Paine, Jefferson, and other representatives of the Anglo-American liberalism of the first period. He wanted to limit the power of the state to a minimum and gradually replace it by a Socialist economic order”.[148] On the other hand, Fermn Salvochea was a mayor of the city of Cdiz and a president of the province of Cdiz. He was one of the main propagators of anarchist thought in that area in the late 19th century and is considered to be “perhaps the most beloved figure in the Spanish Anarchist movement of the 19th century”.[149][150] Ideologically, he was influenced by Bradlaugh, Owen and Paine, whose works he had studied during his stay in England and Kropotkin, whom he read later.[149] The revolutionary wave of 19171923 saw the active participation of anarchists in Russia and Europe. Russian anarchists participated alongside the Bolsheviks in both the February and October 1917 revolutions. However, Bolsheviks in central Russia quickly began to imprison or drive underground the libertarian anarchists. Many fled to the Ukraine.[151] There, in the Ukrainian Free Territory they fought in the Russian Civil War against the White movement, monarchists and other opponents of revolution and then against Bolsheviks as part of the Revolutionary Insurrectionary Army of Ukraine led by Nestor Makhno, who established an anarchist society in the region for a number of months. Expelled American anarchists Emma Goldman and Alexander Berkman protested Bolshevik policy before they left Russia.[152]

The victory of the Bolsheviks damaged anarchist movements internationally as workers and activists joined Communist parties. In France and the United States, for example, members of the major syndicalist movements of the CGT and IWW joined the Communist International.[153] In Paris, the Dielo Truda group of Russian anarchist exiles, which included Nestor Makhno, issued a 1926 manifesto, the Organizational Platform of the General Union of Anarchists (Draft), calling for new anarchist organizing structures.[154][155]

The Bavarian Soviet Republic of 19181919 had libertarian socialist characteristics.[156][157] In Italy, from 1918 to 1921 the anarcho-syndicalist trade union Unione Sindacale Italiana grew to 800,000 members.[158]

In the 1920s and 1930s, with the rise of fascism in Europe, anarchists began to fight fascists in Italy,[159] in France during the February 1934 riots[160] and in Spain where the CNT (Confederacin Nacional del Trabajo) boycott of elections led to a right-wing victory and its later participation in voting in 1936 helped bring the popular front back to power. This led to a ruling class attempted coup and the Spanish Civil War (19361939).[161] Gruppo Comunista Anarchico di Firenze held that the during early twentieth century, the terms libertarian communism and anarchist communism became synonymous within the international anarchist movement as a result of the close connection they had in Spain (anarchism in Spain) (with libertarian communism becoming the prevalent term).[162]

Murray Bookchin wrote that the Spanish libertarian movement of the mid-1930s was unique because its workers’ control and collectiveswhich came out of a three-generation “massive libertarian movement”divided the republican camp and challenged the Marxists. “Urban anarchists” created libertarian communist forms of organization which evolved into the CNT, a syndicalist union providing the infrastructure for a libertarian society. Also formed were local bodies to administer social and economic life on a decentralized libertarian basis. Much of the infrastructure was destroyed during the 1930s Spanish Civil War against authoritarian and fascist forces.[163] The Iberian Federation of Libertarian Youth[164] (FIJL, Spanish: Federacin Ibrica de Juventudes Libertarias), sometimes abbreviated as Libertarian Youth (Juventudes Libertarias), was a libertarian socialist[165] organisation created in 1932 in Madrid.[166] In February 1937, the FIJL organised a plenum of regional organisations (second congress of FIJL). In October 1938, from the 16th through the 30th in Barcelona the FIJL participated in a national plenum of the libertarian movement, also attended by members of the CNT and the Iberian Anarchist Federation (FAI).[167] The FIJL exists until today. When the republican forces lost the Spanish Civil War, the city of Madrid was turned over to the francoist forces in 1939 by the last non-francoist mayor of the city, the anarchist Melchor Rodrguez Garca.[168] During autumn of 1931, the “Manifesto of the 30” was published by militants of the anarchist trade union CNT and among those who signed it there was the CNT General Secretary (19221923) Joan Peiro, Angel Pestaa CNT (General Secretary in 1929) and Juan Lopez Sanchez. They were called treintismo and they were calling for “libertarian possibilism” which advocated achieving libertarian socialist ends with participation inside structures of contemporary parliamentary democracy.[169] In 1932, they establish the Syndicalist Party which participates in the 1936 spanish general elections and proceed to be a part of the leftist coalition of parties known as the Popular Front obtaining 2 congressmen (Pestaa and Benito Pabon). In 1938, Horacio Prieto, general secretary of the CNT, proposes that the Iberian Anarchist Federation transforms itself into a “Libertarian Socialist Party” and that it participates in the national elections.[170]

The Manifesto of Libertarian Communism was written in 1953 by Georges Fontenis for the Federation Communiste Libertaire of France. It is one of the key texts of the anarchist-communist current known as platformism.[171] In 1968, in Carrara, Italy the International of Anarchist Federations was founded during an international anarchist conference to advance libertarian solidarity. It wanted to form “a strong and organised workers movement, agreeing with the libertarian ideas”.[172][173] In the United States, the Libertarian League was founded in New York City in 1954 as a left-libertarian political organisation building on the Libertarian Book Club.[174][175] Members included Sam Dolgoff,[176] Russell Blackwell, Dave Van Ronk, Enrico Arrigoni[177] and Murray Bookchin.

In Australia, the Sydney Push was a predominantly left-wing intellectual subculture in Sydney from the late 1940s to the early 1970s which became associated with the label “Sydney libertarianism”. Well known associates of the Push include Jim Baker, John Flaus, Harry Hooton, Margaret Fink, Sasha Soldatow,[178] Lex Banning, Eva Cox, Richard Appleton, Paddy McGuinness, David Makinson, Germaine Greer, Clive James, Robert Hughes, Frank Moorhouse and Lillian Roxon. Amongst the key intellectual figures in Push debates were philosophers David J. Ivison, George Molnar, Roelof Smilde, Darcy Waters and Jim Baker, as recorded in Baker’s memoir Sydney Libertarians and the Push, published in the libertarian Broadsheet in 1975.[179] An understanding of libertarian values and social theory can be obtained from their publications, a few of which are available online.[180][181]

In 1969, French platformist anarcho-communist Daniel Gurin published an essay in 1969 called “Libertarian Marxism?” in which he dealt with the debate between Karl Marx and Mikhail Bakunin at the First International and afterwards suggested that “[l]ibertarian marxism rejects determinism and fatalism, giving the greater place to individual will, intuition, imagination, reflex speeds, and to the deep instincts of the masses, which are more far-seeing in hours of crisis than the reasonings of the ‘elites’; libertarian marxism thinks of the effects of surprise, provocation and boldness, refuses to be cluttered and paralysed by a heavy ‘scientific’ apparatus, doesn’t equivocate or bluff, and guards itself from adventurism as much as from fear of the unknown”.[182] Libertarian Marxist currents often draw from Marx and Engels’ later works, specifically the Grundrisse and The Civil War in France.[183] They emphasize the Marxist belief in the ability of the working class to forge its own destiny without the need for a revolutionary party or state.[184] Libertarian Marxism includes such currents as council communism, left communism, Socialisme ou Barbarie, Lettrism/Situationism and operaismo/autonomism and New Left.[185][unreliable source?] In the United States, from 1970 to 1981 there existed the publication Root & Branch[186] which had as a subtitle “A Libertarian Marxist Journal”.[187] In 1974, the Libertarian Communism journal was started in the United Kingdom by a group inside the Socialist Party of Great Britain.[188] In 1986, the anarcho-syndicalist Sam Dolgoff started and led the publication Libertarian Labor Review in the United States[189] which decided to rename itself as Anarcho-Syndicalist Review in order to avoid confusion with right-libertarian views.[190]

The indigenous anarchist tradition in the United States was largely individualist.[191] In 1825, Josiah Warren became aware of the social system of utopian socialist Robert Owen and began to talk with others in Cincinnati about founding a communist colony.[192] When this group failed to come to an agreement about the form and goals of their proposed community, Warren “sold his factory after only two years of operation, packed up his young family, and took his place as one of 900 or so Owenites who had decided to become part of the founding population of New Harmony, Indiana”.[193] Warren termed the phrase “cost the limit of price”[194] and “proposed a system to pay people with certificates indicating how many hours of work they did. They could exchange the notes at local time stores for goods that took the same amount of time to produce”.[195] He put his theories to the test by establishing an experimental labor-for-labor store called the Cincinnati Time Store where trade was facilitated by labor notes. The store proved successful and operated for three years, after which it was closed so that Warren could pursue establishing colonies based on mutualism, including Utopia and Modern Times. “After New Harmony failed, Warren shifted his ideological loyalties from socialism to anarchism (which was no great leap, given that Owen’s socialism had been predicated on Godwin’s anarchism)”.[196] Warren is widely regarded as the first American anarchist[195] and the four-page weekly paper The Peaceful Revolutionist he edited during 1833 was the first anarchist periodical published,[135] an enterprise for which he built his own printing press, cast his own type and made his own printing plates.[135]

Catalan historian Xavier Diez reports that the intentional communal experiments pioneered by Warren were influential in European individualist anarchists of the late 19th and early 20th centuries such as mile Armand and the intentional communities started by them.[197] Warren said that Stephen Pearl Andrews, individualist anarchist and close associate, wrote the most lucid and complete exposition of Warren’s own theories in The Science of Society, published in 1852.[198] Andrews was formerly associated with the Fourierist movement, but converted to radical individualism after becoming acquainted with the work of Warren. Like Warren, he held the principle of “individual sovereignty” as being of paramount importance. Contemporary American anarchist Hakim Bey reports:

Steven Pearl Andrews… was not a fourierist, but he lived through the brief craze for phalansteries in America and adopted a lot of fourierist principles and practices… a maker of worlds out of words. He syncretized abolitionism in the United States, free love, spiritual universalism, Warren, and Fourier into a grand utopian scheme he called the Universal Pantarchy… He was instrumental in founding several ‘intentional communities,’ including the ‘Brownstone Utopia’ on 14th St. in New York, and ‘Modern Times’ in Brentwood, Long Island. The latter became as famous as the best-known fourierist communes (Brook Farm in Massachusetts & the North American Phalanx in New Jersey)in fact, Modern Times became downright notorious (for ‘Free Love’) and finally foundered under a wave of scandalous publicity. Andrews (and Victoria Woodhull) were members of the infamous Section 12 of the 1st International, expelled by Marx for its anarchist, feminist, and spiritualist tendencies.[199]

For American anarchist historian Eunice Minette Schuster, “[it is apparent… that Proudhonian Anarchism was to be found in the United States at least as early as 1848 and that it was not conscious of its affinity to the Individualist Anarchism of Josiah Warren and Stephen Pearl Andrews. William B. Greene presented this Proudhonian Mutualism in its purest and most systematic form”.[200] William Batchelder Greene was a 19th-century mutualist individualist anarchist, Unitarian minister, soldier and promoter of free banking in the United States. Greene is best known for the works Mutual Banking, which proposed an interest-free banking system; and Transcendentalism, a critique of the New England philosophical school. After 1850, he became active in labor reform.[200] “He was elected vice-president of the New England Labor Reform League, the majority of the members holding to Proudhon’s scheme of mutual banking, and in 1869 president of the Massachusetts Labor Union”.[200] Greene then published Socialistic, Mutualistic, and Financial Fragments (1875).[200] He saw mutualism as the synthesis of “liberty and order”.[200] His “associationism… is checked by individualism… ‘Mind your own business,’ ‘Judge not that ye be not judged.’ Over matters which are purely personal, as for example, moral conduct, the individual is sovereign, as well as over that which he himself produces. For this reason he demands ‘mutuality’ in marriagethe equal right of a woman to her own personal freedom and property”.[200]

Poet, naturalist and transcendentalist Henry David Thoreau was an important early influence in individualist anarchist thought in the United States and Europe. He is best known for his book Walden, a reflection upon simple living in natural surroundings; and his essay Civil Disobedience (Resistance to Civil Government), an argument for individual resistance to civil government in moral opposition to an unjust state. In Walden, Thoreau advocates simple living and self-sufficiency among natural surroundings in resistance to the advancement of industrial civilization.[201] Civil Disobedience, first published in 1849, argues that people should not permit governments to overrule or atrophy their consciences and that people have a duty to avoid allowing such acquiescence to enable the government to make them the agents of injustice. These works influenced green anarchism, anarcho-primitivism and anarcho-pacifism,[202] as well as figures including Mohandas Gandhi, Martin Luther King, Jr., Martin Buber and Leo Tolstoy.[202] “Many have seen in Thoreau one of the precursors of ecologism and anarcho-primitivism represented today in John Zerzan. For George Woodcock this attitude can be also motivated by certain idea of resistance to progress and of rejection of the growing materialism which is the nature of American society in the mid-19th century”.[201] Zerzan included Thoreau’s “Excursions” in his edited compilation of anti-civilization writings, Against Civilization: Readings and Reflections.[203] Individualist anarchists such as Thoreau[204][205] do not speak of economics, but simply the right of disunion from the state and foresee the gradual elimination of the state through social evolution. Agorist author J. Neil Schulman cites Thoreau as a primary inspiration.[206]

Many economists since Adam Smith have argued thatunlike other taxesa land value tax would not cause economic inefficiency.[207] It would be a progressive tax[208]primarily paid by the wealthyand increase wages, reduce economic inequality, remove incentives to misuse real estate and reduce the vulnerability that economies face from credit and property bubbles.[209][210] Early proponents of this view include Thomas Paine, Herbert Spencer, and Hugo Grotius,[84] but the concept was widely popularized by the economist and social reformer Henry George.[211] George believed that people ought to own the fruits of their labor and the value of the improvements they make, thus he was opposed to income taxes, sales taxes, taxes on improvements and all other taxes on production, labor, trade or commerce. George was among the staunchest defenders of free markets and his book Protection or Free Trade was read into the U.S. Congressional Record.[212] Yet he did support direct management of natural monopolies as a last resort, such as right-of-way monopolies necessary for railroads. George advocated for elimination of intellectual property arrangements in favor of government sponsored prizes for inventors.[213][not in citation given] Early followers of George’s philosophy called themselves single taxers because they believed that the only legitimate, broad-based tax was land rent. The term Georgism was coined later, though some modern proponents prefer the term geoism instead,[214] leaving the meaning of “geo” (Earth in Greek) deliberately ambiguous. The terms “Earth Sharing”,[215] “geonomics”[216] and “geolibertarianism”[217] are used by some Georgists to represent a difference of emphasis, or real differences about how land rent should be spent, but all agree that land rent should be recovered from its private owners.

Individualist anarchism found in the United States an important space for discussion and development within the group known as the “Boston anarchists”.[218] Even among the 19th-century American individualists there was no monolithic doctrine and they disagreed amongst each other on various issues including intellectual property rights and possession versus property in land.[219][220][221] Some Boston anarchists, including Benjamin Tucker, identified as socialists, which in the 19th century was often used in the sense of a commitment to improving conditions of the working class (i.e. “the labor problem”).[222] Lysander Spooner, besides his individualist anarchist activism, was also an anti-slavery activist and member of the First International.[223] Tucker argued that the elimination of what he called “the four monopolies”the land monopoly, the money and banking monopoly, the monopoly powers conferred by patents and the quasi-monopolistic effects of tariffswould undermine the power of the wealthy and big business, making possible widespread property ownership and higher incomes for ordinary people, while minimizing the power of would-be bosses and achieving socialist goals without state action. Tucker’s anarchist periodical, Liberty, was published from August 1881 to April 1908. The publication, emblazoned with Proudhon’s quote that liberty is “Not the Daughter But the Mother of Order” was instrumental in developing and formalizing the individualist anarchist philosophy through publishing essays and serving as a forum for debate. Contributors included Benjamin Tucker, Lysander Spooner, Auberon Herbert, Dyer Lum, Joshua K. Ingalls, John Henry Mackay, Victor Yarros, Wordsworth Donisthorpe, James L. Walker, J. William Lloyd, Florence Finch Kelly, Voltairine de Cleyre, Steven T. Byington, John Beverley Robinson, Jo Labadie, Lillian Harman and Henry Appleton.[224] Later, Tucker and others abandoned their traditional support of natural rights and converted to an egoism modeled upon the philosophy of Max Stirner.[220] A number of natural rights proponents stopped contributing in protest and “[t]hereafter, Liberty championed egoism, although its general content did not change significantly”.[225] Several publications “were undoubtedly influenced by Liberty’s presentation of egoism. They included: I published by C.L. Swartz, edited by W.E. Gordak and J.W. Lloyd (all associates of Liberty); The Ego and The Egoist, both of which were edited by Edward H. Fulton. Among the egoist papers that Tucker followed were the German Der Eigene, edited by Adolf Brand, and The Eagle and The Serpent, issued from London. The latter, the most prominent English-language egoist journal, was published from 1898 to 1900 with the subtitle ‘A Journal of Egoistic Philosophy and Sociology'”.[225]

By around the start of the 20th century, the heyday of individualist anarchism had passed.[226] H. L. Mencken and Albert Jay Nock were the first prominent figures in the United States to describe themselves as libertarians;[227] they believed Franklin D. Roosevelt had co-opted the word “liberal” for his New Deal policies which they opposed and used “libertarian” to signify their allegiance to individualism.[citation needed] In 1914, Nock joined the staff of The Nation magazine, which at the time was supportive of liberal capitalism. A lifelong admirer of Henry George, Nock went on to become co-editor of The Freeman from 1920 to 1924, a publication initially conceived as a vehicle for the single tax movement, financed by the wealthy wife of the magazine’s other editor, Francis Neilson.[228] Critic H.L. Mencken wrote that “[h]is editorials during the three brief years of the Freeman set a mark that no other man of his trade has ever quite managed to reach. They were well-informed and sometimes even learned, but there was never the slightest trace of pedantry in them”.[229]

Executive Vice President of the Cato Institute, David Boaz, writes: “In 1943, at one of the lowest points for liberty and humanity in history, three remarkable women published books that could be said to have given birth to the modern libertarian movement”.[230] Isabel Paterson’s The God of the Machine, Rose Wilder Lane’s The Discovery of Freedom and Ayn Rand’s The Fountainhead each promoted individualism and capitalism. None of the three used the term libertarianism to describe their beliefs and Rand specifically rejected the label, criticizing the burgeoning American libertarian movement as the “hippies of the right”.[231] Rand’s own philosophy, Objectivism, is notedly similar to libertarianism and she accused libertarians of plagiarizing her ideas.[231] Rand stated:

All kinds of people today call themselves “libertarians,” especially something calling itself the New Right, which consists of hippies who are anarchists instead of leftist collectivists; but anarchists are collectivists. Capitalism is the one system that requires absolute objective law, yet libertarians combine capitalism and anarchism. That’s worse than anything the New Left has proposed. It’s a mockery of philosophy and ideology. They sling slogans and try to ride on two bandwagons. They want to be hippies, but don’t want to preach collectivism because those jobs are already taken. But anarchism is a logical outgrowth of the anti-intellectual side of collectivism. I could deal with a Marxist with a greater chance of reaching some kind of understanding, and with much greater respect. Anarchists are the scum of the intellectual world of the Left, which has given them up. So the Right picks up another leftist discard. That’s the libertarian movement.[232]

In 1946, Leonard E. Read founded the Foundation for Economic Education (FEE), an American nonprofit educational organization which promotes the principles of laissez-faire economics, private property, and limited government.[233] According to Gary North, former FEE director of seminars and a current Ludwig von Mises Institute scholar, FEE is the “granddaddy of all libertarian organizations”.[234] The initial officers of FEE were Leonard E. Read as President, Austrian School economist Henry Hazlitt as Vice-President and Chairman David Goodrich of B. F. Goodrich. Other trustees on the FEE board have included wealthy industrialist Jasper Crane of DuPont, H. W. Luhnow of William Volker & Co. and Robert Welch, founder of the John Birch Society.[236][237]

Austrian school economist Murray Rothbard was initially an enthusiastic partisan of the Old Right, particularly because of its general opposition to war and imperialism,[238] but long embraced a reading of American history that emphasized the role of elite privilege in shaping legal and political institutions. He was part of Ayn Rand’s circle for a brief period, but later harshly criticized Objectivism.[239] He praised Rand’s Atlas Shrugged and wrote that she “introduced me to the whole field of natural rights and natural law philosophy”, prompting him to learn “the glorious natural rights tradition”.[240](pp121, 132134) He soon broke with Rand over various differences, including his defense of anarchism. Rothbard was influenced by the work of the 19th-century American individualist anarchists[241] and sought to meld their advocacy of free markets and private defense with the principles of Austrian economics.[242] This new philosophy he called anarcho-capitalism.

Karl Hess, a speechwriter for Barry Goldwater and primary author of the Republican Party’s 1960 and 1964 platforms, became disillusioned with traditional politics following the 1964 presidential campaign in which Goldwater lost to Lyndon B. Johnson. He parted with the Republicans altogether after being rejected for employment with the party, and began work as a heavy-duty welder. Hess began reading American anarchists largely due to the recommendations of his friend Murray Rothbard and said that upon reading the works of communist anarchist Emma Goldman, he discovered that anarchists believed everything he had hoped the Republican Party would represent. For Hess, Goldman was the source for the best and most essential theories of Ayn Rand without any of the “crazy solipsism that Rand was so fond of”.[243] Hess and Rothbard founded the journal Left and Right: A Journal of Libertarian Thought, which was published from 1965 to 1968, with George Resch and Leonard P. Liggio. In 1969, they edited The Libertarian Forum 1969, which Hess left in 1971. Hess eventually put his focus on the small scale, stating that “Society is: people together making culture”. He deemed two of his cardinal social principles to be “opposition to central political authority” and “concern for people as individuals”. His rejection of standard American party politics was reflected in a lecture he gave during which he said: “The Democrats or liberals think that everybody is stupid and therefore they need somebody… to tell them how to behave themselves. The Republicans think everybody is lazy”.[244]

The Vietnam War split the uneasy alliance between growing numbers of American libertarians and conservatives who believed in limiting liberty to uphold moral virtues. Libertarians opposed to the war joined the draft resistance and peace movements, as well as organizations such as Students for a Democratic Society (SDS). In 1969 and 1970, Hess joined with others, including Murray Rothbard, Robert LeFevre, Dana Rohrabacher, Samuel Edward Konkin III and former SDS leader Carl Oglesby to speak at two “left-right” conferences which brought together activists from both the Old Right and the New Left in what was emerging as a nascent libertarian movement.[245] As part of his effort to unite right and left-libertarianism, Hess would join the SDS as well as the Industrial Workers of the World (IWW), of which he explained: “We used to have a labor movement in this country, until I.W.W. leaders were killed or imprisoned. You could tell labor unions had become captive when business and government began to praise them. They’re destroying the militant black leaders the same way now. If the slaughter continues, before long liberals will be asking, ‘What happened to the blacks? Why aren’t they militant anymore?'”.[246] Rothbard ultimately broke with the left, allying himself instead with the burgeoning paleoconservative movement.[247] He criticized the tendency of these left-libertarians to appeal to “‘free spirits,’ to people who don’t want to push other people around, and who don’t want to be pushed around themselves” in contrast to “the bulk of Americans,” who “might well be tight-assed conformists, who want to stamp out drugs in their vicinity, kick out people with strange dress habits, etc”.[248] This left-libertarian tradition has been carried to the present day by Samuel Edward Konkin III’s agorists, contemporary mutualists such as Kevin Carson and Roderick T. Long and other left-wing market anarchists.[249]

In 1971, a small group of Americans led by David Nolan formed the Libertarian Party,[250] which has run a presidential candidate every election year since 1972. Other libertarian organizations, such as the Center for Libertarian Studies and the Cato Institute, were also formed in the 1970s.[251] Philosopher John Hospers, a one-time member of Rand’s inner circle, proposed a non-initiation of force principle to unite both groups, but this statement later became a required “pledge” for candidates of the Libertarian Party and Hospers became its first presidential candidate in 1972.[citation needed] In the 1980s, Hess joined the Libertarian Party and served as editor of its newspaper from 1986 to 1990.

Modern libertarianism gained significant recognition in academia with the publication of Harvard University professor Robert Nozick’s Anarchy, State, and Utopia in 1974, for which he received a National Book Award in 1975.[252] In response to John Rawls’s A Theory of Justice, Nozick’s book supported a nightwatchman state on the grounds that it was an inevitable phenomenon which could arise without violating individual rights.[253]

In the early 1970s, Rothbard wrote that “[o]ne gratifying aspect of our rise to some prominence is that, for the first time in my memory, we, ‘our side,’ had captured a crucial word from the enemy… ‘Libertarians’… had long been simply a polite word for left-wing anarchists, that is for anti-private property anarchists, either of the communist or syndicalist variety. But now we had taken it over”.[254] Since the resurgence of neoliberalism in the 1970s, this modern American libertarianism has spread beyond North America via think tanks and political parties.[255][256]

A surge of popular interest in libertarian socialism occurred in western nations during the 1960s and 1970s.[257] Anarchism was influential in the Counterculture of the 1960s[258][259][260] and anarchists actively participated in the late sixties students and workers revolts.[261] In 1968, the International of Anarchist Federations was founded in Carrara, Italy during an international anarchist conference held there in 1968 by the three existing European federations of France, the Italian and the Iberian Anarchist Federation as well as the Bulgarian federation in French exile.[173][262] The uprisings of May 1968 also led to a small resurgence of interest in left communist ideas. Various small left communist groups emerged around the world, predominantly in the leading capitalist countries. A series of conferences of the communist left began in 1976, with the aim of promoting international and cross-tendency discussion, but these petered out in the 1980s without having increased the profile of the movement or its unity of ideas.[263] Left communist groups existing today include the International Communist Party, International Communist Current and the Internationalist Communist Tendency. The housing and employment crisis in most of Western Europe led to the formation of communes and squatter movements like that of Barcelona, Spain. In Denmark, squatters occupied a disused military base and declared the Freetown Christiania, an autonomous haven in central Copenhagen.

Around the turn of the 21st century, libertarian socialism grew in popularity and influence as part of the anti-war, anti-capitalist and anti-globalisation movements.[264] Anarchists became known for their involvement in protests against the meetings of the World Trade Organization (WTO), Group of Eight and the World Economic Forum. Some anarchist factions at these protests engaged in rioting, property destruction and violent confrontations with police. These actions were precipitated by ad hoc, leaderless, anonymous cadres known as black blocs and other organisational tactics pioneered in this time include security culture, affinity groups and the use of decentralised technologies such as the internet.[264] A significant event of this period was the confrontations at WTO conference in Seattle in 1999.[264] For English anarchist scholar Simon Critchley, “contemporary anarchism can be seen as a powerful critique of the pseudo-libertarianism of contemporary neo-liberalism…One might say that contemporary anarchism is about responsibility, whether sexual, ecological or socio-economic; it flows from an experience of conscience about the manifold ways in which the West ravages the rest; it is an ethical outrage at the yawning inequality, impoverishment and disenfranchisment that is so palpable locally and globally”.[265] This might also have been motivated by “the collapse of ‘really existing socialism’ and the capitulation to neo-liberalism of Western social democracy”.[266]

Libertarian socialists in the early 21st century have been involved in the alter-globalization movement, squatter movement; social centers; infoshops; anti-poverty groups such as Ontario Coalition Against Poverty and Food Not Bombs; tenants’ unions; housing cooperatives; intentional communities generally and egalitarian communities; anti-sexist organizing; grassroots media initiatives; digital media and computer activism; experiments in participatory economics; anti-racist and anti-fascist groups like Anti-Racist Action and Anti-Fascist Action; activist groups protecting the rights of immigrants and promoting the free movement of people, such as the No Border network; worker co-operatives, countercultural and artist groups; and the peace movement.

In the United States, polls (circa 2006) find that the views and voting habits of between 10 and 20 percent (and increasing) of voting age Americans may be classified as “fiscally conservative and socially liberal, or libertarian”.[267][268] This is based on pollsters and researchers defining libertarian views as fiscally conservative and socially liberal (based on the common United States meanings of the terms) and against government intervention in economic affairs and for expansion of personal freedoms.[267] Through 20 polls on this topic spanning 13 years, Gallup found that voters who are libertarian on the political spectrum ranged from 1723% of the United States electorate.[269] However, a 2014 Pew Poll found that 23% of Americans who identify as libertarians have no idea what the word means.[270]

2009 saw the rise of the Tea Party movement, an American political movement known for advocating a reduction in the United States national debt and federal budget deficit by reducing government spending and taxes, which had a significant libertarian component[271] despite having contrasts with libertarian values and views in some areas, such as nationalism, free trade, social issues and immigration.[272] A 2011 Reason-Rupe poll found that among those who self-identified as Tea Party supporters, 41 percent leaned libertarian and 59 percent socially conservative.[273] The movement, named after the Boston Tea Party, also contains conservative[274] and populist elements[275] and has sponsored multiple protests and supported various political candidates since 2009. Tea Party activities have declined since 2010 with the number of chapters across the country slipping from about 1,000 to 600.[276][277] Mostly, Tea Party organizations are said to have shifted away from national demonstrations to local issues.[276] Following the selection of Paul Ryan as Mitt Romney’s 2012 vice presidential running mate, The New York Times declared that Tea Party lawmakers are no longer a fringe of the conservative coalition, but now “indisputably at the core of the modern Republican Party”.[278]

In 2012, anti-war presidential candidates (Libertarian Republican Ron Paul and Libertarian Party candidate Gary Johnson) raised millions of dollars and garnered millions of votes despite opposition to their obtaining ballot access by Democrats and Republicans.[279] The 2012 Libertarian National Convention, which saw Gary Johnson and James P. Gray nominated as the 2012 presidential ticket for the Libertarian Party, resulted in the most successful result for a third-party presidential candidacy since 2000 and the best in the Libertarian Party’s history by vote number. Johnson received 1% of the popular vote, amounting to more than 1.2 million votes.[280][281] Johnson has expressed a desire to win at least 5 percent of the vote so that the Libertarian Party candidates could get equal ballot access and federal funding, thus subsequently ending the two-party system.[282][283][284]

Since the 1950s, many American libertarian organizations have adopted a free market stance, as well as supporting civil liberties and non-interventionist foreign policies. These include the Ludwig von Mises Institute, Francisco Marroqun University, the Foundation for Economic Education, Center for Libertarian Studies, the Cato Institute and Liberty International. The activist Free State Project, formed in 2001, works to bring 20,000 libertarians to New Hampshire to influence state policy.[285] Active student organizations include Students for Liberty and Young Americans for Liberty.

A number of countries have libertarian parties that run candidates for political office. In the United States, the Libertarian Party was formed in 1972 and is the third largest[286][287] American political party, with over 370,000 registered voters in the 35 states that allow registration as a Libertarian[288] and has hundreds of party candidates elected or appointed to public office.[289]

Current international anarchist federations which sometimes identify themselves as libertarian include the International of Anarchist Federations, the International Workers’ Association, and International Libertarian Solidarity. The largest organised anarchist movement today is in Spain, in the form of the Confederacin General del Trabajo (CGT) and the CNT. CGT membership was estimated to be around 100,000 for 2003.[290] Other active syndicalist movements include the Central Organisation of the Workers of Sweden and the Swedish Anarcho-syndicalist Youth Federation in Sweden; the Unione Sindacale Italiana in Italy; Workers Solidarity Alliance in the United States; and Solidarity Federation in the United Kingdom. The revolutionary industrial unionist Industrial Workers of the World claiming 2,000 paying members as well as the International Workers Association, an anarcho-syndicalist successor to the First International, also remain active. In the United States, there exists the Common Struggle Libertarian Communist Federation.

Criticism of libertarianism includes ethical, economic, environmental, pragmatic, and philosophical concerns.[291] It has also been argued that laissez-faire capitalism does not necessarily produce the best or most efficient outcome,[292] nor does its policy of deregulation prevent the abuse of natural resources. Furthermore, libertarianism has been criticized as utopian due to the lack of any such societies today.

Critics such as Corey Robin describe right-libertarianism as fundamentally a reactionary conservative ideology, united with more traditional conservative thought and goals by a desire to enforce hierarchical power and social relations:[293]

Conservatism, then, is not a commitment to limited government and libertyor a wariness of change, a belief in evolutionary reform, or a politics of virtue. These may be the byproducts of conservatism, one or more of its historically specific and ever-changing modes of expression. But they are not its animating purpose. Neither is conservatism a makeshift fusion of capitalists, Christians, and warriors, for that fusion is impelled by a more elemental forcethe opposition to the liberation of men and women from the fetters of their superiors, particularly in the private sphere. Such a view might seem miles away from the libertarian defense of the free market, with its celebration of the atomistic and autonomous individual. But it is not. When the libertarian looks out upon society, he does not see isolated individuals; he sees private, often hierarchical, groups, where a father governs his family and an owner his employees.

John Donahue argues that if political power were radically shifted to local authorities, parochial local interests would predominate at the expense of the whole and that this would exacerbate current problems with collective action.[294]

Michael Lind has observed that of the 195 countries in the world today, none have fully actualized a libertarian society:

If libertarianism was a good idea, wouldn’t at least one country have tried it? Wouldn’t there be at least one country, out of nearly two hundred, with minimal government, free trade, open borders, decriminalized drugs, no welfare state and no public education system?[295]

Lind has also criticised libertarianism, particularly the right-wing and free market variant of the ideology, as being incompatible with democracy and apologetic towards autocracy.[296]

See the original post here:

Libertarianism – Wikipedia

Can Libertarianism Be a Governing Philosophy?

The discussion we are about to have naturally divides itself into two aspects:

First: Could libertarianism, if implemented, sustain a state apparatus and not devolve into autocracy or anarchy? By that I mean the lawless versions of autocracy and anarchy, not stable monarchy or emergent rule of law without a state. Second: even if the answer were Yesor, Yes, if . . . we would still need to know whether enough citizens desired a libertarian order that it could feasibly be voluntarily chosen. That is, I am ruling out involuntary imposition by force of libertarianism as a governing philosophy.

I will address both questions, but want to assert at the outset that the first is the more important and more fundamental one. If the answer to it is No, there is no point in moving on to the second question. If the answer is Yes, it may be possible to change peoples minds about accepting a libertarian order.

The Destinationalists

As I have argued elsewhere[1], there are two main paths to deriving libertarian principles, destinations and directions. The destinationist approach shares the method of most other ethical paradigms: the enunciation of timeless moral and ethical precepts that describe the ideal libertarian society.

What makes for a distinctly libertarian set of principles is two precepts:

The extreme forms of these principles, for destinationists, can be hard for outsiders to accept. One example is noted by Matt Zwolinski, who cites opinion data gathered from libertarians by Liberty magazine and presented in its periodic Liberty Poll. A survey question frequently included in the survey was:

Suppose that you are on a friends balcony on the 50th floor of a condominium complex. You trip, stumble and fall over the edge. You catch a flagpole on the next floor down. The owner opens his window and demands you stop trespassing.

Zwolinski writes that in 1988, 84 percent of respondents to the flagpole question

said they believed that in such circumstances they should enter the owners residence against the owners wishes. 2% (one respondent) said that they should let go and fall to their death, and 15% said they should hang on and wait for somebody to throw them a rope. In 1999, the numbers were 86%, 1%, and 13%. In 2008, they were 89.2%, 0.9%, and 9.9%.

The interesting thing is that, while the answers to the flagpole question were almost unchanged over time, with a slight upward drift in those who would aggress by trespassing, support for the non-aggression principle itself plummeted. Writes Zwolinski:

Respondents were asked to say whether they agreed or disagreed with [the non-aggression principle]. In 1988, a full 90% of respondents said that they agreed. By 1999, however, the percentage expressing agreement had dropped by almost half to 50%. And by 2008, it was down to 39.7%.

If we take support for the non-aggression principle as a Rorschach test, it does not appear that most people, maybe not even everyone who identifies as a libertarian, are fully convinced that the principle is an absolute categorical moral principle.

The Directionalists

Of course, it could be true that many who identify now as libertarians, and those who might be attracted to libertarianism in the future, are directionalists. A directional approach holds that any policy action that increases the liberty and welfare of individuals is an improvement, and should be supported by libertarians, even if the policy itself violates either the self-ownership principle or the non-aggression principle.

A useful example here might be school vouchers. Instead of being a monopoly provider of public school education, the state might specialize in funding but leave the provision of education at least partly to private sector actors. The destinationist would object (and correctly) that the policy still involves the initiation of violence in collecting taxes involuntarily imposed on at least individuals who would not pay without the threat of coercion. In contrast, the directionalist might support vouchers, since parents would at least be afforded more liberty in choosing schools for their children, and the system would be subject to more competition, thus holding providers responsible for the quality of education being delivered.

Here, then, is a slightly modified take on the central question: Would a hybrid version of libertarianism, one that advocated for the destination but accepted directional improvements, be a viable governing philosophy? Even with this amendment, allowing for directional improvements as part of the core governing philosophy, is libertarianismto use a trope of the momentsustainable? The reason this approach could be useful is that it correlates to one of the great divisions within the libertarian movement: the split between political anarchists, who believe that any coercive state apparatus is ultimately incompatible with liberty, and the minarchists, who believe that a limited government is desirable, even necessary, and that it is also possible.

Limiting Leviathan: Getting Power to Stay Where You Put It

For a state to be consistent with both the self-ownership principle and the non-aggression principle, there must be certain core rights to property, expression, and action that are inviolable. This inviolability extends even to situations where initiating force would greatly benefit most people, meaning that consequentialist considerations cannot outweigh the rights of individuals.

Where might such a state originate, and how could it be continually limited to only those functions for which it was originally justified? One common answer is a form of contractarianism. (Another is convention, which is beyond the scope in this essay. See Robert Sugden[2] and Gerard Gaus[3] for a review of some of the issues.) This is not to say that actual states are the results of explicitly contractual arrangements; rather, there is an as if element: rational citizens in a state of nature would have voluntarily consented to the limited coercion of a minarchist state, given the substantial and universal improvement in welfare that results from having a provider of public goods and a neutral enforcer of contracts. Without a state, claims the minarchist, these two functionspublic goods provision and contract enforcementare either impossible or so difficult as to make the move to create a coercive state universally welcome for all citizens.

Contractarianism is of course an enormous body of work in philosophy, ranging from Thomas Hobbes and Jean-Jacques Rousseau to David Gauthier and John Rawls. Our contractarians, the libertarian versions, start with James Buchanan and Jan Narveson. Buchanans contractarianism is stark: Rules start with us, and the justification for coercion is, but can only be, our consent to being coerced. It is not clear that Buchanan would accept the full justification of political authority by tacit contract, but Buchanan also claims that each group in society should start from where we are now, meaning that changes in the rules require something as close to unanimous consent as possible.[4]

Narvesons view is closer to the necessary evil claim for justifying government. We need a way to be secure from violence, and to be able to enter into binding agreements that are enforceable. He wrote in The Libertarian Idea (1988) that there is no alternative that can provide reasons to everyone for accepting it, no matter what their personal values or philosophy of life may be, and thus motivating this informal, yet society-wide institution. He goes on to say:

Without resort to obfuscating intuitions, of self-evident rights and the like, the contractarian view offers an intelligible account both of why it is rational to want a morality and of what, broadly speaking, the essentials of that morality must consist in: namely, those general rules that are universally advantageous to rational agents. We each need morality, first because we are vulnerable to the depredations of others, and second because we can all benefit from cooperation with others. So we need protection, in the form of the ability to rely on our fellows not to engage in activities harmful to us; and we need to be able to rely on those with whom we deal. We each need this regardless of what else we need or value.

The problem, or so the principled political anarchist would answer, is that Leviathan cannot be limited unless for some reason Leviathan wants to limit itself.

One of the most interesting proponent of this view is Anthony de Jasay, an independent philosopher of political economy. Jasay would not dispute the value of credible commitments for contracts. His quarrel comes when contractarians invoke a founding myth. When I think of the Social Contract (the capitals signify how important it is!), I am reminded of that scene from Monty Python where King Arthur is talking to the peasants:

King Arthur: I am your king.

Woman: Well, I didnt vote for you.

King Arthur: You dont vote for kings.

Woman: Well howd you become king then?

[holy music . . . ]

King Arthur: The Lady of the Lake, her arm clad in the purest shimmering samite held aloft Excalibur from the bosom of the water, signifying by divine providence that I, Arthur, was to carry Excalibur. That is why I am your king.

Dennis: [interrupting] Listen, strange women lyin in ponds distributin swords is no basis for a system of government. Supreme executive power derives from a mandate from the masses, not from some farcical aquatic ceremony.

According to Jasay, there are two distinct problems with contractarian justifications for the state. Each, separately and independently, is fatal for the project, in his view. Together they put paid to the notion that a libertarian could favor minarchism.

The first problem is the enforceable contracts justification. The second is the limiting Leviathan problem.

The usual statement of the first comes from Hobbes: Covenants, without the sword, are but words. That means that individuals cannot enter into binding agreements without some third party to enforce the agreement. Since entering into binding agreements is a central precondition for mutually beneficial exchange and broad-scale market cooperation, we need a powerful, neutral enforcer. So, we all agree on that; the enforcer collects the taxes that we all agreed on and, in exchange, enforces all our contracts for us. (See John Thrasher[5] for some caveats.)

Butwait. Jasay compares this to jumping over your own shadow. If contracts cannot be enforced save by coercion from a third party, how can the contract between citizens and the state be enforced? [I]t takes courage to affirm that rational people could unanimously wish to have a sovereign contract enforcer bound by no contract, wrote Jasay in his book Against Politics (1997). By courage he does not intend a compliment. Either those who make this claim are contradicting themselves (since we cant have contracts, well use a contract to solve the problem) or the argument is circular (cooperation requires enforceable contracts, but these require a norm of cooperation).

Jasay put the question this way in On Treating Like Cases Alike: Review of Politics by Principle Not Interest, his 1999 essay in the Independent Review:

If man can no more bind himself by contract than he can jump over his own shadow, how can he jump over his own shadow and bind himself in a social contract? He cannot be both incapable of collective action and capable of it when creating the coercive agency needed to enforce his commitment. One can, without resorting to a bootstrap theory, accept the idea of an exogenous coercive agent, a conqueror whose regime is better than anything the conquered people could organize for themselves. Consenting to such an accomplished fact, however, can hardly be represented as entering into a contract, complete with a contracts ethical implications of an act of free will. [Emphasis in original]

In sum, the former claimthat contracts cannot be enforcedcannot then be used to conjure enforceable contracts out of a shadow. The latter claimthat people will cooperate on their ownmeans that no state is necessary in the first place. The conclusion Jasay reaches is that states, if they exist, may well be able to compel people to obey. The usual argument goes like this:

The state exists and enjoys the monopoly of the use of force for some reason, probably a historical one, that we need not inquire into. What matters is that without the state, society could not function tolerably, if at all. Therefore all rational persons would choose to enter into a social contract to create it. Indeed, we should regard the state as if it were the result of our social contract, hence indisputably legitimate.[6]

Jasay concludes that this argument must be false. As Robert Nozick famously put it in Anarchy, State, and Utopia (1974), tacit consent isnt worth the paper its not written on. We cannot confect a claim that states deserve our obedience based on consent. For consent is what true political authority requires: not that our compliance can be compelled, but that the state deserves our compliance. Ordered anarchy with no formal state is therefore a better solution, in Jasays view, because consent is either not real or is not enough.

Of course, this is simply an extension of a long tradition in libertarian thought, dating at least to Lysander Spooner. As Spooner said:

If the majority, however large, of the people of a country, enter into a contract of government, called a constitution, by which they agree to aid, abet or accomplish any kind of injustice, or to destroy or invade the natural rights of any person or persons whatsoever, whether such persons be parties to the compact or not, this contract of government is unlawful and voidand for the same reason that a treaty between two nations for a similar purpose, or a contract of the same nature between two individuals, is unlawful and void. Such a contract of government has no moral sanction. It confers no rightful authority upon those appointed to administer it. It confers no legal or moral rights, and imposes no legal or moral obligation upon the people who are parties to it. The only duties, which any one can owe to it, or to the government established under color of its authority, are disobedience, resistance, destruction.[7]

Now for the other problem highlighted by Jasay, that of limiting Leviathan. Let us assume the best of state officials: that they genuinely intend to do good. We might make the standard Public Choice assumption that officials want to use power to benefit themselves, but let us put that aside; instead, officials genuinely want to improve the lives of their citizens.

This means a minarchist state is not sustainable. Officials, thinking of the society as a collective rather than as individuals with inviolable rights, will immediately discover opportunities to raise taxes, and create new programs and new powers that benefit those in need. In fact, it is precisely the failure of the Public Choice assumptions of narrow self-interest that ensure this outcome. It might be possible in theory to design a principal-agent system of bureaucratic contract that constrains selfish officials. But if state power attracts those who are willing to sacrifice the lives or welfare of some for the greater good, then minarchy is quickly breached and Leviathan swells without the possibility of constraint.

I hasten to add that it need not be true, for Jasays claim to go through, that the concept of the greater good have any empirical content. It is enough that a few people believe, and can brandish the greater good like a truncheon, smashing rules and laws designed to stop the expansion of state power. No one who wants to do good will pass up a chance to do good, even if it means changing the rules. This process is much like that described by F.A. Hayek in Why the Worst Get on Top (see Chapter 10 of The Road to Serfdom) or Bertrand de Jouvenels Power (1945).

So, again, we reach a contradiction: Either 1) minarchy is not possible, because it is overwhelmed by the desire to do good, or minarchy is not legitimate because it is based on a mythical tacit consent; or 2) no state, minarchist or otherwise, is necessary because people can limit their actions on their own. Citizens might conclude that such self-imposed limits on their own actions are morally required, and that reputation and competition can limit the extent of depredation and reward cooperation in settings with repeated interaction. Jasay would argue, then, that constitutions and parchment barriers are either unnecessary (if people are self-governing) or ineffective (if they are not). Leviathan either cannot exist or else it is illimitable.

But Thats Not Enough

What I have argued so far is that destinationist libertarianism that is fully faithful to the self-ownership principle and the non-aggression principle could not be an effective governing philosophy. The only exception to this claim would be if libertarianism were universally believed, and people all agreed to govern themselves in the absence of a coercive state apparatus of any kind. Of course, one could object that even then something like a state would emerge, because of the economies of scale in the provision of defense, leading to a dominant protection network as described by Nozick. Whether that structure of service-delivery is necessarily a state is an interesting question, but not central to our current inquiry.

My own view is that libertarianism is, and in fact should be, a philosophy of governing that is robust and useful. But then I am a thoroughgoing directionalist. The state and its deputized coercive instruments have expanded the scope and intensity of their activities far beyond what people need to achieve cooperative goals, and beyond what they want in terms of immanent intrusions into our private lives.

Given the constant push and pull of politics, and the desire of groups to create and maintain rents for themselves, the task of leaning into the prevailing winds of statism will never be done. But it is a coherent and useful governing philosophy. When someone asks how big the state should be, there arent many people who think the answer is zero. But thats not on the table, anyway. My answer is smaller than it is now. Any policy change that grants greater autonomy (but also responsibility) to individual citizens, or that lessens government control over private action, is desirable; and libertarians are crucial for providing compelling intellectual justifications for why this is so.

In short, I dont advocate abandoning destinationist debates. The positing of an ideal is an important device for recruitment and discussion. But at this point we have been going in the wrong direction, for decades. It should be possible to find allies and fellow travelers. They may want to get off the train long before we arrive at the end of the line, but for many miles our paths toward smaller government follow the same track.

[1] Michael Munger, Basic Income Is Not an Obligation, but It Might Be a Legitimate Choice, Basic Income Studies 6:2 (December 2011), 1-13.

[2] Robert Sugden, Can a Humean Be a Contractarian? in Perspectives in Moral Science, edited by Michael Baurmann and Bernd Lahno, Frankfurt School Verlag (2009), 1123.

[3] Gerald Gaus, Why the Conventionalist Needs the Social Contract (and Vice Versa), Rationality, Markets and Morals, Frankfurt School Verlag, 4 (2013), 7187.

[4] For more on the foundation of Buchanans thought, see my forthcoming essay in the Review of Austrian Economics, Thirty Years After the Nobel: James Buchanans Political Philosophy.

[5] John Thrasher, Uniqueness and Symmetry in Bargaining Theories of Justice, Philosophical Studies 167 (2014), 683699.

[6] Anthony de Jasay, Pious Lies: The Justification of States and Welfare States, Economic Affairs 24:2 (2004), 63-64.

[7] Lysander Spooner, The Unconstitutionality of Slavery (Boston: Bela Marsh, 1860), pp. 9-10.

Originally posted here:

Can Libertarianism Be a Governing Philosophy?

6 Reasons Why I Gave Up On Libertarianism Return Of Kings

These days, libertarianism tends to be quite discredited. It is now associated with the goofy candidature of Gary Johnson, having a rather narrow range of issueslegalize weed! less taxes!, cucking ones way to politics through sweeping all the embarrassing problems under the carpet, then surrendering to liberal virtue-signaling and endorsing anti-white diversity.

Now, everyone on the Alt-Right, manosphere und so wieser is laughing at those whose adhesion to a bunch of abstract premises leads to endorse globalist capital, and now that Trump officially heads the State, wed be better off if some private companies were nationalized than let to shadowy overlords.

To Americans, libertarianism has been a constant background presence. Its main icons, be them Ayn Rand, Murray Rothbard or Friedrich Hayek, were always read and discussed here and there, and never fell into oblivion although they barely had media attention. The academic and political standing of libertarianism may be marginal, it has always been granted small platforms and resurrected from time to time in the public landscape, one of the most conspicuous examples of it being the Tea Party demonstrations.

To a frog like yours trulyKek being now praised by thousands of well-meaning memers, I can embrace the frog moniker gladlylibertarianism does not have the same standing at all. In French universities, libertarian thinkers are barely discussed, even in classes that are supposed to tackle economics: for one hour spent talking about Hayek, Keynes easily enjoys ten, and the same goes on when comparing the attention given to, respectively, Adam Smith and Karl Marx.

On a wider perspective, a lot of the contemporary French identity is built on Jacobinism, i.e. on crushing underfoot organic regional sociability in the name of a bureaucratized and Masonic republic. The artificial construction of France is exactly the kind of endeavour libertarianism loathes. No matter why the public choices school, for example, is barely studied here: pompous leftist teachers and mediocre fonctionnaires are too busy gushing about themselves, sometimes hiding the emptiness of their life behind a ridiculous epic narrative that turns social achievements into heroic feats, to give a fair hearing to pertinent criticism.

When I found out about libertarianism, I was already sick of the dominant fifty shades of leftism political culture. The gloomy mediocrity of small bureaucrats, including most school teachers, combined with their petty political righteousness, always repelled me. Thus, the discovery oflaissez-faire advocates felt like stumbling on an entirely new scene of thoughtand my initial feeling was vindicated when I found about the naturalism often associated with it, something refreshing and intuitively more satisfying than the mainstream culture-obsessed, biology-denying view.

Libertarianism looked like it could solve everything. More entrepreneurship, more rights to those who actually create wealth and live through the good values of personal responsibility and work ethic, less parasitesbe they bureaucrats or immigrants, no more repressive speech laws. Coincidentally, a new translation of Ayn Rands Atlas Shrugged was published at this time: I devoured it, loving the sense of life, the heroism, the epic, the generally great and achieving ethos contained in it. Arent John Galt and Hank Rearden more appealing than any corrupt politician or beta bureaucrat that pretends to be altruistic while backstabbing his own colleagues and parasitizing the country?

Now, although I still support small-scale entrepreneurship wholeheartedly, I would never defend naked libertarianism, and here is why.

Part of the Rothschild family, where nepotism and consanguinity keep the money in

Unity makes strength, and trust is much easier to cultivate in a small group where everyone truly belongs than in an anonymous great society. Some ethnic groups, especially whites, tend to be instinctively individualistic, with a lot of people favouring personal liberty over belonging, while others, especially Jews, tend to favor extended family business and nepotism.

On a short-term basis, mobile individuals can do better than those who are bound to many social obligations. On the long run, however, extended families manage to create an environment of trust and concentrate capital. And whereas individuals may start cheating each other or scattering their wealth away, thanks to having no proper economic network, families and tribes will be able to invest heavily in some of their members and keep their wealth inside. This has been true for Jewish families, wherever their members work as moneylenders or diamond dealers, for Asians investing in new restaurants or any other business project of their own, and for North Africans taking over pubs and small shops in France.

The latter example is especially telling. White bartenders, butchers, grocers and the like have been chased off French suburbs by daily North African and black violence. No one helped them, everyone being afraid of getting harassed as well and busy with their own business. (Yep, just like what happened and still happens in Rotheram.) As a result, these isolated, unprotected shop-owners sold their outlet for a cheap price and fled. North Africans always covered each others violence and replied in groups against any hurdle, whereas whites lowered their heads and hoped not to be next on the list.

Atlas Shrugged was wrong. Loners get wrecked by groups. Packs of hyenas corner and eat the lone dog.

Libertarianism is not good for individuals on the long runit turns them into asocial weaklings, soon to be legally enslaved by global companies or beaten by groups, be they made of nepotistic family members or thugs.

How the middle classes end up after jobs have been sent overseas and wages lowered

People often believe, thanks to Leftist media and cuckservative posturing, that libertarians are big bosses. This is mostly, if not entirely, false. Most libertarians are middle class guys who want more opportunities, less taxation, and believe that libertarianism will help them to turn into successful entrepreneurs. They may be right in very specific circumstances: during the 2000s, small companies overturned the market of electronics, thus benefiting both to their independent founders and to society as a whole; but ultimately, they got bought by giants like Apple and Google, who are much better off when backed by a corrupt State than on a truly free market.

Libertarianism is a fake alternative, just as impossible to realize as communism: far from putting everyone at its place, it lets ample room to mafias, monopolies, unemployment caused by mechanization and global competition. If one wants the middle classes to survive, one must protect the employment and relative independence of its membersbankers and billionaires be damned.

Spontaneous order helped by a weak government. I hope they at least smoke weed.

A good feature of libertarianism is that it usually goes along with a positive stance on biology and human nature, in contrast with the everything is cultural and ought to be deconstructed left. However, this stance often leads to an exaggerated optimism about human nature. In a society of laissez-faire, the libertarians say, people flourish and the order appears spontaneously.

Well, this is plainly false. As all of the great religions say, after what Christians call the Fall, man is a sinner. If you let children flourish without moral standards and role models, they become spoiled, entitled, manipulative, emotionally fragile and deprived of self-control. If you let women flourish without suspicion, you let free rein to their propensities to hypergamy, hysteria, self-entitlement and everything we can witness in them today. If you let men do as they please, you let them become greedy, envious, and turning into bullies. As a Muslim proverb says, people must be flogged to enter into paradiseand as Aristotle put forth, virtues are trained dispositions, no matter the magnitude of innate talents and propensities.

Michelle The Man Obama and Lying Crooked at a Democrat meeting

When the laissez-faire rules, some will succeed on the market more than others, due to differences in investment, work, and natural abilities. Some will succeed enough to be able to buy someone elses business: this is the natural consequence of differences in wealth and of greed. When corrupt politicians enter the game, things become worse, as they will usually help some large business owners to shield their position against competitorsat the expense of most people, who then lose their independence and live off a wage.

At the end, what we get is a handful of very wealthy individuals who have managed to concentrate most capital and power levers into their hands and a big crowd of low-wage employees ready to cut each others throat for a small promotion, and females waiting in line to get notched by the one per cent while finding the other ninety-nine per cent boring.

Censorship by massive social pressure, monopoly over the institutions and crybullying is perfectly legal. What could go wrong?

On the surface, libertarianism looks good here, because it protects the individuals rights against left-hailing Statism and cuts off the welfare programs that have attracted dozens of millions of immigrants. Beneath, however, things are quite dire. Libertarianism enshrines the leftists right to free speech they abuse from, allows the pressure tactics used by radicals, and lets freethinking individuals getting singled out by SJWs as long as these do not resort to overt stealing or overt physical violence. As for the immigrants, libertarianism tends to oppose the very notion of non-private boundaries, thus letting the local cultures and identities defenseless against both greedy capitalists and subproletarian masses.

Supporting an ideology that allows the leftists to destroy society more or less legally equates to cucking, plain and simple. Desiring an ephemeral cohabitation with rabid ideological warriors is stupid. We should aim at a lasting victory, not at pretending to constrain them through useless means.

Am I the only one to find that Gary Johnson looks like a snail (Spongebob notwithstanding)?

In 2013, one of the rare French libertarians academic teachers, Jean-Louis Caccomo, was forced into a mental ward at the request of his university president. He then spent more than a year getting drugged. Mr. Caccomo had no real psychological problem: his confinement was part of a vicious strategy of pathologization and career-destruction that was already used by the Soviets. French libertarians could have wide denounced the abuse. Nonetheless, most of them freaked out, and almost no one dared to actually defend him publicly.

Why should rational egoists team up and risk their careers to defend one of themselves after all? They would rather posture at confidential social events, rail at organic solidarity and protectionism, or trolling the shit out of individuals of their own social milieu because Ive got the right to mock X, its my right to free speech! The few libertarian people I knew firsthand, the few events I have witnessed in that small milieu, were enough to give me serious doubts about libertarianism: how can a good political ideology breed such an unhealthy mindset?

Political ideologies are tools. They are not ends in themselves. All forms of government arent fit for any people or any era. Political actors must know at least the most important ones to get some inspiration, but ultimately, said actors win on the ground, not in philosophical debates.

Individualism, mindless consumerism, careerism, hedonism are part of the problem. Individual rights granted regardless of ones abilities, situation, and identity are a disaster. Time has come to overcome modernity, not stall in one of its false alternatives. The merchant caste must be regulated, though neither micromanaged or hampered by a parasitic bureaucracy nor denied its members right for small-scale independence. Individual rights must be conditional, boundaries must be restored, minority identities based on anti-white male resentment must be crushed so they cannot devour sociability from the inside again, and the pater familias must assert himself anew.

Long live the State and protectionism as long as they defend the backbone of society and healthy relationships between the sexes, and no quarter for those who think they have a right to wage grievance-mongering against us, no matter if they want to use the State or private companies. At the end, the socialism-libertarianism dichotomy is quite secondary.

Read Next: Sugar Baby Culture In The US Is Creating A Marketplace for Prostitution

Here is the original post:

6 Reasons Why I Gave Up On Libertarianism Return Of Kings

Gene therapy – Wikipedia

In the medicine field, gene therapy (also called human gene transfer) is the therapeutic delivery of nucleic acid into a patient’s cells as a drug to treat disease.[1][2] The first attempt at modifying human DNA was performed in 1980 by Martin Cline, but the first successful nuclear gene transfer in humans, approved by the National Institutes of Health, was performed in May 1989.[3] The first therapeutic use of gene transfer as well as the first direct insertion of human DNA into the nuclear genome was performed by French Anderson in a trial starting in September 1990.

Between 1989 and February 2016, over 2,300 clinical trials had been conducted, more than half of them in phase I.[4]

Not all medical procedures that introduce alterations to a patient’s genetic makeup can be considered gene therapy. Bone marrow transplantation and organ transplants in general have been found to introduce foreign DNA into patients.[5] Gene therapy is defined by the precision of the procedure and the intention of direct therapeutic effects.

Gene therapy was conceptualized in 1972, by authors who urged caution before commencing human gene therapy studies.

The first attempt, an unsuccessful one, at gene therapy (as well as the first case of medical transfer of foreign genes into humans not counting organ transplantation) was performed by Martin Cline on 10 July 1980.[6][7] Cline claimed that one of the genes in his patients was active six months later, though he never published this data or had it verified[8] and even if he is correct, it’s unlikely it produced any significant beneficial effects treating beta-thalassemia.

After extensive research on animals throughout the 1980s and a 1989 bacterial gene tagging trial on humans, the first gene therapy widely accepted as a success was demonstrated in a trial that started on 14 September 1990, when Ashi DeSilva was treated for ADA-SCID.[9]

The first somatic treatment that produced a permanent genetic change was performed in 1993.[citation needed]

Gene therapy is a way to fix a genetic problem at its source. The polymers are either translated into proteins, interfere with target gene expression, or possibly correct genetic mutations.

The most common form uses DNA that encodes a functional, therapeutic gene to replace a mutated gene. The polymer molecule is packaged within a “vector”, which carries the molecule inside cells.

Early clinical failures led to dismissals of gene therapy. Clinical successes since 2006 regained researchers’ attention, although as of 2014, it was still largely an experimental technique.[10] These include treatment of retinal diseases Leber’s congenital amaurosis[11][12][13][14] and choroideremia,[15] X-linked SCID,[16] ADA-SCID,[17][18] adrenoleukodystrophy,[19] chronic lymphocytic leukemia (CLL),[20] acute lymphocytic leukemia (ALL),[21] multiple myeloma,[22] haemophilia,[18] and Parkinson’s disease.[23] Between 2013 and April 2014, US companies invested over $600 million in the field.[24]

The first commercial gene therapy, Gendicine, was approved in China in 2003 for the treatment of certain cancers.[25] In 2011 Neovasculgen was registered in Russia as the first-in-class gene-therapy drug for treatment of peripheral artery disease, including critical limb ischemia.[26] In 2012 Glybera, a treatment for a rare inherited disorder, became the first treatment to be approved for clinical use in either Europe or the United States after its endorsement by the European Commission.[10][27]

Following early advances in genetic engineering of bacteria, cells, and small animals, scientists started considering how to apply it to medicine. Two main approaches were considered replacing or disrupting defective genes.[28] Scientists focused on diseases caused by single-gene defects, such as cystic fibrosis, haemophilia, muscular dystrophy, thalassemia, and sickle cell anemia. Glybera treats one such disease, caused by a defect in lipoprotein lipase.[27]

DNA must be administered, reach the damaged cells, enter the cell and either express or disrupt a protein.[29] Multiple delivery techniques have been explored. The initial approach incorporated DNA into an engineered virus to deliver the DNA into a chromosome.[30][31] Naked DNA approaches have also been explored, especially in the context of vaccine development.[32]

Generally, efforts focused on administering a gene that causes a needed protein to be expressed. More recently, increased understanding of nuclease function has led to more direct DNA editing, using techniques such as zinc finger nucleases and CRISPR. The vector incorporates genes into chromosomes. The expressed nucleases then knock out and replace genes in the chromosome. As of 2014 these approaches involve removing cells from patients, editing a chromosome and returning the transformed cells to patients.[33]

Gene editing is a potential approach to alter the human genome to treat genetic diseases,[34] viral diseases,[35] and cancer.[36] As of 2016 these approaches were still years from being medicine.[37][38]

Gene therapy may be classified into two types:

In somatic cell gene therapy (SCGT), the therapeutic genes are transferred into any cell other than a gamete, germ cell, gametocyte, or undifferentiated stem cell. Any such modifications affect the individual patient only, and are not inherited by offspring. Somatic gene therapy represents mainstream basic and clinical research, in which therapeutic DNA (either integrated in the genome or as an external episome or plasmid) is used to treat disease.

Over 600 clinical trials utilizing SCGT are underway in the US. Most focus on severe genetic disorders, including immunodeficiencies, haemophilia, thalassaemia, and cystic fibrosis. Such single gene disorders are good candidates for somatic cell therapy. The complete correction of a genetic disorder or the replacement of multiple genes is not yet possible. Only a few of the trials are in the advanced stages.[39]

In germline gene therapy (GGT), germ cells (sperm or egg cells) are modified by the introduction of functional genes into their genomes. Modifying a germ cell causes all the organism’s cells to contain the modified gene. The change is therefore heritable and passed on to later generations. Australia, Canada, Germany, Israel, Switzerland, and the Netherlands[40] prohibit GGT for application in human beings, for technical and ethical reasons, including insufficient knowledge about possible risks to future generations[40] and higher risks versus SCGT.[41] The US has no federal controls specifically addressing human genetic modification (beyond FDA regulations for therapies in general).[40][42][43][44]

The delivery of DNA into cells can be accomplished by multiple methods. The two major classes are recombinant viruses (sometimes called biological nanoparticles or viral vectors) and naked DNA or DNA complexes (non-viral methods).

In order to replicate, viruses introduce their genetic material into the host cell, tricking the host’s cellular machinery into using it as blueprints for viral proteins. Retroviruses go a stage further by having their genetic material copied into the genome of the host cell. Scientists exploit this by substituting a virus’s genetic material with therapeutic DNA. (The term ‘DNA’ may be an oversimplification, as some viruses contain RNA, and gene therapy could take this form as well.) A number of viruses have been used for human gene therapy, including retroviruses, adenoviruses, herpes simplex, vaccinia, and adeno-associated virus.[4] Like the genetic material (DNA or RNA) in viruses, therapeutic DNA can be designed to simply serve as a temporary blueprint that is degraded naturally or (at least theoretically) to enter the host’s genome, becoming a permanent part of the host’s DNA in infected cells.

Non-viral methods present certain advantages over viral methods, such as large scale production and low host immunogenicity. However, non-viral methods initially produced lower levels of transfection and gene expression, and thus lower therapeutic efficacy. Later technology remedied this deficiency.[citation needed]

Methods for non-viral gene therapy include the injection of naked DNA, electroporation, the gene gun, sonoporation, magnetofection, the use of oligonucleotides, lipoplexes, dendrimers, and inorganic nanoparticles.

Some of the unsolved problems include:

Three patients’ deaths have been reported in gene therapy trials, putting the field under close scrutiny. The first was that of Jesse Gelsinger in 1999. Jesse Gelsinger died because of immune rejection response.[51] One X-SCID patient died of leukemia in 2003.[9] In 2007, a rheumatoid arthritis patient died from an infection; the subsequent investigation concluded that the death was not related to gene therapy.[52]

In 1972 Friedmann and Roblin authored a paper in Science titled “Gene therapy for human genetic disease?”[53] Rogers (1970) was cited for proposing that exogenous good DNA be used to replace the defective DNA in those who suffer from genetic defects.[54]

In 1984 a retrovirus vector system was designed that could efficiently insert foreign genes into mammalian chromosomes.[55]

The first approved gene therapy clinical research in the US took place on 14 September 1990, at the National Institutes of Health (NIH), under the direction of William French Anderson.[56] Four-year-old Ashanti DeSilva received treatment for a genetic defect that left her with ADA-SCID, a severe immune system deficiency. The defective gene of the patient’s blood cells was replaced by the functional variant. Ashantis immune system was partially restored by the therapy. Production of the missing enzyme was temporarily stimulated, but the new cells with functional genes were not generated. She led a normal life only with the regular injections performed every two months. The effects were successful, but temporary.[57]

Cancer gene therapy was introduced in 1992/93 (Trojan et al. 1993).[58] The treatment of glioblastoma multiforme, the malignant brain tumor whose outcome is always fatal, was done using a vector expressing antisense IGF-I RNA (clinical trial approved by NIH protocolno.1602 November 24, 1993,[59] and by the FDA in 1994). This therapy also represents the beginning of cancer immunogene therapy, a treatment which proves to be effective due to the anti-tumor mechanism of IGF-I antisense, which is related to strong immune and apoptotic phenomena.

In 1992 Claudio Bordignon, working at the Vita-Salute San Raffaele University, performed the first gene therapy procedure using hematopoietic stem cells as vectors to deliver genes intended to correct hereditary diseases.[60] In 2002 this work led to the publication of the first successful gene therapy treatment for adenosine deaminase deficiency (ADA-SCID). The success of a multi-center trial for treating children with SCID (severe combined immune deficiency or “bubble boy” disease) from 2000 and 2002, was questioned when two of the ten children treated at the trial’s Paris center developed a leukemia-like condition. Clinical trials were halted temporarily in 2002, but resumed after regulatory review of the protocol in the US, the United Kingdom, France, Italy, and Germany.[61]

In 1993 Andrew Gobea was born with SCID following prenatal genetic screening. Blood was removed from his mother’s placenta and umbilical cord immediately after birth, to acquire stem cells. The allele that codes for adenosine deaminase (ADA) was obtained and inserted into a retrovirus. Retroviruses and stem cells were mixed, after which the viruses inserted the gene into the stem cell chromosomes. Stem cells containing the working ADA gene were injected into Andrew’s blood. Injections of the ADA enzyme were also given weekly. For four years T cells (white blood cells), produced by stem cells, made ADA enzymes using the ADA gene. After four years more treatment was needed.[62]

Jesse Gelsinger’s death in 1999 impeded gene therapy research in the US.[63][64] As a result, the FDA suspended several clinical trials pending the reevaluation of ethical and procedural practices.[65]

The modified cancer gene therapy strategy of antisense IGF-I RNA (NIH n 1602)[59] using antisense / triple helix anti-IGF-I approach was registered in 2002 by Wiley gene therapy clinical trial – n 635 and 636. The approach has shown promising results in the treatment of six different malignant tumors: glioblastoma, cancers of liver, colon, prostate, uterus, and ovary (Collaborative NATO Science Programme on Gene Therapy USA, France, Poland n LST 980517 conducted by J. Trojan) (Trojan et al., 2012). This anti-gene antisense/triple helix therapy has proven to be efficient, due to the mechanism stopping simultaneously IGF-I expression on translation and transcription levels, strengthening anti-tumor immune and apoptotic phenomena.

Sickle-cell disease can be treated in mice.[66] The mice which have essentially the same defect that causes human cases used a viral vector to induce production of fetal hemoglobin (HbF), which normally ceases to be produced shortly after birth. In humans, the use of hydroxyurea to stimulate the production of HbF temporarily alleviates sickle cell symptoms. The researchers demonstrated this treatment to be a more permanent means to increase therapeutic HbF production.[67]

A new gene therapy approach repaired errors in messenger RNA derived from defective genes. This technique has the potential to treat thalassaemia, cystic fibrosis and some cancers.[68]

Researchers created liposomes 25 nanometers across that can carry therapeutic DNA through pores in the nuclear membrane.[69]

In 2003 a research team inserted genes into the brain for the first time. They used liposomes coated in a polymer called polyethylene glycol, which unlike viral vectors, are small enough to cross the bloodbrain barrier.[70]

Short pieces of double-stranded RNA (short, interfering RNAs or siRNAs) are used by cells to degrade RNA of a particular sequence. If a siRNA is designed to match the RNA copied from a faulty gene, then the abnormal protein product of that gene will not be produced.[71]

Gendicine is a cancer gene therapy that delivers the tumor suppressor gene p53 using an engineered adenovirus. In 2003, it was approved in China for the treatment of head and neck squamous cell carcinoma.[25]

In March researchers announced the successful use of gene therapy to treat two adult patients for X-linked chronic granulomatous disease, a disease which affects myeloid cells and damages the immune system. The study is the first to show that gene therapy can treat the myeloid system.[72]

In May a team reported a way to prevent the immune system from rejecting a newly delivered gene.[73] Similar to organ transplantation, gene therapy has been plagued by this problem. The immune system normally recognizes the new gene as foreign and rejects the cells carrying it. The research utilized a newly uncovered network of genes regulated by molecules known as microRNAs. This natural function selectively obscured their therapeutic gene in immune system cells and protected it from discovery. Mice infected with the gene containing an immune-cell microRNA target sequence did not reject the gene.

In August scientists successfully treated metastatic melanoma in two patients using killer T cells genetically retargeted to attack the cancer cells.[74]

In November researchers reported on the use of VRX496, a gene-based immunotherapy for the treatment of HIV that uses a lentiviral vector to deliver an antisense gene against the HIV envelope. In a phase I clinical trial, five subjects with chronic HIV infection who had failed to respond to at least two antiretroviral regimens were treated. A single intravenous infusion of autologous CD4 T cells genetically modified with VRX496 was well tolerated. All patients had stable or decreased viral load; four of the five patients had stable or increased CD4 T cell counts. All five patients had stable or increased immune response to HIV antigens and other pathogens. This was the first evaluation of a lentiviral vector administered in a US human clinical trial.[75][76]

In May researchers announced the first gene therapy trial for inherited retinal disease. The first operation was carried out on a 23-year-old British male, Robert Johnson, in early 2007.[77]

Leber’s congenital amaurosis is an inherited blinding disease caused by mutations in the RPE65 gene. The results of a small clinical trial in children were published in April.[11] Delivery of recombinant adeno-associated virus (AAV) carrying RPE65 yielded positive results. In May two more groups reported positive results in independent clinical trials using gene therapy to treat the condition. In all three clinical trials, patients recovered functional vision without apparent side-effects.[11][12][13][14]

In September researchers were able to give trichromatic vision to squirrel monkeys.[78] In November 2009, researchers halted a fatal genetic disorder called adrenoleukodystrophy in two children using a lentivirus vector to deliver a functioning version of ABCD1, the gene that is mutated in the disorder.[79]

An April paper reported that gene therapy addressed achromatopsia (color blindness) in dogs by targeting cone photoreceptors. Cone function and day vision were restored for at least 33 months in two young specimens. The therapy was less efficient for older dogs.[80]

In September it was announced that an 18-year-old male patient in France with beta-thalassemia major had been successfully treated.[81] Beta-thalassemia major is an inherited blood disease in which beta haemoglobin is missing and patients are dependent on regular lifelong blood transfusions.[82] The technique used a lentiviral vector to transduce the human -globin gene into purified blood and marrow cells obtained from the patient in June 2007.[83] The patient’s haemoglobin levels were stable at 9 to 10 g/dL. About a third of the hemoglobin contained the form introduced by the viral vector and blood transfusions were not needed.[83][84] Further clinical trials were planned.[85] Bone marrow transplants are the only cure for thalassemia, but 75% of patients do not find a matching donor.[84]

Cancer immunogene therapy using modified antigene, antisense/triple helix approach was introduced in South America in 2010/11 in La Sabana University, Bogota (Ethical Committee 14 December 2010, no P-004-10). Considering the ethical aspect of gene diagnostic and gene therapy targeting IGF-I, the IGF-I expressing tumors i.e. lung and epidermis cancers were treated (Trojan et al. 2016).[86][87]

In 2007 and 2008, a man (Timothy Ray Brown) was cured of HIV by repeated hematopoietic stem cell transplantation (see also allogeneic stem cell transplantation, allogeneic bone marrow transplantation, allotransplantation) with double-delta-32 mutation which disables the CCR5 receptor. This cure was accepted by the medical community in 2011.[88] It required complete ablation of existing bone marrow, which is very debilitating.

In August two of three subjects of a pilot study were confirmed to have been cured from chronic lymphocytic leukemia (CLL). The therapy used genetically modified T cells to attack cells that expressed the CD19 protein to fight the disease.[20] In 2013, the researchers announced that 26 of 59 patients had achieved complete remission and the original patient had remained tumor-free.[89]

Human HGF plasmid DNA therapy of cardiomyocytes is being examined as a potential treatment for coronary artery disease as well as treatment for the damage that occurs to the heart after myocardial infarction.[90][91]

In 2011 Neovasculgen was registered in Russia as the first-in-class gene-therapy drug for treatment of peripheral artery disease, including critical limb ischemia; it delivers the gene encoding for VEGF.[92][26] Neovasculogen is a plasmid encoding the CMV promoter and the 165 amino acid form of VEGF.[93][94]

The FDA approved Phase 1 clinical trials on thalassemia major patients in the US for 10 participants in July.[95] The study was expected to continue until 2015.[85]

In July 2012, the European Medicines Agency recommended approval of a gene therapy treatment for the first time in either Europe or the United States. The treatment used Alipogene tiparvovec (Glybera) to compensate for lipoprotein lipase deficiency, which can cause severe pancreatitis.[96] The recommendation was endorsed by the European Commission in November 2012[10][27][97][98] and commercial rollout began in late 2014.[99] Alipogene tiparvovec was expected to cost around $1.6 million per treatment in 2012,[100] revised to $1 million in 2015,[101] making it the most expensive medicine in the world at the time.[102] As of 2016, only one person had been treated with drug.[103]

In December 2012, it was reported that 10 of 13 patients with multiple myeloma were in remission “or very close to it” three months after being injected with a treatment involving genetically engineered T cells to target proteins NY-ESO-1 and LAGE-1, which exist only on cancerous myeloma cells.[22]

In March researchers reported that three of five adult subjects who had acute lymphocytic leukemia (ALL) had been in remission for five months to two years after being treated with genetically modified T cells which attacked cells with CD19 genes on their surface, i.e. all B-cells, cancerous or not. The researchers believed that the patients’ immune systems would make normal T-cells and B-cells after a couple of months. They were also given bone marrow. One patient relapsed and died and one died of a blood clot unrelated to the disease.[21]

Following encouraging Phase 1 trials, in April, researchers announced they were starting Phase 2 clinical trials (called CUPID2 and SERCA-LVAD) on 250 patients[104] at several hospitals to combat heart disease. The therapy was designed to increase the levels of SERCA2, a protein in heart muscles, improving muscle function.[105] The FDA granted this a Breakthrough Therapy Designation to accelerate the trial and approval process.[106] In 2016 it was reported that no improvement was found from the CUPID 2 trial.[107]

In July researchers reported promising results for six children with two severe hereditary diseases had been treated with a partially deactivated lentivirus to replace a faulty gene and after 732 months. Three of the children had metachromatic leukodystrophy, which causes children to lose cognitive and motor skills.[108] The other children had Wiskott-Aldrich syndrome, which leaves them to open to infection, autoimmune diseases, and cancer.[109] Follow up trials with gene therapy on another six children with Wiskott-Aldrich syndrome were also reported as promising.[110][111]

In October researchers reported that two children born with adenosine deaminase severe combined immunodeficiency disease (ADA-SCID) had been treated with genetically engineered stem cells 18 months previously and that their immune systems were showing signs of full recovery. Another three children were making progress.[18] In 2014 a further 18 children with ADA-SCID were cured by gene therapy.[112] ADA-SCID children have no functioning immune system and are sometimes known as “bubble children.”[18]

Also in October researchers reported that they had treated six hemophilia sufferers in early 2011 using an adeno-associated virus. Over two years later all six were producing clotting factor.[18][113]

In January researchers reported that six choroideremia patients had been treated with adeno-associated virus with a copy of REP1. Over a six-month to two-year period all had improved their sight.[114][115] By 2016, 32 patients had been treated with positive results and researchers were hopeful the treatment would be long-lasting.[15] Choroideremia is an inherited genetic eye disease with no approved treatment, leading to loss of sight.

In March researchers reported that 12 HIV patients had been treated since 2009 in a trial with a genetically engineered virus with a rare mutation (CCR5 deficiency) known to protect against HIV with promising results.[116][117]

Clinical trials of gene therapy for sickle cell disease were started in 2014.[118][119] There is a need for high quality randomised controlled trials assessing the risks and benefits involved with gene therapy for people with sickle cell disease.[120]

In February LentiGlobin BB305, a gene therapy treatment undergoing clinical trials for treatment of beta thalassemia gained FDA “breakthrough” status after several patients were able to forgo the frequent blood transfusions usually required to treat the disease.[121]

In March researchers delivered a recombinant gene encoding a broadly neutralizing antibody into monkeys infected with simian HIV; the monkeys’ cells produced the antibody, which cleared them of HIV. The technique is named immunoprophylaxis by gene transfer (IGT). Animal tests for antibodies to ebola, malaria, influenza, and hepatitis were underway.[122][123]

In March, scientists, including an inventor of CRISPR, Jennifer Doudna, urged a worldwide moratorium on germline gene therapy, writing “scientists should avoid even attempting, in lax jurisdictions, germline genome modification for clinical application in humans” until the full implications “are discussed among scientific and governmental organizations”.[124][125][126][127]

In October, researchers announced that they had treated a baby girl, Layla Richards, with an experimental treatment using donor T-cells genetically engineered using TALEN to attack cancer cells. One year after the treatment she was still free of her cancer (a highly aggressive form of acute lymphoblastic leukaemia [ALL]).[128] Children with highly aggressive ALL normally have a very poor prognosis and Layla’s disease had been regarded as terminal before the treatment.[129]

In December, scientists of major world academies called for a moratorium on inheritable human genome edits, including those related to CRISPR-Cas9 technologies[130] but that basic research including embryo gene editing should continue.[131]

In April the Committee for Medicinal Products for Human Use of the European Medicines Agency endorsed a gene therapy treatment called Strimvelis[132][133] and the European Commission approved it in June.[134] This treats children born with adenosine deaminase deficiency and who have no functioning immune system. This was the second gene therapy treatment to be approved in Europe.[135]

In October, Chinese scientists reported they had started a trial to genetically modify T-cells from 10 adult patients with lung cancer and reinject the modified T-cells back into their bodies to attack the cancer cells. The T-cells had the PD-1 protein (which stops or slows the immune response) removed using CRISPR-Cas9.[136][137]

A 2016 Cochrane systematic review looking at data from four trials on topical cystic fibrosis transmembrane conductance regulator (CFTR) gene therapy does not support its clinical use as a mist inhaled into the lungs to treat cystic fibrosis patients with lung infections. One of the four trials did find weak evidence that liposome-based CFTR gene transfer therapy may lead to a small respiratory improvement for people with CF. This weak evidence is not enough to make a clinical recommendation for routine CFTR gene therapy.[138]

In February Kite Pharma announced results from a clinical trial of CAR-T cells in around a hundred people with advanced Non-Hodgkin lymphoma.[139]

In March, French scientists reported on clinical research of gene therapy to treat sickle-cell disease.[140]

In August, the FDA approved tisagenlecleucel for acute lymphoblastic leukemia.[141] Tisagenlecleucel is an adoptive cell transfer therapy for B-cell acute lymphoblastic leukemia; T cells from a person with cancer are removed, genetically engineered to make a specific T-cell receptor (a chimeric T cell receptor, or “CAR-T”) that reacts to the cancer, and are administered back to the person. The T cells are engineered to target a protein called CD19 that is common on B cells. This is the first form of gene therapy to be approved in the United States. In October, a similar therapy called axicabtagene ciloleucel was approved for non-Hodgkin lymphoma.[142]

In December the results of using an adeno-associated virus with blood clotting factor VIII to treat nine haemophilia A patients were published. Six of the seven patients on the high dose regime increased the level of the blood clotting VIII to normal levels. The low and medium dose regimes had no effect on the patient’s blood clotting levels.[143][144]

In December, the FDA approved Luxturna, the first in vivo gene therapy, for the treatment of blindness due to Leber’s congenital amaurosis.[145] The price of this treatment was 850,000 US dollars for both eyes.[146][147] CRISPR gene editing technology has also been used on mice to treat deafness due to the DFNA36 mutation, which also affects humans.[148]

Speculated uses for gene therapy include:

Gene Therapy techniques have the potential to provide alternative treatments for those with infertility. Recently, successful experimentation on mice has proven that fertility can be restored by using the gene therapy method, CRISPR.[149] Spermatogenical stem cells from another organism were transplanted into the testes of an infertile male mouse. The stem cells re-established spermatogenesis and fertility.[150]

Athletes might adopt gene therapy technologies to improve their performance.[151] Gene doping is not known to occur, but multiple gene therapies may have such effects. Kayser et al. argue that gene doping could level the playing field if all athletes receive equal access. Critics claim that any therapeutic intervention for non-therapeutic/enhancement purposes compromises the ethical foundations of medicine and sports.[152]

Genetic engineering could be used to cure diseases, but also to change physical appearance, metabolism, and even improve physical capabilities and mental faculties such as memory and intelligence. Ethical claims about germline engineering include beliefs that every fetus has a right to remain genetically unmodified, that parents hold the right to genetically modify their offspring, and that every child has the right to be born free of preventable diseases.[153][154][155] For parents, genetic engineering could be seen as another child enhancement technique to add to diet, exercise, education, training, cosmetics, and plastic surgery.[156][157] Another theorist claims that moral concerns limit but do not prohibit germline engineering.[158]

Possible regulatory schemes include a complete ban, provision to everyone, or professional self-regulation. The American Medical Associations Council on Ethical and Judicial Affairs stated that “genetic interventions to enhance traits should be considered permissible only in severely restricted situations: (1) clear and meaningful benefits to the fetus or child; (2) no trade-off with other characteristics or traits; and (3) equal access to the genetic technology, irrespective of income or other socioeconomic characteristics.”[159]

As early in the history of biotechnology as 1990, there have been scientists opposed to attempts to modify the human germline using these new tools,[160] and such concerns have continued as technology progressed.[161][162] With the advent of new techniques like CRISPR, in March 2015 a group of scientists urged a worldwide moratorium on clinical use of gene editing technologies to edit the human genome in a way that can be inherited.[124][125][126][127] In April 2015, researchers sparked controversy when they reported results of basic research to edit the DNA of non-viable human embryos using CRISPR.[149][163] A committee of the American National Academy of Sciences and National Academy of Medicine gave qualified support to human genome editing in 2017[164][165] once answers have been found to safety and efficiency problems “but only for serious conditions under stringent oversight.”[166]

Regulations covering genetic modification are part of general guidelines about human-involved biomedical research. There are no international treaties which are legally binding in this area, but there are recommendations for national laws from various bodies.

The Helsinki Declaration (Ethical Principles for Medical Research Involving Human Subjects) was amended by the World Medical Association’s General Assembly in 2008. This document provides principles physicians and researchers must consider when involving humans as research subjects. The Statement on Gene Therapy Research initiated by the Human Genome Organization (HUGO) in 2001 provides a legal baseline for all countries. HUGOs document emphasizes human freedom and adherence to human rights, and offers recommendations for somatic gene therapy, including the importance of recognizing public concerns about such research.[167]

No federal legislation lays out protocols or restrictions about human genetic engineering. This subject is governed by overlapping regulations from local and federal agencies, including the Department of Health and Human Services, the FDA and NIH’s Recombinant DNA Advisory Committee. Researchers seeking federal funds for an investigational new drug application, (commonly the case for somatic human genetic engineering,) must obey international and federal guidelines for the protection of human subjects.[168]

NIH serves as the main gene therapy regulator for federally funded research. Privately funded research is advised to follow these regulations. NIH provides funding for research that develops or enhances genetic engineering techniques and to evaluate the ethics and quality in current research. The NIH maintains a mandatory registry of human genetic engineering research protocols that includes all federally funded projects.

An NIH advisory committee published a set of guidelines on gene manipulation.[169] The guidelines discuss lab safety as well as human test subjects and various experimental types that involve genetic changes. Several sections specifically pertain to human genetic engineering, including Section III-C-1. This section describes required review processes and other aspects when seeking approval to begin clinical research involving genetic transfer into a human patient.[170] The protocol for a gene therapy clinical trial must be approved by the NIH’s Recombinant DNA Advisory Committee prior to any clinical trial beginning; this is different from any other kind of clinical trial.[169]

As with other kinds of drugs, the FDA regulates the quality and safety of gene therapy products and supervises how these products are used clinically. Therapeutic alteration of the human genome falls under the same regulatory requirements as any other medical treatment. Research involving human subjects, such as clinical trials, must be reviewed and approved by the FDA and an Institutional Review Board.[171][172]

Gene therapy is the basis for the plotline of the film I Am Legend[173] and the TV show Will Gene Therapy Change the Human Race?.[174] In 1994, gene therapy was a plot element in The Erlenmeyer Flask, The X-Files’ first season finale. It is also used in Stargate as a means of allowing humans to use Ancient technology.[175]

Continued here:

Gene therapy – Wikipedia

Genetic engineering in science fiction – Wikipedia

In literature and especially in science fiction, genetic engineering has been used as a theme or a plot device in many stories.[1][2]

In his 1924 essay Daedalus, or Science and the Future, J. B. S. Haldane predicted a day when biologists would invent new algae to feed the world and ectogenetic children would be created and modified using eugenic selection. Aldous Huxley developed these ideas in a satirical direction for his 1932 novel Brave New World, in which ectogenetic embryos were developed in selected environments to create children of an ‘Alpha’, ‘Beta’, or ‘Gamma’ type.[3]

The advent of large-scale genetic engineering has increased its presence in fiction.[4][5] Genetics research consortia, such as the Wellcome Trust Sanger Institute, have felt the need to distinguish genetic engineering fact from fiction in explaining their work to the public,[1] and have explored the role that genetic engineering has played in the public perception of programs, such as the Human Genome Project.[6]

Beyond the usual library catalog classifications,[7] the Wellcome Trust Sanger Institute[1] and the NHGRI[6] have compiled catalogs of literature in various media with genetics and genetic engineering as a theme or plot device. Such compilations are also available at fan sites.[8]

In the 2000 television series Andromeda, the Nietzscheans (Homo sapiens invictus in Latin) are a race of genetically engineered humans who religiously follow the works of Friedrich Nietzsche, social Darwinism and Dawkinite genetic competitiveness. They claim to be physically perfect and are distinguished by bone blades protruding outwards from the wrist area.

In the book 2312 by Kim Stanley Robinson, genetic engineering of humans, plants and animals and how that affects a society spread over the solar system is explored.

In the Animorphs book series, race of aliens known as the Hork-Bajir were engineered by a race known as the Arns. Another race, the Iskhoots, are another example of genetic engineering. The outer body, the Isk, was created by the Yoort, who also modify themselves to be symbotic to the Isk. Also, a being known as the Ellimist has made species such as the Pemalites by this method.

In the 1983 film Anna to the Infinite Power, the main character was one of seven genetically cloned humans created by Anna Zimmerman as a way to groom a perfect person in her image. After her death, her work was carried on by her successor Dr. Henry Jelliff, who had other plans for the project. But in the end we learn that her original genetic creation, Michaela Dupont, has already acquired her creator’s abilities, including how to build a genetic replicator from scratch.

The 1996 video game series Resident Evil involves the creation of genetically engineered viruses which turn humans and animals into organisms such as zombies, the Tyrants or Hunters by a worldwide pharmaceutical company called the Umbrella Corporation.

In the video game series BioShock, most of the enemies in both BioShock and BioShock 2, referred to as “splicers”, as well as the player, gain superpowers and enhance their physical and mental capabilities by means of genetically engineered plasmids, created by use of ADAM stem cells secreted by a species of sea slug.[9]

The novel Beggars in Spain by Nancy Kress and its sequels are widely recognized by science fiction critics as among the most sophisticated fictional treatments of genetic engineering. They portray genetically-engineered characters whose abilities are far greater than those of ordinary humans (e.g. they are effectively immortal and they function without needing to sleep). At issue is what responsibility they have to use their abilities to help “normal” human beings. Kress explores libertarian and more collectivist philosophies, attempting to define the extent of people’s mutual responsibility for each other’s welfare.

In the Battletech science fiction series, the Clans have developed a genetic engineering program for their warriors, consisting of eugenics and the use of artificial wombs.

In The Champion Maker, a novel by Kevin Joseph, a track coach and a teenage phenom stumble upon a dark conspiracy involving genetic engineering while pursuing Olympic gold.

In the CoDominium series, the planet Sauron develops a supersoldier program. The result were the Sauron Cyborgs, and soldiers. The Cyborgs, who made up only a very small part of the population of Sauron, were part highly genetically engineered human, and part machine. Cyborgs held very high status in Sauron society.

Sauron soldiers, who made up the balance of the population, were the result of generations of genetic engineering. The Sauron soldiers had a variety of physical characteristics and abilities that made the soldiers the best in combat and survival in many hostile environments. For instance, their bones were stronger than unmodified humans. Their lungs extract oxygen more efficiently than normal unmodified humans, allowing them to exert themselves without getting short of breath, or function at high altitudes. Sauron soldiers also have the ability to change the focal length of their eyes, so that they can “zoom” in on a distant object, much like an eagle.

The alien Moties also have used genetic engineering.

In the science fiction series Crest of the Stars, the Abh are a race of genetically engineered humans, who continue to practice the technology. All Abh have been adapted to live in zero-gravity environments, with the same features such as beauty, long life, lifelong youthful appearance, blue hair, and a “space sensory organ”.

In the 2000 TV series Dark Angel, the main character Max is one of a group of genetically engineered supersoldiers spliced with feline DNA.

In military science fiction 1993 television series Exosquad, the plot revolves around the conflict between Terrans (baseline humans) and Neosapiens, a race of genetically engineered sentient (and sterile) humanoids, who were originally bred for slave labour but revolted under the leadership of Phaeton and captured the Homeworlds (Earth, Venus and Mars). During the war, various sub-broods of Neosapiens were invented, such as, Neo Megas (intellectually superior to almost any being in the Solar System), Neo Warriors (cross-breeds with various animals) and Neo Lords (the ultimate supersoldiers).

Genetic modification is also found in the 2002 anime series Gundam SEED. It features enhanced humans called Coordinators who were created from ordinary humans through genetic modification.

In Marvel Comics, the 31st century adventurers called the Guardians of the Galaxy are genetically engineered residents of Mercury, Jupiter, and Pluto.

The 1997 film Gattaca deals with the idea of genetic engineering and eugenics as it projects what class relations would look like in a future society after a few generations of the possibility of genetic engineering.

In Marvel Comics, the Inhumans are the result of genetic engineering of early humans by the Kree alien race.

Rather than deliberate engineering, this 2017 novel by British author Steve Turnbull features a plague that carries genetic material across species, causing a wide variety of mutations. Human attempts to control this plague have resulted in a fascist dystopia.

In the Leviathan universe, a group known as the Darwinists use genetically engineered animals as weapons.

The 2000AD strip, Lobster Random features a former soldier-turned-torturer, who has been modified to not feel pain or need to sleep and has a pair of lobster claws grafted to his hips. This state has left him somewhat grouchy.

In Metal Gear Solid, the Genome Army were given gene therapy enhancements.

Also in the series, the Les Enfants Terribles project involved genetic engineering.

The Moreau series by S. Andrew Swann has as the central premise the proliferation of humanoid genetically-engineered animals. The name of the series (and of the creatures themselves) comes from the H. G. Wells novel The Island of Dr. Moreau. In the Wells novel, humanoid animals were created surgically, though this detail has been changed to be genetic manipulation in most film adaptations.

The Neanderthal Parallax novel by Robert J. Sawyer depicts a eugenic society that has benefitted immensely from the sterilization of dangerous criminals as well as preventing the 5% least intelligent from procreating for ten generations.

In the Neon Genesis Evangelion anime series, the character Rei Ayanami is implied to be a lab-created being combining human and angelic DNA. (compare to the Biblical Nephilim)

Genetic engineering (or something very like it) features prominently in Last and First Men, a 1930 novel by Olaf Stapledon.

Genetic engineering is depicted as widespread in the civilized world of Oryx and Crake. Prior to the apocalypse, though, its use among humans is not mentioned. Author Margaret Atwood describes many transgenic creatures such as Pigoons (though originally designed to be harvested for organs, post-apocalyptic-plague, they become more intelligent and vicious, traveling in packs), Snats (snake-rat hybrids who may or may not be extinct), wolvogs (wolf-dog hybrids), and the relatively harmless “rakunks” (skunk-raccoon hybrids, originally designed as pets with no scent glands).

In Plague, a 1978 film, a bacterium in an agricultural experiment accidentally escapes from a research laboratory in Canada, reaching the American Northeast and Great Britain.

Using a method similar to the DNA Resequencer from Stargate SG-1, and even called DNA Resequencing, the Operation Overdrive Power Rangers were given powers of superhuman strength, enhanced hearing, enhanced eyesight, super bouncing, super speed, and invisibility.

Quake II and Quake 4, released in 1997 and 2005, contain genetically-engineered Stroggs.

In the long-running 2006 series Rogue Trooper, the eponymous hero is a Genetic Infantryman, one of an elite group of supersoldiers genetically modified to resist the poisons left in the Nu-Earth atmosphere by decades of war. The original concept from the pages of 80s cult sci-fi comic 2000 AD (of Judge Dredd fame).

James Blish’s The Seedling Stars (1956) is the classic story of controlled mutation for adaptability. In this novel (originally a series of short stories) the Adapted Men are reshaped human beings, designed for life on a variety of other planets. This is one of science fiction’s most unreservedly optimistic accounts to date of technological efforts to reshape human beings.

In “The Man Who Grew Too Much” episode (2014), Sideshow Bob steals DNA from a GMO company, thus making himself the very first genetically engineered human, and attempts to combine his DNA with that of the smartest people ever to exist on Earth.

In Sleeper, a 1973 parody of many science fiction tropes, genetically modified crops are shown to grow gigantic.

The short-lived 1990s television series Space: Above and Beyond includes a race of genetically engineered and artificially gestated humans who are born at the physical age of 18, and are collectively known as In Vitros or sometimes, derogatorily, “tanks” or “nipple-necks”. At the time of the series storyline, this artificial human race was integrated with the parent species, but significant discrimination still occurred.

The Ultimate Life Form project that produced Shadow the Hedgehog and Biolizard in the Sonic the Hedgehog series was a genetic engineering project.

In the Star Trek universe, genetic engineering has featured in a couple of films, and a number of television episodes.

The Breen, the Dominion, Species 8472, the Xindi, and the Federation use technology with organic components.

Khan Noonien Singh, who appeared in Space Seed and Star Trek II: The Wrath of Khan, was a product of genetic engineering. His physical structure was modified to make him stronger and to give him greater stamina than a regular human. His mind was also enhanced. However, the creation of Khan would have serious consequences because the superior abilities given to him created superior ambition. Along with other enhanced individuals, they tried to take over the planet. When they were reawakened by the Enterprise, Khan set himself to taking over the universe. Later, he became consumed by grief and rage, and set himself on the goal of destroying Kirk.

Others of these genetically enhanced augments wreaked havoc in the 22nd century, and eventually some of their enhanced DNA was blended with Klingon DNA, creating the human-looking Klingons of the early 23rd century (See Star Trek: Enterprise episodes “Affliction” and “Divergence”).

Because of the experiences with genetic engineering, the Federation had banned it except to correct genetic birth defects, but a number of parents still illegally subjected their children to genetic engineering for a variety of reasons. This often created brilliant but unstable individuals. Such children are not allowed to serve in Starfleet or practice medicine, though Julian Bashir is a notable exception to this. Despite the ban, the Federation allowed the Darwin station to conduct human genetic engineering, which resulted in a telepathic, telekentic humans with a very effective immune system.

In Attack of the Clones, the Kamino cloners who created the clone army for the Galactic Republic had used engineering to enhance their clones. They modified the genetic structure of all but one to accelerate their growth rate, make them less independent, and make them better suited to combat operations.

Later, the Yuuzhan Vong are a race who exclusively use organic technology and regard mechanical technology as heresy. Everything from starships to communications devices to weapons are bred and grown to suit their needs.

In the show Stargate SG-1, the DNA Resequencer was a device built by the Ancients, designed to make extreme upgrades to humans by realigning their DNA and upgrading their brain activity. The machine gave them superhuman abilities, such as telekensis, telepathy, precognition, superhuman senses, strength, and intellect, the power to heal at an incredible rate, and the power to heal others by touch.

In the futuristic tabletop and video game series, Warhammer 40,000, the Imperium of Man uses genetic engineering to enhance the abilities of various militant factions such as the Space Marines, the Thunder Warriors, and the Adeptus Custodes. In the case of Space Marines, a series of synthesized, metamorphosis-inducing organs, known as gene seed, is made from the genome of the twenty original Primarchs and used to start the transformation of these superhuman warriors.

At the same time, the Tau Empire uses a form of eugenic breeding to improve the physical and mental condition of its various castes.

In the e-book, Methuselah’s Virus, an ageing pharmaceutical billionaire accidentally creates a contagious virus capable of infecting people with extreme longevity when his genetic engineering experiment goes wrong. The novel then examines the problem of what happens if Methuselah’s Virus is at risk of spreading to everyone on the entire planet.

In World Hunger, author Brian Kenneth Swain paints the harrowing picture of a life sciences company that field tests a new strain of genetically modified crop, the unexpected side effect of which is the creation of several new species of large and very aggressive insects.

Genetic engineering is an essential theme of the illustrated book Man After Man: An Anthropology of the Future by Dougal Dixon, where it is used to colonize other star systems and save the humans of Earth from extinction.

The Survival Gene e-book contains the author Artsun Akopyan’s idea that people can’t preserve nature as it is forever, so they’ll have to change their own genetics in the future or die. In the novel, wave genetics is used to save humankind and all life on Earth.

A series of books by David Brin in which humans have encountered the Five Galazies, a multitude of sentient species which all practice Uplift raising species to sapience through genetic engineering. Humans, believing they have risen to sapience through evolution alone, are seen as heretics. But they have some status because at the time of contact humans had already Uplifted two species chimpanzees and bottlenose dolphins.

Eugenics is a recurrent theme in science fiction, often with both dystopian and utopian elements. The two giant contributions in this field are the novel Brave New World (1932) by Aldous Huxley, which describes a society where control of human biology by the state results in permanent social stratification.

There tends to be a eugenic undercurrent in the science fiction concept of the supersoldier. Several depictions of these supersoldiers usually have them bred for combat or genetically selected for attributes that are beneficial to modern or future combat.

The Brave New World theme also plays a role in the 1997 film Gattaca, whose plot turns around reprogenetics, genetic testing, and the social consequences of eugenics. Boris Vian (under the pseudonym Vernon Sullivan) takes a more light-hearted approach in his novel Et on tuera tous les affreux (“And we’ll kill all the ugly ones”).

Other novels touching upon the subject include The Gate to Women’s Country by Sheri S. Tepper and That Hideous Strength by C. S. Lewis. The Eugenics Wars are a significant part of the background story of the Star Trek universe (episodes “Space Seed”, “Borderland”, “Cold Station 12”, “The Augments” and the film Star Trek II: The Wrath of Khan). Eugenics also plays a significant role in the Neanderthal Parallax trilogy where eugenics-practicing Neanderthals from a near-utopian parallel world create a gateway to earth. Cowl by Neal Asher describes the collapse of western civilization due to dysgenics. Also Eugenics is the name for the medical company in La Foire aux immortels book by Enki Bilal and on the Immortel (Ad Vitam) movie by the same author.

In Frank Herbert’s Dune series of novels, selective breeding programs form a significant theme. Early in the series, the Bene Gesserit religious order manipulates breeding patterns over many generations in order to create the Kwisatz Haderach. In God Emperor of Dune, the emperor Leto II again manipulates human breeding in order to achieve his own ends. The Bene Tleilaxu also employed genetic engineering to create human beings with specific genetic attributes. The Dune series ended with causal determinism playing a large role in the development of behavior, but the eugenics theme remained a crucial part of the story.

In Orson Scott Card’s novel Ender’s Game, Ender is only allowed to be conceived because of a special government exception due to his parent’s high intelligence and the extraordinary performance of his siblings. In Ender’s Shadow, Bean is a test-tube baby and the result of a failed eugenics experiment aimed at creating child geniuses.

In the novels Methuselah’s Children and Time Enough for Love by Robert A. Heinlein, a large trust fund is created to give financial encouragement to marriage among people (the Howard Families) whose parents and grandparents were long lived. The result is a subset of Earth’s population who has significantly above-average life spans. Members of this group appear in many of the works by the same author.

In the 1982 Robert Heinlein novel Friday, the main character has been genetically engineered from multiple sets of donors, including, as she finds out later her boss. These enhancements give her superior strength, speed, eyesight in addition to healing and other advanced attributes. Creations like her are considered to be AP’s (Artificial Person).

In Eoin Colfer’s book The Supernaturalist, Ditto is a Bartoli Baby, which is the name for a failed experiment of the famed Dr. Bartoli. Bartoli tried to create a superior race of humans, but they ended in arrested development, with mutations including extrasensory perception and healing hands.

In Larry Niven’s Ringworld series, the character Teela Brown is a result of several generations of winners of the “Birthright Lottery”, a system which attempts to encourage lucky people to breed, treating good luck as a genetic trait.

In season 2 of Dark Angel, the main ‘bad guy’ Ames White is a member of a cult known as the Conclave which has infiltrated various levels of society to breed super-humans. They are trying to exterminate all the Transgenics, including the main character Max Guevara, whom they view as being genetically unclean for having some animal DNA spliced with human.

In the movie Immortel (Ad Vitam), Director/Writer Enki Bilal titled the name of the evil corrupt organization specializing in genetic manipulation, and some very disturbing genetic “enhancement” eugenics. Eugenics has come to be a powerful organization and uses people and mutants of “lesser” genetic stock as guinea pigs. The movie is based on the Nikopol trilogy in Heavy Metal comic books.

In the video game Grand Theft Auto: Vice City, a fictional character called Pastor Richards, a caricature of an extreme and insane televangelist, is featured as a guest on a discussion radio show about morality. On this show, he describes shooting people who do not agree with him and who are not “morally correct”, which the show’s host describes as “amateur eugenics”.

In the 2006 Mike Judge film Idiocracy, a fictional character, pvt. Joe Bauers, aka Not Sure (played by Luke Wilson), awakens from a cryogenic stasis in the year 2505 into a world devastated by dysgenic degeneration. Bauers, who was chosen for his averageness, is discovered to be the smartest human alive and eventually becomes president of the United States.

The manga series Battle Angel Alita and its sequel Battle Angel Alita: Last Order (Gunnm and Gunnm: Last Order as it is known in Japan) by Yukito Kishiro, contains multiple references to the theme of eugenics. The most obvious is the sky city Tiphares (Salem in Japanese edition). Dr. Desty Nova, in the first series in Volume 9, reveals the eugenical nature of the city to Alita (Gally or Yoko) and it is further explored in the sequel series. A James Cameron movie based on the series is due for release on 2018.[10]

In the French 2000 police drama Crimson Rivers, inspectors Pierre Niemans (played by Jean Reno) and his colleague Max Kerkerian (Vincent Cassel) attempt to solve series of murders triggered by eugenics experiment that was going on for years in university town of Guernon.

In the Cosmic Era universe of the Gundam anime series (Mobile Suit Gundam SEED), war is fought between the normal human beings without genetic enhancements, also known as the Naturals, and the Coordinators, who are genetically enhanced. It explores the pros and cons as well as possible repercussions from Eugenics

The Khommites of planet Khomm practice this through the method of self-cloning, believing they are perfect.

The book Uglies, part of a four-book series by Scott Westerfeld, revolves around a girl named Tally who lives in a world where everyone at the age of sixteen receives extensive cosmetic surgery to turn into “Pretties” and join society. Although it deals with extreme cosmetic surgery, the utopian (or dystopian, depending on one’s interpretation) ideals in the book are similar to those present in the books mentioned above.

Continued here:

Genetic engineering in science fiction – Wikipedia

JAS Oceania – JAS Home

More than just another online ordering platform, the new JAS Oceania system gives the trade fast access to the entire JAS stock network for checking price and availability, your online account balance and orders you have in the system.

eJas is set to improve the way trade workshops place product orders and access marketing information in the Automotive Industry.

Access to eJas is available to JAS Oceania account holders only. If you do not have an existing account, please fill in an Account Application Form.

Originally posted here:

JAS Oceania – JAS Home

Nineteen Eighty-Four – Wikipedia

Nineteen Eighty-Four, often published as 1984, is a dystopian novel published in 1949 by English author George Orwell.[2][3] The novel is set in the year 1984 when most of the world population have become victims of perpetual war, omnipresent government surveillance and public manipulation.

In the novel, Great Britain (“Airstrip One”) has become a province of a superstate named Oceania. Oceania is ruled by the “Party”, who employ the “Thought Police” to persecute individualism and independent thinking.[4] The Party’s leader is Big Brother, who enjoys an intense cult of personality but may not even exist. The protagonist of the novel, Winston Smith, is a rank-and-file Party member. Smith is an outwardly diligent and skillful worker, but he secretly hates the Party and dreams of rebellion against Big Brother. Smith rebels by entering a forbidden relationship with fellow employee Julia.

As literary political fiction and dystopian science-fiction, Nineteen Eighty-Four is a classic novel in content, plot, and style. Many of its terms and concepts, such as Big Brother, doublethink, thoughtcrime, Newspeak, Room 101, telescreen, 2 + 2 = 5, and memory hole, have entered into common usage since its publication in 1949. Nineteen Eighty-Four popularised the adjective Orwellian, which describes official deception, secret surveillance, brazenly misleading terminology, and manipulation of recorded history by a totalitarian or authoritarian state.[5] In 2005, the novel was chosen by Time magazine as one of the 100 best English-language novels from 1923 to 2005.[6] It was awarded a place on both lists of Modern Library 100 Best Novels, reaching number 13 on the editor’s list, and 6 on the readers’ list.[7] In 2003, the novel was listed at number 8 on the BBC’s survey The Big Read.[8]

Orwell “encapsulate[d] the thesis at the heart of his unforgiving novel” in 1944, the implications of dividing the world up into zones of influence, which had been conjured by the Tehran Conference. Three years later, he wrote most of it on the Scottish island of Jura from 1947 to 1948 despite being seriously ill with tuberculosis.[9][10] On 4 December 1948, he sent the final manuscript to the publisher Secker and Warburg, and Nineteen Eighty-Four was published on 8 June 1949.[11][12] By 1989, it had been translated into 65 languages, more than any other novel in English until then.[13] The title of the novel, its themes, the Newspeak language and the author’s surname are often invoked against control and intrusion by the state, and the adjective Orwellian describes a totalitarian dystopia that is characterised by government control and subjugation of the people.

Orwell’s invented language, Newspeak, satirises hypocrisy and evasion by the state: the Ministry of Love (Miniluv) oversees torture and brainwashing, the Ministry of Plenty (Miniplenty) oversees shortage and rationing, the Ministry of Peace (Minipax) oversees war and atrocity and the Ministry of Truth (Minitrue) oversees propaganda and historical revisionism.

The Last Man in Europe was an early title for the novel, but in a letter dated 22 October 1948 to his publisher Fredric Warburg, eight months before publication, Orwell wrote about hesitating between that title and Nineteen Eighty-Four.[14] Warburg suggested choosing the main title to be the latter, a more commercial one.[15]

In the novel 1985 (1978), Anthony Burgess suggests that Orwell, disillusioned by the onset of the Cold War (194591), intended to call the book 1948. The introduction to the Penguin Books Modern Classics edition of Nineteen Eighty-Four reports that Orwell originally set the novel in 1980 but that he later shifted the date to 1982 and then to 1984. The introduction to the Houghton Mifflin Harcourt edition of Animal Farm and 1984 (2003) reports that the title 1984 was chosen simply as an inversion of the year 1948, the year in which it was being completed, and that the date was meant to give an immediacy and urgency to the menace of totalitarian rule.[16]

Throughout its publication history, Nineteen Eighty-Four has been either banned or legally challenged, as subversive or ideologically corrupting, like Aldous Huxley’s Brave New World (1932), We (1924) by Yevgeny Zamyatin, Darkness at Noon (1940) by Arthur Koestler, Kallocain (1940) by Karin Boye and Fahrenheit 451 (1953) by Ray Bradbury.[17] Some writers consider the Russian dystopian novel We by Zamyatin to have influenced Nineteen Eighty-Four,[18][19] and the novel bears significant similarities in its plot and characters to Darkness at Noon, written years before by Arthur Koestler, who was a personal friend of Orwell.[20]

The novel is in the public domain in Canada,[21] South Africa,[22] Argentina,[23] Australia,[24] and Oman.[25] It will be in the public domain in the United Kingdom, the EU,[26] and Brazil in 2021[27] (70 years after the author’s death), and in the United States in 2044.[28]

Nineteen Eighty-Four is set in Oceania, one of three inter-continental superstates that divided the world after a global war.

Smith’s memories and his reading of the proscribed book, The Theory and Practice of Oligarchical Collectivism by Emmanuel Goldstein, reveal that after the Second World War, the United Kingdom became involved in a war fought in Europe, western Russia, and North America during the early 1950s. Nuclear weapons were used during the war, leading to the destruction of Colchester. London would also suffer widespread aerial raids, leading Winston’s family to take refuge in a London Underground station. Britain fell to civil war, with street fighting in London, before the English Socialist Party, abbreviated as Ingsoc, emerged victorious and formed a totalitarian government in Britain. The British Commonwealth was absorbed by the United States to become Oceania. Eventually Ingsoc emerged to form a totalitarian government in the country.

Simultaneously, the Soviet Union conquered continental Europe and established the second superstate of Eurasia. The third superstate of Eastasia would emerge in the Far East after several decades of fighting. The three superstates wage perpetual war for the remaining unconquered lands of the world in “a rough quadrilateral with its corners at Tangier, Brazzaville, Darwin, and Hong Kong” through constantly shifting alliances. Although each of the three states are said to have sufficient natural resources, the war continues in order to maintain ideological control over the people.

However, due to the fact that Winston barely remembers these events and due to the Party’s manipulation of history, the continuity and accuracy of these events are unclear. Winston himself notes that the Party has claimed credit for inventing helicopters, airplanes and trains, while Julia theorizes that the perpetual bombing of London is merely a false-flag operation designed to convince the populace that a war is occurring. If the official account was accurate, Smith’s strengthening memories and the story of his family’s dissolution suggest that the atomic bombings occurred first, followed by civil war featuring “confused street fighting in London itself” and the societal postwar reorganisation, which the Party retrospectively calls “the Revolution”.

Most of the plot takes place in London, the “chief city of Airstrip One”, the Oceanic province that “had once been called England or Britain”.[29][30] Posters of the Party leader, Big Brother, bearing the caption “BIG BROTHER IS WATCHING YOU”, dominate the city (Winston states it can be found on nearly every house), while the ubiquitous telescreen (transceiving television set) monitors the private and public lives of the populace. Military parades, propaganda films, and public executions are said to be commonplace.

The class hierarchy of Oceania has three levels:

As the government, the Party controls the population with four ministries:

The protagonist Winston Smith, a member of the Outer Party, works in the Records Department of the Ministry of Truth as an editor, revising historical records, to make the past conform to the ever-changing party line and deleting references to unpersons, people who have been “vaporised”, i.e., not only killed by the state but denied existence even in history or memory.

The story of Winston Smith begins on 4 April 1984: “It was a bright cold day in April, and the clocks were striking thirteen.” Yet he is uncertain of the true date, given the regime’s continual rewriting and manipulation of history.[31]

In the year 1984, civilization has been damaged by war, civil conflict, and revolution. Airstrip One (formerly Britain) is a province of Oceania, one of the three totalitarian super-states that rules the world. It is ruled by the “Party” under the ideology of “Ingsoc” and the mysterious leader Big Brother, who has an intense cult of personality. The Party stamps out anyone who does not fully conform to their regime using the Thought Police and constant surveillance, through devices such as Telescreens (two-way televisions).

Winston Smith is a member of the middle class Outer Party. He works at the Ministry of Truth, where he rewrites historical records to conform to the state’s ever-changing version of history. Those who fall out of favour with the Party become “unpersons”, disappearing with all evidence of their existence removed. Winston revises past editions of The Times, while the original documents are destroyed by fire in a “memory hole”. He secretly opposes the Party’s rule and dreams of rebellion. He realizes that he is already a “thoughtcriminal” and likely to be caught one day.

While in a proletarian neighbourhood, he meets an antique shop owner called Mr. Charrington and buys a diary. He uses an alcove to hide it from the Telescreen in his room, and writes thoughts criticising the Party and Big Brother. In the journal, he records his sexual frustration over a young woman maintaining the novel-writing machines at the ministry named Julia, whom Winston is attracted to but suspects is an informant. He also suspects that his superior, an Inner Party official named O’Brien, is a secret agent for an enigmatic underground resistance movement known as the Brotherhood, a group formed by Big Brother’s reviled political rival Emmanuel Goldstein.

The next day, Julia secretly hands Winston a note confessing her love for him. Winston and Julia begin an affair, an act of the rebellion as the Party insists that sex may only be used for reproduction. Winston realizes that she shares his loathing of the Party. They first meet in the country, and later in a rented room above Mr. Charrington’s shop. During his affair with Julia, Winston remembers the disappearance of his family during the civil war of the 1950s and his terse relationship with his ex-wife Katharine. Winston also interacts with his colleague Syme, who is writing a dictionary for a revised version of the English language called Newspeak. After Syme admits that the true purpose of Newspeak is to reduce the capacity of human thought, Winston speculates that Syme will disappear. Not long after, Syme disappears and no one acknowledges his absence.

Weeks later, Winston is approached by O’Brien, who offers Winston a chance to join the Brotherhood. They arrange a meeting at O’Brien’s luxurious flat where both Winston and Julia swear allegiance to the Brotherhood. He sends Winston a copy of The Theory and Practice of Oligarchical Collectivism by Emmanuel Goldstein. Winston and Julia read parts of the book, which explains more about how the Party maintains power, the true meanings of its slogans and the concept of perpetual war. It argues that the Party can be overthrown if proles (proletarians) rise up against it.

Mr. Charrington is revealed to be an agent of the Thought Police. Winston and Julia are captured in the shop and imprisoned in the Ministry of Love. O’Brien reveals that he is loyal to the party, and part of a special sting operation to catch “thoughtcriminals”. Over many months, Winston is tortured and forced to “cure” himself of his “insanity” by changing his own perception to fit the Party line, even if it requires saying that “2 + 2 = 5”. O’Brien openly admits that the Party “is not interested in the good of others; it is interested solely in power.” He says that once Winston is brainwashed into loyalty, he will be released back into society for a period of time, before they execute him. Winston points out that the Party has not managed to make him betray Julia.

O’Brien then takes Winston to Room 101 for the final stage of re-education. The room contains each prisoner’s worst fear, in Winston’s case rats. As a wire cage holding hungry rats is fitted onto his face, Winston shouts “Do it to Julia!”, thus betraying her. After being released, Winston meets Julia in a park. She says that she was also tortured, and both reveal betraying the other. Later, Winston sits alone in a caf as Oceania celebrates a supposed victory over Eurasian armies in Africa, and realizes that “He loved Big Brother.”

Ingsoc (English Socialism) is the predominant ideology and pseudophilosophy of Oceania, and Newspeak is the official language of official documents.

In London, the capital city of Airstrip One, Oceania’s four government ministries are in pyramids (300 m high), the faades of which display the Party’s three slogans. The ministries’ names are the opposite (doublethink) of their true functions: “The Ministry of Peace concerns itself with war, the Ministry of Truth with lies, the Ministry of Love with torture and the Ministry of Plenty with starvation.” (Part II, Chapter IX The Theory and Practice of Oligarchical Collectivism)

The Ministry of Peace supports Oceania’s perpetual war against either of the two other superstates:

The primary aim of modern warfare (in accordance with the principles of doublethink, this aim is simultaneously recognized and not recognized by the directing brains of the Inner Party) is to use up the products of the machine without raising the general standard of living. Ever since the end of the nineteenth century, the problem of what to do with the surplus of consumption goods has been latent in industrial society. At present, when few human beings even have enough to eat, this problem is obviously not urgent, and it might not have become so, even if no artificial processes of destruction had been at work.

The Ministry of Plenty rations and controls food, goods, and domestic production; every fiscal quarter, it publishes false claims of having raised the standard of living, when it has, in fact, reduced rations, availability, and production. The Ministry of Truth substantiates Ministry of Plenty’s claims by revising historical records to report numbers supporting the current, “increased rations”.

The Ministry of Truth controls information: news, entertainment, education, and the arts. Winston Smith works in the Minitrue RecDep (Records Department), “rectifying” historical records to concord with Big Brother’s current pronouncements so that everything the Party says is true.

The Ministry of Love identifies, monitors, arrests, and converts real and imagined dissidents. In Winston’s experience, the dissident is beaten and tortured, and, when near-broken, he is sent to Room 101 to face “the worst thing in the world”until love for Big Brother and the Party replaces dissension.

The keyword here is blackwhite. Like so many Newspeak words, this word has two mutually contradictory meanings. Applied to an opponent, it means the habit of impudently claiming that black is white, in contradiction of the plain facts. Applied to a Party member, it means a loyal willingness to say that black is white when Party discipline demands this. But it means also the ability to believe that black is white, and more, to know that black is white, and to forget that one has ever believed the contrary. This demands a continuous alteration of the past, made possible by the system of thought which really embraces all the rest, and which is known in Newspeak as doublethink. Doublethink is basically the power of holding two contradictory beliefs in one’s mind simultaneously, and accepting both of them.

Three perpetually warring totalitarian super-states control the world:[34]

The perpetual war is fought for control of the “disputed area” lying “between the frontiers of the super-states”, which forms “a rough parallelogram with its corners at Tangier, Brazzaville, Darwin and Hong Kong”,[34] and Northern Africa, the Middle East, India and Indonesia are where the superstates capture and use slave labour. Fighting also takes place between Eurasia and Eastasia in Manchuria, Mongolia and Central Asia, and all three powers battle one another over various Atlantic and Pacific islands.

Goldstein’s book, The Theory and Practice of Oligarchical Collectivism, explains that the superstates’ ideologies are alike and that the public’s ignorance of this fact is imperative so that they might continue believing in the detestability of the opposing ideologies. The only references to the exterior world for the Oceanian citizenry (the Outer Party and the Proles) are Ministry of Truth maps and propaganda to ensure their belief in “the war”.

Winston Smith’s memory and Emmanuel Goldstein’s book communicate some of the history that precipitated the Revolution. Eurasia was formed when the Soviet Union conquered Continental Europe, creating a single state stretching from Portugal to the Bering Strait. Eurasia does not include the British Isles because the United States annexed them along with the rest of the British Empire and Latin America, thus establishing Oceania and gaining control over a quarter of the planet. Eastasia, the last superstate established, emerged only after “a decade of confused fighting”. It includes the Asian lands conquered by China and Japan. Although Eastasia is prevented from matching Eurasia’s size, its larger populace compensates for that handicap.

The annexation of Britain occurred about the same time as the atomic war that provoked civil war, but who fought whom in the war is left unclear. Nuclear weapons fell on Britain; an atomic bombing of Colchester is referenced in the text. Exactly how Ingsoc and its rival systems (Neo-Bolshevism and Death Worship) gained power in their respective countries is also unclear.

While the precise chronology cannot be traced, most of the global societal reorganization occurred between 1945 and the early 1960s. Winston and Julia once meet in the ruins of a church that was destroyed in a nuclear attack “thirty years” earlier, which suggests 1954 as the year of the atomic war that destabilised society and allowed the Party to seize power. It is stated in the novel that the “fourth quarter of 1983” was “also the sixth quarter of the Ninth Three-Year Plan”, which implies that the first quarter of the first three-year plan began in July 1958. By then, the Party was apparently in control of Oceania.

In 1984, there is a perpetual war between Oceania, Eurasia and Eastasia, the superstates that emerged from the global atomic war. The Theory and Practice of Oligarchical Collectivism, by Emmanuel Goldstein, explains that each state is so strong it cannot be defeated, even with the combined forces of two superstates, despite changing alliances. To hide such contradictions, history is rewritten to explain that the (new) alliance always was so; the populaces are accustomed to doublethink and accept it. The war is not fought in Oceanian, Eurasian or Eastasian territory but in the Arctic wastes and in a disputed zone comprising the sea and land from Tangiers (Northern Africa) to Darwin (Australia). At the start, Oceania and Eastasia are allies fighting Eurasia in northern Africa and the Malabar Coast.

That alliance ends and Oceania, allied with Eurasia, fights Eastasia, a change occurring on Hate Week, dedicated to creating patriotic fervour for the Party’s perpetual war. The public are blind to the change; in mid-sentence, an orator changes the name of the enemy from “Eurasia” to “Eastasia” without pause. When the public are enraged at noticing that the wrong flags and posters are displayed, they tear them down; the Party later claims to have captured Africa.

Goldstein’s book explains that the purpose of the unwinnable, perpetual war is to consume human labour and commodities so that the economy of a superstate cannot support economic equality, with a high standard of life for every citizen. By using up most of the produced objects like boots and rations, the proles are kept poor and uneducated and will neither realise what the government is doing nor rebel. Goldstein also details an Oceanian strategy of attacking enemy cities with atomic rockets before invasion but dismisses it as unfeasible and contrary to the war’s purpose; despite the atomic bombing of cities in the 1950s, the superstates stopped it for fear that would imbalance the powers. The military technology in the novel differs little from that of World War II, but strategic bomber aeroplanes are replaced with rocket bombs, helicopters were heavily used as weapons of war (they did not figure in World War II in any form but prototypes) and surface combat units have been all but replaced by immense and unsinkable Floating Fortresses, island-like contraptions concentrating the firepower of a whole naval task force in a single, semi-mobile platform (in the novel, one is said to have been anchored between Iceland and the Faroe Islands, suggesting a preference for sea lane interdiction and denial).

The society of Airstrip One and, according to “The Book”, almost the whole world, lives in poverty: hunger, disease and filth are the norms. Ruined cities and towns are common: the consequence of the civil war, the atomic wars and the purportedly enemy (but possibly false flag) rockets. Social decay and wrecked buildings surround Winston; aside from the ministerial pyramids, little of London was rebuilt. Members of the Outer Party consume synthetic foodstuffs and poor-quality “luxuries” such as oily gin and loosely-packed cigarettes, distributed under the “Victory” brand. (That is a parody of the low-quality Indian-made “Victory” cigarettes, widely smoked in Britain and by British soldiers during World War II. They were smoked because it was easier to import them from India than it was to import American cigarettes from across the Atlantic because of the War of the Atlantic.)

Winston describes something as simple as the repair of a broken pane of glass as requiring committee approval that can take several years and so most of those living in one of the blocks usually do the repairs themselves (Winston himself is called in by Mrs. Parsons to repair her blocked sink). All Outer Party residences include telescreens that serve both as outlets for propaganda and to monitor the Party members; they can be turned down, but they cannot be turned off.

In contrast to their subordinates, the Inner Party upper class of Oceanian society reside in clean and comfortable flats in their own quarter of the city, with pantries well-stocked with foodstuffs such as wine, coffee and sugar, all denied to the general populace.[35] Winston is astonished that the lifts in O’Brien’s building work, the telescreens can be switched off and O’Brien has an Asian manservant, Martin. All members of the Inner Party are attended to by slaves captured in the disputed zone, and “The Book” suggests that many have their own motorcars or even helicopters. Nonetheless, “The Book” makes clear that even the conditions enjoyed by the Inner Party are only “relatively” comfortable, and standards would be regarded as austere by those of the prerevolutionary lite.[36]

The proles live in poverty and are kept sedated with alcohol, pornography and a national lottery whose winnings are never actually paid out; that is obscured by propaganda and the lack of communication within Oceania. At the same time, the proles are freer and less intimidated than the middle-class Outer Party: they are subject to certain levels of monitoring but are not expected to be particularly patriotic. They lack telescreens in their own homes and often jeer at the telescreens that they see. “The Book” indicates that is because the middle class, not the lower class, traditionally starts revolutions. The model demands tight control of the middle class, with ambitious Outer-Party members neutralised via promotion to the Inner Party or “reintegration” by the Ministry of Love, and proles can be allowed intellectual freedom because they lack intellect. Winston nonetheless believes that “the future belonged to the proles”.[37]

The standard of living of the populace is low overall. Consumer goods are scarce, and all those available through official channels are of low quality; for instance, despite the Party regularly reporting increased boot production, more than half of the Oceanian populace goes barefoot. The Party claims that poverty is a necessary sacrifice for the war effort, and “The Book” confirms that to be partially correct since the purpose of perpetual war consumes surplus industrial production. Outer Party members and proles occasionally gain access to better items in the market, which deals in goods that were pilfered from the residences of the Inner Party.[citation needed]

Nineteen Eighty-Four expands upon the subjects summarised in Orwell’s essay “Notes on Nationalism”[38] about the lack of vocabulary needed to explain the unrecognised phenomena behind certain political forces. In Nineteen Eighty-Four, the Party’s artificial, minimalist language ‘Newspeak’ addresses the matter.

O’Brien concludes: “The object of persecution is persecution. The object of torture is torture. The object of power is power.”

In the book, Inner Party member O’Brien describes the Party’s vision of the future:

There will be no curiosity, no enjoyment of the process of life. All competing pleasures will be destroyed. But alwaysdo not forget this, Winstonalways there will be the intoxication of power, constantly increasing and constantly growing subtler. Always, at every moment, there will be the thrill of victory, the sensation of trampling on an enemy who is helpless. If you want a picture of the future, imagine a boot stamping on a human faceforever.

Part III, Chapter III, Nineteen Eighty-Four

A major theme of Nineteen Eighty-Four is censorship, especially in the Ministry of Truth, where photographs are modified and public archives rewritten to rid them of “unpersons” (persons who are erased from history by the Party). On the telescreens, figures for all types of production are grossly exaggerated or simply invented to indicate an ever-growing economy, when the reality is the opposite. One small example of the endless censorship is Winston being charged with the task of eliminating a reference to an unperson in a newspaper article. He proceeds to write an article about Comrade Ogilvy, a made-up party member who displayed great heroism by leaping into the sea from a helicopter so that the dispatches he was carrying would not fall into enemy hands.

The inhabitants of Oceania, particularly the Outer Party members, have no real privacy. Many of them live in apartments equipped with two-way telescreens so that they may be watched or listened to at any time. Similar telescreens are found at workstations and in public places, along with hidden microphones. Written correspondence is routinely opened and read by the government before it is delivered. The Thought Police employ undercover agents, who pose as normal citizens and report any person with subversive tendencies. Children are encouraged to report suspicious persons to the government, and some denounce their parents. Citizens are controlled, and the smallest sign of rebellion, even something so small as a facial expression, can result in immediate arrest and imprisonment. Thus, citizens, particularly party members, are compelled to obedience.

“The Principles of Newspeak” is an academic essay appended to the novel. It describes the development of Newspeak, the Party’s minimalist artificial language meant to ideologically align thought and action with the principles of Ingsoc by making “all other modes of thought impossible”. (A linguistic theory about how language may direct thought is the SapirWhorf hypothesis.)

Whether or not the Newspeak appendix implies a hopeful end to Nineteen Eighty-Four remains a critical debate, as it is in Standard English and refers to Newspeak, Ingsoc, the Party etc., in the past tense: “Relative to our own, the Newspeak vocabulary was tiny, and new ways of reducing it were constantly being devised” p.422). Some critics (Atwood,[39] Benstead,[40] Milner,[41] Pynchon[42]) claim that for the essay’s author, both Newspeak and the totalitarian government are in the past.

Nineteen Eighty-Four uses themes from life in the Soviet Union and wartime life in Great Britain as sources for many of its motifs. Some time at an unspecified date after the first American publication of the book, producer Sidney Sheldon wrote to Orwell interested in adapting the novel to the Broadway stage. Orwell sold the American stage rights to Sheldon, explaining that his basic goal with Nineteen Eighty-Four was imagining the consequences of Stalinist government ruling British society:

[Nineteen Eighty-Four] was based chiefly on communism, because that is the dominant form of totalitarianism, but I was trying chiefly to imagine what communism would be like if it were firmly rooted in the English speaking countries, and was no longer a mere extension of the Russian Foreign Office.[43]

The statement “2 + 2 = 5”, used to torment Winston Smith during his interrogation, was a communist party slogan from the second five-year plan, which encouraged fulfillment of the five-year plan in four years. The slogan was seen in electric lights on Moscow house-fronts, billboards and elsewhere.[44]

The switch of Oceania’s allegiance from Eastasia to Eurasia and the subsequent rewriting of history (“Oceania was at war with Eastasia: Oceania had always been at war with Eastasia. A large part of the political literature of five years was now completely obsolete”; ch 9) is evocative of the Soviet Union’s changing relations with Nazi Germany. The two nations were open and frequently vehement critics of each other until the signing of the 1939 Treaty of Non-Aggression. Thereafter, and continuing until the Nazi invasion of the Soviet Union in 1941, no criticism of Germany was allowed in the Soviet press, and all references to prior party lines stoppedincluding in the majority of non-Russian communist parties who tended to follow the Russian line. Orwell had criticised the Communist Party of Great Britain for supporting the Treaty in his essays for Betrayal of the Left (1941). “The Hitler-Stalin pact of August 1939 reversed the Soviet Union’s stated foreign policy. It was too much for many of the fellow-travellers like Gollancz [Orwell’s sometime publisher] who had put their faith in a strategy of construction Popular Front governments and the peace bloc between Russia, Britain and France.”[45]

The description of Emmanuel Goldstein, with a “small, goatee beard”, evokes the image of Leon Trotsky. The film of Goldstein during the Two Minutes Hate is described as showing him being transformed into a bleating sheep. This image was used in a propaganda film during the Kino-eye period of Soviet film, which showed Trotsky transforming into a goat.[46] Goldstein’s book is similar to Trotsky’s highly critical analysis of the USSR, The Revolution Betrayed, published in 1936.

The omnipresent images of Big Brother, a man described as having a moustache, bears resemblance to the cult of personality built up around Joseph Stalin.

The news in Oceania emphasised production figures, just as it did in the Soviet Union, where record-setting in factories (by “Heroes of Socialist Labor”) was especially glorified. The best known of these was Alexey Stakhanov, who purportedly set a record for coal mining in 1935.

The tortures of the Ministry of Love evoke the procedures used by the NKVD in their interrogations,[47] including the use of rubber truncheons, being forbidden to put your hands in your pockets, remaining in brightly lit rooms for days, torture through the use of provoked rodents, and the victim being shown a mirror after their physical collapse.

The random bombing of Airstrip One is based on the Buzz bombs and the V-2 rocket, which struck England at random in 19441945.

The Thought Police is based on the NKVD, which arrested people for random “anti-soviet” remarks.[48] The Thought Crime motif is drawn from Kempeitai, the Japanese wartime secret police, who arrested people for “unpatriotic” thoughts.

The confessions of the “Thought Criminals” Rutherford, Aaronson and Jones are based on the show trials of the 1930s, which included fabricated confessions by prominent Bolsheviks Nikolai Bukharin, Grigory Zinoviev and Lev Kamenev to the effect that they were being paid by the Nazi government to undermine the Soviet regime under Leon Trotsky’s direction.

The song “Under the Spreading Chestnut Tree” (“Under the spreading chestnut tree, I sold you, and you sold me”) was based on an old English song called “Go no more a-rushing” (“Under the spreading chestnut tree, Where I knelt upon my knee, We were as happy as could be, ‘Neath the spreading chestnut tree.”). The song was published as early as 1891. The song was a popular camp song in the 1920s, sung with corresponding movements (like touching your chest when you sing “chest”, and touching your head when you sing “nut”). Glenn Miller recorded the song in 1939.[49]

The “Hates” (Two Minutes Hate and Hate Week) were inspired by the constant rallies sponsored by party organs throughout the Stalinist period. These were often short pep-talks given to workers before their shifts began (Two Minutes Hate), but could also last for days, as in the annual celebrations of the anniversary of the October revolution (Hate Week).

Orwell fictionalized “newspeak”, “doublethink”, and “Ministry of Truth” as evinced by both the Soviet press and that of Nazi Germany.[50] In particular, he adapted Soviet ideological discourse constructed to ensure that public statements could not be questioned.[51]

Winston Smith’s job, “revising history” (and the “unperson” motif) are based on the Stalinist habit of airbrushing images of ‘fallen’ people from group photographs and removing references to them in books and newspapers.[53] In one well-known example, the Soviet encyclopaedia had an article about Lavrentiy Beria. When he fell in 1953, and was subsequently executed, institutes that had the encyclopaedia were sent an article about the Bering Strait, with instructions to paste it over the article about Beria.[54]

Big Brother’s “Orders of the Day” were inspired by Stalin’s regular wartime orders, called by the same name. A small collection of the more political of these have been published (together with his wartime speeches) in English as “On the Great Patriotic War of the Soviet Union” By Joseph Stalin.[55][56] Like Big Brother’s Orders of the day, Stalin’s frequently lauded heroic individuals,[57] like Comrade Ogilvy, the fictitious hero Winston Smith invented to ‘rectify’ (fabricate) a Big Brother Order of the day.

The Ingsoc slogan “Our new, happy life”, repeated from telescreens, evokes Stalin’s 1935 statement, which became a CPSU slogan, “Life has become better, Comrades; life has become more cheerful.”[48]

In 1940 Argentine writer Jorge Luis Borges published Tln, Uqbar, Orbis Tertius which described the invention by a “benevolent secret society” of a world that would seek to remake human language and reality along human-invented lines. The story concludes with an appendix describing the success of the project. Borges’ story addresses similar themes of epistemology, language and history to 1984.[58]

During World War II, Orwell believed that British democracy as it existed before 1939 would not survive the war. The question being “Would it end via Fascist coup d’tat from above or via Socialist revolution from below”?[citation needed] Later, he admitted that events proved him wrong: “What really matters is that I fell into the trap of assuming that ‘the war and the revolution are inseparable’.”[59]

Nineteen Eighty-Four (1949) and Animal Farm (1945) share themes of the betrayed revolution, the person’s subordination to the collective, rigorously enforced class distinctions (Inner Party, Outer Party, Proles), the cult of personality, concentration camps, Thought Police, compulsory regimented daily exercise, and youth leagues. Oceania resulted from the US annexation of the British Empire to counter the Asian peril to Australia and New Zealand. It is a naval power whose militarism venerates the sailors of the floating fortresses, from which battle is given to recapturing India, the “Jewel in the Crown” of the British Empire. Much of Oceanic society is based upon the USSR under Joseph StalinBig Brother. The televised Two Minutes Hate is ritual demonisation of the enemies of the State, especially Emmanuel Goldstein (viz Leon Trotsky). Altered photographs and newspaper articles create unpersons deleted from the national historical record, including even founding members of the regime (Jones, Aaronson and Rutherford) in the 1960s purges (viz the Soviet Purges of the 1930s, in which leaders of the Bolshevik Revolution were similarly treated). A similar thing also happened during the French Revolution in which many of the original leaders of the Revolution were later put to death, for example Danton who was put to death by Robespierre, and then later Robespierre himself met the same fate.

In his 1946 essay “Why I Write”, Orwell explains that the serious works he wrote since the Spanish Civil War (193639) were “written, directly or indirectly, against totalitarianism and for democratic socialism”.[3][60] Nineteen Eighty-Four is a cautionary tale about revolution betrayed by totalitarian defenders previously proposed in Homage to Catalonia (1938) and Animal Farm (1945), while Coming Up for Air (1939) celebrates the personal and political freedoms lost in Nineteen Eighty-Four (1949). Biographer Michael Shelden notes Orwell’s Edwardian childhood at Henley-on-Thames as the golden country; being bullied at St Cyprian’s School as his empathy with victims; his life in the Indian Imperial Police in Burma and the techniques of violence and censorship in the BBC as capricious authority.[61]

Other influences include Darkness at Noon (1940) and The Yogi and the Commissar (1945) by Arthur Koestler; The Iron Heel (1908) by Jack London; 1920: Dips into the Near Future[62] by John A. Hobson; Brave New World (1932) by Aldous Huxley; We (1921) by Yevgeny Zamyatin which he reviewed in 1946;[63] and The Managerial Revolution (1940) by James Burnham predicting perpetual war among three totalitarian superstates. Orwell told Jacintha Buddicom that he would write a novel stylistically like A Modern Utopia (1905) by H. G. Wells.[citation needed]

Extrapolating from World War II, the novel’s pastiche parallels the politics and rhetoric at war’s endthe changed alliances at the “Cold War’s” (194591) beginning; the Ministry of Truth derives from the BBC’s overseas service, controlled by the Ministry of Information; Room 101 derives from a conference room at BBC Broadcasting House;[64] the Senate House of the University of London, containing the Ministry of Information is the architectural inspiration for the Minitrue; the post-war decrepitude derives from the socio-political life of the UK and the US, i.e., the impoverished Britain of 1948 losing its Empire despite newspaper-reported imperial triumph; and war ally but peace-time foe, Soviet Russia became Eurasia.

The term “English Socialism” has precedents in his wartime writings; in the essay “The Lion and the Unicorn: Socialism and the English Genius” (1941), he said that “the war and the revolution are inseparable…the fact that we are at war has turned Socialism from a textbook word into a realisable policy” because Britain’s superannuated social class system hindered the war effort and only a socialist economy would defeat Adolf Hitler. Given the middle class’s grasping this, they too would abide socialist revolution and that only reactionary Britons would oppose it, thus limiting the force revolutionaries would need to take power. An English Socialism would come about which “will never lose touch with the tradition of compromise and the belief in a law that is above the State. It will shoot traitors, but it will give them a solemn trial beforehand and occasionally it will acquit them. It will crush any open revolt promptly and cruelly, but it will interfere very little with the spoken and written word.”[65]

In the world of Nineteen Eighty-Four, “English Socialism”(or “Ingsoc” in Newspeak) is a totalitarian ideology unlike the English revolution he foresaw. Comparison of the wartime essay “The Lion and the Unicorn” with Nineteen Eighty-Four shows that he perceived a Big Brother regime as a perversion of his cherished socialist ideals and English Socialism. Thus Oceania is a corruption of the British Empire he believed would evolve “into a federation of Socialist states, like a looser and freer version of the Union of Soviet Republics”.[66][verification needed]

When first published, Nineteen Eighty-Four was generally well received by reviewers. V. S. Pritchett, reviewing the novel for the New Statesman stated: “I do not think I have ever read a novel more frightening and depressing; and yet, such are the originality, the suspense, the speed of writing and withering indignation that it is impossible to put the book down.”[67] P. H. Newby, reviewing Nineteen Eighty-Four for The Listener magazine, described it as “the most arresting political novel written by an Englishman since Rex Warner’s The Aerodrome.”[68] Nineteen Eighty-Four was also praised by Bertrand Russell, E. M. Forster and Harold Nicolson.[68] On the other hand, Edward Shanks, reviewing Nineteen Eighty-Four for The Sunday Times, was dismissive; Shanks claimed Nineteen Eighty-Four “breaks all records for gloomy vaticination”.[68] C. S. Lewis was also critical of the novel, claiming that the relationship of Julia and Winston, and especially the Party’s view on sex, lacked credibility, and that the setting was “odious rather than tragic”.[69]

Nineteen Eighty-Four has been adapted for the cinema, radio, television and theatre at least twice each, as well as for other art media, such as ballet and opera.

The effect of Nineteen Eighty-Four on the English language is extensive; the concepts of Big Brother, Room 101, the Thought Police, thoughtcrime, unperson, memory hole (oblivion), doublethink (simultaneously holding and believing contradictory beliefs) and Newspeak (ideological language) have become common phrases for denoting totalitarian authority. Doublespeak and groupthink are both deliberate elaborations of doublethink, and the adjective “Orwellian” means similar to Orwell’s writings, especially Nineteen Eighty-Four. The practice of ending words with “-speak” (such as mediaspeak) is drawn from the novel.[70] Orwell is perpetually associated with 1984; in July 1984, an asteroid was discovered by Antonn Mrkos and named after Orwell.

References to the themes, concepts and plot of Nineteen Eighty-Four have appeared frequently in other works, especially in popular music and video entertainment. An example is the worldwide hit reality television show Big Brother, in which a group of people live together in a large house, isolated from the outside world but continuously watched by television cameras.

The book touches on the invasion of privacy and ubiquitous surveillance. From mid-2013 it was publicized that the NSA has been secretly monitoring and storing global internet traffic, including the bulk data collection of email and phone call data. Sales of Nineteen Eighty-Four increased by up to seven times within the first week of the 2013 mass surveillance leaks.[79][80][81] The book again topped the Amazon.com sales charts in 2017 after a controversy involving Kellyanne Conway using the phrase “alternative facts” to explain discrepancies with the media.[82][83][84][85]

The book also shows mass media as a catalyst for the intensification of destructive emotions and violence. Since the 20th century, news and other forms of media have been publicizing violence more often.[86][87] In 2013, the Almeida Theatre and Headlong staged a successful new adaptation (by Robert Icke and Duncan Macmillan), which twice toured the UK and played an extended run in London’s West End. The play opened on Broadway in 2017.

In the decades since the publication of Nineteen Eighty-Four, there have been numerous comparisons to Aldous Huxley’s novel Brave New World, which had been published 17 years earlier, in 1932.[88][89][90][91] They are both predictions of societies dominated by a central government and are both based on extensions of the trends of their times. However, members of the ruling class of Nineteen Eighty-Four use brutal force, torture and mind control to keep individuals in line, but rulers in Brave New World keep the citizens in line by addictive drugs and pleasurable distractions.

In October 1949, after reading Nineteen Eighty-Four, Huxley sent a letter to Orwell and wrote that it would be more efficient for rulers to stay in power by the softer touch by allowing citizens to self-seek pleasure to control them rather than brute force and to allow a false sense of freedom:

Within the next generation I believe that the world’s rulers will discover that infant conditioning and narco-hypnosis are more efficient, as instruments of government, than clubs and prisons, and that the lust for power can be just as completely satisfied by suggesting people into loving their servitude as by flogging and kicking them into obedience.[92]

Elements of both novels can be seen in modern-day societies, with Huxley’s vision being more dominant in the West and Orwell’s vision more prevalent with dictators in ex-communist countries, as is pointed out in essays that compare the two novels, including Huxley’s own Brave New World Revisited.[93][94][95][85]

Comparisons with other dystopian novels like The Handmaid’s Tale, Virtual Light, The Private Eye and Children of Men have also been drawn.[96][97]

Follow this link:

Nineteen Eighty-Four – Wikipedia

Genetic engineering | HHMI BioInteractive

A new gene can be inserted into a loop of bacterial DNA called a plasmid. This is done by cutting the plasmid DNA with a restriction enzyme, which allows a new piece of DNA to be inserted. The ends of the new piece of DNA are stitched together by an enzyme called DNA ligase. The genetically engineered bacteria will now manufacture any protein coded by genes on the newly inserted DNA.

See more here:

Genetic engineering | HHMI BioInteractive

Genetic engineering in science fiction – Wikipedia

In literature and especially in science fiction, genetic engineering has been used as a theme or a plot device in many stories.[1][2]

In his 1924 essay Daedalus, or Science and the Future, J. B. S. Haldane predicted a day when biologists would invent new algae to feed the world and ectogenetic children would be created and modified using eugenic selection. Aldous Huxley developed these ideas in a satirical direction for his 1932 novel Brave New World, in which ectogenetic embryos were developed in selected environments to create children of an ‘Alpha’, ‘Beta’, or ‘Gamma’ type.[3]

The advent of large-scale genetic engineering has increased its presence in fiction.[4][5] Genetics research consortia, such as the Wellcome Trust Sanger Institute, have felt the need to distinguish genetic engineering fact from fiction in explaining their work to the public,[1] and have explored the role that genetic engineering has played in the public perception of programs, such as the Human Genome Project.[6]

Beyond the usual library catalog classifications,[7] the Wellcome Trust Sanger Institute[1] and the NHGRI[6] have compiled catalogs of literature in various media with genetics and genetic engineering as a theme or plot device. Such compilations are also available at fan sites.[8]

In the 2000 television series Andromeda, the Nietzscheans (Homo sapiens invictus in Latin) are a race of genetically engineered humans who religiously follow the works of Friedrich Nietzsche, social Darwinism and Dawkinite genetic competitiveness. They claim to be physically perfect and are distinguished by bone blades protruding outwards from the wrist area.

In the book 2312 by Kim Stanley Robinson, genetic engineering of humans, plants and animals and how that affects a society spread over the solar system is explored.

In the Animorphs book series, race of aliens known as the Hork-Bajir were engineered by a race known as the Arns. Another race, the Iskhoots, are another example of genetic engineering. The outer body, the Isk, was created by the Yoort, who also modify themselves to be symbotic to the Isk. Also, a being known as the Ellimist has made species such as the Pemalites by this method.

In the 1983 film Anna to the Infinite Power, the main character was one of seven genetically cloned humans created by Anna Zimmerman as a way to groom a perfect person in her image. After her death, her work was carried on by her successor Dr. Henry Jelliff, who had other plans for the project. But in the end we learn that her original genetic creation, Michaela Dupont, has already acquired her creator’s abilities, including how to build a genetic replicator from scratch.

The 1996 video game series Resident Evil involves the creation of genetically engineered viruses which turn humans and animals into organisms such as zombies, the Tyrants or Hunters by a worldwide pharmaceutical company called the Umbrella Corporation.

In the video game series BioShock, most of the enemies in both BioShock and BioShock 2, referred to as “splicers”, as well as the player, gain superpowers and enhance their physical and mental capabilities by means of genetically engineered plasmids, created by use of ADAM stem cells secreted by a species of sea slug.[9]

The novel Beggars in Spain by Nancy Kress and its sequels are widely recognized by science fiction critics as among the most sophisticated fictional treatments of genetic engineering. They portray genetically-engineered characters whose abilities are far greater than those of ordinary humans (e.g. they are effectively immortal and they function without needing to sleep). At issue is what responsibility they have to use their abilities to help “normal” human beings. Kress explores libertarian and more collectivist philosophies, attempting to define the extent of people’s mutual responsibility for each other’s welfare.

In the Battletech science fiction series, the Clans have developed a genetic engineering program for their warriors, consisting of eugenics and the use of artificial wombs.

In The Champion Maker, a novel by Kevin Joseph, a track coach and a teenage phenom stumble upon a dark conspiracy involving genetic engineering while pursuing Olympic gold.

In the CoDominium series, the planet Sauron develops a supersoldier program. The result were the Sauron Cyborgs, and soldiers. The Cyborgs, who made up only a very small part of the population of Sauron, were part highly genetically engineered human, and part machine. Cyborgs held very high status in Sauron society.

Sauron soldiers, who made up the balance of the population, were the result of generations of genetic engineering. The Sauron soldiers had a variety of physical characteristics and abilities that made the soldiers the best in combat and survival in many hostile environments. For instance, their bones were stronger than unmodified humans. Their lungs extract oxygen more efficiently than normal unmodified humans, allowing them to exert themselves without getting short of breath, or function at high altitudes. Sauron soldiers also have the ability to change the focal length of their eyes, so that they can “zoom” in on a distant object, much like an eagle.

The alien Moties also have used genetic engineering.

In the science fiction series Crest of the Stars, the Abh are a race of genetically engineered humans, who continue to practice the technology. All Abh have been adapted to live in zero-gravity environments, with the same features such as beauty, long life, lifelong youthful appearance, blue hair, and a “space sensory organ”.

In the 2000 TV series Dark Angel, the main character Max is one of a group of genetically engineered supersoldiers spliced with feline DNA.

In military science fiction 1993 television series Exosquad, the plot revolves around the conflict between Terrans (baseline humans) and Neosapiens, a race of genetically engineered sentient (and sterile) humanoids, who were originally bred for slave labour but revolted under the leadership of Phaeton and captured the Homeworlds (Earth, Venus and Mars). During the war, various sub-broods of Neosapiens were invented, such as, Neo Megas (intellectually superior to almost any being in the Solar System), Neo Warriors (cross-breeds with various animals) and Neo Lords (the ultimate supersoldiers).

Genetic modification is also found in the 2002 anime series Gundam SEED. It features enhanced humans called Coordinators who were created from ordinary humans through genetic modification.

In Marvel Comics, the 31st century adventurers called the Guardians of the Galaxy are genetically engineered residents of Mercury, Jupiter, and Pluto.

The 1997 film Gattaca deals with the idea of genetic engineering and eugenics as it projects what class relations would look like in a future society after a few generations of the possibility of genetic engineering.

In Marvel Comics, the Inhumans are the result of genetic engineering of early humans by the Kree alien race.

Rather than deliberate engineering, this 2017 novel by British author Steve Turnbull features a plague that carries genetic material across species, causing a wide variety of mutations. Human attempts to control this plague have resulted in a fascist dystopia.

In the Leviathan universe, a group known as the Darwinists use genetically engineered animals as weapons.

The 2000AD strip, Lobster Random features a former soldier-turned-torturer, who has been modified to not feel pain or need to sleep and has a pair of lobster claws grafted to his hips. This state has left him somewhat grouchy.

In Metal Gear Solid, the Genome Army were given gene therapy enhancements.

Also in the series, the Les Enfants Terribles project involved genetic engineering.

The Moreau series by S. Andrew Swann has as the central premise the proliferation of humanoid genetically-engineered animals. The name of the series (and of the creatures themselves) comes from the H. G. Wells novel The Island of Dr. Moreau. In the Wells novel, humanoid animals were created surgically, though this detail has been changed to be genetic manipulation in most film adaptations.

The Neanderthal Parallax novel by Robert J. Sawyer depicts a eugenic society that has benefitted immensely from the sterilization of dangerous criminals as well as preventing the 5% least intelligent from procreating for ten generations.

In the Neon Genesis Evangelion anime series, the character Rei Ayanami is implied to be a lab-created being combining human and angelic DNA. (compare to the Biblical Nephilim)

Genetic engineering (or something very like it) features prominently in Last and First Men, a 1930 novel by Olaf Stapledon.

Genetic engineering is depicted as widespread in the civilized world of Oryx and Crake. Prior to the apocalypse, though, its use among humans is not mentioned. Author Margaret Atwood describes many transgenic creatures such as Pigoons (though originally designed to be harvested for organs, post-apocalyptic-plague, they become more intelligent and vicious, traveling in packs), Snats (snake-rat hybrids who may or may not be extinct), wolvogs (wolf-dog hybrids), and the relatively harmless “rakunks” (skunk-raccoon hybrids, originally designed as pets with no scent glands).

In Plague, a 1978 film, a bacterium in an agricultural experiment accidentally escapes from a research laboratory in Canada, reaching the American Northeast and Great Britain.

Using a method similar to the DNA Resequencer from Stargate SG-1, and even called DNA Resequencing, the Operation Overdrive Power Rangers were given powers of superhuman strength, enhanced hearing, enhanced eyesight, super bouncing, super speed, and invisibility.

Quake II and Quake 4, released in 1997 and 2005, contain genetically-engineered Stroggs.

In the long-running 2006 series Rogue Trooper, the eponymous hero is a Genetic Infantryman, one of an elite group of supersoldiers genetically modified to resist the poisons left in the Nu-Earth atmosphere by decades of war. The original concept from the pages of 80s cult sci-fi comic 2000 AD (of Judge Dredd fame).

James Blish’s The Seedling Stars (1956) is the classic story of controlled mutation for adaptability. In this novel (originally a series of short stories) the Adapted Men are reshaped human beings, designed for life on a variety of other planets. This is one of science fiction’s most unreservedly optimistic accounts to date of technological efforts to reshape human beings.

In “The Man Who Grew Too Much” episode (2014), Sideshow Bob steals DNA from a GMO company, thus making himself the very first genetically engineered human, and attempts to combine his DNA with that of the smartest people ever to exist on Earth.

In Sleeper, a 1973 parody of many science fiction tropes, genetically modified crops are shown to grow gigantic.

The short-lived 1990s television series Space: Above and Beyond includes a race of genetically engineered and artificially gestated humans who are born at the physical age of 18, and are collectively known as In Vitros or sometimes, derogatorily, “tanks” or “nipple-necks”. At the time of the series storyline, this artificial human race was integrated with the parent species, but significant discrimination still occurred.

The Ultimate Life Form project that produced Shadow the Hedgehog and Biolizard in the Sonic the Hedgehog series was a genetic engineering project.

In the Star Trek universe, genetic engineering has featured in a couple of films, and a number of television episodes.

The Breen, the Dominion, Species 8472, the Xindi, and the Federation use technology with organic components.

Khan Noonien Singh, who appeared in Space Seed and Star Trek II: The Wrath of Khan, was a product of genetic engineering. His physical structure was modified to make him stronger and to give him greater stamina than a regular human. His mind was also enhanced. However, the creation of Khan would have serious consequences because the superior abilities given to him created superior ambition. Along with other enhanced individuals, they tried to take over the planet. When they were reawakened by the Enterprise, Khan set himself to taking over the universe. Later, he became consumed by grief and rage, and set himself on the goal of destroying Kirk.

Others of these genetically enhanced augments wreaked havoc in the 22nd century, and eventually some of their enhanced DNA was blended with Klingon DNA, creating the human-looking Klingons of the early 23rd century (See Star Trek: Enterprise episodes “Affliction” and “Divergence”).

Because of the experiences with genetic engineering, the Federation had banned it except to correct genetic birth defects, but a number of parents still illegally subjected their children to genetic engineering for a variety of reasons. This often created brilliant but unstable individuals. Such children are not allowed to serve in Starfleet or practice medicine, though Julian Bashir is a notable exception to this. Despite the ban, the Federation allowed the Darwin station to conduct human genetic engineering, which resulted in a telepathic, telekentic humans with a very effective immune system.

In Attack of the Clones, the Kamino cloners who created the clone army for the Galactic Republic had used engineering to enhance their clones. They modified the genetic structure of all but one to accelerate their growth rate, make them less independent, and make them better suited to combat operations.

Later, the Yuuzhan Vong are a race who exclusively use organic technology and regard mechanical technology as heresy. Everything from starships to communications devices to weapons are bred and grown to suit their needs.

In the show Stargate SG-1, the DNA Resequencer was a device built by the Ancients, designed to make extreme upgrades to humans by realigning their DNA and upgrading their brain activity. The machine gave them superhuman abilities, such as telekensis, telepathy, precognition, superhuman senses, strength, and intellect, the power to heal at an incredible rate, and the power to heal others by touch.

In the futuristic tabletop and video game series, Warhammer 40,000, the Imperium of Man uses genetic engineering to enhance the abilities of various militant factions such as the Space Marines, the Thunder Warriors, and the Adeptus Custodes. In the case of Space Marines, a series of synthesized, metamorphosis-inducing organs, known as gene seed, is made from the genome of the twenty original Primarchs and used to start the transformation of these superhuman warriors.

At the same time, the Tau Empire uses a form of eugenic breeding to improve the physical and mental condition of its various castes.

In the e-book, Methuselah’s Virus, an ageing pharmaceutical billionaire accidentally creates a contagious virus capable of infecting people with extreme longevity when his genetic engineering experiment goes wrong. The novel then examines the problem of what happens if Methuselah’s Virus is at risk of spreading to everyone on the entire planet.

In World Hunger, author Brian Kenneth Swain paints the harrowing picture of a life sciences company that field tests a new strain of genetically modified crop, the unexpected side effect of which is the creation of several new species of large and very aggressive insects.

Genetic engineering is an essential theme of the illustrated book Man After Man: An Anthropology of the Future by Dougal Dixon, where it is used to colonize other star systems and save the humans of Earth from extinction.

The Survival Gene e-book contains the author Artsun Akopyan’s idea that people can’t preserve nature as it is forever, so they’ll have to change their own genetics in the future or die. In the novel, wave genetics is used to save humankind and all life on Earth.

A series of books by David Brin in which humans have encountered the Five Galazies, a multitude of sentient species which all practice Uplift raising species to sapience through genetic engineering. Humans, believing they have risen to sapience through evolution alone, are seen as heretics. But they have some status because at the time of contact humans had already Uplifted two species chimpanzees and bottlenose dolphins.

Eugenics is a recurrent theme in science fiction, often with both dystopian and utopian elements. The two giant contributions in this field are the novel Brave New World (1932) by Aldous Huxley, which describes a society where control of human biology by the state results in permanent social stratification.

There tends to be a eugenic undercurrent in the science fiction concept of the supersoldier. Several depictions of these supersoldiers usually have them bred for combat or genetically selected for attributes that are beneficial to modern or future combat.

The Brave New World theme also plays a role in the 1997 film Gattaca, whose plot turns around reprogenetics, genetic testing, and the social consequences of eugenics. Boris Vian (under the pseudonym Vernon Sullivan) takes a more light-hearted approach in his novel Et on tuera tous les affreux (“And we’ll kill all the ugly ones”).

Other novels touching upon the subject include The Gate to Women’s Country by Sheri S. Tepper and That Hideous Strength by C. S. Lewis. The Eugenics Wars are a significant part of the background story of the Star Trek universe (episodes “Space Seed”, “Borderland”, “Cold Station 12”, “The Augments” and the film Star Trek II: The Wrath of Khan). Eugenics also plays a significant role in the Neanderthal Parallax trilogy where eugenics-practicing Neanderthals from a near-utopian parallel world create a gateway to earth. Cowl by Neal Asher describes the collapse of western civilization due to dysgenics. Also Eugenics is the name for the medical company in La Foire aux immortels book by Enki Bilal and on the Immortel (Ad Vitam) movie by the same author.

In Frank Herbert’s Dune series of novels, selective breeding programs form a significant theme. Early in the series, the Bene Gesserit religious order manipulates breeding patterns over many generations in order to create the Kwisatz Haderach. In God Emperor of Dune, the emperor Leto II again manipulates human breeding in order to achieve his own ends. The Bene Tleilaxu also employed genetic engineering to create human beings with specific genetic attributes. The Dune series ended with causal determinism playing a large role in the development of behavior, but the eugenics theme remained a crucial part of the story.

In Orson Scott Card’s novel Ender’s Game, Ender is only allowed to be conceived because of a special government exception due to his parent’s high intelligence and the extraordinary performance of his siblings. In Ender’s Shadow, Bean is a test-tube baby and the result of a failed eugenics experiment aimed at creating child geniuses.

In the novels Methuselah’s Children and Time Enough for Love by Robert A. Heinlein, a large trust fund is created to give financial encouragement to marriage among people (the Howard Families) whose parents and grandparents were long lived. The result is a subset of Earth’s population who has significantly above-average life spans. Members of this group appear in many of the works by the same author.

In the 1982 Robert Heinlein novel Friday, the main character has been genetically engineered from multiple sets of donors, including, as she finds out later her boss. These enhancements give her superior strength, speed, eyesight in addition to healing and other advanced attributes. Creations like her are considered to be AP’s (Artificial Person).

In Eoin Colfer’s book The Supernaturalist, Ditto is a Bartoli Baby, which is the name for a failed experiment of the famed Dr. Bartoli. Bartoli tried to create a superior race of humans, but they ended in arrested development, with mutations including extrasensory perception and healing hands.

In Larry Niven’s Ringworld series, the character Teela Brown is a result of several generations of winners of the “Birthright Lottery”, a system which attempts to encourage lucky people to breed, treating good luck as a genetic trait.

In season 2 of Dark Angel, the main ‘bad guy’ Ames White is a member of a cult known as the Conclave which has infiltrated various levels of society to breed super-humans. They are trying to exterminate all the Transgenics, including the main character Max Guevara, whom they view as being genetically unclean for having some animal DNA spliced with human.

In the movie Immortel (Ad Vitam), Director/Writer Enki Bilal titled the name of the evil corrupt organization specializing in genetic manipulation, and some very disturbing genetic “enhancement” eugenics. Eugenics has come to be a powerful organization and uses people and mutants of “lesser” genetic stock as guinea pigs. The movie is based on the Nikopol trilogy in Heavy Metal comic books.

In the video game Grand Theft Auto: Vice City, a fictional character called Pastor Richards, a caricature of an extreme and insane televangelist, is featured as a guest on a discussion radio show about morality. On this show, he describes shooting people who do not agree with him and who are not “morally correct”, which the show’s host describes as “amateur eugenics”.

In the 2006 Mike Judge film Idiocracy, a fictional character, pvt. Joe Bauers, aka Not Sure (played by Luke Wilson), awakens from a cryogenic stasis in the year 2505 into a world devastated by dysgenic degeneration. Bauers, who was chosen for his averageness, is discovered to be the smartest human alive and eventually becomes president of the United States.

The manga series Battle Angel Alita and its sequel Battle Angel Alita: Last Order (Gunnm and Gunnm: Last Order as it is known in Japan) by Yukito Kishiro, contains multiple references to the theme of eugenics. The most obvious is the sky city Tiphares (Salem in Japanese edition). Dr. Desty Nova, in the first series in Volume 9, reveals the eugenical nature of the city to Alita (Gally or Yoko) and it is further explored in the sequel series. A James Cameron movie based on the series is due for release on 2018.[10]

In the French 2000 police drama Crimson Rivers, inspectors Pierre Niemans (played by Jean Reno) and his colleague Max Kerkerian (Vincent Cassel) attempt to solve series of murders triggered by eugenics experiment that was going on for years in university town of Guernon.

In the Cosmic Era universe of the Gundam anime series (Mobile Suit Gundam SEED), war is fought between the normal human beings without genetic enhancements, also known as the Naturals, and the Coordinators, who are genetically enhanced. It explores the pros and cons as well as possible repercussions from Eugenics

The Khommites of planet Khomm practice this through the method of self-cloning, believing they are perfect.

The book Uglies, part of a four-book series by Scott Westerfeld, revolves around a girl named Tally who lives in a world where everyone at the age of sixteen receives extensive cosmetic surgery to turn into “Pretties” and join society. Although it deals with extreme cosmetic surgery, the utopian (or dystopian, depending on one’s interpretation) ideals in the book are similar to those present in the books mentioned above.

Original post:

Genetic engineering in science fiction – Wikipedia

War on drugs – Wikipedia

War on Drugs is an American term[6][7] usually applied to the U.S. federal government’s campaign of prohibition of drugs, military aid, and military intervention, with the stated aim being to reduce the illegal drug trade.[8][9] The initiative includes a set of drug policies that are intended to discourage the production, distribution, and consumption of psychoactive drugs that the participating governments and the UN have made illegal. The term was popularized by the media shortly after a press conference given on June 18, 1971, by President Richard Nixonthe day after publication of a special message from President Nixon to the Congress on Drug Abuse Prevention and Controlduring which he declared drug abuse “public enemy number one”. That message to the Congress included text about devoting more federal resources to the “prevention of new addicts, and the rehabilitation of those who are addicted”, but that part did not receive the same public attention as the term “war on drugs”.[10][11][12] However, two years prior to this, Nixon had formally declared a “war on drugs” that would be directed toward eradication, interdiction, and incarceration.[13] Today, the Drug Policy Alliance, which advocates for an end to the War on Drugs, estimates that the United States spends $51 billion annually on these initiatives.[14]

On May 13, 2009, Gil Kerlikowskethe Director of the Office of National Drug Control Policy (ONDCP)signaled that the Obama administration did not plan to significantly alter drug enforcement policy, but also that the administration would not use the term “War on Drugs”, because Kerlikowske considers the term to be “counter-productive”.[15] ONDCP’s view is that “drug addiction is a disease that can be successfully prevented and treated… making drugs more available will make it harder to keep our communities healthy and safe”.[16] One of the alternatives that Kerlikowske has showcased is the drug policy of Sweden, which seeks to balance public health concerns with opposition to drug legalization. The prevalence rates for cocaine use in Sweden are barely one-fifth of those in Spain, the biggest consumer of the drug.[17]

In June 2011, the Global Commission on Drug Policy released a critical report on the War on Drugs, declaring: “The global war on drugs has failed, with devastating consequences for individuals and societies around the world. Fifty years after the initiation of the UN Single Convention on Narcotic Drugs, and years after President Nixon launched the US government’s war on drugs, fundamental reforms in national and global drug control policies are urgently needed.”[18] The report was criticized by organizations that oppose a general legalization of drugs.[16]

The first U.S. law that restricted the distribution and use of certain drugs was the Harrison Narcotics Tax Act of 1914. The first local laws came as early as 1860.[19] In 1919, the United States passed the 18th Amendment, prohibiting the sale, manufacture, and transportation of alcohol, with exceptions for religious and medical use. In 1920, the United States passed the National Prohibition Act (Volstead Act), enacted to carry out the provisions in law of the 18th Amendment.

The Federal Bureau of Narcotics was established in the United States Department of the Treasury by an act of June 14, 1930 (46 Stat. 585).[20] In 1933, the federal prohibition for alcohol was repealed by passage of the 21st Amendment. In 1935, President Franklin D. Roosevelt publicly supported the adoption of the Uniform State Narcotic Drug Act. The New York Times used the headline “Roosevelt Asks Narcotic War Aid”.[21][22]

In 1937, the Marihuana Tax Act of 1937 was passed. Several scholars have claimed that the goal was to destroy the hemp industry,[23][24][25] largely as an effort of businessmen Andrew Mellon, Randolph Hearst, and the Du Pont family.[23][25] These scholars argue that with the invention of the decorticator, hemp became a very cheap substitute for the paper pulp that was used in the newspaper industry.[23][26] These scholars believe that Hearst felt[dubious discuss] that this was a threat to his extensive timber holdings. Mellon, United States Secretary of the Treasury and the wealthiest man in America, had invested heavily in the DuPont’s new synthetic fiber, nylon, and considered[dubious discuss] its success to depend on its replacement of the traditional resource, hemp.[23][27][28][29][30][31][32][33] However, there were circumstances that contradict these claims. One reason for doubts about those claims is that the new decorticators did not perform fully satisfactorily in commercial production.[34] To produce fiber from hemp was a labor-intensive process if you include harvest, transport and processing. Technological developments decreased the labor with hemp but not sufficient to eliminate this disadvantage.[35][36]

On October 27, 1970, Congress passes the Comprehensive Drug Abuse Prevention and Control Act of 1970, which, among other things, categorizes controlled substances based on their medicinal use and potential for addiction.[37] In 1971, two congressmen released an explosive report on the growing heroin epidemic among U.S. servicemen in Vietnam; ten to fifteen percent of the servicemen were addicted to heroin, and President Nixon declared drug abuse to be “public enemy number one”.[37][38]

Although Nixon declared “drug abuse” to be public enemy number one in 1971,[39] the policies that his administration implemented as part of the Comprehensive Drug Abuse Prevention and Control Act of 1970 were a continuation of drug prohibition policies in the U.S., which started in 1914.[37][40]

“The Nixon campaign in 1968, and the Nixon White House after that, had two enemies: the antiwar left and black people. You understand what I’m saying? We knew we couldn’t make it illegal to be either against the war or black, but by getting the public to associate the hippies with marijuana and blacks with heroin, and then criminalizing both heavily, we could disrupt those communities. We could arrest their leaders, raid their homes, break up their meetings, and vilify them night after night on the evening news. Did we know we were lying about the drugs? Of course we did.” John Ehrlichman, to Dan Baum[41][42][43] for Harper’s Magazine[44] in 1994, about President Richard Nixon’s war on drugs, declared in 1971.[45][46]

In 1973, the Drug Enforcement Administration was created to replace the Bureau of Narcotics and Dangerous Drugs.[37]

The Nixon Administration also repealed the federal 210-year mandatory minimum sentences for possession of marijuana and started federal demand reduction programs and drug-treatment programs. Robert DuPont, the “Drug czar” in the Nixon Administration, stated it would be more accurate to say that Nixon ended, rather than launched, the “war on drugs”. DuPont also argued that it was the proponents of drug legalization that popularized the term “war on drugs”.[16][unreliable source?]

In 1982, Vice President George H. W. Bush and his aides began pushing for the involvement of the CIA and U.S. military in drug interdiction efforts.[47]

The Office of National Drug Control Policy (ONDCP) was originally established by the National Narcotics Leadership Act of 1988,[48][49] which mandated a national anti-drug media campaign for youth, which would later become the National Youth Anti-Drug Media Campaign.[50] The director of ONDCP is commonly known as the Drug czar,[37] and it was first implemented in 1989 under President George H. W. Bush,[51] and raised to cabinet-level status by Bill Clinton in 1993.[52] These activities were subsequently funded by the Treasury and General Government Appropriations Act of 1998.[53][54] The Drug-Free Media Campaign Act of 1998 codified the campaign at 21 U.S.C.1708.[55]

The Global Commission on Drug Policy released a report on June 2, 2011, alleging that “The War On Drugs Has Failed.” The commissioned was made up of 22 self-appointed members including a number of prominent international politicians and writers. U.S. Surgeon General Regina Benjamin also released the first ever National Prevention Strategy.[56]

On May 21, 2012, the U.S. Government published an updated version of its Drug Policy.[57] The director of ONDCP stated simultaneously that this policy is something different from the “War on Drugs”:

At the same meeting was a declaration signed by the representatives of Italy, the Russian Federation, Sweden, the United Kingdom and the United States in line with this: “Our approach must be a balanced one, combining effective enforcement to restrict the supply of drugs, with efforts to reduce demand and build recovery; supporting people to live a life free of addiction.”[59]

In March 2016 the International Narcotics Control Board stated that the International Drug Control treaties do not mandate a “war on drugs.”[60]

According to Human Rights Watch, the War on Drugs caused soaring arrest rates that disproportionately targeted African Americans due to various factors.[62] John Ehrlichman, an aide to Nixon, said that Nixon used the war on drugs to criminalize and disrupt black and hippie communities and their leaders.[63]

The present state of incarceration in the U.S. as a result of the war on drugs arrived in several stages. By 1971, different stops on drugs had been implemented for more than 50 years (for e.g. since 1914, 1937 etc.) with only a very small increase of inmates per 100,000 citizens. During the first 9 years after Nixon coined the expression “War on Drugs”, statistics showed only a minor increase in the total number of imprisoned.

After 1980, the situation began to change. In the 1980s, while the number of arrests for all crimes had risen by 28%, the number of arrests for drug offenses rose 126%.[64] The result of increased demand was the development of privatization and the for-profit prison industry.[65] The US Department of Justice, reporting on the effects of state initiatives, has stated that, from 1990 through 2000, “the increasing number of drug offenses accounted for 27% of the total growth among black inmates, 7% of the total growth among Hispanic inmates, and 15% of the growth among white inmates.” In addition to prison or jail, the United States provides for the deportation of many non-citizens convicted of drug offenses.[66]

In 1994, the New England Journal of Medicine reported that the “War on Drugs” resulted in the incarceration of one million Americans each year.[67] In 2008, the Washington Post reported that of 1.5 million Americans arrested each year for drug offenses, half a million would be incarcerated.[68] In addition, one in five black Americans would spend time behind bars due to drug laws.[68]

Federal and state policies also impose collateral consequences on those convicted of drug offenses, such as denial of public benefits or licenses, that are not applicable to those convicted of other types of crime.[69] In particular, the passage of the 1990 SolomonLautenberg amendment led many states to impose mandatory driver’s license suspensions (of at least 6 months) for persons committing a drug offense, regardless of whether any motor vehicle was involved.[70][71] Approximately 191,000 licenses were suspended in this manner in 2016, according to a Prison Policy Initiative report.[72]

In 1986, the U.S. Congress passed laws that created a 100 to 1 sentencing disparity for the trafficking or possession of crack when compared to penalties for trafficking of powder cocaine,[73][74][75][76] which had been widely criticized as discriminatory against minorities, mostly blacks, who were more likely to use crack than powder cocaine.[77] This 100:1 ratio had been required under federal law since 1986.[78] Persons convicted in federal court of possession of 5grams of crack cocaine received a minimum mandatory sentence of 5 years in federal prison. On the other hand, possession of 500grams of powder cocaine carries the same sentence.[74][75] In 2010, the Fair Sentencing Act cut the sentencing disparity to 18:1.[77]

According to Human Rights Watch, crime statistics show thatin the United States in 1999compared to non-minorities, African Americans were far more likely to be arrested for drug crimes, and received much stiffer penalties and sentences.[79]

Statistics from 1998 show that there were wide racial disparities in arrests, prosecutions, sentencing and deaths. African-American drug users made up for 35% of drug arrests, 55% of convictions, and 74% of people sent to prison for drug possession crimes.[74] Nationwide African-Americans were sent to state prisons for drug offenses 13 times more often than other races,[80] even though they only supposedly comprised 13% of regular drug users.[74]

Anti-drug legislation over time has also displayed an apparent racial bias. University of Minnesota Professor and social justice author Michael Tonry writes, “The War on Drugs foreseeably and unnecessarily blighted the lives of hundreds and thousands of young disadvantaged black Americans and undermined decades of effort to improve the life chances of members of the urban black underclass.”[81]

In 1968, President Lyndon B. Johnson decided that the government needed to make an effort to curtail the social unrest that blanketed the country at the time. He decided to focus his efforts on illegal drug use, an approach which was in line with expert opinion on the subject at the time. In the 1960s, it was believed that at least half of the crime in the U.S. was drug related, and this number grew as high as 90 percent in the next decade.[82] He created the Reorganization Plan of 1968 which merged the Bureau of Narcotics and the Bureau of Drug Abuse to form the Bureau of Narcotics and Dangerous Drugs within the Department of Justice.[83] The belief during this time about drug use was summarized by journalist Max Lerner in his celebrated[citation needed] work America as a Civilization (1957):

As a case in point we may take the known fact of the prevalence of reefer and dope addiction in Negro areas. This is essentially explained in terms of poverty, slum living, and broken families, yet it would be easy to show the lack of drug addiction among other ethnic groups where the same conditions apply.[84]

Richard Nixon became president in 1969, and did not back away from the anti-drug precedent set by Johnson. Nixon began orchestrating drug raids nationwide to improve his “watchdog” reputation. Lois B. Defleur, a social historian who studied drug arrests during this period in Chicago, stated that, “police administrators indicated they were making the kind of arrests the public wanted”. Additionally, some of Nixon’s newly created drug enforcement agencies would resort to illegal practices to make arrests as they tried to meet public demand for arrest numbers. From 1972 to 1973, the Office of Drug Abuse and Law Enforcement performed 6,000 drug arrests in 18 months, the majority of the arrested black.[85]

The next two Presidents, Gerald Ford and Jimmy Carter, responded with programs that were essentially a continuation of their predecessors. Shortly after Ronald Reagan became President in 1981 he delivered a speech on the topic. Reagan announced, “We’re taking down the surrender flag that has flown over so many drug efforts; we’re running up a battle flag.”[86] For his first five years in office, Reagan slowly strengthened drug enforcement by creating mandatory minimum sentencing and forfeiture of cash and real estate for drug offenses, policies far more detrimental to poor blacks than any other sector affected by the new laws.[citation needed]

Then, driven by the 1986 cocaine overdose of black basketball star Len Bias,[dubious discuss] Reagan was able to pass the Anti-Drug Abuse Act through Congress. This legislation appropriated an additional $1.7 billion to fund the War on Drugs. More importantly, it established 29 new, mandatory minimum sentences for drug offenses. In the entire history of the country up until that point, the legal system had only seen 55 minimum sentences in total.[87] A major stipulation of the new sentencing rules included different mandatory minimums for powder and crack cocaine. At the time of the bill, there was public debate as to the difference in potency and effect of powder cocaine, generally used by whites, and crack cocaine, generally used by blacks, with many believing that “crack” was substantially more powerful and addictive. Crack and powder cocaine are closely related chemicals, crack being a smokeable, freebase form of powdered cocaine hydrochloride which produces a shorter, more intense high while using less of the drug. This method is more cost effective, and therefore more prevalent on the inner-city streets, while powder cocaine remains more popular in white suburbia. The Reagan administration began shoring public opinion against “crack”, encouraging DEA official Robert Putnam to play up the harmful effects of the drug. Stories of “crack whores” and “crack babies” became commonplace; by 1986, Time had declared “crack” the issue of the year.[88] Riding the wave of public fervor, Reagan established much harsher sentencing for crack cocaine, handing down stiffer felony penalties for much smaller amounts of the drug.[89]

Reagan protg and former Vice-President George H. W. Bush was next to occupy the oval office, and the drug policy under his watch held true to his political background. Bush maintained the hard line drawn by his predecessor and former boss, increasing narcotics regulation when the First National Drug Control Strategy was issued by the Office of National Drug Control in 1989.[90]

The next three presidents Clinton, Bush and Obama continued this trend, maintaining the War on Drugs as they inherited it upon taking office.[91] During this time of passivity by the federal government, it was the states that initiated controversial legislation in the War on Drugs. Racial bias manifested itself in the states through such controversial policies as the “stop and frisk” police practices in New York city and the “three strikes” felony laws began in California in 1994.[92]

In August 2010, President Obama signed the Fair Sentencing Act into law that dramatically reduced the 100-to-1 sentencing disparity between powder and crack cocaine, which disproportionately affected minorities.[93]

Commonly used illegal drugs include heroin, cocaine, methamphetamine, and, marijuana.

Heroin is an opiate that is highly addictive. If caught selling or possessing heroin, a perpetrator can be charged with a felony and face twofour years in prison and could be fined to a maximum of $20,000.[94]

Crystal meth is composed of methamphetamine hydrochloride. It is marketed as either a white powder or in a solid (rock) form. The possession of crystal meth can result in a punishment varying from a fine to a jail sentence. As with other drug crimes, sentencing length may increase depending on the amount of the drug found in the possession of the defendant.[95]

Cocaine possession is illegal across the U.S., with the cheaper crack cocaine incurring even greater penalties. Having possession is when the accused knowingly has it on their person, or in a backpack or purse. The possession of cocaine with no prior conviction, for the first offense, the person will be sentenced to a maximum of one year in prison or fined $1,000, or both. If the person has a prior conviction, whether it is a narcotic or cocaine, they will be sentenced to two years in prison, a $2,500 fine, or both. With two or more convictions of possession prior to this present offense, they can be sentenced to 90 days in prison along with a $5,000 fine.[96]

Marijuana is the most popular illegal drug worldwide. The punishment for possession of it is less than for the possession of cocaine or heroin. In some U.S. states, the drug is legal. Over 80 million Americans have tried marijuana. The Criminal Defense Lawyer article claims that, depending on the age of person and how much the person has been caught for possession, they will be fined and could plea bargain into going to a treatment program versus going to prison. In each state the convictions differ along with how much marijuana they have on their person.[97]

Some scholars have claimed that the phrase “War on Drugs” is propaganda cloaking an extension of earlier military or paramilitary operations.[9] Others have argued that large amounts of “drug war” foreign aid money, training, and equipment actually goes to fighting leftist insurgencies and is often provided to groups who themselves are involved in large-scale narco-trafficking, such as corrupt members of the Colombian military.[8]

From 1963 to the end of the Vietnam War in 1975, marijuana usage became common among U.S. soldiers in non-combat situations. Some servicemen also used heroin. Many of the servicemen ended the heroin use after returning to the United States but came home addicted. In 1971, the U.S. military conducted a study of drug use among American servicemen and women. It found that daily usage rates for drugs on a worldwide basis were as low as two percent.[98] However, in the spring of 1971, two congressmen released an alarming report alleging that 15% of the servicemen in Vietnam were addicted to heroin. Marijuana use was also common in Vietnam. Soldiers who used drugs had more disciplinary problems. The frequent drug use had become an issue for the commanders in Vietnam; in 1971 it was estimated that 30,000 servicemen were addicted to drugs, most of them to heroin.[11]

From 1971 on, therefore, returning servicemen were required to take a mandatory heroin test. Servicemen who tested positive upon returning from Vietnam were not allowed to return home until they had passed the test with a negative result. The program also offered a treatment for heroin addicts.[99]

Elliot Borin’s article “The U.S. Military Needs its Speed”published in Wired on February 10, 2003reports:

But the Defense Department, which distributed millions of amphetamine tablets to troops during World War II, Vietnam and the Gulf War, soldiers on, insisting that they are not only harmless but beneficial.

In a news conference held in connection with Schmidt and Umbach’s Article 32 hearing, Dr. Pete Demitry, an Air Force physician and a pilot, claimed that the “Air Force has used (Dexedrine) safely for 60 years” with “no known speed-related mishaps.”

The need for speed, Demitry added “is a life-and-death issue for our military.”[100]

One of the first anti-drug efforts in the realm of foreign policy was President Nixon’s Operation Intercept, announced in September 1969, targeted at reducing the amount of cannabis entering the United States from Mexico. The effort began with an intense inspection crackdown that resulted in an almost shutdown of cross-border traffic.[101] Because the burden on border crossings was controversial in border states, the effort only lasted twenty days.[102]

On December 20, 1989, the United States invaded Panama as part of Operation Just Cause, which involved 25,000 American troops. Gen. Manuel Noriega, head of the government of Panama, had been giving military assistance to Contra groups in Nicaragua at the request of the U.S. which, in exchange, tolerated his drug trafficking activities, which they had known about since the 1960s.[103][104] When the Drug Enforcement Administration (DEA) tried to indict Noriega in 1971, the CIA prevented them from doing so.[103] The CIA, which was then directed by future president George H. W. Bush, provided Noriega with hundreds of thousands of dollars per year as payment for his work in Latin America.[103] When CIA pilot Eugene Hasenfus was shot down over Nicaragua by the Sandinistas, documents aboard the plane revealed many of the CIA’s activities in Latin America, and the CIA’s connections with Noriega became a public relations “liability” for the U.S. government, which finally allowed the DEA to indict him for drug trafficking, after decades of tolerating his drug operations.[103] Operation Just Cause, whose purpose was to capture Noriega and overthrow his government; Noriega found temporary asylum in the Papal Nuncio, and surrendered to U.S. soldiers on January 3, 1990.[105] He was sentenced by a court in Miami to 45 years in prison.[103]

As part of its Plan Colombia program, the United States government currently provides hundreds of millions of dollars per year of military aid, training, and equipment to Colombia,[106] to fight left-wing guerrillas such as the Revolutionary Armed Forces of Colombia (FARC-EP), which has been accused of being involved in drug trafficking.[107]

Private U.S. corporations have signed contracts to carry out anti-drug activities as part of Plan Colombia. DynCorp, the largest private company involved, was among those contracted by the State Department, while others signed contracts with the Defense Department.[108]

Colombian military personnel have received extensive counterinsurgency training from U.S. military and law enforcement agencies, including the School of Americas (SOA). Author Grace Livingstone has stated that more Colombian SOA graduates have been implicated in human rights abuses than currently known SOA graduates from any other country. All of the commanders of the brigades highlighted in a 2001 Human Rights Watch report on Colombia were graduates of the SOA, including the III brigade in Valle del Cauca, where the 2001 Alto Naya Massacre occurred. US-trained officers have been accused of being directly or indirectly involved in many atrocities during the 1990s, including the Massacre of Trujillo and the 1997 Mapiripn Massacre.

In 2000, the Clinton administration initially waived all but one of the human rights conditions attached to Plan Colombia, considering such aid as crucial to national security at the time.[109]

The efforts of U.S. and Colombian governments have been criticized for focusing on fighting leftist guerrillas in southern regions without applying enough pressure on right-wing paramilitaries and continuing drug smuggling operations in the north of the country.[110][111] Human Rights Watch, congressional committees and other entities have documented the existence of connections between members of the Colombian military and the AUC, which the U.S. government has listed as a terrorist group, and that Colombian military personnel have committed human rights abuses which would make them ineligible for U.S. aid under current laws.[citation needed]

In 2010, the Washington Office on Latin America concluded that both Plan Colombia and the Colombian government’s security strategy “came at a high cost in lives and resources, only did part of the job, are yielding diminishing returns and have left important institutions weaker.”[112]

A 2014 report by the RAND Corporation, which was issued to analyze viable strategies for the Mexican drug war considering successes experienced in Columbia, noted:

Between 1999 and 2002, the United States gave Colombia $2.04 billion in aid, 81 percent of which was for military purposes, placing Colombia just below Israel and Egypt among the largest recipients of U.S. military assistance. Colombia increased its defense spending from 3.2 percent of gross domestic product (GDP) in 2000 to 4.19 percent in 2005. Overall, the results were extremely positive. Greater spending on infrastructure and social programs helped the Colombian government increase its political legitimacy, while improved security forces were better able to consolidate control over large swaths of the country previously overrun by insurgents and drug cartels.

It also notes that, “Plan Colombia has been widely hailed as a success, and some analysts believe that, by 2010, Colombian security forces had finally gained the upper hand once and for all.”[113]

The Mrida Initiative is a security cooperation between the United States and the government of Mexico and the countries of Central America. It was approved on June 30, 2008, and its stated aim is combating the threats of drug trafficking and transnational crime. The Mrida Initiative appropriated $1.4 billion in a three-year commitment (20082010) to the Mexican government for military and law enforcement training and equipment, as well as technical advice and training to strengthen the national justice systems. The Mrida Initiative targeted many very important government officials, but it failed to address the thousands of Central Americans who had to flee their countries due to the danger they faced everyday because of the war on drugs. There is still not any type of plan that addresses these people. No weapons are included in the plan.[114][115]

The United States regularly sponsors the spraying of large amounts of herbicides such as glyphosate over the jungles of Central and South America as part of its drug eradication programs. Environmental consequences resulting from aerial fumigation have been criticized as detrimental to some of the world’s most fragile ecosystems;[116] the same aerial fumigation practices are further credited with causing health problems in local populations.[117]

In 2012, the U.S. sent DEA agents to Honduras to assist security forces in counternarcotics operations. Honduras has been a major stop for drug traffickers, who use small planes and landing strips hidden throughout the country to transport drugs. The U.S. government made agreements with several Latin American countries to share intelligence and resources to counter the drug trade. DEA agents, working with other U.S. agencies such as the State Department, the CBP, and Joint Task Force-Bravo, assisted Honduras troops in conducting raids on traffickers’ sites of operation.[118]

The War on Drugs has been a highly contentious issue since its inception. A poll on October 2, 2008, found that three in four Americans believed that the War On Drugs was failing.[119]

At a meeting in Guatemala in 2012, three former presidents from Guatemala, Mexico and Colombia said that the war on drugs had failed and that they would propose a discussion on alternatives, including decriminalization, at the Summit of the Americas in April of that year.[120] Guatemalan President Otto Prez Molina said that the war on drugs was exacting too high a price on the lives of Central Americans and that it was time to “end the taboo on discussing decriminalization”.[121] At the summit, the government of Colombia pushed for the most far-reaching change to drugs policy since the war on narcotics was declared by Nixon four decades prior, citing the catastrophic effects it had had in Colombia.[122]

Several critics have compared the wholesale incarceration of the dissenting minority of drug users to the wholesale incarceration of other minorities in history. Psychiatrist Thomas Szasz, for example, writes in 1997 “Over the past thirty years, we have replaced the medical-political persecution of illegal sex users (‘perverts’ and ‘psychopaths’) with the even more ferocious medical-political persecution of illegal drug users.”[123]

Penalties for drug crimes among American youth almost always involve permanent or semi-permanent removal from opportunities for education, strip them of voting rights, and later involve creation of criminal records which make employment more difficult.[124] Thus, some authors maintain that the War on Drugs has resulted in the creation of a permanent underclass of people who have few educational or job opportunities, often as a result of being punished for drug offenses which in turn have resulted from attempts to earn a living in spite of having no education or job opportunities.[124]

According to a 2008 study published by Harvard economist Jeffrey A. Miron, the annual savings on enforcement and incarceration costs from the legalization of drugs would amount to roughly $41.3 billion, with $25.7 billion being saved among the states and over $15.6 billion accrued for the federal government. Miron further estimated at least $46.7 billion in tax revenue based on rates comparable to those on tobacco and alcohol ($8.7 billion from marijuana, $32.6 billion from cocaine and heroin, remainder from other drugs).[125]

Low taxation in Central American countries has been credited with weakening the region’s response in dealing with drug traffickers. Many cartels, especially Los Zetas have taken advantage of the limited resources of these nations. 2010 tax revenue in El Salvador, Guatemala, and Honduras, composed just 13.53% of GDP. As a comparison, in Chile and the U.S., taxes were 18.6% and 26.9% of GDP respectively. However, direct taxes on income are very hard to enforce and in some cases tax evasion is seen as a national pastime.[126]

The status of coca and coca growers has become an intense political issue in several countries, including Colombia and particularly Bolivia, where the president, Evo Morales, a former coca growers’ union leader, has promised to legalise the traditional cultivation and use of coca.[127] Indeed, legalization efforts have yielded some successes under the Morales administration when combined with aggressive and targeted eradication efforts. The country saw a 1213% decline in coca cultivation[127] in 2011 under Morales, who has used coca growers’ federations to ensure compliance with the law rather than providing a primary role for security forces.[127]

The coca eradication policy has been criticised for its negative impact on the livelihood of coca growers in South America. In many areas of South America the coca leaf has traditionally been chewed and used in tea and for religious, medicinal and nutritional purposes by locals.[128] For this reason many insist that the illegality of traditional coca cultivation is unjust. In many areas the U.S. government and military has forced the eradication of coca without providing for any meaningful alternative crop for farmers, and has additionally destroyed many of their food or market crops, leaving them starving and destitute.[128]

The CIA, DEA, State Department, and several other U.S. government agencies have been alleged to have relations with various groups which are involved in drug trafficking.

Senator John Kerry’s 1988 U.S. Senate Committee on Foreign Relations report on Contra drug links concludes that members of the U.S. State Department “who provided support for the Contras are involved in drug trafficking… and elements of the Contras themselves knowingly receive financial and material assistance from drug traffickers.”[129] The report further states that “the Contra drug links include… payments to drug traffickers by the U.S. State Department of funds authorized by the Congress for humanitarian assistance to the Contras, in some cases after the traffickers had been indicted by federal law enforcement agencies on drug charges, in others while traffickers were under active investigation by these same agencies.”

In 1996, journalist Gary Webb published reports in the San Jose Mercury News, and later in his book Dark Alliance, detailing how Contras, had been involved in distributing crack cocaine into Los Angeles whilst receiving money from the CIA.[citation needed] Contras used money from drug trafficking to buy weapons.[citation needed]

Webb’s premise regarding the U.S. Government connection was initially attacked at the time by the media. It is now widely accepted that Webb’s main assertion of government “knowledge of drug operations, and collaboration with and protection of known drug traffickers” was correct.[130][not in citation given] In 1998, CIA Inspector General Frederick Hitz published a two-volume report[131] that while seemingly refuting Webb’s claims of knowledge and collaboration in its conclusions did not deny them in its body.[citation needed] Hitz went on to admit CIA improprieties in the affair in testimony to a House congressional committee. There has been a reversal amongst mainstream media of its position on Webb’s work, with acknowledgement made of his contribution to exposing a scandal it had ignored.

According to Rodney Campbell, an editorial assistant to Nelson Rockefeller, during World War II, the United States Navy, concerned that strikes and labor disputes in U.S. eastern shipping ports would disrupt wartime logistics, released the mobster Lucky Luciano from prison, and collaborated with him to help the mafia take control of those ports. Labor union members were terrorized and murdered by mafia members as a means of preventing labor unrest and ensuring smooth shipping of supplies to Europe.[132]

According to Alexander Cockburn and Jeffrey St. Clair, in order to prevent Communist party members from being elected in Italy following World War II, the CIA worked closely with the Sicilian Mafia, protecting them and assisting in their worldwide heroin smuggling operations. The mafia was in conflict with leftist groups and was involved in assassinating, torturing, and beating leftist political organizers.[133]

In 1986, the US Defense Department funded a two-year study by the RAND Corporation, which found that the use of the armed forces to interdict drugs coming into the United States would have little or no effect on cocaine traffic and might, in fact, raise the profits of cocaine cartels and manufacturers. The 175-page study, “Sealing the Borders: The Effects of Increased Military Participation in Drug Interdiction”, was prepared by seven researchers, mathematicians and economists at the National Defense Research Institute, a branch of the RAND, and was released in 1988. The study noted that seven prior studies in the past nine years, including one by the Center for Naval Research and the Office of Technology Assessment, had come to similar conclusions. Interdiction efforts, using current armed forces resources, would have almost no effect on cocaine importation into the United States, the report concluded.[135]

During the early-to-mid-1990s, the Clinton administration ordered and funded a major cocaine policy study, again by RAND. The Rand Drug Policy Research Center study concluded that $3 billion should be switched from federal and local law enforcement to treatment. The report said that treatment is the cheapest way to cut drug use, stating that drug treatment is twenty-three times more effective than the supply-side “war on drugs”.[136]

The National Research Council Committee on Data and Research for Policy on Illegal Drugs published its findings in 2001 on the efficacy of the drug war. The NRC Committee found that existing studies on efforts to address drug usage and smuggling, from U.S. military operations to eradicate coca fields in Colombia, to domestic drug treatment centers, have all been inconclusive, if the programs have been evaluated at all: “The existing drug-use monitoring systems are strikingly inadequate to support the full range of policy decisions that the nation must make…. It is unconscionable for this country to continue to carry out a public policy of this magnitude and cost without any way of knowing whether and to what extent it is having the desired effect.”[137] The study, though not ignored by the press, was ignored by top-level policymakers, leading Committee Chair Charles Manski to conclude, as one observer notes, that “the drug war has no interest in its own results”.[138]

In mid-1995, the US government tried to reduce the supply of methamphetamine precursors to disrupt the market of this drug. According to a 2009 study, this effort was successful, but its effects were largely temporary.[139]

During alcohol prohibition, the period from 1920 to 1933, alcohol use initially fell but began to increase as early as 1922. It has been extrapolated that even if prohibition had not been repealed in 1933, alcohol consumption would have quickly surpassed pre-prohibition levels.[140] One argument against the War on Drugs is that it uses similar measures as Prohibition and is no more effective.

In the six years from 2000 to 2006, the U.S. spent $4.7 billion on Plan Colombia, an effort to eradicate coca production in Colombia. The main result of this effort was to shift coca production into more remote areas and force other forms of adaptation. The overall acreage cultivated for coca in Colombia at the end of the six years was found to be the same, after the U.S. Drug Czar’s office announced a change in measuring methodology in 2005 and included new areas in its surveys.[141] Cultivation in the neighboring countries of Peru and Bolivia increased, some would describe this effect like squeezing a balloon.[142]

Richard Davenport-Hines, in his book The Pursuit of Oblivion,[143] criticized the efficacy of the War on Drugs by pointing out that

1015% of illicit heroin and 30% of illicit cocaine is intercepted. Drug traffickers have gross profit margins of up to 300%. At least 75% of illicit drug shipments would have to be intercepted before the traffickers’ profits were hurt.

Alberto Fujimori, president of Peru from 1990 to 2000, described U.S. foreign drug policy as “failed” on grounds that “for 10 years, there has been a considerable sum invested by the Peruvian government and another sum on the part of the American government, and this has not led to a reduction in the supply of coca leaf offered for sale. Rather, in the 10 years from 1980 to 1990, it grew 10-fold.”[144]

At least 500 economists, including Nobel Laureates Milton Friedman,[145] George Akerlof and Vernon L. Smith, have noted that reducing the supply of marijuana without reducing the demand causes the price, and hence the profits of marijuana sellers, to go up, according to the laws of supply and demand.[146] The increased profits encourage the producers to produce more drugs despite the risks, providing a theoretical explanation for why attacks on drug supply have failed to have any lasting effect. The aforementioned economists published an open letter to President George W. Bush stating “We urge…the country to commence an open and honest debate about marijuana prohibition… At a minimum, this debate will force advocates of current policy to show that prohibition has benefits sufficient to justify the cost to taxpayers, foregone tax revenues and numerous ancillary consequences that result from marijuana prohibition.”

The declaration from the World Forum Against Drugs, 2008 state that a balanced policy of drug abuse prevention, education, treatment, law enforcement, research, and supply reduction provides the most effective platform to reduce drug abuse and its associated harms and call on governments to consider demand reduction as one of their first priorities in the fight against drug abuse.[147]

Despite over $7 billion spent annually towards arresting[148] and prosecuting nearly 800,000 people across the country for marijuana offenses in 2005[citation needed] (FBI Uniform Crime Reports), the federally funded Monitoring the Future Survey reports about 85% of high school seniors find marijuana “easy to obtain”. That figure has remained virtually unchanged since 1975, never dropping below 82.7% in three decades of national surveys.[149] The Drug Enforcement Administration states that the number of users of marijuana in the U.S. declined between 2000 and 2005 even with many states passing new medical marijuana laws making access easier,[150] though usage rates remain higher than they were in the 1990s according to the National Survey on Drug Use and Health.[151]

ONDCP stated in April 2011 that there has been a 46 percent drop in cocaine use among young adults over the past five years, and a 65 percent drop in the rate of people testing positive for cocaine in the workplace since 2006.[152] At the same time, a 2007 study found that up to 35% of college undergraduates used stimulants not prescribed to them.[153]

A 2013 study found that prices of heroin, cocaine and cannabis had decreased from 1990 to 2007, but the purity of these drugs had increased during the same time.[154]

The War on Drugs is often called a policy failure.[155][156][157][158][159]

The legality of the War on Drugs has been challenged on four main grounds in the U.S.

Several authors believe that the United States’ federal and state governments have chosen wrong methods for combatting the distribution of illicit substances. Aggressive, heavy-handed enforcement funnels individuals through courts and prisons; instead of treating the cause of the addiction, the focus of government efforts has been on punishment. By making drugs illegal rather than regulating them, the War on Drugs creates a highly profitable black market. Jefferson Fish has edited scholarly collections of articles offering a wide variety of public health based and rights based alternative drug policies.[160][161][162]

In the year 2000, the United States drug-control budget reached 18.4 billion dollars,[163] nearly half of which was spent financing law enforcement while only one sixth was spent on treatment. In the year 2003, 53 percent of the requested drug control budget was for enforcement, 29 percent for treatment, and 18 percent for prevention.[164] The state of New York, in particular, designated 17 percent of its budget towards substance-abuse-related spending. Of that, a mere one percent was put towards prevention, treatment, and research.

In a survey taken by Substance Abuse and Mental Health Services Administration (SAMHSA), it was found that substance abusers that remain in treatment longer are less likely to resume their former drug habits. Of the people that were studied, 66 percent were cocaine users. After experiencing long-term in-patient treatment, only 22 percent returned to the use of cocaine. Treatment had reduced the number of cocaine abusers by two-thirds.[163] By spending the majority of its money on law enforcement, the federal government had underestimated the true value of drug-treatment facilities and their benefit towards reducing the number of addicts in the U.S.

In 2004 the federal government issued the National Drug Control Strategy. It supported programs designed to expand treatment options, enhance treatment delivery, and improve treatment outcomes. For example, the Strategy provided SAMHSA with a $100.6 million grant to put towards their Access to Recovery (ATR) initiative. ATR is a program that provides vouchers to addicts to provide them with the means to acquire clinical treatment or recovery support. The project’s goals are to expand capacity, support client choice, and increase the array of faith-based and community based providers for clinical treatment and recovery support services.[165] The ATR program will also provide a more flexible array of services based on the individual’s treatment needs.

The 2004 Strategy additionally declared a significant 32 million dollar raise in the Drug Courts Program, which provides drug offenders with alternatives to incarceration. As a substitute for imprisonment, drug courts identify substance-abusing offenders and place them under strict court monitoring and community supervision, as well as provide them with long-term treatment services.[166] According to a report issued by the National Drug Court Institute, drug courts have a wide array of benefits, with only 16.4 percent of the nation’s drug court graduates rearrested and charged with a felony within one year of completing the program (versus the 44.1% of released prisoners who end up back in prison within 1-year). Additionally, enrolling an addict in a drug court program costs much less than incarcerating one in prison.[167] According to the Bureau of Prisons, the fee to cover the average cost of incarceration for Federal inmates in 2006 was $24,440.[168] The annual cost of receiving treatment in a drug court program ranges from $900 to $3,500. Drug courts in New York State alone saved $2.54 million in incarceration costs.[167]

Describing the failure of the War on Drugs, New York Times columnist Eduardo Porter noted:

Jeffrey Miron, an economist at Harvard who studies drug policy closely, has suggested that legalizing all illicit drugs would produce net benefits to the United States of some $65 billion a year, mostly by cutting public spending on enforcement as well as through reduced crime and corruption. A study by analysts at the RAND Corporation, a California research organization, suggested that if marijuana were legalized in California and the drug spilled from there to other states, Mexican drug cartels would lose about a fifth of their annual income of some $6.5 billion from illegal exports to the United States.[169]

Many believe that the War on Drugs has been costly and ineffective largely because inadequate emphasis is placed on treatment of addiction. The United States leads the world in both recreational drug usage and incarceration rates. 70% of men arrested in metropolitan areas test positive for an illicit substance,[170] and 54% of all men incarcerated will be repeat offenders.[171]

There are also programs in the United States to combat public health risks of injecting drug users such as the Needle exchange programme. The “needle exchange programme” is intended to provide injecting drug users with new needles in exchange for used needles to prevent needle sharing.

Covert activities and foreign policy

More here:

War on drugs – Wikipedia

Homepage – The War On Drugs

Postal Code

Country Select CountryUnited StatesAfghanistanAlbaniaAlgeriaAmerican SamoaAndorraAngolaAnguillaAntarcticaAntigua and BarbudaArgentinaArmeniaArubaAustraliaAustriaAzerbaijanBahamasBahrainBangladeshBarbadosBelarusBelgiumBelizeBeninBermudaBhutanBoliviaBosnia & HerzegovinaBotswanaBouvet IslandBrazilBritish Ind Ocean Ter Brunei DarussalamBulgariaBurkina FasoBurundiCambodiaCameroonCanadaCape VerdeCayman IslandsCentral African RepChadChileChinaChristmas IslandCocos (Keeling Is)ColombiaComorosCongoCook IslandsCosta RicaCote D’Ivoire Croatia (Hrvatska)CubaCyprusCzech RepublicDenmarkDjiboutiDominicaDominican RepublicEast TimorEcuadorEgyptEl SalvadorEquatorial GuineaEritreaEstoniaEthiopiaFalkland Islands Faroe IslandsFijiFinlandFranceFrance, MetroFrench GuianaFrench PolynesiaFrench Southern TerGabonGambiaGeorgiaGeorgia and S. Sand IsGermanyGhanaGibraltarGreeceGreenlandGrenadaGuadeloupeGuamGuatemalaGuineaGuinea-BissauGuyanaHaitiHeard & McDonald IsHondurasHong KongHungaryIcelandIndiaIndonesiaIranIraqIrelandIsraelItalyJamaicaJapanJordanKazakhstanKenyaKiribatiKorea (North) Korea (South)KuwaitKyrgyzstan LaosLatviaLebanonLesothoLiberiaLibyaLiechtensteinLithuaniaLuxembourgMacauMacedoniaMadagascarMalawiMalaysiaMaldivesMaliMaltaMarshall IslandsMartiniqueMauritaniaMauritiusMayotteMexicoMicronesiaMoldovaMonacoMongoliaMontserratMoroccoMozambiqueMyanmarNamibiaNauruNepalNetherlandsNetherlands AntillesNeutral Zone Saudi/IraqNew CaledoniaNew ZealandNicaraguaNigerNigeriaNiueNorfolk IslandNorthern Mariana IsNorwayOmanPakistanPalauPanamaPapua New GuineaParaguayPeruPhilippinesPitcairnPolandPortugalPuerto RicoQatarReunionRomaniaRussian FederationRwandaSaint Kitts and NevisSaint LuciaSt. Vincent/Grenadines SamoaSan MarinoSao Tome and PrincipeSaudi ArabiaSenegalSeychellesSierra LeoneSingaporeSlovakia (Slovak Rep)SloveniaSolomon IslandsSomaliaSouth AfricaSoviet Union (former)SpainSri LankaSt. HelenaSt. Pierre and Miquelo SudanSurinameSvalbard & Jan MayenSwazilandSwedenSwitzerlandSyriaTaiwanTajikistanTanzaniaThailandTogoTokelauTongaTrinidad and TobagoTunisiaTurkeyTurkmenistanTurks and Caicos TuvaluUS Minor Outlying IsUgandaUkraineUnited Arab EmiratesUnited Kingdom UruguayUzbekistanVanuatuVatican City State VenezuelaViet NamVirgin Islands (Brit)Virgin Islands (US)Wallis and Futuna IsWestern SaharaYemenYugoslaviaZaireZambiaZimbabwe

First Name

Birth Date

MonthJanuaryFebruaryMarchAprilMayJuneJulyAugustSeptemberOctoberNovemberDecember

Date12345678910111213141516171819202122232425262728293031

Read the original:

Homepage – The War On Drugs

The War on Drugs (band) – Wikipedia

The War on Drugs is an American indie rock band from Philadelphia, Pennsylvania, formed in 2005. The band consists of Adam Granduciel (lyrics, vocals, guitar), David Hartley (bass), Robbie Bennett (keyboards), Charlie Hall (drums), Jon Natchez (saxophone, keyboards) and Anthony LaMarca (guitar).

Founded by close collaborators Granduciel and Kurt Vile, The War on Drugs released their debut studio album, Wagonwheel Blues, in 2008. Vile departed shortly after its release to focus on his solo career. The band’s second studio album Slave Ambient was released in 2011 to favorable reviews and extensive touring.

The band’s third album, Lost in the Dream, was released in 2014 following extensive touring and a period of loneliness and depression for primary songwriter Granduciel. The album was released to widespread critical acclaim and increased exposure. Previous collaborator Hall joined the band as its full-time drummer during the recording process, with saxophonist Natchez and additional guitarist LaMarca accompanying the band for its world tour. Signing to Atlantic Records, the six-piece band released their fourth album, A Deeper Understanding, in 2017, which won the Grammy Award for Best Rock Album at the 60th Annual Grammy Awards.

In 2003, frontman Adam Granduciel moved from Oakland, California to Philadelphia, where he met Kurt Vile, who had also recently moved back to Philadelphia after living in Boston for two years.[2] The duo subsequently began writing, recording and performing music together.[3] Vile stated, “Adam was the first dude I met when I moved back to Philadelphia in 2003. We saw eye-to-eye on a lot of things. I was obsessed with Bob Dylan at the time, and we totally geeked-out on that. We started playing together in the early days and he would be in my band, The Violators. Then, eventually I played in The War On Drugs.”[4]

Granduciel and Vile began playing together as The War on Drugs in 2005. Regarding the band’s name, Granduciel noted, “My friend Julian and I came up with it a few years ago over a couple bottles of red wine and a few typewriters when we were living in Oakland. We were writing a lot back then, working on a dictionary, and it just came out and we were like “hey, good band name” so eventually when I moved to Philadelphia and got a band together I used it. It was either that or The Rigatoni Danzas. I think we made the right choice. I always felt though that it was the kind of name I could record all sorts of different music under without any sort of predictability inherent in the name”[5]

While Vile and Granduciel formed the backbone of the band, they had a number of accompanists early in the group’s career, before finally settling on a lineup that added Charlie Hall as drummer/organist, Kyle Lloyd as drummer and Dave Hartley on bass.[6] Granduciel had previously toured and recorded with The Capitol Years, and Vile has several solo albums.[7] The group gave away its Barrel of Batteries EP for free early in 2008.[8] Their debut LP for Secretly Canadian, Wagonwheel Blues, was released in 2008.[9]

Following the album’s release, and subsequent European tour, Vile departed from the band to focus on his solo career, stating, “I only went on the first European tour when their album came out, and then I basically left the band. I knew if I stuck with that, it would be all my time and my goal was to have my own musical career.”[4] Fellow Kurt Vile & the Violators bandmate Mike Zanghi joined the band at this time, with Vile noting, “Mike was my drummer first and then when The War On Drugs’ first record came out I thought I was lending Mike to Adam for the European tour but then he just played with them all the time so I kind of had to like, while they were touring a lot, figure out my own thing.”[10]

The lineup underwent several changes, and by the end of 2008, Kurt Vile, Charlie Hall, and Kyle Lloyd had all exited the group. At that time Granduciel and Hartley were joined by drummer Mike Zanghi, whom Granduciel also played with in Kurt Vile’s backing band, the Violators.

After recording much of the band’s forthcoming studio album, Slave Ambient, Zanghi departed from the band in 2010. Drummer Steven Urgo subsequently joined the band, with keyboardist Robbie Bennett also joining at around this time. Regarding Zanghi’s exit, Granduciel noted: “I loved Mike, and I loved the sound of The Violators, but then he wasn’t really the sound of my band. But you have things like friendship, and he’s down to tour and he’s a great guy, but it wasn’t the sound of what this band was.”[11]

Slave Ambient was released to favorable reviews in 2011.[citation needed]

In 2012, Patrick Berkery replaced Urgo as the band’s drummer.[12]

On December 4, 2013 the band announced the upcoming release of its third studio album, Lost in the Dream (March 18, 2014). The band streamed the album in its entirety on NPR’s First Listen site for a week before its release.[13]

Lost in the Dream was featured as the Vinyl Me, Please record of the month in August 2014. The pressing was a limited edition pressing on mint green colored vinyl.

In June 2015, The War on Drugs signed with Atlantic Records for a two-album deal.[14]

On Record Store Day, April 22, 2017, The War on Drugs released their new single “Thinking of a Place.”[15] The single was produced by frontman Granduciel and Shawn Everett.[16] April 28, 2017, The War on Drugs announced a fall 2017 tour in North America and Europe and that a new album was imminent.[17] On June 1, 2017, a new song, “Holding On”, was released, and it was announced that the album would be titled A Deeper Understanding and was released on August 25, 2017.[18]

The 2017 tour begins in September, opening in the band’s hometown, Philadelphia, and it concludes in November in Sweden.[19]

A Deeper Understanding was nominated for the International Album of the Year award at the 2018 UK Americana Awards[20].

At the 60th Annual Grammy Awards, on January 28th, 2018, A Deeper Understanding won the Grammy for Best Rock Album [21]

Granduciel and Zanghi are both former members of founding guitarist Vile’s backing band The Violators, with Granduciel noting, “There was never, despite what lazy journalists have assumed, any sort of falling out, or resentment”[22] following Vile’s departure from The War on Drugs. In 2011, Vile stated, “When my record came out, I assumed Adam would want to focus on The War On Drugs but he came with us in The Violators when we toured the States. The Violators became a unit, and although the cast does rotate, we’ve developed an even tighter unity and sound. Adam is an incredible guitar player these days and there is a certain feeling [between us] that nobody else can tap into. We don’t really have to tell each other what to play, it just happens.”

Both Hartley and Granduciel contributed to singer-songwriter Sharon Van Etten’s fourth studio album, Are We There (2014). Hartley performs bass guitar on the entire album, with Granduciel contributing guitar on two tracks.

Granduciel is currently[when?] producing the new Sore Eros album. They have been recording it in Philadelphia and Los Angeles on and off for the past several years.[4]

In 2016, The War on Drugs contributed a cover of “Touch of Grey” for a Grateful Dead tribute album called Day of the Dead. The album was curated by The National’s Aaron and Bryce Dessner.[19]

Current members

Former members

Read the original here:

The War on Drugs (band) – Wikipedia

A Brief History of the Drug War | Drug Policy Alliance

This video from hip hop legend Jay Z and acclaimed artist Molly Crabapple depicts the drug wars devastating impact on the Black community from decades of biased law enforcement.

The video traces the drug war from President Nixon to the draconian Rockefeller Drug Laws to the emerging aboveground marijuana market that is poised to make legal millions for wealthy investors doing the same thing that generations of people of color have been arrested and locked up for. After you watch the video, read on to learn more about the discriminatory history of the war on drugs.

Many currently illegal drugs, such as marijuana, opium, coca, and psychedelics have been used for thousands of years for both medical and spiritual purposes. So why are some drugs legal and other drugs illegal today? It’s not based on any scientific assessment of the relative risks of these drugs but it has everything to do with who is associated with these drugs.

The first anti-opium laws in the 1870s were directed at Chinese immigrants. The first anti-cocaine laws in the early 1900s were directed at black men in the South. The first anti-marijuana laws, in the Midwest and the Southwest in the 1910s and 20s, were directed at Mexican migrants and Mexican Americans. Today, Latino and especially black communities are still subject to wildly disproportionate drug enforcement and sentencing practices.

In the 1960s, as drugs became symbols of youthful rebellion, social upheaval, and political dissent, the government halted scientific research to evaluate their medical safety and efficacy.

In June 1971, President Nixon declared a war on drugs. He dramatically increased the size and presence of federal drug control agencies, and pushed through measures such as mandatory sentencing and no-knock warrants.

A top Nixon aide, John Ehrlichman, later admitted: You want to know what this was really all about. The Nixon campaign in 1968, and the Nixon White House after that, had two enemies: the antiwar left and black people. You understand what Im saying. We knew we couldnt make it illegal to be either against the war or black, but by getting the public to associate the hippies with marijuana and blacks with heroin, and then criminalizing both heavily, we could disrupt those communities. We could arrest their leaders, raid their homes, break up their meetings, and vilify them night after night on the evening news. Did we know we were lying about the drugs? Of course we did.Nixon temporarily placed marijuana in Schedule One, the most restrictive category of drugs, pending review by a commission he appointed led by Republican Pennsylvania Governor Raymond Shafer.

In 1972, the commission unanimously recommended decriminalizing the possession and distribution of marijuana for personal use. Nixon ignored the report and rejected its recommendations.

Between 1973 and 1977, however, eleven states decriminalized marijuana possession. In January 1977, President Jimmy Carter was inaugurated on a campaign platform that included marijuana decriminalization. In October 1977, the Senate Judiciary Committee voted to decriminalize possession of up to an ounce of marijuana for personal use.

Within just a few years, though, the tide had shifted. Proposals to decriminalize marijuana were abandoned as parents became increasingly concerned about high rates of teen marijuana use. Marijuana was ultimately caught up in a broader cultural backlash against the perceived permissiveness of the 1970s.

The presidency of Ronald Reagan marked the start of a long period of skyrocketing rates of incarceration, largely thanks to his unprecedented expansion of the drug war. The number of people behind bars for nonviolent drug law offenses increased from 50,000 in 1980 to over 400,000 by 1997.

Public concern about illicit drug use built throughout the 1980s, largely due to media portrayals of people addicted to the smokeable form of cocaine dubbed crack. Soon after Ronald Reagan took office in 1981, his wife, Nancy Reagan, began a highly-publicized anti-drug campaign, coining the slogan “Just Say No.”

This set the stage for the zero tolerance policies implemented in the mid-to-late 1980s. Los Angeles Police Chief Daryl Gates, who believed that casual drug users should be taken out and shot, founded the DARE drug education program, which was quickly adopted nationwide despite the lack of evidence of its effectiveness. The increasingly harsh drug policies also blocked the expansion of syringe access programs and other harm reduction policies to reduce the rapid spread of HIV/AIDS.

In the late 1980s, a political hysteria about drugs led to the passage of draconian penalties in Congress and state legislatures that rapidly increased the prison population. In 1985, the proportion of Americans polled who saw drug abuse as the nation’s “number one problem” was just 2-6 percent. The figure grew through the remainder of the 1980s until, in September 1989, it reached a remarkable 64 percent one of the most intense fixations by the American public on any issue in polling history. Within less than a year, however, the figure plummeted to less than 10 percent, as the media lost interest. The draconian policies enacted during the hysteria remained, however, and continued to result in escalating levels of arrests and incarceration.

Although Bill Clinton advocated for treatment instead of incarceration during his 1992 presidential campaign, after his first few months in the White House he reverted to the drug war strategies of his Republican predecessors by continuing to escalate the drug war. Notoriously, Clinton rejected a U.S. Sentencing Commission recommendation to eliminate the disparity between crack and powder cocaine sentences.

He also rejected, with the encouragement of drug czar General Barry McCaffrey, Health Secretary Donna Shalalas advice to end the federal ban on funding for syringe access programs. Yet, a month before leaving office, Clinton asserted in a Rolling Stone interview that “we really need a re-examination of our entire policy on imprisonment” of people who use drugs, and said that marijuana use “should be decriminalized.”

At the height of the drug war hysteria in the late 1980s and early 1990s, a movement emerged seeking a new approach to drug policy. In 1987, Arnold Trebach and Kevin Zeese founded the Drug Policy Foundation describing it as the loyal opposition to the war on drugs. Prominent conservatives such as William Buckley and Milton Friedman had long advocated for ending drug prohibition, as had civil libertarians such as longtime ACLU Executive Director Ira Glasser. In the late 1980s they were joined by Baltimore Mayor Kurt Schmoke, Federal Judge Robert Sweet, Princeton professor Ethan Nadelmann, and other activists, scholars and policymakers.

In 1994, Nadelmann founded The Lindesmith Center as the first U.S. project of George Soros Open Society Institute. In 2000, the growing Center merged with the Drug Policy Foundation to create the Drug Policy Alliance.

George W. Bush arrived in the White House as the drug war was running out of steam yet he allocated more money than ever to it. His drug czar, John Walters, zealously focused on marijuana and launched a major campaign to promote student drug testing. While rates of illicit drug use remained constant, overdose fatalities rose rapidly.

The era of George W. Bush also witnessed the rapid escalation of the militarization of domestic drug law enforcement. By the end of Bush’s term, there were about 40,000 paramilitary-style SWAT raids on Americans every year mostly for nonviolent drug law offenses, often misdemeanors. While federal reform mostly stalled under Bush, state-level reforms finally began to slow the growth of the drug war.

Politicians now routinely admit to having used marijuana, and even cocaine, when they were younger. When Michael Bloomberg was questioned during his 2001 mayoral campaign about whether he had ever used marijuana, he said, “You bet I did and I enjoyed it.” Barack Obama also candidly discussed his prior cocaine and marijuana use: “When I was a kid, I inhaled frequently that was the point.”

Public opinion has shifted dramatically in favor of sensible reforms that expand health-based approaches while reducing the role of criminalization in drug policy.

Marijuana reform has gained unprecedented momentum throughout the Americas. Alaska, California, Colorado, Nevada, Oregon, Maine, Massachusetts, Washington State, and Washington D.C. have legalized marijuana for adults. In December 2013, Uruguay became the first country in the world to legally regulate marijuana. In Canada, Prime Minister Justin Trudeau plans legalize marijuana for adults by 2018.

In response to a worsening overdose epidemic, dozens of U.S. states passed laws to increase access to the overdose antidote, naloxone, as well as 911 Good Samaritan laws to encourage people to seek medical help in the event of an overdose.

Yet the assault on American citizens and others continues, with 700,000 people still arrested for marijuana offenses each year and almost 500,000 people still behind bars for nothing more than a drug law violation.

President Obama, despite supporting several successful policy changes such as reducing the crack/powder sentencing disparity, ending the ban on federal funding for syringe access programs, and ending federal interference with state medical marijuana laws did not shift the majority of drug policy funding to a health-based approach.

Now, the new administration is threatening to take us backward toward a 1980s style drug war. President Trump is calling for a wall to keep drugs out of the country, and Attorney General Jeff Sessions has made it clear that he does not support the sovereignty of states to legalize marijuana, and believes good people dont smoke marijuana.

Progress is inevitably slow, and even with an administration hostile to reform there is still unprecedented momentum behind drug policy reform in states and localities across the country. The Drug Policy Alliance and its allies will continue to advocate for health-based reforms such as marijuana legalization, drug decriminalization, safe consumption sites, naloxone access, bail reform, and more.

We look forward to a future where drug policies are shaped by science and compassion rather than political hysteria.

Read more from the original source:

A Brief History of the Drug War | Drug Policy Alliance

War on Drugs | United States history | Britannica.com

War on Drugs, the effort in the United States since the 1970s to combat illegal drug use by greatly increasing penalties, enforcement, and incarceration for drug offenders.

The War on Drugs began in June 1971 when U.S. Pres. Richard Nixon declared drug abuse to be public enemy number one and increased federal funding for drug-control agencies and drug-treatment efforts. In 1973 the Drug Enforcement Agency was created out of the merger of the Office for Drug Abuse Law Enforcement, the Bureau of Narcotics and Dangerous Drugs, and the Office of Narcotics Intelligence to consolidate federal efforts to control drug abuse.

The War on Drugs was a relatively small component of federal law-enforcement efforts until the presidency of Ronald Reagan, which began in 1981. Reagan greatly expanded the reach of the drug war and his focus on criminal punishment over treatment led to a massive increase in incarcerations for nonviolent drug offenses, from 50,000 in 1980 to 400,000 in 1997. In 1984 his wife, Nancy, spearheaded another facet of the War on Drugs with her Just Say No campaign, which was a privately funded effort to educate schoolchildren on the dangers of drug use. The expansion of the War on Drugs was in many ways driven by increased media coverage ofand resulting public nervousness overthe crack epidemic that arose in the early 1980s. This heightened concern over illicit drug use helped drive political support for Reagans hard-line stance on drugs. The U.S. Congress passed the Anti-Drug Abuse Act of 1986, which allocated $1.7 billion to the War on Drugs and established a series of mandatory minimum prison sentences for various drug offenses. A notable feature of mandatory minimums was the massive gap between the amounts of crack and of powder cocaine that resulted in the same minimum sentence: possession of five grams of crack led to an automatic five-year sentence while it took the possession of 500 grams of powder cocaine to trigger that sentence. Since approximately 80% of crack users were African American, mandatory minimums led to an unequal increase of incarceration rates for nonviolent black drug offenders, as well as claims that the War on Drugs was a racist institution.

Concerns over the effectiveness of the War on Drugs and increased awareness of the racial disparity of the punishments meted out by it led to decreased public support of the most draconian aspects of the drug war during the early 21st century. Consequently, reforms were enacted during that time, such as the legalization of recreational marijuana in a number of states and the passage of the Fair Sentencing Act of 2010 that reduced the discrepancy of crack-to-powder possession thresholds for minimum sentences from 100-to-1 to 18-to-1. While the War on Drugs is still technically being waged, it is done at much less intense level than it was during its peak in the 1980s.

Go here to see the original:

War on Drugs | United States history | Britannica.com

Philippines War on Drugs | Human Rights Watch

Tilted election playing field in Turkey; European Court of Justice confirms rights of same-sex couples; Philippine policepromoting abusers; Vietnam’s cyber security law; Nigerian military trying to smear Amnesty International; Paris names imprisoned Bahrainrights activist Nabeel Rajaban honorary citizen; Intimidation ofjournalists in the US; Brutal US treatment of refugees; and Russia’s World Cup amid Syria slaughter.

See original here:

Philippines War on Drugs | Human Rights Watch


12345...102030...