Gene Therapy – Biotechnology – Science and Research

Gene therapy is using "genes as medicine". It is an experimental approach to treating genetic disease where the faulty gene is fixed, replaced or supplemented with a healthy gene so that it can function normally. Most genetic diseases cannot be treated, but gene therapy research gives some hope to patients and their families as a possible cure. However, this technology does not come without risks and many clinical trials to evaluate its effectiveness need to be done before gene therapy can be put to regular medical use.

To get a new gene into a cell's genome, it must be carried in a molecule called a vector. The most common vectors currently being used are viruses, which naturally invade cells and insert their genetic material into that cell's genome. To use a virus as a vector, the virus' own genes are removed and replaced with the new gene destined for the cell. When the virus attacks the cell, it will insert the genetic material it carries. A successful transfer will result in the target cell now carrying the new gene that will correct the problem caused by the faulty gene.

Viruses that can be used as vectors include retroviruses like HIV, adenoviruses (one of which causes the common cold), adeno-associated viruses and herpes simplex viruses. There are also many non-viral vectors being tested for gene therapy uses. These include artificial lipid spheres called liposomes, DNA attached to a molecule that will bind to a receptor on the target cell, artificial chromosomes and naked DNA that is not attached to another molecule at all and can be directly inserted into the cell.

The actual transfer of the new gene into the target cell can happen in two ways: ex vivo and in vivo. The ex vivo approach involves transferring the new gene into cells that have been removed from the patient and grown in the laboratory. Once the transfer is complete, the cells are returned to the patient, where they will continue to grow and produce the new gene product. The in vivo approach delivers the vector directly to the patient, where transfer of the new gene will occur in the target cells within the body.

Conditions or disorders that result from mutations in a single gene are potentially the best candidates for gene therapy. However, the many challenges met by researchers working on gene therapy mean that its application is still limited while the procedure is being perfected.

Before gene therapy can be used to treat a certain genetic condition or disorder, certain requirements need to be met:

Clinical trials for gene therapy in other countries (for example France and the United Kingdom) have shown that there are still several major factors preventing gene therapy from becoming a routine way to treat genetic conditions and disorders. While the transfer of the new gene into the target cells has worked, it does not seem to have a long-lasting effect. This suggests that patients would have to be treated multiple times to control the condition or disorder. There is also always a risk of a severe immune response, since the immune cells are trained to attack any foreign molecule in the body. Working with viral vectors has proven to be challenging because they are difficult to control and the body immediately recognizes and attacks common viruses. Recent work has focussed on potential non-viral vectors to avoid the complications associated with the viral vectors. Finally, while there are thousands of single-gene disorders, the more common genetic disorders are actually caused by multiple genes, which do not make them good candidates for gene therapy.

One promising application of gene therapy is in treating type I diabetes. Researchers in the United States used an adenovirus as a vector to deliver the gene for hepatocyte growth factor (HGF) to pancreatic islet cells removed from rats. They injected the altered cells into diabetic rats and, within a day, the rats were controlling their blood glucose levels better than the control rats. This model mimics the transplantation of islet cells in humans and shows that the addition of the HGF gene greatly enhances the islet cells' function and survival.

In Canada, researchers in Edmonton, Alberta also developed a protocol to treat type I diabetes. Doctors use ultrasound to guide a small catheter through the upper abdomen and into the liver. Pancreatic islet cells are then injected through the catheter into the liver. In time, islets are established in the liver and begin releasing insulin.

Another application for gene therapy is in treating X-linked severe combined immunodeficiency (X-SCID), a disease where a baby lacks both T and B cells of the immune system and is vulnerable to infections. The current treatment is bone marrow transplant from a matched sibling, which is not always possible or effective in the long term. Researchers in France and the United Kingdom, knowing the disease was caused by a faulty gene on the X chromosome, treated 14 children by replacing the faulty gene ex vivo. Upon receiving the altered cells, the patients showed great improvements in their immune system functions. Unfortunately, two of the children developed a form of leukemia several years after the treatment. Further investigation showed that the vector had inserted the gene near a proto-oncogene, which led to uncontrolled growth of the T cells. The clinical trials were put on hold until a safer method can be designed and tested.

Link:

Gene Therapy - Biotechnology - Science and Research

Gene Therapy for Diseases | ASGCT – American Society of Gene …

Gene Therapy for Diseases

Gene Therapy has made important medical advances in less than two decades. Within this short time span, it has moved from the conceptual stage to technology development and laboratory research to clinical translational trials for a variety of deadly diseases. Among the most notable advancements are the following:

Severe Combined Immune Deficiency (ADA-SCID) ADA-SCID is also known as the bubble boy disease. Affected children are born without an effective immune system and will succumb to infections outside of the bubble without bone marrow transplantation from matched donors. A landmark study representing a first case of gene therapy "cure," or at least a long-term correction, for patients with deadly genetic disorder was conducted by investigators in Italy. The therapeutic gene called ADA was introduced into the bone marrow cells of such patients in the laboratory, followed by transplantation of the genetically corrected cells back to the same patients. The immune system was reconstituted in all six treated patients without noticeable side effects, who now live normal lives with their families without the need for further treatment.

Chronic Granulomatus Disorder (CGD) CGD is a genetic disease in the immune system that leads to the patients' inability to fight off bacterial and fungal infections that can be fatal. Using similar technologies as in the ADA-SCID trial, investigators in Germany treated two patients with this disease, whose reconstituted immune systems have since been able to provide them with full protection against microbial infections for at least two years.

Hemophilia Patients born with Hemophilia are not able to induce blood clots and suffer from external and internal bleeding that can be life threatening. In a clinical trial conducted in the United States , the therapeutic gene was introduced into the liver of patients, who then acquired the ability to have normal blood clotting time. The therapeutic effect however, was transient because the genetically corrected liver cells were recognized as foreign and rejected by the healthy immune system in the patients. This is the same problem faced by patients after organ transplantation, and curative outcome by gene therapy might be achievable with immune-suppression or alternative gene delivery strategies currently being tested in preclinical animal models of this disease.

Other genetic disorders After many years of laboratory and preclinical research in appropriate animal models of disease, a number of clinical trials will soon be launched for various genetic disorders that include congenital blindness, lysosomal storage disease and muscular dystrophy, among others.

Cancer Multiple gene therapy strategies have been developed to treat a wide variety of cancers, including suicide gene therapy, oncolytic virotherapy, anti-angiogenesis and therapeutic gene vaccines. Two-thirds of all gene therapy trials are for cancer and many of these are entering the advanced stage, including a Phase III trial of Ad.p53 for head and neck cancer and two different Phase III gene vaccine trials for prostate cancer and pancreas cancer. Additionally, numerous Phase I and Phase II clinical trials for cancers in the brain, skin, liver, colon, breast and kidney among others, are being conducted in academic medical centers and biotechnology companies, using novel technologies and therapeutics developed on-site.

Neurodegenerative Diseases Recent progress in gene therapy has allowed for novel treatments of neurodegenerative diseases such as Parkinson's Disease and Huntington's Disease, for which exciting treatment results have been obtained in appropriate animal models of the corresponding human diseases. Phase I clinical trials for these neurodegenerative disorders have been, or will soon be, launched.

Other acquired diseases The same gene therapeutic techniques have been applied to treat other acquired disorders such as viral infections (e.g. influenza, HIV, hepatitis), heart disease and diabetes, among others. Some of these have entered, or will soon be entering, into early phase clinical trials.

See the rest here:

Gene Therapy for Diseases | ASGCT - American Society of Gene ...

Gene Therapy – Cancer Treatments – Moores Cancer Center, UC …

Gene therapy is an experimental treatment that involves inserting genetic material into your cells to give them a new function or restore a missing function, as cancer may be caused by damaged or missing genes, also known as gene mutations. Although gene therapy may be one way to overcome these changes and treat or prevent cancer, it is currently only available through clinical trials.

Cancer is caused by changes in our genes. Genes are inherited from our parents, and determine our traits and characteristics. They are made of biological molecules called deoxyribonucleic acid (DNA) and ribonucleic acid (RNA). DNA and RNA are responsible for making proteins, which have many functions, such as helping a cell to maintain its shape or controlling its growth and division. Changes or mutations in genes can affect the proteins and may sometimes lead to diseases, such as cancer.

Gene therapy is designed to modify cancer cells at the molecular level and replace a missing or bad gene with a healthy one. The new gene is delivered to the target cell via a vector, which is usually an inactive virus or liposome, a tiny fat bubble.

Gene therapy can be done in two ways: outside (ex vivo) or inside (in vivo) your body. Ex-vivo techniques involve taking some of the cancer cells out of your body, injecting them with good genes, and then putting them back into your body. The in-vivo process requires that good genes be put directly into a tumor, which may be difficult depending on its location or if the cancer has spread. Scientists generally use two types of cells in gene therapy the tumor cells themselves and immune system cells that attack the tumors.

Researchers from Moores Cancer Center at UC San Diego Health System are studying several gene therapy techniques for breast cancer, melanoma, leukemia and pancreatic cancer.

For example, they have been integrally involved in the development of Herceptin, a targeted therapy that is proving to be effective in curing localized human epidermal growth factor receptor-2 (HER2) breast cancer. HER2 controls how cells grow, divide and repair themselves.

Researchers have also been injecting a modified herpes virus into melanoma tumors, with the intention of improving the bodys immune defenses against the disease.

Gene therapy called TNFerade Biologic involves a DNA carrier containing the gene for tumor necrosis factor-alpha, an immune system protein with potent and well-documented anti-cancer effects. TNFerade is being studied in combination with radiation therapy for first-time treatment of inoperable pancreatic cancer.

TNFerade and the herpes strategies use gene therapy to enhance the killing effect of the primary mechanism radiation in TNFerade and viral induced cell lysis, or splitting, in the herpes virus.

When will gene therapy be available? Gene therapy is only available as a cancer treatment through clinical trials.

Are there any risks associated with gene therapy clinical trials? Yes. Viral vectors might infect healthy cells as well as cancer cells, a new gene might be inserted in the wrong location in the DNA, or the transferred genes could be overexpressed and produce too much of the missing protein, causing harm. All risks for any procedure should be discussed with your doctor.

Continued here:

Gene Therapy - Cancer Treatments - Moores Cancer Center, UC ...

Gene Therapy and Cell Therapy Defined | ASGCT – American …

Gene therapy and cell therapy are overlapping fields of biomedical research with the goals of repairing the direct cause of genetic diseases in the DNA or cellular population, respectively. These powerful strategies are also being focused on modulating specific genes and cell subpopulations in acquired diseases in order to reestablish the normal equilibrium. In many diseases, gene and cell therapy are combined in the development of promising therapies.

In addition, these two fields have helped provide reagents, concepts, and techniques that are elucidating the finer points of gene regulation, stem cell lineage, cell-cell interactions, feedback loops, amplification loops, regenerative capacity, and remodeling.

Gene therapy is defined as a set of strategies that modify the expression of an individuals genes or that correct abnormal genes. Each strategy involves the administration of a specific DNA (or RNA).

Cell therapy is defined as the administration of live whole cells or maturation of a specific cell population in a patient for the treatment of a disease.

Gene therapy: Historically, the discovery of recombinant DNA technology in the 1970s provided the tools to efficiently develop gene therapy. Scientists used these techniques to readily manipulate viral genomes, isolate genes, identify mutations involved in human diseases, characterize and regulate gene expression, and engineer various viral vectors and non-viral vectors. Many vectors, regulatory elements, and means of transfer into animals have been tried. Taken together, the data show that each vector and set of regulatory elements provides specific expression levels and duration of expression. They exhibit an inherent tendency to bind and enter specific types of cells as well as spread into adjacent cells. The effect of the vectors and regulatory elements are able to be reproduced on adjacent genes. The effect also has a predictable survival length in the host. Although the route of administration modulates the immune response to the vector, each vector has a relatively inherent ability, whether low, medium or high, to induce an immune response to the transduced cells and the new gene products.

The development of suitable gene therapy treatments for many genetic diseases and some acquired diseases has encountered many challenges and uncovered new insights into gene interactions and regulation. Further development often involves uncovering basic scientific knowledge of the affected tissues, cells, and genes, as well as redesigning vectors, formulations, and regulatory cassettes for the genes.

While effective long-term treatments for anemias, hemophilia, cystic fibrosis, muscular dystrophy, Gauschers disease, lysosomal storage diseases, cardiovascular diseases, diabetes, and diseases of the bones and joints are elusive today, some success is being observed in the treatment of several types of immunodeficiency diseases, cancer, and eye disorders. Further details on the status of development of gene therapy for specific diseases are summarized here.

Cell therapy: Historically, blood transfusions were the first type of cell therapy and are now considered routine. Bone marrow transplantation has also become a well-established protocol. Bone marrow transplantation is the treatment of choice for many kinds of blood disorders, including anemias, leukemias, lymphomas, and rare immunodeficiency diseases. The key to successful bone marrow transplantation is the identification of a good "immunologically matched" donor, who is usually a close relative, such as a sibling. After finding a good match between the donors and recipients cells, the bone marrow cells of the patient (recipient) are destroyed by chemotherapy or radiation to provide room in the bone marrow for the new cells to reside. After the bone marrow cells from the matched donor are infused, the self-renewing stem cells find their way to the bone marrow and begin to replicate. They also begin to produce cells that mature into the various types of blood cells. Normal numbers of donor-derived blood cells usually appear in the circulation of the patient within a few weeks. Unfortunately, not all patients have a good immunological matched donor. Furthermore, bone marrow grafts may fail to fully repopulate the bone marrow in as many as one third of patients, and the destruction of the host bone marrow can be lethal, particularly in very ill patients. These requirements and risks restrict the utility of bone marrow transplantation to some patients.

Cell therapy is expanding its repertoire of cell types for administration. Cell therapy treatment strategies include isolation and transfer of specific stem cell populations, administration of effector cells, induction of mature cells to become pluripotent cells, and reprogramming of mature cells. Administration of large numbers of effector cells has benefited cancer patients, transplant patients with unresolved infections, and patients with chemically destroyed stem cells in the eye. For example, a few transplant patients cant resolve adenovirus and cytomegalovirus infections. A recent phase I trial administered a large number of T cells that could kill virally-infected cells to these patients. Many of these patients resolved their infections and retained immunity against these viruses. As a second example, chemical exposure can damage or cause atrophy of the limbal epithelial stem cells of the eye. Their death causes pain, light sensitivity, and cloudy vision. Transplantation of limbal epithelial stem cells for treatment of this deficiency is the first cell therapy for ocular diseases in clinical practice.

Several diseases benefit most from treatments that combine the technologies of gene and cell therapy. For example, some patients have a severe combined immunodeficiency disease (SCID) but unfortunately, do not have a suitable donor of bone marrow. Scientists have identified that patients with SCID are deficient in adenosine deaminase gene (ADA-SCID), or the common gamma chain located on the X chromosome (X-linked SCID). Several dozen patients have been treated with a combined gene and cell therapy approach. Each individuals hematopoietic stem cells were treated with a viral vector that expressed a copy of the relevant normal gene. After selection and expansion, these corrected stem cells were returned to the patients. Many patients improved and required less exogenous enzymes. However, some serious adverse events did occur and their incidence is prompting development of theoretically safer vectors and protocols. The combined approach also is pursued in several cancer therapies.

Further information on the progress and status of gene therapy and cell therapy on various diseases is listed here.

Read more:

Gene Therapy and Cell Therapy Defined | ASGCT - American ...

Articles about Gene Therapy – latimes

NEWS

October 24, 2012 | By Karen Kaplan, Los Angeles Times

Scientists have demonstrated a new type of gene therapy that would - in principle - allow mothers to avoid saddling their children with rare diseases that could result in heart problems, dementia, diabetes, deafness and other significant health issues. The disorders in question are all due to mutations in one of the 37 genes in our mitochondrial DNA. Mitochondria are structures within cells that convert the energy from food into a form that cells can use, according to this explainer from the NIH's National Library of Medicine.

HEALTH

September 13, 2012 | By Elaine Herscher

Genes make us who we are - in sickness and in health. We get our genetic makeup from our parents, of course, but in the future, we might be getting genes from our doctors too. Imagine your doctor promising to cure your cancer or heart disease by prescribing some new snippets of DNA. For some diseases, gene therapy is already a reality. In other cases, genetic cures are still years away. Despite many challenges and setbacks - including some that are surely yet to come - experts predict that gene therapy will eventually become a crucial and even common part of healthcare.

SCIENCE

August 15, 2012 | By Rosie Mestel, Los Angeles Times

Dog lovers may be interested in an article published this week in the New England Journal of Medicine: It highlights the discoveries scientists are making about diseases that various dog breeds are prone to -- and how those findings can benefit human health as well as that of canines. It's written by longtime dog genetics researcher Elaine Ostrander of the National Human Genome Research Institute. The discoveries are possible because of several things: First off, both the human genome and dog genomes have been sequenced.

SCIENCE

July 20, 2012 | By Thomas H. Maugh II, Los Angeles Times

The long-frustrated field of gene therapy is about to reach a major milestone: the first regulatory approval of a gene therapy treatment for disease in the West. The European Medicine Agency's Committee for Medicinal Products for Human Use said Friday that it is recommending approval of Glybera, a treatment for lipoprotein lipase deficiency manufactured by uniQure of Amsterdam. The European Commission generally follows the recommendations of the agency, and if it does so this time, the product could be available in all 27 members of the European Union by the end of the year.

SCIENCE

July 18, 2012 | By Jon Bardin, Los Angeles Times

We like to think of the Olympics as a level playing field - that's why doping is banned. But scientific research complicates this view: There are numerous genetic factors known to confer advantages in athletic contests, from mutations that increase the oxygen carrying capacity of blood to gene variants that confer an incredible increase in endurance, and these mutations appear to be especially common in Olympic athletes. In other words, we may want an egalitarian Olympic games, but it probably isn't in the cards.

NEWS

June 29, 2012 | By Jon Bardin, Los Angeles Times / For the Booster Shots blog

Can't kick cigarettes? A vaccine may one day help by preventing nicotine from reaching its target in the brain, according to research published this week. Most smoking therapies do a poor job of stopping the habit - 70% to 80% of smokers who use an approved drug therapy to quit relapse. Scientists say this is because the targets of existing therapies are imperfect, only slightly weakening nicotine's ability to find its target in the brain. So some scientists have been trying a different approach - creation of a vaccine.

See original here:

Articles about Gene Therapy - latimes

University of Pennsylvania || Gene Therapy Program

Gene Therapy Program > Home

Providing a foundation for basic research necessary to assure the success of gene therapy.

Given all the developments in molecular genetics, the isolation and cloning of genes is now a relatively common procedure. Research now centers on somatic gene therapy, referring to the techniques used to insert a functioning gene into the somatic (non-reproductive) cells of a patient to correct an inborn genetic error or to provide a new function to the cell. Having individual genes available opens the way for gene therapy to take place. And yet, after an initial period of about six years of preclinical work and another thirteen years involving clinical trials, effective gene delivery still remains one of the central challenges in the field.

The Gene Therapy Program of the University of Pennsylvania comprises basic scientific research and core lab research services. Our focus is on developing effective gene vectors derived from recombinant viruses. Much of our current effort is in the development of new adeno-associated virus (AAV) vectors, although some of our research involves both adenoviruses and lentiviruses. Several basic science core laboratories work together to support the development of new vectors.

Contact: Gene Therapy Program Suite 2000, Translational Research Laboratories (TRL) 125 S. 31st Street Philadelphia, PA 19104-3403 Phone: 215-898-0226 Fax: 215-494-5444 GTP@mail.med.upenn.edu

More:

University of Pennsylvania || Gene Therapy Program

Project ATLANTA – Urban Heat Island Study

High Spatial Resolution Airborne Multispectral Thermal Infrared Data to Support Analysis and Modeling Tasks in EOS IDS Project ATLANTA

Dale A. Quattrochi (dale.quattrochi@msfc.nasa.gov), NASA, Global Hydrology and Climate Center, Huntsville, AL

Jeffrey C. Luvall (jeff.luvall@msfc.nasa.gov) , NASA, Global Hydrology and Climate Center, Huntsville, AL

Background

Project ATLANTA (ATlanta Land-use ANalysis: Temperature and Air-quality) as a newly-funded NASA EOS Interdisciplinary Science (IDS) investigation in 1996, seeks to observe, measure, model, and analyze how the rapid growth of the Atlanta, Georgia metropolitan area since the early 1970's has impacted the region's climate and air quality. The primary objectives for this research effort are: 1) To investigate and model the relationship between Atlanta urban growth, land cover change, and the development of the urban heat island phenomenon through time at nested spatial scales from local to regional; 2) To investigate and model the relationship between Atlanta urban growth and land cover change on air quality through time at nested spatial scales from local to regional; and 3) To model the overall effects of urban development on surface energy budget characteristics across the Atlanta urban landscape through time at nested spatial scales from local to regional. Our key goal is to derive a better scientific understanding of how land cover changes associated with urbanization in the Atlanta area, principally in transforming forest lands to urban land covers through time, has, and will, effect local and regional climate, surface energy flux, and air quality characteristics. Allied with this goal is the prospect that the results from this research can be applied by urban planners, environmental managers and other decision-makers, for determining how urbanization has impacted the climate and overall environment of the Atlanta area. It is our intent to make the results available from this investigation to help facilitate measures that can be applied to mitigate climatological or air quality degradation, or to design alternate measures to sustain or improve the overall urban environment in the future. Project ATLANTA is a multidisciplinary research endeavor and enlists the expertise of 8 investigators: Dale Quattrochi (PI) (NASA/Global Hydrology Center); Jeffrey Luvall (NASA/Global Hydrology and Climate Center); C.P. Lo (University of Georgia); Stanley Kidder (Colorado State University); Haider Taha (Lawrence Berkeley National Laboratory); Robert Bornstein (San Jose State University); Kevin Gallo (NOAA/NESDIS); and Robert Gillies (Utah State University).

Atlanta Urban Growth and Effects on Climate and Air Quality

In the last half of the 20th century, Atlanta, Georgia has risen as the premier commercial, industrial, and transportation urban area of the southeastern United States. The rapid growth of the Atlanta area, particularly within the last 25 years, has made Atlanta one of the fastest growing metropolitan areas in the United States. The population of the Atlanta metropolitan area increased 27% between 1970 and 1980, and 33% between 1980-1990 (Research Atlanta, Inc., 1993). Concomitant with this high rate of population growth, has been an explosive growth in retail, industrial, commercial, and transportation services within the Atlanta region. This has resulted in tremendous land cover change dynamics within the metropolitan region, wherein urbanization has consumed vast acreages of land adjacent to the city proper and has pushed the rural/urban fringe farther and farther away from the original Atlanta urban core. An enormous transition of land from forest and agriculture to urban land uses has occurred in the Atlanta area in the last 25 years, along with subsequent changes in the land-atmosphere energy balance relationships.

Air quality has degenerated over the Atlanta area, particularly in regard to elevations in ozone and emissions of volatile organic compounds (VOCs), as indicated by results from the Southern Oxidants Study (SOS) which has focused a major effort on measuring and quantifying the air quality over the Atlanta metropolitan region. SOS modeling simulations for Atlanta using U.S. Environmental Protection Agency (EPA) State Implementation Plan guidelines suggest that a 90% decrease in nitrogen oxide emissions, one of the key elements in ozone production, will be required to bring Atlanta into attainment with the present ozone standard (SOS, 1995).

Project ATLANTA Science Approach

The scientific approach we are using in relating land cover changes with modifications in the local and regional climate and in air quality, is predicated on the analysis of remote sensing data in conjunction with in situ data (e.g., meteorological measurements) that are employed to initialize local and regional-level numerical models of land-atmosphere interactions. Remote sensing data form the basis for quantifying how land covers have changed within the Atlanta metropolitan area through time from the mid-1970's, when Atlanta's dramatic growth began in earnest, to the present. These remotely sensed data will be used to provide input to numerical models that relate land cover change through time with surface energy flux and meteorological parameters to derive temporal models of how land cover changes have impacted both the climatology and the air quality over the Atlanta region. Current remote sensing data (i.e., data obtained during 1997) will be used to calibrate the models and as baseline data for extending the models to predict how prospective future land cover changes will effect the local and regional climate and air quality over the Atlanta-north Georgia region. Additionally, remote sensing data will be used as an indirect modeling method to describe urbanization and deforestation parameters that can be used to assess, as well as predict, the effects of land use changes on the local microclimate.

In concert with the remote sensing-based analysis and modeling of land cover changes is an extensive numerically-based modeling effort to better understand the cause-and-effect relationships between urbanization and trends in climatology and air quality. Sophisticated numerical meteorological models can complement extensive field monitoring projects and help improve our understanding of these relationships and the evolution of the urban climate on a location-specific basis. Measured data alone cannot resolve the relationships between the many causes of urban heat islands/urban climates and observations. For example, measured data cannot directly attribute a certain fraction of temperature rise to a certain modification in land use patterns, change in energy consumption, or release of anthropogenic heat into the atmosphere. These are aspects that numerical modeling can help resolve. Similarly, monitored air quality data cannot be used to establish a direct cause-and-effect relationship between emission sources, activities, or urbanization and observed air quality (e.g., smog). In this sense, photochemical models can be used in testing the sensitivity of ozone concentrations to changes in various land-use components, emission modifications and control, or other strategies. Thus, we are incorporating an assessment of land cover/land use change as measured from remote sensing data, with temporal numerical modeling simulations to better understand the effects that the growth of Atlanta has had on local and regional climate characteristics and air quality.

ATLAS Data: Role and Characteristics

To augment the quantitative measurements of land cover change and land surface thermal characteristics derived from satellite data (i.e, Landsat MSS and TM data for assessment of land cover change; Landsat TM thermal, and AVHRR and GOES data for land surface thermal characteristics), we are employing high spatial resolution airborne multispectral thermal data to provide detailed measurements of thermal energy fluxes that occur for specific surfaces (e.g., pavements, buildings) across the Atlanta urban landscape, and the changes in thermal energy response for these surfaces between day and night. This information is critical to resolving the underlying surface responses that lead to development of local and regional-scale urban climate processes, such as the urban heat island phenomenon and related characteristics. (Quattrochi and Ridd, 1994, 1997). These aircraft data will also be used to develop a functional classification of the thermal attributes of the Atlanta metropolitan area to better understand the energy budget linkages between the urban surface and the boundary layer atmosphere. This will be performed using the Thermal Response Number (TRN) (Luvall and Holbo, 1989; Luvall, 1997) which is expressed as

Where Rn is total net radiation and T change in surface temperature for time period t1 to t2.

Because urban landscapes are very complex in composition, the partitioning of energy budget terms depends on surface type. In natural landscapes, the partitioning is dependent on canopy biomass, leaf area index, aerodynamic roughness, and moisture status, all of which are influenced by the development stage of the ecosystem. In urban landscapes, however, the distribution of artificial or altered surfaces substantially modifies the surface energy budget. Thus, one key component of Project ATLANTA is to measure and model surface energy responses in both space and time, to better understand the processes-responses of urban climate and air quality interactions across the Atlanta metropolitan area.

The airborne sensor used to acquire high spatial resolution multispectral thermal infrared data over Atlanta is the Advanced Thermal and Land Applications Sensor (ATLAS), which is flown onboard a Lear 23 jet aircraft operated by the NASA Stennis Space Center. The ATLAS is a 15-channel multispectral scanner that basically incorporates the bandwidths of the Landsat TM (along with several additional channels) and 6 thermal IR channels similar to that available on the airborne Thermal Infrared Multispectral Scanner (TIMS) sensor (Table 1). Of particular importance to the Atlanta study is the multispectral thermal IR capability of the ATLAS instrument. ATLAS thermal IR data, collected at a very high spatial resolution, have been used to study urban surface energy responses in a previous study over the Huntsville, Alabama metropolitan area with excellent results (Lo et al., 1997).

ATLAS Data Collection

ATLAS data were collected over a 48 x 48 km2 area, centered on the Atlanta Central Business District (CBD) on May 11 and 12, 1997. An early May data acquisition window was selected to facilitate the collection of ATLAS data during the spring when vegetation canopy was filled out, surface temperatures were high enough to permit substantial heating of the urban landscape, and there was a high probability that cool fronts would still be moving through the Atlanta area to permit clear skies, as opposed to later in the spring or summer when increased cloud cover or convective storms become limiting factors in obtaining aircraft data. ATLAS data were collected at a 10m pixel spatial resolution during the daytime, between approximately 11:00 a.m. and 3:00 p.m. local time (Eastern Daylight Time) to capture the highest incidence of solar radiation across the city landscape around solar noon. ATLAS 10m data were also obtained the following morning (May 12) between 2:00-4:00 a.m. local time (Eastern Daylight Time) to measure the Atlanta urban surface during the coolest time of the diurnal energy cycle. Eleven flight lines were required to cover the 48 x 48 km2 area at a 10m spatial resolution. To permit the derivation of TRN values, all 11 daytime flight lines were flown and then repeated later at about a 2 hour interval. Nighttime overflights were not repeated because of the relative invariance in thermal energy fluxes at night which obviated the need to calculate TRNs.

Sky conditions at the time of the daytime overflights were mostly clear with some cirrus clouds present. The Lear jet aircraft flew at an altitude of 5,063m above mean terrain to achieve a 10m pixel resolution which was well below the cirrus clouds. Cirrus clouds covered the entire Atlanta metropolitan area during the night flights. The presence of cirrus cloud cover at night did, to some extent, dampen the cooling effect of thermal energy release to a clear sky, but air temperatures were still sufficiently cool to provide ample difference with daytime heating. Maximum air temperatures during the daytime overflights were approximately 25oC, while air temperature during the nighttime flights was around 10oC. Sample surface temperatures for tree-shaded grass, tree canopy, and asphalt in full sunlight recorded with a hand-held infrared thermometer (8-14 lm) during the afternoon were 28oC, 21oC, and 50oC, respectively. Daytime temperatures for a commercial building roof comprised of rock/membrane coating ranged from 49oC to 52oC. This illustrates that although air temperatures were cooler than optimal for development of the urban heat island effect, there was still significant heating by artificial urban surfaces to permit good contrast with nighttime cooling.

Atmospheric radiance must be accounted for in order to obtain calibrated surface temperatures. Although the ATLAS thermal channels fall within the atmospheric window for atmospheric longwave transmittance (8.0-13.0 m), the maximum transmittance is only about 80%. The amount of atmospheric radiance in the atmospheric window is mostly dependent on the atmospheric water vapor content, although there is an ozone absorption band around 9.5 m. To assist in obtaining accurate thermal surface energy response measurements from the ATLAS data, radiosonde launches were made concurrently with both the daytime and nighttime overflights. The atmospheric profiles obtained from these radiosonde data are then incorporated into the MODTRAN3 model for calculation of atmospheric radiance (Berk et al., 1989). The output from MODTRAN3 is combined with calibrated ATLAS spectral response curves and blackbody information recorded during the flight, using the Earth Resources Laboratory Applications Software (ELAS) module TRADE (Thermal Radiant Temperature) (Graham et al., 1986), to produce a look-up table for pixel temperatures as a function of ATLAS values (Anderson, 1992).

One pyranometer and one pyrgeometer were also stationed on a rooftop within one of aircraft flight lines for use in measuring incoming shortwave and longwave radiation within the study area. Additionally, two shadowband radiometers were placed in strategic locations within the flight path for use in measuring shortwave visible radiation for determining visibility parameters for input into MODTRAN3. The output from MODTRAN3 is combined with calibrated ATLAS spectral response curves and onboard calibration lamp information recorded during the flight in TRADE to produce calibrated at-sensor radiance for the visible wavelengths.

ATLAS Data: Some Examples

Approximately 5 Gb of raw (unprocessed) ATLAS data were collected during the May 11-12 aircraft overflights. In addition to the digital ATLAS data, color infrared aerial photography at 1:32,000 scale was obtained during daytime mission. Figure 1 illustrates daytime thermal (channel 13 - 9.60-10.2 m) ATLAS data collected over the Atlanta CBD area. Figure 2 provides an example of ATLAS data (channel 13) acquired during the night over the Atlanta CBD. Both images are oriented with north at the top. Excluding the effects of the highly variable emissivites of urban building materials, an empirical observation of the images presented in Figures 1 and 2 illustrates the wide range of thermal energy responses present across the Atlanta city landscape, as well as the detail that can be discerned from the 10m data. The Georgia Dome, an enclosed football stadium, appears as the large square-shaped structure due west of the Atlanta city center. Interstate highways 75/85 which traverse in a north-south direction around the city center, are seen as a dark "ribbon" on the day data (Figure 1) just to the east of downtown Atlanta. Just south of the city center, is the junction of Interstate Highways 75/85 and 20. Shadows from tall buildings located in the Atlanta city center can also be observed on the daytime data. In Figure 1, the intense thermal energy responses from buildings, pavements and other surfaces typical of the urban landscape, as well as the heterogeneous distribution of these responses, stand in significant contrast to the relative "flatness" of Atlanta thermal landscape at night (Figure 2). Also, the damping effect that the urban forest has on upwelling thermal energy responses is evident, particularly in the upper right side of the daytime image where residential tree canopy is extensive. In Figure 2, there is still evidence, even in the very early morning, of elevated thermal energy responses from buildings and other surfaces in the Atlanta CBD and from streets and highways. It appears that thermal energy responses for vegetation across the image are relatively uniform at night, regardless of vegetative type (e.g., grass, trees).

ATLAS Data Analysis: The Next Steps

From the images in Figures 1 and 2, it is apparent that high resolution ATLAS data offer a unique opportunity to measure, analyze and model the state and dynamics of thermal energy responses across the Atlanta metropolitan landscape. In addition to deriving energy balance measurements for day and night, and TRN values for specific urban surfaces to better understand the thermal characteristics that drive the development of the urban heat island phenomena and the overall Atlanta urban climate, these multispectral ATLAS data also exist as database record of current land cover/land use conditions for the Atlanta metropolitan area. Along with the extensive meteorological data available via a network of mesonet stations that are currently operating across the Atlanta area, the ATLAS data will be used to initialize and calibrate the meteorological and air quality models that will be run for the time period when the airborne data were collected. Moreover, one of the key facets from Project ATLANTA is to work with local planning agencies, such as the Atlanta Regional Commission (ARC), to model how the continued growth of Atlanta will impact the climate and air quality of the north Georgia region. The ARC is currently developing a 20-year growth plan for a 10 county area around Atlanta. Using the ATLAS data obtained in May, 1997 as a baseline for land cover/land use, our objective is to perform some "prospective" modeling on how meteorological conditions and air quality will change, predicated on the ARC's 20-year plan. By doing so, we hope to provide the ARC and other planning or decision-making bodies, with model output that can be used to modify or revise growth plans for the Atlanta metropolitan area, and to help mitigate or ameliorate the expansion of the urban heat island effect or the further deterioration in air quality.

References

Anderson, J. E., 1992. Determination of water surface temperature based on the use of thermal infrared multispectral scanner data. Geocarto International 3:3-8.

Berk, A., L. S. Bernstein, and D. C. Robertson., 1989: Modtran: A Moderate Resolution Model for Lowtran 7. U.S. Air Force Geophysics Laboratory, Environmental Research Papers GL-TR-89-0122, Hanscom Air Force Base, MA, 37 pp.

Graham, M.H., B.G. Junkin, M.T. Kalcic, R.W. Pearson and B.R. Seyfarth, 1986. ELAS - Earth resources laboratory applications software. Revised Jan.1986. NASA/NSTL/ERL Report No. 183.

Lo, C.P., D.A. Quattrochi, and J.C. Luvall, 1997. Application of high-resolution thermal infrared remote sensing and GIS to assess the urban heat island effect. International Journal of Remote Sensing 18:287-304.

Luvall, J.C., and H. R. Holbo, 1989: Measurements of short-term thermal responses of coniferous forest canopies using thermal scanner data. Remote Sensing of Environment, 27, 1-10.

Luvall, J.C., 1997. The use of remotely sensed surface temperatures from an aircraft-based thermal infrared multispectral scanner (TIMS) to estimate the spatial and temporal variability of latent heat fluxes and thermal response numbers from a white pine (Pinus strobus L.) plantation. In Scale in Remote Sensing and GIS, D.A. Quattrochi and M.F. Goodchild, eds. CRC/Lewis Publishers, Boca Raton, FL, pp.169-185.

Quattrochi, D.A. and M.K. Ridd, 1994. Measurement and analysis of thermal energy responses from discrete urban surfaces using remote sensing data. International Journal of Remote Sensing 15:1991-2022.

Quattrochi, D.A. and M.K. Ridd, 1997. Analysis of vegetation within a semi-arid urban environment using high spatial resolution airborne thermal infrared remote sensing data. Atmospheric Environment (In press).

Research Atlanta, Inc., 1993: The Dynamics of Change: An Analysis of Growth in Metropolitan Atlanta over the Past Two Decades. Policy Research Center, Georgia State University, Atlanta.

SOS, 1995. The State of the Southern Oxidants Study: Policy-Relevant Findings in Ozone Pollution Research 1988-1994. Southern Oxidants Study: Raleigh, NC, 94 pp.

Figure Captions

Figure 1. ATLAS daytime thermal image (channel 13 -- 9.60-10.2 m) of the Atlanta central business district area. These data have not been geometrically or atmospherically corrected.

Figure 2. ATLAS nighttime thermal image (channel 13 -- 9.60-10.2 m) of the Atlanta central business district area. These data have not been geometrically or atmospherically corrected.

Global Hydrology and Climate Center

Responsible Official: Dr. Steven J. Goodman (steven.goodman@nasa.gov) Page Curator: Paul J. Meyer (paul.meyer@msfc.nasa.gov)

Read more:

Project ATLANTA - Urban Heat Island Study

Astronomy (magazine) – Wikipedia, the free encyclopedia

Astronomy (ISSN0091-6358) is a monthly American magazine about astronomy. Targeting amateur astronomers for its readers, it contains columns on sky viewing, reader-submitted astrophotographs, and articles on astronomy and astrophysics that are readable by nonscientists.

Astronomy is a magazine about the science and hobby of astronomy. Based near Milwaukee in Waukesha, Wisconsin, it is produced by Kalmbach Publishing. Astronomys readers include those interested in astronomy, and those who want to know about sky events, observing techniques, astrophotography, and amateur astronomy in general.

Astronomy was founded in 1973 by Stephen A. Walther, a graduate of the University of WisconsinStevens Point and amateur astronomer. The first issue, August 1973, consisted of 48 pages with five feature articles and information about what to see in the sky that month. Issues contained astrophotos and illustrations created by astronomical artists. Walther had worked part time as a planetarium lecturer at the University of WisconsinMilwaukee and developed an interest in photographing constellations at an early age. Although even in childhood he was interested to obsession in Astronomy, he did so poorly in mathematics that his mother despaired that he would ever be able to earn a living. However he graduated in Journalism from the University of Wisconsin Stevens Point, and as a senior class project he created a business plan for a magazine for amateur astronomers. With the help of his brother David, he was able to bring the magazine to fruition.[citation needed]. He died in 1977.

AstroMedia Corp., the company Walther had founded to publish Astronomy, brought in Richard Berry as editor. Berry also created the offshoot Odyssey, aimed at young readers, and the specialized Telescope Making. In 1985, Milwaukee hobby publisher Kalmbach bought Astronomy.

In 1992, Richard Berry left the magazine and Robert Burnham took over as chief editor. Kalmbach discontinued Deep Sky and Telescope Making magazines and sold Odyssey. In 1996 Bonnie Gordon, now a professor at Central Arizona College, assumed the editorship. David J. Eicher, the creator of "Deep Sky," became chief editor in 2002.

The Astronomy staff also produces other publications. These have included Explore the Universe; Beginners Guide to Astronomy; Origin and Fate of the Universe; Mars: Explore the Red Planet's Past, Present, and Future; Atlas of the Stars; Cosmos; and 50 Greatest Mysteries of the Universe. There also was, for a time in the mid-2000s, a Brazilian edition published by Duetto Editora called Astronomy Brasil. However, due mainly to low circulation numbers, Duetto ceased its publication in September 2007.

Astronomy publishes articles about the hobby and science of astronomy. Generally, the front half of the magazine reports on professional science, while the back half of the magazine presents items of interest to hobbyists. Science articles cover such topics as cosmology, space exploration, exobiology, research conducted by professional-class observatories, and individual professional astronomers. Each issue of Astronomy contains a foldout star map showing the evening sky for the current month and the positions of planets, and some comets.

The magazine has regular columnists. They include science writer Bob Berman, who writes a column called Bob Bermans Strange Universe. Stephen James OMeara writes Stephen James OMearas Secret Sky, which covers observing tips and stories relating to deep-sky objects, planets, and comets. Glenn Chaple writes "Glenn Chaples Observing Basics", a beginners column. Phil Harrington writes "Phil Harringtons Binocular Universe", about observing with binoculars. "Telescope Insider" interviews people who are a part of the telescope-manufacturing industry.

In each issue of Astronomy Magazine, readers will find star and planet charts, telescope observing tips and techniques, and advice on taking photography of the night sky.[2] The magazine also publishes reader-submitted photos in a gallery, lists astronomy-related events, letters from readers, news, and announcements of new products.

Astronomy may include special sections bound into the magazine, such as booklets or posters. Recent examples have included a Messier Catalog booklet, poster showing comet C/2006 P1 (McNaught) and historical comets, a Skyguide listing upcoming sky events, a Telescope Buyer's Guide; a poster titled "Atlas of Extrasolar Planets"; and a poster showing the life cycles of stars.

Astronomy is the largest circulation astronomy magazine, with monthly circulation of 114,080.[3] The majority of its readers are in the United States, but it is also circulated in Canada and internationally.[4]

Its major competitor is Sky & Telescope magazine with a circulation of 80,023.[3]

Read the original:

Astronomy (magazine) - Wikipedia, the free encyclopedia

Astronomy – OpenLearn – Open University

Night sky puts on a meteor shower to celebrate Rosettas closest approach to the sun

Introductory level Duration 5 mins Updated 11 Aug 2015

The Perseids coincide with Rosetta making its closest approach to the Sun, explains Monica Grady.

Introductory level Duration 30 mins Updated 10 Aug 2015

The Perseid meteor shower reaches its peak on 13th August 2015. Find out more about meteors, Perseus and what meteors have to do with ancient Egypt...

Introductory level Duration 5 mins Updated 14 Jul 2015

After a journey of over four and a half billion miles, New Horizons gets some face time with Pluto. But not too much.

Introductory level Duration 5 mins Updated 13 Jul 2015

New Horizons is giving us the chance to see Pluto, close-up, for the first time. But familiarity won't restore Pluto's planet status.

Updated 15 Jun 2015

As part of the consortium that brought Philae to life, we're delighted that it's sending messages once again. Here's a quick round-up of reactions...

Introductory level Updated 23 Apr 2015

Free learning resources in astronomy, relating to light, as part of The Open University's International Year of Light celebrations.

Introductory level Duration 30 mins Updated 19 Mar 2015

On Friday 20 March 2015, thewhole of the UK will be able to viewa partial eclipse of the Sun.

Introductory level Duration 5 mins Updated 19 Mar 2015

Follow our safety advice when looking at the sun

Introductory level Duration 10 mins Updated 19 Mar 2015

Dr. Lucie Green has more safety advice when looking at the sun

Introductory level Duration 5 mins Updated 06 Mar 2015

With jokes, with panic, with searches for religious meaning: A collection of contemporary responses to eclipses from 18th Century publications.

Go here to read the rest:

Astronomy - OpenLearn - Open University

age management (anti-aging medicine): Los Gatos Longevity …

Los Gatos Longevity Institute is the first and foremost medical provider of longevity/anti-aging/age management Medicine dedicated to the proposition that:

age is a state of mind ... aging is a treatable condition

As pioneers in the rapidly expanding age management / anti-aging field, we are increasingly imitated -- but never duplicated. Accept nothing but the best. We are the brand name in age management / anti-aging and Longevity Medicine proudly serving you since 1996.

Now relax ... take in a few deep breaths of air. Really deep ... Good. Now enjoy yourself.

Feel free to navigate our site in its entirety. There is a wealth of information here that should begin to answer many of your first anti-aging questions. You can see our specialty areas to the left.

We have a full spectrum of offerings including professional consultations. Bookmark this site and return often. New information is constantly being added and revised as recent developments are announced.

Your questions are welcomed. You can email us, click on our brief feedback page or call us at 408-358-8855 any time of the day.

And, don't forget to ask us about our Premiere Age Management Plans. It's like internal Plastic Surgery.

It's now up to you ...

It's About Time ...

Read the original:
age management (anti-aging medicine): Los Gatos Longevity ...

How does gene therapy work? – Genetics Home Reference

Gene therapy is designed to introduce genetic material into cells to compensate for abnormal genes or to make a beneficial protein. If a mutated gene causes a necessary protein to be faulty or missing, gene therapy may be able to introduce a normal copy of the gene to restore the function of the protein.

A gene that is inserted directly into a cell usually does not function. Instead, a carrier called a vector is genetically engineered to deliver the gene. Certain viruses are often used as vectors because they can deliver the new gene by infecting the cell. The viruses are modified so they cant cause disease when used in people. Some types of virus, such as retroviruses, integrate their genetic material (including the new gene) into a chromosome in the human cell. Other viruses, such as adenoviruses, introduce their DNA into the nucleus of the cell, but the DNA is not integrated into a chromosome.

The vector can be injected or given intravenously (by IV) directly into a specific tissue in the body, where it is taken up by individual cells. Alternately, a sample of the patients cells can be removed and exposed to the vector in a laboratory setting. The cells containing the vector are then returned to the patient. If the treatment is successful, the new gene delivered by the vector will make a functioning protein.

Researchers must overcome many technical challenges before gene therapy will be a practical approach to treating disease. For example, scientists must find better ways to deliver genes and target them to particular cells. They must also ensure that new genes are precisely controlled by the body.

A new gene is injected into an adenovirus vector, which is used to introduce the modified DNA into a human cell. If the treatment is successful, the new gene will make a functional protein.

The Genetic Science Learning Center at the University of Utah provides information about various technical aspects of gene therapy in Gene Delivery: Tools of the Trade. They also discuss other approaches to gene therapy and offer a related learning activity called Space Doctor.

The Better Health Channel from the State Government of Victoria (Australia) provides a brief introduction to gene therapy, including the gene therapy process and delivery techniques.

Penn Medicines Oncolink describes how gene therapy works and how it is administered to patients.

Next: Is gene therapy safe?

See the original post here:

How does gene therapy work? - Genetics Home Reference

Astronomy – Introduction and History of the Study of Stars

Astronomy, derived from the Greek words for star law, is the scientific study of all objects beyond our world. It is also the process by which we seek to understand the physical laws and origins of our universe.

Over the centuries there have been countless innovators that have contributed to the development and advancement of astronomy. Some of these key individuals include:

Nicolaus Copernicus (1473 - 1543): He was a Polish physician and lawyer by trade, but is now regarded as the father of the current heliocentric model of the solar system.

Tycho Brahe (1546 - 1601): A Danish nobleman, Tycho designed and built instruments of greater power and resolution than anything that had been developed previously. He used these instruments to chart the positions of planets and other celestial objects with such great precision, that it debunked many of the commonly held notions of planetary and stellar motion.

Johannes Kepler (1571 - 1630): A student of Tychos, Kepler continued his work, and from that discovered three laws of planetary motion:

Galileo Galilei (1564 - 1642): While Galileo is sometimes credited (incorrectly) with being the creator of the telescope, he was the first to use the telescope to make detailed studies of heavenly bodies. He was the first to conclude that the Moon was likely similar in composition to the Earth, and that the Suns surface changed (i.e., the motion of sunspots on the Suns surface). He was also the first to see four of Jupiters moons, and the phases of Venus. Ultimately it was his observations of the Milky Way, specifically the detection of countless stars, that shook the scientific community.

Isaac Newton (1642 - 1727): Considered one of the greatest scientific minds of all time, Newton not only deduced the law of gravity, but realized the need for a new type of mathematics (calculus) to describe it.

His discoveries and theories dictated the direction of science for more than 200 years, and truly ushered in the era of modern astronomy.

Albert Einstein (1879 - 1955): Einstein is famous for his development of general relativity, a correction to Newtons law of gravity. But, his relation of energy to mass (E=mc2) is also important to astronomy, as it is the basis for which we understand how the Sun, and other stars, fuse hydrogen into Helium for energy.

Edwin Hubble (1889 - 1953): During his career, Hubble answered two of the biggest questions plaguing astronomers at the time. He determined that so-called spiral nebulae were, in fact, other galaxies, proving that the Universe extends well beyond our own galaxy. Hubble then followed up that discovery by showing that these other galaxies were receding at speeds proportional to their distances away form us.

Stephen Hawking (1942 - ): Very few scientists alive today have contributed more to the advancement of their fields than Stephen Hawking. His work has significantly increased our knowledge of black holes and other exotic celestial objects. Also, and perhaps more importantly, Hawking has made significant strides in advancing our understanding of the Universe and its creation.

There are really two main branches of astronomy: optical astronomy (the study of celestial objects in the visible band) and non-optical astronomy (the use of instruments to study objects in the radio through gamma-ray wavelengths).

Optical Astronomy: Today, when we think about optical astronomy, we most instantly visualize the amazing images from the Hubble Space Telescope (HST), or close up images of the planets taken by various space probes. What most people dont realize though, is that these images also yield volumes of information about the structure, nature and evolution of objects in our Universe.

Non-optical Astronomy: While optical telescopes are sometimes considered the only pure instruments for doing astronomy research, there are other types of observatories that make significant contributions to our understanding of the Universe. These instruments have allowed us to create a picture of our universe that spans the entire electromagnetic spectrum, from low energy radio signals, to ultra high energy gamma-rays. They give us information about the evolution and physics of some of the Universes most dynamic treasures, such as neutron stars and black holes. And it is because of these endeavors that we have learned about the structure of galaxies including our Milky Way.

There are so many types of objects that astronomers study, that it is convenient to break astronomy up into subfields of study.

Planetary Astronomy: Researchers in this subfield focus their studies on planets, both within and outside our solar system, as well as objects like asteroids and comets.

Solar Astronomy: While the sun has been studied for centuries, there is still a significant amount of active research conducted. Particularly, scientists are interested in learning how the Sun changes, and trying to understand how these changes affect the Earth.

Stellar Astronomy: Simply, stellar astronomy is the study of stars, including their creation, evolution and death. Astronomers use instruments to study different objects across all wavelengths, and use the information to create physical models of the stars.

Galactic Astronomy: The Milky Way Galaxy is a very complex system of stars, nebulae, and dust. Astronomers study the motion and evolution of the Milky Way in order to learn how galaxies are formed.

Extragalactic Astronomy: Astronomers study other galaxies in the Universe to learn how galaxies are grouped and interact on a large scale.

Cosmology: Cosmologists study the structure of the Universe in order to understand its creation. They typically focus on the big picture, and attempt to model what the Universe would have looked like only moments after the Big Bang.

See the original post here:

Astronomy - Introduction and History of the Study of Stars

What is astronomy? 10 astronomy facts – The Time Now: What …

Astronomy is a science that studies celestial objects such as the Sun, planets, moons, stars, nebulae & galaxies.

It is often referred to as the final frontier. Astronomy is an exciting field that most people have at least some interest in. There are literally hundreds of facts that we could go through, below we narrow it down to 10 major facts in no particular order:

1. The night sky appears to be moving when you look at it. All of the stars and points of interest seem to be moving from east to west. This appearance of movement is due to the fact that the earth is rotating. The speed at which the earth is rotating is approximately 1,000 miles per hour. This is the cause of setting and rising of objects at night. You would also be able to see it during the day, as if it were six months ago, if you were able to turn the sun off like a light bulb.

A year is known as 365 days, but the earth actually takes 365.25 days to orbit the sun. This is why we had to come up with a leap year every 4 years, to account for the extra .25 that is otherwise not accounted for. Most people are aware of that, but did you know that once every 400 years we also do another leap year in addition? This is because the earth rotates around the sun a little over 365.25 times, just a fraction if you will. We compensate that by having another leap year every 400 years.

2. Certain areas of the sky are marked by constellations. The charts that you see these days will have the sky broken up into 88 constellations. A constellation of stars is a group of stars that is found to have a pattern, and is named after a mythological figure or by which form it takes. The constellations were first identified by the Greeks over 2500 years ago. They have since seemingly changed positions to where we see them today. This has caused quite the stir in the astrology world, and most people have had to acknowledge the shift in positions.

One of the most common constellations is the Big Dipper. It is one of the constellations that make up the Ursa Major. You will also be able to tell people that you know the Little Dipper is part of the Ursa Minor. The smallest constellation is known as the Crux, whereas there are several larger constellations, Ursa Major is among them. Most of the constellations that you will see were named because they look like something, although more often than not they dont look like what they are named after .

3. There are nine planets that have been named in our solar system, although Pluto was named a dwarf planet in 2006, so it is no longer classified in the same group as the other planets that we have come to know. Earth, Mercury, Venus, Jupiter, Mars, Uranus, Saturn, and Neptune are the other eight planets. Pluto was first discovered to be a planet, but the astronomers all over the globe were going back and forth on whether or not it should be classified as one or not for years. Finally they came to the dwarf planet conclusion.

The largest planet in our solar system is Jupiter. The closest planet to the sun is Mercury. Saturn is known for the magnificent rings that accompany it. A planet is defined as a celestial body that is orbiting around a star. It also has to be big enough that it forms the shape of a sphere through gravity. If it is too big and causes thermonuclear fusion, it is not considered a planet anymore.

4. The exact number of stars is always going to be changing, therefore an exact number is essentially impossible. It is extremely hard to even guess the number of stars. You could say that there is trillions of them, and you would still be way off on the low end. A lot of the stars that you can see without a telescope, just with your eyes at night, were named as far back as ancient times. A lot of the traditions and customs that they used back then to name stars have since dramatically changed. We now have a much different process in choosing the names for our stars.

You will find that a lot of stars have Arabic names. This is because in medieval times the Islamic nations had a very developed interest in astronomy. In fact, the Big Dipper has 7 stars that are named in Arabic. Some time later, Latin became a popular choice when Europeans started to develop their strong interest in astronomy. Polaris, otherwise known as the north star, is one prime example of this.

5. It is impossible to try and name or memorize all of the stars outside of our solar system. There are tons of catalogs though at many research centers that have captured and documented hundreds of thousands of them. You will find that in most of the constellations, the star that is named with an A at the beginning of it is the brightest star. The next brightest would start with a B, and so on. One good example of this is Librae, which is the brightest star in Libra. The only downside to this methodology is that the Greek alphabet only has 24 characters. So, if a constellation has more than 24 stars, it will be impossible to name them in order of diminishing brightness.

Without the help of technology, the naked eye can see approximately 6,000 stars at night. In the northern hemisphere this number falls down to about 3,000. This is going to be the case no matter where you are at, because you are never going to be able to see the entire sky. Some sources have noted that it could be up to 7,000 stars in the entire sky and 3,500 that you can see at one time. Although there seems to be no exact number, this should give you a good estimate of how many stars you can see.

6. Catalogs have played a large role in our ability to effectively name stars and keep track of the ones that are already named. Since most astronomers are not able to efficiently use the Greek method of naming stars, they designate names and add them to these catalogs. One of the most important catalogs that you will find was created by F.W. Argelander in Germany in the mid 1800s. It was named Bonner Durchmuterung. It was at the Bond observatory and listed hundreds of thousands of stars.

In the U.S. around 1920, another very important catalog was created in order to help the astronomy field keep track of stars. It was known as the Henry Draper catalog. The listings in this catalog go by HD numbers. You can think of it as a library book number. It is not exactly the same but will help you put it in perspective. Henry Draper was a physician that died, and his widow is the one that funded the catalog and named it after him.

Many people forget that there is also a few catalogs that were started for non-stellar objects. This can include star clusters, nebulae, galaxies, etc. You can find a 100 or so of the brightest ones in a catalog that was created by Charles Messier, a French astronomist, in the 18th century. The objects that are in this catalog are going to be referred to by their M numbers.

7. Astronomy and time have always gone hand and hand. Ever since there has been a need to keep track of time, there has been astronomy behind that. Even in ancient times they needed to be able to keep track of the seasons in order to properly plan religious events and when there would be dramatic weather changes. Today, we still rely on this information to plan our vacations, religious events, and many other things that we need to know the weather ahead of time for. We know what weather is likely to come our way depending on what season we are currently in.

Those that wanted more specific time measurements had to look to the position of the sun in the sky. Just about all of the people on earth sleep when it is dark and are active when it is light. The sun is what tells us when that is happening. One of the best inventions to keep track of this was the sundial. It was meant to help keep extreme accuracy for those that used it. For the most part, it was extremely accurate. It was the equivalent to our phones and alarm clocks now days.

To get nearly exact measurements of time, astronomers would turn to the meridian. For those of you that dont know, the meridian is the circle on the celestial sphere that passes through both celestial poles and the zenith. You would be able to tell when noon has occurred when you are able to identify the sun crossing over the meridian above the horizon. Midnight would happen when the same thing occurred in the opposite direction, but below the horizon.

8. People eventually found out that the sun is not the most accurate way to keep time. There are a couple of reasons why. The earths orbit is an eclipse, not a circle. The sun is the reason why one of the points in which you would focus is blocked out. Because of this, the earth will move closer to the sun during part of the orbit, and it will move further away during the other part. The earth will then proceed to speed up during the part when it is closest to the sun, and it will slow down when it is at the part of the orbit that is furthest from the sun.

The ecliptic is the next reason, as it is inclined by 23.5 degrees. That is in relation to the celestial equator. A large portion of the suns motion around the equinoxes is in a north-south direction, rather than the majority of the time being in the east-west direction. The easterly progression from day to day during the solstices is faster than that of the equinoxes. Because of this, it is more like the earth is rotating normally, rather than in an ecliptic motion. The tropic of cancer and the tropic of capricorn are great examples of that. When the sun is over the equator, it moves slowly. When it is over the two tropics mentioned above, it starts to move faster.

They were able to solve this problem by the creation of the mean sun. The mean sun is the rate at which the sun would appear to be rotating around the earth if it were uniform. It keeps moving at a constant rate and is a much better way to keep accurate time. This way people can keep time without having to account for all of the variables that would cause time to slightly alter. It would be impossible to create a way to keep track with all of these variables.

You may hear of the term, a mean solar system day. This is the interval in which it takes the sun to cross the meridian transit of the mean sun successively. This time is also measured to be exactly 24 hours. It is meant to be the exactly equivalent to a normal solar system day. Our sense of time can be linked to this mean solar system day.

9. The mean solar system time and the apparent solar system time can be different by up to 15 minutes during the different seasons. The equation of time is what you call this difference. Astronomers have come up with graphs that will allow you to correct this equation of time. You would have to use a sun dial to accomplish this. It is going to be the apparent solar system time minus the mean solar system time. This should give you the margin of error that you can use with the sundial to come up with the appropriate time.

Time zones were invented to help make it easier on those that were dealing with commerce and transportation. It was also a great help to those that were communicating as well. All of the clocks are set to the mean solar system time. First though, you have to make sure that the meridian runs through the center of that time zone in order for it to be accurate. The earth is comprised of 24 time zones. Four of these time zones are located across the United States. These four time zones result in a 3 hour difference between the west coast and the east coast.

Sidereal time is when you base your time measurement off of the stars, and not the sun. Astronomers have been focusing on this instead of the sun for measuring time. You are able to use this sidereal time to help you aim telescopes before you go gazing into the heavens above. Just about every astronomer out there is going to have a sidereal clock to use while they are star gazing. If you are looking for a technical definition, sidereal time is the right ascension of an object on the meridian. Besides astronomers, navigators have always benefited from sidereal time. They have used it in their travels for quite some time. It might not be that useful to you and me, but that doesnt mean that it doesnt have a very distinct and useful purpose.

10. Gravitational pull is exerted by both the moon and the sun to the earth. This is the reason why the earth rotates the way that it does. Because of this, there is a large bulge near the equator. There is a 27 mile difference when the earth is measured by the equator, rather than pole to pole. This is why you will hear the earth referred to as a oblate spheroid, instead of a sphere. The gravitational pull does have an affect on the bulge and the earth which causes a change in the axis rotation.

If you compare the earth to a spinning top, it is a lot easier to understand. The top has to spin otherwise it is going to fall on its side. Similarly, the earth would do close to the same thing. Gravity would cause the bulge to be tugged and it would have the earth straighten out. The earth is always spinning though, so luckily there is no chance of that every happening.

Read this article:

What is astronomy? 10 astronomy facts - The Time Now: What ...

Astronomy – New World Encyclopedia

Astronomy (Greek: = + , astronomia = astron + nomos, literally, "law of the stars") is the science of celestial phenomena that originate outside Earth's atmosphere. It gives us the context for our existence in an evolving universe of untold numbers of galaxies and complex structures at all scales. It studies the origins, evolution, and physical and chemical properties of celestial objects. In short, astronomy is about finding out what is going on beyond Earth.

Astronomy is one of the oldest sciences, with a scientific methodology existing at the time of Ancient Greece and advanced observation techniques possibly much earlier as seen in the study of archaeoastronomy. In ancient cultures astronomical observations were often connected to religious thought, a remnant of which we find in astrology today.

The earliest observations of the heavens were by naked eye, but even this method allows the celestial objects to be cataloged and assigned to constellations. A knowledge of the constellations has been an important navigational tool since the earliest times. The emergence of astronomy as a science following the scientific method is very important to the development of science in general. It was through astronomy with the development of the heliocentric (sun-centered) view of the solar system that we find the early seeds of conflict between Christian thought and science (see Galileo Galilei).

Did you know?

Astronomy is one of the few sciences where amateurs can still play an active role

Astronomy is one of the few sciences where amateurs can still play an active role, especially in the discovery and monitoring of transient phenomena.

In ancient Greece and other early civilizations, astronomy consisted largely of astrometry, measuring positions of stars and planets in the sky. Later, the work of Johannes Kepler and Isaac Newton, whose work led to the development of celestial mechanics, mathematically predicting the motions of celestial bodies interacting under gravity, and solar system objects in particular. Much of the effort in these two areasonce done largely by handis highly automated nowadays, to the extent that they are rarely considered as independent disciplines anymore. Motions and positions of objects are now more easily determined, and modern astronomy is more concerned with observing and understanding the actual physical nature of celestial objects.

Since the twentieth century, the field of professional astronomy has split into observational astronomy and theoretical astrophysics. Although most astronomers incorporate elements of both into their research, because of the different skills involved, most professional astronomers tend to specialize in one or the other. Observational astronomy is concerned mostly with acquiring data, which involves building and maintaining instruments and processing the resulting information; this branch is at times referred to as "astrometry" or simply as "astronomy." Theoretical astrophysics is concerned mainly with ascertaining the observational implications of different models, and involves working with computer or analytic models.

The fields of study can also be categorized in other ways. Categorization by the region of space under study (for example, Galactic astronomy, Planetary Sciences); by subject, such as star formation or cosmology; or by the method used for obtaining information.

Other disciplines that may be considered part of astronomy:

Main article: Observational astronomy

In astronomy, information is mainly received from the detection and analysis of electromagnetic radiation and photons, but information is also carried by cosmic rays, neutrinos, meteors, and, in the near future, gravitational waves (see LIGO and LISA).

A traditional division of astronomy is given by the region of the electromagnetic spectrum observed:

Optical and radio astronomy can be performed with ground-based observatories, because the atmosphere is transparent at the wavelengths being detected. Infrared light is heavily absorbed by water vapor, so infrared observatories have to be located in high, dry places or in space.

The atmosphere is opaque at the wavelengths used by X-ray astronomy, gamma-ray astronomy, UV astronomy and (except for a few wavelength "windows") Far infrared astronomy, so observations must be carried out mostly from balloons or space observatories. Powerful gamma rays can, however be detected by the large air showers they produce, and the study of cosmic rays can also be regarded as a branch of astronomy.

In early times, astronomy only comprised the observation and predictions of the motions of the naked-eye objects. Aristotle said that the Earth was the center of the Universe and everything rotated around it in orbits that were perfect circles. Aristotle had to be right because people thought that Earth had to be in the center with everything rotating around it because the wind would not scatter leaves, and birds would only fly in one direction. For a long time, people thought that Aristotle was right, but now some people think that Aristotle accidentally did more to hinder our knowledge than help it.

The Rigveda refers to the 27 constellations associated with the motions of the sun and also the 12 zodiacal divisions of the sky. The ancient Greeks made important contributions to astronomy, among them the definition of the magnitude system. The Bible contains a number of statements on the position of the earth in the universe and the nature of the stars and planets, most of which are poetic rather than literal; see Biblical cosmology. In 500 C.E., Aryabhata presented a mathematical system that described the earth as spinning on its axis and considered the motions of the planets with respect to the sun.

Observational astronomy was mostly stagnant in medieval Europe, but flourished in the Iranian world and other parts of Islamic realm. In the late ninth century, Persian astronomer al-Farghani wrote extensively on the motion of celestial bodies. His work was translated into Latin in the twelfth century. In the late tenth century, a huge observatory was built near Tehran, Persia (now Iran), by the Persian astronomer al-Khujandi, who observed a series of meridian transits of the Sun, which allowed him to calculate the obliquity of the ecliptic. Also in Persia, Omar Khayym performed a reformation of the calendar that was more accurate than the Julian Calendar and came close to the Gregorian. Abraham Zacuto was responsible in the fifteenth century for the adaptations of astronomical theory for the practical needs of Portuguese caravel expeditions.

During the Renaissance, Copernicus proposed a heliocentric model of the Solar System. His work was defended, expanded upon, and corrected by Galileo Galilei and Johannes Kepler. Galileo added the innovation of using telescopes to enhance his observations. Kepler was the first to devise a system that described correctly the details of the motion of the planets with the Sun at the center. However, Kepler did not succeed in formulating a theory behind the laws he wrote down. It was left to Newton's invention of celestial dynamics and his law of universal gravitation to finally explain the motions of the planets. Newton also developed the reflecting telescope.

Stars were found to be faraway objects. With the advent of spectroscopy, it was proved that they were similar to our own sun but with a wide range of temperatures, masses, and sizes. The existence of our galaxy, the Milky Way, as a separate group of stars was only proven in the twentieth century, along with the existence of "external" galaxies, and soon after, the expansion of the universe, seen in the recession of most galaxies from us. Modern astronomy has also discovered many exotic objects such as quasars, pulsars, blazars and radio galaxies, and has used these observations to develop physical theories which describe some of these objects in terms of equally exotic objects such as black holes and neutron stars. Physical cosmology made huge advances during the twentieth century, with the model of the Big Bang heavily supported by the evidence provided by astronomy and physics, such as the cosmic microwave background radiation, Hubble's Law, and cosmological abundances of elements.

All links retrieved May 22, 2014.

New World Encyclopedia writers and editors rewrote and completed the Wikipedia article in accordance with New World Encyclopedia standards. This article abides by terms of the Creative Commons CC-by-sa 3.0 License (CC-by-sa), which may be used and disseminated with proper attribution. Credit is due under the terms of this license that can reference both the New World Encyclopedia contributors and the selfless volunteer contributors of the Wikimedia Foundation. To cite this article click here for a list of acceptable citing formats.The history of earlier contributions by wikipedians is accessible to researchers here:

Note: Some restrictions may apply to use of individual images which are separately licensed.

See the original post:

Astronomy - New World Encyclopedia

Astronomical Society

Inspiring people to look up and wonder for 125 years

On a chilly February evening in 1889 in San Francisco, astronomers from Lick Observatory and members of the Pacific Coast Amateur Photographic Association fresh from viewing the New Years Day total solar eclipse north of the City met to share pictures and experiences. Edward Holden, Licks first director, complimented the amateurs on their service to science, and proposed to continue the good fellowship through the founding of a Society to advance the science of Astronomy, and to diffuse information concerning it. Thus the Astronomical Society of the Pacific was born.

Through more than a century of operation, as human understanding of the universe has advanced, so has the ASP connecting scientists, educators, amateur astronomers and the public to share astronomical research, conduct professional development in science education, and provide resources that engage students and adults alike in the adventure of scientific discovery.

As a non-profit membership organization, international in scope, the ASPs mission is to increase the understanding and appreciation of astronomy through the engagement of our many constituencies to advance science and science literacy. We invite you to explore our site to learn more about us, to check out our resources and education section for the researcher, the educator, and the backyard enthusiast, to get involved by becoming an ASP member and to consider supporting our work for the benefit of a science literate world!

Media inquiries: media {at} astrosociety.org or 415-715-1406

The ASP is proud to participate in the Annual Combined Federal Campaign (CFC #10651), a workplace giving program supported by Federal civilian, postal, and military donors around the world. Thank you!

View original post here:

Astronomical Society

Bad Astronomy – Slate Magazine

Do you think global warming is something that only affects us sometime in the future, decades or centuries from now?

Think again. Our planet heating up is affecting us now, and has been for decades. Were already seeing a lot of serious problems due to it: extreme weather, more devastating hurricanes, wildfires, and sea level rise.

Of all these, the last seems most like science fiction. Seriously, the levels of the ocean are going up? It cant be much, right?

Think again, again. NASA just released results from several satellite observations going back to 1992. Those 23 years of data show that the oceans of the planet have risen substantially in that time: more than 6 centimeters (2 inches) on average, with some places on Earth seeing more than 22 centimeters (9 inches)!

This animation shows where the levels are going, and by how much:

The global sea level rise is driven by two major factors: One is that as water warms, it expands, raising the sea level. The other is that Greenland and Antarctica are melting, dumping 450 billion tons of water into the oceans every year. Every year.

Photo by NASA Earth Science News Team

So overall sea level is rising, but in some places its rising faster than others. For example, in the Pacific, heat travel east to west, so the eastern coasts of the Philippines and Japan have seen huge jumps in sea level the past two decades. Interestingly, sea levels have dropped in some places. Off the northeastern shore of the U.S. you can see a drop. But in that case its because the Gulf Stream, a major warm ocean current, has shifted north somewhat, so levels have risen in the north but dropped in its wake to the south.

But those drops are highly localized. Globally, levels are on the rise.

Drawing by Skeptical Science

The cause of all this is obvious and very real: global warming. As human activityprimarily dumping 40 billion tons of carbon dioxide into the atmosphere every yearcauses the Earths surface temperature to go up, a lot of that energy is absorbed by the oceans, causing them to expand. Some of it is absorbed at the poles, melting ice there.

Sea ice melting at the North Pole is bad enough, but the land ice melting is nothing short of catastrophic. Climatologists have already shown that the melting of the West Antarctica ice sheet may be unstoppable. We may be locked inthat is, inevitably going to suffer froma full meter of sea level rise, 3 feet. This may take a century or more, but its coming. And while that may seem like a long time, think of it this way: A meter per century is a centimeter every year, an inch every 2 years.

Mind you, thats vertical rise. Look at the slope of a beach and you can see that a small rise vertically means a lot of horizontal reach to the ocean, too. Well see beaches disappear, coastlines changed. More immediately, well see storm surges do far more damage as it takes less rise in the water levels to inundate cities. Remember what the surge from Hurricane Sandy did to NYC? Well be seeing more and more of that.

This is the new normal. And the scary thing is not so much that the new normal is bad, its that with more warming, rising sea levels, and changing weather patterns, the new normal will continue to get worse. There may not be a normal any more.

Just as a reminder: With only a single exception, none of the GOP presidential candidates has a reality-based view on global warming (the exception is George Pataki, who has no chance of winning), and those views range from unsupportable by facts to unhinged in the extreme. Even those of them who admit its real think its not human caused, or that we cant do anything about it without hurting the economy (and that is 100 percent ultra-grade fertilizer; its worse to wait). Even this far out it seems certain the House will go GOP again in 2016, so having a climate-change-denying president will mean at least four more years of inaction bolstered by the smoke and mirrors of the noise machine.

And dont forget that the GOP in the House is still trying to eviscerate NASAs Earth science budget, which goes in large part to monitoring the effects of global warming. Why? Simply put, they deny the reality all around them.

And all that time, the temperatures will rise, the glaciers will melt, the sea levels will rise, and well be that much deeper into a catastrophe that is already well under way.

Read more:

Bad Astronomy - Slate Magazine

MEMS | Solid State Technology

View this paper to learn how Epicor ERP specifically aligns to the business needs of the electronics and high-tech industry, and hear how one electronics organization achieved improved operational controls, better inventory accuracy, and world class tools to meet supply chain requirements with Epicor ERP.July 01, 2015 Sponsored by Epicor

Operational efficiency is a critical factor in the fluid processing industry. The synergy of fitting components and assembly technology to achieve this objective is the focus of Fit-LINE, Inc. Applying extensive polymer technology and injection molding expertise, the company has analyzed the design, tooling and manufacturing processes required to create high-performance solutions for demanding high-purity fluid processing applications. Through extensive R&D, testing and evaluation, Fit-LINE has isolated three variables that need to be addressed to ensure leak-free fitting assemblies.June 01, 2015 Sponsored by Fit-LINE, Inc.

Remarkable silicones. The combination of their unique ability to maintain physical properties across a wide range of temperature, humidity, and frequency--combined with their flexibility--set them apart. Silicone based adhesives, sealants, potting and encapsulation compounds are used in hundreds of consumer, business, medical, and military electronic systems. In this white paper, learn what makes silicones different from other organic polymers, why their properties remain stable across different temperatures, and how they have played a major role in the rapid innovation of the electronics industry.May 12, 2015 Sponsored by Master Bond, Inc.,

September 9, 2015 at 8:00 p.m. ET

Sponsored By:

September 2015 (Date and time TBD)

Sponsored By:

September 2015 (Date and time TBD)

Sponsored By:

Original post:

MEMS | Solid State Technology

Medical School Requirements – StudentDoc

Written by Studentdoc Editor

Visit our Premed Forum for details and discussion of medical school requirements.

There are no set-in-stone requirements for every medical school. Many medical schools will make exceptions or emphasize different courses and topics in their admissions process. However, there is a basic set of courses and examinations that is commonly accepted as basic medical school requirements that will be considered by nearly every school.

With the development of the new MCAT, planned for release in 2015, there are additional courses that are recommended. These include psychology and social sciences, which will be tested in new sections on the longer MCAT.

Most often, an initial screen of applicants is done by computer to ensure that basic things like courses taken, GPA and MCAT scores meet a desired minimum. After that, it's all about the person and not the numbers. Consider what makes a strong medical school application, and adjust yours accordingly. The medical school admissions process is a mix of science and art. To get an idea of how competitive your MCAT scores and GPA are, try our Medical School Search tool.

The commonly accepted coursework requirements for medical school include a minimum of 1 year of:

If you are planning to do your premedical coursework after you get your undergraduate degree, you can take these courses at nearly any four-year college.

Medical school admissions are competitive, so you need to have a strong GPA. A GPA above 3.5 is preferrable. A GPA below 3.5 can somtimes raise a flag, especially if you attended a school famous for grade-inflation, like Harvard. While things might have changed a little at Harvard, there is still the impression that everyone gets a minimum 3.3, so the GPA cutoff might be more strictly enforced.

Your MCAT scores are important. They say little about you as a person, but they are given substantial weight by medical schools. The sections of the MCAT are similar to the required coursework: physical sciences (physics and inorganic chemistry), biological sciences (biology and organic chemistry), verbal, and a writing sample.

It has been estimated that 70-80% of all medical school applicants have taken an MCAT test prep course.

You need a college degree. BUT, it does not have to be in the sciences. In fact, for some schools a science degree is a negative - Johns Hopkins, for example. You need to show medical schools you are passionate about something. That you're willing to spend four years, study a topic you love, learn it, and be able to build on it. Selecting a college major should not be about getting into medical school, it should be about study what you love to think about or do.

If you do enjoy science, then research is one way to show you're serious about it. If you're going to do a research project as an undergrad, start early. Freshman year is not too early to start. That gives you a year or two to learn the ropes, then a year and a half of serious work before you get to present your work in your medical school interview. Choose a respected faculty member doing research that interests you. Work hard. Read. Understand what you are doing and why you are doing it. You should be able to explain and defend your work to an educated scientist who doesn't work in your field.

I'm personally not a big fan of shadowing a physician. It doesn't show much committment, and suggests you're just interested in getting into medical school. If you're truly not sure you want to get into medicine, then shadow a physician and find out what it's like. Don't expect a "shadowing experience" do carrya lot of weight on your application.

The impact of volunteer service on your application will depend on the quality of the service, and your committment to it. Is this a one month, two-times a week thing organized by someone else, or is this a project you've involved in for several years and are taking a leadership role in. How does this project affect you, and how have you made a meaningful contribution to the project.

Remember, medical schools are looking for people who are willing to take the time and effort to make a serious contribution. That contribution can be in a volunteer program, an academic pursuit, research, or even sport. You just have to show that you are willing and capable of working hard enough to accomplish an important goal.

Go here to read the rest:

Medical School Requirements - StudentDoc

Medical schools in California (United States)

Page 1 of 2 University of California Los Angeles (David Geffen School of Medicine) In just over 50 years - within the lifetimes of many of its original architects - the David Geffen School of Medicine at UCLA has joined the ranks of ... Address:10833 Le Conte Avenue Stanford University (School of Medicine) The School of Medicine is interested in identifying candidates who are committed to serving the needs of all members of society, and whose accomplishm... Address:300 Pasteur Drive Loma Linda University (School of Medicine) Since opening in 1909, Loma Linda University's School of Medicine has been training skilled medical professionals with a commitment to Christian ... Address:11175 Campus Street Coleman Pavilion University of Southern California (Keck School of Medicine) Located in Los Angeles, the Keck School of Medicine of the University of Southern California trains tomorrows leaders in patient care and biomedical ... Address:1975 Zonal Avenue University of California San Francisco (School of Medicine) Ranked fourth among the nation's medical schools, the UCSF School of Medicine earns its greatest distinction from the outstanding faculty - includ... Address:513 Parnassus Avenue University of California San Diego (School of Medicine) The UCSD School of Medicine is uniquely positioned to provide a solid foundation for a successful career, whether your focus turns out to be primary c... Address:9500 Gilman Drive Touro University (College of Osteopathic Medicine - California) TUCOM prepares students to become outstanding osteopathic physicians who uphold the values, philosophy and practice of osteopathic medicine and who ar... Address:1310 Johnson Lane, Vallejo, California Western University of Health Sciences (College of Osteopathic Medicine of the Pacific) The Mission of the College of Osteopathic Medicine of the Pacific (COMP) is to prepare students to become technically competent, culturally sensitive,... Address:309 East Second Street/College Plaza, Pomona, California University of California Davis (School of Medicine ) UC Davis School of Medicine is one of five University of California medical schools in the State of California. Founded in 1966, the school graduated ... Address:One Shields Avenue, Med. Sci. 1C, Rm. 104 University of California, Irvine (College of Medicine) Since its founding in 1965, the University of California, Irvine has been set apart by its spirit of innovation, with strengths in research and educat... Address:Medical Education Building 802

Original post:

Medical schools in California (United States)

Liberty House | A Landmark Hospitality venue

The Liberty House Restaurant, located on the waterfront of Jersey City, boasts a perfect, unobstructed view of the breathtaking New York City skyline. The full-service multi star restaurant, as well as the two large event spaces, faces the New York skyline for all guests to enjoy. Located adjacent to Liberty Landing Marina, the Liberty House Restaurant offers a menu of seasonal fare, fresh sushi and seafood, incredible steaks, chops and daily specials. The same quality of service is evident in the catering menus and is the reason Liberty House has been voted one of the best weddings venues in NJ year after year. Outdoor dining is offered spring through fall to make it a perfect weeknight setting for reconnecting with friends and family. From live music in summer months to cocktails on the patio in front of the fire pits, Liberty House Restaurant has something for everyone. Liberty House Restaurant even offers marshmallows for roasting at the fire pits so you can practice making the perfect smore. Every month Liberty House Restaurant offers new dinner specials and exciting new dishes to tempt the palate. From the bestselling Bacon and Bourbon night to Liberty Houses March Madness of Mac N Cheese month, where a new Mac N Cheese is offered every day. At Liberty House Restaurant there is something exciting happening each month. Combining the best views of NYC with the best wedding venue in NJ makes for an unforgettable wedding and event experience. Two completely separate event spaces and beautiful gardens face the famous skyline, making the Liberty House a top choice for engaged couples in the tri state area. We have the top event planners, our memory makers, who have extensive wedding experience and know how to make your event a huge success. Over the past years, couples have voted Liberty House one of the best wedding venues in NJ. Amazing cuisine, perfect NYC views and our memory makers are the perfect ingredients for your next event!

The Liberty House Restaurant, located on the waterfront of Jersey City, boasts a perfect, unobstructed view of the breathtaking New York City skyline. The full-service multi star restaurant, as well as the two large event spaces, faces the New York skyline for all guests to enjoy. Located adjacent to Liberty Landing Marina, the Liberty House Restaurant offers a menu of seasonal fare, fresh sushi and seafood, incredible steaks, chops and daily specials. The same quality of service is evident in the catering menus and is the reason Liberty House has been voted one of the best weddings venues in NJ year after year. Outdoor dining is offered spring through fall to make it a perfect weeknight setting for reconnecting with friends and family. From live music in summer months to cocktails on the patio in front of the fire pits, Liberty House Restaurant has something for everyone. Liberty House Restaurant even offers marshmallows for roasting at the fire pits so you can practice making the perfect smore. Every month Liberty House Restaurant offers new dinner specials and exciting new dishes to tempt the palate. From the bestselling Bacon and Bourbon night to Liberty Houses March Madness of Mac N Cheese month, where a new Mac N Cheese is offered every day. At Liberty House Restaurant there is something exciting happening each month. Combining the best views of NYC with the best wedding venue in NJ makes for an unforgettable wedding and event experience. Two completely separate event spaces and beautiful gardens face the famous skyline, making the Liberty House a top choice for engaged couples in the tri state area. We have the top event planners, our memory makers, who have extensive wedding experience and know how to make your event a huge success. Over the past years, couples have voted Liberty House one of the best wedding venues in NJ. Amazing cuisine, perfect NYC views and our memory makers are the perfect ingredients for your next event!

Continue reading here:

Liberty House | A Landmark Hospitality venue