Medical School 101: What Medical School Is Really Like …

Premedical students are, understandably, focused on getting into medical school. They shadow physicians and have an idea of what being a physician is like. However, many dont have an understanding of what life at medical school is like.

Medical school is a place in which you will grow as a person and as a professional. You will be challenged to study more than you thought possible and pick yourself up when you fall down. The massive amounts of knowledge you need to learn in a short period of time makes medical school one of the most challenging professional schools out there.

I like to think of medical school as a roller coaster. Each medical student who enters is happy and even eager to study but as the months drag on, the studying gets old and you say to yourself, I cannot wait until all this studying is over!

As a new physician, I have experienced the sleeplessness, the long arduous hours of studying, the multiple stops at Starbucks and more. Heres my overview of the realities of attending medical school.

Two types of medical schools exist: Allopathic Medical Schools and Osteopathic Medical Schools. Allopathic medical schools confer an M.D. degree and Osteopathic medical schools confer a D.O. degree. Both schools train its students to become fully licensed to practice medicine and prescribe medications. Both doctors see patients and become investigators of the body as they try to find out why their patients are sick.

Whats the difference? Osteopathic Physicians learn osteopathic manipulative treatment, using their hands to help diagnose and treat different diseases.

The typical medical school focuses on a combination of lectures and problem-based learning modules. Imagine sitting in class, listening to lectures, taking notes and then taking Scantron or even computerized tests. This is the standard way in which medical school builds and tests your knowledge. In fact, medical school literally feeds your brain with first, basic sciences and then, clinical knowledge.

The problem-based learning method consists of a group of med students working together to solve a patient case. For example, you are presented with a hypothetical 45 year old man with a history of heart disease and high cholesterol. He travels from New York to California on a business trip. Upon landing he experiences excruciating right leg pain. Problem-based learning focuses on exploring this case and diagnosing this patient. A physician-moderator typically sits in to guide and create the dynamic of the group.

Schools may have a traditional or system-based curriculum. A systems-based curriculum means that all your classes are divided up by body system. For example: Month one may be about the cardiovascular system, month two may be about the gastrointestinal system and month three may be about the reproductive system and so on.

YEAR 1 Your MS-1 (Medical Student 1) year will be your most difficult year of med school. Year one of medical school consists of mostly basic sciences courses, which means LOTS of memorization. I detail the major classes below, but medical school also consists of medical ethics courses, OSCEs in which you learn the physical exam and more. OSCEs refer to Objective Structured Clinical Exams in which you are presented with various hypothetical patient scenarios. An actor portrays a patient with a certain clinical disease and you are expected to obtain a thorough medical history and physical examination in the allotted time period.

GROSS ANATOMY In year one, you are presented with one of the most challenging medical school classes known to humankind: gross anatomy. For many of you, gross anatomy conjures up images of cadavers and the smell of formaldehyde. Gross anatomy has two components: lecture and lab. Lecture is typically lasts for an hour while lab is typically about four to five hours long.

Different medical schools structure their gross anatomy courses differently: Some medical schools have gross anatomy every day while other medical schools opt to hold the course three times a week. The course itself can last three months to one year.

Here, you will learn the wonders of the human body from the cranial nerves, brachial plexus and mediastinum to the femur, humerus and orbicularis oculi muscle in your eye. Im not gonna lie, gross anatomy is a tough class. You have to keep up with the reading or else you will be behind. Study in groups if you like learning with a group of people.

HISTOLOGY Histology is the study of cells in the human body. This, too, consists of a lecture and lab component. Oftentimes, you will take histology and gross anatomy together, especially if your medical school is systems-based. Lab consists of looking at slides in the microscope. I loved histology but didnt appreciate gross anatomy until I was done with it!

PATHOLOGY Ever watch Dr. G Medical Examiner? Pathology class in medical school is similar to pathology seen on Dr. G Medical Examiner. You look at histology slides of, for example, an infarcted heart (heart attack) and know by inspection that it is a damaged heart. This, like histology and gross anatomy, consists of lecture and lab.

BIOCHEMISTRY Biochemistry is similar to organic chemistry but better. Dont panic, you dont have to distill any liquids in lab or draw any funny structures as this class is primarily lecture-based. You may have to memorize the Krebs cycle and glycolysis cycle.

YEAR 2 Year two of medical school is typically clinical-based. Here you will learn a handful of the diseases you will encounter in the hospital, such as:

and the list goes on (and on and on).

This is when medical school turns to real medicine.

YEAR 3 Year three consists of clinical rotations. Here you will become part of the medical team. A medical team typically consists of an attending (senior doctor), residents (doctors-in-training) and interns (first year residents). As a medical student, you are at the bottom of the totem pole. Some doctors will make that well-known while others are very nice.

You will rotate through the many clinical specialties of medicine, such as Internal Medicine (adult medicine), pediatrics, ob/gyn, psychiatry, etc. Here, you will get a taste of what kind of doctor you will become.

Your team will grade you on your performance during your rotation. As with any work environment, this can be a bit biased. However, national tests are administered at the end of your rotations. Some medical schools require you to pass this exam to receive a grade at the end of your clinical rotations. Sometimes, the percentage grade is even factored into your final rotation grades.

YEAR 4 Year four of medical school is much like year three but a bit more specialized. You can delve into the specialties of medicine even more. For example, if you liked internal medicine, you can elect to do a gastroenterology, cardiology or rheumatology rotation. Grading is the same as in year three.

So this piece hopefully gave you a good overview of the nuts and bolts of medical school. Congratulations on your recent admission or good luck with your applications and best wishes for your future plans!

Dr. Lisabetta Divita is a physician, medical writer/editor and premedical student mentor. Her company blog, MedicalInk911, can be found at LisabettaDivita.weebly.com.

Read more:

Medical School 101: What Medical School Is Really Like ...

Medical Education | HMS

The Program in Medical Education (PME) at Harvard Medical School is the organizational structure housing the educational programs leading to the MD degree. Under the leadership of the Dean for Medical Education and the Associate Dean for Medical Education Planning and Administration, the offices of the PME are responsible for all aspects of the educational plan and for development and review of educational policies.

Five academic units report to the Dean for Medical Education, who also serves as chair of the Curriculum Committee (see Curriculum Governance). These units include Admissions; the Academic Societies; the Academy/Center for Teaching and Learning; the Center for Evaluation; and Student Affairs.

A parallel administrative structure is overseen by a chief administrative officer reporting directly to the DME. The Associate Dean for Medical Education Planning and Administration has responsibility for all administrative functions in the PME, including Financial Administration, Curriculum Programs, Admissions, Financial Aid, the Registrars Office, and the Academy. Reporting to the Associate Dean is a team of senior administrative staff who work together to facilitate communication throughout the PME, develop administrative policies and procedures, and plan events or programs. Each area of the PME has a representative on a Senior Administrators Group, chaired by the Associate Dean, including the Academy, the Academic Societies, Admissions, the Center for Evaluation, Curriculum Programs, Financial Aid, Financial Administration, the DMEs Office, the Registrars Office, Scholars in Medicine, and Student Affairs.

See the original post:

Medical Education | HMS

Medical School Admission

So you're interested in becoming a medical doctor? If you haven't yet applied to med school, then you will (hopefully) find Medical School Admission Dot Com very helpful.

The number of first-year students enrolling at the nation's medical schools in the fall of 2014 grew to a record 20,343. However, so too did the number of applicants which grew to 49,480.That's a 13% increase in the number of applicants in just the last 3 years!

As recently as 2002, the number of students applying to and enrolling in American medical schools appeared to be in a freefall, having dipped sharply, from highs in the mid-1990s, amid concerns about a glut of physicians. But with at least some experts now predicting a significant shortfall of doctors in the years ahead, medical schools are expanding their enrollments and students are flooding the institutions with applications to fill the seats, according to an annual look at medical school admissions by the Association of American Medical Colleges (AAMC).

The AAMC called for a 30 percent increase in medical school enrollments by 2015 through expanded enrollments at existing schools and the creation of new ones. However, that level of growth wasn't quite attained and the AAMC had the foresight in May 2012 to issue a revised forecast for a 29.6% increase in enrollment by the 2016-2017 academic season.

AAMC officials have been gratified not just by the enrollment growth, but also that it has resulted in a medical student body that is both more academically accomplished and more ethnically and racially diverse. Fewer than 45 percent of 2014 applicants to American medical schools were admitted, a figure that has declined steadily. Admitted first-year students in 2014 had an average MCAT score of 31.4 and average college grade point average of 3.69. That translates into a 0.3 MCAT score increase and 0.02 GPA increase in just the past three years. And it is a meaningful increase over the respective averages of 29.7 and 3.60 for the entering class of 2000.

As a result, an already competitive admissions process appears to be becoming even more competitive. But don't fear. We have information on many different aspects of the admission process including both the AMCAS and the MCAT that can help you along the way.

What We Think You Should Know

Med school is not easy to get through, and the profession itself is definitely not for everyone. Think long and hard about becoming a doctor before you commit yourself to this career track. Being a doctor means making huge sacrifices, first in medical school and then later in your internship and residency. Long shifts will be the norm not just during your training but throughout your career. Even when you're not working directly with patients, you will be spending a significant amount of time as a doctor reading and staying current in new medical techniques and research. The people who excel in medicine are those who are happy spending every waking moment thinking about it and those are precisely the kind of people that medical schools are looking for.

People are only half-joking when they describe a medical career as a "jealous lover" who takes over the practitioner's life. Studies have found that doctors tend to have a very high divorce rate and, ironically, a relatively short life expectancy. That situation seems to be changing somewhat, precisely because the younger doctors coming into the profession now insist on more work-life balance than their elders were willing to accept. Still, medicine is far more than a job. It takes as much dedication to thrive in a medical career as it does to get into medical school and to excel in medical training.

This website will give you assistance in your application process as you apply to med programs, but we want to make sure you are aware of the downside and have given proper thought and consideration to these points. If you have not, you could end up unhappy with your choice of career path and you will have taken an opportunity to attend med school from someone else who has the devotion to the "jealous lover" and the dedication to become a good doctor.

Still Want to Be a Doctor?

Good! Read on and we wish you the best of luck with your medical school applications!

See more here:

Medical School Admission

What You Need to Know about Medical School

By Tara Kuther, Ph.D.

Many college students wonder whether they should apply to medical school. If you're considering a career in medicine, start preparing now. If you aren't sure if medical school is for you but there is a small possibility that you might apply - begin preparing. Med school is very competitive. Start planning and gaining the experiences you need to construct a successful application even if you haven't entirely decided to apply.

Delay preparing until are one hundred percent certain that you will apply -- and you'll hurt your chances of admissions as the best applicants begin early.

Planning for Medical School You do not have to be a premed major to be accepted to medical school. In fact, many universities do not offer a premed major. Instead you must satisfy some basic academic prerequisites including lots of science and math courses. Think carefully about whether medical school is for you. Consider the pros and cons of a career in medicine, the cost of med school , and what your years in med school might be like. If you decide to apply to medical school you must determine what type of medicine is for you: allopathic or osteopathic.

Applying to Medical School If you plan to apply to med school right after graduating college you must begin the application process towards the end of your junior year. First, you must take the Medical College Admission Test. This challenging exam tests your knowledge of science as well as your reasoning and writing abilities. Give yourself time to retake it, if needed. Take time to prepare by reviewing MCAT prep books and taking sample exams.

The MCAT is administered by computer from January through August each year. Register early as seats fill quickly.

Review the American Medical College Application Service (AMCAS) Application. Note the assigned essays regarding your background and experience. You will also submit your transcript and MCAT scores. Another critical part of your application is your letters of evaluation. These are written by professors and discuss your competencies as well as your promise for a career in medicine.

If you make it past the initial review you may be asked to interview. Do not rest easy as most interview candidates are not admitted to medical school.

The interview is your chance to become more than a paper application and set of MCAT scores. Preparation is essential. The interview may take several forms. A new type of interview the Multiple Mini Interview (MMI) is becoming increasingly popular. Consider the kinds of questions that you are likely to be asked. Plan questions of your own as you are judged by your interest and the quality of your questions.

Attending Medical School You will find that attending medical school is not just a full time job - it is two. As a medical student you will to attend lectures and labs. The first year of medical school consists of science courses that pertain to the human body. The second year consists of courses on disease and treatment as well as some clinical work. Additionally, students are required to take the United States Medical Licensing Examination (USMLE-1 given by NBME) their second year to determine if they have the competence to continue. The third year students begin their rotations and continue the fourth year, working directly with patients.

During the fourth year students focus on specific subfields and apply for residency. The Match is how residencies are selected: Both applicants and programs blindly choose their top preferences. Those who match are awarded by the National Resident Matching Program. Residents spend several years in training, varying by specialization. Surgeons, for example, may complete training up to a decade after graduating from medical school.

Continued here:

What You Need to Know about Medical School

List of US Medical Schools – University of Alaska Anchorage

Albany Medical College (Albany, New York)

Albert Einstein College of Medicine of Yeshiva University (Bronx, New York)

Baylor College of Medicine (Houston, Texas)

Boston University School of Medicine (Boston, MA)

Brody School of Medicine at East Carolina University (Greenville, NC)

Case Western Reserve University School of Medicine (Cleveland, Ohio)

Chicago Medical School -- Rosalind Franklin University of Medicine & Science (North Chicago, IL)

Columbia University College of Physicians and Surgeons (New York, NY)

Creighton University School of Medicine (Omaha, NE)

Dartmouth Medical School (Hanover, NH)

David Geffen School of Medicine at UCLA (Los Angeles, CA)

Drexel University College of Medicine (Philadelphia, PA)

Duke University School of Medicine (Durham, NC)

East Tennessee State University - James H. Quillen College of Medicine (Johnson City, TN)

Eastern Virginia Medical School (Norfolk, VA)

Emory University School of Medicine (Atlanta, GA)

Florida International University College of Medicine (Miami, FL)

Florida State University College of Medicine (Tallahassee, FL)

George Washington University School of Medicine and Health Sciences (Washington, DC)

Georgetown University School of Medicine (Washington D.C.)

Harvard Medical School (Boston, Massachusetts)

Howard University College of Medicine (Washington, DC)

Indiana University School of Medicine (Indianapolis, IN)

Joan C. Edwards School of Medicine at Marshall University (Huntington, WV)

Johns Hopkins University School of Medicine (Baltimore, MD)

Keck School of Medicine of the University of Southern California (Los Angeles, CA)

Loma Linda University School of Medicine (Loma Linda, CA)

Louisiana State University HSC - School of Medicine at New Oreleans (New Orleans, LA)

Louisiana State University HSC - School of Medicine in Shreveport (Shreveport, LA)

Loyola University Chicago - Stritch School of Medicine (Maywood, IL)

Mayo Medical School -- Mayo Clinic College of Medicine (Rochester, MN)

Medical College of Georgia - School of Medicine (Augusta, GA)

Medical College of Wisconsin (Milwaukee, WI)

Medical University of South Carolina - College of Medicine (Charleston, SC)

Meharry Medical College (Nashville, TN)

Mercer University School of Medicine (Macon, GA)

Michigan State University College of Human Medicine (East Lansing, MI)

Morehouse School of Medicine (Atlanta, GA)

Mount Sinai School of Medicine (New York, NY)

New York Medical College -- School of Medicine (Valhalla, NY)

New York University School of Medicine (New York, NY)

Northeastern Ohio Universities College of Medicine (Rootstown, OH)

Northwestern University - Feinberg School of Medicine (Chicago, IL)

Ohio State University College of Medicine (Columbus, OH)

Oregon Health & Science University School of Medicine (Portland, OR)

Pennsylvania State University College of Medicine (Hershey, PA)

Ponce School of Medicine (Ponce, Puerto Rico)

Rush Medical College of Rush University Medical Center (Chicago, IL)

Saint Louis University School of Medicine (St. Louis, MO)

San Juan Bautista School of Medicine (Caguas, Puerto Rico)

Sanford School of Medicine of the University of South Dakota (Sioux Falls, SD)

School of Medicine at Stony Brook University Medical Center (Stony Brook, NY)

Southern Illinois University School of Medicine (Springfield, IL)

Stanford University School of Medicine (Stanford, CA)

SUNY Downstate Medical Center - College of Medicine (Brooklyn, NY)

SUNY Upstate Medical University - College of Medicine (Syracuse, NY)

Temple University School of Medicine (Philadelphia, PA)

Texas A & M Health Science Center - College of Medicine (College Station, TX)

Texas Tech University HSC - Paul L. Foster School of Medicine (El Paso, TX)

Texas Tech University HSC School of Medicine (Lubbock, TX)

Thomas Jefferson University -- Jefferson Medical College (Philadelphia, PA)

Tufts University School of Medicine (Boston, MA)

Tulane University School of Medicine (New Orleans, LA)

Uniformed Services University of the Health Sciences - F. Edward Hebert School of Medicine (Bethesda, MD)

Universidad Central del Caribe - School of Medicine (Bayamon, Puerto Rico)

University at Buffalo - School of Medicine and Biomedical Sciences (The State University of New York)

University of Alabama School of Medicine (Birmingham, AL)

University of Arizona College of Medicine (Tucson, AZ)

University of Arkansas for Medical Sciences -- College of Medicine (Little Rock, AR)

University of California Davis - School of Medicine (Sacramento, CA)

University of California Irvine - School of Medicine (Irvine, CA)

University of California San Diego - School of Medicine (San Diego, CA)

University of California San Francisco - School of Medicine

University of Central Florida College of Medicine (Orlando, FL)

University of Chicago - Pritzker School of Medicine (Chicago, IL)

University of Cincinnati College of Medicine (Cincinnati, OH)

University of Colorado School of Medicine (Denver, CO)

University of Connecticut School of Medicine (Farmington, CT)

University of Florida College of Medicine (Gainesville, FL)

University of Hawaii at Manoa - John A. Burns School of Medicine (Honolulu, HI)

University of Illinois College of Medicine (Chicago, IL)

University of Iowa Roy J. and Lucille A. Carver College of Medicine (Iowa City, IA)

University of Kansas School of Medicine (Kansas City, KS)

University of Kentucky College of Medicine (Lexington, KY)

University of Louisville School of Medicine (Louisville, KY)

University of Maryland School of Medicine (Baltimore, MD)

University of Massachusetts School of Medicine (Worcester, MA)

University of Medicine and Dentistry of New Jersey - New Jersey Medical School (Newark, NJ)

University of Medicine and Dentistry of New Jersey - Robert Wood Johnson Medical School (Piscataway, NJ)

University of Miami - Leonard M. Miller School of Medicine (Miami, FL)

University of Michigan Medical School (Ann Arbor, MI)

University of Minnesota Medical School (Minneapolis)

University of Mississippi Medical Center School of Medicine (Jackson, MS)

University of Missouri-Kansas City School of Medicine

University of Missouri - Columbia School of Medicine (Columbia, MO)

University of Nebraska Medical Center - College of Medicine (Omaha, NE)

University of New Mexico School of Medicine (Albuquerque, NM)

University of Nevada School of Medicine (Reno, NV)

University of North Carolina - Chapel Hill School of Medicine (Chapel Hill, NC)

University of North Dakota - School of Medicine and Health Sciences (Grand Forks, ND)

Original post:

List of US Medical Schools - University of Alaska Anchorage

Medical School Personal Statement & Application Essays

On average, medical schools accept around 8% of applicants; at the top schools the acceptance rate can be as low as 2%

The statistics are intimidating. While top business and law schools also boast incredibly low acceptance rates, the overall or average number of students accepted in those fields is between 35-50%. The reality for medical school applicants is that anywhere from 60-90% of applicants simply won't get in.

For the successful candidates, the road ahead is long. Medical school and residency programs require most students to acquire around eight years of classroom and clinical education before practicing. This means that aspiring medical doctors and researchers are an exceptional breed. Medicine is not a field in which you simply stick a toe in the water-this journey is a feet-first leap.

Yet even those students bright and committed enough to tackle medical study can struggle through the written portion of the application process. Those with a science background might feel uncomfortable with their writing skills. Others may simply suffer from the same anxiety that plagues all hopeful students-the ability to convey their thoughts effectively on paper.

Whether applying as a first-year medical student using the centralized American Medical College Application Service (AMCAS), or applying directly to the medical school as a transfer, advanced-standing, or residency applicant, you can rely on our experienced writers for solid assistance. For schools that don't offer an interview, the essay is a vital component of the application process. For those that do interview, the essay is a chance to supplement the in-person meeting, giving the student a chance to carefully consider their answers before sharing them. It can help the student demonstrate that they have more than good grades, test scores and ambition to offer the medical programs.

Particularly for students applying for residency or specialty programs, your medical school personal statement can assist admissions staff in better understanding the applicant's particular goals. For many students, the goal of practicing medicine is fuelled by more than a love of science and research. It can be a very moving endeavor, grounded in a passion for helping, fixing and discovering.

Whether a student wants to craft an medical school application essay that is deeply personal or one that focuses more on scientific and clinical experience, our writers are poised to help. We recognize the importance of this field, and how crucial the essay can be in helping admissions officers to match hopeful students with the ideal medical program.

See the article here:

Medical School Personal Statement & Application Essays

Top Medical Schools in United States & Medical School Rankings

Top medical schools provide the advanced education required to earn a Doctor of Medicine (MD) professional degree and prepare you to take the United States Medical Licensing Examination (USMLE). Many offer specialized training in a particular area of medicine, such as obstetrics or surgery. You can search the medical school rankings here at U.S. News University Directory. Other programs, like Master's Degrees in Healthcare and Doctorate's in Healthcare Administration, can also be found within the directory.

Are you ready to begin a degree or certificate program now? U.S News University Directory can match you with schools and programs that meet your criteria in a few simple steps. Learn more now.

#1

Founded in 1636, Harvard University is a private school In Cambridge, Mass. The medical school is located in the universitys Boston campus. Research students also work across the city in 17 different research institutions and area hospitals. Within the program, there are departments for social sciences such as genetics or cell biology, and hospital-based departments like pediatrics or surgery.

#2

Stanford University is a private school that was founded in 1885 in the Bay Area, about 30 miles outside of San Francisco. The Stanford University School of Medicine is in Silicon Valley, Calif. The medical school has research-based study options across most of their clinical departments. There are research centers for broad subjects like prevention to specific areas like Down syndrome.

#3

A private school founded in 1876, Johns Hopkins University is located in Baltimore. There are four main campuses, one that is mostly for undergraduates and three that are home to the graduate school programs. The Johns Hopkins School of Medicine gives students the option to work toward an M.D., a Ph.D. or both. The medical program works with Johns Hopkins Hospital and several research centers.

#3

The University of California- San Francisco is located throughout seven locations in San Francisco and Fresno, Calif. There are 28 departments for students to choose from, including family and community medicine, microbiology and immunology and anesthesia and perioperative care. Students can earn a second degree through numerous dual and joint program options.

#5

The University of Pennsylvania was founded by Benjamin Franklin in 1740. Located in Philadelphia, the University of Pennsylvania is home to the Perelman School of Medicine. The medical school offers masters, M.D., Ph.D., and post-doctorate degree programs. Students can also take dual degrees within any of the universitys programs. Departments include genetics, physiology and microbiology.

#6

Founded in 1853, Washington University in St. Louis is a private school located in a suburban setting. The medical school offers options for M.A., M.D. and Ph.D. programs in subjects such as clinical investigation, biostatistics and health science. Students can work with hospitals like St. Louis Childrens Hospital and Barnes-Jewish Hospital. There are research programs in the summer or all year.

#7

Yale University is a private school in New Haven, Conn. Founded in 1701, Aside from the college and graduate school of arts and sciences, Yale has 13 professional schools, including the School of Medicine. Medical students work with the Yale-New Haven Hospital and must submit a thesis of original research to receive their diploma. Yale offers programs in areas such as immunobiology and pathology.

#8

Founded in 1754, Columbia University is a private school in Manhattan. The College of Physicians and Surgeons uses a unique curriculum set in three parts: fundamentals for a year and a half; major clinical year, then a 14 month program of electives, clinicals and other projects. Students work with hospitals such as New York-Presbyterian University Hospital and the Harlem Hospital Center.

#8

A private school, Duke University was founded in 1838 in Durham, N.C. The School of Medicine allows students to earn an M.D. in four years, three for training and one for rotations in the students area of interest. The final year can also be used to complete a dual masters degree in an M.P.H., M.B.A. or Master of Science program. Students can also continue to work toward a Ph.D.

#10

The University of Chicago was founded in 1892 in the Hyde Park area. A private school, the university offers a variety of graduate programs. The Pritzker School of Medicine at University of Chicago encourages research. Before graduating, students are required to complete a project in the area of scientific discovery, medical education, quality and safety, community health or global health.

#10

The University of Michigan- Ann Arbor is a public school that was founded in 1817. The medical school allows students to get started right away, seeing patients within a semester of their start date. The school offers dual degree programs for medical students to earn an M.B.A. or a Masters in Public Health, Science in Information, Public Policy, Arts or Science in Clinical Research.

#10

The University of Washington is a public school located in Seattle. Founded in 1861, the School of Medicine offers M.D. and Ph.D. programs as well as a Medical Scientist Training program that allows students to earn both. The medical school has a partnership with nearby states that lets students do six-week clerkships in Washington, Wyoming, Montana, Alaska and Idaho.

#13

The University of California- Los Angeles (UCLA) is a public school founded in 1919. UCLA offers numerous options for continuing education, including the David Geffen School of Medicine. The medical school has options for combined degrees such as an M.D./Ph.D., an M.D./M.B.A. or an M.D.-Oral Surgery Residency. Medical students work with the Ronald Reagan UCLA Medical Center and other facilities.

#14

New York University is a private school in Manhattan that was founded in 1831. The School of Medicine at New York University is located in the Langone Medical Center, near the East River. Students can study toward an M.D., Ph.D. or both through a dual degree. Students can apply for the School of Medicine Honors Program that includes at least 18 weeks of scientific research as well as a thesis.

#14

Located in Nashville, Vanderbilt University is a private school that was founded in 1873. The School of Medicine at Vanderbilt University offers several dual degree programs, a Medical Scientist Training Program and cutting-edge masters degree programs like a Master of Science in Clinical Investigation, Master of Education of the Deaf and Master of Medical Physics.

#16

A public school founded in 1787, the University of Pittsburgh is located just outside of the downtown area of the city. The School of Medicine has students choose a concentration area like global health or disabilities medicine. Students participate in numerous research opportunities through facilities such as the University of Pittsburgh Medical Center.

#17

A public school founded in 1960, the University of California- San Diego is located in the La Jolla area of the city. The University of CaliforniaSan Diego School of Medicine uses an integrated curriculum that combines clinical medicine with medical science. Students can take classes or earn a degree in subjects like healthcare leadership through the UCSD College of Integrated Life Sciences.

#18

Cornell University is a private school founded in 1865 in Ithaca, N.Y. Cornell has 14 colleges and schools, each with their own faculty. The Weill Cornell Medical College works with the New York-Presbyterian Hospital, Memorial Sloan-Kettering Cancer Center and the Hospital for Special Surgery. Students have once a week classes in doctors offices while taking Medicine, Patients and Society.

#19

A private school founded in 1851, Northwestern University is located in Evanston, Ill. The Feinberg School of Medicine at Northwestern University grades first and second year students with a pass/fail system, and third and fourth year students using honors, high pass, pass and fail. Programs include an M.D./M.M. program that awards both a doctor of medicine and master of management degree.

#20

The Icahn School of Medicine at Mount Sinai is located in the Manhattan area of New York. The school is home to medical students ranging from M.D./Ph.D. students, masters degree students and postdoctoral fellows. Students can choose between many research options, like dentistry research, vascular biology and mood and anxiety disorders programs.

#21

The Baylor College of Medicine is a private medical school in Houston, Texas. The college is located within the Texas Medical Center. Students have the opportunity to work with eight different teaching hospitals in the area. The medical school has four dual degree programs as well as additional research options at other institutions. Baylor has over 90 research and patient-care facilities.

#22

The University of North Carolina-Chapel Hill is a public school that was founded in 1789. The School of Medicine runs over 30 facilities, such as the Bowles Center for Alcohol Studies and the Center for AIDS Research. Medical students are a part of the student government through the Whitehead Medical Society. Clinical departments at the medical school include subjects like surgery and pediatrics.

#23

A private school, Emory University was founded in 1836. Located near Atlanta, the Emory University School of Medicine assigns students to specific societies with their own mentors and clinician advisers. The curriculum within the medical school is split into phases: foundations of medicine, applications of medical sciences, discovery and translation of medical sciences.

#24

Located in Cleveland, Ohio, Case Western Reserve University is a private school founded in 1826. The School of Medicine offers three options for becoming an M.D. The University Program takes four years, the College Program takes five years and includes additional research and clinical opportunities and the Medical Scientist Training Program takes seven years and awards a medical degree and Ph.D.

#25

The University of Texas Southwestern Medical Center-Dallas is a public school that consists of the Medical School, the Graduate School of Biomedical Sciences and the School of Health Professions. The university offers masters and doctoral degrees in subjects such as cancer biology and biomedical communications. The university has won a Nobel Prize for research on cholesterol metabolism.

Read more:

Top Medical Schools in United States & Medical School Rankings

Gene therapy – PBS

A treatment for Cystic Fibrosis. A cure for AIDS. The end of cancer. That's what the newspapers promised us in the early 1990's. Gene therapy was the answer to what ailed us. Scientists had at last learned how to insert healthy genes into unhealthy people. And those healthy genes would either replace the bad genes causing diseases like CF, sickle-cell anemia and hemophilia or stimulate the body's own immune system to rid itself of HIV and some forms of cancer. A decade later, none of these treatments have come to fruition and research into gene therapy has become politically unpopular, making clinical trials hard to approve and research dollars hard to come by. But some researchers who are taking a different approach to gene therapy could be on the road to more success than ever before. - - - - - - - - - - - -

Early Promise

Almost as soon as Watson and Crick unwound the double helix in the 1950's, researchers began considering the possibility- and ethics- of gene therapy. The goals were lofty- to fix inherited genetic diseases such as Cystic Fibrosis and hemophilia forever.

Gene therapists planned to isolate the relevant gene in question, prepare good copies of that gene, then deliver them to patients' cells. The hope was that the treated cells would give rise to new generations of healthy cells for the rest of the patient's life. The concept was elegant, but would require decades of research to locate the genes that cause illnesses.

By 1990, it was working in the lab. By inserting healthy genes into cells from CF patients, scientists were able to transmogrify the sick cells as if by magic into healthy cells.

That same year, four-year-old Ashanti DeSilva became the first person in history to receive gene therapy. Dr. W. French Anderson of the National Heart, Lung and Blood Institute and Dr. Michael Blaese and Dr. Kenneth Culver, both of the National Cancer Institute, performed the historic and controversial experiment.

DeSilva suffered from a rare immune disorder known as ADA deficiency that made her vulnerable to even the mildest infections. A single genetic defect- like a typo in a novel- left DeSilva unable to produce an important enzyme. Without that enzyme, DeSilva was likely to die a premature death.

Anderson, Blaese and Culver drew the girl's blood and treated her defective white blood cells with the gene she lacked. The altered cells were then injected back into the girl, where- the scientists hoped- they would produce the enzyme she needed as well as produce future generations of normal cells.

Though the treatment proved safe, its efficacy is still in question. The treated cells did produce the enzyme, but failed to give rise to healthy new cells. DeSilva, who is today relatively healthy, still receives periodic gene therapy to maintain the necessary levels of the enzyme in her blood. She also takes doses of the enzyme itself, in the form of a drug called PEG-ADA, which makes it difficult to tell how well the gene therapy would have worked alone.

"It was a very logical approach," says Dr. Jeffrey Isner, Chief of Vascular Medicine and Cardiovascular Research at St. Elizabeth's Medical Center in Boston as well as Professor of Medicine at Tufts University School of Medicine. "But in most cases the strategy failed, because the vectors we have today are not ready for prime time." - - - - - - - - - - - - 4 pages: | 1 | 2 | 3 | 4 |

Photo: Dr. W. French Anderson

See the original post:

Gene therapy - PBS

What’s Next: Top Trends | Diary of an accidental futurist …

Heres my re-write of the chapter on money again. I must stress that this is now going into an edit process so it will come out looking slightly different. Quite a bit of material got dumped (which really hurts when you like it). Otherwise it was a matter of better signposting, starting with the economy and then moving into money and constantly teasing out the digital. Sorry about the layout, its gone rather haywire. (BTW, Ive added a few links, which obviously wont appear in any paper versions of this chapter).

The Economy and Money: Is digital money making us more careless? The idea of the future being different from the present is so repugnant to our conventional modes of thought and behaviour that we, most of us, offer a great resistance to acting on its practice. John Maynard Keynes, economist Networks of inequality A few years ago I was walking down a street in West London when a white van glided to a halt opposite. Four men stepped out and slowly slid what looked like a giant glass coffin from the rear. Inside it was a large shark.

The sight of a live shark in London was slightly surreal, so I sauntered over to ask what was going on. It transpired that the creature in question was being installed in an underground aquarium in the basement of a house in Notting Hill. This secret subterranean lair should, I suppose, have belonged to Dr Evil. To local residents opposing deep basement developments it probably did. A more likely candidate might have been someone benefiting from the digitally networked nature of global finance. A partner at Goldman Sachs, perhaps. This is the investment bank immortalised by Rolling Stone magazine as: a great vampire squid wrapped around the face of humanity. Or possibly the owner was the trader known as the London Whale, who lost close to six billion dollars in 2012 for his employer, JP Morgan, by electronically betting on a series of highly risky and somewhat shady derivatives known as Credit Default Swaps. London real estate had become a serious place to stash funny money, so maybe the house belonged to a slippery individual dipping their fingers into the bank accounts of a corrupt foreign government or international institution. In the words of William Gibson, the former sci-fi writer, London is: Where you go if you successfully rip off your third world nation.

Whichever ruthless predator the house belonged to, something fishy was underfoot. My suspicion was that it had to do with unchecked financial liberalisation, but also how the digital revolution was turning the economy into a winner-takes-all online casino. The shift of power away from locally organised labour to globally organised capital has been occurring for a while, but the recent digital revolution has accelerated and accentuated this. Digitalisation hasnt directly enabled globalisation, but it certainly hasnt restrained it either and one of its negative side effects has been a tendency toward polarisation, both in terms of individual incomes and market monopolies.

Throughout most of modern history, around two-thirds of the money made in developed countries was typically paid as wages. The remaining third was paid as interest, dividends or other forms of rent to the owners of capital. But since 2000, the amount paid to capital has increased substantially while that paid to labour has declined, meaning that real wages have remained flat or fallen for large numbers of people.

The shift toward capital could have an innocent analogue explanation. China, home to an abundant supply of low-cost labour, has pushed wages down globally. This situation could soon reverse, as China runs out of people to move to its cities, their pool of labour shrinks, due to ageing, and Chinese wages increase. Alternatively, low-cost labour may shift somewhere else possibly Africa. But another explanation for the weakened position of human labour is that humans are no longer competing against each other, but against a range of largely unseen digital systems. It is humans that are losing out. A future challenge for governments globally will therefore be the allocation of resources (and perhaps taxes) between people and machines given that automated systems will take on an increasing number of previously human roles and responsibilities.

Same as it ever was? Ever since the invention of the wheel weve used machines to supplement our natural abilities. This has always displaced certain human skills. And for every increase in productivity and living standards thereve been downsides. Fire cooks our food and keeps us warm, but it can burn down our houses and fuel our enemys weapons. During the first industrial revolution machines further enhanced human muscle and then we outsourced further dirty and dangerous jobs to machines. More recently weve used machines to supplement our thinking by using them for tedious or repetitive tasks. Whats different now is that digital technologies, ranging from advanced robotics and sensor networks to basic forms of artificial intelligence and autonomous systems, are threatening areas where human activity or input was previously thought essential or unassailable.In particular, software and algorithms with near-zero marginal cost are now being used for higher-order cognitive tasks. This is not digital technology being used alongside humans, but as an alternative to them. This is not digital and human. This is digital instead of human. Losing an unskilled job to an expensive machine is one thing, but if highly skilled jobs are lost to cheap software where does that leave us? What skills do the majority of humans have left to sell if machines and automated systems start to think? You might be feeling pretty smug about this because you believe that your job is somehow special or terribly difficult to do, but the chances are that you are wrong, especially when you take into account whats happening to the cost and processing power of computers. Its not so much what computers are capable of now, but what they could be capable of in ten or twenty years time that you should be worried about.

I remember ten or so years ago reading that if you index the cost of robots to humans with 1990 as the base (1990=100) the cost of robots had fallen from 100 to 18.5. In contrast, the cost of people had risen to 151. Der Spiegel, a German magazine, recently reported that the cost of factory automation relative to human labour had fallen by 50 per cent since 1990. Over the shorter term there wont be much to worry about. Even over the longer term therell still be jobs that idiot savant software wont be able to do very well or do at all. But unless we wake up to the fact that were training people to compete head on with machine intelligence theres going to be trouble eventually. This is because we are filling peoples heads with knowledge thats applied according to sets of rules, which is exactly what computers do. We should be teaching people to do things that these machines cannot. We should be teaching people to constantly ask questions, find fluid problems, think creatively and act empathetically. We should be teaching high abstract reasoning, lateral thinking and interpersonal skills. If we dont a robot may one day come along with the same cognitive skills as us, but costs just $999. Thats not $999 a month, thats $999 in total. Forever. No lunch breaks, holidays, childcare, sick pay or strike action, either. How would you compete with that?

If you think thats far fetched, Foxconn, a Chinese electronics assembly company, is designing a factory in Chengdu thats totally automated no human workers involved whatsoever. Im fairly sure well eventually have factories and machines that can replicate themselves too, including software that writes its own code and 3D printers that can print other 3D-printers. Once weve invented machines that are smarter than us there is no reason to suppose that these machines wont go on to invent their own machines which we then wont be able to understand and so on ad infinitum. Lets hope these machines are nice to us. Its funny that our obsessive compulsive addiction to machines, especially mobile devices, is currently undermining our interpersonal skills and eroding our abstract reasoning and creativity, when these skills are exactly what well need to compete against the machines. But who ever said that the future couldnt be deeply ironic. There are more optimistic outcomes of course. Perhaps the productivity gains created by these new technologies will eventually show up and the resulting wealth will be more fairly shared. Perhaps therell be huge cost savings made in healthcare or education. Technologically induced productivity gains may offset ageing populations and shrinking workforces too. Its highly unlikely that humans will stop having interpersonal and social needs and even more unlikely that the delivery of all these needs will be within the reach of robots. In the shorter term its also worth recalling an insightful comment attributed to NASA in 1965, which is that: Man is the lowest-cost, all-purpose computer system which can be mass-produced by unskilled labour. But if the rewards of digitalisation are not equitable or designers decide that human agency is dispensable or unprofitable then a bleaker future may emerge, one characterised by polarisation, alienation and discomfort. Money for nothing

Tim Cook, the CEO of Apple, once responded to demands that Apple raise its return to shareholders by saying that his aim was not to make more profit. His aim was to make better products, from which greater financial returns would flow. This makes perfect sense to anyone except speculators carelessly seeking short-term financial gains at the expense of broader measures of benefit or value. As Jack Welch, the former CEO of General Electric once said: Shareholder value is the dumbest idea in the world. It was Plato who pointed out that an appetite for more could be directly linked with bad human behaviour. This led Aristotle to draw a black and white distinction between the making of things and the making of money. Both philosophers would no doubt have been disillusioned with high frequency trading algorithms algorithms being computer programs that follow certain steps to solve a problem or react to an observed situation. In 2013, algorithms traded $28 million worth of shares in 15 milliseconds after Reuters released manufacturing data milliseconds early. Doubtless money was made here, but for doing what?

Charles Handy, the contemporary philosopher, makes a similar point in his book The Second Curve that when money becomes the point of something then something goes wrong. Money is merely a secure way to hold or transmit value (or frozen desire as someone more poetically put it). Money is inherently valueless unless exchanged for something else. But the aim of many digital companies appears to be to make money by selling themselves (or their users) to someone else. Beyond this their ambition appears to be market disruption by delivering something faster or more conveniently than before. But to what end ultimately? What is their great purpose? What are they for beyond saving time and delivering customers to advertisers? In this context, high frequency trading is certainly clever, but its socially useless. It doesnt make anything other than money for small number of individuals. Moreover, while the risks to the owners of the algorithms are almost non-existent, this is not generally the case for the society as a whole. Huge profits are privatised, but huge losses tend to be socialised. Connectivity has multiple benefits, but linking things together means that any risks are linked with the result that systemic failure is a distinct possibility. So far weve been lucky. Flash Crashes such as the one that occurred on 6 May 2010 have been isolated events. On this date high-speed trading algorithms decided to sell vast amounts of stocks in seconds, causing momentary panic. Our blind faith in the power and infallibility of algorithms makes such failures more likely and more severe. As Christopher Steiner, author of Automate: How Algorithms Came to Rule Or World writes: Were already halfway towards a world where algorithms run nearly everything. As their power intensifies, wealth will concentrate towards them. Similarly, Nicholas Carr has written that: Miscalculations of risk, exacerbated by high-speed computerised trading programs, played a major role in the near meltdown of the worlds financial system in 2008. Digitalisation helped to create the sub-prime mortgage market and expanded it at a reckless rate. But negative network effects meant that the market imploded with astonishing speed, partly because financial networks were able to spread panic as easily as they had been able to transmit debt. Network effects can create communities and markets very quickly, but they can destroy them with velocity and ferocity too. Given the worlds financial markets, which influence our savings and pensions, are increasingly influenced by algorithms, this is a major cause for concern. After all, who is analysing the algorithms that are doing all of the analysing?

Out of sight and out of mind Interestingly, its been shown that individuals spend more money when they use digital or electronic money rather than physical cash. Because digital money is somehow invisible or out of sight our spending is less considered or careful. And when money belongs to someone else, a remote institution rather than a known individual for instance, any recklessness and impulsiveness is amplified. Susan Greenfield, the neuroscientist, has even gone as far as to link the 2008 financial crisis to digitalisation because digitalisation creates a mindset of disposability. If, as a trader, you have grown up playing rapid-fire computer games in digital environments you may decide that similar thrills can be achieved via trading screens without any direct real-world consequences. You can become desensitised. Looking at numbers on a screen its easy to forget that these numbers represent money and ultimately people. Having no contact with either can be consequential. Putting controls on computers can make matters worse because we tend to take less notice of information when its delivered on a screen amid a deluge of other digital distractions.

Carelessness can have other consequences too. Large basement developments such as the one I stumbled into represent more than additional living space. They are symbolic of a gap thats opening up between narcissistic individuals who believe that they can do anything they want if they can afford it and others who are attempting to hang on to some semblance of physical community. A wealthy few even take pleasure seeing how many local residents they can upset, as though it were some kind of glorious computer game. Of course, in the midst of endless downward drilling and horizontal hammering, the many have one thing that the few will never have, which is enough. Across central London, where a large house can easily cost ten million pounds, it is not unusual for basement developments to include underground car parks, gyms, swimming pools and staff quarters, although the latter are technically illegal. Its fine to stick one of natures most evolved killing creatures 50 feet underground, but local councils draw a line in the sand with Philippino nannies. The argument for downward development is centred on the primacy of the individual in modern society. Its their money (digital or otherwise) and they should be allowed to do whatever they like with it. There isnt even a need to apologise to neighbours about the extended noise, dirt and inconvenience. The argument against such developments is that its everyone elses sanity and that neighbourhoods and social cohesion rely on shared interests and some level of civility and cooperation. If people start to build private cinemas with giant digital screens in basements this means they arent frequenting public spaces such as local cinemas, which in turn impacts on the vitality of the area. In other words, an absence of reasonable restraint and humility by a handful of self-centred vulgarians limits the choices enjoyed by the broader community. This isnt totally the fault of digitalisation, far from it, but the idea that an individual can and should be left alone to do or say what they like is being amplified by digital technology. This is similar, in some respects, to the way in which being seated securely inside a car seems to bring out the worst in some drivers behaviour toward other road users. Access to technology, especially technology thats personal and mobile, facilitates remoteness, which in turn reduces the need to interact physically or consider the feelings of other human beings. Remote access, in particular, can destroy human intimacy and connection, although on the plus side such technology can be used to expose or shame individuals that do wrong in the eyes of the broader community.

In ancient Rome there was a law called Lex Sumptuaria that restrained public displays of wealth and curbed the purchasing of luxury goods. Similar sumptuary laws aimed at superficiality and excess have existed in ancient Greece, China, Japan and Britain. Perhaps its time to bring these laws back or at least to levy different rates of tax or opprobrium on immodest or socially divisive consumption or on digital products that damage the cohesiveness of the broader physical community. Income polarisation and inequality arent new. Emile Zola, the French writer, referred to rivers of money: Corrupting everyone in a fever of speculation in mid-19th Century Paris. But could it be that a fever of digital activity is similarly corrupting? Could virtualisation and personalisation be fraying the physical bonds that make us human and ultimately hold society together? Whats especially worrying here is that studies suggest that wealth beyond a certain level erodes empathy for other human beings. Perhaps the shift from physical to digital interaction and exchange is doing much the same thing. But its not just the wealthy that are withdrawing physically. Various apps are leading to what some commentators are calling the shut in economy. This is a spin-off from the on-demand economy, whereby busy people, including those that work from home, are not burdened by household chores. But perhaps hardly ever venturing outside is as damaging as physically shutting others out. As one food delivery service, Door Dash, cryptically says: Never leave home again. Where have all the jobs gone? Id like to move on to consider some other aspects of digital exchange, but before I do Id like to dig a little deeper into the question of whether computers and automated systems are creating or destroying wealth and what happens to any humans that become irrelevant to the needs of the on-demand digital economy.

The digitally networked nature of markets is making some people rich, but also spreading wealth around far more than you might think. Globally, the level of inequality between nations is lessening and so too is extreme poverty. In 1990, for example, 43 per cent of people in emerging markets lived in extreme poverty, defined as existing on less than $1 per day. By 2010, this figure had shrunk to 21 per cent. Or consider China. In 2000, around 4 per cent of Chinese households were defined as middle class. By 2012, this had increased to two-thirds and by 2022, its predicted that almost half (45%) of the Chinese population will be middle class, defined as having annual household incomes of between US $9,000 and $16,000. This has more to do with demographics and deregulation than digitalisation, but by accident or design global poverty has been reduced by half in 20 years. Nevertheless, the gap between the highest and lowest earning members of society is growing and is set to continue with the onward march of digital networks. As the novelist Jonathan Franzen says: The internet itself is in an incredibly elitist concentrator of wealth in the hands of the few while giving the appearance of voice and the appearance of democracy to people who are in fact being exploited by the technologies. If you have something that the world feels it needs right now its now possible to make an awful lot of money very quickly, especially if the need can be transmitted digitally. However, the spoils of regulatory and technological change are largely being accrued by people who are highly educated and internationally minded. If you are neither of these things then you are potentially destined for low-paid, insecure work, although at least youll have instant access to free music, movie downloads and computer games to pass the time until you die. Theres been much discussion about new jobs being invented, including jobs we cant currently comprehend, but most current jobs are fairly routine and repetitive and therefore ripe for automation. Furthermore, its unrealistic to expect that millions of people can be quickly retrained and reassigned to do jobs that are beyond the reach of robots, virtualisation and automation. Losing a few thousand jobs in car manufacturing to industrial robots or Amazon wiping out a bookshop is one thing, but what happens if automation removes vast swathes of employment across the globe? What if half of all jobs were to disappear? Moreover, if machines do most of the work how will most people acquire enough money to pay for the things that the machines make, thereby keeping the machines in employment? Maybe we should tax robots instead of people?

In theory the internet should be creating jobs. In the US between 1996 and 2005 it looked like it might. Productivity increased by around 3 per cent and unemployment fell. But by 2005 (i.e. before the global recession) this started to reverse. Why might this be so? According to McKinsey & Company, a firm of consultants, computers and related electronics, information industries and manufacturing contributed about 50 per cent of US productivity increases since 2000 but reduced (US) employment by 4,500,000 jobs. Perhaps productivity gains will take time to come. This is a common claim of techno-optimists and the authors of the book, Race Against the Machine. Looking at the first industrial revolution, especially the upheavals brought about by the great inventions of the Victorian era, they could have a point. But it could be that new technology, for all its power, cant compete with simple demographics and sovereign debt. Perhaps, for all its glitz, computing just isnt as transformative as stream power, railroads, electricity, postage stamps, the telegraph or the automobile. Yes weve got Facebook, Snapchat and Rich Cats of Instagram, but we havent set foot on the moon since 1969 and traffic in many cities moves no faster today than it did 100 years ago. It is certainly difficult to argue against certain aspects of technological change. Between 1988 and 2003, for example, the effectiveness of computers increased a staggering 43,000,000-fold. Exponentials of this nature must be creating tectonic shifts somewhere, but where exactly?

Is efficiency a good measure of value? In its heyday, in 1955, General Motors employed 600,000 people. Today, Google, a similarly iconic American company, employs around 50,000. Facebook employs about 6,000. More dramatically, when Facebook bought Instagram for $1 billion in 2012, Instagram had 30,000,000 users, but employed just 13 people full time. At the time of writing, Whatsapp had just 55 employees, but a market value exceeding that of the entire Sony Corporation. This forced Robert Reich, a former US treasury Secretary, to describe Whatsapp as: everything thats wrong with the US economy. This isnt because the company is bad its because it doesnt create jobs. Another example is Amazon. For each million dollars of revenue that Amazon makes it employs roughly one person. This is undoubtedly efficient, but is it desirable? Is it progress? These are all examples of the dematerialisation of the global economy, where we dont need as many people to produce things, especially when digital products and services have a near zero marginal cost and where customers can be co-opted as free click-workers that dont appear on any balance sheet. A handful of people are making lots of money from this and when regulatory frameworks are weak or almost non-existent and geography becomes irrelevant these sums tend to multiply. For multinational firms making money is becoming easier too, not only because markets are growing, but because huge amounts of money can be saved by using information technology to co-ordinate production and people across geographies. Technology vs. psychology If a society can be judged by how it treats those with the least then things are not looking good. Five minutes walk from the solitary shark and winner takes all mentality you can find families that havent worked in three generations. Many of them have given up hope of ever doing so. They are irrelevant to a digital economy or, more specifically, what Manuel Castells, a professor of sociology at the University of California at Berkeley, calls informational capitalism. Similarly, Japan is not far off a situation where some people will retire without ever having worked and without having moved out of the parental home. In some ways Japan is unique, for instance its resistance to immigration. But in other ways Japan offers a glimpse of what can happen when a demographic double-whammy of rapid ageing and falling fertility means that workforces shrink, pensions become unaffordable and younger generations dont enjoy the same dreams, disposable incomes or standards of living as their parents. Economic uncertainty and geo-political volatility, caused partly by a shift from analogue to digital platforms, can mean that careers are delayed, which delays marriage, which feeds through to low birth rates, which lowers GDP, which fuels more economic uncertainty. This is all deeply theoretical, but the results can be hugely human. If people dont enjoy secure employment, housing or relationships, what does this do to their physical and especially psychological state? I expect that a negative psychological shift could be the next big thing we experience unless a coherent we emerges to challenge some of the more negative aspects of not only income inequality, but the lack of secure and meaningful work for the less talented, the less skilled and less fortunate. A few decades ago people worked in a wide range of manufacturing and service industries and collected a secure salary and benefits. But now, according to Yochai Benkler, a professor at Harvard Law School, the on-demand digital economy is efficiently connecting people selling certain skills to others looking to buy. This sounds good. It sounds entrepreneurial. It sounds efficient and flexible and is perhaps an example of labour starting to develop its own capital. But its also, potentially, an example of mass consumption decoupling from large-scale employment and of the fact that unrestrained free-markets can be savagely uncompromising. Of course, unlike machines, people can vote and they can revolt too, although I think that passive disaffection and disenfranchisement are more likely. One of the great benefits of the internet has been the ease with which ideas can be transmitted across the globe, but ideas dont always turn into actions. The transmission of too much data or what might be termed too much truth is also resulting in what Castells calls: informed bewilderment. This may sound mild, but if bewilderment turns into despair and isolation theres a chance this could feed into fundamentalism, especially when the internet is so efficient at hosting communities of anger and transmitting hatred. There is also evidence emerging that enduring physical hardship and mental anguish not only create premature ageing, which compromises the immune and cardiovascular system, but that this has a lasting legacy for those people having children. This is partly because poorer individuals are more attuned to injustice, which feeds through to ill health and premature ageing, and partly because many of the subsequent diseases can be passed on genetically. But perhaps its not relative income levels per se that so offend, but the fact that its now so easy to see what you havent got. Social media spreads images of excess abundantly and exuberantly. Sites such as Instagram elicit envy and distribute depression by allowing selfies to scream: look at me! (and my oh so perfect life). Its all a lie, of course, but we dont really notice this because the internet so easily becomes a prison of belief.

A narrowing of focus In the Victorian era, when wealth was polarised, there was at least a shared moral code, broad sense of civic duty and collective responsibility. People, you might say, remained human. Nowadays, increasingly, individuals are looking out for themselves and digitalisation, true to form, is oiling the wheels of efficiency here too. Individualism has created a culture thats becoming increasingly venal, vindictive and avaricious. This isnt just true in the West. In China there is anguished discussion about individual callousness and an emergent culture of compensation. The debate was initiated back in 2011 when a toddler, Yue Yue, was hit by several vehicles in Foshan, a rapidly growing city in Guangdong province, and a video of the event was posted online. Despite being clearly hurt no vehicles stopped and nobody bothered to help until a rubbish collector picked the child up. Yue Yue later died in hospital. Another incident, also in China, saw two boys attempting to save two girls from drowning. The boys failed and were made to pay compensation of around 50,000 Yuan (about 5,000) each to the parents for not saving them. Such incidents are rare, but they are not unknown and do perhaps point toward a world that is becoming more interested in money than mankind a world that is grasping and litigious, where trust and the principle of moral reciprocity are under threat. You can argue that we are only aware of such events due to digital connectivity, which is probably true, and that both sharing and volunteering are in good health. But you can also argue that the transparency conjured up by connectivity and social media is making people more nervous about sticking their necks out. In a word with no secrets, ubiquitous monitoring and perfect remembering people have a tendency to conform. Hence we click on petitions online rather than actually doing anything. I was innocently eating my breakfast recently when I noticed that Kelloggs were in partnership with Chime for Change, an organisation committed to raise funds and awareness for girls education and empowerment through projects promoting education, health and justice. How were Kelloggs supporting this? By asking people to share a selfie to show your support. To me this is an example of internet impatience and faux familiarity. It personifies the way that the internet encourages ephemeral acts of belonging that are actually nothing of the sort. As for philanthropy, theres a lot of it around, but much of it has become, as one Museum director rather succinctly put it: money laundering for the soul. Philanthropy is becoming an offshoot of personal branding. It is buildings as giant selfies, rather than the selfless or anonymous love of humanity. One pleasing development that may offset this trend is crowd-funding, whereby individuals fund specific ideas with micro-donations. At the moment this is largely confined to inventions and the odd artistic endeavour, but theres no reason why crowds of people with small donations cant fund political or altruistic ideas or even interesting individuals with a promising future. I sometimes wonder why we havent seen a new round of revolutions in the West. Due to digital media we all know all about the haves and the have yachts. Its even easy to find out where the yachts are moored thanks to free tracking apps. Then again, we barely know our own neighbours these days, living, as we increasingly do, in digital bubbles where friends and news stories are filtered according to pre-selected criteria. The result is that we know more and more about the people and things we like, but less and less about anything, or anyone, outside of our existing preferences and prejudices. Putting aside cognitative biases such as inattentional blindness, which means we are often blissfully unaware of whats happening in front of our own eyes, theres also the thought that weve become so focussed on ourselves that focusing anger on a stranger five minutes up the road or on a distant yacht is a bit of stretch. This is especially true if you are addicted to 140 character updates of your daily existence or looking at photographs of cute cats online. Mugged by reality Is anyone out there thinking about how Marxs theory of alienation might be linked to social stratification and an erosion of humanity? I doubt it, but the fall of Communism can be connected with the dominance of individualism and the emergence of self-obsession. This is because before the fall of the Berlin Wall in 1989 there was an alternative ideology and economic system that acted as a counter-weight to the excesses of capitalism, free markets and individualism. Similarly, in many countries, an agile and attentive left took the sting out of any political right hooks. Then in the 1990s there was a dream called the internet. But the internet is fast becoming another ad-riddled venue for capitalism where, according to an early Facebook engineer (quoted by Ashlee Vance in his biography of Elon Musk) the best minds of my generation are thinking about how to make people click ads. The early dream of digital democracy has also soured because it turns out that a complete democracy of expression attracts voices that are stupid, angry and have a lot of time on their hands. This is a Jonathan Franzen again, although he reminds me of another writer, Terry Prachett, who pointed out that: real stupidity beats artificial intelligence every time. To get back to the story in hand, the point here is that if you take away any balancing forces you not only end up with tax shy billionaires, but income polarisation and casino banking. You can also end up with systemic financial crashes, another of which will undoubtedly be along shortly, thanks to our stratospheric levels of debt, the globally connected nature of risk and the corruption and villainy endemic in emerging markets. Its possible that connectivity will create calm rather than continued volatility, but I doubt it. More likely a relatively insignificant event, such as a modest rise in US interest rates, will spread panic and emotional contagion at which point anyone still living in a digital bubble will get mugged by reality. Coming back to some good news, a significant economic trend is the growth of global incomes. This sounds at odds with declining real wages, but I am talking about emerging not developed markets. According to Ernst & Young, the accountancy firm, an additional 3 billion individuals are being added to the global middle class. Thats 3 billion more smart-phone using, FitBit wearing, Linked-in profiled, Apple iCar driving, Instagram obsessives. In China, living standards have risen by an astonishing 10,000 per cent in a single generation. In terms of per capita GDP in China and India this has doubled in 16 and 12 years respectively. In the UK this took 153 years. This is pleasing, although the definition of middle class includes people earning as little as $10 a day. Many of these people also live behind the Great Firewall of China, so we shouldnt get too carried away with trickle down economics or the opening up of democracy. What globalisation giveth to jobs automation may soon be taketh away too and many may find themselves sinking downward towards working class or neo-feudal status rather than effervescently rising upward. According to Pew Research, the percentage of people in the US that think of themselves as middle class fell from 53 per cent in 2008 to 44 per cent in 2014, with 40 per cent now defining themselves as lower class compared to 25 per cent in 2008. Teachers, for example, that have studied hard, worked relentlessly and benefit society as a whole find themselves priced out of real-estate and various socio-economic classifications by the relentless rise of financial speculators. Deeper automation and virtualisation could make things worse. Martin Wolf, a Financial Times columnist, comments that intelligent machines could hollow out middle class jobs and compound inequality. Even if the newfound global wealth isnt temporary theres plenty of research to suggest that as people grow richer they focus more attention on their own needs at the expense of others. So a wealthier world may turn out to be one thats less caring. Of course, its not numbers that matter. What counts are feelings, especially feelings related to the direction of travel. The perception in the West generally is that we are mostly moving in the wrong direction. This can be seen in areas such as education and health and its not too hard to imagine a future world split into two halves, a thin, rich, well-educated, mobile elite and an overweight, poorly educated, anchored underclass. This is reminiscent of HG Wells intellectual, surface dwelling Eloi and downtrodden, subterranean Morlocks in The Time Machine and Tolkiens Mines of Moria. The only difference this time might be that its the global rich that end up living underground, cocooned from the outside world in deep basement developments. An upside to the downside Its obviously possible that this outcome is re-written. Its entirely possible that we will experience a reversal where honour, spiritual service or courage to country are valued far above commerce. This is a situation that existed in Britain and elsewhere not that long ago. Its possible that grace, humility, public spiritedness and contempt for vulgar displays of wealth could become dominant social values. Or perhaps a modest desire to leave as small a footprint as possible could become a key driving force. On the other hand, perhaps a dark dose of gloom and doom is exactly what the world needs. Perhaps the era of cheap money is coming to an end and an extended period of slow growth will do us all a world a good. A study led by Heejung Park at UCLA found that the trend towards greater materialism and reduced empathy had been partly reversed due to the 2008-2010 economic downturn. In comparison with a similar study looking at the period 20042006, US adolescents were less concerned with owning expensive items, while the importance of having a job thats worthwhile to society rose. Whether this is just cyclical or part of a permanent shift is currently impossible to say. These studies partly link with previous research suggesting that a decline in economic wealth promotes collectivism and perhaps with the idea that we only truly appreciate things when we are faced with their loss. There arent too many upsides to global pandemics, rogue asteroids and financial meltdowns, but the threat of impending death or disaster does focus the long lens of perspective, as Steve Jobs pointed out in his commencement speech at Stamford University. Digital Vs Physical trust Thats enough about the economy. How might the digitalisation of money impact our everyday behaviour in the future? I think it is still too early to make any definitive statements about particular technologies or applications, but I do believe that the extinction of cash is inevitable because digital transactions are faster and more convenient, especially for companies. Cash can be cumbersome too. Its also because governments and bureaucracies would like to reduce illegal economic activity and collect the largest amount of tax possible thereby increasing their power. In the US, for instance, its been estimated that cash costs the American economy $200 billion a year, not just due to tax evasion and theft, but also due to time wasting. A study, by Tufts University says that the average American spends 28 minutes per month traveling to ATMs, to which my reaction is so what? What are people not doing by wasting 28 minutes going to an ATM? Writing sonnets? Inventing a cure for cancer? But a wholly cashless society, or global e-currency, wont happen for a long time, partly because physical money, especially banknotes, is so tied up with notions of national identity (just look at the Euro to see how that can go wrong!). Physical money tells a rich story. It symbolises a nations heritage in a way that digital payments cannot. People in recent years have also tended to trust cash more. The physical presence of cash is deeply reassuring, especially in times of economic turmoil. In the UK, in 2012, more than half of all transactions were cash and the use of banknotes and coins rose slightly from the previous year. Why? The answer is probably that in 2012 the UK was still belt-tightening and people felt they could control their spending more easily using cash. Or perhaps people didnt trust the banks or each other. Similarly, in most rich countries, more than 90 per cent of all retail is still in physical rather than digital stores. We should also be careful not to assume that everyone is like ourselves. The people most likely to use cash are elderly, poor or vulnerable, so it would be a huge banking error, in my view, if everyone stopped accepting physical money. Its also a useful Plan B to have a stash of cash in case the economy melts down or your phone battery dies leaving you with no way to pay for dinner. This is probably swimming against the tide though and I suspect there is huge pent-up demand for mobile and automated payments. Globally cash is still king (85% of all transactions are still cash according to one recent study) but in developed economies this tends not to be the case. In the US about 60% of transactions are now digital, while in the UK non-cash payments have now overtaken physical cash. Money will clearly be made trying to get rid of physical money. According to the UK payments council the use of cash is expected to fall by a third by 2022. Nevertheless, circumstances do change and I suspect that any uptake of new payment technologies is scenario dependent. I was on the Greek island of Hydra in 2014 and much to my surprise the entire economy had reverted to physical money. This was slightly annoying, because I had just written a blog post about the death of cash based on my experience of visiting the island two years earlier. On this visit almost everywhere accepted electronic payments, but things had dramatically changed. Again, why? I initially thought the reason was Greeks attempting to avoid tax. Cash is anonymous. But it transpired that the real reason was trust. If you are a small business supplying meat to a taverna and youre worried about getting paid, you ask for cash. This is one reason why cash might endure longer than some e-evangelists tell us. Cash is a hugely convenient method to store and exchange value and has the distinct advantage of keeping our purchasing private. If we exchange physical cash for digital currency this makes it easier for companies and governments to spy on what were doing.

Countless types of cashless transactions There are many varieties of digital money. Weve had credit cards for a very long time. Transactions using cards have been digital for ages and contactless for a while. Weve grown used to private currencies, virtual currencies, micro-payments, embedded value cards, micro-payments and contactless (NFC) payments. Weve also learnt to trust PayPal and various Peer-to-Peer lending sites such as Zopa and Prosper, although one suspects that, like ATMs, we are happier taking money out than putting money in. Were also slowly getting used to the idea of payments using mobile phones. There are even a few e-exhibitionists with currency chips embedded in their own bodies and while this might take a while to catch on I can see the value in carrying around money in our bodies. A chip inserted in your jaw or arm is a bit extreme, but how about a tiny e-pill loaded with digital cash that, once swallowed, is good for $500 or about a week? Theres even digital gold, but to be honest I cant get my head around that at all. The key point here is that all of these methods of transaction are more or less unseen. They are also fast and convenient, which, I would suggest, means that spending will be more impulsive and less considered. We will have regular statements detailing our digital transactions, of course, but these will also be digital, delivered to our screens amid a deluge of other digital distractions and therefore widely ignored or not properly read.

Really thinking or mindlessly consuming? What interests me most here is whether or not attitudes and behaviours change in the presence of invisible money. There is surprisingly little research on this subject, but what does exist, along with my own experience, suggests that once we shift from physical to digital money things do change. With physical money (paper money, metal coins and cheques) we are more likely to buy into the illusion that money has inherent value. We are therefore more vigilant. In many cases, certainly my own, we are more careful. In short we think. Physical money feels real so our purchasing (and debt) is more considered. With digital money (everything from credit and debit cards to PayPal, Apple Money, iTunes vouchers, loyalty points and so forth) our spending is more impulsive. And as I said, earlier, when money is digital and belongs to someone else any careless behaviour is amplified. Quantitative Easing (QE) is perhaps a similar story. If instead of pressing a key on a computer and sending digital money to a secondary market to buy financial assets including bonds we saw fleets of trucks outside central banks being loaded with piles of real money to do the same I suspect that our reaction would be wholly different. We might even question whether a government monetising (buying) its own debt is a sensible idea given that the 2008 financial meltdown was caused by the transmission and obfuscation of debt. Of course, pumping money into assets via QE circles back to create inequality. If you own hard assets, such as real estate, then any price increases created by QE can be a good thing because it increases the value of your assets (often bought with debt, which is reduced via inflation). In contrast savers holding cash, or anyone without assets, is penalised. Its a bit of a stretch to link QE to the Arab Spring, but some people have, pointing out that food price inflation was a contributory factor, which can be indirectly linked to QEs effects on commodities. If one was a conspiracy theorist one might even suggest that QEs real aim was to drive down the value of the dollar, the pound and the Euro at the expense of spiralling hard currency debt and emerging economy currencies. Im getting back into macro-economics, which I dont want to do, but its worth pointing out that in The Downfall of Money, the author Frederick Taylor notes that Germanys hyperinflation not only destroyed the middle class, but democracy itself. As he writes, by the time inflation reached its zenith: everyone wanted a dictatorship. The cause of Germanys hyperinflation was initially Germany failing to keep up with payments due to France after WW1. But it was also caused by too much money chasing too few goods, which has shades of asset bubbles created by QE. It was depression, not inflation per se, that pushed voters toward Hitler, but this has a familiar ring. Across Europe we are seeing a significant rightwards shift and one of the main reasons why Germany wont boost the EU economy is because of the lasting trauma caused by inflation ninety years ago. If a lasting legacy of QE, debt, networked risk and a lack of financial restraint by individuals and institutions, all accentuated by digitalisation, is either high inflation or continued depression things could get nasty, in which case we might all long for the return of cash as a relatively safe and private way to endure the storm. Crypto-currency accounts The idea of a global digital economy thats free from dishonest banks, avaricious speculators and regulation-fixated governments is becoming increasingly popular, especially, as youd expect, online. Currencies around the world are still largely anchored to the idea of geographical boundaries and economies in which physical goods and services are exchanged. But what if someone invented a decentralised digital currency that operated independently of central banks? And what if that currency were to use encryption techniques, not only to ensure security and avoid confiscation or taxation, but to control the production of the currency? A crypto-currency like BitCoin perhaps?

In one scenario, BitCoin could become not only an alternative currency, but a stateless alternative payments infrastructure, competing against the like of Apple Pay and PayPal and against alternative currencies like airline miles. But theres a more radical possibility. What if a country got into trouble (Greece? Italy? Argentina?) and trust in the national currency collapsed. People might seek alternative ways to make payments or keep their money safe. If enough people flocked to something like BitCoin a government might be forced to follow suit and wed end up with a crytocurrency being used for exports, with its value tied to a particular economy or set of economies. More radically, how about a currency that rewarded certain kinds of behaviour? We have this already, in a sense, with loyalty cards, but Im thinking of something more consequential. What if the underlying infrastructure of BitCoin was used to create a currency that was distributed to people behaving in a virtuous manner? What if, for instance, money could be earned by putting more energy or water into a local network than was taken out? Or how about earning money by abstaining from the development of triple sub-basements or by visiting an elderly person that lives alone and asking them how they are? We could even pay people who smiled at strangers using eye-tracking and facial recognition technology on Apple smart glasses or Google eye-contact lenses. Given what governments would potentially be able to see and do if cash does disappear, such alternative currencies along with old-fashioned bartering could prove popular. At the moment, central banks use interest rates as the main weapon to control or stimulate the economy. But if people hoard cash because interest rates are low or because they dont trust banks then the economy is stunted. But with a cashless society the government has another weapon in its arsenal. What if banks not only charged people for holding money (negative interest rates) but governments imposed an additional levy for not spending it? This is making my head spin so we should move on to explore the brave new world of healthcare and medicine, of which money is an enabler. But before we do Id like to take a brief look at pensions and taxation and then end on considering whether the likes of Mark Zuckerberg might actually be OK really. If economic conditions are good, Id imagine that money and payments will continue to migrate toward digital formats. Alternatives to banks will spring up and governments will loosen their tax-take. However, if austerity persists, or returns, then governments will do everything they can to get hold of more of your money but they will be less inclined to spend it, especially on services. Taxation based upon income and expenditure will continue, but I expect that it will also shift towards assets and wealth and to a very real extent individual behaviour. One of the effects of moving toward digital payments and connectivity is transparency. Governments will, in theory, be able to see what youre spending your money on, but also how youre living in a broader sense. Hence stealth taxation. Have you put the wrong type of plastic in the recycling bin again? Thats a fine (tax). Kids late for school again? Fine (tax). Burger and large fries again? You get the idea Governments will seek to not only maximise revenue, but nudge people toward certain allegedly virtuous behaviours and people will be forced to pay for the tiniest transgressions. This, no doubt, will spark rage and rebellion, but theyll be a tax for that too. As for pensions, there are several plausible scenarios, but business as usual doesnt appear to be one of them. The system is a pyramid-selling scheme thats largely bust and needs to be reinvented in many countries. 1 in 7 people in the UK has no retirement savings whatsoever, for instance, and the culture of instant digital gratification would suggest that trying to get people to save a little for later wont meet with much success. What comes next largely depends on whether the culture of now persists and whether or not responsibility for the future is shared individually or collectively. If the culture of individualism and instant rewards holds firm, well end up with a very low safety net or a situation where people never fully retire. If we are able to delay gratification, well end up either with a return to a savings culture or one where the state provides significant support in return for significant contributions. The bottom line here is that pensions are set firmly in the future and while we like thinking about the future we dont like paying for it. So what might happen that could change the world for the better and make things slightly more sustainable?

An economy if people still matter In 1973, the economist EF Schumachers book Small is Beautiful warned against the dangers of gigantism. On one level the book was a pessimistic polemic about modernity in general and globalisation in particular. On the other hand it was possibly prescient and predictive. Schumacher foresaw the problem of resource constraints and foreshadowed the issue of human happiness, which he believed could not be sated by material possessions. He also argued for human satisfaction and pleasure to be central to all work, mirroring the thoughts of William Morris and the Arts and Crafts Movement. They argued that since consumer demand was such a central driver of the economy, then one way to change the world for the better would be to change what the majority of people want, which links directly back into money and our current voracious appetite for material possessions. On one level Schumachers book is still an idealistic hippy homily. On another it manages to describe our enduring desire for human scale, human relationships and technology that is appropriate, controllable and above all understandable. Physical money encourages the physical interaction of people, whereas digital cash is more hands off and remote. Digital transactions require energy and while any desire for green computing wont exactly stop the idea of a cashless society in its tracks it may yet restrain it. There are already some weak signals around this, which Schumacher may have approved of. Our desire for neo-Victorian computing (Steam Punk), craft sites like Etsy, the popularity of live music events and literature festivals and digital detoxing all point to a desire for balance and a world where humans are allowed to focus on what they do best. The partly generational shift toward temporary digital access rather than full physical ownership is also an encouraging development against what might be termed stuffocation. Schumacher also warned against the concentration of economic and political power, which he believed would lead to dehumanisation. Decisions should therefore be made on the basis of human needs rather than the revenue requirements of distantly accountable corporations and governments. In this respect the internet could go either way. It could bring people together and enable a more locally focussed and sustainable way or living or it could facilitate the growth of autocratic governments and monopolistic transnational corporations. But remember that the dematerialisation of the global economy the analogue to digital switch if you will is largely unseen and therefore mostly out of mind so very few people are discussing this at the moment. To some extent digital payments are a technology in search of a problem. Cash is easy to carry, easy to use and doesnt require a power source except to retrieve it from an ATM. Meanwhile credit and debit cards are widely accepted worldwide and online, so why do we need additional channels or formats? Maybe we dont. Maybe we dont even need money as much as we think. One of the problems with the digital economy from an economics standpoint is that digital companies dont produce many jobs. But maybe this isnt a problem. Once weve achieved shelter and security and managed to feed ourselves, the things that make us happy tend to be invisible to economists. The things that fulfil our deepest human needs are not to be physical things, but nebulous notions like love, belonging and compassion. This is reminiscent of Abraham Maslows hierarchy of needs, but unfortunately self-esteem, altruism, purpose and spirituality dont directly contribute to GDP or mass employment. Perhaps they should. It pains me to say it, but maybe the digital dreamers are onto something after all. Maybe the digital economy will change our frame of reference and focus our attention on non-monetary value and human exchange even if this goes a little crazy at times. What would Schumacher make our current economic situation? Maybe hed see the present day as the start of something nasty. Maybe hed see it as the start of something beautiful. What I suspect he would point out is that many people feel that they have lost control of their lives, especially financially. Job insecurity, austerity, debt and a lack of secure saving and pensions make people anxious. This can have physical effects. According to a study published in the medical magazine The Lancet, economic conditions can make people and their genetic dependants sick. Putting to one side the increased risk of suicide, mental health is a major casualty of volatile economic and geopolitical conditions. Psychological stress means that our bodies are flooded with stress hormones and these can make us ill. Turbulent economic conditions can also make long lasting changes to our genes, which can be a catalyst for heart disease, cancer and depression in later generations. Another study, co-led by George Slavich at the University of California at Los Angeles, says that there is historical evidence for such claims and cites the fact that generations born during recessions tend to have unusually short lifespans. Research by Jenny Tung at Duke University in North Carolina also suggests that if animals perceive they have a lower social rank, the more active their pro-inflammatory genes become. This may be applicable to humans perceiving that they are becoming digital-serfs. Even the anticipation of bad news or negative events may trigger such changes, which might explain why I recently heard that the shark in Notting Hill is now on medication.

The rest is here:

What's Next: Top Trends | Diary of an accidental futurist ...

What Is An Ecosystem?

Plants and animals depend upon one another in this African savannah ecosystem. Nuria Camacho / EyeEm; Getty Images

By Jenn Savedge

You have probably heard a lot about ecosystems and how important they are to the health of our planet. But what exactly is an ecosystem anyhow?

An ecosystem - or ecological system - is the interaction between living organisms in an environment, including plants, animals, fish, birds, microorganisms, water, and people, and their relationship with non-living components of the environment - such as soil, air, climate, and weather.

In other words - an ecosystem is the complex way that living things interact with and depend upon one another and their environment.

Each component of an ecosystem has a vital role to play to ensure that all components stay healthy. If a seemingly unimportant bug species starts to decline in population, that means that there will be less food for the spiders and birds in that environment. That in turn translates in less food for the animals that eat those spiders and bugs. And that means that fewer animals will be returning nutrients to the soil via their waste and decay. So the plants will begin to die as will any animals that feed on these plants.

It's a complex, complicated cycle in which balance is maintained only when all of the components are thriving in healthy populations. Too much of one organism will lead to a decline in another. Ecologists refer to healthy ecosystems as sustainable, meaning that they will remain in balance unless disturbed.

Natural incidents such as floods, fires, earthquakes, hurricanes, storms, and even volcanic eruptions can significantly damage the health of an ecosystem.

As can human-caused factors such as climate change,pollution, habitat destruction, overharvesting, and the introduction of invasive species.

Ecosystems vary in size to the complex interactions between microorganisms and a decaying leaf in the forest to the African Serengeti in which hundreds of plants and animals exist with and rely upon one another to survive. And ecosystem can be as small as a puddle or as large as the Atlantic Ocean.

Ecologists define very small ecosystems as 'micro' ecosystems. This could be the size of a puddle, a rock, or a leaf in the forest. These microsystems might seem too small to be significant.

But a single puddle could contain hundreds of microorganisms that depend upon one another to thrive. That's an ecosystem!

Larger ecosystem units are sometimes referred to as biomes. Classic examples of biomes include forests, wetlands, oceans, rivers, savannah, deserts, tundra, and grasslands.

Basically, if an ecosystem is primarily in a water body such as an ocean or lake, it is considered an aquatic ecosystem. If it is on land, it is called a terrestrial ecosystem.

Here is a quick breakdown of ecosystem classification:

Aquatic Ecosystems

Terrestrial Ecosystems

You may have heard this term before and wondered what it meant. An ecotone is the transition zone between two ecosystems. Ecosystems don't just stop and start abruptly. Even in a puddle ecosystem, there may be an area surrounding the puddle in which the ground holds more moisture than it does in the surrounding area. This moist ground may be home to different microorganisms than those that live further away. This area is not quite a puddle, and not quite the surrounding forest. It is a blend between the two ecosystems. It is an ecotone.

Read the original here:

What Is An Ecosystem?

List of islands of California – Wikipedia, the free encyclopedia

This list of islands of California is organized into sections, generally arranged from north to south. The islands within each section are listed in alphabetical order.

All three islands in Humboldt Bay are located in the narrow midsection of the bay. This portion of the bay is located within the City of Eureka, California entirely within Humboldt County.

The Farallon Islands are a group of rugged small islands over 20 miles (32km) offshore from the mainland of the City and County of San Francisco, which they are also formally within. They consist of over twenty small islets divided into north, south and middle sections, as well as a major bank, Fanny Shoal. The surrounding waters were once used as a disposal site for radioactive waste.[8]

The Sacramento-San Joaquin River Delta is an inverted delta at the juncture of the Sacramento and San Joaquin rivers. There are about 57 named islands in the Delta.

View post:

List of islands of California - Wikipedia, the free encyclopedia

Channel Islands of California – Wikipedia, the free encyclopedia

The Channel Islands of California are a chain of eight islands located in the Pacific Ocean off the coast of Southern California along the Santa Barbara Channel in the United States of America. Five of the islands are part of Channel Islands National Park, and the waters surrounding these islands make up Channel Islands National Marine Sanctuary. The islands were first colonized by the Chumash and Tongva Native Americans 13,000 years ago, who were then displaced by European settlers who used the islands for fishing and agriculture. The U.S. military uses the islands as training grounds, weapons test sites, and as a strategic defensive location. The Channel Islands and the surrounding waters house a diverse ecosystem with many endemic species and subspecies.

The eight islands are split among the jurisdictions of three separate California counties: Santa Barbara County (four), Ventura County (two), and Los Angeles County (two). The islands are divided into two groupsthe Northern Channel Islands and the Southern Channel Islands. The four Northern Islands used to be a single landmass known as Santa Rosae.

The archipelago extends for 160 miles (257km) between San Miguel Island in the north and San Clemente Island in the south. Together, the islands land area totals 221,331 acres (89,569ha), or about 346 square miles (900km2).

Five of the islands (San Miguel, Santa Rosa, Santa Cruz, Anacapa, and Santa Barbara) were made into the Channel Islands National Park in 1980. The Channel Islands National Marine Sanctuary encompasses the waters six nautical miles (11 kilometers) off Anacapa, Santa Cruz, San Miguel, Santa Rosa, and Santa Barbara islands.

Santa Catalina Island is the only one of the eight islands with a significant permanent civilian settlementthe resort city of Avalon, California, and the unincorporated town of Two Harbors.

Natural seepage of oil occurs at several places in the Santa Barbara Channel.[1] Tar balls or pieces of tar in small numbers are found in the kelp and on the beaches. Native Americans used naturally occurring tar, bitumen, for a variety of purposes which include roofing, waterproofing, paving and some ceremonial purposes.[2]

The Channel Islands at low elevations are virtually frost-free and constitute one of the few such areas in the 48 contiguous US states. It snows only rarely, on higher mountain peaks.

The eight Channel Islands of California, off the west coast of North America

Separated from the California mainland throughout recent geological history, the Channel Islands provide the earliest evidence for human seafaring in the Americas. It is also the site of the discovery of the earliest paleontological evidence of humans in North America.[3] The Northern Channel Islands are now known to have been settled by maritime Paleo Indian peoples at least 13,000 years ago. Archaeological sites on the island provide a unique and invaluable record of human interaction with Channel Island marine and terrestrial ecosystems from the late Pleistocene to historic times. Historically, the northern islands were occupied by the island Chumash, while the southern islands were occupied by the Tongva. Scott O'Dell has had a book written about the indigenous peoples living on the island, Island of the Blue Dolphins. Aleuts hunters visited the islands to hunt otters in the early 1800s. The Aleuts purportedly clashed with the native Chumash, killing many over trading disputes. Aleut interactions with the natives were also detailed in O'Dell's book.[4]

The Chumash and Tongva were removed from the islands in the early 19th century, taken to Spanish missions and pueblos on the adjacent mainland. For a century, the Channel Islands were used primarily for ranching and fishing activities, which had significant impacts on island ecosystems, including the local extinction of sea otters, bald eagles, and other species. With most of the Channel Islands now managed by federal agencies or conservation groups, the restoration of the island ecosystems has made significant progress.Several of the islands were used by whalers in the 1930s to hunt for sperm whales.[5]

In 1972, the Brown Berets seized and claimed the islands for Mexico, citing the Treaty of Guadalupe Hidalgo, a treaty between Mexico and the USA by which Mexico lost more than half of its territory, and arguing that the treaty does not specifically mention the Channel Islands nor the Farallon Islands. Though the United States had occupied them since 1852, the group speculated that Mexico could claim the islands and seek their return through litigation before the International Court of Justice. However, a detailed analysis of its situation puts in doubt the likelihood of Mexico winning the case at the International Court of Justice.[6]The Channel Islands National Park's mainland visitor center received 342,000 visitors in 2014. The Channel Islands itself attracts around 70,000 tourists a year, most during the summer.[7] Visitors can travel to the islands via public boat or airplane transportation. Camping grounds are available on Anacapa, Santa Rosa, Santa Cruz, San Miguel, and Santa Barbara Islands in the Channel Islands National Park. Attractions include whale watching, hikes, snorkeling, kayaking and camping.[8]

The United States Navy controls San Nicolas Island and San Clemente Island, and has installations elsewhere in the chain. During World War II all of Southern Californias Channel Islands were put under military control, including the civilian-populated Santa Catalina where tourism was halted and established residents needed permits to travel to and from the mainland.[9] San Miguel Island was used as a bombing range[10] and Santa Barbara Island as an early warning outpost under the presumed threat of a Japanese attack on California.[11] San Clemente Island was used to train the Navy's first amphibious force to prepare for Pacific combat against the Japanese in World War II.[12] San Nicolas Island has been used since 1957 as a launch pad for research rockets. San Nicolas was considered out of eight possible locations as the site of the Trinity nuclear test.[13] Santa Rosa Island was used in 1952 as a base for the USAF 669th AC&W Squadron and they operated two Distant Early Warning FPS-10 radars from the hilltops there. In 1955 another FPS-3 search radar was added, and in 1956, a GPS-3 search radar was installed. A new MPS-14 long-range height-finder radar was installed in 1958. The base was shut down in March 1963, when the 669th was moved to Vandenberg AFB In Lompoc, California. The islands still house US Navy SEALs training facilities and continues to use the Naval Auxiliary Landing Field located on San Clemente Island.[12]

The Channel Islands are part of one of the richest marine ecosystems of the world. Many unique species of plants and animals are endemic to the Channel Islands, including fauna such as the Channel Islands spotted skunk, ashy storm-petrel, Santa Cruz sheep, and flora including a unique subspecies of Torrey pine.

Flora on the Channel Islands include a unique subspecies of pine, oak, and the island tree mallow. Santa Rosa Island holds two groves of the Torrey pine subspecies Pinus torreyana var. insularis, which is endemic to the island. Torrey pines are the United States' rarest pine species.[14] The islands also house many rare and endangered species of plants, including the island barberry, the island rushrose, and the Santa Cruz Island lace pod. giant kelp forests surround the islands and act as a source of nutrition and protection for other animals.[15]

Invasive species, such as the Australian blue gum tree, olive tree, sweet fennel and Harding grass threaten native species through competition for light, nutrients, and water. The Australian blue gum, for example, releases toxins in its leaf litter which prevents other species of plants from growing in the soil surrounding it. The blue gum, as well as other species including the Harding grass, are much more flammable and better adapted to wildfires than native species.[16]

The Channel Islands and the waters surrounding hold many endemic species of animals, including fauna such as the Channel Islands spotted skunk, island scrub jay, ashy storm-petrel, Santa Cruz sheep, San Clemente loggerhead shrike, San Clemente sage sparrow. Many species of large marine mammals, including pacific gray whales, blue whales, and California sea lions breed or feed close to the Channel Islands. Seabirds, including the western gulls, bald eagles, pigeon guillemonts, and Scripps's murrelets use the islands as well for shelter and breeding grounds. The endemic island fox is California's smallest natural canine and has rebounded from its near extinction in the late 1990s. Several endemic reptile species including the island fence lizard, island night lizard, and Channel Islands slender salamander live on the islands.[17]

Conservation efforts are being made to maintain the islands' endemic species. Feral livestock, including pigs, goats, and sheep, pose a threat to many of the species, including the San Clemente loggerhead shrike and Channel Islands spotted skunk. The National Park Service eradicated the feral pigs on Santa Rosa and Santa Cruz islands during the 1990s and on Santa Catalina Island in 2007.[18][4] Introduced pathogens have devastated island species due to isolation from the mainland. In 1998, an outbreak of canine distemper swept through Santa Catalina Island severely reducing the island skunk and fox populations. Rabies and distemper vaccination programs were initiated to protect the island's wildlife. Canine distemper is thought to have been brought to the islands on a stowaway raccoon or a domestic dog.[19]

In the 1950s, bald eagles and peregrine falcons on the Channel Islands became locally extinct after widespread use of pesticides such as DDT.[20] The birds ingest contaminated fish and seabirds which poisons the adults and weakens their eggs. Golden eagles, which are natural competitors of other birds of prey, do not primarily feed on these animals and were able to colonize the islands in the early 1990s. In the early 2000s, golden eagles were live trapped and relocated.[21] In 2002 and 2006 breeding pairs of bald eagles were reintroduced to the northern islands.[22] Later in 2006, the introduced adult eagles hatched chicks on the islands for the first time since their extinction. The Channel Islands National Park established a bald eagle webcam on their website in 2007.[4]

Coordinates: 340058N 1194814W / 34.01611N 119.80389W / 34.01611; -119.80389

Here is the original post:

Channel Islands of California - Wikipedia, the free encyclopedia

Islands Restaurant – Rancho Park – Los Angeles, CA – Yelp

This was one of the first places I went with my roommates during my third year of college. We wanted something that wasn't necessarily Asian (here's looking at you, Sawtelle), so we decided on the safest option ever: BURGERS! I mean, you just can't go wrong. My friend and local LA native suggested Islands because it would be uncontroversial and good, so we all went with it.

We hopped into his jalopy (sorry, I just wanted to use that word. It's really just a Saturn) and took the short drive to Islands. We parked in a nearby neighborhood and walked over, and I took a picture of the sign, which you can see handily included below. The atmosphere of the place is pretty cool: lots of surfing stuff and other island-related paraphernalia. A bar sits in the middle and is a good place to watch sporting events such as UCLA improbably making March Madness and angering sports fans across the nation as it somehow wins its way to the Sweet 16. I didn't have them advancing in my bracket either, and I'm from UCLA...Well then...

Anyways, our server was quite attentive and helpful. I eventually decided upon the Kilauea, which is this insanely pepper-crusted burger with pepper jack cheese, chipotle aioli, lettuce, tomato, and island reds. I'm famous for not liking tomato, so I took mine out--not that you needed to know that or anything, but I pride myself on accurate reviews...Or boring ones.

The burger was actually really good, but if I had to nitpick, it was way too peppery. After a few bites, the pepper completely overwhelmed the burger to a point where it was overshadowing the entire meal. It was still dang good, but I had to un-crust some of the pepper to finish it. In any case, I would simply order a different burger next time since the amount of pepper (and I really, really love pepper) was a bit much.

Aside from that, the bottomless fries were pretty awesome and--yes--uncontroversial. Overall, a really safe place to eat without any major gripes, and that's exactly what you should expect.

Originally posted here:

Islands Restaurant - Rancho Park - Los Angeles, CA - Yelp

Collection Online | Browse By Movement | Futurism …

In a stylistic idiom that integrated some of the techniques of Cubism and Divisionism, the Futurists glorified the energy and speed of modern life together with the dynamism and violence of the new technological society. In their manifestos, art, poetry, and theatrical events, they celebrated automobiles, airplanes, machine guns, and other phenomena that they associated with modernity; they denounced moralism and feminism, as well as museums and libraries, which they considered static institutions of an obsolete culture. The Futurists sought to represent the experience of the modern metropolisnamely, the overstimulation of the individuals sensoriumby portraying multiple phases of motion simultaneously and by showing the interpenetration of objects and their environment through the superimposition of different chromatic planes. Artists and poets affiliated with Futurism include Giacomo Balla, Umberto Boccioni, Carlo Carr, Filippo Tommaso Marinetti (the movements founder), Luigi Russolo, and Gino Severini. Balla led a second generation of Italian Futurists, including Fortunato Depero, Gerardo Dottori, and Enrico Prampolini, in the 1920s and 1930s.

Almost concomitantly with Italian Futurism, a Russian version of Futurism developed under the leadership of Kazimir Malevich, who described most of his work from 1912 to 1915 as Cubo-Futurist. This Cubist fragmentation of space allied to the Futurist simultaneity of shifting forms was also taken up briefly by Liubov Popova and other Russian artists. Futurism, however, was more prevalent among Russias poets than its painters.

Here is the original post:

Collection Online | Browse By Movement | Futurism ...

Astrophysics – Postgraduate taught degree programmes …

We ask that you apply online for a postgraduate taught degree. Our system allows you to fill out the standard application form online and submit this to the University within 42 days of starting your application.

You need to read the guide to applying onlinebefore starting your application. It will ensure you are ready to proceed, as well as answer many common questions about the process.

Do I have to apply online for a postgraduate taught degree?

Yes. To apply for a postgraduate taught degree you must apply online. We are unable to accept your application by any other means than online.

Do I need to complete and submit the application in a single session?

No. You have 42 days to submit your application once you begin the process. You may save and return to your application as many times as you wish to update information, complete sections or upload additional documents such as your final transcript or your language test.

What documents do I need to provide to make an application?

As well as completing your online application fully, it is essential that you submit the following documents:

If you do not have all of these documents at the time of submitting your application then it is still possible to make an application and provide any further documents at a later date, as long as you include a full current transcript (and an English translation if required) with your application. See the Your References, Transcripts and English Qualification sections of our Frequently Asked Questions for more information.

Do my supporting documents need to be submitted online?

Yes, where possible, please upload the supporting documents with your application.

How do I provide my references?

You must either upload the required references to your online application or ask your referees to send the references to the University as we do not contact referees directly. There is two main ways that you can provide references: you can either upload references on headed paper when you are making an application using the Online Application (or through Applicant Self-Service after you have submitted your application) or you can ask your referee to email the reference directly to pgadmissions@glasgow.ac.uk. See the 'Your References, Transcripts and English Qualifications' section of the Frequently Asked Questions for more information.

What if I am unable to submit all of my supporting documents online?

If you cannot upload an electronic copy of a document and need to send it in by post, please attach a cover sheet to it that includes your name, the programme you are applying for, and your application reference number.

You may send them to:

Recruitment & International Office 71 Southpark Avenue Glasgow G12 8QQ Fax: +44 141 330 4045

Can I email my supporting documents?

No. We cannot accept email submissions of your supporting documents.

What entry requirements should I have met before applying? Where can I find them?

You should check that you have met (or are likely to have met prior to the start of the programme) the individual entry requirements for the degree programme you are applying for. This information can be found on the entry requirements tab on each individual programme page, such as the one you are viewing now.

What English Language requirements should I have met before applying? Where can I find them?

If you are an international student, you should also check that you have met the English Language requirements specific to the programme you are applying for. These can also be found on the entry requirements tab for each specific programme.

Further Information

Please see the Frequently Asked Questions for more information on applying to a postgraduate taught programme.

These notes are intended to help you complete the online application form accurately, they are also available within the help section of the online application form.If you experience any difficulties accessing the online application then you should visit the Application Troubleshooting/FAQs page.

Classes start September 2016 and you may be expected to attend induction sessions the week before.

Read more:

Astrophysics - Postgraduate taught degree programmes ...

Artificial intelligence: Should we be as terrified as Elon …

Elon Musk (left) and Bill Gates (right) have both raised concerns about artificial intelligence. Images: CNET

Elon Musk and Bill Gates have been as fearless as any entrepreneurs and innovators of the past half century. They have eaten big risks for breakfast and burped out billions of dollars afterward.

But today, both are terrified of the same thing: Artificial intelligence.

In a February 2015 Reddit AMA, Gates said, "First the machines will do a lot of jobs for us and not be super intelligent. That should be positive if we manage it well. A few decades after that though the intelligence is strong enough to be a concern ... and [I] don't understand why some people are not concerned."

AI and the Future of Business

Machine learning, task automation and robotics are already widely used in business. These and other AI technologies are about to multiply, and we look at how organizations can best take advantage of them.

In a September 2015 CNN interview, Musk went even further. He said, "AI is much more advanced than people realize. It would be fairly obvious if you saw a robot walking around talking and behaving like a person... What's not obvious is a huge server bank in a vault somewhere with an intelligence that's potentially vastly greatly than what a human mind can do. And it's eyes and ears will be everywhere, every camera, every device that's network accessible... Humanity's position on this planet depends on its intelligence so if our intelligence is exceeded, it's unlikely that we will remain in charge of the planet."

Gates and Musk are two of humanity's most credible thinkers, who have not only put forward powerful new ideas about how technology can benefit humanity, but have also put them into practice with products that make things better.

And still, their comments about AI tend to sound a bit fanciful and paranoid.

Are they ahead of the curve and able to understand things that the rest of us haven't caught up with yet? Or, are they simply getting older and unable to fit new innovations into the old tech paradigms that they grew up with?

To be fair, others such as Stephen Hawking and Steve Wozniak have expressed similar fears, which lends credibility to the position that Gates and Musk have staked out.

What this really boils down to is that it's time for the tech industry to put guidelines in place to govern the development of AI. The reason it's needed is that the technology could be developed with altruistic intentions, but could eventually be co-opted for destructive purposes--in the same way that nuclear technology became weaponized and spread rapidly before it could be properly checked.

In fact, Musk has made a direct correlation there. In 2014, he tweeted, "We need to be super careful with AI. [It's] potentially more dangerous than nukes."

AI is already creeping into military use with the rise of armed drone aircraft. No longer piloted by humans, they are carrying out attacks against enemy targets. For now, they are remotely controlled by soldiers. But the question has been raised of how long it will be until the machines are given specific humans or groups of humans--enemies in uniform--to target and given the autonomy to shoot to kill when they acquire their target. Should it ever be ethical for a machine to make a judgment call in taking a human life?

These are the kinds of conversations that need to happen more broadly before AI technology continues its rapid development. Certainly governments are going to want to get involved with laws and regulations, but the tech industry itself can pre-empt and shape that by putting together its own standards of conduct and ethical guidelines ahead of nations and regulatory bodies hardening the lines.

Stuart Russell, computer science professor at the University of California, Berkeley, has also compared the development of AI to nuclear weapons. Russell spoke to the United Nations in Geneva in April about these concerns. Russell said, "The basic scenario is explicit or implicit value misalignment--AI systems [that are] given objectives that don't take into account all the elements that humans care about. The routes could be varied and complex--corporations seeking a supertechnological advantage, countries trying to build [AI systems] before their enemies."

Russell recommended putting guidelines in place for students and researchers to keep human values at the center of all AI research.

Private sector giant Google--which has long explored AI and dove even deeper with its 2014 acquisition of DeepMind--set up an ethics review board to oversee the safety of the technologies that it develops with AI.

All of this begs for a public-private partnership to turn up the volume on these conversations and put well thought-out frameworks in place.

Let's do it before AI has its Hiroshima.

For more on how businesses are going to use AI, see our ZDNet-TechRepublic special feature AI and the Future of Business.

Previously on the Monday Morning Opener:

Continue reading here:

Artificial intelligence: Should we be as terrified as Elon ...

Applied Behavioral Science – Ashford University

Direct your education toward success with your Bachelor of Arts in Applied Behavioral Science degree from Ashford University.

Dr. Maura Pilotti is Chair of Ashford University's Applied Behavioral Science program. She holds a Doctor of Philosophy in Experimental Cognition from the Graduate Center of CUNY and a Laurea in Clinical Psychology from l'Universita' degli Studi di Padova in Italy. Read her full bio.

The Bachelor of Arts in Applied Behavioral Science degree program allows you to study ways to build community and maintain relationships. Learn about individual, family, and community problems and their solutions. Build a broad foundation of skills from the disciplines of logic, law, psychology, and sociology. This online degree program demonstrates your ability to understand behavior and solve social problems.

Successful completion of the Bachelor of Arts in Applied Behavioral Science degree by itself does not lead to licensure or certification in any state, regardless of concentration or specialization. Further, Ashford University does not guarantee that any professional organization will accept a graduate's application to sit for any exam for the purpose of professional certification. Students seeking licensure or certification in a particular profession are strongly encouraged to carefully research the requirements prior to enrollment. Requirements may vary by state. Further, a criminal record may prevent an applicant from obtaining licensure, certification, or employment in this field of study.

If a behavioral science degree fits your personal goals, contact Ashford University at 866.711.1700 to learn more, or request additional information.

See the original post here:
Applied Behavioral Science - Ashford University

Behavioral Science Degree | Schools.com

Behavioral science is related to psychology in that it makes use of observations of human behavior data to construct conclusions, but it takes a slightly different approach than the inward-looking psychological modeling of mind and personality. This subject focuses more on the effects that human actions have on relationships, decision-making and other aspects of human choice.

Expertise in behavioral science can be valuable in several different industries, and trained professionals are in demand all over the country. Degrees in the subject can be earned in a variety of different ways, depending on the goals and available resources of a given institution. Some schools group them with degrees in psychology, sociology and other social science disciplines, while others classify behavioral science degrees under the same umbrella as business school training, thus preparing students to apply their skills in a commercial context.

There are also different sets of expectations for behavioral science study at each academic level, with higher-order degrees often employing deeper channels of inquiry to a narrower, more specialized subject focus. Let's take a quick look at what you might expect at each level of behavioral science training.

At the associate level, behavioral science degree programs tend to concentrate on the basics, including introductory courses in social science, research methods and scientific inquiry. One advantage of an associate degree in behavioral science is how well it can prepare you to enter a bachelor's program in the discipline, but it may also demonstrate dedication to employers and thereby help you land internships or entry-level jobs.

Professionals with behavioral science training can translate their knowledge into success in a variety of different careers. Here are just a few of the paths you might take in the workforce after graduating with a behavioral science degree:

Interested in one of these careers? Check out behavioral science schools near you, and start identifying your next career steps.

Community colleges are great places to look for behavioral science degrees. Some institutions, like Granite State College in New Hampshire, offer behavioral science associate degrees online.

Bachelor's degrees in behavioral science tend to delve deeper into concepts that are only touched on in associate degree programs. Hands-on application of research methods is often a part of the upper-division study at this level, and bachelor's degree coursework tends to address these more complex subjects in behavioral science, such as:

A behavioral science bachelor's degree can potentially open many doors career-wise, particularly in the business world. Market research, human resources, public relations and customer service management are only a few of the fields where behavioral science training can be valuable. Many universities, including Bellevue University, Ashford University and Wilmington University offer an online bachelor's degree in behavioral science.

If your goal in understanding human behavior is to construct beneficial social policy based on your knowledge, or to eventually teach others what you know, a master's degree in behavioral science might be the right choice for you. You'll typically need a master's degree if you plan to go all the way and earn your Ph.D. The degree can also lead to opportunities in industries you may not expect, such as criminal justice, public health and government.

The coursework content and admission prerequisites of a behavioral science master's degree program tend to vary based on the student's chosen concentration. Those focusing on general psychology may need experimental design courses, counseling-oriented degrees may require study of counseling theories and methods, and students of behavioral science for criminal justice may need criminology and ethics courses. Online master's degrees and post-master's certificates in behavioral science can be found at many institutions, including Capella University and Saint Joseph's University.

Because it has applications in so many professional disciplines, certain institutions may combine the knowledge and skills of behavioral science with one of their other departments. There are some of the related disciplines where the behavioral science courses might be hiding in your school:

It's possible that behavioral science courses offered through departments other than psychology and sociology may focus on concepts of behavior as they relate to that specific department, so talk to an adviser or a departmental representative if you want to make your best-informed decision about which courses to take. The fact remains that if you're interested in behavioral science but the school you've chosen doesn't have a degree with that exact name, you can still find a way to keep yourself on track.

Sources:

Online Bachelor of Arts in Applied Behavioral Science, Ashford University, http://www.ashford.edu/degrees/online/ba-applied-behavioral-science.htm

Behavioral Science Degree - Bachelor of Science, Bellevue University, http://www.bellevue.edu/degrees/undergraduate/behavioral-science-bs/major-requirements.aspx

Master of Science - Behavioral Science, Cameron University, http://cameron.edu/graduate/programs/ms/

Master of Public Health, Social and Behavioral Sciences Specialization, Capella University, http://www.capella.edu/online-degrees/mph-social-behavioral-sciences/courses

Associate of Science in Behavioral Science, Granite State College, http://www.granite.edu/pdf/curriculummaps/undergraduate/AssocSciBehSci.pdf

Post Master's in Behavior Analysis, Saint Joseph's University, http://www.sju.edu/int/academics/cas/grad/pmba/overview.html

Human Resource Specialists and Labor Relations Specialists, "Occupational Outlook Handbook, 2014-15 Edition," Bureau of Labor Statistics, U.S. Department of Labor Statistics, Jan. 8, 2014, http://www.bls.gov/ooh/business-and-financial/human-resources-specialists-and-labor-relations-specialists.htm

Market Research Analysts, "Occupational Outlook Handbook, 2014-15 Edition," Bureau of Labor Statistics, U.S. Department of Labor, Jan. 8, 2014, http://www.bls.gov/ooh/business-and-financial/market-research-analysts.htm

Rehabilitation Counselors, "Occupational Outlook Handbook, 2014-15 Edition," Bureau of Labor Statistics, U.S. Department of Labor Statistics, Jan. 8, 2014, http://www.bls.gov/ooh/community-and-social-service/rehabilitation-counselors.htm

Social Workers, "Occupational Outlook Handbook, 2014-15 Edition," Bureau of Labor Statistics, U.S. Department of Labor, Jan. 8, 2014, http://www.bls.gov/ooh/Community-and-Social-Service/Social-workers.htm

Substance Abuse and Behavioral Disorder Counselors, "Occupational Outlook Handbook, 2014-15 Edition," Bureau of Labor Statistics, U.S. Department of Labor, Jan. 8, 2014, http:// http://www.bls.gov/ooh/community-and-social-service/substance-abuse-and-behavioral-disorder-counselors.htm

Behavioral Science, Bachelor of Science, Wilmington University, http://www.wilmu.edu/behavioralscience/behavsci_curr.aspx

View original post here:
Behavioral Science Degree | Schools.com

School of Social and Behavioral Sciences | New College of …

School of Social and Behavioral Sciences

Welcome to the New Colleges School of Social and Behavioral Sciences, a vibrant school that explores the intersection of human behavior and science.

Through challenging coursework that explores how the human mind works, students and faculty seek to understand how we communicate with one another, how and why we organize into cultural and political groups as it relates to vital causes and issues, and ultimately how we make sense of our world and the many places we occupy within it.

Students work with accomplished faculty who conduct cutting-edge research and are adept at translating knowledge to both undergraduate and graduate classrooms. Students work together to inspire and motivate one another along their journeys as they pursue their academic and career goals.

Graduates are prepared for work in marketing, public relations, behavioral health, politics, psychology, research and data analysis, consulting and sociology, or in any field where excellent communication and critical thinking skills are highly valued. Our graduates are also well prepared to seek advance degrees in a variety of disciplines including communication, psychology, and the law.

Led by Director Jeffrey Kassing, the School of Social and Behavioral Sciences offers five majors and one certificate. With an academically rigorous, career-focused, student-centered approach, the School of Social and Behavioral Sciences at the New College offers an unmatched educational experience. Programs offered include:

Bachelors degree programs:

Minors:

Certificate:

Labs / Spaces:

See original here:
School of Social and Behavioral Sciences | New College of ...

Behavioral Science Bachelor of Science – Wilmington University

Bachelor of Science About This Program Purpose

The purpose of the Bachelor of Science degree program in Behavioral Science is to provide students with an in-depth understanding of how social issues, social environments, and cultural influences impact individual and group behaviors. With a Wilmington University Behavioral Science degree, students will gain the knowledge and skills they need to succeed in careers working with adolescents and teens, families, the elderly, the homeless, the court systems, government agencies, addictions, crisis interventions, and more. Upon completion of the program, students seek careers in the human services, government, business, and industry. Students seeking graduate degree options may consider Masters degrees in social work, sociology, psychology, human services, public administration, criminology, counseling, or human resource management.

The program includes courses in psychology, sociology, and anthropology. Course work emphasizes normal and abnormal individual development, as well as family, group, and cultural dimensions of behavior. Ethical and professional issues are also addressed. Skill development in interpersonal relations, problem solving, and evaluation of programs and research is stressed. In addition, General Education courses required of all Wilmington University undergraduates provide a well-rounded academic foundation.

Classroom courses provide a blend of theory and application. Students also have the option to explore internship opportunities throughout the community in a variety of settings which provide experiences in applying knowledge and skills. The program is offered statewide, with day and evening classes offered at New Castle, Dover, Brandywine and Georgetown. The program is also offered in New Jersey at Mt. Laurel and Cumberland. The program is also available online.

PSY 309 Interpersonal Communication Skills OR PSY 315 Group Dynamics

SOC 304 Ethnic Groups and Minorities

The Co-op option allows students to complete core courses in PSY 309 - Interpersonal Communications Skills or PSY 315 - Group Dynamics and SOC 304 - Ethnic Groups and Minorities in a supervised educational work setting related to the student's major field of study. If students select the Co-op option, both PSY 309 or PSY 315 and SOC 304 in Co-op format are required. Alternative core courses may also be available for Co-op depending on the field placement. Each Co-op assignment is one semester long and normally, the two Co-op assignments span two consecutive semesters with the same employer. In order to be eligible, students must have at least 60 credits, plus a GPA of 2.5 or higher. Students must inform the Director of the CAP/Co-op program and the Behavioral Science Program Chair one semester before they would like to begin a Co-op assignment.

For additional information on the CAP/Co-op program option, please see the website: http://www.wilmu.edu/coop/

The College of Social and Behavioral Sciences recommends that students who transfer in six or more core courses and all 18 credits of core electives use any remaining electives to increase their subject knowledge by taking upper level electives in their field. These will include the interdisciplinary electives identified from the other academic colleges.

The Behavioral Science program has set a minimum passing grade of "C-" for program core courses. Students receiving a grade lower than "C-" in any required core course must retake that course.

This information applies to students who enter this degree program during the 2015-2016 Academic Year. If you entered this degree program before the Fall2015 semester, please refer to the academiccatalog for the year you began your degree program.

Read the original post:
Behavioral Science Bachelor of Science - Wilmington University