Scientists: Next Black Hole Image Will Be Way Clearer

The first-ever black hole image is a blurry orange ring. During the announcement, scientists described how they're working to improve the resolution.

Pale Orange Ring

The image of a black hole shared by scientists on Wednesday represents many things. It’s the first-ever direct observation of a black hole’s event horizon, it’s evidence supporting Einstein’s theory of general relativity — and, if we’re being perfectly honest, it’s just straight-up awesome.

But the picture — a glowing orange ring — is also kind of fuzzy, like an optometrist forgot to calibrate her equipment before photographing someone’s retina. That’s why scientists from the Event Horizon Telescope (EHT), the international network that captured the image, are promising that the next one will be way crisper.

Enhance!

The EHT is a network of radio telescopes around the world. By combining their data, scientists can essentially treat the EHT as though it’s a single planet-sized dish. That lets them spot small, faraway objects like the black hole at the center of the M87 galaxy — but because there’s only a handful of telescopes in the network, the images are inherently fuzzy.

Since the M87 data was collected, a number of telescopes have joined the ranks of the EHT, meaning that any future images will already be sharper, EHT Director Shep Doeleman explained during a livestream on Wednesday. He also mentioned that algorithms could be used to clean up the current image.

The current image was taken with a network of telescopes that could capture a wavelength as small as one millimeter — Doeleman’s goal is to get that down to 0.87 millimeters. That would sharpen future images by 13 percent.

Branching Out

In coming years, the EHT may even grow larger than the Earth.

“World domination isn’t enough — we also want to get into space,” Doeleman said, explaining that he hopes to introduce orbital telescopes into the mix.

Doing so would mean an even higher resolution for future black hole images, and it could help the EHT finally capture Sagittarius A*, the supermassive black hole at the core of the Milky Way.

More on the EHT: Scientists Just Released the First-Ever Image of a Black Hole

The post Scientists: Next Black Hole Image Will Be Way Clearer appeared first on Futurism.

Follow this link:
Scientists: Next Black Hole Image Will Be Way Clearer

Undersea Robots Are Helping Save the Great Barrier Reef

Australian scientists are preparing to deliver millions of coral larvae to the Great Barrier Reef using an autonomous drone called

RoboStork

A team of Australian scientists built an underwater robot that can deliver larval coral to the Great Barrier reef, where they hope it will help restore the reef to some of its former glory, before it was ravaged by climate change.

The delivery drone, LarvalBot, is a more hospitable version of the underwater drone that has previously been used to hunt and kill off the coral’s predators — yet another experiment in using robotics to protect and help recover the world’s coral reefs.

Fertilizing The Lawn

The scientists behind the project consider their work similar to fertilizing a lawn, according to Particle. Except instead of grass, it’s working on a beautiful and complex underwater ecosystem.

In order to re-seed the coral reef with larvae, scientists first need to gather that seed in the first place. Back in November, the researchers gathered millions of coral sperm and egg cells for what they called at the time “IVF for coral.”

Planning In Advance

LarvalBot made its first delivery back in December. Now the researchers are planning a second expedition to coincide with the reef’s natural mass spawning period, which will happen in October into November.

When that happens, LarvalBot will dive down, dropping millions of larvae that the researchers hope will be able to take root as brand new coral.

READ MORE: ROBOTS TO THE RESCUE OF THE GREAT BARRIER REEF [Particle]

More on the coral reef: To Protect Endangered Coral Reefs, Researchers Need Legal Recourse

The post Undersea Robots Are Helping Save the Great Barrier Reef appeared first on Futurism.

Visit link:
Undersea Robots Are Helping Save the Great Barrier Reef

Walmart Is Rolling Out Floor-Cleaning Robots in 1,500 Stores

Walmart is sending autonomous custodial robots to 1,500 stores in a play to cut down on the tasks human employees have to face.

Clean Many Robots

Walmart is about to bring worker robots to a third of its stores.

Of the corporation’s 4,600 U.S. locations, 1,500 are about to start using floor-cleaning custodial robots and 300 will use the bots to spot empty shelves, according to The Wall Street Journal. It’s a move that could save human employees a lot of time, but also one that signals that Walmart considers sees human employees and their salaries as circumventable expenses.

Time To Pivot

“With automation, we are able to take away some of the tasks that associates don’t enjoy doing,” Mark Propes, a Walmart operations director, told the WSJ. “At the same time, we continue to open up new jobs in other things in the store.”

Those other jobs are likely related to e-commerce, WSJ reports, as Walmart plans to pivot to more online sales in an attempt to challenge Amazon.

READ MORE: Walmart Is Rolling Out the Robots [The Wall Street Journal]

More on Walmart: Walmart Is About to Deploy Hundreds of Robot Janitors

The post Walmart Is Rolling Out Floor-Cleaning Robots in 1,500 Stores appeared first on Futurism.

Originally posted here:
Walmart Is Rolling Out Floor-Cleaning Robots in 1,500 Stores

Zapping Elderly People’s Brains Supercharges Their Working Memory

Electrically stimulating the brains of people in their 60s and 70s allowed them to perform as well on working memory tasks as 20-somethings.

Memory Games

Stimulating the brains of elderly people with electrical currents allowed them to perform just as well on a memory test as people in their 20s — a sign that researchers may have found a noninvasive way to turn back the hands of time when it comes to human memory.

“It’s opening up a whole new avenue of potential research and treatment options,” researcher Rob Reinhart said in a press release regarding the study, “and we’re super excited about it.”

All Ages

In a study published in the journal Nature Neuroscience on Monday, researchers from Boston University detail how they asked a group of 20-somethings and a group of people in their 60s and 70s to complete a task designed to test their working memory, which is the part of our short-term memory that we use for reasoning and decision-making.

Working memory typically begins declining around the time we hit 30 years old, so as expected, the people in their 20s outperformed the older group on the memory task.

Remembrall

However, after the members of the older group received 25 minutes of mild stimulation via scalp electrodes, they performed just as well as the younger participants — and the memory boost still hadn’t subsided by the time the experiment ended 50 minutes later.

According to the researchers, the benefits of this noninvasive treatment could extend beyond those whose working memory has started to succumb to age, too. They found that stimulating the brains of the younger people who performed poorly on the task boosted their memories as well.

READ MORE: As Memories Fade, Can We Supercharge Them Back to Life? [Boston University]

More on memory: Can a Brain Zap Really Boost Your Memory?

The post Zapping Elderly People’s Brains Supercharges Their Working Memory appeared first on Futurism.

The rest is here:
Zapping Elderly People’s Brains Supercharges Their Working Memory

The Israeli Moon Lander Is About to Touch Down

SpaceIL's Moon lander, Beresheet, is expected to touch down on the lunar surface on Thursday, landing Israeli a place in the history books.

Lunar Lander

If all goes according to plan, Israel will earn a place in history on Thursday as the fourth nation ever to land a spacecraft on the Moon — and unlike any craft that came before it, this Moon lander was privately funded.

Beresheet is the work of SpaceIL, a nonprofit Israeli space company. On Feb. 21, the company launched its $100 million spacecraft on a journey to the Moon aboard a SpaceX Falcon 9 rocket, and on April 4, it settled into the Moon’s orbit.

The next step in the mission is for Beresheet to attempt to land on the surface of the Moon sometime between 3 and 4 p.m. ET on Thursday.

Watch Along

Beresheet’s target landing site is in the northeastern part of Mare Serenitatis, also known as the Sea of Serenity.

“On the basis of our experience with Apollo, the Serenitatis sites favor both landing safety and scientific reward,” SpaceIL team member Jim Head said in a press release.

SpaceIL and Israel Aerospace Industries, the company that built Beresheet, will live-stream Thursday’s touch-down attempt, so the world will have a chance to watch along as Israel tries to land itself a spot in the history books.

READ MORE: Israel’s Beresheet space probe prepares for historic moon landing [NBC News]

More on Beresheet: Israel’s Moon Lander Just Got Photobombed by the Earth

The post The Israeli Moon Lander Is About to Touch Down appeared first on Futurism.

Go here to read the rest:
The Israeli Moon Lander Is About to Touch Down

Some People Are Exceptionally Good at Predicting the Future

Some people are adept at forecasting, predicting the likelihood of future events, and a new contest aims to suss them out.

Super-Forecasters

Some people have a knack for accurately predicting the likelihood of future events. You might even be one of these “super-forecasters” and not know it — but now there’s an easy way to find out.

BBC Future has teamed up with UK-based charity Nesta and forecasting services organization Good Judgement on the “You Predict the Future” challenge. The purpose is to study how individuals and teams predict the likelihood of certain events, ranging from the technological to the geopolitical.

All Winners

Anyone interested in testing their own forecasting skills can sign up for the challenge to answer a series of multiple-choice questions and assign a percentage to how likely each answer is to come true.

“When you’re part of the challenge, you’ll get feedback on how accurate your forecasts are,” Kathy Peach, who leads Nesta’s Centre for Collective Intelligence Design, told BBC Future. “You’ll be able to see how well you do compared to other forecasters. And there’s a leader board, which shows who the best performing forecasters are.”

Collective Intelligence

You’ll also be helping advance research on collective intelligence, which focuses on the intellectual abilities of groups of people acting as one.

Additionally, as Peach told BBC Future, “New research shows that forecasting increases open-mindedness, the ability to consider alternative scenarios, and reduces political polarisation,”  — meaning even if you don’t find out you’re a “super-forecaster,” you might just end up a better person after making your predictions.

READ MORE: Could you be a super-forecaster? [BBC Future]

More on forecasting: Forecasting the Future: Can the Hive Mind Let Us Predict the Future?

The post Some People Are Exceptionally Good at Predicting the Future appeared first on Futurism.

See original here:
Some People Are Exceptionally Good at Predicting the Future

Amazon Is Fighting South American Govs to Control “.Amazon” Domains

Amazon and a coalition of nations in South America are duking it out over who gets the coveted

Ongoing Battle

The deadline has passed for Amazon and a coalition of eight governments in South America to settle a seven-year dispute over the coveted “.amazon” top-level domain.

Both groups want dibs, and neither Amazon nor the countries through which the iconic river runs have agreed to various compromises, according to BBC News. Above all else, the dispute highlights how Amazon has become powerful that it’s becoming embroiled in geopolitical disputes.

Back And Forth

The eight nations, which together form the Amazon Cooperation Treat Organization (ACTO), blocked Amazon’s attempt to claim the domain outright. In ACTO’s proposed deal, Amazon would be allowed to use relevant sites like “kindle.amazon,” but most addresses would be reserved for member nations.

Amazon essentially proposed the opposite, in which each country would get a modified version of the “.amazon” domain.

Not Budging

The Internet Corporation for Assigned Names and Numbers (ICANN) gave Amazon and ACTO until April 7 to settle the dispute. But Business Insider reports that neither group submitted a deal to ICANN by the deadline, which has now been pushed back to April 21.

In 2018, Amazon tried to garner favor by offering $5 million worth of Kindles and web hosting, which ACTO declined.

“We are not looking for financial compensation,” Ecuadorian ambassador Francisco Carrión wrote to ICANN. “Nor are we after ex-gratia concessions to use one or a few second-level domains.”

READ MORE: The nations of the Amazon want the name back [BBC News]

More on Amazon: Lawmakers Don’t Know How to Regulate Amazon’s Delivery Robots

The post Amazon Is Fighting South American Govs to Control “.Amazon” Domains appeared first on Futurism.

More:
Amazon Is Fighting South American Govs to Control “.Amazon” Domains

China Is Trying to Scrub Bikinis and Smoking From the Internet

A new story reveals how Chinese live-streaming company Inke uses a combination of human moderators and AI to facilitate government censorship.

Cleaning Cyberspace

On Monday, the South China Morning Post published a story about the content moderation operations at Inke, one of China’s largest live-streaming companies.

The piece offers a rare glimpse at how China’s private sector helps facilitate government censorship. In some cases, that means flagging streams of people smoking or wearing bikinis — content that would likely seem fairly innocuous to an American audience — but in others, it means preventing internet viewers from seeing streams of people committing acts of terrorism or violence.

That’s the same kind of content multinational corporations such as Facebook have had trouble moderating — raising questions about what these Chinese companies have figured out that American ones haven’t.

Evolving Censorship

Inke tasks a team of 1,200 moderators with policing the streams of its 25 million users, according to SCMP.

The moderators watch streams 10- to 15-seconds before they actually go live, and in that time, they’re expected to catch anything “that is against the law and regulations, against mainstream values, and against the company’s values,” Zhi Heng, Inke’s content safety team leader, told the SCMP.

Inke defers to guidelines published by the China Association of Performing Arts to know what content falls under that umbrella, and according to the SCMP story, it ranges from politically sensitive speech and violence to people smoking or wearing bikinis.

The document is updated weekly, however, meaning content that might be acceptable one week could be censored the next, or vice versa.

To make this massive task of censoring content a little more manageable on its human moderators, Inke also employs algorithms and recognition software capable of filtering content into different risk categories.

The company sometimes dedicates just one human reviewer to watching streams considered “low-risk,” such as cooking shows, according to SCMP, while higher-risk streams receive closer scrutiny.

Learning Opportunity

The idea of censoring streams of people smoking cigarettes or wearing bikinis might seem ridiculous to a Western audience.

However, if Inke’s combination of human and AI moderators is effective at flagging the content deemed objectionable in China, it’s worth considering what it’s doing that others, such as Facebook, aren’t. Are Inke’s algorithms better in some discernible way? Has it stumbled upon the optimum human moderator-to-user ratio?

You might not agree with the content China is censoring, but content moderation isn’t by default objectionable — even Facebook’s own execs believe the company should have prevented the horrific livestream of the Christchurch shooting from reaching its audience, for example.

So perhaps there’s something Facebook and others could learn from how Inke is managing the job of filtering out undesirable online content, even if we don’t agree with China’s definition of undesirable.

READ MORE: No smoking, no tattoos, no bikinis: inside China’s war to ‘clean up’ the internet [South China Morning Post]

More on censorship: China Is Censoring “Genetically Edited Babies” on Social Media

The post China Is Trying to Scrub Bikinis and Smoking From the Internet appeared first on Futurism.

Continued here:
China Is Trying to Scrub Bikinis and Smoking From the Internet

Here’s How Big the M87 Black Hole Is Compared to the Earth

The black hole that scientists imaged is a stellar giant. It would take millions of Earths lined up side-by-side to span its length.

Pale Black Dot

On Wednesday, a team of scientists from around the world released the first ever directly-observed image of the event horizon of a black hole.

The black hole, M87*, is found within the constellation Virgo — and as the webcomic XKCD illustrated, it’s as big as our entire solar system.

Stellar Giant

The gigantic black hole, not counting the giant rings of trapped light orbiting it, is about 23.6 billion miles (38 billion kilometers) across, according to Science News.

Meanwhile, the Earth is just 7,917 miles in diameter — meaning our planet wouldn’t even be a drop in the bucket of the giant, black void. Based Futurism’s calculations, it would take just over 2.98 million Earths lined up in a row to span the length of M87*. For a sense of scale, that’s about how many adult giraffes it would take to span the diameter of Earth.

Paging Pluto

Our entire solar system is just about 2.27 billion miles wide, meaning we could just barely fit the whole thing into the newly-imaged black hole’s event horizon.

Thankfully, M87* is about 55 million light years away — so while we could readily fit inside its gaping maw, we’re way too far to get sucked in.

READ MORE: Revealed: a black hole the size of the solar system [Cosmos]

More on M87*: Scientists: Next Black Whole Image Will Be Way Clearer

The post Here’s How Big the M87 Black Hole Is Compared to the Earth appeared first on Futurism.

Read the rest here:
Here’s How Big the M87 Black Hole Is Compared to the Earth

Amazon Workers Listen to Your Alexa Conversations, Then Mock Them

A new Bloomberg piece shared the experiences of Amazon workers tasked with listening to Alexa recordings, and what they hear isn't always mundane.

I Hear You

Amazon pays thousands of workers across the globe to review audio picked up by its Echo speakers — and their behavior raises serious concerns about both privacy and safety.

Bloomberg recently spoke with seven people who participated in Amazon’s audio review process. Each worker was tasked with listening to, transcribing, and annotating voice recordings with the goal of improving the ability of Amazon’s Alexa smart assistant to understand and respond to human speech.

But sometimes, according to Bloomberg, they share private recordings in a disrespectful way.

“I think we’ve been conditioned to the [assumption] that these machines are just doing magic machine learning” University of Michigan professor Florian Schaub told Bloomberg. “But the fact is there is still manual processing involved.”

Listen to This

The job is usually boring, according to Bloomberg’s sources. But if they heard something out of the ordinary, they said, sometimes they’d share the Alexa recordings with other workers via internal chat rooms.

Occasionally, it was just because they found the audio amusing — a person singing off-key, for example — but other times, the sharing was “a way of relieving stress” after hearing something disturbing, such as when two of Bloomberg’s sources heard what sounded like a sexual assault.

When they asked Amazon how to handle cases like the latter, the workers said they were told “it wasn’t Amazon’s job to interfere.” Amazon, meanwhile, said it had procedures in place for when workers hear something “distressing” in Alexa recordings.

READ MORE: Amazon Workers Are Listening to What You Tell Alexa [Bloomberg]

More on Echo: Thanks, Amazon! Echo Recorded and Sent Audio to Random Contacts Without Warning

The post Amazon Workers Listen to Your Alexa Conversations, Then Mock Them appeared first on Futurism.

See the article here:
Amazon Workers Listen to Your Alexa Conversations, Then Mock Them

Scientists Say New Quantum Material Could “‘Download’ Your Brain”

A new type of quantum material can directly measure neural activity and translate it into electrical signals for a computer.

Computer Brain

Scientists say they’ve developed a new “quantum material” that could one day transfer information directly from human brains to a computer.

The research is in early stages, but it invokes ideas like uploading brains to the cloud or hooking people up to a computer to track deep health metrics — concepts that until now existed solely in science fiction.

Quantum Interface

The new quantum material, described in research published Wednesday in the journal Nature Communications, is a “nickelate lattice” that the scientists say could directly translate the brain’s electrochemical signals into electrical activity that could be interpreted by a computer.

“We can confidently say that this material is a potential pathway to building a computing device that would store and transfer memories,” Purdue University engineer Shriram Ramanathan told ScienceBlog.

Running Diagnostics

Right now, the new material can only detect the activity of some neurotransmitters — so we can’t yet upload a whole brain or anything like that. But if the tech progresses, the researchers hypothesize that it could be used to detect neurological diseases, or perhaps even store memories.

“Imagine putting an electronic device in the brain, so that when natural brain functions start deteriorating, a person could still retrieve memories from that device,” Ramanathan said.

READ MORE: New Quantum Material Could Warn Of Neurological Disease [ScienceBlog]

More on brain-computer interface: This Neural Implant Accesses Your Brain Through the Jugular Vein

The post Scientists Say New Quantum Material Could “‘Download’ Your Brain” appeared first on Futurism.

Continued here:
Scientists Say New Quantum Material Could “‘Download’ Your Brain”

Scientists Find a New Way to Kickstart Stable Fusion Reactions

A new technique for nuclear fusion can generate plasma without requiring as much space-consuming equipment within a reactor.

Warm Fusion

Scientists from the Princeton Plasma Physics Laboratory say that they’ve found a new way to start up nuclear fusion reactions.

The new technique, described in research published last month in the journal Physics of Plasmas, provides an alternate means for reactors to convert gas into the superhot plasma that gets fusion reactions going with less equipment taking up valuable lab space — another step in the long road to practical fusion power.

Out With The Old

Right in the center of a tokamak, a common type of experimental nuclear fusion reactor, there’s a large central magnet that helps generate plasma. The new technique, called “transient coaxial helical injection,” does away with the magnet but still generates a stable reaction, freeing up the space taken up by the magnet for other equipment.

“The good news from this study,” Max Planck Institute researcher Kenneth Hammond said in a press release, “is that the projections for startup in large-scale devices look promising.”

READ MORE: Ready, set, go: Scientists evaluate novel technique for firing up fusion-reaction fuel [Princeton Plasma Physics Laboratory newsroom via ScienceDaily]

More on nuclear fusion: Scientists Found a New Way to Make Fusion Reactors More Efficient

The post Scientists Find a New Way to Kickstart Stable Fusion Reactions appeared first on Futurism.

See more here:
Scientists Find a New Way to Kickstart Stable Fusion Reactions

NASA Is Funding the Development of 18 Bizarre New Projects

Through the NASA Innovative Advanced Concepts (NIAC) program, NASA funds projects that go

Nurturing the Bizarre

NASA isn’t afraid to take a chance on the weird. In fact, it has a program designed for that specific purpose, called NASA Innovative Advanced Concepts (NIAC) — and on Wednesday, the agency announced 18 bizarre new projects receiving funding through the program.

“Our NIAC program nurtures visionary ideas that could transform future NASA missions by investing in revolutionary technologies,” NASA exec Jim Reuter said in a press release. “We look to America’s innovators to help us push the boundaries of space exploration with new technology.”

Sci-Fi to Sci-Fact

The 18 newly funded projects are divided into two groups: Phase I and Phase II.

The 12 recipients of the Phase I awards will each receive approximately $125,000 to fund nine month’s worth of feasibility studies for their concepts. These include a project to beam power through Venus’ atmosphere to support long-term missions, a spacesuit with self-healing skin, and floating microprobes inspired by spiders.

The six Phase II recipients, meanwhile, will each receive up to $500,000 to support two-year studies dedicated to fine-tuning their concepts and investigating potential ways to implement the technologies, which include a flexible telescope, a neutrino detector, and materials for solar surfing.

“NIAC is about going to the edge of science fiction, but not over,” Jason Derleth, NIAC program executive, said in the press release. “We are supporting high impact technology concepts that could change how we explore within the solar system and beyond.”

READ MORE: NASA Invests in Potentially Revolutionary Tech Concepts [Jet Propulsion Laboratory]

More on bizarre NASA plans: New NASA Plan for Mars Is Moderately-Terrifying-Sounding, Also, Completely-Awesome: Robotic. Bees.

The post NASA Is Funding the Development of 18 Bizarre New Projects appeared first on Futurism.

More:
NASA Is Funding the Development of 18 Bizarre New Projects

Report: Tesla Doc Is Playing Down Injuries to Block Workers’ Comp

Former Tesla and clinic employees share how doctors blocked workers' compensation claims and put injured people back to work to avoid payouts.

Here’s A Band-Aid

Tesla’s on-site clinic, Access Omnicare, has allegedly been downplaying workers’ injuries to keep the electric automaker off the hook for workers’ compensation.

Several former Tesla employees, all of whom got hurt on the job, and former employees of Access Omnicare, told Reveal News that the clinic was minimizing worker injuries so that the automaker wouldn’t have to pay workers’ comp — suggesting that the barely-profitable car company is willing to do whatever it takes to stay out of the red and avoid negative press.

Back To Work

Reveal, which is a project by the Center for Investigative Reporting, described cases in which employees suffered electrocution, broken bones, and mold-related rashes while working in a Tesla factory — only for Omnicare to deny that the injuries warranted time off work.

The clinic’s top doctor “wanted to make certain that we were doing what Tesla wanted so badly,” former Omnicare operations manager Yvette Bonnet told Reveal. “He got the priorities messed up. It’s supposed to be patients first.”

Missing Paperwork

Meanwhile, employees who requested the paperwork to file for workers’ comp were repeatedly ignored, according to Reveal.

“I just knew after the third or fourth time that they weren’t going to do anything about it,” a former employee whose back was crushed under a falling Model X hatchback told Reveal. “I was very frustrated. I was upset.”

The automaker is on the hook for up to $750,000 in medical payments per workers’ comp claim, according to Reveal‘s reporting.

Meanwhile, both Tesla CEO Elon Musk and Laurie Shelby, the company’s VP of safety, have publicly praised Access Omnicare, Reveal found. Musk even recently announced plans to extend it to other plants, “so that we have really immediate first-class health care available right on the spot when people need it.”

READ MORE: How Tesla and its doctor made sure injured employees didn’t get workers’ comp [Reveal News]

More on Tesla: Video Shows Tesla Autopilot Steering Toward Highway Barriers

The post Report: Tesla Doc Is Playing Down Injuries to Block Workers’ Comp appeared first on Futurism.

Continue reading here:
Report: Tesla Doc Is Playing Down Injuries to Block Workers’ Comp

Infertile Couple Gives Birth to “Three-Parent Baby”

A Greek couple just gave birth to a three-parent baby, the first conceived as part of a clinical trial to treat infertility.

Happy Birthday

On Tuesday, a couple gave birth to what researchers are calling a “three-parent baby” — giving new hope to infertile couples across the globe.

After four cycles of in vitro fertilization failed to result in a pregnancy, the Greek couple enrolled in a clinical trial for mitochondrial replacement therapy (MRT) — meaning doctors placed the nucleus from the mother’s egg into a donor egg that had its nucleus removed. Then they fertilized the egg with sperm from the father and implanted it into the mother.

Due to this procedure, the six-pound baby boy has DNA from both his mother and father, as well as a tiny bit from the woman who donated the egg.

Greek Life

The Greek baby wasn’t the first “three-parent baby” born after his parents underwent MRT — that honor goes to the offspring of a Jordanian woman who gave birth in 2016.

However, in her case and others that followed it, doctors used the technique to prevent a baby from inheriting a parent’s genetic defect. This marked the first time a couple used MRT as part of a clinical trial to treat infertility.

“Our excellent collaboration and this exceptional result will help countless women to realise their dream of becoming mothers with their own genetic material,” Nuno Costa-Borges, co-founder of Embryotools, one of the companies behind the trial, said in a statement.

READ MORE: Baby with DNA from three people born in Greece [The Guardian]

More on three-parent babies: An Infertile Couple Is Now Pregnant With a “Three-Parent Baby”

The post Infertile Couple Gives Birth to “Three-Parent Baby” appeared first on Futurism.

Read more here:
Infertile Couple Gives Birth to “Three-Parent Baby”

MIT Prof: If We Live in a Simulation, Are We Players or NPCs?

An MIT scientist asks whether we're protagonists in a simulated reality or so-called NPCs who exist to round out a player character's experience. 

Simulation Hypothesis

Futurism readers may recognize Rizwan Virk as the MIT researcher touting a new book arguing that we’re likely living in a game-like computer simulation.

Now, in new interview with Vox, Virk goes even further — by probing whether we’re protagonists in the simulation or so-called “non-player characters” who are presumably included to round out a player character’s experience.

Great Simulation

Virk speculated about whether we’re players or side characters when Vox writer Sean Illing asked a question likely pondered by anyone who’s seen “The Matrix”: If you were living in a simulation, would you actually want to know?

“Probably the most important question related to this is whether we are NPCs (non-player characters) or PCs (player characters) in the video game,” Virk told Vox. “If we are PCs, then that means we are just playing a character inside the video game of life, which I call the Great Simulation.”

More Frightening

It’s a line of inquiry that cuts to the core of the simulation hypothesis: If the universe is essentially a video game, who built it — and why?

“The question is, are all of us NPCs in a simulation, and what is the purpose of that simulation?” Virk asked. “A knowledge of the fact that we’re in a simulation, and the goals of the simulation and the goals of our character, I think, would still be interesting to many people.”

READ MORE: Are we living in a computer simulation? I don’t know. Probably. [Vox]

More on the simulation hypothesis: Famous Hacker Thinks We’re Living in Simulation, Wants to Escape

The post MIT Prof: If We Live in a Simulation, Are We Players or NPCs? appeared first on Futurism.

Continue reading here:
MIT Prof: If We Live in a Simulation, Are We Players or NPCs?

astronomy | Definition & Facts | Britannica.com

Since the late 19th century astronomy has expanded to include astrophysics, the application of physical and chemical knowledge to an understanding of the nature of celestial objects and the physical processes that control their formation, evolution, and emission of radiation. In addition, the gases and dust particles around and between the stars have become the subjects of much research. Study of the nuclear reactions that provide the energy radiated by stars has shown how the diversity of atoms found in nature can be derived from a universe that, following the first few minutes of its existence, consisted only of hydrogen, helium, and a trace of lithium. Concerned with phenomena on the largest scale is cosmology, the study of the evolution of the universe. Astrophysics has transformed cosmology from a purely speculative activity to a modern science capable of predictions that can be tested.

Its great advances notwithstanding, astronomy is still subject to a major constraint: it is inherently an observational rather than an experimental science. Almost all measurements must be performed at great distances from the objects of interest, with no control over such quantities as their temperature, pressure, or chemical composition. There are a few exceptions to this limitationnamely, meteorites (most of which are from the asteroid belt, though some are from the Moon or Mars), rock and soil samples brought back from the Moon, samples of comet and asteroid dust returned by robotic spacecraft, and interplanetary dust particles collected in or above the stratosphere. These can be examined with laboratory techniques to provide information that cannot be obtained in any other way. In the future, space missions may return surface materials from Mars, or other objects, but much of astronomy appears otherwise confined to Earth-based observations augmented by observations from orbiting satellites and long-range space probes and supplemented by theory.

The solar system took shape 4.57 billion years ago, when it condensed within a large cloud of gas and dust. Gravitational attraction holds the planets in their elliptical orbits around the Sun. In addition to Earth, five major planets (Mercury, Venus, Mars, Jupiter, and Saturn) have been known from ancient times. Since then only two more have been discovered: Uranus by accident in 1781 and Neptune in 1846 after a deliberate search following a theoretical prediction based on observed irregularities in the orbit of Uranus. Pluto, discovered in 1930 after a search for a planet predicted to lie beyond Neptune, was considered a major planet until 2006, when it was redesignated a dwarf planet by the International Astronomical Union.

The average Earth-Sun distance, which originally defined the astronomical unit (AU), provides a convenient measure for distances within the solar system. The astronomical unit was originally defined by observations of the mean radius of Earths orbit but is now defined as 149,597,870.7 km (about 93 million miles). Mercury, at 0.4 AU, is the closest planet to the Sun, while Neptune, at 30.1 AU, is the farthest. Plutos orbit, with a mean radius of 39.5 AU, is sufficiently eccentric that at times it is closer to the Sun than is Neptune. The planes of the planetary orbits are all within a few degrees of the ecliptic, the plane that contains Earths orbit around the Sun. As viewed from far above Earths North Pole, all planets move in the same (counterclockwise) direction in their orbits.

Most of the mass of the solar system is concentrated in the Sun, with its 1.99 1033 grams. Together, all of the planets amount to 2.7 1030 grams (i.e., about one-thousandth of the Suns mass), and Jupiter alone accounts for 71 percent of this amount. The solar system also contains five known objects of intermediate size classified as dwarf planets and a very large number of much smaller objects collectively called small bodies. The small bodies, roughly in order of decreasing size, are the asteroids, or minor planets; comets, including Kuiper belt, Centaur, and Oort cloud objects; meteoroids; and interplanetary dust particles. Because of their starlike appearance when discovered, the largest of these bodies were termed asteroids, and that name is widely used, but, now that the rocky nature of these bodies is understood, their more descriptive name is minor planets.

The four inner, terrestrial planetsMercury, Venus, Earth, and Marsalong with the Moon have average densities in the range of 3.95.5 grams per cubic cm, setting them apart from the four outer, giant planetsJupiter, Saturn, Uranus, and Neptunewhose densities are all close to 1 gram per cubic cm, the density of water. The compositions of these two groups of planets must therefore be significantly different. This dissimilarity is thought to be attributable to conditions that prevailed during the early development of the solar system (see below Theories of origin). Planetary temperatures now range from around 170 C (330 F, 440 K) on Mercurys surface through the typical 15 C (60 F, 290 K) on Earth to 135 C (210 F, 140 K) on Jupiter near its cloud tops and down to 210 C (350 F, 60 K) near Neptunes cloud tops. These are average temperatures; large variations exist between dayside and nightside for planets closest to the Sun, except for Venus with its thick atmosphere.

The surfaces of the terrestrial planets and many satellites show extensive cratering, produced by high-speed impacts (see meteorite crater). On Earth, with its large quantities of water and an active atmosphere, many of these cosmic footprints have eroded, but remnants of very large craters can be seen in aerial and spacecraft photographs of the terrestrial surface. On Mercury, Mars, and the Moon, the absence of water and any significant atmosphere has left the craters unchanged for billions of years, apart from disturbances produced by infrequent later impacts. Volcanic activity has been an important force in the shaping of the surfaces of the Moon and the terrestrial planets. Seismic activity on the Moon has been monitored by means of seismometers left on its surface by Apollo astronauts and by Lunokhod robotic rovers. Cratering on the largest scale seems to have ceased about three billion years ago, although on the Moon there is clear evidence for a continued cosmic drizzle of small particles, with the larger objects churning (gardening) the lunar surface and the smallest producing microscopic impact pits in crystals in the lunar rocks.

All of the planets apart from the two closest to the Sun (Mercury and Venus) have natural satellites (moons) that are very diverse in appearance, size, and structure, as revealed in close-up observations from long-range space probes. The four outer dwarf planets have moons; Pluto has at least five moons, including one, Charon, fully half the size of Pluto itself. Over 200 asteroids and 80 Kuiper belt objects also have moons. Four planets (Jupiter, Saturn, Uranus, and Neptune), one dwarf planet (Haumea), and one Centaur object (Chariklo) have rings, disklike systems of small rocks and particles that orbit their parent bodies.

During the U.S. Apollo missions a total weight of 381.7 kg (841.5 pounds) of lunar material was collected; an additional 300 grams (0.66 pounds) was brought back by unmanned Soviet Luna vehicles. About 15 percent of the Apollo samples have been distributed for analysis, with the remainder stored at the NASA Johnson Space Center, Houston, Texas. The opportunity to employ a wide range of laboratory techniques on these lunar samples has revolutionized planetary science. The results of the analyses have enabled investigators to determine the composition and age of the lunar surface. Seismic observations have made it possible to probe the lunar interior. In addition, retroreflectors left on the Moons surface by Apollo astronauts have allowed high-power laser beams to be sent from Earth to the Moon and back, permitting scientists to monitor the Earth-Moon distance to an accuracy of a few centimetres. This experiment, which has provided data used in calculations of the dynamics of the Earth-Moon system, has shown that the separation of the two bodies is increasing by 4.4 cm (1.7 inches) each year. (For additional information on lunar studies, see Moon.)

Mercury is too hot to retain an atmosphere, but Venuss brilliant white appearance is the result of its being completely enveloped in thick clouds of carbon dioxide, impenetrable at visible wavelengths. Below the upper clouds, Venus has a hostile atmosphere containing clouds of sulfuric acid droplets. The cloud cover shields the planets surface from direct sunlight, but the energy that does filter through warms the surface, which then radiates at infrared wavelengths. The long-wavelength infrared radiation is trapped by the dense clouds such that an efficient greenhouse effect keeps the surface temperature near 465 C (870 F, 740 K). Radar, which can penetrate the thick Venusian clouds, has been used to map the planets surface. In contrast, the atmosphere of Mars is very thin and is composed mostly of carbon dioxide (95 percent), with very little water vapour; the planets surface pressure is only about 0.006 that of Earth. The outer planets have atmospheres composed largely of light gases, mainly hydrogen and helium.

Each planet rotates on its axis, and nearly all of them rotate in the same directioncounterclockwise as viewed from above the ecliptic. The two exceptions are Venus, which rotates in the clockwise direction beneath its cloud cover, and Uranus, which has its rotation axis very nearly in the plane of the ecliptic.

Some of the planets have magnetic fields. Earths field extends outward until it is disturbed by the solar windan outward flow of protons and electrons from the Sunwhich carries a magnetic field along with it. Through processes not yet fully understood, particles from the solar wind and galactic cosmic rays (high-speed particles from outside the solar system) populate two doughnut-shaped regions called the Van Allen radiation belts. The inner belt extends from about 1,000 to 5,000 km (600 to 3,000 miles) above Earths surface, and the outer from roughly 15,000 to 25,000 km (9,300 to 15,500 miles). In these belts, trapped particles spiral along paths that take them around Earth while bouncing back and forth between the Northern and Southern hemispheres, with their orbits controlled by Earths magnetic field. During periods of increased solar activity, these regions of trapped particles are disturbed, and some of the particles move down into Earths atmosphere, where they collide with atoms and molecules to produce auroras.

Jupiter has a magnetic field far stronger than Earths and many more trapped electrons, whose synchrotron radiation (electromagnetic radiation emitted by high-speed charged particles that are forced to move in curved paths, as under the influence of a magnetic field) is detectable from Earth. Bursts of increased radio emission are correlated with the position of Io, the innermost of the four Galilean moons of Jupiter. Saturn has a magnetic field that is much weaker than Jupiters, but it too has a region of trapped particles. Mercury has a weak magnetic field that is only about 1 percent as strong as Earths and shows no evidence of trapped particles. Uranus and Neptune have fields that are less than one-tenth the strength of Saturns and appear much more complex than that of Earth. No field has been detected around Venus or Mars.

More than 500,000 asteroids with well-established orbits are known, and thousands of additional objects are discovered each year. Hundreds of thousands more have been seen, but their orbits have not been as well determined. It is estimated that several million asteroids exist, but most are small, and their combined mass is estimated to be less than a thousandth that of Earth. Most of the asteroids have orbits close to the ecliptic and move in the asteroid belt, between 2.3 and 3.3 AU from the Sun. Because some asteroids travel in orbits that can bring them close to Earth, there is a possibility of a collision that could have devastating results (see Earth impact hazard).

Comets are considered to come from a vast reservoir, the Oort cloud, orbiting the Sun at distances of 20,00050,000 AU or more and containing trillions of icy objectslatent comet nucleiwith the potential to become active comets. Many comets have been observed over the centuries. Most make only a single pass through the inner solar system, but some are deflected by Jupiter or Saturn into orbits that allow them to return at predictable times. Halleys Comet is the best known of these periodic comets; its next return into the inner solar system is predicted for 2061. Many short-period comets are thought to come from the Kuiper belt, a region lying mainly between 30 AU and 50 AU from the Sunbeyond Neptunes orbit but including part of Plutosand housing perhaps hundreds of millions of comet nuclei. Very few comet masses have been well determined, but most are probably less than 1018 grams, one-billionth the mass of Earth.

Since the 1990s more than a thousand comet nuclei in the Kuiper belt have been observed with large telescopes; a few are about half the size of Pluto, and Pluto is the largest Kuiper belt object. Plutos orbital and physical characteristics had long caused it to be regarded as an anomaly among the planets. However, after the discovery of numerous other Pluto-like objects beyond Neptune, Pluto was seen to be no longer unique in its neighbourhood but rather a giant member of the local population. Consequently, in 2006 astronomers at the general assembly of the International Astronomical Union elected to create the new category of dwarf planets for objects with such qualifications. Pluto, Eris, and Ceres, the latter being the largest member of the asteroid belt, were given this distinction. Two other Kuiper belt objects, Makemake and Haumea, were also designated as dwarf planets.

Smaller than the observed asteroids and comets are the meteoroids, lumps of stony or metallic material believed to be mostly fragments of asteroids. Meteoroids vary from small rocks to boulders weighing a ton or more. A relative few have orbits that bring them into Earths atmosphere and down to the surface as meteorites. Most meteorites that have been collected on Earth are probably from asteroids. A few have been identified as being from the Moon, Mars, or the asteroid Vesta.

Meteorites are classified into three broad groups: stony (chondrites and achondrites; about 94 percent), iron (5 percent), and stony-iron (1 percent). Most meteoroids that enter the atmosphere heat up sufficiently to glow and appear as meteors, and the great majority of these vaporize completely or break up before they reach the surface. Many, perhaps most, meteors occur in showers (see meteor shower) and follow orbits that seem to be identical with those of certain comets, thus pointing to a cometary origin. For example, each May, when Earth crosses the orbit of Halleys Comet, the Eta Aquarid meteor shower occurs. Micrometeorites (interplanetary dust particles), the smallest meteoroidal particles, can be detected from Earth-orbiting satellites or collected by specially equipped aircraft flying in the stratosphere and returned for laboratory inspection. Since the late 1960s numerous meteorites have been found in the Antarctic on the surface of stranded ice flows (see Antarctic meteorites). Some meteorites contain microscopic crystals whose isotopic proportions are unique and appear to be dust grains that formed in the atmospheres of different stars.

The age of the solar system, taken to be close to 4.6 billion years, has been derived from measurements of radioactivity in meteorites, lunar samples, and Earths crust. Abundances of isotopes of uranium, thorium, and rubidium and their decay products, lead and strontium, are the measured quantities.

Assessment of the chemical composition of the solar system is based on data from Earth, the Moon, and meteorites as well as on the spectral analysis of light from the Sun and planets. In broad outline, the solar system abundances of the chemical elements decrease with increasing atomic weight. Hydrogen atoms are by far the most abundant, constituting 91 percent; helium is next, with 8.9 percent; and all other types of atoms together amount to only 0.1 percent.

The origin of Earth, the Moon, and the solar system as a whole is a problem that has not yet been settled in detail. The Sun probably formed by condensation of the central region of a large cloud of gas and dust, with the planets and other bodies of the solar system forming soon after, their composition strongly influenced by the temperature and pressure gradients in the evolving solar nebula. Less-volatile materials could condense into solids relatively close to the Sun to form the terrestrial planets. The abundant, volatile lighter elements could condense only at much greater distances to form the giant gas planets.

In the1990s astronomers confirmed that other stars have one or more planets revolving around them. Studies of these planetary systems have both supported and challenged astronomers theoretical models of how Earths solar system formed. Unlike the solar system, many extrasolar planetary systems have large gas giants like Jupiter orbiting very close to their stars, and in some cases these hot Jupiters are closer to their star than Mercury is to the Sun.

That so many gas giants, which form in the outer regions of their system, end up so close to their stars suggests that gas giants migrate and that such migration may have happened in the solar systems history. According to the Grand Tack hypothesis, Jupiter may have done so within a few million years of the solar systems formation. In this scenario, Jupiter is the first giant planet to form, at about 3 AU from the Sun. Drag from the protoplanetary disk causes it to fall inward to about 1.5 AU. However, by this time, Saturn begins to form at about 3 AU and captures Jupiter in a 3:2 resonance. (That is, for every three revolutions Jupiter makes, Saturn makes two.) The two planets migrate outward and clear away any material that would have gone to making Mars bigger. Mars should be bigger than Venus or Earth, but it is only half their size. The Grand Tack, in which Jupiter moves inward and then outward, explains Marss small size.

About 500 million years after the Grand Tack, according to the Nice Model (named after the French city where it was first proposed), after the four giant planetsJupiter, Saturn, Uranus, and Neptuneformed, they orbited 517 AU from the Sun. These planets were in a disk of smaller bodies called planetesimals and in orbital resonances with each other. About four billion years ago, gravitational interactions with the planetesimals increased the eccentricity of the planets orbits, driving them out of resonance. Saturn, Uranus and Neptune migrated outward, and Jupiter migrated slightly inward. (Uranus and Neptune may even have switched places.) This migration scattered the disk, causing the Late Heavy Bombardment. The final remnant of the disk became the Kuiper belt.

The origin of the planetary satellites is not entirely settled. As to the origin of the Moon, the opinion of astronomers long oscillated between theories that saw its origin and condensation as simultaneous with the formation of Earth and those that posited a separate origin for the Moon and its later capture by Earths gravitational field. Similarities and differences in abundances of the chemical elements and their isotopes on Earth and the Moon challenged each group of theories. Finally, in the 1980s a model emerged that gained the support of most lunar scientiststhat of a large impact on Earth and the expulsion of material that subsequently formed the Moon. (See Moon: Origin and evolution.) For the outer planets, with their multiple satellites, many very small and quite unlike one another, the picture is less clear. Some of these moons have relatively smooth icy surfaces, whereas others are heavily cratered; at least one, Jupiters Io, is volcanic. Some of the moons may have formed along with their parent planets, and others may have formed elsewhere and been captured.

The measurable quantities in stellar astrophysics include the externally observable features of the stars: distance, temperature, radiation spectrum and luminosity, composition (of the outer layers), diameter, mass, and variability in any of these. Theoretical astrophysicists use these observations to model the structure of stars and to devise theories for their formation and evolution. Positional information can be used for dynamical analysis, which yields estimates of stellar masses.

In a system dating back at least to the Greek astronomer-mathematician Hipparchus in the 2nd century bce, apparent stellar brightness (m) is measured in magnitudes. Magnitudes are now defined such that a first-magnitude star is 100 times brighter than a star of sixth magnitude. The human eye cannot see stars fainter than about sixth magnitude, but modern instruments used with large telescopes can record stars as faint as about 30th magnitude. By convention, the absolute magnitude (M) is defined as the magnitude that a star would appear to have if it were located at a standard distance of 10 parsecs. These quantities are related through the expression m M = 5 log10 r 5, in which r is the stars distance in parsecs.

The magnitude scale is anchored on a group of standard stars. An absolute measure of radiant power is luminosity, which is related to the absolute magnitude and usually expressed in ergs per second (ergs/sec). (Sometimes the luminosity is stated in terms of the solar luminosity, 3.86 1033 ergs/sec.) Luminosity can be calculated when m and r are known. Correction might be necessary for the interstellar absorption of starlight.

There are several methods for measuring a stars diameter. From the brightness and distance, the luminosity (L) can be calculated, and, from observations of the brightness at different wavelengths, the temperature (T) can be calculated. Because the radiation from many stars can be well approximated by a Planck blackbody spectrum (see Plancks radiation law), these measured quantities can be related through the expression L = 4R2T4, thus providing a means of calculating R, the stars radius. In this expression, is the Stefan-Boltzmann constant, 5.67 105 ergs/cm2K4sec, in which K is the temperature in kelvins. (The radius R refers to the stars photosphere, the region where the star becomes effectively opaque to outside observation.) Stellar angular diameters can be measured through interferometrythat is, the combining of several telescopes together to form a larger instrument that can resolve sizes smaller than those that an individual telescope can resolve. Alternatively, the intensity of the starlight can be monitored during occultation by the Moon, which produces diffraction fringes whose pattern depends on the angular diameter of the star. Stellar angular diameters of several milliarcseconds can be measured.

Many stars occur in binary systems (see binary star), in which the two partners orbit their mutual centre of mass. Such a system provides the best measurement of stellar masses. The period (P) of a binary system is related to the masses of the two stars (m1 and m2) and the orbital semimajor axis (mean radius; a) via Keplers third law: P2 = 42a3/G(m1 + m2). (G is the universal gravitational constant.) From diameters and masses, average values of the stellar density can be calculated and thence the central pressure. With the assumption of an equation of state, the central temperature can then be calculated. For example, in the Sun the central density is 158 grams per cubic cm; the pressure is calculated to be more than one billion times the pressure of Earths atmosphere at sea level and the temperature around 15 million K (27 million F). At this temperature, all atoms are ionized, and so the solar interior consists of a plasma, an ionized gas with hydrogen nuclei (i.e., protons), helium nuclei, and electrons as major constituents. A small fraction of the hydrogen nuclei possess sufficiently high speeds that, on colliding, their electrostatic repulsion is overcome, resulting in the formation, by means of a set of fusion reactions, of helium nuclei and a release of energy (see proton-proton cycle). Some of this energy is carried away by neutrinos, but most of it is carried by photons to the surface of the Sun to maintain its luminosity.

Other stars, both more and less massive than the Sun, have broadly similar structures, but the size, central pressure and temperature, and fusion rate are functions of the stars mass and composition. The stars and their internal fusion (and resulting luminosity) are held stable against collapse through a delicate balance between the inward pressure produced by gravitational attraction and the outward pressure supplied by the photons produced in the fusion reactions.

Stars that are in this condition of hydrostatic equilibrium are termed main-sequence stars, and they occupy a well-defined band on the Hertzsprung-Russell (H-R) diagram, in which luminosity is plotted against colour index or temperature. Spectral classification, based initially on the colour index, includes the major spectral types O, B, A, F, G, K and M, each subdivided into 10 parts (see star: Stellar spectra). Temperature is deduced from broadband spectral measurements in several standard wavelength intervals. Measurement of apparent magnitudes in two spectral regions, the B and V bands (centred on 4350 and 5550 angstroms, respectively), permits calculation of the colour index, CI = mB mV, from which the temperature can be calculated.

For a given temperature, there are stars that are much more luminous than main-sequence stars. Given the dependence of luminosity on the square of the radius and the fourth power of the temperature (R2T4 of the luminosity expression above), greater luminosity implies larger radius, and such stars are termed giant stars or supergiant stars. Conversely, stars with luminosities much less than those of main-sequence stars of the same temperature must be smaller and are termed white dwarf stars. Surface temperatures of white dwarfs typically range from 10,000 to 12,000 K (18,000 to 21,000 F), and they appear visually as white or blue-white.

The strength of spectral lines of the more abundant elements in a stars atmosphere allows additional subdivisions within a class. Thus, the Sun, a main-sequence star, is classified as G2 V, in which the V denotes main sequence. Betelgeuse, a red giant with a surface temperature about half that of the Sun but with a luminosity of about 10,000 solar units, is classified as M2 Iab. In this classification, the spectral type is M2, and the Iab indicates a giant, well above the main sequence on the H-R diagram.

The range of physically allowable masses for stars is very narrow. If the stars mass is too small, the central temperature will be too low to sustain fusion reactions. The theoretical minimum stellar mass is about 0.08 solar mass. An upper theoretical bound called the Eddington limit, of several hundred solar masses, has been suggested, but this value is not firmly defined. Stars as massive as this will have luminosities about one million times greater than that of the Sun.

A general model of star formation and evolution has been developed, and the major features seem to be established. A large cloud of gas and dust can contract under its own gravitational attraction if its temperature is sufficiently low. As gravitational energy is released, the contracting central material heats up until a point is reached at which the outward radiation pressure balances the inward gravitational pressure, and contraction ceases. Fusion reactions take over as the stars primary source of energy, and the star is then on the main sequence. The time to pass through these formative stages and onto the main sequence is less than 100 million years for a star with as much mass as the Sun. It takes longer for less massive stars and a much shorter time for those much more massive.

Once a star has reached its main-sequence stage, it evolves relatively slowly, fusing hydrogen nuclei in its core to form helium nuclei. Continued fusion not only releases the energy that is radiated but also results in nucleosynthesis, the production of heavier nuclei.

Stellar evolution has of necessity been followed through computer modeling, because the timescales for most stages are generally too extended for measurable changes to be observed, even over a period of many years. One exception is the supernova, the violently explosive finale of certain stars. Different types of supernovas can be distinguished by their spectral lines and by changes in luminosity during and after the outburst. In Type Ia, a white dwarf star attracts matter from a nearby companion; when the white dwarfs mass exceeds about 1.4 solar masses, the star implodes and is completely destroyed. Type II supernovas are not as luminous as Type Ia and are the final evolutionary stage of stars more massive than about eight solar masses. Type Ib and Ic supernovas are like Type II in that they are from the collapse of a massive star, but they do not retain their hydrogen envelope.

The nature of the final products of stellar evolution depends on stellar mass. Some stars pass through an unstable stage in which their dimensions, temperature, and luminosity change cyclically over periods of hours or days. These so-called Cepheid variables serve as standard candles for distance measurements (see above Determining astronomical distances). Some stars blow off their outer layers to produce planetary nebulas. The expanding material can be seen glowing in a thin shell as it disperses into the interstellar medium while the remnant core, initially with a surface temperature as high as 100,000 K (180,000 F), cools to become a white dwarf. The maximum stellar mass that can exist as a white dwarf is about 1.4 solar masses and is known as the Chandrasekhar limit. More-massive stars may end up as either neutron stars or black holes.

The average density of a white dwarf is calculated to exceed one million grams per cubic cm. Further compression is limited by a quantum condition called degeneracy (see degenerate gas), in which only certain energies are allowed for the electrons in the stars interior. Under sufficiently great pressure, the electrons are forced to combine with protons to form neutrons. The resulting neutron star will have a density in the range of 10141015 grams per cubic cm, comparable to the density within atomic nuclei. The behaviour of large masses having nuclear densities is not yet sufficiently understood to be able to set a limit on the maximum size of a neutron star, but it is thought to be less than three solar masses.

Still more-massive remnants of stellar evolution would have smaller dimensions and would be even denser that neutron stars. Such remnants are conceived to be black holes, objects so compact that no radiation can escape from within a characteristic distance called the Schwarzschild radius. This critical dimension is defined by Rs = 2GM/c2. (Rs is the Schwarzschild radius, G is the gravitational constant, M is the objects mass, and c is the speed of light.) For an object of three solar masses, the Schwarzschild radius would be about three kilometres. Radiation emitted from beyond the Schwarzschild radius can still escape and be detected.

Although no light can be detected coming from within a black hole, the presence of a black hole may be manifested through the effects of its gravitational field, as, for example, in a binary star system. If a black hole is paired with a normal visible star, it may pull matter from its companion toward itself. This matter is accelerated as it approaches the black hole and becomes so intensely heated that it radiates large amounts of X-rays from the periphery of the black hole before reaching the Schwarzschild radius. Some candidates for stellar black holes have been founde.g., the X-ray source Cygnus X-1. Each of them has an estimated mass clearly exceeding that allowable for a neutron star, a factor crucial in the identification of possible black holes. Supermassive black holes that do not originate as individual stars exist at the centre of active galaxies (see below Study of other galaxies and related phenomena). One such black hole, that at the center of the galaxy M87, has a mass 6.5 billion times that of the Sun and has been directly observed.

Whereas the existence of stellar black holes has been strongly indicated, the existence of neutron stars was confirmed in 1968 when they were identified with the then newly discovered pulsars, objects characterized by the emission of radiation at short and extremely regular intervals, generally between 1 and 1,000 pulses per second and stable to better than a part per billion. Pulsars are considered to be rotating neutron stars, remnants of some supernovas.

Stars are not distributed randomly throughout space. Many stars are in systems consisting of two or three members separated by less than 1,000 AU. On a larger scale, star clusters may contain many thousands of stars. Galaxies are much larger systems of stars and usually include clouds of gas and dust.

The solar system is located within the Milky Way Galaxy, close to its equatorial plane and about 8 kiloparsecs from the galactic centre. The galactic diameter is about 30 kiloparsecs, as indicated by luminous matter. There is evidence, however, for nonluminous matterso-called dark matterextending out nearly twice this distance. The entire system is rotating such that, at the position of the Sun, the orbital speed is about 220 km per second (almost 500,000 miles per hour) and a complete circuit takes roughly 240 million years. Application of Keplers third law leads to an estimate for the galactic mass of about 100 billion solar masses. The rotational velocity can be measured from the Doppler shifts observed in the 21-cm emission line of neutral hydrogen and the lines of millimetre wavelengths from various molecules, especially carbon monoxide. At great distances from the galactic centre, the rotational velocity does not drop off as expected but rather increases slightly. This behaviour appears to require a much larger galactic mass than can be accounted for by the known (luminous) matter. Additional evidence for the presence of dark matter comes from a variety of other observations. The nature and extent of the dark matter (or missing mass) constitutes one of todays major astronomical puzzles.

There are about 100 billion stars in the Milky Way Galaxy. Star concentrations within the galaxy fall into three types: open clusters, globular clusters, and associations (see star cluster). Open clusters lie primarily in the disk of the galaxy; most contain between 50 and 1,000 stars within a region no more than 10 parsecs in diameter. Stellar associations tend to have somewhat fewer stars; moreover, the constituent stars are not as closely grouped as those in the clusters and are for the most part hotter. Globular clusters, which are widely scattered around the galaxy, may extend up to about 100 parsecs in diameter and may have as many as a million stars. The importance to astronomers of globular clusters lies in their use as indicators of the age of the galaxy. Because massive stars evolve more rapidly than do smaller stars, the age of a cluster can be estimated from its H-R diagram. In a young cluster the main sequence will be well populated, but in an old cluster the heavier stars will have evolved away from the main sequence. The extent of the depopulation of the main sequence provides an index of age. In this way, the oldest globular clusters have been found to be about 12.5 billion years old, which should therefore be the minimum age for the galaxy.

The interstellar medium, composed primarily of gas and dust, occupies the regions between the stars. On average, it contains less than one atom in each cubic centimetre, with about 1 percent of its mass in the form of minute dust grains. The gas, mostly hydrogen, has been mapped by means of its 21-cm emission line. The gas also contains numerous molecules. Some of these have been detected by the visible-wavelength absorption lines that they impose on the spectra of more-distant stars, while others have been identified by their own emission lines at millimetre wavelengths. Many of the interstellar molecules are found in giant molecular clouds, wherein complex organic molecules have been discovered.

In the vicinity of a very hot O- or B-type star, the intensity of ultraviolet radiation is sufficiently high to ionize the surrounding hydrogen out to a distance as great as 100 parsecs to produce an H II region, known as a Strmgren sphere. Such regions are strong and characteristic emitters of radiation at radio wavelengths, and their dimensions are well calibrated in terms of the luminosity of the central star. Using radio interferometers, astronomers are able to measure the angular diameters of H II regions even in some external galaxies and can thereby deduce the great distances to those remote systems. This method can be used for distances up to about 30 megaparsecs. (For additional information on H II regions, see nebula: Diffuse nebulae (H II regions).)

Interstellar dust grains scatter and absorb starlight, the effect being roughly inversely proportional to wavelength from the infrared to the near ultraviolet. As a result, stellar spectra tend to be reddened. Absorption typically amounts to about one magnitude per kiloparsec but varies considerably in different directions. Some dusty regions contain silicate materials, identified by a broad absorption feature around a wavelength of 10 m. Other prominent spectral features in the infrared range have been sometimes, but not conclusively, attributed to graphite grains and polycyclic aromatic hydrocarbons (PAHs).

Starlight often shows a small degree of polarization (a few percent), with the effect increasing with stellar distance. This is attributed to the scattering of the starlight from dust grains that have been partially aligned in a weak interstellar magnetic field. The strength of this field is estimated to be a few microgauss, very close to the strength inferred from observations of nonthermal cosmic radio noise. This radio background has been identified as synchrotron radiation, emitted by cosmic-ray electrons traveling at nearly the speed of light and moving along curved paths in the interstellar magnetic field. The spectrum of the cosmic radio noise is close to what is calculated on the basis of measurements of the cosmic rays near Earth.

Cosmic rays constitute another component of the interstellar medium. Cosmic rays that are detected in the vicinity of Earth comprise high-speed nuclei and electrons. Individual particle energies, expressed in electron volts (eV; 1 eV = 1.6 1012 erg), range with decreasing numbers from about 106 eV to more than 1020 eV. Among the nuclei, hydrogen nuclei are the most plentiful at 86 percent, helium nuclei next at 13 percent, and all other nuclei together at about 1 percent. Electrons are about 2 percent as abundant as the nuclear component. (The relative numbers of different nuclei vary somewhat with kinetic energy, while the electron proportion is strongly energy-dependent.)

A minority of cosmic rays detected in Earths vicinity are produced in the Sun, especially at times of increased solar activity (as indicated by sunspots and solar flares). The origin of galactic cosmic rays has not yet been conclusively identified, but they are thought to be produced in stellar processes such as supernova explosions, perhaps with additional acceleration occurring in the interstellar regions. (For additional information on interstellar matter, see Milky Way Galaxy: The general interstellar medium.)

The central region of the Milky Way Galaxy is so heavily obscured by dust that direct observation has become possible only with the development of astronomy at nonvisual wavelengthsnamely, radio, infrared, and, more recently, X-ray and gamma-ray wavelengths. Together, these observations have revealed a nuclear region of intense activity, with a large number of separate sources of emission and a great deal of dust. Detection of gamma-ray emission at a line energy of 511,000 eV, which corresponds to the annihilation of electrons and positrons (the antimatter counterpart of electrons), along with radio mapping of a region no more than 20 AU across, points to a very compact and energetic source, designated Sagittarius A*, at the centre of the galaxy. Sagittarius A* is a supermassive black hole with a mass equivalent to 4,310,000 Suns.

Galaxies are normally classified into three principal types according to their appearance: spiral, elliptical, and irregular. Galactic diameters are typically in the tens of kiloparsecs and the distances between galaxies typically in megaparsecs.

Spiral galaxiesof which the Milky Way system is a characteristic exampletend to be flattened, roughly circular systems with their constituent stars strongly concentrated along spiral arms. These arms are thought to be produced by traveling density waves, which compress and expand the galactic material. Between the spiral arms exists a diffuse interstellar medium of gas and dust, mostly at very low temperatures (below 100 K [280 F, 170 C]). Spiral galaxies are typically a few kiloparsecs in thickness; they have a central bulge and taper gradually toward the outer edges.

Ellipticals show none of the spiral features but are more densely packed stellar systems. They range in shape from nearly spherical to very flattened and contain little interstellar matter. Irregular galaxies number only a few percent of all stellar systems and exhibit none of the regular features associated with spirals or ellipticals.

Properties vary considerably among the different types of galaxies. Spirals typically have masses in the range of a billion to a trillion solar masses, with ellipticals having values from 10 times smaller to 10 times larger and the irregulars generally 10100 times smaller. Visual galactic luminosities show similar spreads among the three types, but the irregulars tend to be less luminous. In contrast, at radio wavelengths the maximum luminosity for spirals is usually 100,000 times less than for ellipticals or irregulars.

Quasars are objects whose spectra display very large redshifts, thus implying (in accordance with the Hubble law) that they lie at the greatest distances (see above Determining astronomical distances). They were discovered in 1963 but remained enigmatic for many years. They appear as starlike (i.e., very compact) sources of radio waveshence their initial designation as quasi-stellar radio sources, a term later shortened to quasars. They are now considered to be the exceedingly luminous cores of distant galaxies. These energetic cores, which emit copious quantities of X-rays and gamma rays, are termed active galactic nuclei (AGN) and include the object Cygnus A and the nuclei of a class of galaxies called Seyfert galaxies. They are powered by the infall of matter into supermassive black holes.

The Milky Way Galaxy is one of the Local Group of galaxies, which contains about four dozen members and extends over a volume about two megaparsecs in diameter. Two of the closest members are the Magellanic Clouds, irregular galaxies about 50 kiloparsecs away. At about 740 kiloparsecs, the Andromeda Galaxy is one of the most distant in the Local Group. Some members of the group are moving toward the Milky Way system while others are traveling away from it. At greater distances, all galaxies are moving away from the Milky Way Galaxy. Their speeds (as determined from the redshifted wavelengths in their spectra) are generally proportional to their distances. The Hubble law relates these two quantities (see above Determining astronomical distances). In the absence of any other method, the Hubble law continues to be used for distance determinations to the farthest objectsthat is, galaxies and quasars for which redshifts can be measured.

Cosmology is the scientific study of the universe as a unified whole, from its earliest moments through its evolution to its ultimate fate. The currently accepted cosmological model is the big bang. In this picture, the expansion of the universe started in an intense explosion 13.8 billion years ago. In this primordial fireball, the temperature exceeded one trillion K, and most of the energy was in the form of radiation. As the expansion proceeded (accompanied by cooling), the role of the radiation diminished, and other physical processes dominated in turn. Thus, after about three minutes, the temperature had dropped to the one-billion-K range, making it possible for nuclear reactions of protons to take place and produce nuclei of deuterium and helium. (At the higher temperatures that prevailed earlier, these nuclei would have been promptly disrupted by high-energy photons.) With further expansion, the time between nuclear collisions had increased and the proportion of deuterium and helium nuclei had stabilized. After a few hundred thousand years, the temperature must have dropped sufficiently for electrons to remain attached to nuclei to constitute atoms. Galaxies are thought to have begun forming after a few million years, but this stage is very poorly understood. Star formation probably started much later, after at least a billion years, and the process continues today.

Observational support for this general model comes from several independent directions. The expansion has been documented by the redshifts observed in the spectra of galaxies. Furthermore, the radiation left over from the original fireball would have cooled with the expansion. Confirmation of this relic energy came in 1965 with one of the most striking cosmic discoveries of the 20th centurythe observation, at short radio wavelengths, of a widespread cosmic radiation corresponding to a temperature of almost 3 K (about 270 C [454 F]). The shape of the observed spectrum is an excellent fit with the theoretical Planck blackbody spectrum. (The present best value for this temperature is 2.735 K, but it is still called three-degree radiation or the cosmic microwave background.) The spectrum of this cosmic radio noise peaks at approximately a one-millimetre wavelength, which is in the far infrared, a difficult region to observe from Earth; however, the spectrum has been well mapped by the Cosmic Background Explorer (COBE), Wilkinson Microwave Anisotropy Probe, and Planck satellites. Additional support for the big bang theory comes from the observed cosmic abundances of deuterium and helium. Normal stellar nucleosynthesis cannot produce their measured quantities, which fit well with calculations of production during the early stages of the big bang.

Early surveys of the cosmic background radiation indicated that it is extremely uniform in all directions (isotropic). Calculations have shown that it is difficult to achieve this degree of isotropy unless there was a very early and rapid inflationary period before the expansion settled into its present mode. Nevertheless, the isotropy posed problems for models of galaxy formation. Galaxies originate from turbulent conditions that produce local fluctuations of density, toward which more matter would then be gravitationally attracted. Such density variations were difficult to reconcile with the isotropy required by observations of the 3 K radiation. This problem was solved when the COBE satellite was able to detect the minute fluctuations in the cosmic background from which the galaxies formed.

The very earliest stages of the big bang are less well understood. The conditions of temperature and pressure that prevailed prior to the first microsecond require the introduction of theoretical ideas of subatomic particle physics. Subatomic particles are usually studied in laboratories with giant accelerators, but the region of particle energies of potential significance to the question at hand lies beyond the range of accelerators currently available. Fortunately, some important conclusions can be drawn from the observed cosmic helium abundance, which is dependent on conditions in the early big bang. The observed helium abundance sets a limit on the number of families of certain types of subatomic particles that can exist.

The age of the universe can be calculated in several ways. Assuming the validity of the big bang model, one attempts to answer the question: How long has the universe been expanding in order to have reached its present size? The numbers relevant to calculating an answer are Hubbles constant (i.e., the current expansion rate), the density of matter in the universe, and the cosmological constant, which allows for change in the expansion rate. In 2003 a calculation based on a fresh determination of Hubbles constant yielded an age of 13.7 billion 200 million years, although the precise value depends on certain assumed details of the model used. Independent estimates of stellar ages have yielded values less than this, as would be expected, but other estimates, based on supernova distance measurements, have arrived at values of about 15 billion years, still consistent, within the errors. In the big bang model the age is proportional to the reciprocal of Hubbles constant, hence the importance of determining H as reliably as possible. For example, a value for H of 100 km/sec/Mpc would lead to an age less than that of many stars, a physically unacceptable result.

A small minority of astronomers have developed alternative cosmological theories that are seriously pursued. The overwhelming professional opinion, however, continues to support the big bang model.

Finally, there is the question of the future behaviour of the universe: Is it open? That is to say, will the expansion continue indefinitely? Or is it closed, such that the expansion will slow down and eventually reverse, resulting in contraction? (The final collapse of such a contracting universe is sometimes termed the big crunch.) The density of the universe seems to be at the critical density; that is, the universe is neither open nor closed but flat. So-called dark energy, a kind of repulsive force that is now believed to be a major component of the universe, appears to be the decisive factor in predictions of the long-term fate of the cosmos. If this energy is a cosmological constant (as proposed in 1917 by Albert Einstein to correct certain problems in his model of the universe), then the result would be a big chill. In this scenario, the universe would continue to expand, but its density would decrease. While old stars would burn out, new stars would no longer form. The universe would become cold and dark. The dark (nonluminous) matter component of the universe, whose composition remains unknown, is not considered sufficient to close the universe and cause it to collapse; it now appears to contribute only a fourth of the density needed for closure.

An additional factor in deciding the fate of the universe might be the mass of neutrinos. For decades the neutrino had been postulated to have zero mass, although there was no compelling theoretical reason for this to be so. From the observation of neutrinos generated in the Sun and other celestial sources such as supernovas, in cosmic-ray interactions with Earths atmosphere, and in particle accelerators, investigators have concluded that neutrinos have some mass, though only an extremely small fraction of the mass of an electron. Although there are vast numbers of neutrinos in the universe, the sum of such small neutrino masses appears insufficient to close the universe.

Read more here:

astronomy | Definition & Facts | Britannica.com

How To Calculate Your Financial Independence Number

If you are pursuing financial independence and early retirement, you should have a financial independence number. If you have been on an FI journey for a while, this is nothing new to you.

The financial independence number is the amount of money you need to be able to live off the returns on your net worth without depleting your net worth. Once you have money in the bank equivalent to your financial independence number, you can call yourself financially independent for life because it does not deplete your net worth.

Before we get to how you calculate your financial independence number, we need to understand a few things first.

Before we dive into the calculations, you need to find out how much money you will spend each month once you reach financial independence.

A good starting point is to find out how much money you spend per month at the moment. If you dont have a budget where this is visible, try to give an estimate of how much money you spend every month everything included (e.g. housing, transport, clothes etc.).

Keep in mind that some costs go uponce you achieve financial independence and perhaps retire early.

Multiply your monthly spending by 12 to find out what your required yearly spending is when you become financially independent.

As an example, I spend roughly $2,500 per month, which makes my yearly spending requirement $30,000.

Next up is your safe withdrawal rate. This is used in combination with your yearly spending to calculate your financial independence number.

Much has been written about the safe withdrawal rate. People usually dont disagree with the concept of safe withdrawal rates, but they like to discuss the value that it should have.

The safe withdrawal rate is the percentage of your net worth that you can withdraw each year without running out of money before you die. It originates from a 1998 study called the Trinity study. This is where the 4%-rule comes from if you have ever heard about that.

The Trinity study argued that you should have a safe withdrawal rate of 4% by looking at returns from 1925 to 1995. In that time period, it would have been highly unlikely that you would have depleted your net worth if you had only withdrawn 4% every year. This is assuming that your net worth would still have been invested in a stock-dominated portfolio.

I personally use a 4% safe withdrawal rate, but others argue that you should be even more conservative such as using a 3% safe withdrawal rate.

Why do I use 4% then? Well, if it turns out that 4% withdrawal depletesmy net worth, I will be fine with spending less or earning some more money for a while. Keep in mind that the safe withdrawal rate assumes that you dont make any additional income apart from investment returns.

Which safe withdrawal rate should you use? I would suggest something between 3-4%, and youll be fine.

Now to the grand finale! Using the two financial ingredients from this post, youll be able to calculate your financial independence number.

You can calculate your FI number using this equation:

Financial independence number = Yearly spending / Safe withdrawal rate

As an example, my financial independence number is:

$750,000 = $30,000 / 4%

My financial independence number is $750,000. Once I have a net worth of that, I am financially independent and I can stop working for the rest of my life. Easy as that!

Having a number makes financial independence much more tangible for most people. For me it is a great motivation to have a clear goal, and I religiously track my progress every month.

If you are curious, you can also use your savings rate to calculate time to retirement using my financial independence calculator.

Your turn: What is your financial independence number?

See the article here:

How To Calculate Your Financial Independence Number

What does posthumanism mean? – definitions.net

Posthumanism

Posthumanism or post-humanism (meaning "after humanism" or "beyond humanism") is a term with at least seven definitions according to philosopher Francesca Ferrando:Antihumanism: any theory that is critical of traditional humanism and traditional ideas about humanity and the human condition.Cultural posthumanism: a branch of cultural theory critical of the foundational assumptions of humanism and its legacy that examines and questions the historical notions of "human" and "human nature", often challenging typical notions of human subjectivity and embodiment and strives to move beyond archaic concepts of "human nature" to develop ones which constantly adapt to contemporary technoscientific knowledge.Philosophical posthumanism: a philosophical direction which draws on cultural posthumanism, the philosophical strand examines the ethical implications of expanding the circle of moral concern and extending subjectivities beyond the human speciesPosthuman condition: the deconstruction of the human condition by critical theorists.Transhumanism: an ideology and movement which seeks to develop and make available technologies that eliminate aging and greatly enhance human intellectual, physical, and psychological capacities, in order to achieve a "posthuman future".AI takeover: A more pessimistic alternative to transhumanism in which humans will not be enhanced, but rather eventually replaced by artificial intelligences. Some philosophers, including Nick Land, promote the view that humans should embrace and accept their eventual demise. This is related to the view of "cosmism" which supports the building of strong artificial intelligence even if it may entail the end of humanity as in their view it "would be a cosmic tragedy if humanity freezes evolution at the puny human level".Voluntary Human Extinction, which seeks a "posthuman future" that in this case is a future without humans.

More:

What does posthumanism mean? - definitions.net

Posthumanism | Literature in a Wired World Wiki | FANDOM …

What is Posthumanism?Edit

According to the Oxford English Dictionary:

1. post-humanism: A system of thought formulated in reaction to the basic tenets of humanism, esp. its focus on humanity rather than the divine or supernatural

2. posthumanism: The idea that humanity can be transformed, transcended, or eliminated either by technological advances or the evolutionary processl artistic, scientific, or philosophical practice which also reflects this belief

...to find more information on this history of the word Posthumanism, click HERE

N. Katherine Hayles was born in St. Louis Missouri on December 16, 1943. She attended Rochester Institute of Technology where she earned a B.S. in Chemistry. She then attended the California Institute of Technology and earned a M.S. in Chemistry as well. In 1977, she went to the University of Rochester and earned a Ph.D. in English Literature.

N. Katherine Hayles is popular critic of posthumanism. She is most known for being the author of "How We Became Posthuman". She believes that although we can put our intellect into another machine, we still need to keep in mind who we are and that our information is not completely transferable-- we still need the use of our own bodies. She has become a critic to many believers of posthumanism who believe the body acts as a piece of hardware just as any other computer.

thumb|316px|left|Interview with N. Katherine Hayles by Stacey Cochran

Do Androids Dream of Electric Sheep and Hayles' paper on posthumanism intertwine with one another as Hayles believes in a "Separation between body and mind is a consequence of historical change rather than what must inevitably happen as part of their materialized life." As we progress further into a new age of humans slowly developing into an android-like state (people getting prosthesis to help them function better) we are not going against humanity but simply flowing with the tides of history. With this kind of change, we are brought with the question: what makes us human? In DADES the only method to determine who is a human and android is by one concept: empathy. Some of the humans follow a religion known as Mercerism which is based on empathy. By utilizing an empathy box, it links them to other humans as they take upon the obstacles that Mercer faces as a cohesive unit. We are brought upon a concept of how humans, identify ourselves as individuals and as members of a group through Mercerism by being able to feel empathy towards each other. The novel toys with the concept of expanding this group to the few existing animals on Earth, and even androids. These androids are advanced to the point where it is only possible to determine whether or not one is human or android by a test involving empathy. When the bountyhunter in DADES, Deckard, has to retire these androids, he begins to ponder if he in fact is human. He believes that if being human is the ability to feel empathy, then how can he truly be human without feeling empathy when he retires the androids. In order to expand the definition of human to androids, Hayles and Dick both believe that a new mixture of man and machine must occur to fulfill this expanded category to androids. A mixture of machine and man are already amongst us (as shown in one group's presentation of a man with a robot eyeball) and many already have robotic arms/legs etc.

Bladerunner is a movie based on the novel Do Androids Dream of Electric sheep. The film did not fare well in box offices, but has since become a classic. Some may say the film needed time to catch on but it is used in classrooms all around the United States to teach about posthumanism.

thumb|left|300px

Shelley Jackson was born in the Phillippines in 1963. Jackson attended Stanford undergraduate and Brown for her M.F.A. in creative writing. While at Brown Jackson was inspired to create her first hypertext fiction titled, Patchwork Girl. This work at the time was the best selling CD for electronic litterature and is considered a cornerstone in starting the electronic litterature movement. Jackson is currently teaching in The New School in New York City.

Similar to These Waves of Girls, "My Body" is a Hypertext Fiction that explores a young girl's memories of childhood and growing up. Many of the memories involve stories relating to growing up, sexuality, and body development. This hypertext fiction maps out different parts of a woman's body for readers to click and to discover the author's inner thoughts.

To navigate for yourself click HERE

More:

Posthumanism | Literature in a Wired World Wiki | FANDOM ...