How babies change the dynamics of friendships – ABC News

In the early days of motherhood, I found it incredibly isolating.

It never occurred to me that my childfree friends might feel isolated too.

But then I read an ABC Life story about childfree women, which explained how women without kids often feel excluded from society and forgotten by their parent friends.

The "forgotten" accusation left me wounded. Motherhood can be all-encompassing and it was a blow to feel like I might be letting my friends down.

So I reached out to my childfree colleague Kellie Scott.

We are both in our 30s and have seen how babies change the dynamics of friendships (from different sides of the fence).

We've written letters to our friends about our thoughts.

Since having my daughter almost two years ago, it's been hectic with some soaring highs, new mother worries and tears, and endless nights of broken sleep.

And a lot of time spent alone with the baby.

Before bub arrived I had no idea what this "mother" thing was about.

The load certainly isn't what I expected. It's so much more physically and emotionally demanding sometimes planning beyond today is too much.

But here's a news flash from a new mother: I want to talk to you. I need to get out of the house. But I don't know how I can make that happen as regularly as I once did.

From managing endless loads of washing to seeing their friends less, six mums and dads share what changed in their lives after becoming parents.

There's no doubt my world has changed it's a lot smaller and yours has continued on.

I've pondered the idea that you might feel excluded by me again and again. Maybe it stems from my fear of oversharing or boring you with baby photos and stories.

I didn't mean my lack of sharing about my child's bowel movements and broken English to make you feel excluded.

I remember what it was like when a friend, high on hormones, told me that she never knew the type of love and how deep it could be until she held her newborn. I wanted to vomit on the spot.

I don't want to be that person.

I do, however, need some sort of sign that you're interested. Maybe even being OK with mess, children's music in the background and bath time.

I'd love to have a gin and tonic down at the funky bar on the corner once a week (I'd even settle for once every few months).

But even when all the planets (and shift work rosters) align, my partner is home to care for the tot and dinner is almost on the table, there's a little sense of guilt that I'm leaving at the most crucial and full-on time of the day the dinner, bath, bed routine.

I was lucky not to have to cope with postnatal depression, but a few years back it was clear something else, more insidious had happened to me. Postnatal depletion, Rebecca Huntley writes.

So please, come to me. Realise the demands of a little person are hard to work around, but that I do want to see you.

Join us for a meal because the free one-toddler show is always evolving it's maturing. Pitching food is over and saying cheers and clinking milk cups is in.

After all, in our 20s a little drama at dinner or on a night out was entertaining.

If that's not your thing, maybe doing the dishes and bringing in the clothes from the line when I'm tied up.

I'd love the company of more strong women and I need everyone I can get to be part of my village to help raise an independent girl to become a good-natured, compassionate, friendly human being.

Despite our best attempts to break down some of the barriers facing women, I know my little girl will have to grapple with the same issues we have.

You are a person I have chosen as a friend. You of all people have the kind of traits I would like to instil in her.

How can you stay close as friends when kids come along? Add your thoughts (and frustrations) in the comments.

We've been through a lot we debated children, good and bad boyfriends, and career options over dinner many times. We can do this again, please don't be put off by the interruptions.

I miss you, I need you and I want to be there for you too.

I don't know if I'll ever have a child, but in my mid-30s, I'm one of the last in our friendship group to still be wondering.

My sister-in-law recently told me the best support she had with her newborns was someone just coming around to take washing off the line, or cook dinner funnily enough, the same things Sarah talks about above.

It got me thinking about what kind of friend I've been to mums in my circle.

Maybe you're too kind to say anything, but let's face it, when I visit, I just sit around chatting while sipping my wine, sometimes as chaos ensues around me.

Yes, I come to you, but no, I'm not very useful once there.

I don't really understand how hard it is to simply do washing or cook dinner, because I haven't experienced parenthood for myself.

And you rarely complain. (Maybe that's saved for mums' group, where like-minded parents can sympathise.)

I also feel incapable of helping with a small person.

Women who choose not to have children are often labelled selfish, shallow and immature. But an increasing number are not having kids because of the ridiculous standards around motherhood.

I visited one friend and her two babies not long ago. She got really sick and my partner and I had to step up.

We were so proud we'd survived a few hours alone with the kids, until I realised we hadn't checked if they needed a nappy change.

Then I had to make a bottle and needed the neighbour to come and explain how.

If you haven't been around babies much, something as simple as changing and feeding them can be scary.

You really don't want to f*ck it up.

As for being left out, I've been one of the lucky ones.

We still see each other enough that we haven't become strangers.

Yes, there's some baby chat, but it's not all PG. We still talk about how weird bodies are, how scary Trump is, and that hilarious time we did X, Y and Z.

Like what you're reading? Sign up forthe ABC Life newsletter to see more

At this stage, the kids are still little. Perhaps if I never have children, the divide will become harder to bridge. But if that's the case I hope you are as mindful of that as I will be.

I don't want to be left behind, least of all by you.

Support, judgement, concerns about the environment and mental health, a love of pets, and thoughts about loneliness readers had a lot to say about the childfree life.

What I would say to the wonderful mothers in my life is three things.

One, please don't feel bad about asking for help. I don't get what it's like and maybe never will. I'm sorry I haven't stepped up before, but I'm very open to being bossed around. Ask me to cook your dinner, and I'll do it in a heartbeat.

Two, understand I don't always feel confident around your child. I'm constantly worried I'm going to drop them (it's happened).

Three, know that your little people are important to me. And I'll never get sick of hearing about them as long as you never get sick of hearing me talk about my dog.

Oh, and there is one more thing please find a good babysitter, because just sometimes, I want you all to myself.

Link:

How babies change the dynamics of friendships - ABC News

How nanotechnology can improve access to cleaner, safer water – The Sociable

Many people take their access to water for granted. They turn on the faucet and get clean water in seconds, usually not thinking about the people who do not have such resources.

In many parts of the world, individuals have access to water, but its unclean. Then, as people consume that water for hydration, the consequences could be illness or death.

Statistics from the Centers for Disease Control and Prevention (CDC) indicate that unsafe drinking water, inadequate access to water for hygiene purposes and a lack of sanitation access collectively account for about 88% of diarrheal disease fatalities.

Even the less-severe cases of diarrhea or related issues can require kids to take time off from school or for adults to miss days of work, both of which could cause substantial disruptions.

Moreover, the CDC reports that 780 million people globally dont have so-called improved water sources available to them. Those should provide access to clean, safe water but dont in every case.

Addressing the lack of clean water will not be easy. Sometimes, the issue is an infrastructural matter at heart, with communities lacking the infrastructure upgrades necessary for maintaining clean water. This problem prevails despite a poll finding that 85% of Americans support increasing federal investments to rebuild the water infrastructure.

Nanotechnology is increasingly a feasible option for dealing with water purification needs. It involves controlling matter on the atomic and molecular scale.

Lets look at some current and potential opportunities for applying nanotechnology to water purification and investigate how those possibilities could impact societies.

People have treated water with coagulants for centuries. The conventional process involves adding a substance such as aluminum sulfate to water, which causes large contaminants to group together and settle. However, this kind of coagulant-based purification does not get rid of smaller contaminants, and it requires going through multiple treatment processes to make the water safe.

Scientists improved on the traditional method of using coagulants through getting inspiration from the Actinia sea creature, which grabs prey with tentacles. They designed a nanocoagulant that captures both large and small contaminants in water, including many not taken away by other methods.

Since this is a single-step process, researchers believe it could be a cost-effective way to bring clean water to more people. Moreover, the researchers confirmed that the nancoagulant takes out nitrate, a cause of the often-fatal disease called blue baby syndrome.

Practical water purification requires getting the job done quickly, but not so fast that quality control goes down. Scientists developed a new nanofilter that they say cleans water more than 100 times more efficiently than current methods. The team demonstrated how the filter tackles lead and oil-contaminated water in the lab.

However, they feel confident that the possible applications span far beyond those. For example, some of the components of the filter effectively purified water from things like phosphates and mercury in previous studies.

The technology utilizes nanostructures that grow on liquid metals. The group involved in its development says their achievement is scalable. In addition to the benefit of extreme efficiently, this possibility could be cheap to produce.

If all those characteristics remain once the filter becomes commercialized, the product will give new, low-cost ways to make water drinkable, potentially bringing them to underserved populations.

Making water potable is exceptionally challenging without consistent access to electricity. However, scientists may have overcome that barrier with a solar-based nanotechnology method that depends on nanocellulose and graphene oxide to make a double-layered biofoam.

The researchers say the graphene oxide conducts heat and electricity during a process that evaporates dirty water and allows collecting purified water in exchange.

The scientists envisioned eventually creating huge sheets of their biofoam and bringing it to areas that require clean water and have ample sunlight.

In that instance, this purification method could work well in rural areas that do not have a reliable electrical infrastructure, or when such infrastructure gets damaged, such as after a storm.

Microplastics are a category of all types of plastic fragments less than five millimeters across. They can harm the environment by getting into waterways, and scientists are concerned that microplastics may harm humans who unknowingly eat them.

Scientists devised a way to one day apply nanotechnology to wastewater treatment and stop microplastics from getting into hydration sources.

The group mixed manganese with carbon used to make nanotubes. This approach broke down the microplastics, eventually turning them into water and carbon dioxide. The scientists eliminated about half of the microplastics this way, and think they could get rid of them all if they allowed the associated chemical reaction to happen for a more extended period.

Scientists are not yet sure of the long-term effects of microplastics on humans and the environment. But, this proactive method could avoid any catastrophes they might cause and make microplastic pollution less prevalent.

Water purification methods that use nanotechnology seem poised for growth throughout the foreseeable future.

They could address known challenges and promote the greater good by improving access to clean and trustworthy water sources.

Although some of the efforts are in the early stages, any progress could facilitate valuable research.

Follow this link:

How nanotechnology can improve access to cleaner, safer water - The Sociable

Nanoparticles may have bigger impact on the environment than previously thought – National Science Foundation

Non-antibacterial nanoparticles can cause resistance in bacteria

Chemist Erin Carlson led research showing that nanoparticles can cause resistance in bacteria.

October 15, 2019

Over the last two decades, nanotechnology has improved many everyday products, from microelectronics to sunscreens. Nanoparticles (particles just a few hundred atoms in size) are ending up in the environment by the ton, but scientists are still unclear about the long-term effects of these super-small particles.

In a first-of-its-kind study, published in Chemical Science, researchers have shown that nanoparticles may have a bigger impact on the environment than previously thought.

Researchers at the University of Minnesota, through the National Science Foundation Center for Sustainable Nanotechnology, found that a common, non-disease-causing bacterium in the environment, Shewanella oneidensis MR-1, developed rapid resistance when repeatedly exposed to nanoparticles used in making lithium ion batteries, the rechargeable batteries used in portable electronics and electric vehicles. The resistance means that the fundamental biochemistry and biology of the bacteria are changing.

The results of the study are unusual, the researchers say. Bacterial resistance usually occurs because bacteria become resistant to attempts to kill them. In this case, the nanoparticles used in lithium ion batteries were not intended to kill bacteria. This is the first report of non-antibacterial nanoparticles causing resistance in bacteria.

Bacteria are prevalent in lakes and soil where there is a delicate balance of organisms. Other organisms feed on the microbes, and the resistant bacteria could have effects scientists can't yet predict.

"Research that advances technology and sustains our environment is a priority for the Division of Chemistry," said Michelle Bushey, program director for the NSF Chemical Centers for Innovation Program. "This work reveals the unexplored and long-term impacts some nanoparticles have on the living organisms around us. This discovery at the chemistry-biology interface is a first step toward developing new sustainable materials and practices and providing the groundwork for possible remediation approaches."

Go here to see the original:

Nanoparticles may have bigger impact on the environment than previously thought - National Science Foundation

Nanotechnology Market 2019| Global Industry Overview, Latest Trends, Business Boosting Strategies, CAGR Status, Growth Opportunities and Forecast 2026…

Verified Market Research has recently published a report, titled [Nanotechnology Market Size, Trends and Forecast to 2026]. The research report provides an in-depth explanation of the various factors that are likely to drive the market. It discusses the longer term of the market by learning the historical details. Analysts have studied the ever-changing market dynamics to evaluate their impact on the overall market. In addition, the report also discusses the segments present in the market. Primary and secondary research methodologies have been used to provide the readers with an accurate and precise understanding of the overall Nanotechnology market. Analysts have also given readers an unbiased opinion about the direction companies will take during the forecast period.

The research report also includes the global market figures that provide historical data as well as estimated figures. It gives a clear picture of the growth rate of the market during the forecast period. The report aims to give the readers quantifiable data that is collected from verified data. The report attempts to answer all the difficult questions such as market sizes and company strategies.

Global Nanotechnology Market was valued at USD 1.03 Billion in 2018 and is projected to reachUSD 2.29 Billion by 2026, growing at a CAGR of10.40 % from 2019 to 2026.

Get | Download Sample Copy @https://www.verifiedmarketresearch.com/download-sample/?rid=15416&utm_source=UKN&utm_medium=AK

Key Players Mentioned in the Nanotechnology Market Research Report:Nanosys, QD Vision, Arkema, 10Angstroms, 10x MicroStructures, 10x Technology Inc, 3M 3rd Millennium Inc, 3rdTech Inc, Bayer Material Science and Cortex

Nanotechnology Market: Drivers and Restraints

The report explains the drivers shaping the future of the Nanotechnology market. It evaluates the various forces that are expected to create a positive influence on the overall market. Analysts have studied the investments in research and development of products and technologies that are expected to give the players a definite boost. Furthermore, researchers have also included an analysis of the changing consumer behaviour that is projected to impact the supply and demand cycles present in the Nanotechnology market. Evolving per capita earnings, improving economic statuses, and emerging trends have all been studied in this research report.

The research report also explains the potential restraints present in the global Nanotechnology market. It evaluates the aspects that are likely to hamper the market growth in the near future. In addition to this assessment, it also provides a list of opportunities that could prove lucrative to the overall market. Analysts provide solutions for turning threats and restraints into successful opportunities in the coming years.

Nanotechnology Market: Regional Segmentation

In the successive chapters, analysts have studied the regional segments present in the Nanotechnology market. This gives the readers a narrowed-view of the global market enabling a closer look at the elements that could define its progress. It highlights myriad regional aspects such as the impact of culture, environment, and government policies that influence the regional markets.

Ask for Discount @https://www.verifiedmarketresearch.com/ask-for-discount/?rid=15416&utm_source=UKN&utm_medium=AK

Table of Content

1 Introduction of Nanotechnology Market

1.1 Overview of the Market 1.2 Scope of Report 1.3 Assumptions

2 Executive Summary

3 Research Methodology of Verified Market Research

3.1 Data Mining3.2 Validation3.3 Primary Interviews3.4 List of Data Sources

4 Nanotechnology Market Outlook

4.1 Overview4.2 Market Dynamics4.2.1 Drivers4.2.2 Restraints4.2.3 Opportunities4.3 Porters Five Force Model4.4 Value Chain Analysis

5 Nanotechnology Market, By Deployment Model

5.1 Overview

6 Nanotechnology Market, By Solution6.1 Overview

7 Nanotechnology Market, By Vertical

7.1 Overview

8 Nanotechnology Market, By Geography8.1 Overview8.2 North America8.2.1 U.S.8.2.2 Canada8.2.3 Mexico8.3 Europe8.3.1 Germany8.3.2 U.K.8.3.3 France 8.3.4 Rest of Europe 8.4 Asia Pacific 8.4.1 China 8.4.2 Japan 8.4.3 India 8.4.4 Rest of Asia Pacific 8.5 Rest of the World 8.5.1 Latin America 8.5.2 Middle East

9 Nanotechnology Market Competitive Landscape

9.1 Overview 9.2 Company Market Ranking 9.3 Key Development Strategies

10 Company Profiles

10.1.1 Overview 10.1.2 Financial Performance 10.1.3 Product Outlook 10.1.4 Key Developments

11 Appendix

11.1 Related Research

Complete Report is Available @ https://www.verifiedmarketresearch.com/product/nanotechnology-market/?utm_source=UKN&utm_medium=AK

We also offer customization on reports based on specific client requirement:

1-Freecountry level analysis forany 5 countriesof your choice.

2-FreeCompetitive analysis of any market players.

3-Free 40 analyst hoursto cover any other data points

About Us:

Verified Market Research has been providing Research Reports, with up to date information, and in-depth analysis, for several years now, to individuals and companies alike that are looking for accurate Research Data. Our aim is to save your Time and Resources, providing you with the required Research Data, so you can only concentrate on Progress and Growth. Our Data includes research from various industries, along with all necessary statistics like Market Trends, or Forecasts from reliable sources.

Contact Us:

Mr. Edwyne Fernandes Call: +1 (650) 781 4080 Email:sales@verifiedmarketresearch.com

Go here to read the rest:

Nanotechnology Market 2019| Global Industry Overview, Latest Trends, Business Boosting Strategies, CAGR Status, Growth Opportunities and Forecast 2026...

Research Brief: Nanoparticles may have bigger impact on the environment than previously thought – UMN News

Over the last two decades, nanotechnology has improved many of the products we use every day from microelectronics to sunscreens. Nanoparticles (particles that are just a few hundred atoms in size) are ending up in the environment by the ton, but scientists are still unclear about the long-term effects of these super-small nanoparticles.

In a first-of-its-kind study, researchers have shown that nanoparticles may have a bigger impact on the environment than previously thought. The research is published in Chemical Science, a peer-reviewed journal of the Royal Society of Chemistry.

Researchers from the National Science Foundation Center for Sustainable Nanotechnology, led by scientists at the University of Minnesota, found that a common, non-disease-causing bacteria found in the environment, called Shewanella oneidensis MR-1, developed rapid resistance when repeatedly exposed to nanoparticles used in making lithium ion batteries, the rechargeable batteries used in portable electronics and electric vehicles. Resistance is when the bacteria can survive at higher and higher quantities of the materials, which means that the fundamental biochemistry and biology of the bacteria is changing.

At many times throughout history, materials and chemicals like asbestos or DDT have not been tested thoroughly and have caused big problems in our environment, said Erin Carlson, a University of Minnesota chemistry associate professor in the Universitys College of Science and Engineering and the lead author of the study. We dont know that these results are that dire, but this study is a warning sign that we need to be careful with all of these new materials, and that they could dramatically change whats happening in our environment.

Carlson said the results of this study are unusual because typically when we talk about bacterial resistance it is because weve been treating the bacteria with antibiotics. The bacteria become resistant because we are trying to kill them, she said. In this case, the nanoparticles used in lithium ion batteries were never made to kill bacteria.

This is the first report of non-antibacterial nanoparticles causing resistance in bacteria.

In the past, many studies in the field exposed bacteria to a large dose of nanoparticles and observed if the bacteria died. This study was different because it looked at what happens over a more extended period of time to test how the bacteria might adapt over multiple generations when continually exposed to the nanoparticles. The bacteria were clearly able to take higher and higher doses of these materials over time without dying.

Even though a nanoparticle may not be toxic to a microbe, it can still be dangerous, said Stephanie Mitchell, a University of Minnesota chemistry graduate student and lead graduate student on this study.

Carlson warns that the results of this study go far beyond just bacteria.

This research is very important to humans because bacteria are prevalent in our lakes and soil where there is a delicate balance of organisms. Other organisms feed on these microbes and there could be a major effect up the food chain or these resistant bacteria could have other effects we cant even predict right now.

Carlson said the researchers will continue follow-up studies to determine the effects of other human-made nanomaterials on other organisms in the environment and the long-term effects.

Research that both advances technology and sustains our environment is a priority for the Division of Chemistry, said Michelle Bushey, program director for the Chemical Centers for Innovation Program at the National Science Foundation. This work reveals unexplored and long-term impacts that some nanoparticles have on the living organisms around us. This discovery at the chemistry-biology interface is a first step toward developing new sustainable materials and practices, as well as providing the groundwork for possible remediation approaches.

In addition to Carlson and Mitchell, other lead researchers on the study include University of Minnesota Chemistry Professor Christy Haynes, Augsburg University Chemistry Associate Professor Z. Vivian Feng, and University of Wisconsin-Madison Chemistry Professor Robert Hamers, the director of the Center for Sustainable Nanotechnology. Others on the research team include University of Minnesota researchers Natalie Hudson-Smith, Meghan Cahill, and Benjamin Reynolds; Augsburg University researchers Seth Frand and Rodrigo Tapia Hernandez; and University of Wisconsin-Madison researchers Curtis Green and Chenyu Wang.

This research was funded by the National Science Foundation through the Center for Sustainable Nanotechnology, an NSF Center for Chemical Innovation.

To read the full research paper, visit the Chemical Science website.

Originally posted here:

Research Brief: Nanoparticles may have bigger impact on the environment than previously thought - UMN News

Cause of Harmful Dendrites and Whiskers in Lithium Batteries Uncovered [Video] – SciTechDaily

PNNL scientists Chongmin Wang, Wu Xu and Yang He with the specially modified environmental transmission electron microscope they used to capture images and video of growing whiskers inside a lithium battery. Credit: Photo by Andrea Starr/PNNL

Scientists have uncovered a root cause of the growth of needle-like structuresknown as dendrites and whiskersthat plague lithium batteries, sometimes causing a short circuit, failure, or even a fire.

The team, led by Chongmin Wang at the Department of Energys Pacific Northwest National Laboratory, has shown that the presence of certain compounds in the electrolytethe liquid material that makes a batterys critical chemistry possibleprompts the growth of dendrites and whiskers. The team hopes the discovery will lead to new ways to prevent their growth by manipulating the batterys ingredients. The results were published online October 14, 2019, in Nature Nanotechnology.

Researchers at PNNL have captured on video the growth of a harmful structure known as a whisker inside a nanosized lithium metal battery. Lithium ions begun to clump together, forming a particle; the structure grows slowly as more and more lithium atoms glom on, growing the same way that a stalagmite grows from the floor of a cave. Then, suddenly, a whisker shoots forth.

Video courtesy of He et. al., Nature Nanotechnology

Dendrites are tiny, rigid tree-like structures that can grow inside a lithium battery; their needle-like projections are called whiskers. Both cause tremendous harm; notably, they can pierce a structure known as the separator inside a battery, much like a weed can poke through a concrete patio or a paved road. They also increase unwanted reactions between the electrolyte and the lithium, speeding up battery failure. Dendrites and whiskers are holding back the widespread use of lithium metal batteries, which have higher energy density than their commonly used lithium-ion counterparts.

The PNNL team found that the origin of whiskers in a lithium metal battery lies in a structure known as the SEI or solid-electrolyte interphase, a film where the solid lithium surface of the anode meets the liquid electrolyte. Further, the scientists pinpointed a culprit in the growth process: ethylene carbonate, an indispensable solvent added to electrolyte to enhance battery performance.

It turns out that ethylene carbonate leaves the battery vulnerable to damage.

The teams findings include videos that show the step-by-step growth of a whisker inside a nanosized lithium metal battery specially designed for the study.

A dendrite begins when lithium ions start to clump, or nucleate, on the surface of the anode, forming a particle that signifies the birth of a dendrite. The structure grows slowly as more and more lithium atoms glom on, growing the same way that a stalagmite grows from the floor of a cave. The team found that the energy dynamics on the surface of the SEI push more lithium ions into the slowly growing column. Then, suddenly, a whisker shoots forth.

It wasnt easy for the team to capture the action. To do so, scientists integrated an atomic force microscope (AFM) and an environmental transmission electron microscope (ETEM), a highly prized instrument that allows scientists to study an operating battery under real conditions.

Researcher Yang He adds a sample to the environmental transmission electron microscope. Credit: Photo by Andrea Starr/PNNL

The team used the AFM to measure the tiny force of the whisker as it grew. Much like a physician measures a patients hand strength by asking the patient to push upward against the doctors outstretched hands, the PNNL team measured the force of the growing whisker by pushing down on its tip with the cantilever of the AFM and measuring the force the dendrite exerted during its growth.

The team found that the level of ethylene carbonate directly correlates with dendrite and whisker growth. The more of the material the team put in the electrolyte, the more the whiskers grew. The scientists experimented with the electrolyte mix, changing ingredients in an effort to reduce dendrites. Some changes, such as the addition of cyclohexanone, prevented the growth of dendrites and whiskers.

We dont want to simply suppress the growth of dendrites; we want to get to the root cause and eliminate them, said Wang, a corresponding author of the paper along with Wu Xu. We drew upon the expertise of our colleagues who have expertise in electrochemistry. My hope is that our findings will spur the community to look at this problem in new ways. Clearly, more research is needed.

Understanding what causes whiskers to start and grow will lead to new ideas for eliminating them or at least controlling them to minimize damage, added first author Yang He. He and the team tracked how whiskers respond to an obstacle, either buckling, yielding, kinking, or stopping. A greater understanding could help clear the path for the broad use of lithium metal batteries in electric cars, laptops, mobile phones, and other areas.

###

Authors of the paper, from PNNL and EMSL, include Wang, Xu, and He, as well as Xiaodi Ren, Yaobin Xu, Mark Engelhard, Xiaolin Li, Jie Xiao, Jun Liu, and Ji-Guang (Jason) Zhang. The work was funded by DOEs Office of Energy Efficiency and Renewable Energys Vehicle Technologies Office. The work was made possible thanks to the unique combination of capabilities available at EMSL, the Environmental Molecular Sciences Laboratory, a DOE Office of Science user facility located at PNNL.

Reference: Origin of lithium whisker formation and growth under stress by Yang He, Xiaodi Ren, Yaobin Xu, Mark H. Engelhard, Xiaolin Li, Jie Xiao, Jun Liu, Ji-Guang Zhang, Wu Xu and Chongmin Wang, 14 October 2019, Nature Nanotechnology.DOI: 10.1038/s41565-019-0558-z

About PNNL

Pacific Northwest National Laboratory draws on signature capabilities in chemistry, Earth sciences, and data analytics to advance scientific discovery and create solutions to the nations toughest challenges in energy resiliency and national security. Founded in 1965, PNNL is operated by Battelle for the U.S. Department of Energys Office of Science. DOEs Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time.

Originally posted here:

Cause of Harmful Dendrites and Whiskers in Lithium Batteries Uncovered [Video] - SciTechDaily

The Concern for the Secretive Bio-Geopolitics – FXStreet

Recently, many countries in Asia have suffered from deadly and costly epidemics. While globalization and climate change may play a causal role, germ geopolitics cannot be excluded any longer.

Entomological, anti-animal and crop-based diseases typically occur for natural reasons. All three have also be aggravated by globalization and climate change. However, evidence suggests that some of these outbreaks may also involve prior deployment in "biological programs" and "research.

Take anthrax, for instance. Despite the post-9/11 concerns, the bacteria continue to be researched. In May 2015, Pentagon confirmed that its lab in Utah had inadvertently sent live anthrax samples to one of its military bases in South Korea. Last April, civic groups and residents took to the street to protest against biological agent experiments, which the US was reportedly conducting at Busans Port Pier 8. Pentagons budget estimates suggest the project was ongoing with funds set aside for live agent tests.

These issues remain sensitive in East Asia, in light of the US biowarfare against North Koreans and Chinese in the 1950s and contemporary geopolitics. Biological agents have dual-use functions. Like new technologies, they can save but also incapacitate and destroy human lives.

Asian Swine Fever: Epidemics Vs Geopolitics

Asian swine fever (ASF) is a hemorrhagic fever of pigs with mortality rates close to 100 percent and major economic losses. Historically, the first ASF outbreak took place in Kenya in 1907 and the first outside Africa in Portugal in 1957. That's the official story.

In reality, by the early '50s, several viruses, including ASF, were available in Fort Terry, a US bio-warfare facility in Plum Island, New York. Between the 1960s and late 90s, Cuba accused Washington of ten biological warfare attacks following serious infectious disease outbreaks. None were proven conclusively, but several most likely occurred. In 1971, pigs in Havana hog farm were diagnosed with ASF virus, which caused half a million pigs to be slaughtered. As Cuba suffered food shortage, the UN labeled the outbreak the "most alarming event" of 1971.

The debacle remained a mystery until 1977, when Long Island Newsday reported the virus was delivered from a US army base; the site of joint Army-CIA covert operations in the Panama Canal Zone. US Central Intelligence Agency (CIA) denied involvement. Yet, bio-warfare historian Norman Covert has affirmed CIA had access to the laboratories.

CIA Denies Link to Cuban Swine Fever

Following the Cold War, the ASF threat seemed to have been defused. But as a series of color revolutions took off in Eastern Europe in countries targeted for NATO enlargement - the ASF in 2007 spread to Georgia in the Caucasus and thereafter widely to neighboring countries, including Armenia, Azerbaijan and several territories in Russia.

After a decade of relative quiet, the first ASF outbreak in China was reported in Shenyang in August 2018. It was thought to have come to China via Russia or Eastern Europe; that is, through the "color revolutions" countries.

The timing is intriguing. In China, the spread of ASF began with the US trade war after mid-2018. As a result, US pork sales to China were over three times pricier already last spring than a year before, despite the US retaliatory tariffs. Chinas over 400 million pigs account for half of the world total. The ASF is a major threat to global food security.

US Trade War, Chinas ASF Epidemic, Soaring Import Prices

Chinas Breeding Saw Population, 2016-19

Source: FAO (UN), USDA (US), MARA (China)

Ethnic Bio-Bombs, Non-Endemic Outbreaks

After the Cold War, the Nunn-Lugar Cooperative Threat Reduction Program (CTRP) was created presumably to keep the former Soviet Unions nuclear and chemical infrastructure from rogue nations and terrorists. But as Congress in 1996 began to expand the program internationally, so did efforts to capitalize on its offensive uses.

In particular, the neoconservative Project for New American Century (PNAC), the ideological force behind the subsequent Bush administrations foreign policy, declared in its manifesto Rebuilding Americas Defenses (2000) that advanced forms of biological warfare that can target specific genotypes may transform biological warfare from the realm of terror to a politically useful tool.

Previously, such efforts at biological ethnic bombs had occurred mainly in apartheid-era South Africa and Rhodesia; the PNAC builds on the Israeli ethno-bomb idea to target specific genetic traits among target populations.

By May 2007, Russia banned all exports of human bio samples, due to concern for "genetic bio-weapons" targeting Russian population. Reportedly, some of these institutions, including Harvard Public Health and USAID, have collected biological material in China as well. In October 2018, Russian Defense Ministry claimed the spread of viral diseases from Georgia, including African swine fever since 2007, could be connected to a US lab network in the area, where more than 70 Georgians had died in odd conditions.

The lab network, a branch of the Nunn-Lugar bio-initiative, belongs to the multimillion-dollar Cooperative Biological Engagement Program (CBEP) funded by Pentagon's Cooperative Threat Reduction Agency (DTRA). The CBEP labs are located in 25 countries, including in Eastern Europe (e.g., Georgia and Ukraine), the Middle East, Africa and Southeast Asia. In several locations, there have been reported outbreaks of tropical diseases, which are not endemic to the area.

Despite high-level Russian calls for a comprehensive evaluation and joint inspections, pleas for multilateral cooperation have been ignored. In its 2020 multimillion-dollar budget, the DTRA characterizes the "bio-security" program in Asia as the partner of choice in a region competing against Chinese influence.

Pressing Need for Multipolar Cooperation

Even the discoverer of the devastating Lyme disease Willy Burdorfer participated in US bio-warfare, according to science bestseller Bitten (2019). That has triggered New Jersey Rep. Chris Smiths investigation into whether Pentagon has experimented with tics and other insects as biological weapons.

International concern is rising over the role of potential covert goals in viral outbreaks. By September, the Fall armyworm, a pest that can damage a wide variety of crops, had spread to 25 Chinese provinces posing a severe threat to food security. Described first in 1797, it used to be endemic only to Americas. After the Trump 2016 win, it has globalized faster than Facebook. Only crisis measures permitted China to contain the threat for this year.

A new Pentagon program called Insect Allies funded by the Defense Advanced Research Projects Agency (DARPA) relies on gene editing and hopes to infect insects with modified viruses, presumably to make US crops more resilient. In contrast, international scientists suggest such programs do not represent agricultural research but new bio-weapon programs, which violate the Biological Weapons Convention.

During the Cold War, the threat of the mutually assured destruction constrained nuclear and bio-warfare risks, however. The contemporary era is devoid of such constraints and thus far more dangerous. The most effective way to resolve the contested bio-warfare challenges would be to build on international multilateral biological arms control, particularly the 1925 Geneva Protocol, the 1972 Biological and Toxin Weapons Convention (BWC) and the 1993 Chemical Weapons Convention (CWC).

With rising climate risks and geopolitical tensions, no single country should have monopoly over biological agents in the 21st century. Whats desperately needed is multipolar cooperation among the major advanced economies and large emerging powers.

Dr. Dan Steinbock is the founder of Difference Group and has served at the India, China and America Institute (US), Shanghai Institute for International Studies (China) and the EU Center (Singapore).

A shorter version of the commentary was released by China-US Focus on October 10, 2019

Read this article:

The Concern for the Secretive Bio-Geopolitics - FXStreet

Fermilab and University of Chicago scientist Josh Frieman awarded $1 million by DOE Office of Science – Fermi National Accelerator Laboratory

The Department of Energy has awarded Fermilab and University of Chicago scientist Josh Frieman $1 million over three years as part of the inaugural Office of Science Distinguished Scientist Fellowship program.

Office of Science distinguished scientist fellows were chosen from nominations submitted by nine U.S. national laboratories. Frieman is one of only five scientists selected, chosen for his scientific leadership, engagement with the academic research community, scientific excellence and significant scientific achievement.

The Distinguished Scientist Fellowship was established to develop, sustain and promote excellence in Office of Science research through collaborations between institutions of higher education and national laboratories.

Frieman says he will use the funding to support his cosmic research program and to foster tighter connections in cosmic frontier research between Fermilab and the University of Chicago.

While a significant number of University of Chicago graduate students and postdoctoral researchers have conducted research at Fermilab in a variety of areas of high-energy physics, very few currently carry out cosmology or theoretical astrophysics work at the lab.

Frieman aims to change that by building more active collaboration between Fermilab and the University of Chicago in research with cosmic surveys.

There are many very talented students at the university and many very talented scientists at Fermilab, Frieman said. Ive mentored students and postdocs at the University of Chicago, but few of them have spent time at Fermilab. And there are postdocs in the astrophysics groups at Fermilab who spend a small fraction of their time at the university. Im looking to bridge that gap, to help make the whole greater than the sum of its parts.

Friemans current research centers on the Fermilab-hosted Dark Energy Survey a project he led from 2010 to 2018 and will transition in coming years to the Large Synoptic Survey Telescope, whose construction is managed by SLAC National Accelerator Laboratory.

With its full data set accumulated, the Dark Energy Survey is at a very exciting phase of its science analysis, and both the university and Fermilab will play significant roles in LSST. Id like to get more students and postdocs engaged in both projects and to stimulate synergies between the lab and the university in the process, he said. Collaboration drives science forward, and this award recognizes that the more closely the labs and universities work together, the further we can take our research. Its an honor to be among the first recipients of this fellowship.

With a long list of leadership roles and academic distinctions to his credit, Frieman has the experience needed to bring these two research groups together. Currently the president of the Aspen Center for Physics, Frieman is a fellow of the American Physical Society, of the American Association for the Advancement of Science, and of the American Academy of Arts and Sciences. He is also chair of the American Physical Society Division of Astrophysics. He previously served on the Particle Physics Project Prioritization Panel of the High Energy Physics Advisory Panel, on the Astro 2010 Decadal Survey Committee, and on the Astronomy and Astrophysics Advisory Committee.

Frieman is head of the Fermilab Particle Physics Division and is a professor of astronomy and astrophysics and member of the Kavli Institute for Cosmological Physics at the University of Chicago.

View post:

Fermilab and University of Chicago scientist Josh Frieman awarded $1 million by DOE Office of Science - Fermi National Accelerator Laboratory

Humans Will Never Live on Another Planet, Nobel Laureate Says. Here’s Why. – Livescience.com

Here's the reality: We're messing up the Earth and any far-out ideas of colonizing another orb when we're done with our own are wishful thinking. That's according to Michel Mayor, an astrophysicist who was a co-recipient of the Nobel Prize in physics this year for discovering the first planet orbiting a sun-like star outside of our solar system.

"If we are talking about exoplanets, things should be clear: We will not migrate there," he told Agence France-Presse (AFP). He said he felt the need to "kill all the statements that say, 'OK, we will go to a livable planet if one day life is not possible on Earth.'"

All of the known exoplanets, or planets outside of our solar system, are too far away to feasibly travel to, he said. "Even in the very optimistic case of a livable planet that is not too far, say a few dozen light years, which is not a lot, it's in the neighbourhood, the time to go there is considerable," he added.

Related: 8 Ways Global Warming Is Already Changing the World

Mayor shared half of the Nobel Prize this year along with Didier Queloz for discovering the first exoplanet in October 1995. Using novel instruments at the Haute-Provence Observatory in southern France, they detected a gas giant similar to Jupiter, which they named 51 Pegasi b. (The other half of the prize was awarded to James Peebles of Princeton University for his work in dark matter and dark energy).

Since then, over 4,000 other exoplanets have been found in the Milky Way, but apparently, none of them can be feasibly reached.

Stephen Kane, a professor of planetary astrophysics at the University of California in Riverside, agrees with Mayor. "The sad reality is that, at this point in human history, all stars are effectively at a distance of infinity," Kane told Live Science. "We struggle very hard as a species to reach the Earth's moon."

We might be able to send people to Mars in the next 50 years, but "I would be very surprised if humanity made it to the orbit of Jupiter within the next few centuries," he said. Since the distance to the nearest star outside of our solar system is about 70,000 times greater than the distance to Jupiter, "all stars are effectively out of reach."

Well, you might say, plenty of things seemed out of reach until we reached them, such as sending aircraft on intercontinental flights. But "in this case, the required physics to reach the stars, if it exists, is not known to us and it would require a fundamental change in our understanding of the relationship between mass, acceleration and energy."

"So that's where we stand, firmly on the Earth, and unlikely to change for a very, very long time," he said.

Mayor told the AFP: "We must take care of our planet, it is very beautiful and still absolutely livable."

Andrew Fraknoi, emeritus chair of the astronomy department at Foothill College in California agreed that we won't be able to travel to these stars in the near future. But "I would never say we can never reach the stars and possible habitable planets," he said. "Who knows how our technology will evolve after another million years of evolution."

Originally published on Live Science.

Need more space? You can get 5 issues of our partner "All About Space" Magazine for $5 for the latest amazing news from the final frontier!

(Image credit: All About Space magazine)

The rest is here:

Humans Will Never Live on Another Planet, Nobel Laureate Says. Here's Why. - Livescience.com

Exotic ‘Fuzzy’ Dark Matter May Have Created Giant Filaments Across the Early Universe – Livescience.com

Dark matter, the mysterious substance making up a quarter of the mass and energy of the universe, might be made from extremely tiny and light particles, new research suggests. This fuzzy form of dark mattercalled that because these miniscule particles' wavelengths would be smeared out over a colossally huge areawould have altered the course of cosmic history and created long and wispy filaments instead of clumpy galaxies in the early universe, according to simulations.

The findings have observational consequences upcoming telescopes will be able to peer back to this early time period and potentially distinguish between different types of dark matter, allowing physicists to better understand its properties.

Related: 11 Unanswered Questions About Dark Matter

Dark matter is an unknown massive substance found throughout the cosmos. It gives off no light hence the name dark matter but its gravitational effects help bind together galactic clusters and cause stars at the edges of galaxies to spin faster than they otherwise would. Many scientists believe that most dark matter is cold, meaning it moves relatively slowly. But there are entirely different ideas, such as the possibility that it's tiny and fuzzy, meaning it would move quickly because its so light.

"Our simulations show that the first galaxies and stars that form look very different in a universe with fuzzy dark matter than a universe that has cold dark matter," Lachlan Lancaster, an astrophysics graduate student at Princeton University and co-author of a new paper in the journal Physical Review Letters, told Live Science.

Lancaster explained that the most common speculations about dark matter suggest it is composed of weakly interactive massive particles (WIMPs), which would have a few tens or hundreds of times the mass of a proton. Simulations that use this type of dark matter are extremely good at re-creating the large-scale structure of the universe, including vast voids of empty space surrounded by long, spidery filaments of gas and dust, a formation known as the cosmic web. But on smaller scales, such models contain a number of discrepancies from what astronomers observe with their telescopes. In this standard view, dark matter should pile up in the centers of galaxies, but nobody has seen it doing so.

Fuzzy dark matter, in contrast, would be mind-bogglingly light, perhaps a billionth of a billionth of a billionth the mass of an electron, according to a statement from MIT. Quantum mechanics states that particles can also be thought of as waves, with wavelengths inversely proportional to their mass, Lancaster said. So the wavelength of such a light particle would be thousands of light-years long.

Fuzzy dark matter would therefore have a harder time clumping together than cold, WIMP dark matter. In simulations, Lancaster and his co-authors showed that a cold dark-matter universe would have galaxies that formed relatively quickly out of spherical halos.

But fuzzy dark matter would instead coalesce into long, wispy strings of material "more giant filaments than clumpy galaxies," Lancaster said and galaxies would then be born larger and later. Dark matter would also have a harder time piling up in the centers of galaxies, potentially explaining why astronomers don't observe this clumpiness when they look at galaxies.

Instruments like the Large Synoptic Survey Telescope (LSST) in Chile and 30-meter-class telescopes being built around the world will soon be able to peer back to some of the universe's earliest days. They are expected to start taking data in the next decade, which means "we'll either start seeing the effects of fuzzy dark matter, or start ruling them out," Lancaster said.

Though other researchers have speculated about fuzzy dark matter, the new simulations do a more careful job of working out its cosmological effects, said Jeremiah Ostriker, an astrophysicist at Columbia University who was not involved in the work.

"This helps outline the details of what the formation of structure would be in this variant theory," OStriker added. "And it's one of the most interesting variant theories around."

Lancaster said his team's future simulations might focus on capturing more details of the fuzzy dark matter's effects, potentially giving astronomers a better idea of what they might expect to see through their telescopes.

Originally published on Live Science.

Need more space? You can get 5 issues of our partner "All About Space" Magazine for $5 for the latest amazing news from the final frontier!

(Image credit: All About Space magazine)

Link:

Exotic 'Fuzzy' Dark Matter May Have Created Giant Filaments Across the Early Universe - Livescience.com

How Mere Humans Manage to Comprehend the Vastness of the Universe – Scientific American

Astrophysics is not typically considered to be part of the humanities. Yet one class I took as a senior at university suggested otherwise. It left me in awe of the human mind.

With my own background rooted in the humanities, I found myself focusing on the way my professors described the cosmos. While the fantastical environments of black holes, white dwarfs and dark matter often took center stage, at the heart of each discovery was the human mind seeking to understand the unfamiliar.

Their tales of discovery made it clear that we often take our knowledge of the universe for granted. After all, the universe was not built for the human mind to understand. When we look up at the night sky, we see only a tiny fraction of what is out there. It is the task of the astrophysicist to develop a picture of the universe despite our overwhelming blindness.

I wanted to better understand how being human shapes our understanding of the universe. After talking to some of Princetons leading astrophysicists, one thing became clear: the discipline requires the human mind to be conscious not only of the universe but of itself (unless otherwise identified, all quotes are from these scientists).

Only 5 percent of the universe is normal, observable matter. Within this small fraction, the human eye can only perceive matter that emits light within a certain frequency on the electromagnetic spectrum. While birds can perceive magnetic fields and snakes can image in the infrared, we can detect only visible light. This range determines our picture of space, Adam Burrows explains. Our picture of space is, in that sense, a direct product of the human mind.

Rather than assume our picture wholly captured the universe, Jo Dunkley says that astrophysicists started wondering whether there might be other things filling our galaxies and universe that we cannot see. They designed telescopes to detect frequencies of light that lie beyond human perception, such as those of x-rays and radio waves. With these instruments, our picture of the universe became 5 percent complete.

The astrophysicists task then became one of using the visible to detect the remaining 95 percent. Einsteins laws of gravity provided a means of navigating the obscure. Because gravity depends solely upon mass, its effects can be seen irrespective of light production. As Dunkley explains, a massive, invisible object, such as a black hole, will attract a visible object, like a star.

While the Event Horizon Telescopes image of a black hole is one recent example, the strategy dates back as early as 1933. It was Swiss astronomer Fritz Zwicky who unwittingly first employed the technique when examining the behavior of galaxy clusters. He found the clusters to be far more massive than anticipated based on what was visible. He called the missing mass dark matter. Nearly 40 years later, American astronomer Vera Rubin confirmed its existence. While measuring the radial velocity of galaxies, she observed velocities incompatible with those predicted by the laws of gravity. The expectation had been that objects farther from the center of the galaxy orbited more slowly than those near the center. Rubin instead observed a constant velocity, meaning that there was no decrease at the fringe of the galaxies. In order for this to be possible within the laws of physics, there must be more to space than meets the eye, Dunkley explains. The mass existed, it just had yet to be detected.

Neta Bahcall explains that its the laws of gravity that render this dark matter indirectly observable. They allow astrophysicists to determine how much of the universe is invisible without knowing exactly what the darkness is. James Jeans once likened the situation to Platos well-known allegory, where imprisoned in our cave, with our backs to the light, we can only watch the shadows on the wall. The comparison is apt. Counterintuitively, the shadows here represent what is visible, and the light represents what we cannot see or even imagine. With this technique, dark matter came to contribute 27 percent to our cave drawing of the universe.

The 68 percent of the universe absent from our drawing is still unknown. But, in 1998, that unknown was given a name: dark energy. It emerged as a means of explaining the universes anomalous expansion. In the 1990s, astrophysicists thought that the universes rate of expansion would gradually decrease. The laws of gravity predicted that the matter filling the universe would begin to pull itself together as time went on, thus slowing the universes expansion. Yet this turned out not to be the case. The expansion was accelerating. Very little is known about dark energy, and so our picture of the universe remains far from complete.

The problems facing our picture of the universe are not limited to what we can perceive. As Ed Turner explains, our mind and the culture in which it was formed condition the way we explore the universe. Because of this particular conditioning, we have mental blind spots for the cosmic phenomena that run counter to human intuition and understanding. For instance, Turner claims that the mind is predisposed to see things as statistically significant when they might not be. We erroneously perceive patterns in the spacing of stars and of the planets in the solar system, seeing them as though they were arranged.

There are other properties of the mind that get in the way of seeing the truth, according to Turner. Consider, for instance, our belief that massive objects must take up space. It is not a direct relationship: we accept that a piece of lead is more massive than a pillow, even though the latter is larger. At the extremes, however, we expect some positive correlation between the two. The extreme physical environment of a neutron star then poses problems. As Michael Strauss suggests, the star is so dense that a thimbleful of neutron star material has the mass of 70 million elephants. We cannot help but wonder: where is all the mass?

We are blinded by being human when we look at something larger than the human experience, Robert Lupton explains. It becomes further apparent when we are confronted with counterintuitive phenomena like white dwarfs and black holes. White dwarfs decrease in size as they become more massive, says Joshua Winn, and for black holes, all mass is compressed to zero size. While we cannot see the black hole, giving the phenomena a name allows us to imagine it. The same could be said of dark matter and dark energy, explains Dunkley. As with the previous analogy, language provides a means of overcoming our initial blindness to interact with these cosmic phenomena.

Astrophysicists encounter another blinding property of the mind when considering the nature of space: we can only visualize in three dimensions. In order to imagine the geometry of space namely whether it is flat or curvedwe would need to be able to think in four dimensions, says Dunkley. For instance, to determine the curvature of a ball, we first picture the ball in three dimensions. Therefore, to determine a three-dimensional curve, the mind would need to picture the four-dimensional object.

This need arises when astrophysicists contemplate the expanding universe and relativity. For the former, the task is to conceptualize a three-dimensional universe that exists in a loopan impossible visualization, for connecting every dimension would create a four-dimensional object. For the latter, in order to explore the relativistic behavior of spacetime, the task is to imagine a three-dimensional space deformed by gravityanother impossibility.

In both cases, two-dimensional analogies facilitate understanding. Dunkley likens the universe to a piece of string attached at both ends to create a loop, and then relies upon language to bridge the-dimensional gap. We would connect every side of space, such that no matter the direction we traveled in, we would always return to our starting point, she explains. Similarly, in his 1915 paper on general relativity, Einstein used a trampoline as a two-dimensional analogue for space. He then turned to language to illustrate how placing a massive object upon the stretchy surface creates a third, vertical dimension. The same principle applied in more dimensions, he argued: massive objects bend space. While we are still unable to visualize the four-dimensional phenomena, Dunkley says that through these linguistic analogies, we can imagine the consequences.

In this manner, astrophysicists stretch the mind to see the universe from an external perspective, says Turner. Burrows speaks of retraining the brain by developing a new language better suited for the conversation between the cosmos and the individual. The environment of the universe is so different from our daily environment that often we cannot imagine it, according to Joel Hartman. Take, for instance, the size of the universe and the number of stars within it. The language of mathematics, grounded in scientific notation, logarithms and orders of magnitude, allows us to grapple with the cosmos where words fall short, explains Burrows.

Similarly, when considering the four-dimensional universe, mathematical measurements provide astrophysicists with an invaluable means of navigating the obscure. Just like in two dimensions, explains Dunkley, if the geometry of space is flat, then parallel lines, like light rays, stay parallel always. If the space is curved, then they will either come towards each other in a positively curved universe or splay apart in a negatively curved one. To return to the language of Platos cave, it seems that by measuring the shadows before us, we are able to conceptualize, in part, the nature of what remains out of sight and out of mind.

Even with this universal language of mathematics, astrophysicists still resort to biological terms to describe certain cosmic phenomena. Turner describes how astrophysicists speak of the birth and death of stars, as though they were alive. More extreme is the twin paradox devised to facilitate a correct conception of time. We are accustomed to thinking of time as strictly linear and independent, but Einsteins theory of relativity says that probably is not the case. Time passes more slowly when close to massive objects.

To overcome our intuition, astrophysicists imagine taking two twins and somehow sending one of them to spend time near a black hole, [so that] she would actually age more slowly than [her] Earth-dwelling partner, explains Dunkley. The physical manifestation of aging allows the mind to grapple with the nonuniformity of time, for we are able to envision two differently aged twins despite the semblance of a paradox.

While there are certainly properties of the mind that get in the way of seeing the truth, as Turner says, the fact that it is human allows us to engage with the universe. The lives of stars and the twin paradox are just two examples of astrophysicists making sense of the unfamiliar through our own biology. After all, it is the mind of the astrophysicist that must first identify its blind spots and then devise techniques to overcome them. In that sense, astrophysics and humanism go together in a wonderfully unexpected way. As the literary critic Leo Spitzer once wrote, the humanist believes in the power of the human mind of investigating the human mind.

So often the predominant reaction to astrophysics focuses on how vast the universe is and how insignificant a place we hold in it. It would be far better to flip the narrative to see the marvel of the mind exploring the cosmos, human lens and all.

Read more here:

How Mere Humans Manage to Comprehend the Vastness of the Universe - Scientific American

The astrophysicist whose polling aggregator is projecting the election – The Hill Times

Mired in a growing frustration with how political polls were being reported on, a Quebec astrophysicist tried his hand at aggregating polls and projecting the 2018 Quebec election. Three provincial elections later, Philippe Fournier is hoping to correctly predict 90 per cent of the winning candidates of the Oct. 21 vote.

From coast-to-coast-to-coastfrom Nunavut to Skeena-Bulkley Valley, B.C., to Avalon, N.L.338Canada dives into individual races, as the websites name suggest, across all of Canadas 338 ridings.

I was looking at some Quebec polling before the [2018] election, and I noticed that many articles in newspapers were just badly written. Some journalists are told to write about polls when they dont know much about polls and statistics, Prof. Fournier told The Hill Times in a phone interview, a little more than a week before the Oct. 21 election.

I told my students I could do much better than that, he said.

Prof. Fournier teaches astrophysics at Cgep de Saint-Laurent in Montreal, where he is currently teaching only part timeas he currently spends 70 to 80 hours per week on 338Canada.

He first became involved in polling aggregation in the Quebec provincial election in 2018, when he started writing about polling projections on his website qc125.com.

It took about three months and then La Presse in Montreal contacted me asking me questions and political parties contacted me asking me questions. So it kind of became viral, Prof. Fournier said. It went so well that after the Quebec election, I figured, well, why not do a Canada-wide system.

The distinctive feature, Paul Adams, a journalism professor at Carleton University and former EKOS pollster, said of 338Canada, is the individual riding projections.

To get insight on the individual ridings, an aggregator takes the regional and subregional polling results, and apply them to historical patterns, Prof. Adams said.

As of Oct. 15, 338Canada is projecting a close race between the Conservatives and Liberals. The Hill Times photograph by Andrew Meade

Where aggregators can miss in its projections is where there is no historical baseline.

In this election, the obvious one would be Maxime Berniers party [the Peoples Party], Prof. Adams said. We dont actually know where we would expect them to run stronger.

How the Peoples Partys 2.8 per cent national support, according to 338Canada, will be distributed in certain individual ridings remains an unknown, Prof. Adams said.

So far, Prof. Fournier has worked on three provincial elections where he correctly projected more than 90 per cent of the winning candidates over the three votes. In the 2018 Ontario election, which resulted in a Progressive Conservative government, he identified 111 of the 124 winning candidates; 11 of the 13 misses were within the margin of error. In the 2018 Quebec election, he identified 112 of 125 correct candidates; of the 13 misses, four of them were in the margin of error. His most recent projections during the 2019 Alberta provincial election were his most successful, identifying 94 per cent of the successful candidates. He projected 82 of the 87 winning politicians. Of the five that he missed, three were within the margin of error.

I dont aim for perfection because its statistics and I know its impossible, Prof. Fournier said about projecting the Oct. 21 federal vote. I have this threshold that I want to reach: 90 per cent. But 90 per cent means I will miss about 35 [ridings].

The election right now is so close that I might miss more. So it might be 85 per cent. But my desire would be 90 per cent. Above that is unrealistic, he said.

As of Oct. 14, 338Canada was projecting the Conservatives to win 135.8 seats, and the Liberals to take 134.9 seats, both with massive margins of error. As he explained in Macleans, Prof. Fournier said his model has no fewer than 139 of 338 electoral districts labeled as either toss up or leaningmeaning more than a third of all ridings remain too close to call. A week before the election, the Bloc was projected to win 32.4 seats, the NDP 30.1 seats, the Green Party at 3.8 seats, and Independents and the Peoples Party at less than one seat apiece.

Prof. Fourniers model takes into account demographics of ridings, as well as historical performance and the effect of star candidates.

Pollster Greg Lyle, president of Innovative Research, said in cases where there is a strong candidate challenging an incumbent its hard to quantify what will tip the scales.

Mr. Lyle pointed to the current campaign in Kamloops-Thompson-Cariboo, B.C., which has been held by Conservative MP Cathy McLeod since 2006. But her current Liberal challenger is Terry Lake, a former Kamloops mayor and B.C. health minister.

A lot of Conservatives have been used to voting for Terry Lake, because provincially they vote B.C. Liberal, he said. And Terry Lake was very well regarded, so he has a strong personal brand.

In that seat, would you put your finger on the scale for the Liberals if it was close, Mr. Lyle said. But once you do that, now you are not betting on the methodology anymore, youre betting on the grey area.

As of Oct. 15, 338Canada is projecting that Ms. McLeod is likely will win re-election, with Mr. Lake finishing a distant second place.

Prof. Fournier said initially some pollsters werent all receptive to his modelling.

There are amateurs online and Im sure those amateurs really grind the gears of many pollsters. I know some pollsters really dont like aggregators, he said.

When he first started Qc125.com, Prof. Fournier said pollster Jean-Marc Lger was initially not pleased with him.

[Mr. Lger said,] I didnt know you were a scientist, I thought you were just some guy on the internet pulling numbers. Like everything on the internet, there are amateurs that just do anything and there are the serious people, Prof. Fournier said.

In addition to the website, Prof. Fournier also writes about his polling aggregate projections for Lactualit and Macleans.

When looking at historical examples, Prof. Fournier takes into account the 2011 and 2015 elections.

I look further in the past for some districts, he said, but the thing is the demographics in some parts of the country, especially the urban areas changed so fast [that] they would not be very useful to use the 2004 [or] 2006 numbers.

Prof. Adams said, generally speaking, aggregators are better than individual pollsters in predicting outcomes.

Pollster Frank Graves, pictured at the Green Partys 2016 convention in Ottawa, says aggregators can have the corrosive impact on a voters decision to not cast a ballot. The Hill Times photograph by Sam Garcia

Thats not to say that in any given election, there may be an individual pollster that comes out better than the aggregations, but the problem is you cant predict in advance which pollsters that is going to be, he said. Youre safer to stick with the aggregators.

There are two key concerns that aggregation has to focus on to influence the success of the model, Prof. Adams said. Aggregators have to decide the way each individual will be weighted and when to discard old polls for the aggregation model.

For Prof. Fournier, he will weigh a poll down in his model if the results are too out of line with the overall average. But, he said, if subsequent polling shows that the poll wasnt an outlier and was a precursor, it will regain its weight.

Outside of campaigns, Prof. Fournier said he can keep a poll in the model for around two months given how slowly the numbers move. As the poll gets older, the less weight it will have in the model.

During the election, Prof. Fournier said, he will only keep a poll in the model for a week.

Prof. Adams said if a poll is kept in the model for too long, it can miss the rapidly changing mood of an electorate.

If you take a weeks worth of results, you have a larger total number of cases, like you have tens of thousands of cases potentially adding up all the different polls, in a more stable environment that would give you a better result than just one or two polls at the end of the campaign. But if theres movement at the end of the campaign then a failure to decay older polls quickly enough will not serve you as well, he said.

EKOS Research president Frank Graves said he didnt think the aggregators are adding a service that cant be done elsewhere, as many pollsters put out their own seat projections.

Over the years, Mr. Graves said EKOS has performed better than the aggregators in projecting seats. In the 2006 federal election, EKOS projected the Conservatives would win 125 seats, plus or minus five seats. In the end, the party won 124 seats.

Mr. Graves said a seat projection should be within 20 seats of the winning party.

He added that if the aggregators are inaccurate or overstating their precision, there can be a corrosive impact on the voter decision making, as some people may decide not to vote if they look at a projection and see one candidate projected to win easily.

I would not rule out the fact that the aggregators contributions in the U.S. presidential election couldve been the victory for Donald Trump because a lot of disaffected, weakly-engaged Hilary [Clinton] voters were told, This is mailed in. Its done. Dont worry. And they stayed home [on election day], Mr. Graves said.

nmoss@hilltimes.com

The Hill Times

Neil Moss is a reporter at The Hill Times covering federal politics, foreign policy, and defence.- nmoss@hilltimes.com

View original post here:

The astrophysicist whose polling aggregator is projecting the election - The Hill Times

Norwich Science Festival launches with physics and astronomy events | What’s on and things to do in Norfolk – Eastern Daily Press

PUBLISHED: 16:03 15 October 2019

Rebecca MacNaughton

Immerse yourself at Norwich Arts Centre on Tuesday October 22 and Wednesday October 23. Picture: [UNIT]

Archant

Have you ever wondered what a star looks like when it is born, how you measure the speed of light or why dogs are so important to spaceflight? Head to Norwich Science Festival, October 8-26, to get the answers.

Email this article to a friend

To send a link to this page you must be logged in.

Ever since man walked on the moon 50 years ago, we've been fascinated by space. It has populated our favourite books, films and TV shows for decades and now, with the advent of commercial space tourism, more of us might get the chance to explore it.

Head to The Forum on Friday, October 25 to find out how spaceflight has changed over the years as journalist and broadcaster, Richard Hollingham, returns to his hometown to chair a panel called The Future of Human Spaceflight: The Moon, Mars and Beyond.

He will be joined by a diverse panel of experts, including space engineering consultant and founder of Rocket Women Vinita Marwaha Madil, journalist and broadcaster Sue Nelson, medical doctor and ESA researcher Beth Healey and University of Southampton researcher Christopher Ogunlesi. Together, they will discuss how spaceflight has changed since the Apollo missions and how commercial space tourism is set to bring a wider cross-section of people to orbit - and, he says, the panel will also share some important insights on how women have shaped the course of the 21st century space race.

Richard will also return to The Forum on Saturday, October 26 for Space Dogs, an exploration of how canine cosmonauts have paved the way for human exploration. "Dogs are the great unsung heroes of space," says Richard. "Every astronaut from Yuri Gagarin to Tim Peake owes their experiences to these pioneering space dogs."

He'll detail how stray dogs from the Soviet Union paved the way for human spaceflight and look at some of the key canine cosmonauts, from Laika - the first dog sent to space in 1951 - to Belka and Strelka who, Richard says, "flew, orbited and returned to Earth to be hailed as Soviet celebrities."

If you've ever wondered how science and art can work together, then head to Norwich Puppet Theatre on Friday, October 25 from 7-8.35pm, as The Cosmic Shambles Network presents Signals, a comedy play that follows two astronomers as they hunt for alien life.

The show asks some pretty hefty questions about the search for meaning - "what if we did find aliens?" asks producer Trent Burton, "how would humanity react, how would those two individuals react?" - as well as the role of art in science.

"The Cosmic Shambles Network blurs the line between science and art - once people get over the initial idea of it, they realise it's a much more natural fit than they thought," says Trent.

The performance will also be followed by a talk from Stargazing Live's Professor Lucie Green as she unpicks some of the science behind the show. "There's some really nice science in the show so the talk afterwards will expand on that," says Trent. "It's a great juxtaposition between these two people, stuck at a desk, and what these questions, at the edge of the universe, could mean."

There will also be a night of music at The Octagon Chapel on Saturday, October 26 as The Sky at Night's Prof Chris Lintott presents The Crowd and the Cosmos before being joined by acclaimed musician Steve Pretty - expect a unique, out of this world evening as they perform a special version of their acclaimed show, Universe of Music.

DON'T MISS OUT

October 22-23

[UNIT]: [REACH THE MOON]

Norwich Arts Centre, 2-2.45pm, 6-6.45pm and 8-8.45pm

Cost: 4/6/9/12

Age: 5+

Immerse yourself in these audio/visual performances which celebrate the 50th anniversary of the first moon landing. It will be an all-encompassing audio/visual feast which innovatively marries science with art.

October 26

Radio Blips and Blasts: Pulsars and our Understanding of the Cosmos

The Forum, Millenium Plain, 3.30-4.30pm

Cost: Free

Age: 12+

Since their discovery in 1967, observations of pulsars - the incredibly dense, highly magnetic, rapidly rotating remnants of supernova explosions - have been used to increase our understanding of fundamental physics. In this talk, UEA's Dr Robert Ferdman will discuss the state of astrophysics leading up to, and including, this momentous discovery.

Introduction to the Universe: Sleep Not Essential!

The Forum, Millenium Plain, 5-6pm

Cost: 5

Age: 12+

TV astronomer and author Mark Thompson will expound the wonders of the Universe in a warm-up up to his 2020 record-breaking attempt to lecture for five days straight with no sleep! Anything could happen.

Buy tickets and find out more at http://www.norwichsciencefestival.co.uk

Read the rest here:

Norwich Science Festival launches with physics and astronomy events | What's on and things to do in Norfolk - Eastern Daily Press

What the Women of the First All Female Spacewalk Will Do at the ISS – Newsweek

Two NASA astronauts are due to make history this week by taking part in the first ever all-female spacewalk.

Astronauts Christina Koch and Jessica Meir will work to fix the power system of the International Space Station (ISS), the habitable satellite which is in orbit 220 miles above the Earth's surface.

The momentous spacewalk was due to take place on October 21. But NASA announced on Tuesday it would be pushed forward to Thursday or Friday.

In a change to their brief, the pair will replace a faulty battery charge-discharge unit, which is currently preventing a new lithium-ion battery installed earlier this month from powering the ISS.

The ISS is fuelled by a collection of thousands of solar cells known as arrays. Battery charge-discharge units control how much charge batteries which collect energy from these arrays receive.

NASA officials explained during a press conference attended by Space.com that the issue stems from a battery pack which was swapped in April.

Despite the broken unit, NASA gave assurances in a statement that the crew is safe and their laboratory experiments onboard the ISS have not been disrupted.

The postponed spacewalk due to take place on 21 October would have been the fourth of 10 planned to take place between October and December, as part of what is known as Expedition 61.

Commanded by the European Space Agency's Luca Parmitano, the mission includes NASA astronauts Koch, Meir, and Andrew Morgan, as well as Russia's Aleksandr Skvortsov and Oleg Skripochka.

Meir and Koch had originally planned to help replace nickel-hydrogen batteries with "newer, more powerful" lithium-ion batteries on the far side of the ISS's port truss, according to NASA. The work is a continuation of an upgrade of the stations power system which started in January 2017.

The second half of the sequence, expected to start in November, will see crew members fix the space station's alpha magnetic spectrometer.

Meir joined Kochwho holds the record for the longest single spaceflight by a womanat the satellite in late September, and will spend over six months on the ISS.

Read more

The first all-woman spacewalk was controversially postponed in March as there were not enough medium-sized space suits on the ISS to fit both women. NASA astronaut Anne McClain was scheduled to join Koch in updating the ISS's power sources. But McClain arrived back on Earth in June, completing a 204-day mission.

In March, McClain became the 13th woman to complete a spacewalk, followed by Koch a few days later.

McClain tweeted back in March: "This decision was based on my recommendation. Leaders must make tough calls, and I am fortunate to work with a team who trusts my judgement. We must never accept a risk that can instead be mitigated. Safety of the crew and execution of the mission come first."

Dr. Scott G. Gregory, lecturer in astrophysics at the University of Dundee, told Newsweek the all-female spacewalk is "both historic and inspirational."

"Despite many talented female scientists and engineers, spacewalks have been a male-dominated activity," he said.

"Although there have been female spacewalkers before, it has always been with a male counterpart. This first all-female spacewalk is long overdue. Jessica Meir and Christina Koch are following their childhood dreams of being astronauts and 'walking' in space. It represents years of dedication, study, hard work, and pushing limits and that is truly inspirational."

Gregory said he'll be watching the live stream of the spacewalk with his 5-year-old daughter.

"We'll be following the news as Jessica Meir and Christina Koch write their own history."

"All over the world lots of little girlsand little boyswill be watching and they'll be dreaming that they'll be walking in space when they're grown up," he added. "If that inspires them to dedicate their lives to scientific or technological endeavours, this can only be a positive for all of society."

Read the original post:

What the Women of the First All Female Spacewalk Will Do at the ISS - Newsweek

Nobel Prize in Physics: James Peebles, master of the universe, shares award – Firstpost

The ConversationOct 14, 2019 16:04:31 IST

During the press conference in which he was revealed as one of the winners of the 2019 Nobel Prize in Physics, James (Jim) Peebles was asked to point to a single discovery or breakthrough from his long career that would put the award in context. Peebles demurred, replying instead: Its a lifes work.

Thats a perfect description of his contribution to our understanding of the universe. His is a career so influential that he is widely recognised as one of the key architects of the field of physical cosmology, the study of the universes origin, structure and evolution. I am sure I am not alone in regarding Peebles as the greatest living cosmologist.

Nobel Physics Prize winner- James (Jim) Peebles. image credit: Princeton University/EPA

Peebless research career started in the early 1960s. The Canadian-born scientist earned his undergrad at the University of Manitoba and later gained his PhD in the group of Robert Dicke at Princeton University in New Jersey in 1962. He has remained there ever since. Peebles now holds the title of Albert Einstein Professor of Science at Princeton.

In the 1960s, Dickes group was working on theoretical predictions and the corresponding observational consequences for the state of the primordial universe, the phase immediately following the Big Bang lasting for a few hundred thousand years. At that time the Big Bang theory for the formation of the universe was not yet fully accepted, despite observational evidence that galaxies were moving away from each other.

Dickes group was working on the theory that if the universe was expanding, then it must have been much smaller, hotter and denser in the past. The prediction was that the thermal radiation from this epoch might be still be observable today as background radiation pervading the universe. The Princeton group was also designing instruments to try to detect it.

Meanwhile, Arno Penzias and Robert Wilson, working for Bell Labs (also in New Jersey), had detected an unusual persistent background noise in their experiment. They were investigating the use of high altitude echo balloons, a kind of early satellite communication.

When Penzias and Wilson approached Dickes group for advice, it became clear that they had actually detected the relic background radiation. We call it the cosmic microwave background (CMB) because the radiation peaks in the microwave part of the electromagnetic spectrum.

A map of the universes cosmic microwave background radiation. image credit: NASA

The resulting papers were arguably the birth of the field of observational cosmology, a branch of physics that has revolutionised our view of the cosmos and our place within it. Peebles played a pivotal role in our theoretical understanding of the primordial universe and its evolution, but he also recognised that the CMB was a treasure trove of information that could be plundered. In particular, it holds clues about the formation of cosmic structures the galaxies and indeed clues about the fundamental nature of the universe itself.

Much of Peebless work has focused on understanding the emergence and growth of structure in the universe from the relatively smooth primordial conditions encoded in the CMB. In the process he has helped define an entire field of study.

For example, in the early 1970s, he was one of the first to run computer simulations of cosmic structure formation, a practice that is an entire branch of research today, where cosmologists explore toy universes.

Peebles helped usher in the dark sector to our model of the universe, becoming a pioneer of (what is now called) the standard cosmological model. In this model, the universe is dominated by mysterious forms of matter and energy that we are yet to fully understand, but whose existence is supported by observational evidence. Normal matter now has an almost negligible cosmic relevance compared to this dark matter and dark energy.

Peebles has produced such an immense body of work it is impossible to do it all justice in this short article. In one of his most influential papers, he linked the subtle fluctuations in the temperature of the CMB which reflect ripples in the density of matter shortly after the Big Bang with the way in which matter is distributed on a large-scale throughout the present day universe. The link exists because all the structure we see around us today must have grown through the evolution of those primordial seeds.

Peebles advanced the concept of a dark matter component to the universe and its implications for the evolution of structure. Through this, and other work, he helped establish the theoretical framework for our picture of how galaxies have formed and evolved. And he demonstrated how observations of the CMB and the distribution of galaxies could be used as evidence to help measure key cosmological parameters, the numbers that feature in the equations we use to describe the nature of the universe.

The influence of Peebles doesnt end there. Aside from his monumental contributions to fundamental research, spanning the CMB, dark matter, dark energy, inflation, nucleosynthesis, structure formation and galaxy evolution, his textbooks have educated generations of cosmologists. They will do for years to come. His Principles of Physical Cosmology is on my desk right now.

In the Nobel press conference, Peebles was keen to highlight that he didnt work alone. But to say that he has been largely responsible for shaping our understanding of the universe is a cosmic understatement.

James Geach, Professor of Astrophysics and Royal Society University Research Fellow, University of Hertfordshire

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Visit link:

Nobel Prize in Physics: James Peebles, master of the universe, shares award - Firstpost

Can you pivot from studying galaxy evolution to working in data science? – Siliconrepublic.com

Brian OHalloran of Liberty IT discusses his work as a data scientist and the unusual route that led him to where he is now.

Brian OHalloran is a data scientist at Liberty IT, working on natural language programming projects and managing stakeholders. But before joining the company, he was a researcher in astrophysics and went on to work at the Daily Telegraph, among other roles.

Here, he tells Siliconrepublic.com about the world of data science, and the transferable skills that made his colourful career path possible.

Working as a data scientist is not too far removed from my academic roles. You get to do R&D, after all BRIAN OHALLORAN

Prior to joining Liberty IT, I was lead data scientist at the Daily Telegraph in London, working on things like recommendation systems for users and building election models for Westminster elections. Before that, I was in a similar role at eFinancialCareers again in London which I joined after leaving academia.

I used to be an astrophysicist. My area of interest was galaxy evolution, particularly focused on nearby dwarf galaxies as theyre excellent proxies for understanding how galaxies evolved in the early universe.

I spent four years as a postdoc in Washington DC, working on projects focused on this type of research, followed by another six in London. The latter role was as part of the European Space Agencys Herschel Space Telescope project, working on the SPIRE instrument team.

I graduated from NUI Maynooth with a BSc in experimental physics and mathematics in 1998, followed by a PhD in experimental physics from UCD in 2003.

Obviously, I picked up the hard skills for analysing, breaking down and solving problems during this time. What was invaluable though, in terms of my current role, were the soft skills that you pick up by accident and through stealth.

I spent quite a lot of time teaching physics and astronomy courses, and learned invaluable soft skills in terms of communication of ideas and people and stakeholder management, something particularly of value when dealing with C-suite and non-technical stakeholders!

Well, that depends on the problem, of course. Ive spent quite a lot of time working on natural language programming projects during my data science career. Most of my actual development time is spent knee-deep in Python, TensorFlow, Keras and Spacy.

At Liberty IT, weve migrated our DS frameworks to the cloud. Were increasingly using Amazon SageMaker and their competitor from Microsoft, and both loom large in our future.

Ive been lucky to have not just one, but a number of hugely influential people throughout both my academic and data science careers. My PhD supervisor at UCD, Brian McBreen, played a huge role in the development of my academic career.

In terms of my data science career, my bosses and colleagues at the Telegraph were greatly influential in where I am today, including Magda Piatkowska, Herv Schnegg and Dimitris Pertsinis.

In some ways, working as a data scientist is not too far removed from my academic roles. You get to do R&D, after all, and so are given a lot of leeway in that regard, which is great as it allows you to be very creative.

I really enjoy working with stakeholders, as it is very much a two-way street in terms of education and evangelising. If you do it right, you get to iron out what they are looking for as a final product, everyone gets excited and commits to the project. Stakeholder buy-in and proper communication back and forth are such crucial components of success in the data science field. Without either, projects are doomed to failure.

The data science function at Liberty IT is very new, so theres huge potential for projects across the Liberty Mutual Insurance Group, with us at the heart of that. Its a really exciting time and place to be in.

Data science functions that work well and that add real business value. If more and more firms crack that problem, its a really exciting trend. The rest, in terms of trends, is really just window dressing.

Brush up on those soft skills. Learn to network, learn how to listen to your stakeholders. Theres no point in building technically wonderful solutions if theres no customer willing to use them.

Data scientists need to stay away from ivory towers at all costs. Make sure you do too.

Read the original:

Can you pivot from studying galaxy evolution to working in data science? - Siliconrepublic.com

A Soviet Satellite Falls to Earth in ‘The Walking Dead’ Season 10. How Realistic Is It? – Space.com

AMC's "The Walking Dead" launched its 10th season last week to the delight of zombie fans everywhere, but the premiere also contained a space junk Easter egg that just might be a major plot point for the series: a Soviet satellite crashing to Earth.

The episode "Lines We Cross" ends with an old Soviet satellite crashing to Earth as a brilliant daytime fireball. It loses unmistakable sonic booms and a sparks wildfire in enemy territory (watch out for Whisperers!) that the show's heroes must battle to save their hunting grounds.

"We'd been talking the the writers' room about what are different things that happen the more time goes on," executive producer Angela Kang said in AMC's "Talking Dead" recap of the Oct. 6 premiere. "We thought that it would just be like an interesting thing that we haven't dealt with before, and then see what things they can get from the satellite in terms of technology or to help them."

Related: The Biggest Spacecraft to Fall Uncontrolled from SpaceMore: Failed 1970s Venus Probe Could Crash to Earth This Year

A Soviet satellite falls to Earth in a brilliant fireball in AMC's "The Walking Dead" Season 10 premiere "Lines We Cross,"

(Image credit: AMC)

According to Kang, "The Walking Dead" showrunners sought advice on satellites from a NASA scientists at the Jet Propulsion Laboratory in Pasadena, California.

That got us wondering how accurate the satellite crash depiction was, so we reached out to astrophysicist Jonathan McDowell at the HarvardSmithsonian Center for Astrophysics in Cambridge, Mass. who tracks satellites and space junk in Earth orbit.

Our first question: How accurate is the fireball, sonic boom and crash, which leaves much of the satellite intact?

"The visual of the reentry ... is good, although it looks too high at that point to have audible sonic booms, I would guess. Overall, not bad as a depiction," McDowell told Space.com in an email. "The almost-intact satellite found on the ground ... is not plausible."

The appearance of "The Walking Dead" satellite resembles a type of old Soviet surveillance satellite known as a Tselina-R, which was used for electronic intelligence, McDowell said.That might just be a coincidence, though.

According to Russian spaceflight expert Anatoly Zak, who runs Russianspaceweb.com, Tselina-R launched in 1990 (before the end of the Soviet Union) and was designed to last about six months. But Tselina-class satellites were launched throughout the late 1960s, 70s, 80s and 90s, with the last to fly in the early 2000s.

Next question: Is 10 years in "The Walking Dead" (that's my estimate based on the show's seasons) enough time for satellites to fall from space?

"The 10 years is enough if the satellite were in a relatively low orbit of say 500 km," McDowell said. "It'd be unlikely to have a jet-engine-size bit surviving unless it was actually designed for reentry (like a camera/film capsule)."

Some Soviet satellites like that depicted in "The Walking Dead" have had small engine parts, like a meter-sized plate or quarter-meter spherical pressure tank, survive, but nothing the size of what appears in the show, McDowell added.

"Maybe you'd get a piece that big from a space station module."

Related: Skylab's Remains: NASA Space Station Debris in Australia (Photos)

The remains of a Soviet satellite in AMC's The Walking Dead. Such a large piece of space junk from a satellite would be unlikely in reality, unless it was designed to survive reentry.

(Image credit: AMC)

Here's a follow-up: The folks in "The Walking Dead" rush to the Soviet satellite to put out a wildfire, then rush to salvage any technology they can. Wouldn't there be toxic hydrazine or other chemicals to worry about? And would anything be salvageable at all?

"It's unlikely there'd be anything usable surviving I don't think anything from Skylab survived in repairable condition for example," McDowell said. "Again, if [it were] part of a system designed for reentry, that's a different story, so you could imagine a cargo ship ([SpaceX's] Dragon, for example) that boosted its recovery module in the wrong direction and was stranded in orbit for reentry 10 years later but it wouldn't look like that."

And McDowell suggests there might be more to worry about than just toxic hydrazine, a fuel used for spacecraft thrusters.

"Theconcern with hydrazine is valid but maybe brief it would probably dissipate pretty quickly. I would certainly be hesitant about approachingthething when not wearing a protective hazmat suit," McDowell said. "There might be potentially explosive hypergolic propellant on board too. And on an old Soviet sat there might be high-explosive self destruct devices."

Okay, last question: Do you watch "The Walking Dead"?

"I'm not a big zombie fan, 'Walking Dead' is too gross for me," said McDowell, though he did enjoy the TV series "iZombie."

McDowell also enjoy another show about the dead walking the Earth, HBO's "Dead Like Me," which began with Russia's Mir space station falling to Earth and its toilet seat killing the show's lead character.

"I was a fan of 'Dead Like Me,' indeed," McDowell said, "and appreciated the (again, implausible) Mir toilet seat."

Episode 2 of "The Walking Dead" Season 10 airs tonight on AMC at 9 p.m. EDT/8 p.m. CDT.

Email Tariq Malik attmalik@space.comor follow him@tariqjmalik. Follow us@SpacedotcomandFacebook.

Need more space? You can get 5 issues of our partner "All About Space" Magazine for $5 for the latest amazing news from the final frontier!

(Image credit: All About Space magazine)

See original here:

A Soviet Satellite Falls to Earth in 'The Walking Dead' Season 10. How Realistic Is It? - Space.com

This Is The One Way The Moon Outshines Our Sun – Forbes

Typically, even the full Moon is approximately 400,000 times less bright than the Sun is, making it appear about 12-14 visual magnitudes dimmer to human eyes. While, in visible light, the Sun always outshines the Moon (due to the latter reflecting the former's light), there is one part of the spectrum where the Moon can even outshine the Sun after all.

To human eyes, the Moon is the second brightest visible object, trailing only the Sun.

As seen in X-rays against the cosmic background, the Moon's illuminated (bright) and non-illuminated portions (dark) are clearly visible in this early X-ray image taken by ROSAT. The X-rays, like almost all wavelengths of light, arise mostly from reflected emission from the Sun.

Moonlight is just reflectedlight generated from other sources;it's not self-luminous.

The size, wavelength and temperature/energy scales that correspond to various parts of the electromagnetic spectrum. You have to go to higher energies, and shorter wavelengths, to probe the smallest scales. Although the Moon reflects sunlight, the most energetic photons from the Sun normally top out at X-ray energies.

Across the whole electromagnetic spectrum, the Sun always appears much brighter than the Moon.

This 1991 photo shows the Compton Gamma-Ray Observatory being deployed in space during April 7, 1991 from the Space Shuttle Atlantis. This observatory was humanity's first space-based gamma-ray satellite, and was part of NASA's original great observatories program which included Hubble, Compton, Chandra and Spitzer.

Until, that is, we launched the Compton gamma-ray observatory, capable ofmeasuring the highest-energy radiation.

A diagram of the EGRET instrument, which was used for observing the highest-energy photons aboard the Compton Gamma-Ray Observatory. The EGRET instrument is the only one capable of measuring photons with energies between about 20 MeV up to around 30 GeV: higher energy photons than the Sun typically emits.

The Sun, in gamma-rays, is very quiet, as its emitted radiation tops out at X-ray energies.

The Sun's light across the electromagnetic spectrum is due to nuclear fusion, which primarily converts hydrogen into helium. The nuclear reactions produce neutrinos and radiation that extends from the radio all the way up into the X-ray, but gamma-rays are only produced rarely: during flaring events.

The Moon, on the other hand, emits very little light relative to the Sun, but outshines it in gamma-rays.

Between 1991 and 1994, the Moon passed into the Compton Gamma-Ray Observatory's field-of-view multiple times, where the instrument was capable of observing it. Compton detected high-energy gamma-rays from the Moon with its EGRET instrument, and the energy spectrum of the lunar gamma radiation are consistent with a model of gamma ray production by cosmic ray interactions with the lunar surface. The Moon is brighter than the (non-flaring) Sun in these high energies.

Across the full electromagnetic spectrum, only in the highest-energy gamma-rays does the Moon outshine the Sun.

A thin crescent moon, just one day after the new moon, sets in the west. The remaining disk is still illuminated by the light reflected from Earth that's then incident upon the lunar surface. The fact that the Moon always appears full in Gamma-Rays,even when just a thin crescent is illuminated by the Sun, teaches us that it isn't reflected sunlight that's causing these lunar Gamma-Rays.

This observation aloneteaches us that the Moon isn't generating its gamma-rays by reflecting sunlight.

Using data from NASA's Lunar Reconnaissance Orbiter (LRO) and its narrow angle camera (LROC), we can now construct 3D models of the surface of the Moon and simulate any potential landing sites for missions. Our current understanding teaches us that the Moon's surface is made of many heavier elements, is surrounded by practically no atmosphere at all, and has a negligible magnetic field. This combination of factors basically creates 'the perfect storm' for generating gamma-rays from high-energy nuclear recoils.

Unlike the Sun,the Moon's surface is made of mostly heavier elements, while the Sun is mostly hydrogen and helium.

The only time the Sun produces gamma-rays is during flaring events, when accelerated, high-energy protons can collide with heavier nuclei, producing an excited-state nucleus that emits gamma-rays. During quiet conditions, these fast protons will only interact with hydrogen or helium nuclei, which do not produce these gamma-rays. On the Moon's surface, however, heavy nuclei abound, and the creation of excited-state nuclei that then emit gamma-rays is ubiquitous.

When cosmic rays (high-energy particles) from throughout the Universe collide with heavy atoms, nuclearrecoil causes gamma-ray emission.

Cosmic rays produced by high-energy astrophysics sources can reach any object in the Solar System, and appear to permeate our local region of space omnidirectionally. When they collide with Earth, they strike atoms in the atmosphere, creating particle and radiation showers at the surface. When they strike the heavy elements present on the Moon's surface, they can induce a nuclear recoil/reaction that winds up producing the high-energy gamma-rays we observe.

With no atmosphere or magnetic field, and a lithosphere rich in heavy elements, cosmic rays produce gamma-rays upon impacting the Moon.

Although the Sun doesn't typically generate either gamma-rays or cosmic-rays that account for what we see on the Moon, its complex magnetic field undergoes cyclical changes on an 11-year timescale. These changes can alter the gamma-ray flux from the Moon, over time, by up to about 20%.

If we had gamma-ray eyes, the Moon would always look "full" from any perspective.

With 7 panels of ever-increasing observing time, from 2 months up through 128 months, we can see how a gamma-ray image of the Moon becomes sharper and sharper over time. This image was taken by NASA's flagship gamma-ray observatory, Fermi, in energies of 31 MeV or higher. In these high-energy gamma-rays, the Moon indeed outshines the Sun.

Continue reading here:

This Is The One Way The Moon Outshines Our Sun - Forbes

Kostic named a thermodynamic topical collection editor of Entropy journal – NIU Today

Milivoje M. Kostic, professor emeritus in theDepartment of Mechanical Engineering,and editor-in-chief of the Thermodynamics Section of Entropy journal, has been recently named as a topical collection editor of Foundations and Ubiquity of Classical Thermodynamics, after serving as a guest editor for three Entropy special issues in2013,2015 and 2018.

In 2018, Kostic also published a feature paper, Nature of Heat and Thermal Energy, and attended the16th International Heat Transfer Conference(IHTC-16) where he wasa panelist on the development of a new entransy theory. The inclusion to the panel has been influenced by Kostics publication related to theentransy concept and controversies, as well as his priorcollaboration with Chinese universities, starting withkeynote lecturesat the prestigious Tsinghua University.

Entropy is a monthly, peer-reviewed, open access, scientific journal covering research on all aspects of entropy and information theory. The journal publishes original research articles, communications, review articles, concept papers and more. Since Entropy became a mainstream, interdisciplinary journal at the end of 2015, it has diversified in several sections, which also include statistical mechanics, information theory, astrophysics and cosmology, quantum information and complexity, as well as in diverse special issues and more recently in topical collections.

Thermodynamics is a science of energy and entropy, considered by many to be among the most fundamental sciences. The phenomenological laws of thermodynamics have much wider, including philosophical, significance and implications than their simple expressions based on experimental observations they are the fundamental laws of nature. Classical thermodynamics crystallizes a diverse and complex reality to a fundamental cause-and-effect ubiquitous simplicity by its fundamental principles. That is why it is hard to understand thermodynamics the first time or the second time through.

Kostics research and scholarly interests are in fundamental laws of nature; thermodynamics and heat transfer fundamentals; the second law of thermodynamics and entropy; energy efficiency; conservation and sustainability; fluids-thermal-energy components and systems; and nanotechnology and nanofluids.

Kostic retiredfrom his regular NIU duties in 2014 to pursue his scholarly work and research.

Read more from the original source:

Kostic named a thermodynamic topical collection editor of Entropy journal - NIU Today

Some Quasars Shine With the Light of Over a Trillion Stars – Universe Today

Quasars are some of the brightest objects in the Universe. The brightest ones are so luminous they outshine a trillion stars. But why? And what does their brightness tell us about the galaxies that host them?

To try to answer that question, a group of astronomers took another look at 28 of the brightest and nearest quasars. But to understand their work, we have to back track a little, starting with supermassive black holes.

A supermassive black hole (SMBH) is a black hole with more than a million solar masses. They can be much larger than that, too; up to billions of solar masses. One of these entities resides at the center of most galaxies, excluding dwarf galaxies and the like.

Theyre the result of the gravitational collapse of a massive star, and they occupy a spheroidal chunk of space from which nothing, not even light, can escape.

The Milky Way has one of these SMBHs. Its called Sagittarius A-star (Sgr A*) and its about 2.6 million solar masses. But Sgr A* is rather sedate for a SMBH. Other SMBHs are much more active, and theyre called active galactic nuclei (AGN.)

In an AGN, the black hole is actively accreting matter, forming a torus of gas that heats up. As it does so, the gas emits electromagnetic radiation, which we can see. AGNs can emit radiation all across the electromagnetic spectrum.

There are sub-classes of AGNs, and a new study focused on one of those sub-classes called quasars. A quasar is the most powerful type of AGN, and they can shine with the light of a trillion Suns. But some of these quasars are hidden behind their own torus, which blocks our line of sight. In studies of quasars, these ones are ignored or omitted, because theyre difficult to see.

But that creates a problem, because omitting them from the population of quasars means we might be missing something. It also means that one of the central questions around quasars might not be addressed properly.

The question is really multi-pronged: are these extremely bright AGN powered by moderate accretion onto extremely massive black holes? Or are they powered by extreme accretion onto more moderate mass black holes? Or maybe something else is going on. Are they powered by a host galaxy transitioning from a star-forming galaxy to something more sedate like an elliptical galaxy? By ignoring or omitting the quasars that are difficult to see, it makes finding any answers difficult.

A team of astronomers looked at 28 AGN that were both nearby and among the most luminous. Most of them happened to be in elliptical galaxies. The only criteria for choosing them was the intense activity in their nuclei. Their radio emissions span factors of tens of thousands, and their masses also cover a wide range. The astronomers wanted to find out if these bright AGN had any other distinctive qualities which would set them apart from lower luminosity obscured AGN.

Their are some intriguing and surprising results in this study. Some of the results seem to agree with other studies, while some go against the grain.

In the conclusion of their paper, the authors summarize their findings, and it seems that for now, at least, there is no clear explanation for these most luminous of quasars that shine with the light of a trillion stars.

We find that, as a group, our sample of some of the most luminous obscured AGN in BASS/DR1 does not exhibit any distinctive properties with respect to their black hole masses, Eddington ratios, and/or stellar masses of their host galaxies.

They also point out that the host galaxies are mostly all ellipticals, a surprising find. If this finding can be corroborated by other researchers, it may lend some indirect evidence in support of the popular idea that epochs of intense SMBH growth are linked to the transformation of galaxies from (star-forming) disks to (quenched) ellipticals (i.e., through major mergers).

There are 21 researchers behind this study, at institutions including the Harvard and Smithsonian Center for Astrophysics, Tel-Aviv University, Kyoto University, JPL, the Naval Observatory, the ESO, and many others. The data for their study comes from the 70 month Swift/BAT all-sky survey, and with observations using the Keck, VLT, and Palomar observatories. The study is titled BAT AGN Spectroscopic Survey XIII. The nature of the most luminous obscured AGN in the low-redshift universe. Its published in the Monthly Notices of the Royal Astronomical Society.

Like Loading...

See original here:

Some Quasars Shine With the Light of Over a Trillion Stars - Universe Today