Classes Online And In Space – Cardinal & Cream

Jeremy Blaschke,Union University assistant professor of biology,responded to the universitys transition to online classes in a unique and engaging way by taking his students into space.

Like many institutions across the nation, Union University made the decision to move classes online beginning March 16 due to the COVID-19 pandemic, abruptly uprooting the structured educational boundaries students and faculty have known their whole lives.

In a matter of days, professors had to restructure class material to an online format. For many courses, this meant restructuring the entire semester.For many Union professors, the idea of going online threatened the structure they were comfortable teaching in.

Blaschke decided to combat this by simulating a new kind of structure of his own through storytelling.

Its 1000 years in the future, and the humans that inhabit both Earth and Mars are almost extinct. Blaschke is a member of a 10-person crew that is exploring another planet to possibly inhabit and are reporting back to a group of biologists on the species they discover. The biologists in this scenario are the students in Blaschkes Zoology class.

There are some really crazy animals that we talk about that people have never heard of and have never imagined being real animals, said Blaschke in a news release from Union University. For a lot of these, they seem alien already.

Senior psychology and sociology major Jaime May is serving as one of Blaschkes biologists for this mission. She feels like this fictional world that Blaschke has created makes online school a little bit easier.

Im grateful that he cares so much about our education and achievement, May said, that he is willing to sacrifice more of his own time to see us succeed.

Blaschkes space exploration mission backstory is complete with a spaceship background and costume for his YouTube video lectures, and his students, divided into small groups, were tasked with creating names, backstories and personalities of their crew members. Now all of these crew members and their stories are being woven into the story that will unfold as the semester continues.

Continue reading here:

Classes Online And In Space - Cardinal & Cream

A Cosmic Phenomenon: Discovery Of Connection of Energy Within A Galaxy Sparks Excitement To What Other Space S – iTech Post

Science By Renz , Apr 09, 2020 09:37 AM EDT

(Photo : Pixabay)

An unprecedented event has been discovered in the radio galaxy named ESO 137-006 showing bizarre new thread-like appendages that connect the lobes of plasma within the cosmic entity.

On Wednesday, Astronomer Dr Mpati Ramatsoku of Rhodes University announced he and his colleagues have become accustomed to the usual sight of the galaxy.

Dr Ramatsoku said the space cluster consists of a core which is home to a supermassive black hole that shoots out two jets of plasma that race at near the speed of light. The energy found within the jets will ultimately slow down and disperse, producing large radio lobes.

The astronomer also stated what makes this particular galaxy different is the appearance of what seems to be several, supplementary filaments that link the lobes.

Ramatsoku is the lead author of an international team of astronomers' study of the discovery. They are conducting this endeavour with help from the state-of-the-art radio telescope MeerKat from the Northern Cape Karoo.

The radio telescope SA MeerKat which was launched in 2018, is the means of answering fundamental astrophysical questions regarding the universe.

Ramatsoku said this particular galaxy that belongs in the Norma cluster of galaxies is quite captivating and one of the brightest in the southern sky. And its characterization being the two bright lobes of radio emission which are bent in one direction.

She added with this discovery; they now have knowledge of these new features in the form of multiple collimated synchrotron thread connecting the lobes.

The head of the South African Research Chairs Initiative for radio astronomy over at Rhodes, Professor Oleg Smirnov, expressed the team's excitement over the discovery. He said the beautiful revelations of space are of great importance for MeerKat because they are proof of its amazing capability for locating the 'unknown unknowns' within our universe.

The unexpected finding boosts the team's morale and reminds them of the very reason they undertook their profession in the first place.

Read Also: Looking For The Best Band For Your Buck? Here Are Some Cost-Efficient American Road Monsters You Can Choose From!

Further study of the phenomenon is required to understand its nature further, added Ramatsoku. She also said it is probable for the event to be distinct to the observed galaxy due to its severe climate and environment. It is correct to think the occurrence is common in other radio galaxies but are undetectable due to the lack of more powerful astronomical equipment.

She states if it is indeed more than one galaxies phenomenon, it brings about new challenges and unknowns in figuring out the true nature of these cosmic bodies.

Insight to the nature and physics of these filaments may bring with it backing for other sensitive radio interferometers such as MeerKat and future similar equipment, like the Square Kilometre Array (SKA).

Smirnov credited professor Justin Jonas for being the one responsible for MeerKat's birth and the transfer of the SKA to South Africa. Having been with Rhodes since his student days, Professor Jonas was awarded the vice chancellor's distinguished achievement award last 2019.

A reporter for Herald said even the smaller institutions, such as Rhodes University, are capable of bringing about discoveries of cosmic importance to the field of space exploration.

Read Also: NASA Program To Utilize $7M In Funding Research, From Stronger Spacecraft To Landing Humans On Mars - What They Have Planned Will Blow You Away

2020 ITECHPOST, All rights reserved. Do not reproduce without permission.

Go here to see the original:

A Cosmic Phenomenon: Discovery Of Connection of Energy Within A Galaxy Sparks Excitement To What Other Space S - iTech Post

E. Margaret Burbidge, Astronomer Who Blazed Trails on Earth, Dies at 100 – The New York Times

She joined the University of California, San Diego, in the early 1960s and went on to become the first director of its Center for Astrophysics and Space Sciences. At her death, she was universityprofessor emeritus there.

With her husband, the American physicist William Fowler and the English astronomer Fred Hoyle, Dr. Burbidge wrote a 1957article that is considered one of the most influential scientific papers of its era. Titled Synthesis of the Elements in Stars, but known in astronomical circles simply as B2FH, it was published in the journal Reviewsof Modern Physics.

In it, the authors argued that nearly all of the chemical elements, from aluminum to zinc, are forged in the bodies of stars, a process nowcalled stellar nucleosynthesis.

It was already known that the lightest elements, like hydrogen and helium, had been created amid theBigBang. But the origin of the heavier elements, including the carbon that makes up plants and animals, the oxygen in the atmosphere and the gold and silver mined from the ground in sum, the very matter of the universe was the subject of longstanding debate.

The thesis of B2FH, now widely accepted,isthat the heavier elements are synthesized from the lighter ones by thermonuclear reactions within stars. Loosed into space, these elements can also recombine to form new stars, beginning the cycle once more.

As the article describes it, we are all, in essence, made from stars.

That work laid the foundations for all of modern nuclear astrophysics, and particle astrophysics as well, Dr. Fowler said. It gave a blueprint for how the elements were formed in the cosmos.

(For work on the evolution of stars in general, Dr. Fowler shared the 1983Nobel Prize in Physics with the Indian-American astrophysicist SubrahmanyanChandrasekhar.)

Originally posted here:

E. Margaret Burbidge, Astronomer Who Blazed Trails on Earth, Dies at 100 - The New York Times

Blowtorch of the Gods Captured by Black Hole Image Makers – The New York Times

Dr. Kims group has now reprocessed the observations from those four nights. In addition the group used two other sets of radio telescopes at different frequencies and different resolutions on other days. They did this to study the structure of the quasars jet and zoom in on its source, like opening a set of Matryoshka dolls, in Dr. Kims words.

The results can be seen in the movie above. As viewed from afar at the lowest magnification, the jet bends down from a bright spot at the top of the frame, which corresponds to the center of the quasar, where the black hole is presumably working its grinding magic. Seen closer up, the jet decomposes into a series of blobs or hot spots shooting out. They form a line that bends slightly.

Under the highest magnification, the viewer is left with two blobs one at the top of the image, which is source of the jet, and the lower feature, which is one of the jets outbursts of energy. The source of the jet looks like a bar turned sideways, nearly perpendicular to the direction of the blowtorch.

That, Dr. Kim said in a statement, was a surprise, because they found this unexpected, perpendicular form where they expected to find only the source of the jet.

This is like finding a very different shape by opening the smallest Matryoshka doll, he said.

The perpendicular structure, the astronomers said in their paper, could be the accretion disk itself, the doughnut of fiery doomed material that circles the black hole. Enormous pressures and magnetic fields in that realm squeeze energy out the top and bottom of the doughnut at nearly the speed of light.

Dr. Doleman ventured, however, that it could just be the beam twisting again to make life difficult for the observers.

In the second half of the movie, the astronomers compared images from the Event Horizon Telescope at a single wavelength over the course of a week to see how the knots in the jet were moving.

Read the original here:

Blowtorch of the Gods Captured by Black Hole Image Makers - The New York Times

Astronomers are hoping to see the very first stars and galaxies in the Universe – Universe Today

Sometimes its easy being an astronomer. When your celestial target is something simple and bright, the game can be pretty straightforward: point your telescope at the thing and just wait for all the juicy photons to pour on in.

But sometimes being an astronomer is tough, like when youre trying to study the first stars to appear in the universe. Theyre much too far away and too faint to see directly with telescopes (even the much-hyped James Webb Space Telescope will only be able to see the first galaxies, an accumulation of light from hundreds of billions of stars). To date, we dont have any observations of the first stars, which is a major bummer.

So, astronomers engage in a little bit of cosmic peek-a-boo.Before the first stars formed (the exact date is uncertain, because we havent observed it yet, but we suspect it happened about thirteen billion years ago), the universe was composed almost entirely of pure, unadulterated neutral hydrogen: single electrons bound to single protons in perfect harmony.

But then the first stars appeared, and poured their high-energy radiation throughout the cosmos, flooding the universe with copious X-rays and gamma rays. That intense radiation ripped apart the neutral hydrogen, converting it into the thin but hot plasma that we see in the present-day universe. This process, known as the Epoch of Reionization, started in little patches that eventually grew to engulf the cosmos, like a bunch of weird bubbles.

All this is fascinating, but how can astronomers actually detect this process? They can do it through a little trick of neutral hydrogen: it emits radiation at a very specific frequent, 1420 MHz, which corresponds to a wavelength of 21 centimeters. Before the first stars came online, the neutral gas pumped out this 21cm radiation by the bucketload, with the signal gradually diminishing as the universe became a plasma.

Sounds like a plan, except a) this signal is incredibly weak, and b) a bajillion other things in the universe emit radiation at similar frequencies, including our radios on Earth.

Disentangling the annoying noise from the juicy cosmological signal requires takes mountains of data and sifting through the astronomical haystack for the 21cm needle. We currently dont have the capabilities to make the detection that will have to wait for next-generation radio telescopes like the Square Kilometer Array but current observatories like the Murchison Widefield Array in Western Australia are laying all the necessary groundwork.

Including delivering 200 TB of data in its first pass, which is currently under analysis by some of the most powerful supercomputers in the world.

Like Loading...

Read more from the original source:

Astronomers are hoping to see the very first stars and galaxies in the Universe - Universe Today

Marvel at the universe with the free Northeast Astronomy Forum Virtual Experience today! – Space.com

Each year around this time, thousands of skywatchers, scientists and telescope manufactures flock to Suffern, New York for a weekend reveling at the stars the Northeast Astronomy Forum. This year, due to the coronavirus pandemic, the event has gone virtual and you can watch it live for free today (April 6), no tickets needed.

The Northeast Astronomy Forum, or NEAF, is organized by the Rockland Astronomy Club and has been held for nearly three decades at SUNY Rockland Community College. NEAF 2020 was originally scheduled for this weekend, April 4-5, but the coronavirus pandemic forced organizers to postpone the live event to help curb the spread of COVID-19, the disease caused by the virus.

Instead, NEAF 2020 will hold a one-day free event from 10 a.m. to 8 p.m. EDT (7 a.m. - 5 p.m. PDT). You can tune in to the livestream event here directly from NEAF. It is also being streamed live on YouTube here.

Related: Free space projects for kids at home during the coronavirus outbreakMore: Coronavirus pandemic: Full space industry coverage

The event promises to be packed "featuring product demonstrations, fantastic vendor discounts, door prizes, and amazing speakers that have made the Northeast Astronomy Forum legendary," organizers said in a statement."

Among the speakers in today's forum will be Thomas Zurbuchen, NASA Associate Administrator for science missions; C. Alex Young, the agency's associate director for science, heliophysics division; Samuel Hale, executive director of the Mount Wilson Observatory in California; Dianna Colman, chair of the Yerkes Foundation to Save Yerkes; and planetary scientist Janni Radebaugh, who will discuss Dragonfly, a mission to send a helicopter to Saturn's moon Titan.

Today's one-day livestream is not the end for NEAF 2020. Organizers and CUNY Rockland Community College have rescheduled the event for Sept. 12 and 13.

Email Tariq Malik attmalik@space.comor follow him@tariqjmalik. Follow us@Spacedotcom, Facebook and Instagram.

Read the original here:

Marvel at the universe with the free Northeast Astronomy Forum Virtual Experience today! - Space.com

Here’s the best way to enjoy the ‘Super Pink Moon,’ according to a NASA astronomer – Space.com

Tonight (April 7), the moon will be at its brightest and largest for the whole year during the "Super Pink Moon."

This extraordinary astronomical event is surely not one to miss. Space.com spoke with NASA astronomer Michelle Thaller, the assistant director of science communications at NASA's Goddard Space Flight Center in Greenbelt, Maryland, about tonight's highly-anticipated skywatching event to get a better idea of what to expect and how people can best observe this special supermoon.

"It's just kind of a fun astronomical thing," Thaller said about tonight's full moon. Supermoons, or full moons that appear bigger than usual, occur because our moon does not orbit in a perfect circle around Earth. Rather, it circles our planet in an elliptical-shaped orbit. This means that sometimes the moon is closer to Earth and sometimes it is farther away, causing it to appear bigger or smaller from our perspective on Earth.

Webcast info: How to watch the 'Super Pink Moon' online tonight!Video:Pink supermoon? Astronomer explains what it is

"Tonight the moon is 17,000 miles [27,000 kilometers] closer than average," Thaller said, adding that not only will the moon look bigger in the sky tonight, it will also be about 30% brighter than the average full moon.

Luckily, because it will be so big, bright and obvious in the sky "it's a very simple and easy thing to observe," Thaller said.

This is good news for people who are self-isolating to "flatten the curve" and reduce the spread of the novel coronavirus. Even if you are inside, there is a high likelihood that you'll be able to spot the supermoon through a window.

"All you really need is to be able to look to the eastern horizon at sunset," Thaller said, adding that you have a huge window of time to look at the moon as well. "The wonderful thing about a full moon is that full moons are up all night long, they rise at sunset then they cross over the sky and set at sunrise, so at any point in the night you can go outside and actually see this wonderful big, bright moon."

However, despite how easy it will be for people around the world to check out tonight's supermoon, Thaller added that she has a favorite time to watch a full moon moonrise. "Sometimes it just knocks my socks off to see a full moon rising in the sky," she said.

"When you can see it against the horizon, it looks gigantic. It looks like it's coming in for a landing," she added. "To me, that's the best part."

So, whether you're inside looking out a window or out in your backyard at sunset waiting for a "giant" supermoon to beam up over the eastern horizon, make sure that tonight you look up!

Editor's note:If you have an amazing supermoon photo you'd like to share for a possible story or image gallery, you can send images and comments in tospacephotos@space.com.

Follow Chelsea Gohd on Twitter @chelsea_gohd. Follow us on Twitter @Spacedotcom and on Facebook.

Read the original post:

Here's the best way to enjoy the 'Super Pink Moon,' according to a NASA astronomer - Space.com

Astronomers spot never-before-seen gravitational wave source from binary white dwarf stars – Space.com

Astronomers have detected two stellar corpses whirling around each other, and they might be producing gravitational waves.

White dwarf stars are what become of stars like our sun after they run out of fuel and turn into leftover hot cores. For many years, researchers have predicted that there should be binary, or two-object, systems made up of white dwarf stars. According to general relativity, two such masses orbiting each other should emit energy in the form of gravitational waves, which are ripples or disturbances in the fabric of spacetime.

Now, this is not the discovery of gravitational waves, rather it is the discovery of this binary which may be a source for gravitational waves. But, not only will this study advance our understanding of these systems and gravitational wave sources, it will also be important in validating the efficiency of an instrument that will launch in 2034.

Video:Double star system is a 'cosmic Jekyll and Hyde'Related:Is life possible around binary stars? (Podcast)

The instrument, LISA (the Laser Interferometer Space Antenna) gravitational wave observatory, will use the J2322+0509 system to essentially train with. Because they already know they exist, it's a good test to make sure the instrument can correctly spot it.

"Verification binaries are important because we know that LISA will see them within a few weeks of turning on the telescopes," Mukuemin Kilic, a co-author on this study from the University of Oklahoma, said in the statement. "There's only a handful of LISA sources that we know of today. The discovery of the first prototype of a new class of verification binary puts us well ahead of where anyone could have anticipated."

In a new study identifying and exploring this binary, researchers at the Center for Astrophysics (CfA) at Harvard have detected, for the first time, a binary white dwarf system made up of two white dwarf stars (with helium cores) that are clearly separate stars. This system, known as J2322+0509, has a short orbital period of 1,201 seconds (just over 20 minutes) and is the first gravitational wave source of its kind ever identified.

"Theories predict that there are many double helium-core white dwarf binaries out there," Warren Brown, CfA astronomer and lead author on the study, said in a statement. "This detection provides an anchor for those models, and for doing future experiments so that we can find more of these stars and determine their true numbers."

This system, whose orbital period is the third shortest period of all detached binaries ever found, was fairly tough to spot. "This binary had no light curve," Brown said in the statement. "We couldn't detect a photometric signal because there isn't one." So instead of using a photometric study, which looks at light itself, the team used spectroscopic studies, which observe how matter interacts with electromagnetic radiation like visible light, to identify the star's orbital motion.

But, while the system was tricky to spot, it turns out that this type of binary is an extremely strong source of gravitational waves, the team found using theoretical calculations, according to the statement and the study. The researchers determined that because of the system's alignment with respect to Earth, instruments should pick up a signal 2.5 times stronger than from the same system twisted a different direction.

This binary won't be a binary forever, though, as a consequence of the very gravitational waves the scientists hope to someday detect. "The orbit of this pair of objects is decaying," Brown said. "The gravitational waves that are being emitted are causing the pair to lose energy; in six or seven million years they will merge into a single, more massive white dwarf."

This work is described in a paper posted to the preprint server arXiv.org on April 3 that has been accepted by the journal Astrophysical Journal Letters.

Follow Chelsea Gohd on Twitter @chelsea_gohd. Follow us on Twitter @Spacedotcom and on Facebook.

Visit link:

Astronomers spot never-before-seen gravitational wave source from binary white dwarf stars - Space.com

Interview: Jim Lovell relives the successful failure of Apollo 13 – Astronomy Magazine

Lovell: Well, it did become more famous in the beginning, at least in the eyes of NASA. I have to tell you an interesting story. We came back. Its a failure. So the spacecraft, the command module, which was the only thing left of Apollo 13, really, was in a warehouse down in Florida for about six months. Then, they tried to forget about it. They wanted to go on to Apollo 14 and everything like that.

Then France called up, Paris called up, [the] museum at Le Bourget, which was where Lindbergh landed. They asked the Smithsonian, Do you have any space artifacts that we could have in this museum? Then the lights came on in the Smithsonian and also NASA, Well, we can get rid of this spacecraft. So they exiled Apollo 13 to Le Bourget, and it stayed there for 20 years.

About 18 years after that, I had a classmate that went out there and he saw it and he wrote me a letter. He said, Do you know where your spacecraft is? I didnt at that time. No one told me it was in Le Bourget.Then, later on, a year or so later, my wife [Marilyn] and I were in Paris and we went out to this museum, which was at the airfield there, and there we saw it. We walked up to it. It was still on the cradle that they had rolled it in on. It was all by itself, just about, nothing else around it. The hatch was missing. The instrument panel was missing. The seats were missing. The only thing I saw was a piece of paper that was stuck on the side that said, Apollo 13, and gave the names of the three crew members. And then Ron Howard made the movie. Of course they made the movie that was shown in France, and all those French people said, Oh, its out there in Le Bourget. Lets go see it.

Meanwhile, NASA was so embarrassed and the Smithsonian, that a museum out of Hutchinson, Kansas, called the Cosmosphere, offered to go get [it] and bring it back and pay for it and they did. And all those Frenchmen now were mad because they had kept it for 20 years, and now it came back here. [Laughs.]

Astronomy: Do you recall what the first thing you and Marilyn talked about once you returned after Apollo 13? What did that conversation go like? Did [she] encourage you to find a different career path maybe?

Lovell: Well, I have to tell you another interesting story along those lines. About a week or two weeks after we got picked up in Hawaii and then we came back, we had a big press conference of course. All the NASA people came in and all the reporters came in, and TV people and stuff like that, and a lot of the families came in to listen to the whole thing. We were in the auditorium down in the Johnson Space Center. So we started talking about that.

At the beginning of the conference, a reporter asked, Jim, are you gonna ask for another flight? Obviously, this was not successful. Before that, on Apollo 11 [and] 12, management said, Look, if theres a problem with this flight, well get you back and well give you the very next one.

So when that question came up from the reporter, I thought to myself, because management was right behind us, here was the perfect opportunity to put them on the wall and say yes, because they had not talked to us, the 13, just 11 and 12. I was about ready to say something like that when, out in the audience, I saw a hand go up. Then I saw it go down like this. [Jim gives a thumbs down gesture.] It was my wife. [Laughs.] I could tell. I said, No. I think this is the last flight Im gonna make. [Laughs.]

Read the original here:

Interview: Jim Lovell relives the successful failure of Apollo 13 - Astronomy Magazine

View On Astronomy: Want to see Betelgeuse supernova? You’ll have to wait a bit longer – The Independent

As 2019 came to a close, the news media sensationalized a story about Orions bright star Betelgeuse. The headlines were certainly designed to get ones attention. Betelgeuse was about to go supernova. However, the stars behavior was really old news that was recently enhanced by new observational data. You see, Betelgeuse is a red super giant star (20 times more massive than our Sun and approximately 1,000 times larger) that is indeed nearing the end of its life cycle. And with a star this massive, the result will someday be a supernova event.

Betelgeuse is a known variable star, which pulsates back and forth about one full magnitude (brightness scale) in a 425-day period. What happened more recently is that the star dimmed a little more than usual, by about .2 of a magnitude. An imaging technique using radio waves revealed Betelgeuse appeared to be lopsided, but this discovery turned out to be a huge dust cloud blocking some of the stars light from reaching us. In fact, Betelgeuse has shed off great shells of its outer surface several times in the past, typical activity for these stars as they burn through their supply of nuclear fuel. Speculation arose that Betelgeuses grand finale was soon at hand.

However, every article I read succinctly stated the event could happen soon, or 100,000 years from now. While it is inevitable that Betelgeuse will go supernova in the future, we neednt worry. Fortunately, at its distance of about 700 light years from the Earth, we will not suffer from any hard radiation effects. The supernova will be at least as bright as a full moon and will be visible in broad daylight. About a day before we see the visible light from the supernova event our Earth will be bombarded by a harmless hail of neutrinos and gamma rays. That onslaught will be our advance warning system that Betelgeuse the star has met its demise.

Just as I began to write about Betelgeuses potential imminent demise, new data revealed that Betelgeuse began to brighten once again during mid-to-late February (much like it has in the past). Astronomers will certainly keep monitoring Betelgeuse with their instruments in the hope of capturing the death of a star. If it happens within our lifetime, I hope it occurs when the constellation of Orion is above our horizon. The sight will be spectacular.

Easter Observance Determination

Many religious celebrations are determined by astronomical circumstances. Easter is no exception. But because our secular calendar is not in sync with the motion of the heavens, Easter can occur as early as March 22 or as late as April 25. The general rule is: Easter will fall on the first Sunday after the full moon on or next after the vernal equinox (springMarch 19, 20 or 21). However, if the full moon occurs on a Sunday, Easter is celebrated on the following Sunday. This scenario happened in 2001.

However, there is a caveat to that rule that I only learned about back in 2018. Because the date of the vernal equinox does vary year-to-year, the determination for the Easter date depends on the ecclesiastical approximation of March 21 for the vernal equinox according to https://www.timeanddate.com. This stipulation holds true even if the vernal equinox falls on the 19th or 20th of March.

Therefore, for 2020, using March 21 as the date for the vernal equinox, the next Full Moon after March 21 will be on April 7 at 10:35 pm EDT (Eastern Daylight Time) this year. Therefore, Easter will be celebrated on the following Sunday, April12.

April Lyrids Meteor Shower

Its been a while since Mother Nature has afforded us a decent display of shooting stars. Clouds or bright moonlight have often conspired to prevent us from watching burning rocks falling from the sky. However, on the night of April 22-23, between midnight and dawn, the annual April Lyrids meteor shower will reach its peak of activity. The Lyrids are actually the oldest known shooting star display, having been observed by Chinese astronomers on March 16, 687 BCE. Being such an old display, the number of meteors populating this stream of particles has greatly diminished. However, with good sky conditions and no interfering moonlight, perhaps up to 20 meteors per hour can be counted from dark sky locations.

These swift and bright meteors disintegrate after hitting our atmosphere at a moderate speed of 29.8 miles per second. They often produce luminous trains of dust that can be observed for several seconds. The Moon will be new on the 23rd, so it will not interfere whatsoever with this years shooting star display.

The Lyrids appear to radiate outward from an area of sky on the Lyra-Hercules border near the bright star Vega, which will be about 45 degrees (halfway between the horizon and zenith) above the eastern horizon at midnight and well placed for observing. Let your eyes roam the heavens while facing this general direction. Remember, even though you can trace the dust train left by a Lyrid meteor back to the radiant point, members of this shower can appear anywhere in the sky. The Lyrids are a fairly narrow stream of particles, so dont expect many to be seen before or after peak night. It is produced by dust particles left behind by comet C/1861 G1 Thatcher,

Keep your eyes to the skies!

The author has been involved in the field of observational astronomy in Rhode Island for more than 35 years. He serves as historian of Skyscrapers Inc., the second oldest continuously operating amateur astronomical society in the United States.

Read the original here:

View On Astronomy: Want to see Betelgeuse supernova? You'll have to wait a bit longer - The Independent

See the waning crescent Moon meet the dawn planets, 1516 April 2020 – Astronomy Now Online

At civil dawn (approximately 40minutes before sunrise in the British Isles) on the mornings of 15 and 16April 2020, let the old crescent Moon be your guide to three naked-eye planets Jupiter, Saturn and Mars. Both the red and the ringed planet lie in the constellation of Capricornus, while Jupiter lies in Sagittarius. This looping animation depicts the view very low to the horizon between southeast and south-southeast around 5:30amBST on the mornings in question. Note that the Moons apparent size is enlarged for clarity. Dabih, otherwise known as BetaCapricorni, is a third-magnitude multiple star. AN animation by Ade Ashford.If youre an early riser in the British Isles fortunate enough to experience clear skies at the start of civil twilight on 15 and 16April, why not venture out at 5:30amBST to see the waning crescent Moon guide you to not just one, but three naked-eye planets Jupiter, Saturn and Mars. Typical 750 or 1050 binoculars will enable you to better appreciate these attractive conjunctions, while the smallest of telescopes also reveal some of Jupiters bright Galilean moons.

What to look for on 15April 2020 at5:30amBSTAt the onset of civil twilight some 40minutes before sunrise in the UK, the waning last quarter lunar crescent lies in Sagittarius just 8degrees slightly less than the span of a fist at arms length above the south-southeast horizon for an observer in the heart of the British Isles.AN graphic by Ade Ashford.At 5:30amBST on 15April, magnitude+0.6 Saturn lies 4degrees to the Moons upper left, while magnitude-2.2 Jupiter 13 times brighter than the ringed planet is 4degrees the upper right of the Moon. Whats more, this attractive celestial triumvirate comfortably fits within the field of view of typical 750 binoculars. Owners of small telescopes can also see Jovian moons Callisto, Europa and Ganymede at this time, but Io is transiting the face of its parent planet.

If your skies are particularly clear, can you glimpse third-magnitude star beta () Capricorni, better known as Dabih, some 5degrees (slightly more than a 1050 binocular field of view, but easily encompassed by 750 instruments) to the upper left of Saturn? If so, can you see that its a double star?

What to look out for on 16April 2020 at5:30amBSTThe almost 23-day-old Moon lies in the constellation of Capricornus at UK civil dawn, some 3degrees to the lower right of magnitude+0.6 planet Mars. The lunar crescent is just 5degrees high in the southeast, so can you glimpse the Red Planet and Moon in the same field of view of 1050 binoculars this morning?

Caution: never sweep with binoculars close to the horizon near sunrise lest you accidentally view the Sun with disastrous consequences for your eyesight. Consult our interactive online Almanac to find the precise time of sunrise for your location. (Clickhere for a users guide to the Almanac.)

See the article here:

See the waning crescent Moon meet the dawn planets, 1516 April 2020 - Astronomy Now Online

Comets Are Breaking Apart in Our Cosmic Backyard, and Astronomers Are Stoked – Popular Mechanics

At the end of August 2019, an amateur astronomer named Gennadiy Borisov made a remarkable discovery. He'd spotted an interstellar comet zipping through our solar system. In December, that comet, newly named 2I/Borisov, made its closest approach to the sunperhaps its first close encounter with any star.

Astronomers have since used Earth's many telescopesboth terrestrial and orbitalto observe the comet. Last week, astronomers reported in a series of posts to the website Astronomer's Telegram that 2I/Borisov showed signs of breaking up. On March 28 and March 30, the Hubble Space Telescope snapped pictures of the interstellar comet and it seemed to have split apart, astronomers reported April 2 in a statement.

"Continuing Hubble Space Telescope images of interstellar object 2I/Borisov...show a distinct change in appearance," read the statement, composed by astronomers David Jewitt of UCLA, Max Mutchler of STSCI, Yoonyoung Kim of the Max Planck Institute for Solar System Research, Hal Weaver of John's Hopkin's University's Applied Physics Laboratory and Man-To Hui of the University of Hawaii.

So far, Jewitt tells Popular Mechanics, only a small fragmentmaybe about a tenth of a percent of the total masshas come off of the roughly 1,600-foot-wide body. One of the pieces, according to an Astronomer's Telegram update posted on April 6, has already disappeared.

"Instead, a diffuse, blob-like feature is visible in its place, extending from the remaining component," Qicheng Zhang of Caltech and Quanzhi Ye and Ludmilla Kolokolova of the University of Maryland reported. Ye tells Popular Mechanics that this blob is likely just bits of dust, ice, and rock which have spun off of the comet.

In early March, astronomers recorded several outbursts, where the Comet Borisov shedded a bunch of materiala tell-tale sign that a break-up may be imminent. It takes a while for that heat to permeate through the comet, Jewitt says. Heat from the sun creeps into pockets of ice inside the comet. That ice vaporizes, forming pressure cooker-like conditions, and poof.

These outbursts may have spurred the fast-moving body to shed even more material. They might have another peculiar effect on the comet: they may cause it to speed up. More observations are needed to confirm Jewitt's hypothesis and measure the speed at which its spinning.

NASA, ESA, and D. Jewitt U.C.L.A.

Borisov is just the second interstellar object to slide across our solar system. In 2017, astronomers discovered an elongated interstellar object they called 1I/Oumuamua. While both objects came from distant corners of the universe, 1I/Oumuamua looked and acted more like a lumpy rock. Comet Borisov, however, has all of the typical characteristics of a comet.

"We know it's spent a really long time out there in the interstellar medium at nearly absolute zero temperature," Jewitt says. "So the question is: Have either of those things affected it in some way and made it measurably different from the comets in our solar system?"

Astronomers hope that the splintering comet might spill secrets about its interstellar journey and the solar system from which it came. In October, a pre-print posted to the website ArXiv, reported that traces of water in the comet's tail. Astronomers have also spotted traces of cyanide in the comet's wake. Unfortunately, due to the spread of the novel coronavirus, many Earth-based telescopes that would have otherwise made additional observations about its composition have been shuttered.

So far, it seems to behave in a very similar way to comets that originated much closer to home. "Borisov's behavior is remarkably similar to its solar system siblings," Ye says. "It has a similar composition to solar system comets; we know that solar system comets with similar characteristics are prone to fragment, and Borisov also did."

Fortunately, we've still got a bit of time with Borisov before it disappears from view completely. Ye estimates it will remain visible to ground-based telescopes for another year. Space telescopes like Hubble will likely be able to see the comet for even longer, perhaps a few years, before it slides out of sight.

We've already learned quite a bit about the nature of interstellar comets from 2I/Borisov, but it's taught us something about our own place in the universe, too. "In my opinion, it tells us that our solar system may not be that unique after all," Ye says. "There's something universal across the stars."

Quanzhi Ye (University of Maryland) and Qicheng Zhang (Caltech)/ Ningbo Education Xinjiang Telescope.

Additional observations made by Ye and his colleague Qicheng Zhang of Caltech using the Ningbo Education Xinjiang Telescope, show that Comet C/2019 Y4 (ATLAS) may be breaking up, too.

The comet was discovered on December 28, 2019, by a group of astronomers at Asteroid Terrestrial-impact Last Alert System (ATLAS) in Hawaii. Astronomers had high hopes for Comet C/2019 Y4 (ATLAS), which was expected to become bright enough to be seen with a decent pair of binoculars (or the naked eye, in dark sky areas) this month, but it might meet its demise before that's possible.

These images, taken this weekend "showed an elongated pseudo-nucleus measuring about 3 arcsec in length and aligned with the axis of the tail, a morphology consistent with a sudden decline or cessation of dust production, as would be expected from a major disruption of the nucleus," according to an Astronomer's Telegram update. Trarnslation: Comet C/2019 Y4 (ATLAS) may be headed for Splitsville.

The comet hasn't appeared as bright the past few nights, further suggesting a break-up might be on the horizon. But comets are unpredictable, and that's what makes them worth watching.

More here:

Comets Are Breaking Apart in Our Cosmic Backyard, and Astronomers Are Stoked - Popular Mechanics

South African Radio Astronomy Observatory Mandated To Manage The Production Of Respiratory Ventilators – Space in Africa

South African Department of Trade, Industry and Competition have mandated the South African Radio Astronomy Observatory (SARAO) to manage the national effort required for the local design, development, production and procurement of respiratory ventilators to support the governments response to combat the COVID-19 (coronavirus) pandemic.

SARAO has been mandated to manage the National Ventilator Project based on the experience it gained in the development of complex systems for the MeerKAT radio telescope, a precursor to the worlds largest Square Kilometre Array radio telescope.

With the number of people who have tested positive for COVID-19 steadily increasing in South Africa, the government has called on companies and experts, particularly engineers and scientists, to come with innovative solutions to help combat the pandemic.

As of 03 April 2020, South Africa had 1 505 confirmed cases of COVID-19. The health ministry expects the number to increase exponentially in the next few weeks as more people get tested.

In an effort to meet the anticipated demand for critical medical equipment such as ventilators, the Department of Trade, Industry and Competition is inviting companies and experts to express their interest in the design, development, production and procurement of ventilators in South Africa.

The invitation provides an opportunity for experts and companies to register their interest regarding the goods and services they offer that may be relevant to the National Ventilator Project. Once the specifications are finalised, interested parties will be invited to make a representation of their proposed solutions and the extent to which they would meet the specification.

Interested parties can register here

Experts interested in providing technical support can register here

Submissions have to be in the standard templates, listed below:

New Report: The African space economy is now worth USD 7 billion and is projected to grow at a 7.3% compound annual growth rate to exceed USD 10 billion by 2024. Read the executive summary of the African Space Industry Report - 2019 Edition to learn more about the industry. You can order the report online.

Read the original here:

South African Radio Astronomy Observatory Mandated To Manage The Production Of Respiratory Ventilators - Space in Africa

COVID-19 Forces Earth’s Largest Telescopes to Close. But a Few Isolated Astronomers Are Still Watching Over the Cosmos – Discover Magazine

The alarm sounded at around 3 a.m. on April 3. An electrical malfunction had stalled the behemoth South Pole Telescope as it mapped radiation left over from the Big Bang. Astronomers Allen Foster and Geoffrey Chen crawled out of bed and got dressed to shield themselves from the 70 degree Fahrenheit temperatures outside. They then trekked a few thousand feet across the ice to restart the telescope.

The sun set weeks ago in Antarctica. Daylight wont return for six months. And, yet, life at the bottom of the planet hasnt changed much even as the rest of the world has been turned upside-down. The last flight from the region left on Feb. 15, so theres no need for social distancing. The 42 winterovers still work together. They still eat together. They still share the gym. They even play roller hockey most nights.

And thats why the South Pole Telescope is one of the last large observatories still monitoring the night sky.

Astronomer Allen Foster controls the $20 million South Pole Telescope from inside the comfort of the South Pole Science Station office. (Credit: Jeff Derosa)

An Astronomy magazine tally has found that more than 100 of Earths biggest research telescopes have closed in recent weeks due to the COVID-19 pandemic. What started as a trickle of closures in February and early March has become an almost complete shutdown of observational astronomy. And the closures are unlikely to end soon.

Observatory directors say they could be offline for three to six months or longer. In many cases, resuming operations will mean inventing new ways of working during a pandemic. And that might not be possible for some instruments that require teams of technicians to maintain and operate. As a result, new astronomical discoveries are expected to come to a crawl.

If everybody in the world stops observing, then we have a gap in our data that you cant recover, says astronomer Steven Janowiecki of the McDonald Observatory in Texas. This will be a period that we in the astronomy community have no data on what happened.

Yet these short-term losses arent astronomers main concern.

Theyre accustomed to losing telescope time to bad weather, and they're just as concerned as everyone else about the risks of coronavirus to their loved ones. So, for now, all that most astronomers can do is sit at home and wait for the storm to clear.

If we have our first bright supernova in hundreds of years, that would be terrible, says astronomer John Mulchaey, director of the Carnegie Observatories. But except for really rare events like that, most of the science will be done next year. The universe is 13.7 billion years old. We can wait a few months.

The prospects get darker when considering the pandemics long-term impacts on astronomy. Experts are already worried that lingering damage to the global economy could derail plans for the next decade of cutting-edge astronomical research.

Yes, there will be a loss of data for six months or so, but the economic impact may be more substantial in the long run, says Tony Beasley, director of the National Radio Astronomy Observatory. Its going to be hard to build new telescopes as millions of people are out of work. I suspect the largest impact will be the financial nuclear winter that were about to live through.

The world's largest optical telescopes, shown here, have shut down in droves in recent weeks (open sites are in green). The Hobby-Eberly Telescope at McDonald Observatory in Texas is the largest optical telescope left observing. Construction has also halted at the Vera C. Rubin Observatory site in Chile. (Credit: Astronomy/Roen Kelly)

Through interviews and email exchanges with dozens of researchers, administrators, press officers and observatory directors, as well as reviewing a private list circulating among scientists, Astronomy magazine has confirmed more than 120 of Earth's largest telescopes are now closed as a result of COVID-19.

Many of the shutdowns happened in late March, as astronomy-rich states like Arizona, Hawaii and California issued stay-at-home orders. Nine of the 10 largest optical telescopes in North America are now closed. In Chile, an epicenter of observing, the government placed the entire country under a strict lockdown, shuttering dozens of telescopes. Spain and Italy, two European nations with rich astronomical communities and a large number of COVID-19 infections closed their observatories weeks ago.

Even many small telescopes have now closed, as all-out shutdowns were ordered on mountaintops ranging from Hawaii's Mauna Kea to the Chilean Atacama to the Spanish Canary Islands. Science historians say nothing like this has happened in the modern era of astronomy. Even during the chaos of World War II, telescopes kept observing.

As wartime fears gripped Americans in the 1940s, German-born astronomer Walter Baade was placed under virtual house arrest. As a result, he famously declared Mount Wilson Observatory in California to be his official residence. With the lights of Los Angeles dimmed to avoid enemy bombs, Baade operated the worlds largest telescope in isolation, making groundbreaking discoveries about the cosmos. Among them, Baades work revealed multiple populations of stars, which led him to realize that the universe was twice as big as previously thought.

In the decades since, astronomers have built ever-larger telescopes to see fainter and farther-off objects. Instruments have become increasingly complex and specialized, often requiring them to be swapped out multiple times in a single night. Enormous telescope mirrors need regular maintenance. All of this means observatory crews sometimes require dozens of people, ranging from engineers and technicians to observers and astronomers. Most researchers also still physically travel to a telescope to observe, taking them to far-flung places. As a result, major observatories can be like small villages, complete with hotel-style accommodations, cooks and medics.

But although observatories might be remote, few can safely operate during a pandemic.

Most of our telescopes still work in classical mode. We do have some remote options, but the large fraction of our astronomers still go to the telescopes, says Mulchaey, who also oversees Las Campanas Observatory in Chile and its Magellan Telescopes. Its not as automated as you might think.

Some of the most complicated scientific instruments on Earth are the gravitational-wave detectors, which pick up almost imperceptible ripples in space-time created when two massive objects merge. In 2015, the first gravitational-wave detection opened up an entirely new way for astronomers to study the universe. And since then, astronomers have confirmed dozens of these events.

The most well-known facilities, the twin Laser Interferometer Gravitational-wave Observatory (LIGO) located in Washington state and Louisiana, both pandemic hot spots closed on March 27. Virgo, their Italian partner observatory, shut down the same day. (Its also located near the epicenter of that countrys COVID-19 pandemic.)

More than 1,200 scientists from 18 countries are involved with LIGO. And no other instruments are sensitive enough to detect gravitational waves from colliding black holes and neutron stars like LIGO and Virgo can. Fortunately, the observatories were already near the end of the third observing run, which was set to end April 30.

You don't know what you missed, says LIGO spokesperson Patrick Brady, an astrophysicist at the University of Wisconsin-Milwaukee. We were detecting a binary black hole collision once a week. So, on average, we missed four. But we don't know how special they would have been.

The gravitational-wave detectors will now undergo upgrades that will take them offline through at least late 2021 or early 2022. But the pandemic has already delayed preliminary testing for their planned fourth run. And it could prevent future work or even disrupt supply chains, Brady says. So, although its still too early to know for sure, astronomy will likely have to wait a couple of years for new gravitational-wave discoveries.

Then there's the Event Horizon Telescope (EHT). Last year, the EHT collaboration released the first-ever image of a black hole. And on April 7, they published another unprecedented image that stares down a black hole's jet in a galaxy located some 5 billion light-years away. But now, EHT has cancelled its entire observing run for the year it can only collect data in March and April due to closures at its partner instruments.

Around the world, only a handful of large optical telescopes remain open.

The Green Bank Observatory, Earths largest steerable radio telescope, is still searching for extraterrestrial intelligence, observing everything from galaxies to gas clouds.

The twin Pan-STARRS telescopes on the summit of Hawaii's Haleakala volcano are still scouting the sky for dangerous incoming asteroids. Both instruments can run without having multiple humans in the same building.

We are an essential service, funded by NASA, to help protect the Earth from (an) asteroid impact, says Ken Chambers, director of the Pan-STARRS Observatories in Hawaii. We will continue that mission as long as we can do so without putting people or equipment at risk.

The 10-meter Hobby-Eberly Telescope at McDonald Observatory in Texas is now operating with just one person in the building. (Credit: Marty Harris/McDonald Observatory)

With observatory domes closed at the world's newest and best telescopes, a smattering of older, less high-tech instruments are now Earth's largest operating observatories.

Sporting a relatively modest 6-meter mirror, the biggest optical telescope still working in the Eastern Hemisphere is Russias 45-year-old Bolshoi Azimuthal Telescope in the Caucasus Mountains, a spokesperson there confirmed.

And, for the foreseeable future, the largest optical telescope on the planet is now the 10-meter Hobby-Eberly Telescope (HET) at McDonald Observatory in rural West Texas. Astronomers managed to keep the nearly-25-year-old telescope open thanks to a special research exemption and drastic changes to their operating procedures.

To reduce exposure, just one observer sits in HET's control room. One person turns things on. And one person swaps instruments multiple times each night, as the telescope switches from observing exoplanets with its Habitable Zone Finder to studying dark energy using its now-poorly-named VIRUS spectrograph. Anyone who doesnt have to be on site now works from home.

We don't have the world's best observatory site. Were not on Mauna Kea or anything as spectacular, says Janowiecki, the HETs science operations manager. We don't have any of the expensive adaptive optics. We dont even have a 2-axis telescope. That was [intended as] a massive cost savings.

But, he added, In this one rare instance, its a strength.

The supervising astronomer of HET now manages Earths current largest telescope from a few old computer monitors he found in storage and set up on a foldout card table in his West Texas guest bedroom.

Like the Hobby-Eberly Telescope, the handful of remaining observatories run on skeleton crews or are entirely robotic. And all of the telescope managers interviewed for this story emphasized that even if theyre open now, they wont be able to perform repairs if something breaks, making it unclear how long they could continue operating in the current environment.

The 48-inch Zwicky Transient Facility telescope at Palomar Observatory in Southern California. (Credit: Palomar/Caltech)

The Zwicky Transient Facility (ZTF) is a medium-sized, robotic telescope at Palomar Observatory in Southern California that's still producing nightly maps of the northern sky. And, thanks to automation, it remains open.

The so-called discovery engine searches for new supernovas and other momentary events thanks to computers back at Caltech that compare each new map with the old ones. When the software finds something, it triggers an automatic alert to telescopes around the world. Last week, it sent out notifications on multiple potentially new supernovas.

Similarly, the telescopes that make up the Catalina Sky Survey, based at Arizonas Mount Lemmon, are still searching the heavens for asteroids. In just the past week, they found more than 50 near-Earth asteroids none of them dangerous.

Another small group of robotic telescopes, the international Las Cumbres Observatory network, has likewise managed to stay open, albeit with fewer sites than before. In recent weeks, their telescopes have followed up on unexpected astronomical events ranging from asteroids to supernovas.

"We are fortunate to still be keeping an eye on potential new discoveries," says Las Cumbres Observatory director Lisa Storrie-Lombardi.

But, overall, there are just fewer telescopes available to catch and confirm new objects that appear in our night sky, which means fewer discoveries will be made.

Chambers, the Pan-STARRS telescope director, says his team has been forced to do their own follow-ups as they find new asteroids and supernovas. This will mean we make fewer discoveries, and that we will miss some objects that we would have found in normal times, he says.

NASA's DART spacecraft is scheduled to launch in 2021 on a mission to visit the binary asteroid Didymos. Astronomers need additional observations to help plot the course. (Credit: NASA/JHUAPL)

Astronomer Cristina Thomas of Northern Arizona University studies asteroids. She was the last observer to use the 4.3-meter Lowell Discovery Telescope before it closed March 31 under Arizonas stay-at-home order.

Thomas warns that, in the short term, graduate students could bear the brunt of the lost science. Veteran astronomers typically have a backlog of data just waiting for them to analyze. But Ph.D. students are often starved for data they need to collect in order to graduate on time.

It's stressing them out in a way that it doesn't for me. Were used to building in a night or so for clouds, Thomas says. If this goes on for months, this could put [graduate students] pretty far behind.

One of Thomas' students was set to have observations collected for their dissertation by SOFIA, NASAs airborne observatory. But the flying telescope is currently grounded in California, leaving it unclear when the student will be able to complete their research. And even when astronomy picks back up, everyone will be reapplying for telescope time at once.

But the damage isn't only limited to graduate students. An extended period of observatory downtime could also have an impact on Thomas' own research. Later this year, shes scheduled to observe Didymos, a binary asteroid that NASA plans to visit in 2021. Those observations are supposed to help chart the course of the mission.

The big question for us is: When are we going to be able to observe again? Thomas says. If its a few months, well be able to get back to normal. If it ends up being much longer, were going to start missing major opportunities.

The Keck Observatory telescopes in Hawaii use high-tech adaptive optics equipment that changes their mirrors' shape 1,000 times per second to counter the twinkling caused by Earth's atmosphere. Keck instruments also need to be chilled below freezing to reduce noise. If the warm up, cooling them down can take days or weeks. (Credit: W. M. Keck Observatory/Andrew Richard Hara)

The same qualities that brought observational astronomy to a standstill in the era of social distancing will also make it tough to turn the telescopes back on until the pandemic has completely passed. So, even after the stay-at-home orders lift, some observatories may not find it safe to resume regular operations. They'll have to find new ways to work as a team in tight spaces.

We are just starting to think about these problems now ourselves, says Caltech Optical Observatories deputy director Andy Boden, who also helps allocate observing time on the Keck Observatory telescopes in Hawaii. There are aspects of telescope operations that really do put people in shared spaces, and thats going to be a difficult problem to deal with as we come out of our current orders.

Astronomers say theyre confident they can find solutions. But it will take time. Tony Beasley, the NRAO director, says his team is already working around a long list of what theyre now calling VSDs, or violation of social distancing problems. Their workarounds are typically finding ways to have one person do something that an entire team used to do.

Beasley's research center operates the Green Bank Telescope in West Virginia, as well as the Very Large Array in New Mexico and the global Very Long Baseline Array all of which are still observing, thanks to remote operations and a reimagined workflow.

Although the new workflow is not as efficient as it was in the past, so far there haven't been any problems that couldn't be solved. However, Beasley says some work eventually may require the use of personal protective equipment for people who must work in the same room. And he says they cant ethically use such gear while hospitals are in short supply.

But Beasley and others think interesting and valuable lessons could still come out of the catastrophe.

There's always been kind of a sense that you had to be in the building, and you've got to stare the other people down in the meeting, he says. In the space of a month, I think everyone is surprised at how effective they can be remotely. As we get better at this over the next six months or something, I think there will be parts where we won't go back to some of the work processes from before.

Despite best efforts and optimistic outlooks, some things will remain outside astronomers' control.

Right now, researchers are completing the 2020 Astronomy and Astrophysics Decadal Survey, a kind of scientific census. The guiding document sets priorities and recommends where money should be spent over the next 10 years. NASA and Congress take its recommendations to heart when deciding which projects get funded. Until recent weeks, the economy had been strong and astronomers had hoped for a decade of new robotic explorers, larger telescopes, and getting serious about defending Earth from asteroids.

Engineers prep NASA's Mars InSight lander for launch to the Red Planet. It is currently stationed on Mars investigating the planet's deep interior. (Credit: NASA)

Many of NASAs most important activities from Mars exploration to studying extrasolar planets to understanding the cosmos are centuries-long projects, the modern version of the construction of the great medieval cathedrals, Princeton University astrophysicist David Spergel told the website SpaceNews.com last year as the process got underway. The decadal surveys provide blueprints for constructing these cathedrals, and NASA science has thrived by being guided by these plans.

However, many experts are predicting the COVID-19 pandemic will send the U.S. into a recession; some economists say job losses could rival those seen during the Great Depression.

If that happens, policymakers could cut the funding needed to construct these cathedrals of modern science even after a crisis has us calling on scientists to save society.

Original post:

COVID-19 Forces Earth's Largest Telescopes to Close. But a Few Isolated Astronomers Are Still Watching Over the Cosmos - Discover Magazine

Photos: Venus and the Pleiades prepare to meet | Astronomy Essentials – EarthSky

Submit your photo to EarthSky Community Photos here

Dont miss Venus and the Pleiades! Their conjunction was April 3

Stefano De Rosa? in Turin, Italy captured Venus and the Pleiades on April 2, 2020.

Clouded out? Gianluca Masi at the Virtual Telescope Project is also gearing up to present the Venus-Pleiades conjunction to you online. He wrote to EarthSky this weekend:

In the coming week week, the sky will offer us something unique, coming back every 8 years only: a stunning conjunction, involving planet Venus, the brightest object up there these evenings and the wonderful Pleiades, a spectacular star cluster, one of the best gems of the deep sky. To bring some joy from this cosmic show to people worldwide, often quarantined to limit the dissemination of COVID-19, the Virtual Telescope will share this celestial treasure with everyone, offering a live view covering the climax of this cosmic hug between Venus and the Pleiades.

Click into the Virtual Telescopes site to learn more.

Larry Ilardo caught the Pleiades and Venus from Buffalo, New York, on April 1.

View larger at EarthSky Community Photos. | Pradnya Gharpure caught Venus and the Pleiades on April 1 from Nagpur, India, and wrote: Dazzling Venus and the pretty cluster Pleiades make a beautiful sight this evening as they draw closer!!

View larger at EarthSky Community Photos. | Kevin Saragozza captured this striking view of Venus and the Pleiades from Siracusa Plemmeiro on April 1. He wrote: I positioned myself outside in my garden, not having the possibility to catch the alignment together with interesting terrestrial elements because of the COVID-19 quarantine, I preferred a view only from the sky, the Pleiades and Venus aligned in a vertical position.

View larger at EarthSky Community Photos. | Radu Anghel captured many Pleiades stars and a brilliant Venus in this photo from April 1 taken in Bacau, Romania. Radu wrote: Venus and the Pleiades cluster. Two more days before the 8 years meeting. From isolation, but with a great western view.

Piotr Wieczorek shared this beautiful view of the Pleiades and Venus that he took on March 31. Thank you, Piotr!

View at EarthSky Community Photos. | Marek Nikodem caught these stargazers near Szubin, Poland, looking at the moon, Venus and the Pleiades on March 28, 2020. Thank you, Marek.

The moon, Venus and the Pleiades March 28, 2020 via Fred Espenak.

View at EarthSky Community Photos. | Dennis Schoenfelder saw this glorious view of Venus, the moon, and the Pleiades from his front door in Alamosa, Colorado, on March 28. Thank you, Dennis!

Astronomer Alessandro Marchini director of the Astronomical Observatory at the University of Siena in Italy wrote on March 28, 2020: Stargazing from my backyard this evening, with the wonderful triangle with the crescent moon, Venus and the Pleiades (1.3 light-second, 5.5 light-minutes, 445 light-years away each from Earth). Photographed with my Canon Camera and a 100 mm lens on a tripod. Thank you, Alessandro! Venus is the bright object next to the moon. The Pleiades is the tiny, dipper-shaped star cluster at the top of the photo.

View at EarthSky Community Photos. | Stephen Thurston captured this view of the moon, Venus and the Pleiades on March 27. He wrote: Moon and Venus setting over Lake Champlain from Ferrisburgh, VT.

Tom Wildoner of the Dark Side Observatory wrote: I was lucky on the evening on March 27, 2020, to capture this nice view of the planet Venus approaching the Pleiades star cluster in the constellation Taurus. Think this is close? Wait until the evening on April 3rd, the planet Venus will be inside this cluster! Thank you, Tom!

Bottom line: This week, Venus the brightest planet and dazzling evening star in the west after sunset now will pass the beautiful Pleiades star cluster, also known as the Seven Sisters. Were already getting photos submit yours here. Look west after sunset!

Continued here:

Photos: Venus and the Pleiades prepare to meet | Astronomy Essentials - EarthSky

Supermoon and a meteor shower: 2 astronomical events in April to watch from your backyard – LancasterOnline

With the recent pandemic, stay-at-home orders and business closures, remember one thing: the sky will always be there.

There are two big astronomical events happening in April. Permitting that the weather is clear and that light pollution is low, hopeful viewers should see the cosmos in action this month.

April 8 is when the moon will be at its fullest. This year's April full moon, also known as the full pink moon, will also be a supermoon.

Supermoons happen when the moon is at its closest point to the Earth in its orbit, known as its perigee.

In total, four supermoons will be visible in 2020. Two have already happened: one each in February and March.

After April's supermoon, there will be one more in May, known as the full flower supermoon.

From April 22 to 23, the Lyrids meteor shower will be visible.

It's one of the smaller visible meteor showers, producing around 20 or so visible meteors an hour at its peak.

Meteors are best viewed when the skies are especially dark, so watching from a location with little light pollution will be key to watching the showers later this month.

The Lyrids are the precursor to the more lively Eta Aquarids meteor shower, which will happen from May 6 to 7.

Viewers can expect to see 60 or more visible meteors an hour at its peak.

Subscribe today for only $2

' + submsgtxthtml + '

Get unlimited access to breaking news, ancestry archives, our daily E-newspaper, games and more.

Subscribe today for only $2

' + submsgtxthtml + '

Get unlimited access to breaking news, ancestry archives, our daily E-newspaper, games and more.

Subscribe today for only $2

' + submsgtxthtml + '

Get unlimited access to breaking news, ancestry archives, our daily E-newspaper, games and more.

Subscribe today for only $2

' + submsgtxthtml + '

See the original post here:

Supermoon and a meteor shower: 2 astronomical events in April to watch from your backyard - LancasterOnline

AI and the coronavirus fight: How artificial intelligence is taking on COVID-19 – ZDNet

As the COVID-19 coronavirus outbreak continues to spread across the globe, companies and researchers are looking to use artificial intelligence as a way of addressing the challenges of the virus. Here are just some of the projects using AI to address the coronavirus outbreak.

Using AI to find drugs that target the virus

A number of research projects are using AI to identify drugs that were developed to fight other diseases but which could now be repurposed to take on coronavirus. By studying the molecular setup of existing drugs with AI, companies want to identify which ones might disrupt the way COVID-19 works.

BenevolentAI, a London-based drug-discovery company, began turning its attentions towards the coronavirus problem in late January. The company's AI-powered knowledge graph can digest large volumes of scientific literature and biomedical research to find links between the genetic and biological properties of diseases and the composition and action of drugs.

EE: How to implement AI and machine learning (ZDNet special report) | Download the report as a PDF (TechRepublic)

The company had previously been focused on chronic disease, rather than infections, but was able to retool the system to work on COVID-19 by feeding it the latest research on the virus. "Because of the amount of data that's being produced about COVID-19 and the capabilities we have in being able to machine-read large amounts of documents at scale, we were able to adapt [the knowledge graph] so to take into account the kinds of concepts that are more important in biology, as well as the latest information about COVID-19 itself," says Olly Oechsle, lead software engineer at BenevolentAI.

While a large body of biomedical research has built up around chronic diseases over decades, COVID-19 only has a few months' worth of studies attached to it. But researchers can use the information that they have to track down other viruses with similar elements, see how they function, and then work out which drugs could be used to inhibit the virus.

"The infection process of COVID-19 was identified relatively early on. It was found that the virus binds to a particular protein on the surface of cells called ACE2. And what we could with do with our knowledge graph is to look at the processes surrounding that entry of the virus and its replication, rather than anything specific in COVID-19 itself. That allows us to look back a lot more at the literature that concerns different coronaviruses, including SARS, etc. and all of the kinds of biology that goes on in that process of viruses being taken in cells," Oechsle says.

The system suggested a number of compounds that could potentially have an effect on COVID-19 including, most promisingly, a drug called Baricitinib. The drug is already licensed to treat rheumatoid arthritis. The properties of Baricitinib mean that it could potentially slow down the process of the virus being taken up into cells and reduce its ability to infect lung cells. More research and human trials will be needed to see whether the drug has the effects AI predicts.

Shedding light on the structure of COVID-19

DeepMind, the AI arm of Google's parent company Alphabet, is using data on genomes to predict organisms' protein structure, potentially shedding light on which drugs could work against COVID-19.

DeepMind has released a deep-learning library calledAlphaFold, which uses neural networks to predict how the proteins that make up an organism curve or crinkle, based on their genome. Protein structures determine the shape of receptors in an organism's cells. Once you know what shape the receptor is, it becomes possible to work out which drugs could bind to them and disrupt vital processes within the cells: in the case of COVID-19, disrupting how it binds to human cells or slowing the rate it reproduces, for example.

Aftertraining up AlphaFold on large genomic datasets, which demonstrate the links between an organism's genome and how its proteins are shaped, DeepMind set AlphaFold to work on COVID-19's genome.

"We emphasise that these structure predictions have not been experimentally verified, but hope they may contribute to the scientific community's interrogation of how the virus functions, and serve as a hypothesis generation platform for future experimental work in developing therapeutics," DeepMind said. Or, to put it another way, DeepMind hasn't tested out AlphaFold's predictions outside of a computer, but it's putting the results out there in case researchers can use them to develop treatments for COVID-19.

Detecting the outbreak and spread of new diseases

Artificial-intelligence systems were thought to be among the first to detect that the coronavirus outbreak, back when it was still localised to the Chinese city of Wuhan, could become a full-on global pandemic.

It's thought that AI-driven HealthMap, which is affiliated with the Boston Children's Hospital,picked up the growing clusterof unexplained pneumonia cases shortly before human researchers, although it only ranked the outbreak's seriousness as 'medium'.

"We identified the earliest signs of the outbreak by mining in Chinese language and local news media -- WeChat, Weibo -- to highlight the fact that you could use these tools to basically uncover what's happening in a population," John Brownstein, professor of Harvard Medical School and chief innovation officer at Boston Children's Hospital, told the Stanford Institute for Human-Centered Artificial Intelligence's COVID-19 and AI virtual conference.

Human epidemiologists at ProMed, an infectious-disease-reporting group, published their own alert just half an hour after HealthMap, and Brownstein also acknowledged the importance of human virologists in studying the spread of the outbreak.

"What we quickly realised was that as much it's easy to scrape the web to create a really detailed line list of cases around the world, you need an army of people, it can't just be done through machine learning and webscraping," he said. HealthMap also drew on the expertise of researchers from universities across the world, using "official and unofficial sources" to feed into theline list.

The data generated by HealthMap has been made public, to be combed through by scientists and researchers looking for links between the disease and certain populations, as well as containment measures. The data has already been combined with data on human movements, gleaned from Baidu,to see how population mobility and control measuresaffected the spread of the virus in China.

HealthMap has continued to track the spread of coronavirus throughout the outbreak, visualising itsspread across the world by time and location.

Spotting signs of a COVID-19 infection in medical images

Canadian startup DarwinAI has developed a neural network that can screen X-rays for signs of COVID-19 infection. While using swabs from patients is the default for testing for coronavirus, analysing chest X-rays could offer an alternative to hospitals that don't have enough staff or testing kits to process all their patients quickly.

DarwinAI released COVID-Net as an open-source system, and "the response has just been overwhelming", says DarwinAI CEO Sheldon Fernandez. More datasets of X-rays were contributed to train the system, which has now learnt from over 17,000 images, while researchers from Indonesia, Turkey, India and other countries are all now working on COVID-19. "Once you put it out there, you have 100 eyes on it very quickly, and they'll very quickly give you some low-hanging fruit on ways to make it better," Fernandez said.

The company is now working on turning COVID-Net from a technical implementation to a system that can be used by healthcare workers. It's also now developing a neural network for risk-stratifying patients that have contracted COVID-19 as a way of separating those with the virus who might be better suited to recovering at home in self-isolation, and those who would be better coming into hospital.

Monitoring how the virus and lockdown is affecting mental health

Johannes Eichstaedt, assistant professor in Stanford University's department of psychology, has been examining Twitter posts to estimate how COVID-19, and the changes that it's brought to the way we live our lives, is affecting our mental health.

Using AI-driven text analysis, Eichstaedt queried over two million tweets hashtagged with COVID-related terms during February and March, and combined it with other datasets on relevant factors including the number of cases, deaths, demographics and more, to illuminate the virus' effects on mental health.

The analysis showed that much of the COVID-19-related chat in urban areas was centred on adapting to living with, and preventing the spread of, the infection. Rural areas discussed adapting far less, which the psychologist attributed to the relative prevalence of the disease in urban areas compared to rural, meaning those in the country have had less exposure to the disease and its consequences.

SEE:Coronavirus: Business and technology in a pandemic

There are also differences in how the young and old are discussing COVID-19. "In older counties across the US, there's talk about Trump and the economic impact, whereas in young counties, it's much more problem-focused coping; the one language cluster that stand out there is that in counties that are younger, people talk about washing their hands," Eichstaedt said.

"We really need to measure the wellbeing impact of COVID-19, and we very quickly need to think about scalable mental healthcare and now is the time to mobilise resources to make that happen," Eichstaedt told the Stanford virtual conference.

Forecasting how coronavirus cases and deaths will spread across cities and why

Google-owned machine-learning community Kaggle is setting a number of COVID-19-related challenges to its members, includingforecasting the number of cases and fatalities by cityas a way of identifying exactly why some places are hit worse than others.

"The goal here isn't to build another epidemiological model there are lots of good epidemiological models out there. Actually, the reason we have launched this challenge is to encourage our community to play with the data and try and pick apart the factors that are driving difference in transmission rates across cities," Kaggle's CEO Anthony Goldbloom told the Stanford conference.

Currently, the community is working on a dataset of infections in 163 countries from two months of this year to develop models and interrogate the data for factors that predict spread.

Most of the community's models have been producing feature-importance plots to show which elements may be contributing to the differences in cases and fatalities. So far, said Goldbloom, latitude and longitude are showing up as having a bearing on COVID-19 spread. The next generation of machine-learning-driven feature-importance plots will tease out the real reasons for geographical variances.

"It's not the country that is the reason that transmission rates are different in different countries; rather, it's the policies in that country, or it's the cultural norms around hugging and kissing, or it's the temperature. We expect that as people iterate on their models, they'll bring in more granular datasets and we'll start to see these variable-importance plots becoming much more interesting and starting to pick apart the most important factors driving differences in transmission rates across different cities. This is one to watch," Goldbloom added.

Read the original here:

AI and the coronavirus fight: How artificial intelligence is taking on COVID-19 - ZDNet

Will product designers survive the AI revolution? – The Next Web

Did you know TNW Conference has a track fully dedicated to exploring new design trends this year? Check out the full Sprint program here.

Our intelligence is what makes us human, and AI is an extension of that quality. Yann LeCun

Thehuman species has performed incredible feats of ingenuity. We have created beautiful sculptures from a single block of marble, written enchanting sonnets that have stood for centuries and landed a craft on the face of a distant rock orbiting our planet. It is sobering then to think, that what separates us from our close, albeit far less superior cousins the chimpanzee, is a45% difference in our genomes.

I propose to you, however, that natures insatiable thirst for balance has ultimately led us to create a potential rival to our dominance as a species on this planetArtificial Intelligence. The pertinent question then becomes, what aspects of our infamous ingenuity will AI augment, and perhaps ultimately surpass?

What is AI & Machine Learning?

Essentially what some really smart people out there are trying to achieve, is a computer system that emulates human intelligence. This is the ability to make decisions that maximize the chance of the system achieving its goals. Even more important is the ability of the system to learn and evolve.

To achieve this, every system needs a starting point massive amounts of data. For example, in order to train a computer system to tell the difference between a cat and a dog, you would have to feed it with thousands of images of cats and dogs.

Read: [AI will never replace good old human creativity]

What is creativity?

Creativity is seeing what everyone else saw, and thinking what no one else thought Albert Einstein

Ive heard many people say a computer system could never be creative, and that to create art, music,or an ad campaign, one needs to feel, have a soul, and a lifetime of experiences to draw from.

Having spent over a decade in the advertising industry, I can confidently say that the best creatives I have seen, were usually the ones with the most exposure. The more you have seen, traveled or experienced, the more creative you tend to be.

Creativity is about challenging the norm,thinking differently, being the square pegs in the round holes, and evoking specific emotions in your audience. So how difficult can that be for AI to achieve? It certainly seems that in todays world, creativity is actually very arbitrary. Why? Because both this

and this

are considered valuable works of art.

The current state of AI vs Creatives

Read the original post:

Will product designers survive the AI revolution? - The Next Web

AI In The Enterprise: Reality Or Myth? – Forbes

Artificial intelligence (AI) is one of the most talked-about new technologies in the business world today.

It's estimated that enterprise AI usage has increased 270% since 2015. This has coincided with a massive spike in investment, with the enterprise AI industry expected to grow to $6.1 billion by 2022.

Along with the technology's very real ability to transform the job market, exaggerated myths have also become common. The hype surrounding this branch of technology has led to a number of myths:

Myth No. 1: More Data Is The Key To AI's Success

While it's true that AI needs data in order to learn and operate efficiently, the idea that more data equals better outcomes is misleading. Not all data is created equal.

If the information fed to an AI program is labeled incorrectly or isn't relevant, it poisons the data pool. The more information AI has access to, the more precise its models and predictions will be. If the data itself is of poor quality, the outcome will be precise but not necessarily based on business reality. This can result in poor decision-making.

The truth is that the data fed to an AI solution needs to be curated and analyzed beforehand. Prioritize quality over quantity.

Myth No. 2: Companies See Immediate Value From AI investments

The integration of AI into standard operating procedures doesn't happen overnight. As seen in Myth No. 1, the data the AI uses needs to be curated and checked for relevance beforehand. This may significantly reduce the amount of information the AI has access to.

To obtain truly valuable returns, it's essential to continuously provide relevant data. Like humans, AI solutions need to be given time to learn. There may be a significant lag between when an AI-based initiative begins and when you see a return on investment.

Myth No. 3: AI Will Render Humans Obsolete

The purpose of AI is not to replace all human workers. AI is a tool businesses can use to achieve their goals. It can automate mundane processes and pull interesting insights from large data sets. When used correctly, it augments and aids human decision-making. AI provides recommendations based on trends gleaned from mountains of information. It may even pose new questions that have never been considered. A human still needs to weigh the information provided and make a final decision based on risk analysis.

Pointing out these myths in no way indicates that AI won't deliver on its transformational promise. It's easy to forget that enterprise AI adoption is still in its infancy. Even still, a 2018 Deloitte survey reported that 82% of executives said their AI projects had already led to a positive ROI. Those now implementing AI projects will be the case studies of the near future.

While there are sure to be growing pains, being on the cutting edge of this exciting technology should be beneficial. There's little doubt about how important it will be for the businesses of tomorrow. Getting a head start now, ironing out the wrinkles and locking down efficient processes will pay dividends.

Visit link:

AI In The Enterprise: Reality Or Myth? - Forbes

How Microsoft Teams will use AI to filter out typing, barking, and other noise from video calls – VentureBeat

Last month, Microsoft announced that Teams, its competitor to Slack, Facebooks Workplace, and Googles Hangouts Chat, had passed 44 million daily active users. The milestone overshadowed its unveiling of a few new features coming later this year. Most were straightforward: a hand-raising feature to indicate you have something to say, offline and low-bandwidth support to read chat messages and write responses even if you have poor or no internet connection, and an option to pop chats out into a separate window. But one feature, real-time noise suppression, stood out Microsoft demoed how the AI minimized distracting background noise during a call.

Weve all been there. How many times have you asked someone to mute themselves or to relocate from a noisy area? Real-time noise suppression will filter out someone typing on their keyboard while in a meeting, the rustling of a bag of chips (as you can see in the video above), and a vacuum cleaner running in the background. AI will remove the background noise in real time so you can hear only speech on the call. But how exactly does it work? We talked to Robert Aichner, Microsoft Teams group program manager, to find out.

The use of collaboration and video conferencing tools is exploding as the coronavirus crisis forces millions to learn and work from home. Microsoft is pushing Teams as the solution for businesses and consumers as part of its Microsoft 365 subscription suite. The company is leaning on its machine learning expertise to ensure AI features are one of its big differentiators. When it finally arrives, real-time background noise suppression will be a boon for businesses and households full of distracting noises. Additionally, how Microsoft built the feature is also instructive to other companies tapping machine learning.

Of course, noise suppression has existed in the Microsoft Teams, Skype, and Skype for Business apps for years. Other communication tools and video conferencing apps have some form of noise suppression as well. But that noise suppression covers stationary noise, such as a computer fan or air conditioner running in the background. The traditional noise suppression method is to look for speech pauses, estimate the baseline of noise, assume that the continuous background noise doesnt change over time, and filter it out.

Going forward, Microsoft Teams will suppress non-stationary noises like a dog barking or somebody shutting a door. That is not stationary, Aichner explained. You cannot estimate that in speech pauses. What machine learning now allows you to do is to create this big training set, with a lot of representative noises.

In fact, Microsoft open-sourced its training set earlier this year on GitHub to advance the research community in that field. While the first version is publicly available, Microsoft is actively working on extending the data sets. A company spokesperson confirmed that as part of the real-time noise suppression feature, certain categories of noises in the data sets will not be filtered out on calls, including musical instruments, laughter, and singing.

Microsoft cant simply isolate the sound of human voices because other noises also happen at the same frequencies. On a spectrogram of speech signal, unwanted noise appears in the gaps between speech and overlapping with the speech. Its thus next to impossible to filter out the noise if your speech and noise overlap, you cant distinguish the two. Instead, you need to train a neural network beforehand on what noise looks like and speech looks like.

To get his points across, Aichner compared machine learning models for noise suppression to machine learning models for speech recognition. For speech recognition, you need to record a large corpus of users talking into the microphone and then have humans label that speech data by writing down what was said. Instead of mapping microphone input to written words, in noise suppression youre trying to get from noisy speech to clean speech.

We train a model to understand the difference between noise and speech, and then the model is trying to just keep the speech, Aichner said. We have training data sets. We took thousands of diverse speakers and more than 100 noise types. And then what we do is we mix the clean speech without noise with the noise. So we simulate a microphone signal. And then you also give the model the clean speech as the ground truth. So youre asking the model, From this noisy data, please extract this clean signal, and this is how it should look like. Thats how you train neural networks [in] supervised learning, where you basically have some ground truth.

For speech recognition, the ground truth is what was said into the microphone. For real-time noise suppression, the ground truth is the speech without noise. By feeding a large enough data set in this case hundreds of hours of data Microsoft can effectively train its model. Its able to generalize and reduce the noise with my voice even though my voice wasnt part of the training data, Aichner said. In real time, when I speak, there is noise that the model would be able to extract the clean speech [from] and just send that to the remote person.

Comparing the functionality to speech recognition makes noise suppression sound much more achievable, even though its happening in real time. So why has it not been done before? Can Microsofts competitors quickly recreate it? Aichner listed challenges for building real-time noise suppression, including finding representative data sets, building and shrinking the model, and leveraging machine learning expertise.

We already touched on the first challenge: representative data sets. The team spent a lot of time figuring out how to produce sound files that exemplify what happens on a typical call.

They used audio books for representing male and female voices, since speech characteristics do differ between male and female voices. They used YouTube data sets with labeled data that specify that a recording includes, say, typing and music. Aichners team then combined the speech data and noises data using a synthesizer script at different signal to noise ratios. By amplifying the noise, they could imitate different realistic situations that can happen on a call.

But audiobooks are drastically different than conference calls. Would that not affect the model, and thus the noise suppression?

That is a good point, Aichner conceded. Our team did make some recordings as well to make sure that we are not just training on synthetic data we generate ourselves, but that it also works on actual data. But its definitely harder to get those real recordings.

Aichners team is not allowed to look at any customer data. Additionally, Microsoft has strict privacy guidelines internally. I cant just simply say, Now I record every meeting.'

So the team couldnt use Microsoft Teams calls. Even if they could say, if some Microsoft employees opted-in to have their meetings recorded someone would still have to mark down when exactly distracting noises occurred.

And so thats why we right now have some smaller-scale effort of making sure that we collect some of these real recordings with a variety of devices and speakers and so on, said Aichner. What we then do is we make that part of the test set. So we have a test set which we believe is even more representative of real meetings. And then, we see if we use a certain training set, how well does that do on the test set? So ideally yes, I would love to have a training set, which is all Teams recordings and have all types of noises people are listening to. Its just that I cant easily get the same number of the same volume of data that I can by grabbing some other open source data set.

I pushed the point once more: How would an opt-in program to record Microsoft employees using Teams impact the feature?

You could argue that it gets better, Aichner said. If you have more representative data, it could get even better. So I think thats a good idea to potentially in the future see if we can improve even further. But I think what we are seeing so far is even with just taking public data, it works really well.

The next challenge is to figure out how to build the neural network, what the model architecture should be, and iterate. The machine learning model went through a lot of tuning. That required a lot of compute. Aichners team was of course relying on Azure, using many GPUs. Even with all that compute, however, training a large model with a large data set could take multiple days.

A lot of the machine learning happens in the cloud, Aichner said. So, for speech recognition for example, you speak into the microphone, thats sent to the cloud. The cloud has huge compute, and then you run these large models to recognize your speech. For us, since its real-time communication, I need to process every frame. Lets say its 10 or 20 millisecond frames. I need to now process that within that time, so that I can send that immediately to you. I cant send it to the cloud, wait for some noise suppression, and send it back.

For speech recognition, leveraging the cloud may make sense. For real-time noise suppression, its a nonstarter. Once you have the machine learning model, you then have to shrink it to fit on the client. You need to be able to run it on a typical phone or computer. A machine learning model only for people with high-end machines is useless.

Theres another reason why the machine learning model should live on the edge rather than the cloud. Microsoft wants to limit server use. Sometimes, there isnt even a server in the equation to begin with. For one-to-one calls in Microsoft Teams, the call setup goes through a server, but the actual audio and video signal packets are sent directly between the two participants. For group calls or scheduled meetings, there is a server in the picture, but Microsoft minimizes the load on that server. Doing a lot of server processing for each call increases costs, and every additional network hop adds latency. Its more efficient from a cost and latency perspective to do the processing on the edge.

You want to make sure that you push as much of the compute to the endpoint of the user because there isnt really any cost involved in that. You already have your laptop or your PC or your mobile phone, so now lets do some additional processing. As long as youre not overloading the CPU, that should be fine, Aichner said.

I pointed out there is a cost, especially on devices that arent plugged in: battery life. Yeah, battery life, we are obviously paying attention to that too, he said. We dont want you now to have much lower battery life just because we added some noise suppression. Thats definitely another requirement we have when we are shipping. We need to make sure that we are not regressing there.

Its not just regression that the team has to consider, but progression in the future as well. Because were talking about a machine learning model, the work never ends.

We are trying to build something which is flexible in the future because we are not going to stop investing in noise suppression after we release the first feature, Aichner said. We want to make it better and better. Maybe for some noise tests we are not doing as good as we should. We definitely want to have the ability to improve that. The Teams client will be able to download new models and improve the quality over time whenever we think we have something better.

The model itself will clock in at a few megabytes, but it wont affect the size of the client itself. He said, Thats also another requirement we have. When users download the app on the phone or on the desktop or laptop, you want to minimize the download size. You want to help the people get going as fast as possible.

Adding megabytes to that download just for some model isnt going to fly, Aichner said. After you install Microsoft Teams, later in the background it will download that model. Thats what also allows us to be flexible in the future that we could do even more, have different models.

All the above requires one final component: talent.

You also need to have the machine learning expertise to know what you want to do with that data, Aichner said. Thats why we created this machine learning team in this intelligent communications group. You need experts to know what they should do with that data. What are the right models? Deep learning has a very broad meaning. There are many different types of models you can create. We have several centers around the world in Microsoft Research, and we have a lot of audio experts there too. We are working very closely with them because they have a lot of expertise in this deep learning space.

The data is open source and can be improved upon. A lot of compute is required, but any company can simply leverage a public cloud, including the leaders Amazon Web Services, Microsoft Azure, and Google Cloud. So if another company with a video chat tool had the right machine learners, could they pull this off?

The answer is probably yes, similar to how several companies are getting speech recognition, Aichner said. They have a speech recognizer where theres also lots of data involved. Theres also lots of expertise needed to build a model. So the large companies are doing that.

Aichner believes Microsoft still has a heavy advantage because of its scale. I think that the value is the data, he said. What we want to do in the future is like what you said, have a program where Microsoft employees can give us more than enough real Teams Calls so that we have an even better analysis of what our customers are really doing, what problems they are facing, and customize it more towards that.

Read the rest here:

How Microsoft Teams will use AI to filter out typing, barking, and other noise from video calls - VentureBeat