12345...102030...


Supercomputer – Simple English Wikipedia, the free …

A supercomputer is a computer with great speed and memory. This kind of computer can do jobs faster than any other computer of its generation. They are usually thousands of times faster than ordinary personal computers made at that time. Supercomputers can do arithmetic jobs very fast, so they are used for weather forecasting, code-breaking, genetic analysis and other jobs that need many calculations. When new computers of all classes become more powerful, new ordinary computers are made with powers that only supercomputers had in the past, while new supercomputers continue to outclass them.

Electrical engineers make supercomputers that link many thousands of microprocessors.

Supercomputer types include: shared memory, distributed memory and array. Supercomputers with shared memory are developed by using a parallel computing and pipelining concept. Supercomputers with distributed memory consist of many (about 100~10000) nodes. CRAY series of CRAYRESERCH and VP 2400/40, NEC SX-3 of HUCIS are shared memory types. nCube 3, iPSC/860, AP 1000, NCR 3700, Paragon XP/S, CM-5 are distributed memory types.

An array type computer named ILIAC started working in 1972. Later, the CF-11, CM-2, and the Mas Par MP-2 (which is also an array type) were developed. Supercomputers that use a physically separated memory as one shared memory include the T3D, KSR1, and Tera Computer.

Organizations

Centers

Original post:

Supercomputer – Simple English Wikipedia, the free …

TOP500 – Wikipedia

The TOP500 project ranks and details the 500 most powerful non-distributed computer systems in the world. The project was started in 1993 and publishes an updated list of the supercomputers twice a year. The first of these updates always coincides with the International Supercomputing Conference in June, and the second is presented at the ACM/IEEE Supercomputing Conference in November. The project aims to provide a reliable basis for tracking and detecting trends in high-performance computing and bases rankings on HPL,[1] a portable implementation of the high-performance LINPACK benchmark written in Fortran for distributed-memory computers.

China currently dominates the list with 206 supercomputers, leading the second place (United States) by a record margin of 82.[2] In the most recent list (June 2018), the American Summit is the world’s most powerful supercomputer, reaching 122.3 petaFLOPS on the LINPACK benchmarks.

The TOP500 list is compiled by Jack Dongarra of the University of Tennessee, Knoxville, Erich Strohmaier and Horst Simon of the National Energy Research Scientific Computing Center (NERSC) and Lawrence Berkeley National Laboratory (LBNL), and from 1993 until his death in 2014, Hans Meuer of the University of Mannheim, Germany.

Combined performance of 500 largest supercomputers

Fastest supercomputer

Supercomputer in 500th place

In the early 1990s, a new definition of supercomputer was needed to produce meaningful statistics. After experimenting with metrics based on processor count in 1992, the idea arose at the University of Mannheim to use a detailed listing of installed systems as the basis. In early 1993, Jack Dongarra was persuaded to join the project with his LINPACK benchmarks. A first test version was produced in May 1993, partly based on data available on the Internet, including the following sources:[3][4]

The information from those sources was used for the first two lists. Since June 1993, the TOP500 is produced bi-annually based on site and vendor submissions only.

Since 1993, performance of the No.1 ranked position has grown steadily in accordance with Moore’s law, doubling roughly every 14 months. As of June2018[update], Summit was fastest with an Rpeak[7] of 187.6593PFLOPS. For comparison, this is over 1,432,513 times faster than the Connection Machine CM-5/1024 (1,024 cores), which was the fastest system in November 1993 (twenty-five years prior) with an Rpeak of 131.0GFLOPS.[8]

In June 2016, a Chinese computer made the top based on SW26010 processors, a new, radically modified, model in the Sunway (or ShenWei) line.

As of June2018[update], TOP500 supercomputers are now all 64-bit, mostly based on x86-64 CPUs (Intel EMT64 and AMD AMD64 instruction set architecture), with few exceptions (all based on reduced instruction set computing (RISC) architectures) including 15 supercomputers based on Power Architecture used by IBM POWER microprocessors, six SPARC (all with Fujitsu-designed SPARC-chips, one of whichthe K computersurprisingly made the top in 2011 without a GPU, currently ranked 16th, while only dropping to 3rd on the HPCG list[9]), and two, seemingly related, Chinese designs: the ShenWei-based (ranked 11 in 2011, ranked 158th in November 2016) and Sunway SW26010-based ranked 1 in 2016, making up the remainder (another non-US design is PEZY-SC, while it is an accelerator paired with Intel’s Xeon). Before the ascendance of 32-bit x86 and later 64-bit x86-64 in the early 2000s, a variety of RISC processor families made up most TOP500 supercomputers, including RISC architectures such as SPARC, MIPS, PA-RISC, and Alpha.

In recent years heterogeneous computing, mostly using Nvidia’s graphics processing units (GPU) as coprocessors, has become a popular way to reach a better performance per watt ratio and higher absolute performance; it is almost required for good performance and to make the top (or top 10), with some exceptions, such as the mentioned SPARC computer without any coprocessors. An x86-based coprocessor, Xeon Phi, has also been used.

All the fastest supercomputers in the decade since the Earth Simulator supercomputer have used operating systems based on Linux. Since November2017[update], all the listed supercomputers (100% of the performance share) use an operating system based on the Linux kernel.[10][11]

Since November 2015, no computer on the list runs Windows. In November 2014, Windows Azure[12] cloud computer was no longer on the list of fastest supercomputers (its best rank was 165 in 2012), leaving the Shanghai Supercomputer Center’s Magic Cube as the only Windows-based supercomputer on the list, until it also dropped off the list. It had been ranked 436 in its last appearance on the list released in June 2015, while its best rank was 11 in 2008.[13]

It has been well over a decade since MIPS-based systems (meaning used as host CPUs) dropped entirely off the list[14] but the Gyoukou supercomputer that jumped to 4th place in November 2017 (after a huge upgrade) has MIPS as a small part of the coprocessors. Use of 2,048-core coprocessors (plus 8 6-core MIPS, for each, that “no longer require to rely on an external Intel Xeon E5 host processor”[15]) make the supercomputer much more energy efficient than the other top 10 (i.e. it is 5th on Green500 and other such ZettaScaler-2.2-based systems take first three spots).[16] At 19.86 million cores, it is by far the biggest system: almost double that of the best manycore system in the TOP500, the Chinese Sunway TaihuLight, currently ranked 2nd.

Legend:

Numbers below represent the number of computers in the TOP500 that are in each of the listed countries.

By number of systems as of June2018[update]:[27]

Note, all operating systems of the TOP500 systems use Linux, but Linux above is generic Linux.

In November 2014, it was announced that the United States was developing two new supercomputers to exceed China’s Tianhe-2 in its place as world’s fastest supercomputer. The two computers, Sierra and Summit, will each exceed Tianhe-2’s 55 peak petaflops. Summit, the more powerful of the two, will deliver 150300 peak petaflops.[28] On 10 April 2015, US government agencies banned selling chips, from Nvidia, to supercomputing centers in China as “acting contrary to the national security… interests of the United States”;[29] and Intel Corporation from providing Xeon chips to China due to their use, according to the US, in researching nuclear weapons research to which US export control law bans US companies from contributing “The Department of Commerce refused, saying it was concerned about nuclear research being done with the machine.”[30]

On 29 July 2015, President Obama signed an executive order creating a National Strategic Computing Initiative calling for the accelerated development of an exascale (1000 petaflop) system and funding research into post-semiconductor computing.[31]

In June 2016, Japanese firm Fujitsu announced at the International Supercomputing Conference that its future exascale supercomputer will feature processors of its own design that implement the ARMv8 architecture. The Flagship2020 program, by Fujitsu for RIKEN plans to break the exaflops barrier by 2020 (and “it looks like China and France have a chance to do so and that the United States is content for the moment at least to wait until 2023 to break through the exaflops barrier.”[32]) These processors will also implement extensions to the ARMv8 architecture equivalent to HPC-ACE2 that Fujitsu is developing with ARM Holdings.[32]

Inspur has been one of the largest HPC system manufacturer based out of Jinan, China. As of May 2017, Inspur has become the third manufacturer to have manufactured 64-way system a record which has been previously mastered by IBM and HP. The company has registered over $10B in revenues and have successfully provided a number of HPC systems to countries outside China such as Sudan, Zimbabwe, Saudi Arabia, Venezuela. Inspur was also a major technology partner behind both the supercomputers from China, namely Tianhe-2 and Taihu which lead the top 2 positions of Top500 supercomputer list up to November 2017. Inspur and Supermicro released a few platforms aimed at HPC using GPU such as SR-AI and AGX-2 in May 2017.[33]

Some major systems are not listed on the list. The largest example is the NCSA’s Blue Waters which publicly announced the decision not to participate in the list[34] because they do not feel it accurately indicates the ability for any system to be able to do useful work.[35] Other organizations decide not to list systems for security and/or commercial competitiveness reasons. Additional purpose-built machines that are not capable or do not run the benchmark were not included, such as RIKEN MDGRAPE-3 and MDGRAPE-4.

IBM Roadrunner[36] is no longer on the list (nor is any other using the Cell coprocessor, or PowerXCell).

Although Itanium-based systems reached second rank in 2004,[37][38] none now remain.

Similarly (non-SIMD-style) vector processors (NEC-based such as the Earth simulator that was fastest in 2002[39]) have also fallen off the list. Also the Sun Starfire computers that occupied many spots in the past now no longer appear.

The last non-Linux computers on the list the two AIX ones running on POWER7 (in July 2017 ranked 494rd and 495th[40] originally 86th and 85th), dropped off the list in November 2017.

Go here to read the rest:

TOP500 – Wikipedia

Home | Alabama Supercomputer Authority

The Alabama Supercomputer Authority (ASA) is a state-funded corporation founded in 1989 for the purpose of planning, acquiring, developing, administering and operating a statewide supercomputer and related telecommunicationsystems.

In addition toHigh Performance Computing, and with the growth of the internet,ASA developed the Alabama Research and Education Network (AREN), whichoffers education and research clients in Alabama internet access and other related network services. ASA has further expanded its offerings with state-of-the-artapplication development services that include custom website design with content management system (CMS)development and custom web-based applications for data-mining, reporting, and other client needs.

Original post:

Home | Alabama Supercomputer Authority

What is supercomputer? – Definition from WhatIs.com

A supercomputer is a computer that performs at or near the currently highest operational rate for computers. Traditionally, supercomputers have been used for scientific and engineering applications that must handle very large databases or do a great amount of computation (or both).Although advances likemulti-core processors and GPGPUs (general-purpose graphics processing units)have enabled powerful machinesfor personal use (see: desktop supercomputer, GPU supercomputer),by definition, a supercomputer is exceptional in terms of performance.

At any given time, there are a few well-publicized supercomputers that operate at extremely high speeds relative to all other computers. The term is also sometimes applied to far slower (but still impressively fast) computers. The largest, most powerful supercomputers are really multiple computers that perform parallel processing. In general, there are two parallel processing approaches: symmetric multiprocessing (SMP) and massively parallel processing (MPP).

As of June 2016, the fastest supercomputer in the world was the Sunway TaihuLight, in the city of Wixu in China. A few statistics on TaihuLight:

The first commercially successful supercomputer, the CDC (Control Data Corporation) 6600 was designed by Seymour Cray. Released in 1964, the CDC 6600 had a single CPU and cost $8 million the equivalent of $60 million today. The CDC could handle three million floating point operations per second (flops).

Cray went on to found a supercomputer company under his name in 1972. Although the company has changed hands a number of times it is still in operation. In September 2008, Cray and Microsoft launched CX1, a $25,000 personal supercomputer aimed at markets such as aerospace, automotive, academic, financial services and life sciences.

IBM has been a keen competitor. The company’s Roadrunner, once the top-ranked supercomputer, was twice as fast as IBM’s Blue Gene and six times as fast as any of other supercomputers at that time. IBM’s Watson is famous for having adopted cognitive computing to beat champion Ken Jennings on Jeopardy!, a popular quiz show.

Year

Supercomputer

Peak speed (Rmax)

Location

2016

Sunway TaihuLight

93.01PFLOPS

Wuxi, China

2013

NUDTTianhe-2

33.86PFLOPS

Guangzhou, China

2012

CrayTitan

17.59PFLOPS

Oak Ridge, U.S.

2012

IBMSequoia

17.17PFLOPS

Livermore, U.S.

2011

FujitsuK computer

10.51PFLOPS

Kobe, Japan

2010

Tianhe-IA

2.566PFLOPS

Tianjin, China

2009

CrayJaguar

1.759PFLOPS

Oak Ridge, U.S.

2008

IBMRoadrunner

1.026PFLOPS

Los Alamos, U.S.

1.105PFLOPS

In the United States, some supercomputer centers are interconnected on an Internet backbone known as vBNS or NSFNet. This network is the foundation for an evolving network infrastructure known as the National Technology Grid. Internet2 is a university-led project that is part of this initiative.

At the lower end of supercomputing, clustering takes more of a build-it-yourself approach to supercomputing. The Beowulf Project offers guidance on how to put together a number of off-the-shelf personal computer processors, using Linux operating systems, and interconnecting the processors with Fast Ethernet. Applications must be written to manage the parallel processing.

See the article here:

What is supercomputer? – Definition from WhatIs.com

What is Supercomputer? Webopedia Definition

Main TERM S

By Vangie Beal

The fastest type of computer. Supercomputers are very expensive and are employed for specialized applications that require immense amounts of mathematical calculations. For example, weather forecasting requires a supercomputer. Other uses of supercomputers include animated graphics, fluid dynamic calculations, nuclear energy research, and petroleum exploration.

The chief difference between a supercomputer and a mainframe is that a supercomputer channels all its power into executing a few programs as fast as possible, whereas a mainframe uses its power to execute many programs concurrently.

Stay up to date on the latest developments in Internet terminology with a free newsletter from Webopedia. Join to subscribe now.

Originally posted here:

What is Supercomputer? Webopedia Definition

Scientists Say New Material Could Hold up an Actual Space Elevator

Space Elevator

It takes a lot of energy to put stuff in space. That’s why one longtime futurist dream is a “space elevator” — a long cable strung between a geostationary satellite and the Earth that astronauts could use like a dumbwaiter to haul stuff up into orbit.

The problem is that such a system would require an extraordinarily light, strong cable. Now, researchers from Beijing’s Tsinghua University say they’ve developed a carbon nanotube fiber so sturdy and lightweight that it could be used to build an actual space elevator.

Going Up

The researchers published their paper in May, but it’s now garnering the attention of their peers. Some believe the Tsinghua team’s material really could lead to the creation of an elevator that would make it cheaper to move astronauts and materials into space.

“This is a breakthrough,” colleague Wang Changqing, who studies space elevators at Northwestern Polytechnical University, told the South China Morning Post.

Huge If True

There are still countless galling technical problems that need to be overcome before a space elevator would start to look plausible. Wang pointed out that it’d require tens of thousands of kilometers of the new material, for instance, as well as a shield to protect it from space debris.

But the research brings us one step closer to what could be a true game changer: a vastly less expensive way to move people and spacecraft out of Earth’s gravity.

READ MORE: China Has Strongest Fibre That Can Haul 160 Elephants – and a Space Elevator? [South China Morning Post]

More on space elevators: Why Space Elevators Could Be the Future of Space Travel

Read more here:

Scientists Say New Material Could Hold up an Actual Space Elevator

An AI Conference Refusing a Name Change Highlights a Tech Industry Problem

Name Game

There’s a prominent artificial intelligence conference that goes by the suggestive acronym NIPS, which stands for “Neural Information Processing Systems.”

After receiving complaints that the acronym was alienating to women, the conference’s leadership collected suggestions for a new name via an online poll, according to WIRED. But the conference announced Monday that it would be sticking with NIPS all the same.

Knock It Off

It’s convenient to imagine that this acronym just sort of emerged by coincidence, but let’s not indulge in that particular fantasy.

It’s more likely that tech geeks cackled maniacally when they came up with the acronym, and the refusal to do better even when people looking up the conference in good faith are bombarded with porn is a particularly telling failure of the AI research community.

Small Things Matter

This problem goes far beyond a silly name — women are severely underrepresented in technology research and even more so when it comes to artificial intelligence. And if human decency — comforting those who are regularly alienated by the powers that be — isn’t enough of a reason to challenge the sexist culture embedded in tech research, just think about what we miss out on.

True progress in artificial intelligence cannot happen without a broad range of diverse voices — voices that are silenced by “locker room talk” among an old boy’s club. Otherwise, our technological development will become just as stuck in place as our cultural development often seems to be.

READ MORE: AI RESEARCHERS FIGHT OVER FOUR LETTERS: NIPS [WIRED]

More on Silicon Valley sexism: The Tech Industry’s Gender Problem Isn’t Just Hurting Women

Continued here:

An AI Conference Refusing a Name Change Highlights a Tech Industry Problem

Scientists Are Hopeful AI Could Help Predict Earthquakes

Quake Rate

Earlier this year, I interviewed U.S. Geological Survey geologist Annemarie Baltay for a story about why it’s incredibly difficult to predict earthquakes.

“We don’t use that ‘p word’ — ‘predict’ — at all,” she told me. “Earthquakes are chaotic. We don’t know when or where they’ll occur.”

Neural Earthwork

That could finally be starting to change, according to a fascinating feature in The New York Times.

By feeding seismic data into a neural network — a type of artificial intelligence that learns to recognize patterns by scrutinizing examples — researchers say they can now predict moments after a quake strikes how far its aftershocks will travel.

And eventually, some believe, they’ll be able to listen to signals from fault lines and predict when an earthquake will strike in the first place.

Future Vision

But like Baltay, some researchers aren’t convinced we’ll ever be able to predict earthquakes.University of Tokyo seismologist Robert Geller told the Times that until an algorithm actually predicts an upcoming quake, he’ll remain skeptical.

“There are no shortcuts,” he said. “If you cannot predict the future, then your hypothesis is wrong.”

READ MORE: A.I. Is Helping Scientist Predict When and Where the Next Big Earthquake Will Be [The New York Times]

More on earthquake AI: A New AI Detected 17 Times More Earthquakes Than Traditional Methods

Read the original post:

Scientists Are Hopeful AI Could Help Predict Earthquakes

A Stem Cell Transplant Let a Wheelchair-Bound Man Dance Again

Stand Up Guy

For 10 years, Roy Palmer had no feeling in his lower extremities. Two days after receiving a stem cell transplant, he cried tears of joy because he could feel a cramp in his leg.

The technical term for the procedure the British man underwent is hematopoietic stem cell transplantation (HSCT). And while risky, it’s offering new hope to people like Palmer, who found himself wheelchair-bound after multiple sclerosis (MS) caused his immune system to attack his nerves’ protective coverings.

Biological Reboot

Ever hear the IT troubleshooting go-to of turning a system off and on again to fix it? The HSCT process is similar, but instead of a computer, doctors attempt to reboot a patient’s immune system.

To do this, they first remove stem cells from the patient’s body. Then the patient undergoes chemotherapy, which kills the rest of their immune system. After that, the doctors use the extracted stem cells to reboot the patient’s immune system.

It took just two days for the treatment to restore some of the feeling in Palmer’s legs. Eventually, he was able to walk on his own and even dance. He told the BBC in a recent interview that he now feels like he has a second chance at life.

“We went on holiday, not so long ago, to Turkey. I walked on the beach,” said Palmer. “Little things like that, people do not realize what it means to me.”

Risk / Reward

Still, HSCT isn’t some miracle cure for MS. Though it worked for Palmer, that’s not always the case, and HSCT can also cause infections and infertility. The National MS Society still considers HSCT to be an experimental treatment, and the Food and Drug Administration has yet to approve the therapy in the U.S.

However, MS affects more than 2.3 million people, and if a stem cell transplant can help even some of those folks the way it helped Palmer, it’s a therapy worth exploring.

READ MORE: Walking Again After Ten Years With MS [BBC]

More on HCST: New Breakthrough Treatment Could “Reverse Disability” for MS Patients

View post:

A Stem Cell Transplant Let a Wheelchair-Bound Man Dance Again

AI Dreamed Up These Nightmare Fuel Halloween Masks

Nightmare Fuel

Someone programmed an AI to dream up Halloween masks, and the results are absolute nightmare fuel. Seriously, just look at some of these things.

“What’s so scary or unsettling about it is that it’s not so detailed that it shows you everything,” said Matt Reed, the creator of the masks, in an interview with New Scientist. “It leaves just enough open for your imagination to connect the dots.”

A selection of masks featured on Reed’s twitter. Credit: Matt Reed/Twitter

Creative Horror

To create the masks, Reed — whose day job is as a technologist at a creative agency called redpepper — fed an open source AI tool 5,000 pictures of Halloween masks he sourced from Google Images. He then instructed the tool to generate its own masks.

The fun and spooky project is yet another sign that AI is coming into its own as a creative tool. Just yesterday, a portrait generated by a similar system fetched more than $400,000 at a prominent British auction house.

And Reed’s masks are evocative. Here at the Byte, if we looked through the peephole and saw one of these on a trick or treater, we might not open our door.

READ MORE: AI Designed These Halloween Masks and They Are Absolutely Terrifying [New Scientist]

More on AI-generated art: Generated Art Will Go on Sale Alongside Human-Made Works This Fall

See the article here:

AI Dreamed Up These Nightmare Fuel Halloween Masks

Robot Security Guards Will Constantly Nag Spectators at the Tokyo Olympics

Over and Over

“The security robot is patrolling. Ding-ding. Ding-ding. The security robot is patrolling. Ding-ding. Ding-ding.”

That’s what Olympic attendees will hear ad nauseam when they step onto the platforms of Tokyo’s train stations in 2020. The source: Perseusbot, a robot security guard Japanese developers unveiled to the press on Thursday.

Observe and Report

According to reporting by Kyodo News, the purpose of the AI-powered Perseusbot is to lower the burden on the stations’ staff when visitors flood Tokyo during the 2020 Olympics.

The robot is roughly 5.5 feet tall and equipped with security cameras that allow it to note suspicious behaviors, such as signs of violence breaking out or unattended packages, as it autonomous patrols the area. It can then alert security staff to the issues by sending notifications directly to their smart phones.

Prior Prepration

Just like the athletes who will head to Tokyo in 2020, Perseusbot already has a training program in the works — it’ll patrol Tokyo’s Seibu Shinjuku Station from November 26 to 30. This dry run should give the bot’s developers a chance to work out any kinks before 2020.

If all goes as hoped, the bot will be ready to annoy attendees with its incessant chant before the Olympic torch is lit. And, you know, keep everyone safe, too.

READ MORE: Robot Station Security Guard Unveiled Ahead of 2020 Tokyo Olympics [Kyodo News]

More robot security guards: Robot Security Guards Are Just the Beginning

The rest is here:

Robot Security Guards Will Constantly Nag Spectators at the Tokyo Olympics

People Would Rather a Self-Driving Car Kill a Criminal Than a Dog

Snap Decisions

On first glance, a site that collects people’s opinions about whose life an autonomous car should favor doesn’t tell us anything we didn’t already know. But look closer, and you’ll catch a glimpse of humanity’s dark side.

The Moral Machine is an online survey designed by MIT researchers to gauge how the public would want an autonomous car to behave in a scenario in which someone has to die. It asks questions like: “If an autonomous car has to choose between killing a man or a woman, who should it kill? What if the woman is elderly but the man is young?”

Essentially, it’s a 21st century update on the Trolley Problem, an ethical thought experiment no doubt permanently etched into the mind of anyone who’s seen the second season of “The Good Place.”

Ethical Dilemma

The MIT team launched the Moral Machine in 2016, and more than two million people from 233 countries participated in the survey — quite a significant sample size.

On Wednesday, the researchers published the results of the experiment in the journal Nature, and they really aren’t all that surprising: Respondents value the life of a baby over all others, with a female child, male child, and pregnant woman following closely behind. Yawn.

It’s when you look at the other end of the spectrum — the characters survey respondents were least likely to “save” — that you’ll see something startling: Survey respondents would rather the autonomous car kill a human criminal than a dog.

moral machine
Image Credit: MIT

Ugly Reflection

While the team designed the survey to help shape the future of autonomous vehicles, it’s hard not to focus on this troubling valuing of a dog’s life over that of any human, criminal or not. Does this tell us something important about how society views the criminal class? Reveal that we’re all monsters when hidden behind the internet’s cloak of anonymity? Confirm that we really like dogs?

The MIT team doesn’t address any of these questions in their paper, and really, we wouldn’t expect them to — it’s their job to report the survey results, not extrapolate some deeper meaning from them. But whether the Moral Machine informs the future of autonomous vehicles or not, it’s certainly held up a mirror to humanity’s values, and we do not like the reflection we see.

READ MORE: Driverless Cars Should Spare Young People Over Old in Unavoidable Accidents, Massive Survey Finds [Motherboard]

More on the Moral Machine: MIT’s “Moral Machine” Lets You Decide Who Lives & Dies in Self-Driving Car Crashes

More:

People Would Rather a Self-Driving Car Kill a Criminal Than a Dog

Report Identifies China as the Source of Ozone-Destroying Emissions

Emissions Enigma

For years, a mystery puzzled environmental scientists. The world had banned the use of many ozone-depleting compounds in 2010. So why were global emission levels still so high?

The picture started to clear up in June. That’s when The New York Times published an investigation into the issue. China, the paper claimed, was to blame for these mystery emissions. Now it turns out the paper was probably right to point a finger.

Accident or Incident

In a paper published recently in the journal Geophysical Research Letters, an international team of researchers confirms that eastern China is the source of at least half of the 40,000 tonnes of carbon tetrachloride emissions currently entering the atmosphere each year.

They figured this out using a combination of ground-based and airborne atmospheric concentration data from near the Korean peninsula. They also relied on two models that simulated how the gases would move through the atmosphere.

Though they were able to narrow down the source to China, the researchers weren’t able to say exactly who’s breaking the ban and whether they even know about the damage they’re doing.

Pinpoint

“Our work shows the location of carbon tetrachloride emissions,” said co-author Matt Rigby in a press release. “However, we don’t yet know the processes or industries that are responsible. This is important because we don’t know if it is being produced intentionally or inadvertently.”

If we can pinpoint the source of these emissions, we can start working on stopping them and healing our ozone. And given that we’ve gone nearly a decade with minimal progress on that front, there’s really no time to waste.

READ MORE: Location of Large ‘Mystery’ Source of Banned Ozone Depleting Substance Uncovered [University of Bristol]

More on carbon emissions: China Has (Probably) Been Pumping a Banned Gas Into the Atmosphere

See the rest here:

Report Identifies China as the Source of Ozone-Destroying Emissions

Ford’s Self-Driving Cars Are About to Chauffeur Your Senator

Green-Light District

It doesn’t matter how advanced our self-driving cars get — if they aren’t allowed on roads, they aren’t going to save any lives.

The future of autonomous vehicles (AVs) in the U.S. depends on how lawmakers in Washington D.C. choose to regulate the vehicles. But until now, AV testing has largely taken place far from the nation’s capital, mostly in California and Arizona.

Ford is about to change that. The company just announced plans to be the first automaker to test its self-driving cars in the Distinct of Columbia — and how lawmakers feel about those vehicles could influence future AV legislation.

Career Day

Sherif Marakby, CEO of Ford Autonomous Vehicles, announced the decision to begin testing in D.C. via a blog post last week. According to Marakby, Ford’s politician-friendly focus will be on figuring out how its AVs could promote job creation in the District.

To that end, Ford plans to assess how AVs could increase mobility in D.C., thereby helping residents get to jobs that might otherwise be outside their reach, as well as train residents for future positions as AV technicians or operators.

Up Close and Personal

Marakby notes that D.C. is a particularly suitable location for this testing because the District is usually bustling with activity. The population increases significantly during the day as commuters arrive from the suburbs for work, while millions of people flock to D.C. each year for conferences or tourism.

D.C. is also home to the people responsible for crafting and passing AV legislation. “[I]t’s important that lawmakers see self-driving vehicles with their own eyes as we keep pushing for legislation that governs their safe use across the country,” Marakby wrote.

Ford’s ultimate goal is to launch a commercial AV service in D.C. in 2021. With this testing, the company has the opportunity to directly influence the people who could help it reach that goal — or oppose it.

READ MORE: A Monumental Moment: Our Self-Driving Business Development Expands to Washington, D.C. [Medium]

More on AV legislation: U.S. Senators Reveal the Six Principles They’ll Use to Regulate Self-Driving Vehicles

Here is the original post:

Ford’s Self-Driving Cars Are About to Chauffeur Your Senator

This AI Lie Detector Flags Falsified Police Reports

Minority Report

Imagine this: You file a police report, but back at the station, they feed it into an algorithm — and it accuses you of lying, as though it had somehow looked inside your brain.

That might sound like science fiction, but Spain is currently rolling out a very similar program, called VeriPol, in many of its police stations. VeriPol’s creators say that when it flags a report as false, it turns out to be correct more than four-fifths of the time.

Lie Detector

VeriPol is the work of researchers at Cardiff University and Charles III University of Madrid.

In a paper published earlier this year in the journal Knowledge-Based Systems, they describe how they trained the lie detector with a data set of more than 1,000 robbery reports — including a number that police identified as false — to identify subtle signs that a report wasn’t true.

Thought Crime

In pilot studies in Murcia and Malaga, Quartz reported, further investigation showed that the algorithm was correct about 83 percent of the time that it suspected a report was false.

Still, the project raises uncomfortable questions about allowing algorithms to act as lie detectors. Fast Company reported earlier this year that authorities in the United States, Canada, and the European Union are testing a separate system called AVATAR that they want to use to collect biometric data about subjects at border crossings — and analyze it for signs that they’re not being truthful.

Maybe the real question isn’t whether the tech works, but whether we want to permit authorities to act upon what’s essentially a good — but not perfect — assumption that someone is lying.

READ MORE: Police Are Using Artificial Intelligence to Spot Written Lies [Quartz]

More on lie detectors: Stormy Daniels Took a Polygraph. What Do We Do With the Results?

Visit link:

This AI Lie Detector Flags Falsified Police Reports

These Bacteria Digest Food Waste Into Biodegradable Plastic

Factory Farm

Plastics have revolutionized manufacturing, but they’re still terrible for the environment.

Manufacturing plastics is an energy-intensive slog that ends in mountains of toxic industrial waste and greenhouse gas emissions. And then the plastic itself that we use ends up sitting in a garbage heap for thousands of years before it biodegrades.

Scientists have spent years investigating ways to manufacture plastics without ruining the planet, and a Toronto biotech startup called Genecis says it’s found a good answer: factories where vats of bacteria digest food waste and use it to form biodegradable plastic in their tiny microbial guts.

One-Two Punch

The plastic-pooping bacteria stand to clean up several kinds of pollution while churning out usable materials, according to Genecis.

That’s because the microbes feed on waste food or other organic materials — waste that CBC reported gives off 20 percent of Canada’s methane emissions as it sits in landfills.

Then What?

The plastic that the little buggers produce isn’t anything new. It’s called PHA and it’s used in anything that needs to biodegrade quickly, like those self-dissolving stitches. What’s new here is that food waste is much cheaper than the raw materials that usually go into plastics, leading Genecis to suspect it can make the same plastics for 40 percent less cost.

There are a lot of buzzworthy new alternative materials out there, but with a clear environmental and financial benefit, it’s possible these little bacteria factories might be here to stay.

READ MORE: Greener coffee pods? Bacteria help turn food waste into compostable plastic [CBC]

More on cleaning up plastics: The EU Just Voted to Completely ban Single-Use Plastics

Read the original:

These Bacteria Digest Food Waste Into Biodegradable Plastic

You Can Now Preorder a $150,000 Hoverbike

Please, Santa?

It’s never too early to start writing your Christmas wish list, right? Because we know what’s now at the top of ours: a hoverbike.

We’ve had our eyes on Hoversurf’s Scorpion-3 since early last year — but now, the Russian drone start-up is accepting preorders on an updated version of the vehicle.

Flying Bike

The S3 2019 is part motorcycle and part quadcopter. According to the Hoversurf website, the battery-powered vehicle weighs 253 pounds and has a flight time of 10 to 25 minutes depending on operator weight. Its maximum legal speed is 60 mph — though as for how fast the craft can actually move, that’s unknown. Hoversurf also notes that the vehicle’s “safe flight altitude” is 16 feet, but again, we aren’t sure how high it can actually soar.

What we do know: The four blades that provide S3 with its lift spin at shin level, and while this certainly looks like it would be a safety hazard, the U.S. Department of Transportation’s Federal Aviation Administration approved the craft for legal use as an ultralight vehicle in September.

That means you can only operate an S3 for recreational or sports purposes — but you can’t cruise to work on your morning commute.

Plummeting Bank Account

You don’t need a pilot’s license to operate an S3, but you will need a decent amount of disposable income — the Star Wars-esque craft will set you back $150,000.

If that number doesn’t cause your eyes to cross, go ahead and slap down the $10,000 deposit needed to claim a spot in the reservation queue. You’ll then receive an email when it’s time to to place your order. You can expect to receive your S3 2019 two to six months after that, according to the company website.

That means there’s a pretty good chance you won’t be able to hover around your front yard this Christmas morning, but a 2019 jaunt is a genuine possibility.

READ MORE: For $150,000 You Can Now Order Your Own Hoverbike [New Atlas]

More on Hoversurf: Watch the World’s First Rideable Hoverbike in Flight

See original here:

You Can Now Preorder a $150,000 Hoverbike

FBI’s Tesla Criminal Probe Reportedly Centers on Model 3 Production

Ups and Downs

Can we please get off Mr. Musk’s Wild Ride now? We don’t know how much more of this Tesla rollercoaster we can take.

In 2018 alone, Elon Musk’s clean energy company has endured a faulty flufferbot, furious investors, and an SEC probe and settlement. But there was good news, too. Model 3 deliveries reportedly increased, and just this week, we found out that Tesla had a historic financial quarter, generating $312 million in profit.

And now we’re plummeting again.

Closing In

On Friday, The Wall Street Journal reported that the Federal Bureau of Investigation (FBI) is deepening a criminal probe into whether Tesla “misstated information about production of its Model 3 sedans and misled investors about the company’s business going back to early 2017.”

We’ve known about the FBI’s Tesla criminal probe since September 18, but this is the first report confirming that Model 3 production is at the center of the investigation.

According to the WSJ’s sources, FBI agents have been reaching out to former Tesla employees in recent weeks to ask if they’d be willing to testify in the criminal case, though no word yet on whether any have agreed.

Casual CEO

We might be having trouble keeping up with these twists and turns, but Musk seems to be taking the FBI’s Tesla criminal probe all in stride — he spent much of Friday afternoon joking around with his Twitter followers about dank memes.

Clearly he has the stomach for this, but it’d be hard to blame any Tesla investors for deciding they’d had enough.

READ MORE: Tesla Faces Deepening Criminal Probe Over Whether It Misstated Production Figures [The Wall Street Journal]

More on Tesla: Elon Musk Says Your Tesla Will Earn You Money While You Sleep

Originally posted here:

FBI’s Tesla Criminal Probe Reportedly Centers on Model 3 Production

Zero Gravity Causes Worrisome Changes In Astronauts’ Brains

Danger, Will Robinson

As famous Canadian astronaut Chris Hadfield demonstrated with his extraterrestrial sob session, fluids behave strangely in space.

And while microgravity makes for a great viral video, it also has terrifying medical implications that we absolutely need to sort out before we send people into space for the months or years necessary for deep space exploration.

Specifically, research published Thursday In the New England Journal of Medicine demonstrated that our brains undergo lasting changes after we spend enough time in space. According to the study, cerebrospinal fluid — which normally cushions our brain and spinal cord — behaves differently in zero gravity, causing it to pool around and squish our brains.

Mysterious Symptoms

The brains of the Russian cosmonauts who were studied in the experiment mostly bounced back upon returning to Earth.

But even seven months later, some abnormalities remained. According to National Geographic, the researchers suspect that high pressure  inside the cosmonauts’ skulls may have squeezed extra water into brain cells which later drained out en masse.

Now What?

So far, scientists don’t know whether or not this brain shrinkage is related to any sort of cognitive or other neurological symptoms — it might just be a weird quirk of microgravity.

But along with other space hazards like deadly radiation and squished eyeballs, it’s clear that we have a plethora of medical questions to answer before we set out to explore the stars.

READ MORE: Cosmonaut brains show space travel causes lasting changes [National Geographic]

More on space medicine: Traveling to Mars Will Blast Astronauts With Deadly Cosmic Radiation, new Data Shows

See the rest here:

Zero Gravity Causes Worrisome Changes In Astronauts’ Brains

We Aren’t Growing Enough Healthy Foods to Feed Everyone on Earth

Check Yourself

The agriculture industry needs to get its priorities straight.

According to a newly published study, the world food system is producing too many unhealthy foods and not enough healthy ones.

“We simply can’t all adopt a healthy diet under the current global agriculture system,” said study co-author Evan Fraser in a press release. “Results show that the global system currently overproduces grains, fats, and sugars, while production of fruits and vegetables and, to a smaller degree, protein is not sufficient to meet the nutritional needs of the current population.”

Serving Downsized

For their study, published Tuesday in the journal PLOS ONE, researchers from the University of Guelph compared global agricultural production with consumption recommendations from Harvard University’s Healthy Eating Plate guide. Their findings were stark: The agriculture industry’s overall output of healthy foods does not match humanity’s needs.

Instead of the recommended eight servings of grains per person, it produces 12. And while nutritionists recommend we each consume 15 servings of fruits and vegetables daily, the industry produces just five. The mismatch continues for oils and fats (three servings instead of one), protein (three servings instead of five), and sugar (four servings when we don’t need any).

Overly Full Plate

The researchers don’t just point out the problem, though — they also calculated what it would take to address the lack of healthy foods while also helping the environment.

“For a growing population, our calculations suggest that the only way to eat a nutritionally balanced diet, save land, and reduce greenhouse gas emission is to consume and produce more fruits and vegetables as well as transition to diets higher in plant-based protein,” said Fraser.

A number of companies dedicated to making plant-based proteins mainstream are already gaining traction. But unfortunately, it’s unlikely that the agriculture industry will decide to prioritize growing fruits and veggies over less healthy options as long as people prefer having the latter on their plates.

READ MORE: Not Enough Fruits, Vegetables Grown to Feed the Planet, U of G Study Reveals [University of Guelph]

More on food scarcity: To Feed a Hungry Planet, We’re All Going to Need to Eat Less Meat

Read the original here:

We Aren’t Growing Enough Healthy Foods to Feed Everyone on Earth


12345...102030...