Ascension school district hits ground running in plans for new high school construction – The Advocate

GONZALES With a successful bond election behind them, Ascension Parish public school officials are taking the next steps to build a new high school at Prairieville.

Seventy-one percent of voters who turned out for the Aug. 15 election approved extending an existing property tax for 20 years to fund $140 million in bonds to build the high school and complete 13 other construction and improvement projects.

Originally slated to go before voters in May, an Ascension Parish school district proposal for building a new high school in Prairieville and

"We're thankful to be in a community that partners with us to provide state-of-the-art facilities for our students; we're fired up," Superintendent David Alexander said after a meeting of a board committee that approved a timeline for completing the $79.5 million high school in time for the start of school in 2023.

At its next meeting, set for Tuesday, the board is expected to hire the two Baton Rouge architect firms RHH Architects and Domain Architecture, which bid as a joint venture on the high school project.

A committee of school district personnel is recommending the joint-venture bid over five others to design the 280,000-square-foot school that will go up on Parker Road, next to Prairieville Primary.

The timeline of important milestones approved in committee last week for the project includes:

"It will be an aggressive schedule and we'll do our best," Jeff Parent, the district's supervisor of planning and construction, told board members.

The new high school will be the fourth on the east bank of the parish and is designed to relieve overcrowding at the others Dutchtown, East Ascension and St. Amant where student enrollment at each is nearing or past 2,000.

Other projects that will now be going forward after the recent bond election are:

Read more:

Ascension school district hits ground running in plans for new high school construction - The Advocate

PerimeterX Platform Named Best Application Security Solution by the 2020 Tech Ascension Awards – Security Boulevard

The 2020 Tech Ascension Awards have named the PerimeterX Platform winner of Best Application Security Solution. This recognition comes a mere two months after the formal launch of the consolidated Platform, validating its status as the premier suite of solutions for bot mitigation and client-side protection that preserve a users web app experience.

Digital business and security leaders recognize that innovating at a rapid pace and adopting new application technologies is required to stay competitive, to grow revenue and to increase brand awareness. However, many organizations compromise their digital transformation work by using multiple point solutions to manage the constantly growing number of application security threats that range from classic code vulnerabilities to business logic threats. To truly secure todays digital businesses, a platform approach is essential. Using a single cloud-native platform can simplify and future-proof application security and help businesses bring new applications to market faster and more efficiently. A platform that provides unified visibility into web analytics and enriches data with threat analysis truly empowers businesses to make accurate decisions quickly.

Comprised of PerimeterX Bot Defender, PerimeterX Code Defender and PerimeterX Page Defender, the Platform provides essential features for digital businesses, including:

The cloud-native Platform seamlessly integrates into an enterprises existing infrastructure and automatically scales to meet demandno changes or (Read more...)

*** This is a Security Bloggers Network syndicated blog from PerimeterX Blog authored by PerimeterX Blog. Read the original post at: https://www.perimeterx.com/resources/blog/2020/perimeterx-platform-named-best-application-security-solution-by-the-2020-tech-ascension-awards/

Go here to read the rest:

PerimeterX Platform Named Best Application Security Solution by the 2020 Tech Ascension Awards - Security Boulevard

UAB and Ascension St. Vincent’s hospitals named to third-annual hospital rankings for exceptional consumer loyalty – UAB News

The UAB Health System/Ascension St. Vincents Alliance has seen both organizations named to the third-annual NRC Health Top 100 Consumer Loyalty list, the first and only loyalty-based hospital rankings that recognize the top U.S. health care organizations for earning exceptional loyalty ratings from their patient populations.

The UAB Health System/Ascension St. Vincents Alliance has seen both organizations named to the third-annual NRC Health Top 100 Consumer Loyalty list, the first and only loyalty-based hospital rankings that recognize the top U.S. health care organizations for earning exceptional loyalty ratings from their patient populations.

Both St. Vincents Birmingham and UAB Hospital were named Best in Class, and designated as two of the top 10 hospitals in the country on the Consumer Loyalty list, based on results from NRC Healths Market Insights survey, the largest database of health care consumer responses in the country. From April 2019 to March 2020, NRC Health surveyed more than 310,000 households in the contiguous United States to measure consumer engagement with community health care brands. The winning organizations on the 2020 Consumer Loyalty list achieved remarkably high scores on NRC Healths Loyalty Index, a composite of seven different critical aspects of consumer loyalty, including access, engagement, experience and net promoter score.

Ascension St. Vincents and UAB, along with the other hospitals recognized in this years Consumer Loyalty Awards, are at the forefront of delivering patient-centric care, which is more important than ever in this new normal in health care, said Helen Hrdy, chief growth officer at NRC Health. We are proud to recognize these industry-leading organizations and the commitments they hold to their patients and improving the complete care journey now and moving forward.

For nearly four decades, NRC Health has helped health care organizations illuminate and improve the moments that matter most to patients, residents, physicians, nurses and staff.

What an honor it is to receivesuch high recognition from the patients we serve, said Jason Alexander, CEO of Ascension St. Vincents and senior vice president of Ascension. Every day, it is our promise and desire to deliver compassionate and personalized care to all who come through ourdoors. Knowing that our patients recognize our efforts demonstrates that we are delivering on that promise,and we could not be more pleased.

It comes as no surprise that Ascension St. Vincents and UAB should be among the top 10 hospitals honored on this list, said Will Ferniany, Ph.D., CEO of the UABHS/St. Vincents Alliance. Both organizations are committed to providing outstanding medical care. It is this approach, ingrained within the cultures of both health systems, that inspires patient loyalty. We are honored with this recognition.

Winning organizations were publicly announced Monday, Aug. 24, during the virtual 26th Annual NRC Health Symposium. A complete list of winners can be found at undefined.

Original post:

UAB and Ascension St. Vincent's hospitals named to third-annual hospital rankings for exceptional consumer loyalty - UAB News

Supermicro Details Its Hardware for MN-3, the Most Efficient Supercomputer in the World – HPCwire

In June, HPCwire highlighted the new MN-3 supercomputer: a 1.6 Linpack petaflops system delivering 21.1 gigaflops per watt of power, making it the most energy-efficient supercomputer in the world at least, according to the latest Green500 list, the Top500s energy-conscious cousin. The system was built by Preferred Networks, a Japanese AI startup that used its in-house MN-Core accelerator to help deliver the MN-3s record-breaking efficiency. Collaborating with Preferred Networks was modular system manufacturer Supermicro, which detailed the hardware and processes behind the chart-topping green giant in a recent report.

As Supermicro tells it, Preferred Networks was facing challenges on two fronts: first, the need for a much more powerful system to solve its clients deep learning problems; and second, the exorbitant operating costs of the system they were envisioning. With increasing power costs, a large system of the size PFN was going to need, the operating costs of both the power and associated cooling would exceed the budget that was allocated, Supermicro wrote. Therefore, the energy efficiency of the new solution would have to be designed into the system, and not become an afterthought.

Preferred Networks turned to partnerships to help resolve these problems. First, they worked with researchers at Kobe University to develop the MN-Core accelerator, specializing it for deep learning training processes and optimizing it for energy efficiency. After successfully benchmarking the MN-Core above one teraflop per watt in testing, the developers turned to the rest of the system and thats where Supermicro entered the picture.

On a visit to Japan, Clay Chen general manager of global business development at Supermicro sat down with Preferred Networks to hear what they needed.

At first I was asking them, you know, what type of GPU they are using, Chen said in an interview with HPCwire. They say, oh, no, were not using any type were going to develop our own GPU. And that was quite fascinating to me.

Preferred Networks selected Supermicro for the daunting task: fitting four MN-Core boards, two Intel Xeon Platinum CPUs, up to 6TB of DDR4 memory and Intel Optane persistent memory modules in a single box without sacrificing the energy efficiency of the system.

Supermicro based its design on one of its preexisting GPU server models that was designed to house multiple GPUs (or other accelerators) and high-speed interconnects. Working with Preferred Networks engineers, Supermicro ran simulations to determine the optimal chassis design and component arrangement to ensure that the MN-Core accelerators would be sufficiently cooled and efficiency could be retained.

Somewhat surprisingly, the custom server is entirely fan-cooled. Our concept is: if we can design something with fan cooling, why would we want to use liquid cooling? Chen said. Because essentially, all the heat being pulled out from the liquid is going to cool somewhere. When you take the heat outside the box, you still need to cool the liquid with a fan.

The end result, a customized Supermicro server just for Preferred Networks, is pictured below.

The servers four MN-Core boards are connected to PCIe x16 slots on a Supermicro motherboard and to the MN-Core Direct Connect board that enables high-speed communication between the MN-Core boards.

These custom servers each 7U high were then rack-mounted into what would become the MN-3 supercomputer: 48 servers, four interconnect nodes and five 100GbE switches. In total, the systems 2,080 CPU cores, delivering 1,621 Linpack teraflops of performance, required just 77 kW of power for the Top500 benchmarking run. This efficiency level is just 15 percent short of the 40-megawatt limit targeted by planned exascale systems like Aurora, Frontier and El Capitan.

We are very pleased to have partnered with Supermicro, who worked with us very closely to build MN-3, which was recognized as the worlds most energy-efficient supercomputer, said Yusuke Doi, VP of computing infrastructure at Preferred Networks. We can deliver outstanding performance while using a fraction of the power that was previously required for such a large supercomputer.

Go here to read the rest:

Supermicro Details Its Hardware for MN-3, the Most Efficient Supercomputer in the World - HPCwire

I confess, I’m scared of the next generation of supercomputers – TechRadar

Earlier this year, a Japanese supercomputer built on Arm-based Fujitsu A64FX processors snatched the crown of worlds fastest machine, blowing incumbent leader IBM Summit out of the water.

Fugaku, as the machine is known, achieved 415.5 petaFLOPS by the popular High Performance Linpack (HPL) benchmark, which is almost three times the score of the IBM machine (148.5 petaFLOPS).

It also topped the rankings for Graph 500, HPL-AI and HPCH workloads - a feat never before achieved in the world of high performance computing (HPC).

Modern supercomputers are now edging ever-closer to the landmark figure of one exaFLOPS (equal to 1,000 petaFLOPS), commonly described as the exascale barrier. In fact, Fugaku itself can already achieve one exaFLOPS, but only in lower precision modes.

The consensus among the experts we spoke to is that a single machine will breach the exascale barrier within the next 6 - 24 months, unlocking a wealth of possibilities in the fields of medical research, climate forecasting, cybersecurity and more.

But what is an exaFLOPS? And what will it mean to break the exascale milestone, pursued doggedly for more than a decade?

To understand what it means to achieve exascale computing, its important to first understand what is meant by FLOPS, which stands for floating point operations per second.

A floating point operation is any mathematical calculation (i.e. addition, subtraction, multiplication or division) that involves a number containing a decimal (e.g. 3.0 - a floating point number), as opposed to a number without a decimal (e.g. 3 - a binary integer). Calculations involving decimals are typically more complex and therefore take longer to solve.

An exascale computer can perform 10^18 (one quintillion/100,000,000,000,000,000) of these mathematical calculations every second.

For context, to equal the number of calculations an exascale computer can process in a single second, an individual would have to perform one sum every second for 31,688,765,000 years.

The PC Im using right now, meanwhile, is able to reach 147 billion FLOPS (or 0.00000014723 exaFLOPS), outperforming the fastest supercomputer of 1993, the Intel Paragon (143.4 billion FLOPS).

This both underscores how far computing has come in the last three decades and puts into perspective the extreme performance levels attained by the leading supercomputers today.

The key to building a machine capable of reaching one exaFLOPS is optimization at the processing, storage and software layers.

The hardware must be small and powerful enough to pack together and reach the necessary speeds, the storage capacious and fast enough to serve up the data and the software scalable and programmable enough to make full use of the hardware.

For example, there comes a point at which adding more processors to a supercomputer will no longer affect its speed, because the application is not sufficiently optimized. The only way governments and private businesses will realize a full return on HPC hardware investment is through an equivalent investment in software.

Organizations such as the Exascale Computing Project (EPC) the ExCALIBUR programme are interested in solving precisely this problem. Those involved claim a renewed focus on algorithm and application development is required in order to harness the full power and scope of exascale.

Achieving the delicate balance between software and hardware, in an energy efficient manner and avoiding an impractically low mean time between failures (MTBF) score (the time that elapses before a system breaks down under strain) is the challenge facing the HPC industry.

15 years ago as we started the discussion on exascale, we hypothesized that it would need to be done in 20 mega-watts (MW); later that was changed to 40 MW. With Fugaku, we see that we are about halfway to a 64-bit exaFLOPS at the 40 MW power envelope, which shows that an exaFLOPS is in reach today, explained Brent Gorda, Senior Director HPC at UK-based chip manufacturer Arm.

We could hit an exaFLOPS now with sufficient funding to build and run a system. [But] the size of the system is likely to be such that MTBF is measured in single digit number-of-days based on todays technologies and the number of components necessary to reach these levels of performance.

When it comes to building a machine capable of breaching the exascale barrier, there are a number of other factors at play, beyond technological feasibility. An exascale computer can only come into being once an equilibrium has been reached at the intersection of technology, economics and politics.

One could in theory build an exascale system today by packing in enough CPUs, GPUs and DPUs. But what about economic viability? said Gilad Shainer of NVIDIA Mellanox, the firm behind the Infiniband technology (the fabric that links the various hardware components) found in seven of the ten fastest supercomputers.

Improvements in computing technologies, silicon processing, more efficient use of power and so on all help to increase efficiency and make exascale computing an economic objective as opposed to a sort of sporting achievement.

According to Paul Calleja, who heads up computing research at the University of Cambridge and is working with Dell on the Open Exascale Lab, Fugaku is an excellent example of what is theoretically possible today, but is also impractical by virtually any other metric.

If you look back at Japanese supercomputers, historically theres only ever been one of them made. They have beautifully exquisite architectures, but theyre so stupidly expensive and proprietary that no one else could afford one, he told TechRadar Pro.

[Japanese organizations] like these really large technology demonstrators, which are very useful in industry because it shows the direction of travel and pushes advancements, but those kinds of advancements are very expensive and not sustainable, scalable or replicable.

So, in this sense, there are two separate exascale landmarks; the theoretical barrier, which will likely be met first by a machine of Fugakus ilk (a technological demonstrator), and the practical barrier, which will see exascale computing deployed en masse.

Geopolitical factors will also play a role in how quickly the exascale barrier is breached. Researchers and engineers might focus exclusively on the technological feat, but the institutions and governments funding HPC research are likely motivated by different considerations.

Exascale computing is not just about reaching theoretical targets, it is about creating the ability to tackle problems that have been previously intractable, said Andy Grant, Vice President HPC & Big Data at IT services firm Atos, influential in the fields of HPC and quantum computing.

Those that are developing exascale technologies are not doing it merely to have the fastest supercomputer in the world, but to maintain international competitiveness, security and defence.

In Japan, their new machine is roughly 2.8x more powerful than the now-second place system. In broad terms, that will enable Japanese researchers to address problems that are 2.8x more complex. In the context of international competitiveness, that creates a significant advantage.

In years gone by, rival nations fought it out in the trenches or competed to see who could place the first human on the moon. But computing may well become the frontier at which the next arms race takes place; supremacy in the field of HPC might prove just as politically important as military strength.

Once exascale computers become an established resource - available for businesses, scientists and academics to draw upon - a wealth of possibilities will be unlocked across a wide variety of sectors.

HPC could prove revelatory in the fields of clinical medicine and genomics, for example, which require vast amounts of compute power to conduct molecular modelling, simulate interactions between compounds and sequence genomes.

In fact, IBM Summit and a host of other modern supercomputers are being used to identify chemical compounds that could contribute to the fight against coronavirus. The Covid-19 High Performance Computing Consortium assembled 16 supercomputers, accounting for an aggregate of 330 petaFLOPS - but imagine how much more quickly research could be conducted using a fleet of machines capable of reaching 1,000 petaFLOPS on their own.

Artificial intelligence, meanwhile, is another cross-disciplinary domain that will be transformed with the arrival of exascale computing. The ability to analyze ever-larger datasets will improve the ability of AI models to make accurate forecasts (contingent on the quality of data fed into the system) that could be applied to virtually any industry, from cybersecurity to e-commerce, manufacturing, logistics, banking, education and many more.

As explained by Rashid Mansoor, CTO at UK supercomputing startup Hadean, the value of supercomputing lies in the ability to make an accurate projection (of any variety).

The primary purpose of a supercomputer is to compute some real-world phenomenon to provide a prediction. The prediction could be the way proteins interact, the way a disease spreads through the population, how air moves over an aerofoil or electromagnetic fields interact with a spacecraft during re-entry, he told TechRadar Pro.

Raw performance such as the HPL benchmark simply indicates that we can model bigger and more complex systems to a greater degree of accuracy. One thing that the history of computing has shown us is that the demand for computing power is insatiable.

Other commonly cited areas that will benefit significantly from the arrival of exascale include brain mapping, weather and climate forecasting, product design and astronomy, but its also likely that brand new use cases will emerge as well.

The desired workloads and the technology to perform them form a virtuous circle. The faster and more performant the computers, the more complex problems we can solve and the faster the discovery of new problems, explained Shainer.

What we can be sure of is that we will see the continuous needs or ever growing demands for more performance capabilities in order to solve the unsolvable. Once this is solved, we will find the new unsolvable.

By all accounts, the exascale barrier will likely fall within the next two years, but the HPC industry will then turn its attention to the next objective, because the work is never done.

Some might point to quantum computers, which approach problem solving in an entirely different way to classical machines (exploiting symmetries to speed up processing), allowing for far greater scale. However, there are also problems to which quantum computing cannot be applied.

Mid-term (10 year) prospects for quantum computing are starting to shape up, as are other technologies. These will be more specialized where a quantum computer will very likely show up as an application accelerator for problems that relate to logistics first. They wont completely replace the need for current architectures for IT/data processing, explained Gorda.

As Mansoor puts it, on certain problems even a small quantum computer can be exponentially faster than all of the classical computing power on earth combined. Yet on other problems, a quantum computer could be slower than a pocket calculator.

The next logical landmark for traditional computing, then, would be one zettaFLOPS, equal to 1,000 exaFLOPS or 1,000,000 petaFLOPS.

Chinese researchers predicted in 2018 that the first zettascale system will come online in 2035, paving the way for new computing paradigms. The paper itself reads like science fiction, at least for the layman:

To realize these metrics, micro-architectures will evolve to consist of more diverse and heterogeneous components. Many forms of specialized accelerators are likely to co-exist to boost HPC in a joint effort. Enabled by new interconnect materials such as photonic crystal, fully optical interconnecting systems may come into use.

Assuming one exaFLOPS is reached by 2022, 14 years will have elapsed between the creation of the first petascale and first exascale systems. The first terascale machine, meanwhile, was constructed in 1996, 12 years before the petascale barrier was breached.

If this pattern were to continue, the Chinese researchers estimate would look relatively sensible, but there are firm question marks over the validity of zettascale projections.

While experts are confident in their predicted exascale timelines, none would venture a guess at when zettascale might arrive without prefacing their estimate with a long list of caveats.

Is that an interesting subject? Because to be honest with you, its so not obtainable. To imagine how we could go 1000x beyond [one exaFLOPS] is not a conversation anyone could have, unless theyre just making it up, said Calleja, asked about the concept of zettascale.

Others were more willing to theorize, but equally reticent to guess at a specific timeline. According to Grant, the way zettascale machines process information will be unlike any supercomputer in existence today.

[Zettascale systems] will be data-centric, meaning components will move to the data rather than the other way around, as data volumes are likely to be so large that moving data will be too expensive. Regardless, predicting what they might look like is all guesswork for now, he said.

It is also possible that the decentralized model might be the fastest route to achieving zettascale, with millions of less powerful devices working in unison to form a collective supercomputer more powerful than any single machine (as put into practice by the SETI Institute).

As noted by Saurabh Vij, CEO of distributed supercomputing firm Q Blocks, decentralized systems address a number of problems facing the HPC industry today, namely surrounding building and maintenance costs. They are also accessible to a much wider range of users and therefore democratize access to supercomputing resources in a way that is not otherwise possible.

There are benefits to a centralized architecture, but the cost and maintenance barrier overshadows them. [Centralized systems] also alienate a large base of customer groups that could benefit, he said.

We think a better way is to connect distributed nodes together in a reliable and secure manner. It wouldnt be too aggressive to say that, 5 years from now, your smartphone could be part of a giant distributed supercomputer, making money for you while you sleep by solving computational problems for industry, he added.

However, incentivizing network nodes to remain active for a long period is challenging and a high rate of turnover can lead to reliability issues. Network latency and capacity problems would also need to be addressed before distributed supercomputing can rise to prominence.

Ultimately, the difficulty in making firm predictions about zettascale lies in the massive chasm that separates present day workloads and HPC architectures from those that might exist in the future. From a contemporary perspective, its fruitless to imagine what might be made possible by a computer so powerful.

We might imagine zettascale machines will be used to process workloads similar to those tackled by modern supercomputers, only more quickly. But its possible - even likely - the arrival of zettascale computing will open doors that do not and cannot exist today, so extraordinary is the leap.

In a future in which computers are 2,000+ times as fast as the most powerful machine today, philosophical and ethical debate surrounding the intelligence of man versus machine are bound to be played out in greater detail - and with greater consequence.

It is impossible to directly compare the workings of a human brain with that of a computer - i.e. to assign a FLOPS value to the human mind. However, it is not insensible to ask how many FLOPS must be achieved before a machine reaches a level of performance that might be loosely comparable to the brain.

Back in 2013, scientists used the K supercomputer to conduct a neuronal network simulation using open source simulation software NEST. The team simulated a network made up of 1.73 billion nerve cells connected by 10.4 trillion synapses.

While ginormous, the simulation represented only 1% of the human brains neuronal network and took 40 minutes to replicate 1 seconds worth of neuronal network activity.

However, the K computer reached a maximum computational power of only 10 petaFLOPS. A basic extrapolation (ignoring inevitable complexities), then, would suggest Fugaku could simulate circa 40% of the human brain, while a zettascale computer would be capable of performing a full simulation many times over.

Digital neuromorphic hardware (supercomputers created specifically to simulate the human brain) like SpiNNaker 1 and 2 will also continue to develop in the post-exascale future. Instead of sending information from point A to B, these machines will be designed to replicate the parallel communication architecture of the brain, sending information simultaneously to many different locations.

Modern iterations are already used to help neuroscientists better understand the mysteries of the brain and future versions, aided by advances in artificial intelligence, will inevitably be used to construct a faithful and fully-functional replica.

The ethical debates that will arise with the arrival of such a machine - surrounding the perception of consciousness, the definition of thought and what an artificial uber-brain could or should be used for - are manifold and could take generations to unpick.

The inability to foresee what a zettascale computer might be capable of is also an inability to plan for the moral quandaries that might come hand-in-hand.

Whether a future supercomputer might be powerful enough to simulate human-like thought is not in question, but whether researchers should aspire to bringing an artificial brain into existence is a subject worthy of discussion.

Continued here:

I confess, I'm scared of the next generation of supercomputers - TechRadar

Bradykinin Hypothesis of COVID-19 Offers Hope for Already-Approved Drugs – BioSpace

A group of researchers at Oak Ridge National Lab in Tennessee used the Summit supercomputer, the second-fastest in the world, to analyze data on more than 40,000 genes from 17,000 genetic samples related to COVID-19. The analysis took more than a week and analyzed 2.5 billion genetic combinations. And it came up with a new theory, dubbed the bradykinin hypothesis, on how COVID-19 affects the body.

Daniel Jacobson, a computational systems biologist at Oak Ridge, noted that the expression of genes for significant enzymes in the renin-angiotensin system (RAS), which is involved in blood pressure regulation and fluid balance, was abnormal. He then tracked the abnormal RAS in the lung fluid samples to the kinin cascade, which is an inflammatory pathway closely regulated by the RAS.

In the kinin system, bradykinin, which is a key peptide, causes blood vessels to leak, allowing fluid to accumulate in organs and tissue. And in COVID-19 patients, this system was unbalanced. People with the disease had increased gene expression for the bradykinin receptors and for enzymes known as kallikreins that activate the kinin pathway.

Jacobson and his team published the research in the journal eLife. They believe that this research explains many aspects of COVID-19 that were previously not understood, including why there is an abnormal accumulation of fluid in the patients lungs.

From the research, SARS-CoV-2 infection typically starts when the virus enters the body via ACE2 receptor in the nose, where they are common. The virus then moves through the body, integrating into cells that also have ACE2, including the intestines, kidneys and heart. This is consistent with some of COVID-19s cardiac and gastrointestinal symptoms.

But the virus does not appear to stop there. Instead, it takes over the bodys systems, upregulating ACE2 receptors in cells and tissues where theyre not common, including the lungs. Or as Thomas Smith writes in Medium, COVID-19 is like a burglar who slips in your unlocked second-floor window and starts to ransack your house. Once inside, though, they dont just take your stuffthey also throw open all your doors and windows so their accomplices can rush in and help pillage more efficiently.

The final result of all this is what is being called a bradykinin storm. When the virus affects the RAS, the way the body regulates bradykinin runs amuck, bradykinin receptors are resensitized, and the body stops breaking down bradykinin, which is typically degraded by ACE. They believe it is this bradykinin storm that is responsible for many of COVID-19s deadliest symptoms.

The researchers wrote that the pathology of COVID-19 is likely the result of Bradykinin Storms rather than cytokine storms, which have been observed in COVID-19 patients, but that the two may be intricately linked.

Another researcher, Frank van de Veerdonk, an infectious disease researcher at the Radboud University Medical Center in Netherlands, had made similar observations in mid-March. In April, he and his research team theorized that a dysregulated bradykinin system was causing leaky blood vessels in the lungs, which was a potential cause of the excess fluid accumulation.

Josef Penninger, director of the Life Sciences Institute at the University of British Columbia in Vancouver, who identified that ACE2 is the essential in vivo receptor for SARS, told The Scientist that he believes bradykinin plays a role in COVID-19. It does make a lot of sense. And Jacobsons study supports the hypothesis, but additional research is needed for confirmation. Gene expression signatures dont tell us the whole story. I think it is very important to actually measure the proteins.

Another aspect of Jacobsons study is that via another pathway, COVID-19 increases production of hyaluronic acid (HLA) in the lungs. HLA is common in soaps and lotions because it absorbs more than 1,000 times its weight in fluid. Taking into consideration fluid leaking into the lungs and increased HLA, it creates a hydrogel in the lungs of some COVID-19 patients, which Jacobson describes as like trying to breathe through Jell-O.

This provides a possible explanation for why ventilators have been less effective in severe COVID-19 than physicians originally expected. It reaches a point, Jacobson says, where regardless of how much oxygen you pump in, it doesnt matter, because the alveoli in the lungs are filled with this hydrogel. The lungs become like a water balloon.

The bradykinin hypothesis also explains why about 20% of COVID-19 patients have heart damage, because RAS controls aspects of cardiac contractions and blood pressure. It also supports COVID-19s neurological effects, such as dizziness, seizures, delirium and stroke, which is seen in as much as 50% of hospitalized patients. French-based research identified leaky blood vessels in the brains of COVID-19 patients. And at high doses, bradykinin can break down the blood-brain barrier.

On the positive side, their research suggests that drugs that target components of RAS are already FDA approved for other diseases and might be effective in treating COVID-19. Some, such as danazol (to treat endometriosis, fibrocystic breast disease, and hereditary angioedema), stanazolol (an anabolic steroid derived from testosterone), and ecallantide (marketed as Kalbitor for hereditary angioedema (HAE) and the prevention of blood loss in cardiothoracic surgery), decrease bradykinin production. Icatibant, also used to treat HAE, and is marketed as Firazyr, decreases bradykinin signaling and could minimize its effects once its in the body. Vitamin D may potentially be useful, because it is involved in the RAS system and may reduce levels of REN, another compound involved in the system.

The researchers note that the testing of any of these pharmaceutical interventions should be done in well-designed clinical trials.

More here:

Bradykinin Hypothesis of COVID-19 Offers Hope for Already-Approved Drugs - BioSpace

Stranger than fiction? Why we need supercomputers – TechHQ

In2001: A Space Odyssey, the main villain is a supercomputer named Hal-9000 that was responsible for the death ofDiscovery Onescrew.

Need some help remembering Douglas Rains chilling voice as the sentient computer?

Even though HAL-9000 met with a slow, painful death by disconnection, it remains one of the most iconic supercomputers on screen and in fiction. The villainous systems display of humanity in its last moment, singing the lullaby of Daisy Bell urges viewers to recognize the strong sense of self that the machine possesses. However, in the real world, supercomputers are far less sentimental, if not far off in terms of their data processing and problem-solving ability.

What truly separates supercomputers from your not-so-super-computers is the way they process the workload. Supercomputers, fundamentally, adopt a technique called parallel processing that uses multiple compute resources to solve a computational problem. In contrast, our regular computers run on serial computing that solves computational problems one at a time, following a sequence.

For a sense of just how powerful these systems are, supercomputers are frequently used for simulating reality, including astronomical events like two galaxies colliding or predicting how a nuclear attack would play out.

Supercomputers can simulate astronomical events. Source: Unsplash

Now, scaling it down from the fate of the universe, supercomputers are also used for enterprise-wide applications.

Over the years, the power of supercomputers in simulating reality has given humankind a better ability to make predictions or boost product designs. In manufacturing, this ability users can test out countless product designs to discern which prototypes are best suited to the real world. In this sense, supercomputing significantly slashes the number of physical testing resources and helps organizations get products to market quicker, allowing them to seize opportunities to lead in their respective markets and gain extra profit.

Jack Dongarra, a leading supercomputer expert,noted that the industrial use of supercomputers is widespread: Industry gets it. They are investing in high-performance computers to be more competitive and to gain an edge on their competition. And they feel that money is well spent. They are investing in these things to help drive their products and innovation, their bottom line, their productivity, and their profitability, Dongarra said.

Supercomputers are also helping scientists and researchers in developing new life-saving medicines. Presently, supercomputers all over the world are united over the singular goal in the research and development of a COVID-19 vaccine.

Equipped with the capabilities of supercomputers, researches gain unique opportunities to explore the structure and behavior of the infamous virus at a molecular stage. Since a supercomputer can simulate a myriad of interactions between the virus and human body cells, researchers are able to forecast the spread of the disease and seek for promising treatments or vaccine materials.

Japans Fugaku supercomputer, located at the RIKEN Center for Computational Science in Kobe, was recently crowned the worlds fastest. Around 3,000 researchers use it to search and model new drugs, study weather, and natural disaster scenarios, even the fundamental laws of physics and nature. Recently, researchers have been experimenting with using Fugaku for COVID-19 research into diagnostics, therapeutics, and simulations that replicate the spread patterns of the virus.

Fugaku was developed based on the idea of achieving high performance on a variety of applications of great public interest [] and we are very happy that it has shown itself to be outstanding on all the major supercomputer benchmarks, Satoshi Matsuoka, director of the RIKEN Center, said. I hope that the leading-edge IT developed for it will contribute to major advances on difficult social challenges such as COVID-19.

InIBMs company blog, the Director of IBM Research, Dario Gil writes: The supercomputers will run myriad calculations in epidemiology, bioinformatics, and molecular modeling, in a bid to drastically cut the time of discovery of new molecules that could lead to a vaccine.

A supercomputers parallel computing makes it uniquely suited to screen through a deluge of data and, at its core, solve complex problems that require a lot of number-crunching. Erik Lindahl, a professor of biophysics,sharedto date, supercomputers enable scientists to see how liquids diffuse around the proteins, and no other experimental method is capable of that.

We could not do what we do without computers. The computers enable us to see things that we could never see in experiments otherwise.

While Hals infamous line Im sorry Dave, Im afraid I cant do that left viewers to debate if Hal was truly evil or just obeying orders, perhaps its time we bring this conversation back to life and focus on the extraordinary capabilities of these supercomputers.

View post:

Stranger than fiction? Why we need supercomputers - TechHQ

Google Says It Just Ran The First-Ever Quantum Simulation of a Chemical Reaction – ScienceAlert

Of the many high expectations we have of quantum technology, one of the most exciting has to be the ability to simulate chemistry on an unprecedented level. Now we have our first glimpse of what that might look like.

Together with a team of collaborators, the Google AI Quantum team has used their 54 qubit quantum processor, Sycamore, to simulate changes in the configuration of a molecule called diazene.

As far as chemical reactions go, it's one of the simplest ones we know of.Diazene is little more than a couple of nitrogens linked in a double bond, each towing a hydrogen atom.

However, the quantum computer accurately described changes in the positions of hydrogen to form different diazene isomers.The team also used their system to arrive at an accurate description of the binding energy of hydrogen in increasingly bigger chains.

As straight-forward as these two models may sound, there's a lot going on under the hood. Forget the formulaic chemical reactions from your school textbooks - on a level of quantum mechanics, chemistry is a complicated mix of possibilities.

In some ways, it's the difference between knowing a casino will always make a profit, and predicting the outcomes of the individual games being played inside. Restricted to the predictable rules of classical computers, an ability to represent the infinite combinations of dice rolls and royal flushes of quantum physics has been just too hard.

Quantum computers, on the other hand, are constructed around these very same principles of quantum probability that govern chemistry on a fundamental level.

Logical units called qubits exist in a fuzzy state of 'either/or'. When combined with the 'maybe' states of other qubits in a system, it provides computer engineers with a unique way to carry out computations.

Algorithms specially formulated to take advantage of these quantum mechanics allow for shortcuts, reducing down to minutes that which would take a classical super computer thousands of years of grinding.

If we're to have a hope of modelling chemistry on a quantum level, we're going to need that kind of power, and some.

Just calculating the sum of actions that determine the energy in a molecule of propane would hypothetically take a supercomputer more than a week.But there's a world of difference between a snapshot of a molecule's energy, and calculating all the ways they might change.

The diazene simulation used 12 of the 54 qubits in the Sycamore processor to perform its calculations. This in itself was still twice the size of any previous attempts at chemistry simulations.

The team also pushed the limits on an algorithm designed to marry classical with quantum processes, one designed to iron out the errors that arise all too easily in the delicate world of quantum computing.

It all adds up to possibilities of increasingly bigger simulations in the future, helping us design more robust materials, sift out more effective pharmaceuticals, and even unlock more secrets of our Universe's quantum casino.

Diazene's wandering hydrogens is just the start of the kinds of chemistry we might soon be able to model in a quantum landscape.

This research was published in Science.

Go here to see the original:

Google Says It Just Ran The First-Ever Quantum Simulation of a Chemical Reaction - ScienceAlert

This Equation Calculates the Chances We Live in a Computer Simulation – Discover Magazine

The Drake equation is one of the more famous reckonings in science. It calculates the likelihood that we are not alone in the universe by estimating the number of other intelligent civilizations in our galaxy that might exist now.

Some of the terms in this equation are well known or becoming better understood, such as the number of stars in our galaxy and the proportion that have planets in the habitable zone. But others are unknown, such as the proportion of planets that develop intelligent life; and some may never be known, such as the proportion that destroy themselves before they can be discovered.

Nevertheless, the Drake equation allows scientists to place important bounds on the numbers of intelligent civilizations that might be out there.

However, there is another sense in which humanity could be linked with an alien intelligence our world may just be a simulation inside a massively powerful supercomputer run by such a species. Indeed, various scientists, philosophers and visionaries have said that the probability of such a scenario could be close to one. In other words, we probably are living in a simulation.

The accuracy of these claims is somewhat controversial. So a better way to determine the probability that we live in a simulation would be much appreciated.

Enter Alexandre Bibeau-Delisle and Gilles Brassard at the University of Montreal in Canada. These researchers have derived a Drake-like equation that calculates the chances that we live in a computer simulation. And the results throw up some counterintuitive ideas that are likely to change the way we think about simulations, how we might determine whether we are in one and whether we could ever escape.

Bibeau-Delisle and Brassard begin with a fundamental estimate of the computing power available to create a simulation. They say, for example, that a kilogram of matter, fully exploited for computation, could perform 10^50 operations per second.

By comparison, the human brain, which is also kilogram-sized, performs up to 10^16 operations per second. It may thus be possible for a single computer the mass of a human brain to simulate the real-time evolution of 1.4 10^25 virtual brains, they say.

In our society, a significant number of computers already simulate entire civilizations, in games such as Civilization VI, Hearts of Iron IV, Humankind and so on. So it may be reasonable to assume that in a sufficiently advanced civilization, individuals will be able to run games that simulate societies like ours, populated with sentient conscious beings.

So an interesting question is this: of all the sentient beings in existence, what fraction are likely to be simulations? To derive the answer, Bibeau-Delisle and Brassard start with the total number of real sentient beings NRe, multiply that by the fraction with access to the necessary computing power fCiv; multiply this by the fraction of that power that is devoted to simulating consciousness fDed (because these beings are likely to be using their computer for other purposes, too); and then multiply this by the number of brains they could simulate Rcal.

The resulting equation is this, where fSim is the fraction of simulated brains:

Here RCal is the huge number of brains that fully exploited matter should be able to simulate.

The sheer size of this number, ~10^25, pushes Bibeau-Delisle and Brassard toward an inescapable conclusion. It is mathematically inescapable from [the above] equation and the colossal scale of RCal that fSim 1 unless fCiv fDed 0, they say.

So there are two possible outcomes. Either we live in a simulation or a vanishingly small proportion of advanced computing power is devoted to simulating brains.

Its not hard to imagine why the second option might be true. A society of beings similar to us (but with a much greater technological development) could indeed decide it is not very ethical to simulate beings with enough precision to make them conscious while fooling them and keeping them cut off from the real world, say Bibeau-Delisle and Brassard.

Another possibility is that advanced civilizations never get to the stage where their technology is powerful enough to perform these kinds of computations. Perhaps they destroy themselves through war or disease or climate change long before then. There is no way of knowing.

But suppose we are in a simulation. Bibeau-Delisle and Brassard ask whether we might escape while somehow hiding our intentions from our overlords. They assume that the simulating technology will be quantum in nature. If quantum phenomena are as difficult to compute on classical systems as we believe them to be, a simulation containing our world would most probably run on quantum computing power, they say.

This raises the possibility that it may be possible to detect our alien overlords since they cannot measure the quantum nature of our world without revealing their presence. Quantum cryptography uses the same principle; indeed, Brassard is one of the pioneers of this technology.

That would seem to make it possible for us to make encrypted plans that are hidden from the overlords, such as secretly transferring ourselves into our own simulations.

However, the overlords have a way to foil this. All they need to do is to rewire their simulation to make it look as if we are able to hide information, even though they are aware of it all the time. If the simulators are particularly angry at our attempted escape, they could also send us to a simulated hell, in which case we would at least have the confirmation we were truly living inside a simulation and our paranoia was not unjustified..., conclude Bibeau-Delisle and Brassard, with their tongues firmly in their cheeks.

In that sense, we are the ultimate laboratory guinea pigs: forever trapped and forever fooled by the evil genius of our omnipotent masters.

Time for another game of Civilization VI.

Ref: Probability and Consequences of Living Inside a Computer Simulation. arxiv.org/abs/2008.09275

Read more:

This Equation Calculates the Chances We Live in a Computer Simulation - Discover Magazine

With Republicans like these, who needs Ds? – The Highland County Press

After working my usual day, I arrived home to find that three people in our house had received our "official absentee ballot applications" from Ohio Republican Secretary of State Frank LaRose, who does not like to be photographed with his eyes wide shut in public meetings in Hillsboro.

To be clear, not one us had requested an "official absentee ballot application." For the vast majority of the last 41 years, I have voted in person in either Highland County or Adams County. You could, as I told a former mayor years ago, look it up.

I have not opened my "official absentee ballot application," nor do I plan to. I know where and how to vote, thank you very much.

Upon receiving the "official absentee ballot application," I couldn't help but recall a recent story by the alt-left NPR. (By the way, my MacBook Pro dictionary recognizes "alt-right" but not "alt-left." No surprise, there.)

In a June 25, story (https://www.npr.org/sections/coronavirus-live-updates/2020/06/25/883441640/nearly-1-4-billion-in-coronavirus-relief-payments-sent-to-dead-people), NPR reported that nearly $1.4 billion in coronavirus relief payments were sent to dead people.

The Government Accountability Office said the error involved almost 1.1 million checks and direct deposits sent to ineligible Americans. The payments were part of the COVID-19 package passed by Congress in March and known as the CARES Act.

Reasonable minds must wonder how many unrequested "official absentee ballot applications" have been sent to dead people. Without question, we will be assured that there's a distinction between these government mailings.

Signatures must match. (Not true. After decades of writing notes during public meetings, I am fortunate if I sign my name the same way more than 10 times in a row. That's why I always vote in person with a valid ID, including using my state fishing license just for fun one time in Tranquility.)

There will be checks and balances. Sure there will. Just like the $1.4 billion in coronavirus relief payments that were sent to dead people.

Unrequested official absentee ballot applications are the same as absentee voting and are not to be confused with universal vote by mail. Wrong.

The traditional definition of absentee voting involves a voter who, is for some reason, unable to make it to the polls on Election Day. The voter obtains a form to request a ballot, fills that form out with his/her excuse for not being able to vote in person and sends it to the state. The state sends the voter a ballot and the voter finally returns the ballot with his or her vote.

The official absentee ballot application that we and, no doubt, millions of other Ohioans received, allows voters to receive a mail-in ballot without providing a reason why they need one, much less asking for one.

Voters ought to at least have some skin in the game when exercising their constitutional right to vote.

Universal vote by mail is a great opportunity for fraud.

* * *

Given that the Nov. 3 general election is closer than we may think, and given that letters to the editor of this newspaper in support of some and in opposition to others are inevitable, let's get this out of the way now.

For the record, I edited portions of a recent letter that disparaged someone's political opponent. As I told the author of that letter, I've done this job longer than most people in southern Ohio, and I've learned that political letters to the local newspaper are best when written in support of Candidate A, rather in opposition to Candidate B.

To the letter writer's credit, he understood. That policy is a two-way street, too. I really don't care about the letter after your name. That doesn't impress me.

* * *

In recent news, one Highland County public official deserves some well-deserved recognition. Lord knows, many public officials over the years have begged that I not mention them. No good can come from that, given my own track record of endorsements and other political musings for the last 30 years.

Nonetheless, Highland County Prosecuting Attorney Anneka Collins does deserve some credit for standing on principle.

First, her office worked diligently to have approximately $30,000 in taxes paid by a North Shore Drive business. That issue was half-arsed addressed a decade ago to no avail. Counselor Collins corrected this. Thank you.

Last week, county commissioner Terry Britton pointed out that State Rep. Shane Wilkin said the State Controlling Board had approved the release of $175 million to help local communities with the cost of COVID-19 pandemic-related expenses, including $501,166.42 for Highland County. This was in support of unbudgeted administrative leave funds for the county engineer's office.

Im just too conservative to vote yes for this, Collins said. If you guys want to vote, we can argue this all day long. Im not going to change my position.

Collins said that as long as the prosecutors office isnt at a shortfall, Im not asking for any more money, because that money can be used somewhere else.

Indeed.

Other counties have considered small businesses that are hurting before adding to the coffers of already bloated government budgets, whose salaries, wages and benefits far exceed the family per-capita incomes in Highland County.

Four Republicans voted to shake the federal "free-money tree," all three commissioners and Highland County Auditor Bill Fawley. (What is the federal deficit these days, anyway? Republicans used to care about that.)

Highland County has one public servant working hard to recoup $30,000 and four others who are spending twice that before the first check is cashed.

Good for Anneka Collins for speaking truth to power.

Rory Ryan is publisher and owner of The Highland County Press, Highland County's only locally owned and operated newspaper.

See the rest here:

With Republicans like these, who needs Ds? - The Highland County Press

Resident Evil survival horror TV series is coming to Netflix and fans think its the next Stranger Things – The Sun

RESIDENT Evil fans will be pleased to hear a new TV series based on the video game franchise is coming to Netflix.

The streaming platform has revealed some details about the live-action adaptation.

The @NXOnNetflix account tweeted: "When the Wesker kids move to New Raccoon City, the secrets they uncover might just be the end of everything.

"Resident Evil, a new live action series based on Capcoms legendary survival horror franchise, is coming to Netflix."

For those who don't know, the Wesker kids referred to are the daughters of the main villain from the first five video games.

He's known as Albert Wesker.

1

Netflix added: "The 8 x 1 hour episode season will be helmed by Andrew Dabb (Supernatural), Bronwen Hughes (The Walking Dead, The Journey Is the Destination) will direct the first two episodes."

The first episode will be called "Welcome to Raccoon City".

According to Deadline, the series will feature two timelines.

The first timeline will reportedly focus on the two fourteen year Wesker sisters who move into a manufactured, corporate town.

This is where they learn a potentially world destroying secret.

The second timeline will apparently feature a virus stricken world with a dwindling human population, where the sisters are separated.

With Resident Evil fans already tweeting excitedly, it is possible that the series could become as popular as Stranger Things.

We don't have a date for when the series will air but seeing as the first episode is written, we can hope they'll start filming soon.

Use Netflix on a computer or laptop? Try these useful shortcuts

Here are some handy keyboard shortcuts...

CHANGING PLACESFind YOUR town 750million years ago with mind-blowing Earth map

Hot Deals

LISTEN UPSpotify is giving away FREE Google Nest Mini smart speakers to subscribers

SPEAK EASYActivate these 4 WhatsApp settings NOW to stop snoopers including face-lock

ROCKY HORRORNASA tracking huge asteroid which will 'hit Earth's orbit next month'

TINY-SAURUS REXT-Rex may have weighed HALF of previous estimates coming in at 7 tonnes

FREE NETFLIXWatch Netflix for FREE with no subscription thanks to a global promotion

In other news, Disney fans are fuming after the entertainment giant announced its live-action remake of Mulan will cost up to 26 to stream in the UK.

Gamers will have to pay as much as 449 for the PlayStation 5, a new leak suggests.

And, Disney recentlyreleased a trailerfor its latest Star Wars TV series, "The Bad Batch".

What's your favourite thing to watch on Netflix? Let us know in the comments...

We pay for your stories! Do you have a story for The Sun Online Tech & Science team? Email us at tech@the-sun.co.uk

Read more:

Resident Evil survival horror TV series is coming to Netflix and fans think its the next Stranger Things - The Sun

Checking In With the Newsletter Economy – New York Magazine

Photo: Universal History Archive Images/Universal Images Group via Getty

Over the past few years, online-publishing platforms have made it easy for users to charge a subscription fee for newsletters. As Facebook, Google, and private equity have laid waste to print media nationwide, these platforms have given rise to a new publishing economy, in which any writer with a dedicated following might be able to make a living. Of all the platforms out there, Substack, launched in 2017, has become the preferred tool for writers striking out on their own. According to the company, more than 100,000 subscribers now pay for at least one newsletter, and the platforms top users collect hundreds of thousands of dollars in revenue, which, in some cases, amounts to more than they might earn as staff writers at legacy publications.

Substack collects a 10 percent fee from all subscriptions, which allows it to maintain its one sacred oath: no advertisements. Like traditional media, publishing platforms have been crushed under the pressure advertisers put on traffic expectations and Substack sees its commitment to remaining ad-free as insurance against venture capitalists looking to bulk up or scrap publishers for parts. But that doesnt mean Substack hasnt gotten attention from investors. The company raised $15.3 million in funding from Andreessen Horowitz last year, some of which has been used to provide fellowships and sizable advances to writers, and it the company is also considering using the some of the money to provide its users with legal and editorial guidance.

The newsletter trend is bigger than independent journalists. Print veterans like Graydon Carter and Jonah Goldberg have styled their new publications staffed with editors and funded by investments from private equity as newsletters. And some Substack users are beginning to join forces, bundling their subscriptions at a discount, to offer their readers something that resembles a traditional publication (and that, perhaps, can bring together writers across the ideological spectrum). The question may no longer be whether readers are willing to pay for hyper-focused newsletters, but how many are willing to do so. Substack CEO Chris Best thinks the appetite is great.

Weve developed this habit of outsourcing everything were reading to our Facebook and Twitter feeds. Paying for writers that you trust is a way to take back control, Best said. People are ready to take back their mind.

National Review editor Jonah Goldberg and Weekly Standard alum Stephen Hayes founded this conservative newsletter, which they describe as a center-right Atlantic.

For: Your uncle who rails against Trump but probably wont vote for Biden.Launch: October 2019.Cost: $10 a month.What you get: Regular columns from Goldberg, Hayes, David French, and a collection of editors and writers drawn from The Weekly Standard, The Bulwark, and various conservative think tanks.

Heather Cox Richardson, a professor at Boston College, contextualizes todays news with American history.

For: Ken Burns fans.Launch: November 2019.Cost: $5 a month.What you get: Richardsons newsletter started as a regular Facebook post in which she offered historical context for Trumps Ukraine scandal and subsequent impeachment. Now, readers wake to a roundup of the previous days news delivered in the clear-eyed language of an email from your smartest friend, who happens to have a Ph.D. from Harvard and has spent decades researching American history and politics.

Matt Taibbi moves his alt-left blogging from Rolling Stone to a sort-of-weekly newsletter.

For: Gonzo nostalgics and Russiagate skeptics.Launch: April 2020.Cost: $5 a month.What you get: Taibbis irreverent columns are just as critical of corporate greed and the barbarism of Trumps GOP as they are of anti-racism initiatives and cancel culture, which, according to Taibbi, are symptoms of the New Lefts tendency to divide people between victimizers and victims (he adds an asterisk as a sarcastic scarlet letter to the name of anyone who has been canceled).

Bill Bishop, who formerly wrote newsletters for Axios and the New York Times, strikes out on his own with a daily newsletter on all things China.

For: Day traders and retired spooks.Launch: October 2017.Cost: $15 a month.What you get: Bishop breaks down the days essential eight most-important stories, from a Xi Jinping inspection tour, to Huawei, to TikTok. Bishop knows when to be skeptical of Chinas propaganda and when to be skeptical of propaganda about China.

Former New York Magazine columnist Andrew Sullivan returns to blogging.

For: People who despise Trump but think wokeness is also a threat to the American experiment.Launch: July 2020.Cost: $5 a month.What you get: A cosmopolitan conservatives analysis of race, gender, sexuality, religion, and class, sprinkled with deconstructions of Trumpism.

A former staff writer at The New Republic, Emily Atkin writes an impassioned, deeply reported newsletter on climate change four times a week.

For: People who are the right amount of angry (outraged) about climate change.Launch: September 2019.Cost: $8 a month.What you get: Atkin breaks news, as she did recently when she discovered Democratic congressman Tim Ryan had taken $10,000 from a corrupt fossil-fuel company despite signing the No Fossil Fuel Money pledge. Atkins work has earned her about 2,500 subscribers and, according to The New York Times, she expects to gross $175,000 this year.

Boston-based writer and reporter Luke ONeil.

For: Your friend who laments the death of alt-weeklies.Cost: $6.65 a month.Launch: July, 2018.What you get: ONeil packages his columns in stream-of-consciousness reports that detail the many reasons reasonable people have to be angry right now. His reports are filled with accompanying exhibits and tweets and he frequently includes original interviews and guest posts. ONeil has said he is on track to make $100,000 annually.

When G/O Media folded the news site Splinter, eight journalists started a WordPress site. In March, they switched to Substack and in July launched a paid subscription.

For: Your friend who laments the death of Splinter.Cost: $8 a month.Launch: March 2020.What you get: A leftist politics newsletter thats heavy on the labor beat.

ThinkProgress founder Judd Legum reinvents his liberal news and opinion blog as a newsletter.

For: Your cousin who canvasses for Dems.Cost: $6 a month.Launch: July, 2018.What you get: Want a comprehensive guide to the Trump Administrations attack on the USPS? Or an investigation into Sarah Palins Facebook grift? Popular Information has you covered.

Anonymously written analysis on investing, restructuring, and bankruptcies.

For: Anyone looking for a break from GMAT prep.Cost: $49 a month.Launch: November 2016.What you get: Two newsletters a week provide a skim of corporate bankruptcies and disruptions in the countrys largest industries and offer analyses on everything from airlines and J. Crew in the time of Covid to WeWorks epic implosion.

Business, technology, and media analysis from Ben Thompson.

For: New York tech investors.Launch: March 2013.Cost: $12 a month.What you get: Thompsons daily columns cover everything from the Big Four, to venture capital, to the future of business and media. Thompson is considered by many to be the godfather of the modern paid-subscription newsletter.

Gender, politics, and whatever else is on her mind that day.

For: Anyone in search of a friend.Launch: March 2013.Cost: $5 a year, minimum.What you get: Like Thompson, Friedman is a newsletter pioneer and has spent the better part of the last decade winning readers over with insightful writing and reporting, reading recommendations, and doodled pie charts.

Magazine legend Graydon Carters weekly newsletter picks up where he left off at Vanity Fair, albeit with a smaller platform and a tighter budget.

For: Globe-trotting boomers and bankers who buy art.Launch: July 2019.Cost: $9.99 a month.What you get: Culture, crime, travel, politics from magazine luminaries (including many Vanity Fair alums). The newsletter is a weekly window into Carters proclivities vacations on Lake Como, designer suit recommendations. It all smacks of an era when magazine editors could afford such decadence.

Charlotte Ledger: North Carolina business news, by Tony Mecia.The Dog and Pony Show: Tennessee news and gossip, by Cari Wade Gervin. Street Justice: Washington, D.C., newsletter, by Gordon Chaffin. Importantville: Indiana politics, by Adam Wren.

ParentData: Evidence based parenting advice, by Emily Oster.Hola Papi!: Witty LGBTQ advice column, by John Paul Brammer.BIG: Thinking on monopolies, by Matt Stoller.Margins: The intersection of business and technology, by Ranjan Roy and Can Duruk.

*This article appears in the August 31, 2020, issue ofNew York Magazine. Subscribe Now!

Daily news about the politics, business, and technology shaping our world.

Continue reading here:

Checking In With the Newsletter Economy - New York Magazine

The Rise of DeFi, Potential and Risks – FinSMEs

The word was never the same in 2009 when Satoshi Nakamoto revealed his plans for a digital coin that would be based on everything the current currencies lacked.

Bitcoin today has fundamentally changed how financial systems work. Based on an innovative data maintaining technology, Bitcoin offered cheap, economical, swift and highly secure means of transferring value of money from one person to another.

However great it is,Bitcoin was never intended to act as a whole replacement to thecurrent economic and financial setup, just as a digital currency.Today, people all over the world have realized that blockchain, thetechnology behind all cryptocurrencies, has a lot more potential inday to day finances and money matters.

Amongothers, Decentralized Finance, or DeFi, as it is known,is a phenomenon in which the power of decentralized networks isleveraged to transform traditional financial products into thedistributed data age. The networks create a trustless environmentthat does not require any middleman such as banks or other financialinstitutions.

In the last couple ofyears, DeFi has risen to prominence. The concept started during theheydays of the last crypto rush, where thousands of platform werelaunched to serve a variety of purposes. One of these services wascreating decentralized banks and financial firms. Today, people havethe option of obtaining loans, creating savings accounts and tradeusing DeFi services.

DeFi have become extremelypopular due to the advantages offered:

DeFi seem to be the bestthing that can happen to people in general. However, it is notwithout its risks. With hundreds of DeFi platforms, a person may notbe able to find the right services desired on a platform he or she isregistered on. The tokens used have price volatile in their nature,leading to a possibility that the token held, bought or lent willdevalue.

In the end, where DeFi hasrisks, it offers enormous benefits to the public.

Go here to read the rest:

The Rise of DeFi, Potential and Risks - FinSMEs

Bitcoin Mining Market 2019 Break Down by Top Companies, Countries, Applications, Challenges, Opportunities and Forecast 2026 BTC.com, Antpool,Slush…

CMFE Insights broadcasts a new report titled as Global Bitcoin mining Market, into its massive depository of reports. The circulation converses about the modest drivers that are impelling the development of the business and the troubles rising against the market by large. It also includes the crucial outlines that are trending in the Global Bitcoin mining Market. The Global Bitcoin mining Market is expected to grow at a massive CAGR of over the forecast period 2020-2027.

Bitcoin is a cryptocurrency invented in 2008 by an unknown person or group of people using the name Satoshi Nakamoto and started in 2009 when its implementation was released as open-source software.

The Global Bitcoin mining Market, an analytical study was recently published by CMFE Insights. This statistical data has been scrutinized by using effective exploratory techniques such as primary and secondary research. It considers various applicable sales strategies, which are beneficial in improving the performance of the businesses. The demanding structure of the Global Bitcoin mining Market is fueling the growth of the industries. Additionally, it focuses on some significant restraining factors, which gives a clear idea about threats and challenges involved in running a business.

Request a Sample Copy of this Report At: https://www.cmfeinsights.com/request-sample.php?id=119729

Top Key Players Profiled: BTC.com, Antpool,Slush ,F2pool

In the research study, North America, Europe, Asia-Pacific, Latin America and the Middle East & Africa have been acknowledged at the noticeable regional markets for Global Bitcoin mining Market. On the basis of various vital market verticals such as the industrial volume, product estimating, manufacturing volume, dynamics of demand and supply, revenue and growth of rate in the market in each of the regions.

It provides futuristic market prospects in terms of the upcoming years. The report contains all the necessary veritable of most recent innovations, such as Porters five force model analysis and advanced profiles of elite industry participants. The report additionally drafts a survey of minor and full-scale factors charging for the new candidates in the Global Bitcoin mining Market and the ones as of now in the market along with a systematic value chain exploration.

Ask for Upto 40% Discount: https://www.cmfeinsights.com/ask-for-discount.php?id=119729

Table of Content:

Global Bitcoin mining Market Research Report 2020-2026

Chapter 1: Industry Overview

Chapter 2: Global Bitcoin mining Market International and China Market Analysis

Chapter 3: Environment Analysis of Global Bitcoin mining Market.

Chapter 4: Analysis of Revenue by Classifications

Chapter 5: Analysis of Revenue by Regions and Applications

Chapter 6: Analysis of Global Bitcoin mining Market Revenue Market Status.

Chapter 7: Analysis of Market Key Players

Chapter 8: Sales Price and Gross Margin Analysis of Global Bitcoin mining Market.

Chapter 9Continue To TOC

To Get More Information, Enquiry At: https://www.cmfeinsights.com/enquiry-before-buying.php?id=119729

See the article here:

Bitcoin Mining Market 2019 Break Down by Top Companies, Countries, Applications, Challenges, Opportunities and Forecast 2026 BTC.com, Antpool,Slush...

These are the best films and TV shows to watch on Netflix in September – Milton Keynes Citizen

There is a wide range of new titles set for release this month. (Shutterstock)

If the unseasonable weather kept you inside more than you anticipated in August, causing you to binge watch the entirety of Netflixs collection, theres no need to fear.

Throughout the month of September the streaming service is releasing a whole new batch of films and series to keep you fully entertained as the summer creeps to a close.

Here are the top titles set for release this month.

The Duchess

UK Netflix release date: Friday 11 September

Stand up comedian and occasional 8 out of 10 Cats panelist Katherine Ryan has an exciting new sitcom set for release this month. The Duchess follows Ryan, who plays an exaggerated version of herself as a flawed but loving single mum. Katherine decides to have a second child, but there is one issue: shes not in a relationship. The series follows her as she tries to find a way to make this dream a reality from considering sperm donors to asking her ex.

The Devil All the Time

UK Netflix release date: Wednesday 16 September

The Devil All the Time is an American psychological thriller film based on the novel of the same name by Donald Ray Pollock. It follows Arvin Russell (played by Tom Holland) as he tries to protect his loved ones in a town filled with sinister characters such as a suspicious preacher played by Robert Pattinson, an ominous couple played by Jason Clarke and Riley Keough and a corrupt sheriff played by Sebastian Stan. Also starring Bill Skarsgrd - known for his role as Pennywise in Stephen Kings IT - and Mia Wasikowski (Jane Eyre).

Hope Frozen: A Quest to Live Twice (2020)

UK Netflix release date: Wednesday 15 September

This emotional documentary follows a Thai Buddhist family as they make the unconventional choice to have their terminally ill two-year-old daughter cryogenically frozen in the hope that she will be resurrected and restored back to health in the future. The documentary provides rare insight into not only grief, but the largely undocumented, new scientific fringe innovation of Cryonics, a subject that has been criticised by the wider scientific community.

Bookmarks

UK Netflix release date: Tuesday 1 September

This new Netflix kids show tells childrens stories with an angle on race, features several big names such as Lupita Nyong'o (Us), Caleb McLaughlin (Stranger Things), and Tiffany Haddish (The Lego Movie 2). Bookmarks tells stories specifically from black points of view, covering themes of identity, respect, justice and action.

Im Thinking of Ending Things

UK Netflix release date: Friday 4 September

Charlie Kaufmans new psychological horror film, based on the 2016 novel of the same name by Iain Reid, is quite an unnerving watch. The story captures the doubts and anxieties of a nervous woman meeting her boyfriends rather strange parents for the first time. Starring Jessie Buckley, Toni Collette, Jesse Plemons and David Thewlis.

Enola Holmes

UK Netflix release date: Wednesday 23 September

Written by Jack Thorne (His Dark Materials) and directed by Harry Bradbeer (Fleabag), Enola Holmes puts a feministic spin on the classic Sherlock Holmes story, by focusing on the tales of Sherlock and Mycrofts lesser-known sister Enola, played by Millie Bobby Brown (Stranger Things). The film follows the lively Enola and friends as she tries to find her newly missing mother (Helena Bonham Carter). Starring Henry Cavill and Sam Claflin play Sherlock and Mycroft.

Zodiac

UK Netflix release date: Tuesday 1 September

This crime thriller from director David Fincher is based on a true story and has a star studded line up, including Jake Gyllenhaal, Mark Ruffalo, Robert Downey Jr. Zodiac follows a crime reporter, a political cartoonist, and a couple of cops as they work to investigate San Francisco's infamous Zodiac Killer, a serial murderer operating in the late 60s and early 70s, who is thought to have killed over 20 people and who remains unknown.

The full list of releases coming to Netflix in September:

A Beautiful Mind (2001)BookmarksBorgen, seasons 1-3Demolition Man (1993)Indecent Proposal (1993)The Sum of All Fears (2002)Willy Wonka and the Chocolate Factory (1971)Zodiac (2007)

Chefs Table: BBQ, season 1

Afonso Padilha: ClasslessCall the Midwife, season 8Young Wallander

AwayIm Thinking of Ending Things

Get Organised With the Home EditLa Linea: Shadow of NarcoThe Social DilemmaSo Much Love to Give

The DuchessFamily Business, season 2

Hope Frozen: a Quest to Live TwiceMichael McIntyre: ShowmanMisfits, seasons 1-5

Challenger: The Final FlightCriminal, season 2The Devil All the Time

GIMS: On the RecordThe Last WordThe School Nurse Files

Jurassic World: Camp CretaceousRatchedWhipped

The School Nurse FilesSneakerheads

American Murder: The Family Next Door

Continued here:

These are the best films and TV shows to watch on Netflix in September - Milton Keynes Citizen

Beyond Fermis Paradox VII: What it the Planetarium Hypothesis – Universe Today

Welcome back to our Fermi Paradox series, where we take a look at possible resolutions to Enrico Fermis famous question, Where Is Everybody? Today, we examine the possibility that we cant see them because they have us all inside a massive simulation!

In 1950, Italian-American physicist Enrico Fermi sat down to lunch with some of his colleagues at the Los Alamos National Laboratory, where he had worked five years prior as part of the Manhattan Project. According to various accounts, the conversation turned to aliens and the recent spate of UFOs. Into this, Fermi issued a statement that would go down in the annals of history: Where is everybody?

This became the basis of the Fermi Paradox, which refers to the disparity between high probability estimates for the existence of extraterrestrial intelligence (ETI) and the apparent lack of evidence. Seventy years later, we are still trying to answer that question, which has led to some interesting theories about why we havent. A particularly mind-bending suggestion comes in the form of the Planetarium Hypothesis!

To break it down, this hypothesis states that the reason we are not seeing aliens is that humanity is in a simulation, and the aliens are the ones running it! In order to ensure that human beings do not become aware of this fact, they ensure that the simulation presents us with a Great Silence whenever we look out and listen to the depths of space.

Given the sheer size of the Universe and its age, the Search for Extraterrestrial Intelligence (SETI) seems like a valid enterprise. Consider the following: there are 200 to 400 billion stars in our galaxy and as many as 2 trillion galaxies in the Universe. Within our galaxy alone, there are an estimated 6 billion Earth-like planets, which means that there could be as many as 12 quintillion Earth-like planets in the Universe.

Meanwhile, it took humanity about 4.5 billion years to emerge on Earth, and the Universe has been around for 13.8 billion years. As such, its not farfetched at all to assume that intelligent life has had countless opportunities to emerge somewhere else in the Universe and plenty of time to evolve. In 1961, American physicist and SETI researcher Dr. Frank Drake illustrated this point during a meeting at the Green Bank Observatory.

In preparation for the meeting, Drake created an equation that summed up the probability of finding ETIs in our galaxy. Thereafter known as the Drake Equation, this probabilistic argument is expressed mathematically as:

The purpose of this argument was to summarize the challenges of SETI (i.e. the sheer number of unknowns) and put it into context. At the same time, it demonstrated that the odds of findings ETIs are quite good. Even employing the most conservative estimates for every parameter, the Equation indicates that there should be at least a few ETIs in our galaxy that we could communicate with at any given time.

Moreover, given the age of the Universe itself, there should be many species in our Universe that have evolved to the point where they could explore space and perform feats of engineering that would dwarf anything we can dream of. Which brings us to

In 1964, Soviet/Russian astrophysicist Nikolai Kardashev proposed that extraterrestrial civilizations could be classified based on the amount of energy its able to harness. In an essay detailing this idea, titled Transmission of Information by Extraterrestrial Civilizations, Kardashev proposed a three-tiered scheme the Kardashev Scale that stated the following:

From the standpoint of SETI, civilizations that fall into any of these three categories could be identified in a number of ways. For example, a Type I civilization is likely to have grown to occupy its entire planet and colonize Low Earth Orbit (LEO) with satellites and space stations. This cloud of artificial objects (aka. Clarke Belts) could be visible from the way it reflects the stars light during planetary transits.

A Type II civilization, according to Kardashev, is one that would be capable of building a megastructure around their star (i.e. a Dyson Sphere). This would allow the civilization to harness all of the energy produced by its sun, as well as multiplying the amount of habitable space in its home system exponentially. As Dyson himself stated in his original paper, these megastructures could be spotted by looking for their infrared signatures.

As for Type III civilizations, it is possible that a civilization capable of harnessing all the energy of its galaxy would do so by building an apparatus that encloses it. Or, its possible they would choose to enclose just a part of it, around its core region perhaps, and the supermassive black hole (SMBH) at its center. Regardless, it stands to reason that such an advanced civilization would be impossible not to notice.

Hence Fermis why famous question endures. To date, most attempts to resolve the Fermi Paradox focus on how aliens could exist but be unable to communicate with us. In contrast, the Simulation Hypothesis suggests that they are deliberately not communicating with us, and even taking great pains to hide their existence. Their method of choice consists of keeping us in a simulated reality so that we are blind to their existence.

In 2001, famed science fiction author and mathematician/engineer Stephen Baxter wrote a seminal essay titled, The Planetarium Hypothesis A Resolution of the Fermi Paradox. In response to Fermis question, Baxter postulated that humanitys astronomical observations are actually an illusion created by a Type III Civilization who are keeping humanity in a giant planetarium. Or as he put it:

A possible resolution to the Fermi Paradox is that we are living in an artificial universe, perhaps a form of virtual- reality `planetarium, designed to give us the illusion that the universe is empty. Quantum-physical and thermo-dynamic considerations inform estimates of the energy required to generate such simulations of varying sizes and quality.

This concept is similar to the Simulation Hypothesis, a theory originally put forth by Niklas Bostrom of the Oxford Future of Humanity Institute (FHI). In a 2001 paper, titled Are You Living In A Computer Simulation?, he addressed the idea that what humanity considers the observable Universe is actually a massive virtual environment. This idea, where the very nature of reality is questioned, has deep roots in many philosophical traditions.

In this case, however, it is suggested that the purpose of keeping humanity in a simulation is to protect us, our hosts, and perhaps other species from the dangers associated with contact. Using human history as a template, we see countless examples of how two cultures meeting for the first time can easily end in war, conquest, slavery, and genocide.

However, there are limits. According to Baxters original paper, it would be well within the abilities of a Type III civilization to contain our present civilization within a perfect simulation. However, a single culture that occupies a space measuring ~100 light-years in diameter would exceed the capacities of any conceivable simulated reality.

In this respect, it would be within the Type III civilizations best interests to create a simulation that would contain no evidence of ETIs while also placing limits on our ability to expand out into the Universe. This could be done by including physics models that limit humanitys ability to leave Earth (i.e. its high-escape velocity) and our ability to explore and colonize space (the limits imposed by Special Relativity).

Naturally, the idea that were living in a planetarium created by advanced aliens is difficult to test. However, multiple studies have been conducted on the Simulation Hypothesis that have implications for the Planetarium Hypothesis. For instance, Prof. David Kipping of Columbia University and the Flatiron Institutes Center for Computational Astrophysics recently published a study on the very subject.

In this study, titled A Bayesian Approach to the Simulation Argument, Kipping conducted a series of statistical calculations designed to test the likelihood and the uncertainty associated with Bostroms hypothesis. In sum, Kipping argued that a posthuman civilization with the ability to generate such simulations would create far more than just one, which indicates a high probability that we are not in one.

At the same time, he indicated that the odds that we could be in one of many are close to being even:

Using Bayesian model averaging, it is shown that the probability that we are sims is in fact less than 50%, tending towards that value in the limit of an infinite number of simulations. This result is broadly indifferent as to whether one conditions upon the fact that humanity has not yet birthed such simulations, or ignore it. As argued elsewhere, it is found that if humanity does start producing such simulations, then this would radically shift the odds and make it very probably we are in fact simulated.

Thanks to endorsements by public figures like Elon Musk, who once said theres a billion to one chance were living in base reality, the concept has gained mainstream attention and acceptance. At the same time, though, both the Simulation and Planetarium Hypothesis have their share of detractors and counter-studies that question the merits of this scenario.

For starters, multiple researchers have questioned whether a Universe-level simulation is even possible given our understanding of the laws of nature. In particular, some researchers have used our own failures with quantum Monte Carlo (QMC) simulations to argue that future humans (or an ETI) would not be able to generate a reality that is accurate right down to the quantum level.

Others have criticized the Simulation Hypothesis based on Ockhams Razor and what they see as the computational impossibility to simulate our something as huge as our Universe down to the granular level. Then there are arguments that use recent advancements in lattice Quantum Chromodynamics (QCD) to show how a simulated environment will inevitably be finite and vulnerable to discovery.

Of course, these criticisms can be countered by arguing that it is impossible to disprove the simulation theory based on physical arguments when the very physics we are referencing could be nothing more than the result of the simulation. But this counter-argument only reinforces the issue of how the Simulation Hypothesis is not falsifiable. In short, it can neither be proven nor disproven, so whats the point of debating it?

However, there are arguments concerning the Planetarium Hypothesis that are testable and can therefore be treated separately. For example, there are those who have argued that assuming the existence of a Level III Kardashev civilization is based on a fundamentally flawed assumption. In short, it assumes that the evolutionary path of advanced civilizations is based on expansion rather than optimization.

In a 2008 study, Against the Empire, Serbia astronomer, astrophysicist, and philosopher Milan Cirkovic argued the opposite take. In short, he tested two models for determining the behaviors of a postbiological and technologically advanced civilization the Empire-State and the City-State. In the end, he argued that advanced species would prefer to remain in spatially-compact optimized environments rather than spread outwards.

Some examples of this include the Dyson Swarm and the Matrioshka Brain, two variations on Dysons famous sphere. Whereas the former consist of smaller objects interlinked in orbits around a star, the latter consists of layers of computing material (computronium) powered by the star itself. The civilization responsible for building it could live on the many islands in space, or live out their existence as simulations within the giant brain.

At the end of the day, a species choosing to live like this would have very little incentive to venture out into the Universe and attempt to colonize other worlds or interfere with the development of other species. Nor would they consider other species a threat since they would be inclined to believe the evolutionary pathway for other intelligent life would be similar to their own i.e. in favor of optimization.

Unfortunately, such arguments require that evidence of ETIs be found such as the heat signatures produced by their megastructures in order to be considered testable. At this time, we have a hard time constraining what would be considered a sign of intelligent life and its activity (aka. technosignatures) because we know of only one species capable of doing that (simply put, us!)

Nevertheless, theories like the Planetarium Hypothesis remain fascinating food for thought as we continue to probe the Universe looking for signs of intelligent life. They also help refine the search by suggesting things to be on the lookout for. In the meantime, all we can do is keep looking, listening, and wondering if anyone is out there.

We have written many interesting articles about the Fermi Paradox, the Drake Equation, and the Search for Extraterrestrial Intelligence (SETI) here at Universe Today.

Heres Where Are All the Aliens? The Fermi Paradox, Where Are The Aliens? How The Great Filter Could Affect Tech Advances In Space, Why Finding Alien Life Would Be Bad. The Great Filter, Where Are All The Alien Robots?, How Could We Find Aliens? The Search for Extraterrestrial Intelligence (SETI), and Fraser and John Michael Godier Debate the Fermi Paradox.

Want to calculate the number of extraterrestrial species in our galaxy? Head on over to the Alien Civilization Calculator!

And be sure to check out the rest of our Beyond Fermis Paradox series:

Astronomy Cast has some interesting episodes on the subject. Heres Episode 24: The Fermi Paradox: Where Are All the Aliens?, Episode 110: The Search for Extraterrestrial Intelligence, Episode 168: Enrico Fermi, Episode 273: Solutions to the Fermi Paradox.

Sources:

Like Loading...

More here:

Beyond Fermis Paradox VII: What it the Planetarium Hypothesis - Universe Today

Bayern and Inter involved in "Perisic Poker" – Bulinews.com

By Peter Vice

A reliable Italian footballing insider reports that Bayern Mnchen and Inter are currently locked in a standoff over Ivan Perisic's worth.

While FC Bayern Mnchen declined to exercise the buyout options on their three 2019/20 loanees, reports now surface that the FCB are in talks to re-acquire one of the trio sent back to their parent clubs on August 31st. In addition to allowing attacking midfielder Philippe Coutinho and right fullback Alvaro Odriozola to return to FC Barcelona and Real Madrid respectively, the FCB showed little interest in exercising a 20 million buyout clause for Inter Milans Ivan Perisic.

Italian football insider Fabrizio Romano, as reported on German site Transfermarkt.de, revealed that Bayern chair Karl-Heinz Rummenigge has led an effort to permanently acquire the 31-year-old Croatian winger. Concrete details regarding the current negotiations are available. Apparently, Bayern initiated proceedings with a 12 million offer. Inter seeks to counter with a 15 million asking price.

Bayerns interest in retaining Perisics services surely has much to do with his incredibly strong performances in the final stages of this years Champions League. His stellar play on the left flank played an enormous role in opening the floodgates during the historic 8-2 romp of Barcelona in the quarterfinals. A solid evening of work in the semis also proved immensely helpful.

In perhaps a deliberate prelude to the poker one now sees playing out, Perisic was the one player dropped by manager Hans Dieter Flick in the UCL final. Kingsley Coman replaced him on the left, eventually scoring the decisive goal that would lead Bayern to its first treble in seven years.

The ongoing back-and-forth should prove interesting. Bayern remain deep enough at Perisics position that they can resist attempts by Inter to bump the price up. It remains to be seen how high the German giants can be driven.

Original post:

Bayern and Inter involved in "Perisic Poker" - Bulinews.com

Ripstone Brings Poker Back to Video Game Consoles – Beat The Fish

Ripstone has announced the launch of its upcoming game, Poker Club, which will be available on current and next-gen consoles.

Its not often that you see poker made available on video game consoles. Tournaments and cash games are usually reserved for real-money poker sites and mobile apps, but indie developer Ripstone will be bringing poker back to the console gaming market with its upcoming release, Poker Club.

The studio, known for creating video games based on pool and chess, has announced its plans to launch the most immersive poker game ever made for the following consoles:

To create an authentic experience, the game will feature a first-person perspective but players will also be able to switch between camera modes including top-down viewing and cinematic angles.

While Poker Club will be made available on current consoles, it is on the next-gen machines where this game will really shine. The game will feature Ultra HD 4K visual and make use of Playstation 5s advance haptics to make for a more realistic feel.

Whereas before in game development you had to pick and choose your battles: are we going to target high frame rates / top-end visual features / high rendering resolution / etc. With the power of these new consoles, we can choose everything, its a no-compromise experience Phil Gaskell, Poker Clubs director.

Poker Club players will be table to take part in online Texas Holdem ring games and tournaments with plenty of variations available. Single table, multi-table, freezeouts, turbos and bounties are all available in the tournament selection, and private games are fully customizable.

In addition to playing in individual cash games and tournaments, you take on the challenge of building an entire career as a professional poker player. Similar to sports games like FIFA and NBA2K, you can enter Career Mode where you create your own character that makes their way from modest home games to worldwide tournaments.

There will be 7 different settings for the games in Poker Club, ranging from grungy basements to high-class venues. As players progress in their poker career, they will participate in games across these locations to signify their rise to fame.

Take a look at the teaser trailer below:

Since youre playing Poker Club on video game consoles, the money that youre playing with is no more valuable than Zeldas rupees or Animal Crossings bells. You wont be able to wager with real money on Poker Club, but the game still offers a fun and exciting experience.

No official release date has been announced for Ripstones Poker Club, but its expected to be available later this year. If youre really eager to play poker on your console in the meantime, Pure Holdem (2015) is highly recommended for PS4 and Xbox One players while PokerStars VR is a popular game among PC players (if you own an Oculus).

View original post here:

Ripstone Brings Poker Back to Video Game Consoles - Beat The Fish

ARE THERE CASINO OR POKER TOURNAMENTS ON THE ISLE OF WIGHT? – Island Echo

Are you planning on visiting the Isle of Wight? Are you a fan of playing poker or spending a night at the casino? There are lots of fun activities available for tourists venturing to the Isle of Wight. Indeed, it is a popular tourist destination. It offers everything from the famous Isle of Wight Festival to beautiful views from the coastal path around the island. But if you are more interested in winning some cash or just having fun with your friends, you will want to know if there is a casino on the Isle of Wight. In addition, you might be curious whether there are any tournaments you can join.

Clubs and Tournaments for Poker Players

For many years, people have believed that the Isle of Wight were going to get their own casino. Certainly, this island on the south coast of England offers many activities. But the area does not currently have a casino for residents and tourists to enjoy. Those who enjoy the casino experience often have to go to Portsmouth for the 24 hour casino there.

But all hope is not lost if you love card games and the casino experience. For example, there are local clubs you can join and meet new people. This includes the Wheatsheaf Hotel Poker Club. This a regular club that meets and allows you to make friends and enjoy a game of poker in a relaxed setting. There are also poker nights held at the community club. Again, this allows for a friendly game and the opportunity to enjoy a drink and make new friends.

Times are Changing

It is important to note that times are changing and casinos on land are not as popular as they once were. Instead, more people are going online to enjoy their favourite games. After all, a lot of websites offer a poker bonus for joining and this can be an incentive for someone on the fence. Indeed, the changing environment in the world and the need for social distancing means that less people are going to be attending clubs for the near future. Any tournaments that are being held in person are likely going to be cancelled or already have been due to government advice.

The advantage of online casinos is that you are able to play on the computer from the comfort of your own home. It is likely that tournaments will be held on the internet instead. This is an alternative way to enjoy playing poker and other popular casino games.

Due to current circumstances, it is unlikely that the Isle of Wight will see its first casino anytime soon. However, hopefully, the local clubs on the island will be able to restart sometime soon. This is likely to happen further down the line and with new rules regarding social distancing. In the meantime, online casinos can act as a way for gamblers to enjoy themselves and practice their skills.

View original post here:

ARE THERE CASINO OR POKER TOURNAMENTS ON THE ISLE OF WIGHT? - Island Echo

Benjamin Pollak Latest Player to Join Team partypoker – PokerNews.com

September 02, 2020Will Shillibier

partypoker have announced that yet another poker superstar will join their ranks, with Frenchman Benjamin Pollak the latest member of Team partypoker.

Pollak is the latest player to join Team partypoker after Kevin Hart in April, and will join a stellar line-up of Sponsored Pros including Roberto Romanello, Josip Simunic, Dzmitry Urbanovich, Ludovic Geilich and Day Kotoviezy

Stay up to date with everything going on in the World Poker Tour World Online Championships at the PokerNews Reporting Hub

The former WSOP Main Event final tablist will join the site in the middle not only the WPT World Online Championships, but the partypoker EU network POWERFEST which runs until Wednesday, September 23rd.

Im thrilled and very excited to join the partypoker team," said Pollak. "Theyve supported players from the beginning and see the game as I do. I couldnt be happier to support them and be part of this great adventure."

Pollak will play under the name 'Magicdeal88' on partypoker, and is looking forward to taking to the tables as soon as possible. The partypoker EU POWERFEST will pay out a minimum guarantee of 5 million in France and 4.5 million in Spain. Just like with the POWERFEST on .com, there will be High Roller, Medium and Low tiers for players to partake in.

Pollak burst onto the global poker scene with a third place finish at the 2017 WSOP Main Event, which landed him a $3.5 million score. Pollak began playing poker in 2006 and then confidently progressed through the stakes, stating: I immediately felt that I was able to go far. With hindsight, it may be because I had spent 10 years in magic and I was therefore used to cards and bluffing. Poker was a revelation. I felt it was made for me.

With almost $13m in life time earnings, Pollak sits second on the French all-time money list behind former partypoker Ambassador Betrand Grospellier. Pollak says he is keen to remind people that poker remains as fun as when he first started playing back in 2006, before swiftly progressing through the stakes.

I immediately felt that I was able to go far in poker. With hindsight, it may be because I had spent 10 years in magic and I was therefore used to cards and bluffing. Poker was a revelation. I felt it was made for me.

Find out more about the HUGE partypoker POWERFEST in Spain or France by heading to their website

In order to play in the WPT WOC you need to sign up for a partypoker account. Download partypoker via PokerNews and receive up to $30 worth of SPINS jackpot sit & go tickets when you make your first deposit.

Deposit $10 to receive $10 worth of free play, made up of:

Up your initial deposit to $20 and $30 worth of free play is yours, made up of:

Excerpt from:

Benjamin Pollak Latest Player to Join Team partypoker - PokerNews.com