Page 37«..1020..36373839..5060..»

Category Archives: Singularity

Core i5-12600K Shows Strong Lead Over Ryzen 5 5600X In Ashes of the Singularity – Tom’s Hardware

Posted: October 17, 2021 at 5:30 pm

A user going by the call sign "foxed.in" has started to test Intel's Core i5-12600K "Alder Laker" processor with the Ashes of the Singularity benchmark, Benchleaks discovered. It's the same user who previously shared results for the Core i9-12900K with the same benchmark.

Thus far, the rumors for the Core i5-12600K point to a 10-core configuration with six Golden Cove cores and four Gracemont cores. Only the former features Hyper-Threading, bringing up an unorthodox setup with 10 cores and 16 threads. There's reportedly 20MB of L3 cache on the Core i5-12600K. The clock speeds are also a mess since there are two different cores in an Alder Lake chip. The Core i5-12600K is rumored to feature a 4.9 GHz dual-core core boost on the Golden Cove cores. The all-core boost is allegedly fixed at 4.6 GHz. The Gracemont cores, on the other hand, may check in with a 3.6 GHz dual-core boost and a 3.4 GHz all-core boost.

At the time of the article, foxed.in had performed 13 Ashes of the Singularity runs on the Core i5-12600K, however, only one of them completed successfully. Judging by the huge variations between the results, it's safe to assume that the benchmark isn't optimized for Alder Lake's hybrid design yet and fails to utilize the correct cores. This falls in line with previous speculation that Alder Lake gels with Windows 11, and that games need to be optimized for the hybrid chips.

On the Crazy 1080p preset, the results range from 39 framers per second to 110 fps whereas the chip was scoring between 37 fps and 40 fps on the Medium 1080p preset. It's improbable that the Core i5-12600K would put up the similar scores at different graphics presets. The low scores are probably work of benchmark tapping into the Gracemont cores instead of the Golden Cove cores. Therefore, the 110 fps submission (assuming it wasn't on exotic cooling) may be the only valid result where the software utilized the Golden Cove cores properly.

Unfortunately, foxed.in didn't run the Crazy 1080p preset on the Core i9-12900K so we couldn't compare the Core i5-12600K to the flagship SKU. As a result, we had to scour the Ashes of the Singularity database to find proper entries for comparison. Since the benchmark is light on details on the hardware used, we can't guarantee that the GeForce RTX 3080 in those submissions is the same as the one that foxed.in used. Furthermore, the software versions are different, which can also affect the scores. We suggest you look at the results with an open mind and a grain of salt.

The Core i5-12600K scored 10,800 points on the Crazy 1080p preset. The only Core i5-11600K entry with a GeForce RTX 3080 put up a score of 9,800 points. Alder Lake appears to usher in a 10.2% performance uplift. There weren't any entries for the Core i9-11900K (Rocket Lake) so we had to go as far back as the Core i9-10900K (Comet Lake). The Core i9-10900K had a score of 11,200 points, meaning it's only 3.7% faster than the Core i5-12600K.

Technically, the Ryzen 5 5600X (Vermeer) is the direct rival to the Core i5-12600K, although the latter does come with four small cores. We're unsure how they fit into the picture until we get a review sample in the lab, though. Neither the Ryzen 9 5900X or Ryzen 7 5800X showed up in the database. There were many Ryzen 9 5950X submissions, but none matched our criteria.

For comparison, the Ryzen 5 5600X scored 8,100 points on the Crazy 1080 preset so the Core i5-12600K delivered up to 33.3% higher performance than the Zen 3 chip. The margin is similar to that of the Core i9-12900K's dominance over the Ryzen 9 5950X in the same benchmark, albeit with a different graphics preset. Is it a fluke, or does Alder Lake really poses to be a thread to Zen 3? Luckily, we won't have to wait long to find out if the rumors of a November announcement is accurate.

Originally posted here:

Core i5-12600K Shows Strong Lead Over Ryzen 5 5600X In Ashes of the Singularity - Tom's Hardware

Posted in Singularity | Comments Off on Core i5-12600K Shows Strong Lead Over Ryzen 5 5600X In Ashes of the Singularity – Tom’s Hardware

This Week’s Awesome Tech Stories From Around the Web (Through October 16) – Singularity Hub

Posted: at 5:30 pm

ARTIFICIAL INTELLIGENCE

Facebook Is Researching AI Systems That See, Hear, and Remember Everything You DoJames Vincent | The Verge[Facebooks AI team] imagines AI systems that are constantly analyzing peoples lives using first-person video; recording what they see, do, and hear in order to help them with everyday tasks. Facebooks researchers have outlined a series of skills it wants these systems to develop, including episodic memory (answering questions like where did I leave my keys?) and audio-visual diarization (remembering who said what when).

Drone Delivers Lungs to Transplant Recipient, a Medical FirstGeorge Dvorsky | GizmodoAs the Canadian Pressreports, some 80% of donated lungs cannot be used owing to problems having to do with insufficient oxygenation or a failure to meet minimal functional standards. And like any transplanted organ, time is of the essence; the quicker an organ can be brought to the patient, the better. Hence the desire to ship organs through the air, rather than through congested city traffic.

At 90, William Shatner Becomes the Oldest Person to Reach the Final FrontierDaniel E. Slotnick | The New York TimesThe actor spoke of how the experience of seeing the blue earth from space and the immense blackness of outer space had profoundly moved him, demonstrating what he called the vulnerability of everything. The atmosphere keeping humanity alive is thinner that your skin, he said.

Fraudsters Cloned Company Directors Voice in $35 Million Bank Heist, Police FindThomas Brewster | ForbesWhat [the bank manager] didnt know was that hed been duped as part of an elaborate swindle, one in which fraudsters had used deep voice technology to clone the directors speech, according to a court document unearthed by Forbes in which the U.A.E. has sought American investigators help in tracing $400,000 of stolen funds that went into US-based accounts held by Centennial Bank.

This Is the True Scale of Chinas Bitcoin ExodusGian M. Volpicelli | Wired UKThe figures, gathered by the Cambridge Centre for Alternative Finance (CCAF) found that by the end of August 2021, the percentage of bitcoin mining taking place in China had effectively dropped to zero. That is a staggering reversal for a country that, as late as September 2019, was believed to be home to 75.53 percent of global bitcoin mining operations.

90% of New Cars Sold in Norway Are Now Electric or Plug-in HybridsAdele Peters | Fast CompanyIn 2012, electric and plug-in hybrid cars made up just 3% of new car sales in Norway. By 2019, that had jumped to 56%. Now, the country wants to get to 100% EV sales by 2025and it might actually succeed. The Norwegian Automobile Federation recently reported that if past trends continue, its possible that the last fossil fuel-powered vehicle in Norway might be sold as soon asnext year.

Pentagon Wants AI to Predict Events Before They OccurNatasha Bajema | IEEE SpectrumWhat if by leveraging todays artificial intelligenceto predict events several days in advance, countries like the United States could simply avoid warfare in the first place? It sounds like the ultimate form of deterrence, a strategy that would save everyone all sorts of trouble and its the type of visionary thinking that is driving U.S. military commanders and senior defense policymakers toward therapid adoption of artificial intelligence (AI)-enabled situational awareness platforms.

AI Fake-Face Generators Can Be Rewound to Reveal the Real Faces They Trained OnWill Douglas Heaven | MIT Technology ReviewIn a paper titledThis Person (Probably) Exists, researchers show that many faces produced by GANs bear a striking resemblance to actual people who appear in the training data. The fake faces can effectively unmask the real faces the GAN was trained on, making it possible to expose the identity of those individuals.

Image Credit:Lance Anderson / Unsplash

The rest is here:

This Week's Awesome Tech Stories From Around the Web (Through October 16) - Singularity Hub

Posted in Singularity | Comments Off on This Week’s Awesome Tech Stories From Around the Web (Through October 16) – Singularity Hub

Microsoft’s Massive New Language AI Is Triple the Size of OpenAI’s GPT-3 – Singularity Hub

Posted: at 5:30 pm

Just under a year and a half ago OpenAI announced completion of GPT-3, its natural language processing algorithm that was, at the time, the largest and most complex model of its type. This week, Microsoft and Nvidia introduced a new model theyre calling the worlds largest and most powerful generative language model. The Megatron-Turing Natural Language Generation model (MT-NLG) is more than triple the size of GPT-3 at 530 billion parameters.

GPT-3s 175 billion parameters was already a lot; its predecessor, GPT-2, had a mere 1.5 billion parameters, and Microsofts Turing Natural Language Generation model, released in February 2020, had 17 billion.

A parameter is an attribute a machine learning model defines based on its training data, and tuning more of them requires upping the amount of data the model is trained on. Its essentially learning to predict how likely it is that a given word will be preceded or followed by another word, and how much that likelihood changes based on other words in the sentence.

As you can imagine, getting to 530 billion parameters required quite a lot of input data and just as much computing power. The algorithm was trained using an Nvidia supercomputer made up of 560 servers, each holding eight 80-gigabyte GPUs. Thats 4,480 GPUs total, and an estimated cost of over $85 million.

For training data, Megatron-Turings creators used The Pile, a dataset put together by open-source language model research group Eleuther AI. Comprised of everything from PubMed to Wikipedia to Github, the dataset totals 825GB, broken down into 22 smaller datasets. Microsoft and Nvidia curated the dataset, selecting subsets they found to be of the highest relative quality. They added data from Common Crawl, a non-profit that scans the open web every month and downloads content from billions of HTML pages then makes it available in a special format for large-scale data mining. GPT-3 was also trained using Common Crawl data.

Microsofts blog post on Megatron-Turing says the algorithm is skilled at tasks like completion prediction, reading comprehension, commonsense reasoning, natural language inferences, and word sense disambiguation. But stay tunedthere will likely be more skills added to that list once the model starts being widely utilized.

GPT-3 turned out to have capabilities beyond what its creators anticipated, like writing code, doing math, translating between languages, and autocompleting images (oh, and writing a short film with a twist ending). This led some to speculate that GPT-3 might be the gateway to artificial general intelligence. But the algorithms variety of talents, while unexpected, still fell within the language domain (including programming languages), so thats a bit of a stretch.

However, given the tricks GPT-3 had up its sleeve based on its 175 billion parameters, its intriguing to wonder what the Megatron-Turing model may surprise us with at 530 billion. The algorithm likely wont be commercially available for some time, so itll be a while before we find out.

The new models creators, though, are highly optimistic. We look forward to how MT-NLG will shape tomorrows products and motivate the community to push the boundaries of natural language processing even further, they wrote in the blog post. The journey is long and far from complete, but we are excited by what is possible and what lies ahead.

Image Credit: Kranich17 from Pixabay

More here:

Microsoft's Massive New Language AI Is Triple the Size of OpenAI's GPT-3 - Singularity Hub

Posted in Singularity | Comments Off on Microsoft’s Massive New Language AI Is Triple the Size of OpenAI’s GPT-3 – Singularity Hub

Scientists Find the First Known Planet to Have Survived the Death of Its Star – Singularity Hub

Posted: at 5:30 pm

How will the solar system die? Its a hugely important question that researchers have speculated a lot about, using our knowledge of physics to create complex theoretical models. We know that the sun will eventually become a white dwarf, a burnt stellar remnant whose dim light gradually fades into darkness. This transformation will involve a violent process that will destroy an unknown number of its planets.

So which planets will survive the death of the sun? One way to seek the answer is to look at the fates of other similar planetary systems. This has proven difficult, however. The feeble radiation from white dwarfs makes it difficult to spot exoplanets (planets around stars other than our sun) which have survived this stellar transformation; they are literally in the dark.

In fact, of the over 4,500 exoplanets that are currently known, just a handful have been found around white dwarfs, and the location of these planets suggests they arrived there after the death of the star.

This lack of data paints an incomplete picture of our own planetary fate. Fortunately, we are now filling in the gaps. In our new paper, published in Nature, we report the discovery of the first known exoplanet to survive the death of its star without having its orbit altered by other planets moving around, circling a distance comparable to those between the sun and the solar system planets.

This new exoplanet, which we discovered with the Keck Observatory in Hawaii, is particularly similar to Jupiter in both mass and orbital separation, and provides us with a crucial snapshot into planetary survivors around dying stars. A stars transformation into a white dwarf involves a violent phase in which it becomes a bloated red giant, also known as a giant branch star, hundreds of times bigger than before. We believe that this exoplanet only just survived; if it was initially closer to its parent star, it would have been engulfed by the stars expansion.

When the sun eventually becomes a red giant, its radius will actually reach outwards to Earths current orbit. That means the sun will (probably) engulf Mercury and Venus, and possibly the Earth, but we are not sure.

Jupiter, and its moons, have been expected to survive, although we previously didnt know for sure. But with our discovery of this new exoplanet, we can now be more certain that Jupiter really will make it. Moreover, the margin of error in the position of this exoplanet could mean that it is almost half as close to the white dwarf as Jupiter currently is to the sun. If so, that is additional evidence for assuming that Jupiter and Mars will make it.

So could any life survive this transformation? A white dwarf could power life on moons or planets that end up being very close to it (about one-tenth the distance between the sun and Mercury) for the first few billion years. After that, there wouldnt be enough radiation to sustain anything.

Although planets orbiting white dwarfs have been difficult to find, what has been much easier to detect are asteroids breaking up close to the white dwarfs surface. For exoasteroids to get so close to a white dwarf, they need to have enough momentum imparted to them by surviving exoplanets. Hence, exoasteroids have been long assumed to be evidence that exoplanets are there too.

Our discovery finally provides confirmation of this. Although in the system being discussed in the paper, current technology does not allow us to see any exoasteroids, at least now we can piece together different parts of the puzzle of planetary fate by merging the evidence from different white dwarf systems.

The link between exoasteroids and exoplanets also applies to our own solar system. Individual objects in the asteroid main belt and Kuiper belt (a disc in the outer solar system) are likely to survive the suns demise, but some will be moved by gravity by one of the surviving planets towards the white dwarfs surface.

The new white dwarf exoplanet was found with what is known as the microlensing detection method. This looks at how light bends due to a strong gravitational field, which happens when a star momentarily aligns with a more distant star, as seen from Earth.

The gravity from the foreground star magnifies the light from the star behind it. Any planets orbiting the star in the foreground will bend and warp this magnified light, which is how we can detect them. The white dwarf we investigated is one-quarter of the way towards the center of the Milky Way galaxy, or about 6,500 light years away from our solar system, and the more distant star is in the center of the galaxy.

A key feature of the microlensing technique is that it is sensitive to planets that orbit stars at the Jupiter-sun distance. The other known planets which orbit white dwarfs have been found with different techniques which are sensitive to different star-planet separations. Two examples relate to planets which have survived a stars transformation into a white dwarf and have ended up closer to it than before. One was found by transit photometry (a method to detect planets as they pass in front of a white dwarf, which creates a dip in the light received by Earth) and the other was discovered through the detection of the planets evaporating atmosphere.

One further detection techniqueastrometry, which precisely measures the movement of white dwarfs in the skyis also predicted to yield results. In a few years, astrometry from the Gaia mission is expected to find about a dozen planets orbiting white dwarfs. Perhaps these could offer better evidence as to exactly how the solar system will die.

This variety of discovery techniques bodes well for potential future detections, which may offer further insight into the fate of our own planet. But for now, the newly discovered Jupiter-like exoplanet provides the clearest glimpse into our future.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Image Credit: W.M. Keck Observatory/Adam Makarenko

Read more from the original source:

Scientists Find the First Known Planet to Have Survived the Death of Its Star - Singularity Hub

Posted in Singularity | Comments Off on Scientists Find the First Known Planet to Have Survived the Death of Its Star – Singularity Hub

New theory claims Einstein was wrong and the Big Bang is not the beginning of the universe – Sprout Wired

Posted: at 5:30 pm

A group of scientists is questioning Albert Einsteins theory of general relativity, and challenging the idea that the universe is constantly expanding and that this would mean the beginning of everything, or the Big Bang.

Currently, the idea of the Big Bang is accepted by the scientific community, but another theory seeks to replace it with a new understanding of space, time, and the beginning of the universe. Or rather, lack of beginnings.

Technique06 April

Technique05 December

Many consider the theory of Imperfect general relativity, and although it is effective in explaining the universe on a large scale, the idea put forward by Einstein is inconsistent with quantum mechanics and black holes.

In other words, Einsteins work cannot explain how microscopic point called singularity, theoretically smaller than any known particle, manages such an extreme gravitational field.

This brings us to the Big Bang, as its most classic theory that the universe originated from a singularity. Therefore, some physicists made some theories such as strings and Causal sets, this one is more recent.

It is noteworthy that both theories, Strings when Causal sets have only hypotheses, because it is not possible to test and observe their predictions through the scientific method.

chance set theory

A slightly older theory proposes that space and time have a fundamental unit, or quantum. According to this view, space-time is made up of its packages, as if it were quantized energy packages.

If spacetime is quantum, a range of implications must be considered. Fundamental particles, or spacetime, would have discrete units, and would impose certain limits on what would happen in the universe.

If chance set theory is correct, then there is a limit to how close two points can be to each other, and this limit is limited to the size of the spacetime particle.

In this way, time not only becomes a physical manifestation, the singularity becomes impossible. With no singularity in the universe, there is no more conflict with gravity that has to be resolved.

However, if the universe had no singularities, then there was no Big Bang either. So how did the universe come to be?

new theory

A new article, created by Bruno Bento and Stav Zallel of the University of Liverpool and Imperial College London, respectively, are trying to forge a new path. Basically, they claim that the universe has always existed based on everything we know.

According to scientists, there would be no Big Bang as the beginning, because the causal set would be infinite to the past, so there is always something first.

Many scientists also accept some hypotheses about events that occurred before the Big Bang, including scientist Roger Penrose, who won the Nobel Prize for his demonstration of black hole properties, along with Stephen Hawking.

Researchers Penrose and Hawking have already warmly defended the idea that there was another universe before us, specifically another universe that expanded and then retreated, until it returned to the singularity.

The difference between this hypothesis and causal set theory is that the latter, there is no singularity. Bento and Zalels work is in pre-print on arXiv and awaiting peer review.

So, do you think both scientists are on the right track? Tell us in the comments below!

Originally posted here:

New theory claims Einstein was wrong and the Big Bang is not the beginning of the universe - Sprout Wired

Posted in Singularity | Comments Off on New theory claims Einstein was wrong and the Big Bang is not the beginning of the universe – Sprout Wired

The Big Bang and time as we know it might be nothing but illusions – SYFY WIRE

Posted: at 5:30 pm

There is something unnerving about hearing a somber voice intone In the beginning but wait. What if the beginning of time is no more real than the sci-fi movies you hear it in?

Could the Big Bang have never really happened? Will there be no end to the universe? Is everything in between, even the passage of time, just an illusion? Physicist Bruno Bento is now proposing that the universe may have had no beginning at all, meaning it did not just blow up out of nothingness, expanding rapidly from a few atoms into an expanse too vast for the human brain to fathom. What we perceive as the past and future may be infinite.

Bento didnt just wake up one morning and decide that the universe didnt suddenly explode into being about 14 billion years ago. Turns out that general relativity does not hold up with singularities like black holes and the Big Bang. He and his colleagues recently posted a study on the preprint server arXiv, in which they usedcausal set theoryto propose that space and time may not be what we think they are.

Sometimes, general relativity gives us infinities that we do not consider to be physical, he told SYFY WIRE. This is what we mean when we say it breaks down we need something else, something new, to describe regions of strong gravity where it does not provide a physical answer.

Most scientists believe Einstein is right about general relativity, or the idea that our perception of gravity arises from the curve of space and time. Some phenomena insist on bending that theory and could possibly break it in the future. Black holes are dangerous territory for general relativity because there are too many aspects of them we cannot see. Though there is not enough evidence to disprove it (yet), the inability of any instrument to observe gravity inside a black hole, from which light cannot escape, raises controversial questions.

The thing about black holes and other weird gravitational phenomena is that general relativity cannot fathom the extreme size and energies involved. There is a threshold it cannot cross when you are dealing with singularities, or parts of spacetime where everything we think we know about physics suddenly starts to fall apart. Gravity gains almost unfathomable strength at minuscule scales in a singularity. Even if there is something that can explain black hole innards or the hypothetical Big Bang, we have to find out what that is. Enter causal set theory.

Spacetime is fundamentally discrete in causal set theory, said Bento. It is a causal set. This means that there is a minimum possibledistance between any two events, both in space and time. We don't know exactly what this minimum scale is, there arecurrently no experiments that can probe these scales.

Because you cant exactly go into a lab and test this out, theoretical physics may be able to offer some closure. Bento believes that how the breakdown of general relativity happens could mean the Planck scale, which declares a minimum limit for the universe, may be able to pick up where it left off. Breakdown could still happen past that limit. However, that scale may be small enough to possibly reveal things beyond the realm of human observation.

What this means for the passage of time is that an element in a causal set is an event, or a specific point in spacetime. Elements is created whenever corresponding events start to happen. Now is the emergence of such an event. What is seen as a causal set is supposed to grow from the first element onward, adding new elements on top of the set, so the passage of time means that one element after another comes into being. Past is all the elements that already emerged. Future is those that are still coming up.

In causal set theory, the passage of time was used as an input when constructing a dynamics for causal sets, or how a causal set (a universe) should behave, Bento said. One consequence of this is that the past is finite and the universe has a beginning.

But wait. How, then, can there be no end and no beginning? That lies in how Bento and his team see possibilities in causal set theory. The set could potentially grow in either way, up or down, and if it can grow in the direction of the past and the future, and if it can do that, it means that there is no end or beginning. What we think of as time might just be a way of trying to understand something that would otherwise make our brains explode.

What is really surprising is that Bento thinks the universe would still look exactly the same without a Big Bang. It isnt that general relativity just vanishes. It can still explain everything that direct observations can be made on, whether by telescope, the naked eye or otherwise. So our solar system and everything observable in it is real. Earth is real. We ourselves are real.

The problem appears when we cannot see, he said. That being said, it's usually accepted that a Big Bang singularity does not exist (nor do black hole singularities).The debate is in whatwill replace them and how.

Now try to go to sleep at night thinking about that

Continued here:

The Big Bang and time as we know it might be nothing but illusions - SYFY WIRE

Posted in Singularity | Comments Off on The Big Bang and time as we know it might be nothing but illusions – SYFY WIRE

Responding to Meyer Without Naming Him – Discovery Institute

Posted: at 5:30 pm

Photo source: Discovery Institute.

The website Big Think recently published an article by Ethan Siegel titled Surprise: the Big Bang isnt the beginning of the universe anymore. Siegel is a theoretical astrophysicist and a science writer. He is also an atheist, so he understandably does not like the implications of the universe requiring a beginning. He does not mention Stephen Meyer by name, but he seems to directly engage Meyers recent book Return of the God Hypothesis, arguing that modern theories of cosmology suggest the universe did not have a beginning but is eternal.

Contrary to Siegels claims, Stephen Meyer demonstrates in his book that scientific evidence discovered over the past century decisively points to the universe having an absolute beginning. The beginning is often termed a singularity since it would correspond in the standard Big Bang model to a state approaching an infinite temperature in an infinitesimally small volume. Meyer also argues that the evidence for such a beginning, combined with other data, strongly suggests that the universe was created by the God of the Jewish and Christian Scriptures (here, here, here).

Since the books release, many stories have appeared in varied publications challenging his claims. The articles often do not mention Meyer by name, but their content suggests that they are attempting to disprove his design thesis. A similar phenomenon occurred after the publication of his previous books, especially Darwins Doubt. The critics appear to see Meyer much like Voldemort in the Harry Potter series. He is ever present in their minds, but Meyer is he who shall not be named.

Siegel attempts to find a loophole for the conclusion of a cosmic beginning by appealing to the theory known as eternal chaotic inflation. Inflationary theory was initially developed to explain the fine-tuning implied by the flatness of space and the near perfect uniformity of the cosmic microwave background radiation (CMBR). The flatness represents the lack in curvature of space that the theory of general relativity would normally predict. According to the standard Big Bang model, the lack of curvature required the mass density of the early universe to have been fine-tuned to greater than 1 part in 1060 (a 1 with 60 zeros behind it).

Inflationary theory attempts to explain the flatness of space and the uniformity of the CMBR without the need for such extreme fine-tuning. It postulates a field permeating space that causes the universe to expand at a phenomenal rate. The earliest versions assumed that the expansion occurred a tiny fraction of a second after the Big Bang and only lasted for an exceedingly short period. This expansion purportedly flattened space and generated a CMBR with the observed uniformity.

The theory was later modified to posit an eternally inflating space. Different regions of space stop inflating due to a spontaneous drop in the inflationary fields energy, and they then expand according to standard Big Bang models. Our universe resulted from just such a region exiting inflation. The key point for Siegal is that most of space could have been expanding eternally, thus removing the necessity of a beginning. As Siegel comments,

Due to its exponential nature, even if you run the clock back an infinite amount of time, space will only approach infinitesimal sizes and infinite temperatures and densities; it will never reach it. This means, rather than inevitably leading to a singularity, inflation absolutely cannot get you to one by itself. The idea that the universe began from a singularity, and thats what the Big Bang was, needed to be jettisoned the moment we recognized that an inflationary phase preceded the hot, dense, and matter-and-radiation-filled one we inhabit today.

Unfortunately, Siegels claim was completely discredited by the research of leading cosmologists Arvind Borde, Alan Guth, Alexander Vilenkin. They developed the Borde, Guth, Vilenkin (BGV) theorem that demonstrates that all universes, which are on average expanding, must have had a beginning. Our universe falls into this category, so it must have had a beginning even if eternal inflation were true. Meyer explicitly makes this point in his book:

In any case, by the early 1990s, many physicists had embraced eternal chaotic inflation as the best model for the origin of the universe. The popularity of the model led two physicists, Arvind Borde and Alexander Vilenkin, of Tufts University, to investigate what inflation implied about whether the universe had a beginning. They sought to investigate whether the inflaton field could have been operating for an infinitely long time back into the past that is, whether it could have been past eternal, as they phrased it. Within a decade, Borde, Vilenkin, and a third physicist, Alan Guth, one of the original proponents of inflation, had come to a startling conclusion: the universe must have had a beginning, even if inflationary cosmology is correct.

Siegel is following in the same path as other critics in presenting arguments that Meyer has already dismantled. He is so desperate to maintain his materialist philosophical framework that he is unconsciously suppressing key evidence that undermines his attempts to avoid the clear evidence of design throughout creation.

See the original post:

Responding to Meyer Without Naming Him - Discovery Institute

Posted in Singularity | Comments Off on Responding to Meyer Without Naming Him – Discovery Institute

The World’s Electronic Waste This Year Will Weigh More Than the Great Wall of China – Singularity Hub

Posted: at 5:30 pm

Its widely known that the world has a plastics problem. From landfills to the ocean, the stuff is everywhere, and our conscientious efforts to recycle dont do nearly as much good as we think.

Whats less widely known is that we have a similar problem with another kind of waste: electronics. A report published this week on WEEE Forum revealed that the total waste electronic and electrical equipment from 2021 will weigh an estimated 57.4 million tons. Thats heavier than Chinas Great Wall, which is the heaviest man-made object on Earth.

Not surprisingly, the amount of e-waste generated each year is steadily increasing. For one, as the global middle class grows, more people can afford to buy electronics (and to buy new ones when their old ones break, rather than getting the old ones repaired). Also, the prices of many electronic items tend to trend downwards as their manufacture is scaled up, their technology improves, supply chains are streamlined, etc. (given the global chip shortage, the next couple years may be an exception to this trend).

E-waste appears to be growing by three to four percent per year. In 2019 the total reached 53.6 million tons; that was 21 percent higher than 2014s total. If we stay on this trajectory, annual global e-waste will reach 74 tons by 2030.

Product manufacturers arent helping the situation; building products with shorter life cycles, making repairs too expensive or difficult to undertake, and continually releasing new iterations means people are likely to either cast aside their perfectly-good iPhones/tablets/laptops for newer models, or decide that repairing a non-working device isnt worth the trouble and opt for buying a brand-new one. Do you have at least one working (or partially-working) cell phone or laptop sitting in a drawer somewhere, untouched for months or years? Yeah, me too.

When you buy an expensive product, whether its a half-a-million-dollar tractor or a thousand-dollar phone, you are in a very real sense under the power of the manufacturer, said Tim Wu, special assistant to the president for technology and competition policy within the National Economic Council. And when they have repair specifications that are unreasonable, theres not a lot you can do.

The Right to Repair movement thinks otherwiseor, is trying to get consumers and manufacturers to think otherwise. The movement is trying to make it easier for people to repair the devices they already own rather than having to buy new ones.

Europe is several steps ahead of the US in this arena. In March of this year the EU implemented a law requiring appliances to be repairable for at least 10 years; new devices have to come with repair manuals and be compatible with conventional tools when their life cycle ends (so that people are more likely to break them down and recycle them). In Sweden, people even get tax breaks for appliance repairs done by technicians in their homes.

Though there are no similar laws in place in the US yet, the Federal Trade Commission has been investigating repair restrictions as they relate to antitrust laws and consumer protection. Unsurprisingly, electronics manufacturers are largely against right to repair, claiming consumer safety could be jeopardized. But an FTC report from May of this year found there was limited evidence to support manufacturers justifications for restricting repairs, and that peoples device batteries arent actually that likely to burst into flames, nor their personal data likely to be compromised by repairing their devices.

According to the WEEE Forum report, around 416,000 phones per day are thrown out in the US. Thats 151 million a year, and guess where they end up? Heres a hint: 40 percent of heavy metals in landfills come from discarded electronics. Those metals could be recycled for use in new products, but theres no system nor incentive in place to facilitate this.

While small electronics like phones and laptops may have the fastest turnover, theyre not very heavy, and thus arent the biggest contributors to the huge sum of total tons of e-waste. Those culprits are larger items like refrigerators and stoves. But whatever the item is, it comes down to the same principle: we shouldnt be throwing things out until theyre really, truly done workingand then we should have a way to ensure the recyclable components get to a place where they can be re-used.

Pascal Leroy, director general of the WEEE Forum, said, Many factors play a role in making the electrical and electronics sector resource efficient and circular. Butas long as citizens dont return their used, broken gear, sell it, or donate it, we will need to continue mining all-new materials causing great environmental damage. He added that every ton of waste electronic and electrical equipment that gets recycled saves around two tons of CO2 emissions

Given that repairs directly conflict with their primary motiveprofitcompanies arent likely to make pro-repair moves without some serious pressure from consumers or regulators. And it seems that pressure is already being applied, and responded to: Popular Mechanics reported this week that Microsoft is considering right-to-repair reform, and has hired an independent third party to research the impacts on customers and the environment of making more repairable products.

As WEEE Forums Magdalena Charytanowicz said, Consumers want to do the right thing, but need to be adequately informed, and a convenient infrastructure should be easily available to them so that disposing of e-waste correctly becomes the social norm in communities.

Lets hope we move towards that vision before the weight of our electronic trash grows too much more.

Image Credit: Muntaka Chasant/Wikimedia Commons

Read the original:

The World's Electronic Waste This Year Will Weigh More Than the Great Wall of China - Singularity Hub

Posted in Singularity | Comments Off on The World’s Electronic Waste This Year Will Weigh More Than the Great Wall of China – Singularity Hub

The science behind Destiny 2s Lorentz Driver weapon – Space.com

Posted: at 5:29 pm

In Destiny 2s fifteenth season, Season of the Lost, players can earn a new exotic grade weapon by progressing through the Season Pass the Lorentz Driver. This new Linear Fusion Rifle may look like yet another science fiction weapon powered by impossible energies and ridiculous technology. Thats not necessarily the case this time, as the Lorentz Driver has its mechanics and name rooted in real science. Heres how the weapon breaks down its interpretation of the Lorentz force.

So, first off, we have to look at the force this weapon is named after, the Lorentz force. Originally developed from a complete derivation by Hendrik Lorentz in 1895, Lorentz force is the combination of electric and magnetic forces on a charged particle due to electromagnetic fields. This charged particle will only feel a force due to the magnetic field if it is moving with a component of its velocity perpendicular to the field. If it moves parallel to the magnetic field, it experiences no force. Particles, or a single particle, guided by this force is influenced by the Guiding Center, where all surrounding particles align towards this point in space.

The sum of these two forces creates a force that we call the Lorentz force. This concept allows almost all modern electronics to function; speakers, computers, and even railguns all utilize the idea of Lorentz force as the main basis of how they handle electricity and magnetism. Particle accelerators and cyclotrons especially utilize Lorentz force due to their circular shapes and how the force multiplies the speeds of charged particles, allowing them to collide and create new elements.

Looking at how the Lorentz Driver weapon functions, we can break down the components of the mechanics and how they tie to Lorentz force itself in a basic manner. The Linear Fusion Rifle in Destinys universe is a weapon that projects a super-concentrated beam of elemental energy in a single shot, much like a sniper rifle but with much more piercing power.

The Lorentz Driver is a void element weapon, which is described in Destinys lore as an energy of absence or vacuum where energies can be negated. When players score a Precision Kill, or a head shot, on a target a small black hole forms that attracts nearby enemies and then erupts in an explosion of void energy. Targets at random will also be highlighted by the weapons scope system and drop a small, golden tag called a piece of Telemetry Data. When players pick up three of these tags they will gain a buff known as Lagrangian Sight.

When a singularity forms from a Precision Kill, this is basically the Guiding Center of Lorentz force charging a particle and attracting things near it to a common alignment. Since the energy of the weapon is void and it usually produces a vacuum-like effect when used, the fact nearby objects tend to collapse in on void energy, it makes sense.

While the Lagrangian Sight buff is active, the Lorentz Driver will cause more damage and every kill, precision or not, will cause the singularities to form. According to the weapons lore entry from in-game documentation the rifle was possibly built haphazardly by the alien race known as the Fallen, or Eliksni, from non-weapon parts.

Applying the basics of Lorentz force to the weapon makes it easy to see how it all works. The weapons most basic functions are similar to that of a real-life railgun, a type of cannon that employs magnetism and large amounts of electricity to propel a projectile at high speeds with the use of electromagnetic rails. Lorentz force is applied to how the weapon projectile is propelled by applying a charge to a beam of energy.

When the weapons Lagrangian Sight kicks in, it is using the basis of Lagrangian mechanics that add to the weapons power and precision. When the Lagrangian is applied to the weapons mechanisms, it is interacting with the weapons potential energy and bolstering its accuracy to find a vector within space, which explains why the singularities form on any kill rather than a precision kill.

While in the Destiny universe, weaponry and energy are dictated by the games own lore and concepts of how everything functions, its clear Bungies writing team did their homework on this one. The weapons name doesnt just serve as flavor, but is a simplified demonstration of a foundation of electricity and magnetism.

The brilliance of the weapon mechanics make it not only an engaging weapon to use, but the clever demonstrations of real-life electromagnetism show Bungies attention to detail is bar none. Destiny is not a super-accurate world, but its roots in science and use of worldbuilding to give structure to how everything mingles allows unique representations of real-life concepts of science to exist such as this. As the saying goes, any sufficiently advanced technology can be indistinguishable from magic.

If you're looking for more sci-fi gaming content, check out our best Star Wars games guide, and if you're looking to get immersed in the VR space battles then we've also got our best PSVR space games guide for you.

Play Destiny 2 Beyond Light on Xbox Game Pass

Read more:

The science behind Destiny 2s Lorentz Driver weapon - Space.com

Posted in Singularity | Comments Off on The science behind Destiny 2s Lorentz Driver weapon – Space.com

Curious Nature: What is the world’s largest organism? – Vail Daily News

Posted: at 5:29 pm

What is the worlds largest single organism? A quick Google search could leave you with more questions than you anticipated. Depending on our prior knowledge, we may think that the largest single organism would be a giant sequoia, blue whale, coral reefs, or a pool of algae. However, the true answer is more complex that we may assume.

First, lets get specific and identify what an organism is. An organism is a form of life composed of parts that depend on one another, that maintain various vital processes. When we are considering the largest single organism in the world, we look at genetically identical forms of life of plants, animals, or fungi.

There are two top contenders for largest organism, both of which can be found in our own backyard: Armillaria ostoyae and Populus tremuloides, commonly known as honey fungus and aspen trees. In short, aspen and fungi both claim the title. An Aspen grove in Utah wins for greatest mass, while a fungal community in Oregon covers the largest distance, but there is more to uncover and much to debate.

Aspen trees and honey fungus exist as a single organism because of their hidden underground connectivity. Aspen trees have a root system which links them to other genetically identical trees. In autumn, trees in the same organism will transform leaves simultaneously. This singularity occurs because the aspen reproduces by sending a clone from its roots through the forest floor.

One such organism in Utah has done this enough to cover over 107 acres, called Pando, meaning I Spread in Latin. Pando has been growing and dying for over 80,000 years and has over 47,000 genetically identical trees, weighing over an estimated 13 million pounds. The other contender, the honey fungus, invites further curiosities.

Fungi are in a kingdom all of their own, separate from plants and animals. Fungi are composed of a network of threads or strands called mycelium, which produces spores and feeds on organic matter. Just like the aspen connects via its root system, fungi connect beneath the earth with their mycelial structure.

Fungi create a network of these tiny mycelial roots spanning dozens of feet under the earth, hardly breaching the surface. Mycelial structures have been compared to the neurons in human brains and have even been dubbed natures internet because they work in a similar function.

These fungal structures link roots of trees and act as a messenger for forest flora with impeccable speed and accuracy. In Oregon, the largest stand of fungus is called the Humongous fungus and it is 2,200 acres of Armillaria ostoyae, otherwise known as the honey fungus or shoestring fungus.

The parasitic structure thrives by stealing nutrients and water from the trees, and has been doing so for approximately 2,400 years. If this plot of fungus is truly connected, it could weigh between 13 million to 69 million pounds. However, we have no true answer for the depth and complexity of this structure because it lives so finely under our feet. One sprouting mushroom can bridge a 60-foot distance to the nearest mushroom through the mycelial connections.

When we consider the aspen grove and the honey fungus, both can stake a claim to the title of largest organism, depending on how we measure. Knowing we do not have a simple answer to the question reveals that nature holds many more secrets. Some of these secrets may even hold answers to some of humanitys numerous problems and mysteries. By protecting and studying nature, we protect ourselves for generations to come.

Taylor Branson is a Naturalist at Walking Mountains Science Center. In her free time, you can find her hiking up mountains, reading in a hammock, tending to her plants, or working in a creative art project.

Follow this link:

Curious Nature: What is the world's largest organism? - Vail Daily News

Posted in Singularity | Comments Off on Curious Nature: What is the world’s largest organism? – Vail Daily News

Page 37«..1020..36373839..5060..»