The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Daily Archives: February 29, 2020
Criminal Minds actor Joe Mantegna starred in one of the worst transgender storylines ever aired – LGBTQ Nation
Posted: February 29, 2020 at 10:47 pm
Joe Mantegna arrives for the John Wick: Chapter 3 - Parabellum' L.A. Special Screening on May 15, 2019.Photo: Shutterstock
The TV series Criminal Minds recently ended its 15-year run on CBS. The final episode was acceptable nowhere near as bad as that of MarriedWith Children (which was no finale at all but two random episodes stitched together with no apparent purpose) but it was certainly no M*A*S*H, Newhart or Sopranos.
I knew of Criminal Minds existence early on but never watched an episode until my spouse and I began binging it last year. Early on its notable male lead was Mandy Patinkin, but he left midway through the third season. Taking his place was Joe Mantegna and that leads us to a story about his connection to the denigration of transgender women.
Related: Steven Universe season finale brings a transgender story to the Cartoon Network
I first recall seeing Mantegna in The Godfather Part III. He played Joey Zasa, a rival to the aging Michael Corleone and, in general, a real piece of work. No one cried when Zasa died and, when I saw the movie on opening day 30 years ago, a few people in the theater actually cheered.
Mantegna did a good job hosting Saturday Night Live in connection to the opening of the flick, but the next thing I remember seeing him in was the series First Monday, which debuted in mid-January 2002.It was off the air by May.
It shouldnt have made it to February.
Mantegna played Joseph Novelli, newly-appointed to the U.S. Supreme Court as an associate justice. James Garner was Chief Justice Thomas Brankin, a pastiche of William Rehnquist. Charles Durning was an old conservative associate justice, something of a mutation of his Dolph Briscoe knock-off from The Best Little Whorehouse in Texas.
Great actors all (yes, including Garner; he rarely got the credit he deserved).
But the problem with First Monday was not the acting. It was the writing and perhaps whatever the writers were smoking when they came up with the script for the premiere episode of a series that was being touted as a Supreme Court analogue to The West Wings examination of the inner workings of the White House.
Night Court better served that role than did First Monday, which tried to be edgy out of the gate by dealing with the death penalty and trans issues, failing miserably at both.
The former I wont even discuss here because that part of the episode actually was more preposterous than the trans portion and that really is saying something given how horrid that trans portion was.
The legal aspect of the trans storyline was an insult to the entire notion of appellate court practice as well as to trans women. As Lori Buckwalter noted in her contemporary review of the episode, it was a poorly padded beating by a blunt intellectual instrument.
The case being heard by CBSs fictional Supreme Court involved a trans character seeking asylum in the U.S.
Nothing unusual about that as a legal matter (albeit not necessarily as a SCOTUS case). In general, its a real situation that, sadly, plays out every day, though in much uglier ways under the Trump Administration than even in the George W. Bush era.
But we learn that the attorney stepping up to argue the case in front of the full court had only learned about the case the night before.
So, immediately, the writers are conveying to America that a full U.S. Supreme Court argument on the merits of a case with constitutional implications differs from a typical Monday morning DWI arraignment in any local court in any county in any state only in that there are nine judges present instead of one.
It gets worse.
The arguments go badly and are written as if to purposely sow discord among trans people by hierarchically distinguishing between fully-transitioned (presumably post-op) transsexual women and an individual who is just a transvestite.The episode doesnt disclose how the case ultimately ended but the laughter from the justices tells any viewer all they really need to know on that point.
And it gets worse still.
Did I mention that the attorney who represented the transvestite was an incredibly attractive woman? She caught the eye of a conservative law clerk who, immediately after the oral arguments, hits on her for a date which took place at some unnamed D.C. salsa dancing venue.
Did I mention that the attorney herself was trans? Granted, anyone with any familiarity with media misrepresentations of trans issues or just how television writers try to go overboard to gin up ratings in general (much less with obscenely over-touted premiere episodes) likely suspected this was where it was all going as soon as she began her argument to the court.
Of course, the conservative law clerk didnt know and, quicker than you can say The Crying Game, the salsa dancing gave way to the attorney telling him she was trans with a seemingly computer-generated multi-octave voice-drop.
How can it be conscionable, Buckwalter asked, to leave the largely uncontested impression that we trans folk are ridiculous and available for verbal abuse in public, even in the nations highest judicial venue?
It isnt. But think about the context. The atrocity that was the first of First Mondays thirteen episodes came on the heels of the wonderful character of Prof. Erica Bettis (whose very existence did even more than real-life trans academics to inspire me to pursue a Ph.D.) being disappeared sans explanation from The Education of Max Bickford.
Are things that much better 18 years later?
Yes, there is Orange Is The New Black.But then theres also Dallas Buyers Club.
The greater concern now is not dramas and comedies but the largely uncontested transphobia that people who claimthey are being silenced are allowed to prominently position in newspapers. And then there is the matter of Der Strmer-caliber cartoons followed by non-believable apologies that, even if genuine, can never un-ring the transphobia permission bell.
Currently, this is a much greater problem in England than it is here.But just wait.
Between the openly conservative propaganda outlets and the quasi-legitimate ones with young, desperate journalism grads looking to make their bones by defying political correctness by picking on all-but-defenseless identifiable groups of people
By comparison, First Monday will seem like Inherit the Wind.
Read this article:
Posted in Political Correctness
Comments Off on Criminal Minds actor Joe Mantegna starred in one of the worst transgender storylines ever aired – LGBTQ Nation
The U.S. Navys Future Fleet Will Run Aground In Heavy Weather – Forbes
Posted: at 10:46 pm
Small surface ships will struggle in high seas
The sea is a tough place, and, given that stormy seas often damage ships and endanger sailors, the Navy has habitually worked to keep vessels out of harms way since 1944. But over the past thirty years the Navy has become so risk-averse that the U.S. surface Navy vacated several strategic-but-stormy seas.
That retreatand the general loss of sustained heavy-weather experience by the cost-conscious post-Cold War U.S. Navyhas had real consequences. As the memory of sustained, stormy weather operations faded under the weight of a tough anti-terror operational tempo, the number of U.S. sailors and other naval tastemakers who understood that battle in high seas demanded ships with particular sea-keeping features dwindled away.
So the question remains: Do tastemakers like Secretary of Defense Mark T. Esper, who, in a February 27th letter to the House Armed Services Committee Chairman Adam Smith, argued for more smaller surface combatants; greater reliance on lightly and optionally-manned ships, really understand that they may be arguing for a fleet that will be more effective fighting from a pier than out in the contested seas the future Navy is meant to secure?
Sea States Mattered:
Up until a little more than 18 months ago, almost an entire generation of U.S. sailors lacked experience sailing in the rough seas north of the Arctic Circle. In late 2018, Carrier Strike Group Eight was the first U.S. aircraft carrier battle group to operate in the Norwegian Sea in 27 years. The experiencealong with several othersshowed that the Navy had lost a lot of old operational secrets and practices needed to project power in stormy weather.
The same can be said for design. Back in the Cold War, naval designers grew surface combatants to, in part, better prosecute combat in the high seas. The enormous displacement of an old Cold War mainstay, the Spruance class destroyer, was controversial. At over 8,000 tons, the Spruance was twice that of Americas previous front-line destroyer, the Charles F. Adams class.
But back in the early 1980s, when the U.S. Navy was a bit more concerned about the impact of storms and high seas upon the operational capability of U.S. Navy ships, studies cautioned that even the Spruance Class destroyers were only fully operable 80 percent of the time at Sea State 5 and barely operable 20 percent of the time at Sea State 6.
Cold War naval designers had super-sized the destroyer, to, in part, fight better in high sea states. But while Americas giant nuclear carriers were barely affected by heavy seas, their escortseven the big new Spruance class destroyersstill struggled to remain effective.
Smaller ships have plenty of opportunities to struggle in high seas; in the open ocean of the Northern Hemisphere, Navy studies from 1982 estimated that the probability of seas of Sea State 6 or higher was almost 27 percent. The probability of Sea State 5 or higher was almost 50 percent. This was reflected in choices the Navy made as the Cold War wound down. The Navy shed frigates and other small ships at an enormous rate while retaining the Arleigh Burke class, a destroyer even larger than the Spruance class.
Sea States Still Matter:
Thanks to the end of the Cold War and comprehensive meteorological guidance, Navy ships couldand didset their courses for the best weather possible. With no threat, such risk avoidance made sense. And as China and Russia emerge, the Navy can no longer plan on operating in calm seas. The Navy must go to where the war isand today, as storms are becoming stronger and more frequent, the chances of a fight in higher, rougher seas will only increase.
Meanwhile, Pentagon technologists like Secretary of Defense Mark Esperan Army veteran, who, as Secretary of the Army, urged the a-strategic dismantling of the Armys sea transport wingis extolling the virtues of the low-cost, small ship Navy. Does the Secretary of Defenseor the Deputy Secretary of Defense, David L. Norquist, who has been charged to lead a comprehensive review and analysis of the Navys proposed future fleet force structure, actually understand the tradeoff between vessel size and high-seas effectiveness?
Certainly, frigates and small ships are usefulthe Navy certainly needs a far wider variety of vessels. Fundamental systems engineering questions risk being overlooked in the rush to propose exciting and fundable small-ship concepts. Right now, Washington think tanks are proposing fun-sounding baubles like 2,000 ton minimally manned vessels to serve as floating arsenals for carrier strike groups without really digging into the nitty-gritty operational feasibility of such new schemes.
The question is simple. If an 8,000 ton destroyer is unable to fully operate in Sea State 5 or higher, how well will a far more sophisticated and delicate 2,000 ton optionally-manned missile boat be ready to fight? How will these small vessels keep up with the carrier strike groups they are charged to defend? Have the sensitive technologies necessary for these small ships to fight actually been optimized and tested in real-world small-ship sea conditions?
Theres a reason why U.S. Navy surface combatants have gotten bigits because they need to do a lot of complex warfighting-oriented things. They must keep up with the carriers they defend and they need to be operational at high seas. Small ships can do lots of similar things too, but they cannot do as well at keeping up with an aircraft carrier in high seas and will have a hard time being operational in even ubiquitous mid-sized seas.
Small vessels are finebut they are no panacea. When the seas are big, the lighter, smaller and cheaper fleets favored by budget-minded technocrats risk becoming ineffective. For challenged navies, high seas are an immutable fact of life. But, for the past thirty years, the U.S. Navy has avoided them, and forgotten a lot. And now that a former Army paratrooper and a Certified Government Financial Manager are poised to fundamentally reshape the U.S. Navy, the Navy itself is poorly positioned to even try to express the deep operational risks posed by dramatic changes in naval composition.
In this headlong rush to leverage new technologies and hot new concepts, the fancy powerpoint slides that point the Pentagon towards a cheap, pint-sized and optionally-manned fleet still has a long way to go before being converted into operational reality. In particular, the Navy needs to explain these operational challenges to David Norquist. If they dont, David Norquist will do what his brother could not. While Grover Norquist has failed in his quest to reduce the U.S. Government to the size where he can drag it into the bathroom and drown it in the bathtub, Grovers highly-regarded brotherif allowed to make decisions based largely on accounting principles and exciting powerpoint conceptsmay be set to do just that very thing to the U.S. Navy.
The rest is here:
The U.S. Navys Future Fleet Will Run Aground In Heavy Weather - Forbes
Posted in High Seas
Comments Off on The U.S. Navys Future Fleet Will Run Aground In Heavy Weather – Forbes
Hitler’s Super Warship: Was the Battleship Bismarck Really Supposed To Be Invincible? – The National Interest Online
Posted: at 10:46 pm
In 1960 Twentieth Century Fox released the film Sink the Bismarck! Based on C.S. Forresters bestselling book The Last Nine Days of the Bismarck, the documentary-style film tells a gripping and reasonably factual account of the most famous sea chase in history.
In an early scene, German Fleet Admiral Gnther Ltjens addresses the crew of the battleship as they head out to the Atlantic. With the typically bellicose posturing usually portrayed in American war films, Ltjens proclaims, Officers and men of the Bismarck! This is the fleet commander. I can now tell you that we are going out into the North Atlantic to attack the British convoys. We are going to sink their ships until they no longer dare to let them sail! It is true we are only two ships [Bismarck was sailing with the heavy cruiser Prinz Eugen]. But the world has never seen such ships! We are sailing in the largest, the most powerful battleship afloat, superior to anything in the British Navy! We are faster, we are unsinkable!
From that point on, the viewer is left with little doubt of the German warships invincibility and power. Yet this is not true. Bear in mind that the movie was made in 1959, a full 18 years after the Bismarck had been sunk. This has become the Bismarck legend. But most legends have no more validity than what one accepts at face value.
Like many other historical icons, Bismarcks power has been greatly magnified and distorted. What was once believe about Bismarck is pure fiction. In fact, rather than the most powerful battleship in the world, she was actually among the ranks of less heavily armed capital warships in 1941. True, her engineering and fire control, engines and gunnery were superb. But those factors alone do not warrant top billing.
The development of heavy warships since 1906 when HMS Dreadnought, the first all big gun ship, was launched was a steady climb in size and power. But it was most often a constant duel between size, weight of armor, speed, and gun caliber.
During World War I, the old tactic of battleships steaming in parallel lines battering away at one another ended with the epic Battle of Jutland. In four separate encounters on May 31, 1916, two huge fleets met off Danish Jutland in the North Sea. When it was over, three British battlecruisers had blown up, but the main force of the German High Seas Fleet and the British Grand Fleet had suffered little crippling damage. Even when the biggest guns were employed, it was armor protection that mattered most. Unfortunately, some naval design experts had yet to grasp this fact.
All Jutland proved was that the old way of ending wars with battleships was over.
When the Third Reich dawned in 1933, Germany had already begun a massive shipbuilding program. Destroyers, cruisers and, most effectively, U-boats were constructed in great numbers, but the queens of the sea would still be the mighty battleships. Senior Kriegsmarine officers believed they could be far more effective at hitting and sinking convoys, the lifeline of the United Kingdom, than in dangerous ship-to-ship duels.
Grand Admiral Erich Raeder, commander of the Kriegsmarine, first commissioned the building of three Deutschland-class cruisers, Deutschland, Admiral Scheer, and Admiral Graf Spee. While officially heavy cruisers, they were euphemistically called pocket battleships. Each panzerschiff, or armored ship, carried six 11-inch guns in two turrets as its main armament.
Three 14,500-ton Admiral Hipper-class cruisers, Hipper, Blucher, and Prinz Eugen, each carried eight 8-inch guns in four turrets. Formidable in themselves, they were soon superseded.
The powerful 32,000-ton Scharnhorst and Gneisenau were launched in 1936. They each carried nine 11-inch guns in three turrets. A certain hazy sense of purpose surrounds these two ships. They were referred to at various times as battlecruisers, heavy cruisers, and even battleships. Since the battlecruisers were traditionally meant to act as fast scouts rather than capital ships, this betrays an uncertainty in the Kriegsmarine as to what their role was meant to be.
Not so for Bismarck, laid down in 1936 and launched at the Blohm & Voss Shipyard near Hamburg on St. Valentines Day, 1939. A beaming Hitler attended the ceremonies.
The German battleship Bismarck has long been regarded as the most powerful capital ship ever to go to sea. However, closer examination reveals that such may not be the case.
The new battleship was to be armed with eight 15-inch guns in four turrets and a dozen 5.9-inch rifles in six turrets. At 42,000 tons and protected by 13 inches of armor, Bismarck was the biggest warship ever built in Germany. With both radar and advanced fire control systems to aim her guns, she was capable of doing great damage to other warships and totally destroying any unarmored merchant ship with ease.
The Royal Navy watched her progress with trepidation. When war broke out the primary targets of the German warships were the Atlantic convoys that provided Britain with vital supplies of food and raw materials. They carried munitions, planes, tanks, food, supplies, and troops to Great Britains armies. If the vulnerable transports and tankers could be sunk, it was only a matter of time before Britain would fall.
However, Bismarck was not feared for her firepower alone. The British Admiralty worried over what she could do to convoys, Britains lifeline. The Royal Navy needed to stop her.
In the spring of 1941, Bismarck was undergoing sea trials in the Baltic Sea. When she and her consort, the Prinz Eugen, finally left the Baltic and Norwegian waters to head out to the Atlantic, the fate of Great Britain was uncertain. Already Scharnhorst and Gneisenau had sunk 22 ships totaling 115,000 tons. And they had nowhere near Bismarcks firepower.
In May, 16 convoys were out in the Atlantic, headed for the Mediterranean or the British Isles. Even with Royal Navy destroyers, cruisers, and battleships providing escort, they were all vulnerable to Bismarcks huge guns.
Bismarck was the all-consuming obsession of the British Admiralty. For six days, through good and bad weather, good luck and tragedy, two fleets and nearly a dozen individual warships tried to find, engage, and sink the German behemoth.
On May 24, the pride of the Royal Navy, the huge battlecruiser HMS Hood, met up with Bismarck in the Denmark Strait. When Hood and the terror of the seas met for the first and last time it really came down to the two biggest kids on the block slugging it out to see who was toughest. One was an old fighter with a heavier punch but a shorter reach, while the other was a young boxer who could hit faster.
Less than 10 minutes after they opened fire on each other, the mighty Hood received a hit that pierced her main ammunition magazines and exploded in a massive detonation that killed all but three of her 1,400 crewmen. What really mattered was not the size of the guns. It was range, armor protection, and accuracy. Hood and Bismarck carried almost identical main armament.
One of Bismarcks 15-inch gun turrets, this one named Bruno, looms above the deck assailors go about their business. Under close scrutiny, the broadside punch of the battleships heaviest weapons was only average among the warships of other nations.
Hoods loss was a deep blow to Great Britain, and it only served to steel British resolve. To the rest of the world watching the sea drama unfolding it seemed to prove that Bismarck was invincible. Sinking the Hood was a propaganda bonanza for the Third Reich. Avenging the Hood was a rallying cry for the British nation. Neither side could back down.
The Royal Navy scraped together every available ship, and in the end, by the sheerest luck and steadfast determination, two Royal Navy battleships finally turned Bismarck into a flaming wreck.
For more than 70 years Bismarcks superiority has been taken for granted. The 1960 film added to the legend, and in time it was taken as fact. But how did it start? Who was the first to make the statement that Bismarck was incomparable? Careful research among German and British archives from the Imperial War Museum and the Naval Historical Center reveals not a single public pre-1941 proclamation of Bismarck as the most powerful and/or biggest battleship in the world. Not even the Nazi War Ministry or the Propaganda Ministry seems to have made such a claim. Josef Goebbels, Minister of Propaganda, certainly the master of deceit and spin control, would have been the logical one to say it, but he was too smart. Any naval expert would have challenged a boast of Bismarcks strength, and the Third Reich would have lost face.
The closest to such a claim was during her launching at the port of Kiel. Hitler proudly stated that Bismarck and her sister Tirpitz were the most powerful warships ever built in Germany. That too is not fully accurate. Back in 1916 during the height of the Great War, SMS Bayern was launched. She was the first of Kaiser Wilhelm IIs new superdreadnoughts. She carried no less than eight 15-inch guns, the same as Bismarck would carry 23 years later. Hitler seems to have forgotten this minor point.
The most the Germans could honestly say, if such a word would ever be recognized by the German Propaganda Ministry, is that Bismarck was the newest and most advanced warship in the world. After a careful study of the major warships of the time, it appears the mighty Bismarcks bark was worse than its bite.
Naval guns, by the spring of 1941, were as good as they would ever get. Their size and range increased from the early 12-inch cannon used on the pioneering HMS Dreadnought in 1906, growing by leaps and bounds by the beginning of the Great War. Soon, even 13.5-inch guns were overtaken by the massive 15-inch guns of the colossal Queen Elizabeth-class super-dreadnoughts. They set the standard in the Royal Navy that held sway for the next 20 years. But there were exceptions. For the sister battleships HMS Nelson and HMS Rodney, launched in 1920 and 1922, respectively, nine 16-inch guns, the largest ever cast by the British, were fitted. With three triple-gun turrets, they were later matched by the American Iowa-class battleships. They were the heaviest guns ever mounted on a British warship.
The battlecruisers builtby the British Royal Navy prior to World War I sacrificed armor protection for speed. It was this tradeoff that proved fatal to the great Hood, pictured here, in its brief battle with Bismarck and in May 1941.
The pendulum between more guns and bigger guns swung back and forth, partially due to cost and the configuration of the proposed vessels.
Thus, prior to World War II the newest battleship in the Royal Navy was the King George V with 10 14-inch guns in three turrets. The forward and aft guns were set in two ponderous four-gun turrets, while the last two were set in a high-mounted twin turret.
This illustrates the capricious nature of battleship design in the interwar period and the early 1940s. The 14-inch gun was the standard in the U.S. Navy, appearing on nearly every battleship from the USS Nevada until the launching of USS Iowa in 1942. Nevada carried ten 14-inch guns, while the later USS Arizona boasted 12 guns in four turrets.
Frances largest battleships, Jean Bart and Richelieu, each carried eight 15-inch guns. Italys capital battleship Vittorio Veneto had nine guns of the same caliber and rated at 40,000 tons and 780 feet long.
Of course, any examination of World War II battleships must include the Japanese super battleships Yamato and Musashi. Yamato had been launched by the time of the Bismarck chase but would not be commissioned until December 1941. At 65,000 tons, Yamato and Musashi carried nine immense 18-inch guns, the largest ever mounted on a ship. These were the apogee of battleship design, but both remained vulnerable to carrier-based aircraft and were sunk by U.S. Navy planes during the war.
To clearly illustrate how Bismarcks armament was less than equal to many if not most of the worlds major warships, it will be necessary to look at certain criteria. Main arma-ment, including caliber, weight of shell, and range are the most important criteria for a battleships guns, indeed its very reason for existence. Using a simple formula of the number of guns multiplied by the size provides a warships Total Gun Caliber (TGC). This is only meant as a means of ranking a ships gun size. Another formula, Total Weight of Broadside (TWB) is also used to help the ranking.
Many other factors need to be considered, such as range, rate of fire, fire control, and accuracy. Bismarck, as a new, highly advanced warship with state-of-the-art German engineering, was arguably technologically superior to anything in the Royal Navy in 1941.
After a careful look at the TGC and TWB ratings, some surprising results emerge. Japans Yamato, with a TGC of 168 ranks second behind the Japanese battleship Nagato at 198. Yet as TWB is rated the numbers are reversed. Nagato could fire a heavier broadside than her newer, bigger descendant. Interestingly, the U.S Navys Arizona and Tennessee had the same 168 TGC as Yamato, although their gun range and weight of broadside were inferior. Overall, Japans battlewagons rank highest while the United States and Great Britain hover above France and Italy. The mighty Bismarck, the terror of the seas as Johnny Hortons 1959 novelty song proclaimed, is dead last.
The Japanese battleship Yamato, shown in harbor during construction, along with its sister Musashi mounted the heaviest guns ever placed aboard a modern warship. These 18-inch can
Hood and Bismarck were evenly matched. Both had a TGC of 120 and a nearly identical TWB of 7.238 tons and 6.857 tons, respectively. In fact, Hoods shells weighed 1,900 pounds while her opponent fired 1,800-pound projectiles. Even with heavier shells, Hoods 29,000-meter range was 6,000 meters shorter than Bismarcks. Only Bismarcks range and gunnery was superior. In the end, it was a lack of armor protection that doomed Hood.
So how did the world come to accept the boast? Bismarck was only considered the most powerful battleship in the world long after she had been sunk. It was part of the legend. And the Royal Navy, having lost the vaunted Hood and then destroying the German behemoth, looked better if Bismarck had been the superior vessel.
The truth is, for just nine short days, Bismarck was the newest and most advanced battleship in the world. Sooner or later she would have met her match, as all boastful bullies eventually do.
Author Mark Carlson has written on numerous topics related to World War II and the history of aviation. His book Flying on FilmA Century of Aviation in the Movies 1912-2012 was recently released. He resides in San Diego, California.
This article by Mark Carlson first appeared at the Warfare History Network in January 2019.
Image:Bismarck in port in Hamburg. 24 August 1940. Bundesarchiv.
Visit link:
Posted in High Seas
Comments Off on Hitler’s Super Warship: Was the Battleship Bismarck Really Supposed To Be Invincible? – The National Interest Online
New severe weather warnings issued in Ireland ahead of Storm Jorge – IrishCentral
Posted: at 10:46 pm
Met ireann has posted new and updated weather warnings as Storm Jorge nears Ireland
Storm Jorge, the third major storm set to hit Ireland this month, is expected to extreme winds and rain across the country this weekend.
Read More: Four weather warnings issued across Ireland ahead of Storm Jorge
Met ireann meteorologists said that Storm Jorge is the seventh named storm of the season. It was originally named by AEMET, the Spanish national meteorological service, due to the impact of the storms active cold front which is forecast to bring severe gusts and strong waves to the northwest of Spain.
On Friday, Met ireann issued new weather warnings, including a Status Red (the most severe) wind warning for two counties in the west of Ireland:
Very severe winds associated with Storm Jorge (Hor-hay) on Saturday.
Westerly winds will reach mean speeds of 85 to 100km/h in places on Saturday afternoon with gusts of 130 to 145km/h, with an elevated risk of coastal flooding.
Valid: 1 pm Saturday, February 29, 2020, to 4 pm Saturday, February 29, 2020
Issued: 4 pm Friday, February 28, 2020
Severe winds associated with Storm Jorge (Hor-hay) on Saturday.
Westerly winds will reach mean speeds of 65 to 80km/h for a time on Saturday afternoon and early evening with gusts of 110 to 120km/h, possibly higher in very exposed areas.
Valid: 1 pm - 7 pm Saturday, February 29, 2020
Issued: 4 pm Friday, February 28, 2020
Strong winds associated with Storm Jorge (Hor-hay) on Saturday.
Westerly winds of mean speeds 50 to 65km/h on Saturday evening and early Saturday night with gusts of 90 to 110km/h expected.
Valid: 7 pm to 11:59 pm Saturday, February 29, 2020
Issued: 4 pm Friday, February 28, 2020
1. Southwesterly gales or strong gales will develop overnight on Irish coastal waters from Roches Point to Slyne Head to Rossan Point, extending to all Irish coastal waters and to the Irish Sea tomorrow morning.
2. Winds will veer westerly during Saturday morning and afternoon, increasing gale force 8 to storm force 10 and reaching violent storm-force 11 at times between Mizen head and Erris Head.
Read More: Snow and ice warning issued for the whole of Ireland
The new weather warnings come in addition to several other warnings that Met ireann issued on February 27. On February 28, some updates were made to the existing warnings:
UPDATE: Rainfall accumulations generally between 20 to 30mm expected during Friday and Saturday, but 40 to 50 mm possible in mountainous areas, with a continuing risk of flooding due to already saturated ground and elevated river levels.
Valid: 12:01 am Friday, February 28, 2020, to 11:59 pm Saturday, February 29, 2020
Issued: 11 am Thursday, February 27, 2020
Updated: 7:48 pm Thursday, February 27, 2020
Strong winds associated with Storm Jorge (Hor-hay) on Saturday.
Southwesterly winds of mean speeds 50 to 65km/h on Saturday morning with gusts of 90 to 110km/h expected.
Valid: 9 am to 1 pm Saturday, February 29, 2020
Issued: 1:39 pm Thursday, February 27, 2020
Updated: 4:09 pm Friday, February 28, 2020
Read More: Storm Dennis lashes Ireland with 75 mph winds
On February 27, Joan Blackburn (Deputy Head of Forecasting Division), Sinad Duffy (Meteorologist, Technology Division) and Eoin Sherlock (Head of Flood Forecast Division) offered these remarks regarding Storm Jorge in Ireland:
Storm Jorge (named by AEMET, the Spanish meteorological service) is the latest in a series of Atlantic storms this month and is due to affect Ireland from early Saturday. Rain will extend countrywide from the west tonight, before the storm arrives.
Storm Jorge (pronounced Hor-hay) is a storm centre which will undergo rapid cyclogenesis in the mid-Atlantic during Friday 28th February as it tracks northeastwards towards Ireland. It is then expected to fill slowly as it crosses over the north of the country during Saturday 29th February.
Storm Jorge is forecast to bring severe winds to western and northwestern coastal counties (orange wind warning) and less severe winds to the rest of the country (yellow wind warning) from Saturday morning into early Sunday morning.
Spells of heavy rain associated with Storm Jorge will worsen the flooding situation across the country. A yellow level rainfall warning will come into operation for Munster, Connacht and Donegal from tonight (Thursday night) to late on Saturday evening.
"Currently river levels are elevated across the country, particularly in the Midlands (Shannon catchment). Levels across the Northern half of the country are also high. Therefore, additional rainfall over the coming days will compound the flooding issues here.
"We are in a period of transition between Spring (High) Tides and Neap (Low) Tides. This means there will not be a large variation between high and low tides. The combination of high seas and strong winds or stormy conditions associated with Storm Jorge may increase the possibility of coastal flooding, especially in flood-prone areas along the Atlantic coast on Saturday (particularly when coincident with high tides)."
Read more from the original source:
New severe weather warnings issued in Ireland ahead of Storm Jorge - IrishCentral
Posted in High Seas
Comments Off on New severe weather warnings issued in Ireland ahead of Storm Jorge – IrishCentral
Column: Socialism is not the American way | Opinion – Duncan Banner
Posted: at 10:45 pm
The Democratic Partys tightening embrace of socialist proposals and politicians is real cause for alarm, especially in a country whose very foundation is liberty. Indeed, the precious freedoms that exist here in America have always set us apart. Just think about how many people still want to come to the United States to have a share in the America dream. That should say a lot about just how precious our freedoms are and why we must never take them for granted.
Because of the freedoms secured by brave patriots long ago and preserved time and again for generations, our land is one of endless opportunity where hard work, determination and innovation are rewarded, and the American dream can be achieved regardless of where youve come from. And its because of the underlying foundation of freedom, opportunity and the rule of law. With socialisms popularity dangerously rising, dont be deceived. Our framers trusted free people, free speech and free markets for a reason, and we should continue to do so.
In the course of the current presidential campaign, several Democratic presidential candidates have either rolled out or voiced their support for socialist proposals. Even more disconcerting, self-proclaimed socialist Senator Bernie Sanders recently went so far as to offer a measure of praise for communist dictator Fidel Castro of Cuba, yet the regime Castro founded still mistreats and oppresses its own citizens and still associates with other oppressive and corrupt governments. Its worth remembering that many Cubans have embarked on journeys to flee this oppression and live freely in America revealing the true state of their oppressed homeland and the dangers posed by communism.
But the Democratic embrace of socialism extends beyond the presidential debate stage. Throughout this Congress, House Democrats have put forward numerous legislative proposals that are alarmingly radical and, quite simply, socialist in nature. Widely talked about are bills like the so-called Medicare for All and Green New Deal, as well as budget busting proposals for free college tuition and universal basic income.
While it might be appealing to pursue what sounds like the be all end all with Medicare for All, Democrats promised that less than a decade ago with the so-called Affordable Care Act. Back then, Americans were promised they could keep their doctors, that they could stay on their current plans and that their premiums would go down. None of those things turned out to be true, yet Democrats have pushed for a total government takeover of the industry with their Medicare for All proposal. Aside from its staggering cost, estimated at more than $32 trillion over 10 years, Medicare for All would cause more than 158 million Americans to lose their current coverage. In fact, private health insurance would be completely banned. That means anyone with private, employer-based or union-based health insurance would lose their plans in place of the governments one-size-fits-all coverage. Even if you like your plan, theres no question you really wouldnt be able to keep it. Moreover, the Medicare system, which millions of recipients have paid taxes into for a lifetime, would be flooded by people who have paid little or nothing into the system.
When it comes to the Green New Deal, Republican opposition to it does not mean Republicans dont care about the environment. Certainly, there is an abundance of ideas about how we can be better stewards of the earth and good stewards of taxpayer dollars as well. Unfortunately, the Green New Deal is really socialism masking as environmentalism. Even though the plan was presented as the means to save the earth from destruction, only a small part of the plan actually addresses environmental policy. In fact, much of the proposals cost would go toward purely socialist policies like a federal job guarantee and economic security for all who are unable or unwilling to work.
Several Democratic presidential candidates have echoed the idea of a universal basic income program, providing a regular, taxpayer-funded paycheck to cover basic living expenses, eliminating the value and necessity of work. At a cost of at least $2.8 trillion annually, such a program would not only demolish the federal budget, but it would also discourage people from working at all. The consequences would be disastrous for the nations economy and job creators who want to hire more not less.
America is not a country that thrives because of big government. It thrives because its citizens know that hard work is rewarded, freedom is protected, and individuals can enjoy the fruits of their labor as they see fit. It is that dream of personal freedom and individual property that has defined America as the land of opportunity. It is that opportunity that has and will continue to draw people to our land while making America the envy of the world.
To contact Congressman Cole, call 202-225-6165 or 580-357-2131.
Here is the original post:
Column: Socialism is not the American way | Opinion - Duncan Banner
Posted in Basic Income Guarantee
Comments Off on Column: Socialism is not the American way | Opinion – Duncan Banner
New Intel chip could accelerate the advent of quantum computing – RedShark News
Posted: at 10:44 pm
The marathon to achieve the promise of quantum computers hasedged a few steps forward as Intel unveils a new chip capable, it believes, of accelerating the process.
Called Horse Ridgeand named after one of the coldest places in Oregon, the system-on-chip can control a total of 128 qubits (quantum bits) which is more than double the number of qubits Intel heralded in its Tangle Lake test chip in early 2018.
While companies like IBM and Microsoft have been leapfrogging each other with systems capable of handling ever greater qubits the breakthrough in this case appears to be an ability to lead to more efficient quantum computers by allowing one chip to handle more tasks. It is therefore a step toward moving quantum computing from the lab and into real commercial viability.
Applying quantum computing to practical problems hinges on the ability to scale, and control, thousands of qubits at the same time with high levels of fidelity. Intel suggests Horse Ridge greatly simplifies current complex electronics required to operate a quantum system.
To recap why this is important lets take it for read that Quantum computing has the potential to tackle problems conventional computers cant by leveraging a phenomena of quantum physics: that Qubits can exist in multiple states simultaneously. As a result, they are able to conduct a large number of calculations at the same time.
This can dramatically speed up complex problem-solving from years to a matter of minutes. But in order for these qubits to do their jobs, hundreds of connective wires have to be strung into and out of the cryogenic refrigerator where quantum computing occurs (at temperatures colder than deep space).
The extensive control cabling for each qubit drastically hinders the ability to control the hundreds or thousands of qubits that will be required to demonstrate quantum practicality in the lab not to mention the millions of qubits that will be required for a commercially viable quantum solution in the real world.
Researchers outlined the capability of Horse Ridge in a paper presented at the 2020 International Solid-State Circuits Conference in San Francisco and co-written by collaborators at Dutch institute QuTech.
The integrated SoC design is described as being implemented using Intels 22nm FFL (FinFET Low Power) CMOS technology and integrates four radio frequency channels into a single device. Each channel is able to control up to 32 qubits leveraging frequency multiplexing a technique that divides the total bandwidth available into a series of non-overlapping frequency bands, each of which is used to carry a separate signal.
With these four channels, Horse Ridge can potentially control up to 128 qubits with a single device, substantially reducing the number of cables and rack instrumentations previously required.
The paper goes on to argue that increases in qubit count trigger other issues that challenge the capacity and operation of the quantum system. One such potential impact is a decline in qubit fidelity and performance. In developing Horse Ridge, Intel optimised the multiplexing technology that enables the system to scale and reduce errors from crosstalk among qubits.
While developing control systems isnt, evidently, as hype-worthy as the increase in qubit count has been, it is a necessity, says Jim Clarke, director of quantum hardware, Intel Labs. Horse Ridge could take quantum practicality to the finish line much faster than is currently possible. By systematically working to scale to thousands of qubits required for quantum practicality, were continuing to make steady progress toward making commercially viable quantum computing a reality in our future.
Intels own research suggests it will most likely take at least thousands of qubits working reliably together before the first practical problems can be solved via quantum computing. Other estimates suggest it will require at least one million qubits.
Intel is exploring silicon spin qubits, which have the potential to operate at temperatures as high as 1 kelvin. This research paves the way for integrating silicon spin qubit devices and the cryogenic controls of Horse Ridge to create a solution that delivers the qubits and controls in one package.
Quantum computer applications are thought to include drug development high on the worlds list of priorities just now, logistics optimisation (that is, finding the most efficient way from any number of possible travel routes) and natural disaster prediction.
More:
New Intel chip could accelerate the advent of quantum computing - RedShark News
Posted in Quantum Computing
Comments Off on New Intel chip could accelerate the advent of quantum computing – RedShark News
Cracking the uncertainty around quantum computing – Information Age
Posted: at 10:44 pm
Aravind Ajad Yarra and Saji Thoppil, fellows at Wipro Limited, answer frequently asked questions about quantum computing
What should be kept in mind when implementing quantum technology?
Todays leaders are inundated with the disruptive power of quantum computing and its potential applications in AI, machine learning and data science. Gartner data reveals that by 2023, 95% of organisations researching it will utilise quantum-computing-as-a-service (QCaaS) to minimize risk and contain costs. Also, 20% of organisations will be seen budgeting for quantum computing projects, compared to less than 1% today.
We, Aravind Ajad Yarra, fellow, Wipro Limited and Saji Thoppil, fellow and chief technologist cloud and infrastructure Services, Wipro Limited, bring you the basics of quantum computing and demystify some of its unknown facets in todays evolving scenario.
Lets look at the commonly asked questions:
A: Most of us would have read quantum mechanics at high-school level physics and probably been baffled by its strange characteristics. Quantum mechanics is the physics that applies at atomic and subatomic levels. Thought of using the physics of quantum mechanics to computing is what has led to quantum computing.
Our present-day computing is largely based on Boolean logic, represented using binary bits, which assume the value of either 0 or 1. Quantum computing, on the other hand, uses quantum bits (qubits), which behave differently from classic bits and use quantum superposition state where each qubit can assume both 0 and 1 at the same time.
To get better clarity, I suggest reading this short article on quantum computing.
A: Quantum computing is one of the most exciting developments in recent computing history. For years, Moores law has been helping us to keep the innovation cycle in computing going and push the boundaries of what computing can offer to business, so much so that software is what is driving digital businesses. With Moores law reaching its saturation point, everyone is eagerly looking for whats next in computing. This is seen as something that can keep the computing innovation cycle going, hence this buzz.
If you hear the general hype, you might believe quantum computing might replace classic computing soon. However, that is far from reality. The superposition property that we mentioned earlier gives quantum computing some unique capability that traditional computing doesnt have. Simply put, qubit superposition allows quantum computing to solve certain classes of problems promptly, which might otherwise take years for classical computers.
IBM has established a roadmap for reaching quantum advantage and concluded that: for significant improvement over classical systems, the power of quantum computers must double every year. Read here
A: Quantum computers are not bigger or faster versions of existing computers. Quantum computing is fundamentally different from existing computing. The problems for which quantum computers are most useful are problems that classical computers are not good at.
Some of the classes of problems that quantum computers currently look at are optimisation problems, for example, addressing the classic travelling salesman problem. As the number of cities that have this problem increases, classic computers find it exponentially hard to find an optimum solution. Quantum computers proved very useful for these classes of problems. Solving such problems make quantum computers super useful in areas like gene analysis, drug discovery, chemical synthesis, weather simulations, newer types of encryption, unstructured search, and better deep neural networks, to name a few.
What is AI? Information Age has created a simple guide to AI, machine learning, neural networks, deep learning and random forests. Read here
A: There are two major approaches to quantum computing that are currently in use: circuit-based computers (aka universal quantum computers), and adiabatic computers.
Universal quantum computers are based on logical gates and work similar to the underlying logic foundations of classical computers. Hence, universal quantum computers are extremely useful for computing problems improving on our current knowledge base of solutions. However, qubits required for universal quantum computers are extremely difficult to realise physically because qubit instability makes it hard to produce universal quantum computers.
Adiabatic computers are analog, but are easier to produce. These are more relaxed with respect to qubit state stability. Hence, it is easier to produce 1000s of qubits on adiabatic computers. However, adiabatic computers can be used for limited use cases such as optimisation problems.
A: While most platform companies that are working to build quantum computers are taking bets on one or the other, enterprises can probably explore both of the models. While adiabatic computing is limited, there are production-ready adiabatic computers using real quantum bits (such as those from DWave), as well as digital annealers, which use digital qubits (from Atos and Fujitsu).
Its emerging technologies month on Information Age, that means augmented and virtual reality, quantum computing and blockchain. Read here
Circuit-based quantum computers are much more general purpose. While these have more utility for enterprises, no production-grade problems can be currently solved with the current state of these machines. I would suggest exploring both classes of computers, based on the case that one is trying to solve.
A: The best way to start with identification of use cases for quantum computing is to explore areas where classic computers are currently not good at. Optimisation problems are the best starting point for most enterprises. Based on the industry, different kinds of optimisation use cases can be considered for exploring quantum computers. These could be risk modelling, inventory or asset optimisation, among others.
Cryptography is another area where robust use cases can be identified by enterprises. Quantum computers, when production-ready, can potentially break current methods of encryption, leading to exposure of sensitive data. Identifying data that is very sensitive and has longer term value, and considering safe encryption methods using quantum key generation and distribution are other ways in which it can be used.
Machine learning is also a very promising use case. Quantum machine learning, as it is called, can use special purpose quantum circuits that can significantly boost the efficiency of machine learning algorithms.
A: Industries that are process-centric, such as pharmaceuticals and oil & gas exploration, are the early adopters. These industries can benefit from quantum computing in complex optimisation problems they need solve from time to time.
Apart from these asset-heavy industries, the manufacturing industry is also actively exploring quantum computing. Banks and other financial services companies, which have risk modelling needs, also rely a lot on quantum computing.
A: It is probably too early to talk about real-world scenarios where quantum computers have made an impact. While there are demonstrations by research labs to use quantum communication methods to send instant data transfer from satellite and breaking various encryption methods, these still look good in labs.
The reason for this is the current state of reliability in quantum computers. Qubits are highly sensitive, and they are prone to errors. Error correction methods that we currently use reduce the effective working qubits, but early results have been seen with digital annealers, which simulate adiabatic quantum computing using traditional digital computers.
Wipros Topcoder, for example, is currently working with Fujitsu to run crowdsourced challenging using Fujitsus digital annealer to solve real-world problems. Additionally, Airbus has been running open innovation challenges to solve some of its problems using quantum computing.
Quantum technologies also has appeal in the areas of communication, cryptography, sensors and measurements. Unlike quantum computing, where practical use cases are still in exploratory stages, these areas have industry-ready products that enterprises can put to use.
Quantum communication takes advantage of the nature of photons in flight and is able to detect if a photon has reached the recipient uninterrupted; this can ensure secure communications.
While quantum key generation (QKG) is used to generate truly random keys, quantum key distribution (QKD) is used for securely distributing keys. Both of these are essential for using a one-time pad cryptography technique, which is considered the holy grail in encryption.
Generating true random numbers for the quantum computing era, or indeed the pre-quantum era, is the aim. Crypta Labs reckon they have cracked it. Read here
Additionally, quantum sensors have niche applications where there is a need for highly accurate measurements of gravity, electric fields, time, position and magnetic field. In a fiercely competitive world, we can expect more enterprises wanting to leverage these to create unique offerings.
Given the nature of its evolution, it is hard to make an upfront business case for quantum computing. However, given the potential, I suggest that the business case be made in two parts.
The first part is to focus on near-term (1-2 years) use cases such as optimisation and encryption by using digital annealers for optimisation and photon-based ASICS for key generation. Digital annealers, or even simulators running on cloud, can solve several practical optimisation problems.
On the other hand, centres of excellence can be set up, leading to building expertise and solving relevant problems. Returns from these investments would set the stage for the second part, focusing on mid & longer term (2+ years) use cases, such as exploring machine learning and unstructured data search as part of centres of innovation and open innovation communities with small investments, but with longer period on returns.
Written by Aravind Ajad Yarra, fellow at Wipro Limited, and Saji Thoppil, fellow and chief technologist cloud and infrastructure Services at Wipro Limited
More here:
Cracking the uncertainty around quantum computing - Information Age
Posted in Quantum Computing
Comments Off on Cracking the uncertainty around quantum computing – Information Age
IC Breakthroughs: Energy Harvesting, Quantum Computing, and a 96-Core Processor in Six Chiplets – News – All About Circuits
Posted: at 10:44 pm
According to Moore's law, since the introduction of the first semiconductors, the number of transistors on an integrated circuit has doubled approximately once every 18 months.
However, now that transistors are starting to reach near-atomic sizes, their reduction is becoming increasingly problematic, and as such, this doubling effect is beginning to plateau.
One technology research institute, CEA-Leti, is developing techniques to increase the power of semiconductors.
But what are these new technologies and how will they affect modern electronics?
Developers are increasingly searching for efficient ways toreplace portable power sources that require charging or replacement.
However, such a feat is only possible if power can be extracted from the local environment, like in the instance of a device from the University of Massachusetts Amherst that powers small electronics from moisture in the air.
A more conventionalmethod for energy extraction is using the Peltier effect, which requires a heat differential (such as cold air on a warm wrist), but these are often cumbersome and require heat sinks.
Another method is the use of vibration energy from motion, whereby a cantilever vibrates a piezo element, converting the mechanical energy to electrical energy.
Butthese systems are problematic because they are often tuned for one frequency of vibration. This means that their efficiency is only maximized when external mechanical energy is of the same frequency.
This is where CEA-Letis energy harvesting system comes in.
The energy harvesting systemconverts mechanical energy into electrical energy to power an IC. While similar to a cantilever system, which converts mechanical motion into electrical energy using a piezo effect, the cantilever is electrically tunable, allowingit to match its resonant frequency to the peak frequency of the external mechanical force.
Using an adjustable resonant system increases the harvesting bandwidth by 446%from typical cantilever systems and increases energy efficiency by 94%. The energy needed to control the system is two orders of magnitude lower than what the system harvests; the system requires around 1 W while the energy harvested is between 100 W and 1 mW.
While quantum computing will bring some major changes to the field of computation, they are far from becoming commercialized.
Many hurdles, such as low-temperature requirements, make them difficult to put into everyday applications. But one area, in particular, that is problematic is their integration into standard circuitry.
In a study on energy-efficient quantum computing, researchers explain thatqubits, which are bits in superposition states,must be kept well away from external sources of energy. This is becauseany exposure to external energy puts the qubits at risk ofcollapsing their wavefunction. Such sources of energy can include magnetic field fluctuations, electromagnetic energy, and heat (mechanical vibration).
To make things more complicated, quantum computer circuitry is at some point required to interface with traditional electronic circuitry, such as analog and digital circuits. If these circuits are external to the quantum circuitry, then the issue of space and speed become an issue; remote circuitry takes more room, and the distance reduces the speed at which information can be accessed.
To address these issues, CEA-Leti hasdeveloped a quantum computing technology that combines qubits with traditional digital and analog circuitry on the same piece of silicon using standard manufacturing techniques.
The 28 nm FD-SOI process combines nA current-sensing analog circuitry, buffers, multiplexers, oscillators, and signal amplifiers with an on-chip double quantum dot whose operation is not affectedeven when using the traditional circuitry at digital frequencies up to 7 GHz and analog frequencies up to 3 GHz.
The IC, which operates at 110 mK, is able to provide nA current-sensing while operating on a power budget to prevent interference with the quantum dots, which is 40 times lower than competing technologies.
As the number of transistors on a chip increases, the chances of one failing also increases, thusdecreasingthe yield of wafers. One workaround is to make chips smaller and include fewer transistorswhile also connecting multiple chips together, thus increasingthe overall transistor count.
However, PCBs have issues with connecting multiple dies together. These issues may involve limited bandwidth and the inability to integrate other active circuitry required by the dies, such as power regulation.
CEA-Leti hasmade a breakthrough in IC technology with its active interposer layer and 3D stacked chips.
Namely, the team has developed a 96-core processor on six chiplets, 3D stacked on an active interposer.
Just like the PCB topology, CEA-Leti uses a layer with metal interconnects that connect different dies on a single base. Butunlike a PCB, the interconnection layer is a piece of semiconductor only 100 m thick.
What makes the interposer more impressive is that it isactive. It alsohas integrated circuitry, including transistors. Therefore, the interposer can integrate power regulators, multiplexers, and digital processors, meaningthat the diesdirectly attached to the imposers operate at high-speeds. They alsohave all their needed handling circuitry next to them.
The use of the active imposer also means that smaller ICs with reduced transistor counts can be combined to produce complex circuitry.This improves wafer yields, reduces their overall cost, and expands their capabilities.
These three technologies coming out of CEA-Leti give us a glimpse intoa future where ICs may generate their own power oreven be able to integrate quantum circuitry.
The energy harvesting technology may struggle to find its way into modern designs because most portable applications require relatively large amounts of power (compared to 1 mW) and these devices are often stationary.
The use of quantum circuitry with traditional construction techniques means that quantum security (which may become essential) can be integrated into everyday devicessuch as smartphones, tablets, and computers. Until quantum computing becomes commercial, though, this technology will likely remain niche.
Technologies such as the active imposer may be the first technology of the three discussed here to become widespread as it easily solves modern transistor reduction-related issues.
Is there a specific functionality you can't seem to find in an IC? What limitations do you feel are keeping researchers from making your "dream" IC breakthrough? Share your thoughts in the comments below.
Original post:
Posted in Quantum Computing
Comments Off on IC Breakthroughs: Energy Harvesting, Quantum Computing, and a 96-Core Processor in Six Chiplets – News – All About Circuits
MITs Top 5 tech breakthroughs for 2020 – Big Think
Posted: at 10:44 pm
MIT is no stranger to technology. It's one of the world's most productive and forward-facing tech research organizations. So when MIT gets excited looking forward, it only makes sense to sneak a peak at what they're seeing. MIT recently just published their top 10 technological breakthroughs for 2020 and just beyond. Below are the first five on their list. Each one is an advance that MIT sees as genuinely changing our lives.
Image source: Umberto/unsplash
MIT says: Later this year, Dutch researchers will complete a quantum internet between Delft and the Hague.
Think of a coin. Lay it flat on a table, and it's either heads and tails. This is more or less how things work in the world at larger scales. To see what things are like at a much smaller, quantum size, spin the coin on the table and observe it from above. From our perspective, the coin's state could then be described as being both head and tails at the same time since it's neither one exactly. Being in this rapidly changing condition is like being in "superposition" in quantum physics.
To see, or measure, the coin's head/tails state at any given moment, you'd have to stop it spinning, perhaps flattening it down to the table, where it would be stopped as either head or tails. Thus measured, the coin would be taken it out of superposition. Just like entangled quantum particles.
In classical computing system, data objects are represented by bits, strings of data comprised of zeros and ones, AKA heads or tails. In the quantum world, however, what needs to be represented is that "spinning coin"of superposition in its as-yet-unresolved state. So quantum computing uses "qubits" instead of bits.
Obviously, being able to represent data with qubits objects that collapse out of superposition if they're intercepted or tampered with is an attractive prospect for an increasingly security-conscious world, a natural foundation on which to build a super-secure quantum internet.
Still, qubits are far more complex than bits, and thus harder to process and exchange. Even worse, as our spinning coin will eventually stop spinning and resolve as heads or tails (Inception aside), qubits lose their superimposition after a while, making retaining and exchanging them in a superimposed a serious challenge. While there are various combinations of classical and quantum internets and encryption keys under consideration and construction, they all share a need for the robust, accurate transmission of qubits over long distances.
Now scientists of the Quantum Internet Alliance initiative have announced that they're in the process of building the world's first purely quantum network. It incorporates new quantum repeaters that allow qubits to be passed along long distances without being corrupted or losing their superposition. The group published a paper last October laying out their vision for an Arpanet-type quantum prototype stretching between Delft and the Hague by the end of this decade. (Here's a great explainer.)
Stephanie Wehner of QuTech, a quantum computing and internet center at Delft University of Technology, is coordinator of the project:
"With this very extensive simulation platform we've recently built, which is now running on a supercomputer, we can explore different quantum network configurations and gain an understanding of properties which are very difficult to predict analytically. This way we hope to find a scalable design that can enable quantum communication across all of Europe."
Image source: National Cancer Institute/unsplash
MIT says: Novel drugs are being designed to treat unique genetic mutations.
Developing treatments for any condition can be difficult and expensive, and it behooves researchers to get the most bang for their buck by concentrating on formulating solutions for diseases that afflict large groups of people. Hand in hand with this is a need for generalized remedies that address characteristics the whole group shares.
This is changing, says MIT, with gene editing offering the potential for transforming medicine from the traditional "one size fits all" approach to a more effective, personalized, or "n-of-1," approach. This new form of medicine involves targeting and manipulation of an individual patient's genes, with the application of rapidly maturing technologies for gene replacement including gene editing, and antisensing that removes or corrects problem-causing genetic messages. "What the treatments have in common," says MIT, "is that they can be programmed, in digital fashion and with digital speed, to correct or compensate for inherited diseases, letter for DNA letter." Treatments may also individually be optimized to avoid contemporary medicine's often harsh side effects.
If gene editing lives up to its promise, medicine is about to become radically more successful and humane.
Image source: Artwell/Shutterstock
MIT says: The rise of digital currency has massive ramifications for financial privacy.
While Bitcoin is, as of this writing, collapsing, it's nonetheless clear that purely digital monetary systems have considerable appeal: No more germ-encrusted metal and paper money, and, perhaps more importantly, an opportunity for governments and their central banks to more closely control currency and to instantly execute monetary policy changes.
The truth is we've been halfway there for a long time, currencies such as Bitcoin and Libra notwithstanding. The money in our bank accounts is virtual we personally possess no plies of physical cash at our local bank. Electronic purchasing with credit and debit cards is the norm for most of us, and when large movements of cash occur between banks, they do so in the digital domain. It's all been mostly bytes and bits for some time. What we currently have is a mish-mash of physical and digital money, and MIT predicts the imminent arrival of purely digital monetary systems. (Buh-bye, folding money and pocket change.)
In 2014, China began quietly exploring and building their Digital Currency/Electronic Payments system, or DC/EP. According to OZY, they've already applied for 84 patents for various innovations their new system requires.
One of China's goals is to construct an on-ramp making it easy for citizens to switch to an all-digital system. "Virtually all of these patent applications," Marc Kaufman of Rimon Law, tells OZY, "relate to integrating a system of digital currency into the existing banking infrastructure." The country's developing systems that allow people to swap traditional money for digital currency, as well chip card and digital wallets from which the currency may be spent.
Clearly, an all-digital monetary system presents privacy issues, since all of one's money would presumably be visible to governmental agencies unless adequate privacy protections are implemented. Developing that protection is going to require a deeper exploration of privacy itself, a discussion that has been overdue since the dawn of the internet.
Image source: Halfpoint/Shutterstock
MIT says: Drugs that try to treat ailments by targeting a natural aging process in the body have shown promise.
Strides are being made toward the production of new drugs for conditions that commonly accompany getting older. They don't stop the aging process, but the hope is that in the next five years, scientists may be able to delay some of aging's effects.
Senolytics are a new form of drugs under development that are designed to clean out unwanted stuff that often accumulates in us as we age. These senescent cells can wind up as plaque on brain cells, and as deposits that cause inflammation inhibiting healthy cell maintenance, and leaving toxins in our bodies.
While trials by San Franciscobased Unity Biotechnology are now underway for a senolytic medication targeting osteoarthritis of the knee, MIT notes that other aging-related ailments are getting a promising fresh look as well. For example, one company, Alkahest, specializing in Parkinson's and dementia, is investigating the extraction of certain components of young people's blood for injection into Alzheimer's patients in the hopes of arresting cognitive and functional decline (Oh, hi, Keith Richards.). And researchers at Drexel University College of Medicine are investigating the use of an existing drug, rapamycin, as an anti-aging skin creme.
Image source: Sharon Pittaway/unsplash
MIT says: Scientists have used AI to discover promising drug-like compounds.
Drugs are built from compounds, combinations of molecules that together produce some sort of medically useful effect. Scientists often find that known compounds can have surprising medical value recent research found that 50 non-cancer drugs can fight cancer in addition to their previously known uses.
But what about new compounds? MIT notes there may be as many as 1060 molecule combinations yet to be discovered, "more than all the atoms in the solar system."
AI can help. It can sift through molecule properties recorded in existing databases to identify combinations that may have promise as drugs. Operating much more quickly and inexpensive than humans can, machine learning techniques may revolutionize the search for new medicines.
Researchers at Hong Kongbased Insilico Medicine and the University of Toronto announced last September that AI algorithms had picked out about 30,000 unexplored molecule combinations, eventually winnowing that list down to six especially promising new medical compounds. Synthesis and subsequent animal testing revealed one of them to be especially interesting as a drug. One out of six out of 30,000 may not seem that impressive, but AI and machine learning are quickly evolving.
MIT predicts that in 3-5 years, such investigations will be regularly bearing fruit.
The other five items on MIT's list are:
6. Satellite mega-constellations7. Quantum supremacy8. Tiny AI9. Differential privacy10. Climate change attribution
Related Articles Around the Web
Excerpt from:
Posted in Quantum Computing
Comments Off on MITs Top 5 tech breakthroughs for 2020 – Big Think
This Week’s Awesome Tech Stories From Around the Web (Through February 29) – Singularity Hub
Posted: at 10:44 pm
COMPUTING
Inside the Race to Build the Best Quantum Computer on EarthGideon Lichfield | MIT Technology ReviewRegardless of whether you agree with Googles position [on quantum supremacy] or IBMs, the next goal is clear, Oliver says: to build a quantum computer that can do something useful. The trouble is that its nearly impossible to predict what the first useful task will be, or how big a computer will be needed to perform it.
Were Not Prepared for the End of Moores LawDavid Rotman | MIT Technology ReviewQuantum computing, carbon nanotube transistors, even spintronics, are enticing possibilitiesbut none are obvious replacements for the promise that Gordon Moore first saw in a simple integrated circuit. We need the research investments now to find out, though. Because one prediction is pretty much certain to come true: were always going to want more computing power.
Flippy the Burger-Flipping Robot Is Changing the Face of Fast Food as We Know ItLuke Dormehl | Digital TrendsFlippy is the result of the Miso teams robotics expertise, coupled with that industry-specific knowledge. Its a burger-flipping robot arm thats equipped with both thermal and regular vision, which grills burgers to order while also advising human collaborators in the kitchen when they need to add cheese or prep buns for serving.
The Next Generation of Batteries Could Be Built by VirusesDaniel Oberhaus | Wired[MIT bioengineering professor Angela Belcher has] made viruses that can work with over 150 different materials and demonstrated that her technique can be used to manufacture other materials like solar cells. Belchers dream of zipping around in a virus-powered car still hasnt come true, but after years of work she and her colleagues at MIT are on the cusp of taking the technology out of the lab and into the real world.
Biggest Cosmic Explosion Ever Detected Left Huge Dent in SpaceHannah Devlin | The GuardianThe biggest cosmic explosion on record has been detectedan event so powerful that it punched a dent the size of 15 Milky Ways in the surrounding space. The eruption is thought to have originated at a supermassive black hole in the Ophiuchus galaxy cluster, which is about 390 million light years from Earth.
Star Treks Warp Speed Would Have Tragic ConsequencesCassidy Ward | SyFyThe various crews ofTreks slate of television shows and movies can get from here to there without much fanfare. Seeking out new worlds and new civilizations is no more difficult than gassing up the car and packing a cooler full of junk food. And they dont even need to do that! The replicators will crank out a bologna sandwich just like mom used to make. All thats left is to go, but what happens then?
Image Credit: sergio souza /Pexels
More here:
This Week's Awesome Tech Stories From Around the Web (Through February 29) - Singularity Hub
Posted in Quantum Computing
Comments Off on This Week’s Awesome Tech Stories From Around the Web (Through February 29) – Singularity Hub