The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Alternative Medicine
- Artificial Intelligence
- Atlas Shrugged
- Ayn Rand
- Basic Income Guarantee
- Big Tech
- Black Lives Matter
- Boca Chica Texas
- Casino Affiliate
- Cbd Oil
- Chess Engines
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Designer Babies
- Donald Trump
- Elon Musk
- Ethical Egoism
- Eugenic Concepts
- Fake News
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom of Speech
- Gene Medicine
- Genetic Engineering
- Germ Warfare
- Golden Rule
- Government Oppression
- High Seas
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Longevity
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jordan Peterson
- Las Vegas
- Life Extension
- Marie Byrd Land
- Mars Colonization
- Mars Colony
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- National Vanguard
- New Utopia
- New Zealand
- Online Casino
- Online Gambling
- Personal Empowerment
- Political Correctness
- Politically Incorrect
- Post Human
- Post Humanism
- Private Islands
- Proud Boys
- Quantum Computing
- Quantum Physics
- Resource Based Economy
- Ron Paul
- Second Amendment
- Second Amendment
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Sports Betting
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tor Browser
- Transhuman News
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Zeitgeist Movement
The Evolutionary Perspective
Daily Archives: January 12, 2020
Posted: January 12, 2020 at 11:50 pm
https://radiofreehpc.com/audio/RF-HPC_Episodes/Episode260/RFHPC260_QuantumQuantum.mp3In this podcast, the Radio Free HPC team looks at how Quantum Computing is overhyped and underestimated at the same time.
The episode starts out with Henry being cranky. It also ends with Henry being cranky. But between those two events, we discuss quantum computing and Shahins trip to the Q2B quantum computing conference in San Jose.
Not surprisingly, there is a lot of activity in quantum, with nearly every country pushing the envelop outward. One of the big concerns is that existing cryptography is now vulnerable to quantum cracking. Shahin assures us that this isnt the case today and is probably a decade away, which is another way of saying nobody knows, so it could be next week, but probably not.
We also learn the term NISQ which is a descriptive acronym for the current state of quantum systems. NISQ stands for Noisy Intermediate Scale Quantum computing. The conversation touches on various ways quantum computing is used now and where its heading, plus the main reason why everyone seems to be kicking the tires on quantum: the fear of missing out. Its a very exciting area, but to Shahin, it seems like how AI was maybe 8-10 years ago, so still early days.
Download the MP3 *Follow RFHPC on Twitter *Subscribe on Spotify *Subscribe on Google Play *Subscribe on iTunes
Sign up for the insideHPC Newsletter
See more here:
Posted: at 11:50 pm
What are some bleeding-edge information technology developments that a forward-thinking CIO should keep an eye on?
Here are a few emerging technologies that have caught my attention. These are likely to have an increasing impact on the world of business in the future. Consider which ones you should follow a little more closely.
A recent advance in quantum computing that a Google team achieved indicates that quantum computing technology is making progress out of the lab and closing in on practical business applications. Quantum computing is not likely to change routine business transaction processing or data analytics applications. However, quantum computing is likely to dramatically change computationally intense applications required for:
Since most businesses can benefit from at least a few of these applications, quantum computing is worth evaluating. For a more detailed discussion of specific applications in various topic areas, please read: Applying Paradigm-Shifting Quantum Computers to Real-World Issues.
Machine learning is the science of computers acting without software developers writing detailed code to handle every case in the data that the software will encounter. Machine learning software develops its own algorithms that discover knowledge from specific data and the softwares prior experience. Machine learning is based on statistical concepts and computational principles.
The leading cloud computing infrastructure providers machine learning routines that are quite easy to integrate into machine learning applications. These routines greatly reduce expertise barriers that have slowed machine learning adoption at many businesses.
Selected business applications of machine learning include:
For summary descriptions of specific applications, please read: 10 Companies Using Machine Learning in Cool Ways.
Distributed ledger technology is often called blockchain. It enables new business and trust models. A distributed ledger enables all parties in a business community to see agreed information about all transactions, not just their own. That visibility builds trust within the community.
Bitcoin, a cryptocurrency, is the mostly widely known example application of blockchain.
Distributed ledger technology has great potential to revolutionize the way governments, institutions, and corporations interact with each other and with their clients or customers.Selected business applications of distributed ledger technology include:
For descriptions of industry-specific distributed ledger applications, please read: 17 Blockchain Applications That Are Transforming Society.
The Industrial Internet of Things (IIoT) is a major advance on Supervisor Control and Data Acquisition (SCADA). SCADA, in many forms, has been used for decades to safely operate major industrial facilities including oil refineries, petrochemical plants, electrical power generation stations, and assembly lines of all kinds.
IIOT is a major advance over relatively expensive SCADA. IIoT relies on dramatically cheaper components including sensors, network bandwidth, storage and computing resources. As a result, IIoT is feasible in many smaller facilities and offers a huge increase in data points for larger facilities. Business examples where IIoT delivers considerable value include production plants, trucks, cars, jet engines, elevators, and weather buoys.
The aggressive implementation of IIoT can:
For summary descriptions of specific IIOT applications, please read: The Top 20 Industrial IoT Applications.
RISC-V is an open-source hardware instruction set architecture (ISA) for CPU microprocessors that is growing in importance. Its based on established reduced instruction set computer (RISC) principles. The open-source aspect of the RISC-V ISA is a significant change compared to the proprietary ISA designs of the dominant computer chip manufacturers Intel and Arm.
RISC-V offers a way around paying ISA royalties for CPU microprocessors to either of the monopolists. The royalties may not be significant for chips used in expensive servers or smartphones, but they are significant for the cheap chips required in large numbers to implement the IIOT applications listed above.
For an expanded discussion of RISC-V, please read: A new blueprint for microprocessors challenges the industrys giants.
What bleeding edge information technology developments would you add to this list? Let us know in the comments below.
See the original post:
Posted: at 11:50 pm
Jan 12, 2020 at 10:20 // News
International Business Machines (IBM), an American multinational information technology firm, has managed to double the power of its quantum computer (QC) but this effort didnt break the encryption of Bitcoin (BTC), the original blockchain-based cryptocurrency.
During the CES 2020 conference that happened on January 8, the company revealed that it fruitfully completed a Quantum Volume (QV) of 32 with the help of its 28-qubit quantum PC called Raleigh.
In a nutshell, Quantum Volume is the metric used to define the intricacy of snags which can be deciphered and worked on by a QC. The QV can be employed to relate the performance of various quantum PCs and the information tech company has successfully doubled this value on a yearly basis, ever since 2016 (four years now).
The computer machines have for a good time been addressed as one of the powerful novelties happening within this very century, with budding applications in well-nigh every single sector such as healthcare, internet of things, artificial intelligence (AI), blockchain, financial modeling, etc.
Even though IBM's state-of-the-art advancements can be thought of as momentous progress, QCs can at this time only be applied for very definite errands, for instance, these machines are many miles in front of the universal classical PCs that we have been familiarly using. Per se, major fears have been developed thinking that these gadgets could be at one point in time be applied to break the cryptography employed to safeguard digital assets such as Bitcoin remain speculative, at any rate for the time being.
It is also claimed that since the system is designed completely around cryptographically protected transactions, then, it requires a much more powerful QC to crack and fissure the encryption that is applied to produce the private keys for BTC. As a matter of fact, as per the paper released in June 2017 by several authors including Martin Roetteler, the type of a quantum computer requires to command processing power of about 2,500 qubits in order to breakdown the 256-bit encryption being used by BTC.
Remember, the most powerful QC that we have now only has about 72-qubit processor, and this implies that it will take more time (in years) to touch encryption-intimidating levels. But that doesnt rule out the rate at which IBM and Google are trying to double the computing power year-in-year-out, hence becoming major threats to Bitcoin and the entire cryptocurrency community.
Posted: at 11:50 pm
Space X CEO Elon Musk
Photo by Kevork Djansezian
For a long time, American space exploration was a closed circle: There was just one customer, the U.S. government (NASA) and a handful of giant defense contractors. Then in 2008 Elon Musk's SpaceX put the first privately-financed rocket into orbit, Jeff Bezos' Blue Origin promised private flights, and space was suddenly a lively market with companies vying to put satellites and humans into orbit.
A decade later hundreds of start-ups have flocked to the space sector, bringing sophisticated technologies that include artificial intelligence, quantum computing, phased array radar, space-based solar power, "tiny" satellites and services that could not be imagined just a few years ago.
Space Angels, an early stage investor that also tracks investments in the sector, reported that venture capitalists invested $5 billion into space technologies in the first three quarters of 2019, putting the year on track to be the biggest year yet, with Blue Origin pulling in $1.4 billion from Bezos. Since 2009, said Chad Anderson, CEO of Space Angels, investors have poured nearly $24 billion into 509 companies.
Anderson said that SpaceX triggered the transformation not just by offering competition to NASA but publishing its prices for a launch. Before that revelation, space was really an opaque market, making it difficult for potential competitors to price their products. "It's been a really big decade for commercial space," said Anderson.
The largest amount of venture capital still goes into the most fundamental task: putting satellites into orbit. Anderson says 89 companies have received funding for so-called small-lift launch vehicles. These are companies promising to put payloads of up to 2,000 kilos (4,400 lbs) into low Earth orbit. Their focus is a new generation of small satellites such as those used by OneWeb and SpaceX's StarLink, which promise broadband internet access in even the most remote parts of the world by deploying "constellations" of hundreds or even thousands of tiny satellites.
Satellites have become so mainstream you can now buy a standard 4-in. by 4-in. "cubesat" kit online. All this activity could mean 20,000 to 40,000 satellites joining the 1,000 now in orbit over the next few years. "It's quickly becoming congested," Anderson said of the market for small-lift launch. Of the venture-backed rocket companies, SpaceX and Rocket Lab, with launch sites in New Zealand and Virginia, are making regular launches, although Richard Branson's Virgin Galactic is scheduled to begin flying its manned shuttle this year.
The sky is also getting crowded. Aside from the thousands of new satellites scheduled for launch, there is already a lot of clutter in space as many as 250,000 pieces of junk and debris circle the Earth. Up to now the U.S. Air Force has taken the lead role in tracking debris and warning satellite operators about possible collisions. But the military's tracking radar, with some components dating back to the cold war, can only detect pieces 10 cm (4 in.) across or larger. LeoLabs, a start-up based in Menlo Park, California, has developed an advanced radar system that can detect objects in orbit as small as 2 cm (less than an inch) long.
LeoLabs' Kiwi Space Radar was set up in Central Otago, New Zealand, in 2019. It is the first in the world to track space debris smaller than 10 cm.
A tiny object traveling at several thousand miles an hour can cause severe damage to a satellite. LeoLabs enables customers to track their small satellites more easily and to safely move them to a new position. "That will take a lot of the collision risks off the table," says founder and CEO Dan Ceperley. His company has built phased array radars that steer the radar beam electronically faster than a traditional dish antenna in three locations: Alaska, Texas and New Zealand. To date, LeoLabs has raised $17 million from venture funds, including Marc Bell Capital Partners, Seraphim Capital and Space Angels.
Many of the 1,000 satellites now in orbit are engaged in observing Earth. They monitor the weather, humidity and temperature, among dozens of other phenomena, and capture millions of images. SkyWatch, based in Waterloo, Ontario, recently closed a $10 million round of funding led by San Francisco's Bullpen Capital to develop its service to make satellite data easily available to companies.
SkyWatch would handle licensing and payment for data through subscription fees, and companies could use its software to build their own apps for tasks such as tracking crops or assessing damage from natural disasters. SkyWatch CEO James Slifierz compares his timing to the aftermath of the creation of the global positioning system infrastructure. Once GPS was in place, civilian applications followed.
The growing flow of data from satellites has raised concerns about data security. SpeQtral, based in Singapore, plans to build encryption keys based on the laws of quantum physics to protect space-to-Earth communications. "The security of any communications is essential," says Chune Yang Lum, CEO of SpeQtral, which has raised a $1.9 million seed round led by Space Capital, the venture arm of Space Angels. Quantum encryption has been touted as practically unbreakable.
An illustration of the SPS-ALPHA (Solar Power Satellite by means of Arbitrarily Large Phased Array) transmitting energy to Australia. This approach, in concept phase, includes a series of enormous platforms positioned in space in high Earth orbit to continuously collect and convert solar energy into electricity.
SPS-ALPHA concept and illustration, courtesy John C. Mankins
Start-ups don't have a monopoly on developing new space applications. Tech giant Google has sought ways to commercialize its growing expertise in artificial intelligence and its vast computing power in the cloud. "We work with some of our largest and most transformative customers to do something epic," said Scott Penberthy, director of applied AI at Google Cloud.
He said Google Cloud has done a number of projects with NASA's Frontier Development Lab, including one that takes low-resolution photographs and combines them using AI to create a high-resolution image. Another proposal from Google would enable navigation on the moon's surface (which has no GPS) by having AI comparing an astronaut's surroundings with photos of the moon taken from space.
NASA is itself trying to benefit from the innovations brought by start-ups. In December, NASA's Ames Research Center announced a deal with the Founders Institute, a renowned start-up accelerator, to make some of its technology available to start-up entrepreneurs. In September 2019 the space agency announced the latest round of its Tipping Point Program, a public-private initiative, was distributing $43.2 million to 14 American companies whose technologies could contribute to NASA's plan for its Moon-to-Mars project. Participants include SpaceX, which will work on nozzles to refuel spaceships, and Blue Canyon Technologies, a Denver start-up developing autonomous navigation systems to enable small satellites to maneuver without communicating with "Earth."
In the past five years, NASA has awarded five groups of Tipping Point Awards, worth more than $120 million combined. Broadly speaking, a company or project selected for a tipping point award receives NASA resources up to a fixed amount, with the private side paying for at least 25% of the program's total costs. This allows NASA to shepherd the development of important space technologies while trying to save the agency money.
Despite the surge of cash, not all space projects find funding easily. John Mankins, a former NASA physicist, has long been an advocate of space-based solar power. Satellites would capture solar energy, convert it to microwaves and beam it down to Earth, where it would be converted into electricity. Mankins believes such a system taking advantage of recent technological advances can deliver electricity at a competitive price to areas of the world where power is expensive.
Mankins' company, Solar Space Technologies, has formed a joint venture with an Australian company to seek funding to supply power to remote parts of Australia with minimum impact on the environment. While the cost for space-based solar power may have been prohibitive in the past, Dr. Michael Shara, an astrophysicist at New York's Museum of Natural History, said "it really gets interesting" as costs come down.
Anderson, the Space Angels CEO, said venture capitalists hesitate to invest in space solar power because these are large infrastructure projects. "They require a significant amount of capex, and their paybacks are much longer than the typical 10-year lifetime of a venture capital fund." But as concern about climate change increases and the cost of putting "stuff" in orbit drops, clean energy from space may become an attractive entrepreneurial proposition.
More from Tech Drivers:Meet the robots that may be coming to the airport near youHow to use Apple Pay on all your devicesUber aviation head Eric Allison on new partnership with Hyundai
Continue reading here:
Posted: at 11:50 pm
Column Just before Christmas, Google claimed quantum supremacy. The company had configured a quantum computer to produce results that would take conventional computers some 10,000 years to replicate - a landmark event.
Bollocks, said IBM - which also has big investments both in quantum computing and not letting Google get away with stuff. Using Summit, the world's largest conventional supercomputer at the Oak Ridge National Laboratories in Tennessee, IBM claimed it could do the same calculation in a smidge over two days.
As befits all things quantum, the truth is a bit of both. IBM's claim is fair enough - but it's right at the edge of Summit's capability and frankly a massive waste of its time. Google could, if it wished, tweak the quantum calculation to move it out of that range. And it might: the calculation was chosen precisely not because it was easy, but because it was hard. Harder is better.
Google's quantum CPU has 54 qubits, quantum bits that can stay in a state of being simultaneously one and zero. The active device itself is remarkably tiny, a silicon chip around a centimetre square, or four times the size of the Z80 die in your childhood ZX Spectrum. On top of the silicon, a nest of aluminium tickled by microwaves hosts the actual qubits. The aluminium becomes superconducting below around 100K, but the very coldest part of the circuit is just 15 millikelvins. At this temperature the qubits have low enough noise to survive long enough to be useful
By configuring the qubits in a circuit, setting up data and analysing the patterns that emerge when the superpositions are observed and thus collapse to either one or zero, Google can determine the probable correct outcome for the problem the circuit represents. 54 qubits, if represented in conventional computer terms, would need 254 bits of RAM to represent each step of the calculation, or two petabytes' worth. Manipulating this much data many times over gives the 10 millennia figure Google claims.
IBM, on the other hand, says that it has just enough disk space on Summit to store the complete calculation. However you do it, though, it's not very useful; the only application is in random number generation. That's a fun, important and curiously nuanced field, but you don't really need a refrigerator stuffed full of qubits to get there. You certainly don't need the 27,648 NVidia Tesla GPUs in Summit chewing through 16 megawatts of power.
What Google is actually doing is known in the trade as "pulling a Steve", from the marketing antics of the late Steve Jobs. In particular, his tour at NeXT Inc, the company he started in the late 1980s to annoy Apple and produce idiosyncratic workstations. Hugely expensive to make and even more so to buy, the NeXT systems were never in danger of achieving dominance - but you wouldn't know that from Jobs' pronouncements. He declared market supremacy at every opportunity, although in carefully crafted phrases that critics joked defined the market as "black cubic workstations running NeXTOS."
Much the same is true of Google's claim. The calculation is carefully crafted to do precisely the things that Google's quantum computer can do - the important thing isn't the result, but the journey. Perhaps the best analogy is with the Wright Brothers' first flight: of no practical use, but tremendous significance.
What happened to NeXT? It got out of hardware and concentrated on software, then Jobs sold it - and himself - to Apple, and folded in some of that software into MacOS development. Oh, and some cat called Berners-Lee built something called the World Wide Web on a Next Cube.
Nothing like this will happen with Google's technology. There's no new web waiting to be borne on the wings of supercooled qubits. Even some of the more plausible things, like quantum decryption of internet traffic, is a very long way from reality - and, once it happens, it's going to be relatively trivial to tweak conventional encryption to defeat it. But the raw demonstration, that a frozen lunchbox consuming virtually no power in its core can outperform a computer chewing through enough wattage to keep a small town going, is a powerful inducement for more work.
That's Google's big achievement. So many new and promising technologies have failed not because they could never live up to expectations but because they cant survive infancy. Existing, established technology has all the advantages: it generates money, it has distribution channels, it has an army of experts behind it, and it can adjust to close down challengers before they get going. To take just one company - Intel has tried for decades to break out of the x86 CPU prison. New wireless standards, new memory technologies, new chip architectures, new display systems, new storage and security ideas - year after year, the company casts about for something new that'll make money. It never gets there.
Google's "quantum supremacy" isn't there either, but it has done enough to protect its infant prince in its superconducting crib. That's worth a bit of hype.
Sponsored: Detecting cyber attacks as a small to medium business
Charles Hoskinson Predicts Economic Collapse, Rise of Quantum Computing, Space Travel and Cryptocurrency in the 2020s – The Daily Hodl
Posted: at 11:50 pm
The new decade will unfurl a bag of seismic shifts, predicts the creator of Cardano and Ethereum, Charles Hoskinson. And these changes will propel cryptocurrency and blockchain solutions to the forefront as legacy systems buckle, transform or dissolve.
In an ask-me-anything session uploaded on January 3rd, the 11th birthday of Bitcoin, Hoskinson acknowledges how the popular cryptocurrency gave him an eye-opening introduction to the world of global finance, and he recounts how dramatically official attitudes and perceptions have changed.
Every central bank in the world is aware of cryptocurrencies and some are even taking positions in cryptocurrencies. Theres really never been a time in human history where one piece of technology has obtained such enormous global relevance without any central coordinated effort, any central coordinated marketing. No company controls it and the revolution is just getting started.
And he expects its emergence to coalesce with other epic changes. In a big picture reveal, Hoskinson plots some of the major events he believes will shape the new decade.
Hoskinson says the consequences of these technologies will reach every government service and that cryptocurrencies will gain an opening once another economic collapse similar to 2008 shakes the markets this decade.
I think that means its a great opening for cryptocurrencies to be ready to start taking over the global economy.
Hoskinson adds that hes happy to be alive to witness all of the changes he anticipates, including a reorganization of the media.
This is the last decade of traditional organized media, in my view. Were probably going to have less CNNs and Fox Newses and Bloombergs and Wall Street Journals and more Joe Rogans, especially as we enter the 2025s and beyond. And I think our space in particular is going to fundamentally change the incentives of journalism. And well actually move to a different way of paying for content, curating content.
Featured Image: Shutterstock/Liu zishan
Read the rest here:
Posted: at 11:50 pm
Way back in the 1960s, Gordon Moore, the co-founder of Intel, observed that the number of transistors that could be fitted on a silicon chip was doubling every two years. Since the transistor count is related to processing power, that meant that computing power was effectively doubling every two years. Thus was born Moores law, which for most people working in the computer industry or at any rate those younger than 40 has provided the kind of bedrock certainty that Newtons laws of motion did for mechanical engineers.
There is, however, one difference. Moores law is just a statement of an empirical correlation observed over a particular period in history and we are reaching the limits of its application. In 2010, Moore himself predicted that the laws of physics would call a halt to the exponential increases. In terms of size of transistor, he said, you can see that were approaching the size of atoms, which is a fundamental barrier, but itll be two or three generations before we get that far but thats as far out as weve ever been able to see. We have another 10 to 20 years before we reach a fundamental limit.
Weve now reached 2020 and so the certainty that we will always have sufficiently powerful computing hardware for our expanding needs is beginning to look complacent. Since this has been obvious for decades to those in the business, theres been lots of research into ingenious ways of packing more computing power into machines, for example using multi-core architectures in which a CPU has two or more separate processing units called cores in the hope of postponing the awful day when the silicon chip finally runs out of road. (The new Apple Mac Pro, for example, is powered by a 28-core Intel Xeon processor.) And of course there is also a good deal of frenzied research into quantum computing, which could, in principle, be an epochal development.
But computing involves a combination of hardware and software and one of the predictable consequences of Moores law is that it made programmers lazier. Writing software is a craft and some people are better at it than others. They write code that is more elegant and, more importantly, leaner, so that it executes faster. In the early days, when the hardware was relatively primitive, craftsmanship really mattered. When Bill Gates was a lad, for example, he wrote a Basic interpreter for one of the earliest microcomputers, the TRS-80. Because the machine had only a tiny read-only memory, Gates had to fit it into just 16 kilobytes. He wrote it in assembly language to increase efficiency and save space; theres a legend that for years afterwards he could recite the entire program by heart.
There are thousands of stories like this from the early days of computing. But as Moores law took hold, the need to write lean, parsimonious code gradually disappeared and incentives changed. Programming became industrialised as software engineering. The construction of sprawling software ecosystems such as operating systems and commercial applications required large teams of developers; these then spawned associated bureaucracies of project managers and executives. Large software projects morphed into the kind of death march memorably chronicled in Fred Brookss celebrated book, The Mythical Man-Month, which was published in 1975 and has never been out of print, for the very good reason that its still relevant. And in the process, software became bloated and often inefficient.
But this didnt matter because the hardware was always delivering the computing power that concealed the bloatware problem. Conscientious programmers were often infuriated by this. The only consequence of the powerful hardware I see, wrote one, is that programmers write more and more bloated software on it. They become lazier, because the hardware is fast they do not try to learn algorithms nor to optimise their code this is crazy!
It is. In a lecture in 1997, Nathan Myhrvold, who was once Bill Gatess chief technology officer, set out his Four Laws of Software. 1: software is like a gas it expands to fill its container. 2: software grows until it is limited by Moores law. 3: software growth makes Moores law possible people buy new hardware because the software requires it. And, finally, 4: software is only limited by human ambition and expectation.
As Moores law reaches the end of its dominion, Myhrvolds laws suggest that we basically have only two options. Either we moderate our ambitions or we go back to writing leaner, more efficient code. In other words, back to the future.
What just happened?Writer and researcher Dan Wang has a remarkable review of the year in technology on his blog, including an informed, detached perspective on the prospects for Chinese domination of new tech.
Algorithm says noTheres a provocative essay by Cory Doctorow on the LA Review of Books blog on the innate conservatism of machine-learning.
Fall of the big beastsHow to lose a monopoly: Microsoft, IBM and antitrust is a terrific long-view essay about company survival and change by Benedict Evans on his blog.
Read more from the original source:
Posted: at 11:50 pm
From the emerge of cognitive intelligence, in-memory-computing, fault-tolerant quantum computing, new materials-based semiconductor devices, to faster growth of industrial IoT, large-scale collaboration between machines, production-grade blockchain applications, modular chip design, and AI technologies to protect data privacy, more technology advancements and breakthroughs are expected to gain momentum and generate big impacts on our daily life.
We are at the era of rapid technology development. In particular, technologies such as cloud computing, artificial intelligence, blockchain, and data intelligence are expected to accelerate the pace of the digital economy, said Jeff Zhang, Head of Alibaba DAMO Academy and President of Alibaba Cloud Intelligence.
The following are highlights from the Alibaba DAMO Academy predictions for the top 10 trends in the tech community for this year:
Artificial intelligence has reached or surpassed humans in the areas of perceptual intelligence such as speech to text, natural language processing, video understanding etc. but in the field of cognitive intelligence that requires external knowledge, logical reasoning, or domain migration, it is still in its infancy. Cognitive intelligence will draw inspiration from cognitive psychology, brain science, and human social history, combined with techniques such as cross domain knowledge graph, causality inference, and continuous learning to establish effective mechanisms for stable acquisition and expression of knowledge. These make machines to understand and utilize knowledge, achieving key breakthroughs from perceptual intelligence to cognitive intelligence.
In Von Neumann architecture, memory and processor are separate and the computation requires data to be moved back and forth. With the rapid development of data-driven AI algorithms in recent years, it has come to a point where the hardware becomes the bottleneck in the explorations of more advanced algorithms. In Processing-in-Memory (PIM) architecture, in contrast to the Von Neumann architecture, memory and processor are fused together and computations are performed where data is stored with minimal data movement. As such, computation parallelism and power efficiency can be significantly improved. We believe the innovations on PIM architecture are the tickets to next-generation AI.
In 2020, 5G, rapid development of IoT devices, cloud computing and edge computing will accelerate the fusion of information system, communication system, and industrial control system. Through advanced Industrial IoT, manufacturing companies can achieve automation of machines, in-factory logistics, and production scheduling, as a way to realize C2B smart manufacturing. In addition, interconnected industrial system can adjust and coordinate the production capability of both upstream and downstream vendors. Ultimately it will significantly increase the manufacturers productivity and profitability. For manufacturers with production goods that value hundreds of trillion RMB, if the productivity increases 5-10%, it means additional trillions of RMB.
Traditional single intelligence cannot meet the real-time perception and decision of large-scale intelligent devices. The development of collaborative sensing technology of Internet of things and 5G communication technology will realize the collaboration among multiple agents machines cooperate with each other and compete with each other to complete the target tasks. The group intelligence brought by the cooperation of multiple intelligent bodies will further amplify the value of the intelligent system: large-scale intelligent traffic light dispatching will realize dynamic and real-time adjustment, while warehouse robots will work together to complete cargo sorting more efficiently; Driverless cars can perceive the overall traffic conditions on the road, and group unmanned aerial vehicle (UAV) collaboration will get through the last -mile delivery more efficiently.
Traditional model of chip design cannot efficiently respond to the fast evolving, fragmented and customized needs of chip production. The open source SoC chip design based on RISC-V, high-level hardware description language, and IP-based modular chip design methods have accelerated the rapid development of agile design methods and the ecosystem of open source chips. In addition, the modular design method based on chiplets uses advanced packaging methods to package the chiplets with different functions together, which can quickly customize and deliver chips that meet specific requirements of different applications.
BaaS (Blockchain-as-a-Service) will further reduce the barriers of entry for enterprise blockchain applications. A variety of hardware chips embedded with core algorithms used in edge, cloud and designed specifically for blockchain will also emerge, allowing assets in the physical world to be mapped to assets on blockchain, further expanding the boundaries of the Internet of Value and realizing multi-chain interconnection. In the future, a large number of innovative blockchain application scenarios with multi-dimensional collaboration across different industries and ecosystems will emerge, and large-scale production-grade blockchain applications with more than 10 million DAI (Daily Active Items) will gain mass adoption.
In 2019, the race in reaching Quantum Supremacy brought the focus back to quantum computing. The demonstration, using superconducting circuits, boosts the overall confidence on superconducting quantum computing for the realization of a large-scale quantum computer. In 2020, the field of quantum computing will receive increasing investment, which comes with enhanced competitions. The field is also expected to experience a speed-up in industrialization and the gradual formation of an eco-system. In the coming years, the next milestones will be the realization of fault-tolerant quantum computing and the demonstration of quantum advantages in real-world problems. Either is of a great challenge given the present knowledge. Quantum computing is entering a critical period.
Under the pressure of both Moores Law and the explosive demand of computing power and storage, it is difficult for classic Si based transistors to maintain sustainable development of the semiconductor industry. Until now, major semiconductor manufacturers still have no clear answer and option to chips beyond 3nm. New materials will make new logic, storage, and interconnection devices through new physical mechanisms, driving continuous innovation in the semiconductor industry. For example, topological insulators, two-dimensional superconducting materials, etc. that can achieve lossless transport of electron and spin can become the basis for new high-performance logic and interconnect devices; while new magnetic materials and new resistive switching materials can realize high-performance magnetics Memory such as SOT-MRAM and resistive memory.
Abstract: The compliance costs demanded by the recent data protection laws and regulations related to data transfer are getting increasingly higher than ever before. In light of this, there have been growing interests in using AI technologies to protect data privacy. The essence is to enable the data user to compute a function over input data from different data providers while keeping those data private. Such AI technologies promise to solve the problems of data silos and lack of trust in todays data sharing practices, and will truly unleash the value of data in the foreseeable future.
With the ongoing development of cloud computing technology, the cloud has grown far beyond the scope of IT infrastructure, and gradually evolved into the center of all IT technology innovations. Cloud has close relationship with almost all IT technologies, including new chips, new databases, self-driving adaptive networks, big data, AI, IoT, blockchain, quantum computing and so forth. Meanwhile, it creates new technologies, such as serverless computing, cloud-native software architecture, software-hardware integrated design, as well as intelligent automated operation. Cloud computing is redefining every aspect of IT, making new IT technologies more accessible for the public. Cloud has become the backbone of the entire digital economy.
Posted: at 11:50 pm
Steve Anderrson Saturday, 11 January 2020, 04:58 EST Modified date: Saturday, 11 January 2020, 04:58 EST
At a glance, the quantum volume is a measure of the complexity of a problem that a quantum computer can provide a solution. Quantum volume can also use to compare the performance of different quantum computers.
Ever since 2016, the IBM executives have doubled this value. In the 21st Century, Quantum computers have hailed as one of the most important innovations of the 21st century, along with potential applications in almost all fields of industries. Be it healthcare or artificial intelligence, and even financial modelling, to name a few.
Recently, quantum computers have also entered a new phase of development which can describe as practical. The first real quantum computer was launched in 2009 by Jonathan Holm. From that time, the quantum computer development has travelled a long way. At the moment, the industry driven by a handful of tech giants, including Google and IBM.
Even though IBMs latest advances viewed as significant advances, quantum computers can currently only be used for particular tasks. This indicates that they are far away from the general-purpose which classic computers serve us and to which we are used to.
Therefore, some people start worrying that the encryption technology which used to protect cryptocurrencies, for example, bitcoin may get destroyed. This worry is at least unfounded at present.
As the network is entirely built around the secure cryptographic transactions, a powerful quantum computer could eventually crack the encryption technology which used to generate Bitcoins private keys.
However, as per an article which was published by Martin Roetteler and various co-authors in June in 2017, such type of a machine requires approximately 2,500 qubits of processing power so that they can crack the 256-bit encryption technology which is used by Bitcoin.
Since the most powerful quantum computer which the world currently has only consisted of 72 qubit processors, one thing is clear that it will take several years for a quantum computer to reach the level of threatening encryption technology.
With the help of IBMs computing power which keeps doubling every year, and also the fact that Google has achieved quantum hegemony, Quantum might be working to ensure that Bitcoin can resist potential quantum computing attacks.
Read more from the original source:
Global Quantum Computing Market: What it got next? Find out with the latest research available at PMI – Pro News Time
Posted: at 11:50 pm
In this Quantum Computing Market Global Industry Analysis & Forecast to 2030 research report, the central factors driving the advancement of this industry were recorded and the business accessories and end overseers were indulgent. This statistical surveying Quantum Computing report investigates and inspects the industry and determines a widely inclusive estimate of its development and its details. Another perspective that was efficient is the cost analysis of the prime products driving in the Quantum Computing Industry remembering the overall revenue of the manufacturers.
The following key Quantum Computing Market insights and pointers are covered during this report:
Request a demo sample: https://www.prophecymarketinsights.com/market_insight/Insight/request-sample/571
The prime manufacturers covered during this report are:
Wave Systems Corp, 1QB Information Technologies Inc, QC Ware, Corp, Google Inc, QxBranch LLC, Microsoft Corporation, International Business Machines Corporation, Huawei Technologies Co., Ltd, ID Quantique SA, and Atos SE.
Download PDF Brochure @ https://www.prophecymarketinsights.com/market_insight/Insight/request-pdf/571
The report is an entire guide in providing complete Quantum Computing processes, cost structures, raw materials, investment feasibility, and investment return analysis. The SWOT analysis, market growth, production, profit, and supply-demand statistics are offered
The historical and future trends, prices, product demand, prospects, and Quantum Computing marketing channels are stated. The current business and progressions, future methodologies, market entrants are explained. The consumers, distributors, manufacturers, traders, and dealers in Business Intelligence (Bi) Software Market are covered. A comprehensive research methodology, market size estimation, market breakdown, and data triangulation is roofed.
Checkout Complete Details Here: https://www.prophecymarketinsights.com/market_insight/Global-Quantum-Computing-Market-By-571
Mr. Alex (Sales Manager)
Prophecy Market Insights
Phone: +1 860 531 2701
Here is the original post: