"Moore’s Law" predicts $100,000 for Bitcoin in February 2021 … – Markets Morning

Bitcoin is still at a record rate, after the crypt-diet on Tuesday has reached a value of 3,500 US dollar per coin. The growth comes exactly one week before a big change in the Bitcoin network. On Tuesday, Bitcoins Blockchain will integrate a new software, Segregated Witness, or SegWit.

SegWit is a clever solution, significantly increasing the capacity for transactions, said Aaron Lasher, Head of Marketing at Breadwallet, a Bitcoin Wallet company.

The software was developed years ago and should solve the scaling problem of the crypt diet, which broke the brokerage for years and finally led to a split on 1 August .

The investors were unaffected by the split on 1 August, which resulted in the formation of a clone called Bitcoin Cash.

Sheba Jafari, strategy manager at Goldman Sachs, said in July that the crypt diet had the thing to rise to 3,691. Jafari had already been right with her assumptions, and she also said that she was the most important person.

The value of the bitcoin cryptographic may exceed $100,000 by February 2021. This is what Investor Denis Porto, who is a Harvard scientist, thinks. He is of the opinion that bitcoin is the first digital currency to obey Moores Law.

According to this law, formulated by the co-founder of Intels Gordon Moore Processor, the number of transistors on the new microprocessor models will increase approximately twice every 18-24 months.

Moores law is specifically applied to the number of transistors per circuit, but it can be applied to any digital technology, said Porto. Any technology that grows exponentially (i.e. following Moores Law) has a doubling moment, he adds.

The scientists comments come as a result of the separation of an alternative bitcoin-bitcoin cash branch, with the new cryptoLight quickly becoming the third largest. At the same time, the value of the original bitcoin continued to rise and on Aug. 7 for the first time exceeded $3400 per issue.

On Tuesday, Bloomberg wrote that a Russian company is preparing to boost the bitcoin yield in the country. The co-owner of the company in question is one of Vladimir Putins advisers on Internet issues Dmitry Marinichev. Russia has the potential to reach a 30% share of the worlds extraction of crypto-clay in the future, he said.

The idea is to use Russias cheap electricity to challenge Chinas position as the largest market for digital mining. The holding company, known as the Russian Miner Coin (RMC), plans to use semiconductor chips designed in Russia for use in satellites to minimize power consumption in crypto cell mining computers.

RMC will conduct a sort of initial public offering, with investors in the holding being able to use ethereum or bitcoin to buy a new digital currency type from RMC. The new currency will be worth 18% of the revenue generated by the companys mining equipment.

See the rest here:

"Moore's Law" predicts $100,000 for Bitcoin in February 2021 ... - Markets Morning

Bitcoin Gets Technology Theory Backing, Can Reach $100000 by 2021 – Bitcoinist

Darryn Pollock August 10, 2017 11:45 am

Moores Law has been identified by a Harvard Scientist in Bitcoin, and as such the belief is that the digital currency can reach $100,000 by February 2021, according to this theorem.

With Bitcoin reaching a big milestone in its scaling debate, an issue that has dogged the digital currency for some time, it is now once again breaking records with little slowing it down.

Fear and speculation ran rampant leading up to the August 1 hard fork, which saw the creation of a new digital currency called Bitcoin Cash a fork of the original Bitcoin. However, even since its creation, and rise to third-largest digital currency for a while in regards to market cap, it has not slowed Bitcoins growth.

Moores Law is a theorem and a formula that was created by the co-founder of Intels Gordon Moore Processor. It states that, on a processor, the number of transistors on the new microprocessor models will increase approximately twice every 18-24 months.

This law has been identified by Denis Porto, and investor, as well as a Harvard Scientist. It is his opinion that Bitcoin has become the first digital currency to show signs of this law, even though it is not specifically aimed at this form of technology.

In a recent interview with Markets Morning, Porto said:

Moores law is specifically applied to the number of transistors per circuit, but it can be applied to any digital technology. []Any technology that grows exponentially (i.e. following Moores Law) has a doubling moment.

In the wake of Bitcoin reaching its latest all time high last week, optimism and confidencehave skyrocketed once again for the original digital currency.There have been a number of factors that have pushed Bitcoins growth, and those same factors have seen it follow the trajectory of Moores Law as identified by Porto.

Bitcoins ability to scale through SegWit, and suffer no ill effects from the hard fork, and instead grow to new heights has set this path to $100,000 in the next four years.

There are other factors in the pipeline as well that can also help Bitcoin to stick to this trajectory of $100,000 as on Tuesday it was reported that Russia is looking to take over the mantle as the king of Bitcoin mining.

Dmitry Marinichev, one of Russian President Vladimir Putins advisors, is preparing to boost Russia to be the global power of Bitcoin mining, in an attempt to compete with China.

Are these prediction far too high? Can an asset really reach such prices or is there a real threat of a bubble? Let us know your thoughts in the comments below!

Images courtesy of Pixabay, Texas Instruments, Cryptocompare

More:

Bitcoin Gets Technology Theory Backing, Can Reach $100000 by 2021 - Bitcoinist

This is not Trump’s economy – American Enterprise Institute

How much credit should President Trump get for a U.S. economy thats generating lots of jobs and for a stock market that keeps setting record highs?

Very little.

A trader wears a Donald Trump hat while working on the floor of the New York Stock Exchange (NYSE) shortly after the opening bell in New York, U.S., March 16, 2017. Reuters

Trump, of course, thinks MAGAnomics is doing the trick, as do his most ardent supporters. After the July jobs numbers came out last week, Fox News and other members of Team Trump were touting the more than 1 million private-sector jobs created since Inauguration Day. Over on Trump TV, former CNN pundit Kayleigh McEnany was even crediting the president with personally creating them. That Obama created the same number of jobs during his final six months was considered less newsworthy, apparently.

Presidents are always given too much credit or blame for economic performance on their watch. So many factors are outside their control. But beyond that, the idea that this is already Trumps economy is ridiculous. None of Trumps big agenda items at least the ones corporate America and Wall Street really care about have become law. No ObamaCare repeal. No massive tax cuts. No trillion-dollar infrastructure. Nothing.

To the extent that any president owns an economys performance, were still in the Obama era. Indeed, we really ought to credit the economic performance during the first year of any presidents term to his predecessor after all, its mostly that other guys budgets and policies directly influencing the economy. So for instance, George W. Bushs economy wasnt from 2001 through the end of 2008 it was 2002 through the end of 2009. And so on.

This change would make a big difference. During George W. Bushs two terms, GDP growth averaged 2.1 percent as 1.4 million new jobs were added. Pretty unimpressive. But recalculated Bush gets Obamas first year and loses his first year to Bill Clinton and the 43rd presidents record looks even worse: just 1.5 percent growth and a loss of 1.1 million jobs.

This isnt to say Trump should get zero credit for anything good thats happened this year. Its certainly reasonable that the stock markets 10 percent rise since Trump took the oath of office at least partially reflects investor expectations about his growth plan. But how much? Its always tough to parse these things. Believe it or not, theres more going on in the world than Trump, such as a synchronized upswing in the global economy for the first time since the Great Recession.

There are more complications that dont easily fit the GOPs pro-Trump narrative: First, even when new policies or the expectations of policy changes matter, too much weight shouldnt be given to short-term performance. The very large Reagan tax cuts in the early 1980s were supposed to improve U.S. productivity growth. But there was no upturn until the mid-1990s. This time around, tax cuts might boost growth, but probably not too much if they are temporary and greatly worsen the deficit.

Second, dont forget the Federal Reserve. The Obama recovery is as much if not more the Bernanke-Yellen recovery, just as easing by the Volcker Fed helped ignite the Reagan boom. With the Yellen Fed thinking the economy is near or at full employment, the central bank could offset further fiscal stimulus with more rapid and continued tightening of monetary policy. Or monetary policy could stay loose and continue as a tailwind to growth.

Third, Washington isnt the whole ballgame. While one might partially credit the 1990s boom to Reaganomics tax and regulatory changes, one should also credit Moores Law, the PC revolution, and the emergence of the internet. Whatever Trump does might pale compared to whats happening in Silicon Valley. As I wrote last week, the IT revolution is far from over and might still have a huge impact on productivity and growth.

This isnt the Trump economy. If anything, its the Obama economy or the AmazonAppleFacebookGoogle economy.

Continue reading here:

This is not Trump's economy - American Enterprise Institute

Bitcoin can get to $100000 if it keeps following one of tech’s golden rules – Markets Insider

REUTERS/Bogdan Cristel

Bitcoinis trading at record highs on Monday, but the cryptocurrency may still be far from hitting its ceiling.

It rallied 16.19% since July 31, despite last week's fork that split it in two. It's up 465% since last year.

According to analysis by Dennis Porto, a bitcoin investor and Harvard academic, bitcoin's price could hit $100,000 per coin if it continues to follow one of tech's "golden rules" - Moore's law.

The rule, which was devised in 1965 by Intel cofounder Gordon Moore, describes the exponential improvements of digital technology.

"Moore's law specifically applied to the number of transistors on a circuit but can be applied to any digital technology," Porto wrote in an email to Business Insider. "Any technology that is growing exponentially (i.e., 'following Moore's law') has a doubling time."

Typically, however, the rule applies to a technology's computing power or capabilities. This is the first time Porto has noticed a technology's price following Moore's law.

Since bitcoin's inception, according to Porto, its price has doubled every eight months.

Dennis Porto

"This poses a unique opportunity for investors: Whereas it was difficult to invest in circuits or internet speeds, it is easy to buy a bitcoin," Porto said.

Porto expects that this doubling trend could continue until bitcoin reaches mass adoption. Of course, another cryptocurrency could usurp bitcoin in the meantime.

By February 2021, Porto believes, it could be worth over $100,000.

See the article here:

Bitcoin can get to $100000 if it keeps following one of tech's golden rules - Markets Insider

The Five Principles Of Modern Marketing – AdExchanger

Managing the Data" is a column about customer and audience data strategy written by longtime AdExchanger contributor Chris O'Hara.

Every marketer and media company these days is trying to unlock the secret to personalization. Everyone wants to be the next Amazon, anticipating customer wants and desires and delivering real-time customization.

Actually, everyone might need to be an Amazon going forward; Harris Interactive and others tell us that getting customer experience wrong means up to 80% of customers will leave your brand and try another and it takes seven times more money to reacquire that customer than it did initially.

How important is personalization? In a recent study, 75% marketers of marketers said that theres no such thing as too much personalization for different audiences, and 94% know that delivering personalized content is important to reaching their audiences.

People want and expect personalization and convenience today, and brands and publishers that cannot deliver it will suffer similar fates. However, beyond advanced technology, what do you need to believe to make this transformation happen? What are the core principles a company needs to adhere to, in order to have a shot at transforming themselves into customer-centric enterprises?

Here are five:

Put People First

Its a rusty old saw but, like any clich, its fundamentally true. For years, we have taken a very channel-specific view of engagement, thinking in terms of mobile, display, social and video. But those are channels, apps and browsers. Browsers dont buy anything; people do.

A people-centric viewpoint is critical to being a modern marketer. True people-based marketing needs to extend beyond advertising and start to include things like sales, service and ecommerce interactions every touchpoint people have with brands.

People customers and consumers must reside at the center of everything, and the systems of engagement we use to touch them must be tertiary. This makes the challenges of identity resolution the new basis of competition going forward.

Collect Everything, Measure Everything

A true commitment to personalized marketing means that you have to understand people. For many years, we have assigned outside importance to small scraps of digital exhaust such as clicks, views and likes as signals of brand engagement and intent. Mostly, theyve lived in isolation, never informing a holistic view of people and their wants and desires.

Now we can collect more of this data and do so in real time. Modern enterprises need to become more obsessive about valuing data. Every scrap of data becomes a small stitch in a rich tapestry that forms a view of the customer.

We laughed at the data is the new oil hyperbole a few years back simply because nobody had a way to store and extract real value from the sea of digital ephemera. Today is vastly different because we have both the technology and processes to ingest signals at scale and use artificial intelligence to refine them into gold. Businesses that let valuable data fall to the floor without measuring them might already be dead, but they just dont know it yet.

Be A Retailer

A lot of brands arent as lucky as popular hotel booking sites. To book a room, you need to sign up with your email. Once you become a user, the company collects data on where you like to go, how often you travel, how much you pay for a room and even what kind of mattress you prefer. Any brand would kill for that kind of one-to-one relationship with a customer.

Global CPG brands touch billions of lives every day, yet often have to pay other companies to learn how their marketing spend affected sales efforts. Brands must start to own customer relationships and create one-to-one experiences with buyers. We are seeing the first step with things like Dash buttons and voice ordering, though still through a partner, but we will see this extend even further as brands change their entire business models to start to own the retail relationship with people. The key pivot point will come when brands actually value people data as an asset on their balance sheets.

See The World Dynamically

The ubiquity of data has led to an explosion of microsegmentation. I know marketers and publishers that can define a potential customer to 20 individual attributes. But people can go from a Long Island soccer mom on Monday to an EDM music lover on Friday night. Todays segmentation is very much static and very ineffective for a dynamic world where things change all the time.

To get the right message, right place, right time dynamic right, we need to understand things like location, weather, time of day and context and make those dynamic signals part of how we segment audiences. To be successful, marketers and media companies must commit to thinking of customers as the dynamic and vibrant people they are and enable the ability to collect and activate real-time data into their segmentation models.

Think Like A Technologist

Finally, to create the change described above requires a commitment to understanding technology. You cant do people data without truly understanding data management technology. You cant measure everything without technology that can parse every signal. To be a retailer, you have to give customers a reason to buy directly from you. Thinking about customers dynamically requires real-time systems of collection and activation.

But technology and the people to run it are expensive investments, often taking months and years to show ROI, and the technology changes at the velocity of Moores Law. Its a big commitment to change from diaper manufacturer to marketing technologist, but we are starting to understand that it is the change required to survive an era where people are in control.

Some say that it wasnt streaming media technology that killed Blockbuster, but the fact that people hated their onerous late fees. It was probably both of those things. Tomorrows Blockbusters will be the companies that cannot apply these principles of modern, personalized marketing or do not want to make the large investments to do so.

Follow Chris O'Hara (@chrisohara) and AdExchanger (@adexchanger) on Twitter.

Go here to read the rest:

The Five Principles Of Modern Marketing - AdExchanger

New law takes aim at gangs – Rocky Mount Telegram

Local law enforcement officials are backing a new state law aimed at cracking down on gangs.

Gov. Roy Cooper signed House Bill 138 into law July 27. Formally known as the N.C. Criminal Gang Suppression Act, the law standardizes criteria for classification of criminal gang membership and makes punishments for certain gang-related crimes stricter.

The law states that any person convicted of a felonyother than a Class A, B1 or B2 that was committedas part of criminal gang activity shall be sentenced at a felony class level one class higher than the principal felony for which the person was convicted.Its also states gang leaders or organizers convicted of gang-related felonies will be sentencedat a felony class level two classes higher than the principal felony for which the person was convicted.

Rocky Mount Police Chief James Moore, Nash County Sheriff Keith Stone and Edgecombe County Sheriff Clee Atkinson praised the new law.

I support the enhanced sentencing components of HB138 as a method to alleviate our communities of the most violent gang members, Moore said.There are a multitude of gangs operating in North Carolina and our region of the state.There are traditional street gangs, outlaw motorcycle gangs and white supremacist gangs with members residing in the Twin Counties area. Individuals joining and participating in gangs is not a new phenomenon, but it is still a problem that we have to address.

Moore said battling gang violence starts with addressing the issue of gun violence.

I believe our local public health and medical officials should join the American Public Health Association and the American Medical Association in declaring gun violence a public health crisis that is in need of the public health disease management and prevention approach to mitigate its impact on our youth, Moore said.

Stone agreed, calling the laws wording precise. He added battling the gang problem in the Twin Counties includes continuing the battle against the opioid epidemic.

I agree with the laws wording 110 percent, Stone said.Gang members and leaders need to be prosecuted to the fullest extent of the law.When youve got drugs, youve got gangs. Opioids are killing more people than car wrecks. There should be mandatory jail sentences for gang members and drug dealers, and no reduction of true deals. The community has to come together and the courts have to take a strong stance to combat this problem.

Atkinson echoed Stone and Moores thoughts on gangs being present in the area. He added combating gangs starts at home.

Weve got to get the kids early to keep them from joining gangs, Atkinson said. They need to be educated and taught discipline and responsibility.

With the bill signed into law, it will go into effect beginning Dec. 1.

Link:

New law takes aim at gangs - Rocky Mount Telegram

The Decline of the Laundromat and the Future of Higher Education – Inside Higher Ed (blog)

The Decline of the Laundromat and the Future of Higher Education
Inside Higher Ed (blog)
Who amongst us has wondered if the future of online education and physical classrooms will one-day mimic that of online shopping and bricks-and-mortar retail? What about Moore's Law? (Or whatever the equivalent law that exists for durable appliances).

Go here to read the rest:

The Decline of the Laundromat and the Future of Higher Education - Inside Higher Ed (blog)

Tech improvements are becoming so dramatic that charts are … – Financial Post

Nathaniel Bullard

Fifteen years ago, Japans Earth Simulatorwas the most powerful supercomputer on Earth. It had more than 5,000 processors. It consumed 6,400 kilowatts of electricity. It cost nearly US$400 million to build.

Two weeks ago, a computer engineer built a deep learning box, using off-the-shelf processors and components, that handily exceeds the Earth Simulators capabilities. It uses a maximum of 1 kilowatt of power. It cost US$3,122 to build.

For the first time in writing this, Im stumped for a chart. It is difficult perhaps impossible to show a 99.98 per cent reduction in energy use and a 99.99992 per cent reduction in cost in any meaningful way. It is enough to say that information technology has decreased in cost and increased in computational and energy efficiency to striking degrees.

I would argue that this dramatic improvement has a flattening, or even depressing, economic influence on energy. Dramatically reduced inputs with dramatically increasing outputs is a boon for consumers and businesses, unless those businesses sell the energy that drives those inputs. Weve already seen this: In 2007, U.S. data centres consumed 67 terawatt-hours of electricity. Today, with millions of times more computing power, they consume 72 terawatt-hours, with less than 1per cent growth forecast by 2020. Not the greatest news if youre a power utility that has imagined that more and more information technology will mean more energy demand.

Information technologys improvement over time has been largely a function of Moores Law (which is less a law than an observation). Now, with Moores Law potentially coming to its end, it would seem like the extraordinary improvements that got us from a room-sized US$400 million supercomputer to a US$3,000 desktop box in 15 years could be coming to an end, too.

If technology companies are no longer able to jam more transistors into a chip, does that mean that improvements in energy consumption will also come to an end? If chip improvements plateau, and deployment increases, can information technology find a way to provide a boost to energy demand?

I doubt it, for both hardware and software reasons.

Even as Moores Law is tapping out for general-purpose chips, hardware is becoming increasingly optimized for specific tasks. That optimization for such things as graphics processing or neural network computations for machine learning leads to greater energy efficiency, too. Google now has its own application-specific integrated circuit called the Tensor Processing Unit for machine learning. The TPU delivered 15-30x higher performance and 30-80x higher performance-per-watt than central processing units and graphics processing units.

Then there is the software that runs on that custom hardware, which has direct applications for electricity in particular. Last year, Google unleashed its DeepMind machine learning on its own data centres and managed to reduce the energy used for cooling those data centres by 40 per cent.

So, new special-purpose chips are much more energy-efficient than older general-purpose chips and those efficient chips are now used to run algorithms that make their data centres much more energy-efficient, too.

In a famous 1987 paper, the economist Robert Solow said you can see the computer age everywhere but in the productivity statistics. Today, we could say the same about the computer age and energy statistics.

Nathaniel Bullard is an energy analyst, covering technology and business model innovation and system-wide resource transitions.

Bloomberg View

Go here to see the original:

Tech improvements are becoming so dramatic that charts are ... - Financial Post

How the TSA Is Fighting a Tech War With Terrorists – Fortune

A Transportation Security Administration (TSA) officer checks a traveler's bag at a screening location at Salt Lake City International Airport (SLC) in Salt Lake City, Utah, U.S., on Friday, Dec. 23, 2016. George Frey Bloomberg via Getty Images

The Transportation Security Administrations (TSA) new security regulations , requiring all electronics larger than a cell phone to be placed in bins for x-ray screening in standard passenger lanes, is the latest move in the technology war against terrorists.

The American public may balk at longer lines at airport security when traveling due to the inconvenience of unpacking tablets, laptops, e-readers, and other electronics from carryon bags. This is possibly the first of many tighter regulations from the TSAand travelers should brace for that.

The heart of the issue is technologys maxim of Moores law , a 50-year-old prediction about the doubling of computer power every year (then every few years). Today, Moores law can be applied to faster, more sophisticated technology being packed into smaller electronic footprintsfor example, smaller, thinner, and lighter laptop computers.

What benefits consumers (no more lugging around an eight-pound laptop) has been a challenge to terrorist groups that have to find ways to put explosives into smaller devices . But many terrorist groups are on the cutting edge of technology, using next-generation explosives that are smaller and more powerfuland potentially less detectable , which is a critical concern for the airlines and TSA.

The U.S. government has a more methodical pace than terrorist groups to respond to these technological capabilities that could put the traveling public at greater risk.

TSAs own technology allows for screening of electronics, which must be placed flat on bins (nothing on top or below them), by comparing the image of what is inside (like a CT scan does for the human body) to what the typical iPad or Kindle reader should look like. While screening lines will no doubt be longer and slower, as people unpack electronics from hand baggage and then pack them up again, its far more efficient than having to power on all devices to prove they are legitimate.

In addition, TSA is also testing new luggage-scanning technology to look through luggage for bombs or weapons. The technology also creates 3-D images of contents to allow screeners to get a full and rotatable view of objects. The capability of screening luggage for electronics may become even more crucial as travelers, in an attempt to keep from hauling out several electronic devices at security, check them in their luggage instead.

By deploying more technology, TSA is trying to keep ahead of terrorists in their capabilities to turn benign-looking electronics into weapons. This is good news for the public for the obvious reasons that more efficient and thorough screening and detection makes traveling safer. In my conversations with airline executives, I expect that technology upgrades will continuehopefully at a faster pace than terrorists capabilities. Just as large corporations are learning how agility is critical to their survival, I would not be surprised if TSA policies also remain fluid and subject to sudden change, such as in reaction to a credible threat or a terrorist incident somewhere in the world. One such reaction might be to withdraw some privileges from TSA Pre passengers who are not required to remove laptops or other electronics from their carryon bags, which speeds their screening at airports.

Travelers who remember the former days of going to the airport at the last minute and speeding through light security might bemoan the added layers of what feel like inconvenience. But the fact is, the U.S. is comparatively more passenger-friendly when it comes to airport securityperhaps to its own potential detriment. Israel is known to have some of the toughest airport security and one of the best track records on travel safety.

The U.S. has been tightening security to target specific potential risks; for example, greater screening for inbound flights from 280 airports into the U.S. These new rules affect about 2,000 daily flights carrying 325,000 passengers, and reportedly are in response to terrorists developing new ways to hide bombs and infiltrate airport staff.

While tightening defenses at the international gateways and larger, busier airports makes sense, in any war, the most vulnerable point is the weakest link. Here the nations smaller, regional airports could be vulnerable to either attack or as an entry point for terrorists trying to enter the air travel system. For TSA, this could be an opportunity to further tighten security.

TSA security changes may be as disconcerting for travelers as longer check-in lines are infuriating. But in the current tech war being waged by airlines and security on one side and terrorists groups on the other, passengers will need to be more agile, accommodating, and flexible than ever.

Dean DeBiase is adjunct lecturer of innovation and entrepreneurship at Kellogg School of Management at Northwestern University and a co-author of the best-selling book The Big Moo.

See original here:

How the TSA Is Fighting a Tech War With Terrorists - Fortune

DARPA Continues Investment in Post-Moore’s Technologies – HPCwire (blog)

The U.S. military long ago ceded dominance in electronics innovation to Silicon Valley, the DoD-backed powerhouse that has driven microelectronic generation for decades. With Moores Law clearly running out of steam, the Defense Advanced Research Projects Agency (DARPA) is attempting to reinvigorate and leverage a vibrant domestic chip sector with a $200 million initiative designed among other things to push the boundaries of chip architectures like GPUs.

DARPA recently announced that its Electronics Resurgence Initiative seeks to move beyond Moores Law chip scaling. Among the new fronts to be opened by the defense agency are extending GPU frameworks that underlie machine-learning tools to develop reconfigurable physical structures that adjust to the needs of the software they support.

While it remains unclear how enterprises might benefit directly from the chip initiative overseen by DARPAs Microsystems Technology Office, the agency does have a reputation dating back to the earliest days of the Internet for funding high-risk technology R&D that eventually makes its way into the commercial sector.

The DARPA effort also attempts to lay the groundwork for a post Moores Law era where, according to the agency, research will focus on integrating different semiconductor materials on individual chips, sticky logic devices that combine processing and memory functions and vertical rather than only planar integration of microsystem components.

As the focus of chip technology zeroes in on data driven enterprise applications, DARPA said it would cast a wider net to harness semiconductor innovation that would lead to a post-Moores Law generation of microelectronic systems benefitting military and commercial users.

The effort runs in parallel withrecent attempts by DoDto tap into the sustained burst of technology and development innovation in Silicon Valley. As the technology entrepreneurSteve Blankhas documented, the 20thcentury electronics explosion was initially funded by the U.S military beginning as early as World War II, continuing throughout the Cold War confrontation with the former Soviet Union.

The DARPA effort primarily seeks to establish new development models that go beyond chip scaling. We need to break away from tradition and embrace the kinds of innovations that the new initiative is all about, emphasized William Chappell, director of DARPAs Microsystems Technology Office. The program will embrace progress through circuit specialization and to wrangle the complexity of the next phase of advances, which will have broad implications on both commercial and national defense interests, Chappell added.

The post-Moores Law research effort will complement the recently createdJoint University Microelectronics Program(JUMP), a research effort in basic electronics being co-funded by DARPA and Semiconductor Research Corporation (SRC), an industry consortium based in Durham, N.C. Among the chip makers contributing to JUMP are IBM, Intel Corp., Micron Technology and Taiwan Semiconductor Manufacturing Co.

SRC members and DARPA are expected to kick in more than $150 million for the five-year project. Focus areas include high-frequency sensor networks, distributed and cognitive computing along with intelligent memory and storage.

As DARPA continues to invest in device technology, it is also attempting to leverage what Chappell calls the software-defined world. The agency sees virtualization and other software technologies as one way of addressing skyrocketing weapons costs. Hence, the agency is also investing more research funding in areas such as algorithm development and circuit design for applications such as dynamic spectrum sharing, a capability that would allow the military to squeeze more capacity out of crowded electromagnetic spectrum.

More:

DARPA Continues Investment in Post-Moore's Technologies - HPCwire (blog)

Is Moore’s Law coming to an end? – Blasting News

It all started when Gordon Moore, the co-founder of Intel, made an observation. The observation being, the density of transistors in an integrated circuit doubles approximately every two years. Although more of a statement, Moore's Law has held up its stance for almost half a century. But not for long -- or so it seems.

With technology advancing at an exponential rate we seem to be approaching an invisible wall. In other words, we seem to be approaching the limits of our technology. The number of transistors in a circuit is directly proportional to the size of the transistor. Meaning the smaller the transistor, the higher the number of transistors that can be put into a circuit dice.

To put this exponential growth into perspective, in 2000 the density of transistors in an integrated circuit used for an average computer was 37.5 million. In 2009 the number of transistors grew to 904 million. Meaning in 9 years, the number of transistors in an average computer increased 24.1 times. Most processors used in the latest computers average around 100 million to 150 million transistors, in other words, from 2009 to 2017 the number of transistors increased less than 1.5 times. Meanwhile, the size of a transistor went from 100+ nanometer in 2000 to <20 nano meters in 2017.

Now, things get tricky when the size of a transistor becomes smaller and smaller. Once the size of a transistor reaches below 5nm or so, we will reach a breaking point. At such small distances, electrons act differently.

Electrons can freely pass through any barrier at such distances, as transistors rely on a barrier to stop the flow of electrons. it leaves a transistor virtually and literally useless.

We are fast approaching this limit and once we hit this limit, there is no way out.

We can't be sure, but one thing for sure is that Moore's Law has bent but has not broken. Major companies are trying to use different materials with semiconductor properties to try and face this challenge, this will help to an extent, but it's definitely not a permanent solution. Meanwhile, several other companies and researchers are taking a completely different approach to this. Trying to redefine what a transistor is, by heading into the realm of #Quantum computing and #Carbon Nanotubes.

Within the next decade or two, it will be the end of an era, the end of basic transistors as we know it. Instead, there will be a technological revolution that will change the way we define a computer. Be it quantum computing or carbon nanotubes, Moore's Law will still live on. #Moore Law

Read more from the original source:

Is Moore's Law coming to an end? - Blasting News

CPU architecture after Moore’s Law: What’s next? – Computerworld

Thank you

Your message has been sent.

There was an error emailing this page.

By Lamont Wood

Contributing Writer, Computerworld | Jul 24, 2017 3:00 AM PT

When considering the future of CPU architecture, some industry watchers predict excitement, and some predict boredom. But no one predicts a return to the old days, when speed doubled at least every other year.

The upbeat prognosticators include David Patterson, a professor at the University of California, Berkeley, who literally wrote the textbook (with John Hennessy) on computer architecture. This will be a renaissance era for computer architecture these will be exciting times, he says.

Not so much, says microprocessor consultant Jim Turley, founder of Silicon Insider. In five years we will be 10% ahead of where we are now, he predicts. Every few years there is a university research project that thinks they are about to overturn the tried-and-true architecture that John von Neumann and Alan Turing would recognize and unicorns will dance and butterflies will sing. It never really happens, and we just make the same computers go faster and everyone is satisfied. In terms of commercial value, steady, incremental improvement is the way to go.

They are both reacting to the same thing: the increasing irrelevance of Moores Law, which observed that the number of transistors that could be put on a chip at the same price doubled every 18 to 24 months. For more to fit they had to get smaller, which let them run faster, albeit hotter, so performance rose over the years but so did expectations. Today, those expectations remain, but processor performance has plateaued.

Power dissipation is the whole deal, says Tom Conte, a professor at the Georgia Institute of Technology and past president of the IEEE Computer Society. Removing 150 watts per square centimeter is the best we can do without resorting to exotic cooling, which costs more. Since power is related to frequency, we cant increase the frequency, as the chip would get hotter. So we put in more cores and clock them at about the same speed. They can accelerate your computer when it has multiple programs running, but no one has more than a few trying to run at the same time.

The approach reaches the point of diminishing returns at about eight cores, says Linley Gwennap, an analyst at The Linley Group. Eight things in parallel is about the limit, and hardly any programs use more than three or four cores. So we have run into a wall on getting speed from cores. The cores themselves are not getting much wider than 64 bits. Intel-style cores can do about five instructions at a time, and ARM cores are up to three, but beyond five is the point of diminishing returns, and we need new architecture to get beyond that. The bottom line is traditional software will not get much faster.

Actually, we hit the wall back in the 90s, Conte adds. Even though transistors were getting faster, CPU circuits were getting slower as wire length dominated the computation. We hid that fact using superscalar architecture [i.e., internal parallelism]. That gave us a speedup of 2x or 3x. Then we hit the power wall and had to stop playing that game.

Sponsored Links

Go here to see the original:

CPU architecture after Moore's Law: What's next? - Computerworld

ASML enabling Moore’s law scaling and cost reduction out to 1 to 2 nanometers in mid-2020s – Next Big Future

ASML enabling Moores law scaling and cost reduction out to 1 to 2 nanometers in mid-2020s

Today ASML is selling billions in Extreme Ultraviolet lithography machines. These machines will help deliver chips at the 5 nanometer to 2 nanometer nodes.

ASML (Veldhoven, the Netherlands) sold eight more EUV (next-generation extreme ultraviolet lithography) systems in the second quarter. This brings its EUV backlog to 27 tools valued at about 2.8 billion euro (about $3.26 billion). The firm also announced that it demonstrated the key productivity metric of 125 wafers per hour (125) on an EUV tool at its headquarters.

Second quarter sales increased to 2.1 billion euro (about $244) million, up 8 percent compared to the year ago quarter. The company said it is on track to grow sales about 25 percent this year. Intel, Samsung and TSMC, are hoping to insert oft-delayed EUV lithography into volume production in the next two years.

ASML has a presentation that describes how they see EUV enabling a scaling push with lowering costs out to 2 nanometers.

Here is the original post:

ASML enabling Moore's law scaling and cost reduction out to 1 to 2 nanometers in mid-2020s - Next Big Future

Law firm says it stands up for the disabled. Now it’s accused of racketeering – Fresno Bee


Fresno Bee
Law firm says it stands up for the disabled. Now it's accused of racketeering
Fresno Bee
In a tribute to her father, she filed a lawsuit last month in U.S. District Court in Fresno against Mission Law Firm and its predecessor, Moore Law Firm, alleging that the two firms violated the federal Racketeer Influenced and Corrupt Organizations ...

Go here to read the rest:

Law firm says it stands up for the disabled. Now it's accused of racketeering - Fresno Bee

GoPro Inc (GPRO): The Dark Side of Moore’s Law – Investorplace.com

Popular Posts: Recent Posts:

Moores Law has a dark side.The acceleration of technology means niche products quickly become commodities. In the case of GoPro Inc(NASDAQ:GPRO), it meant that a cheap, sturdy video camera that could bemounted on a cyclists headbecametoo cheap to remain profitable.

GoPro CEO and founder Nicholas Woodman saw this coming, but instead of seeking to build a market in other, more expensive cameras that might see waves other than visible light, he decided to mount his commodity cameras on more complex devices, namely drones.

In doing so,GPRO became a small, high-priced competitor to a company with lower costs, SZ DJI Technology Co.Now, GoProis getting crushed on the one hand by its commodity nature, and on the other hand by a competitor with lower costs and more capital.

This isnot a good place for GoPro stock to be.

Analysts saw this coming. The average rating on GoPro stock has been underweight for months, with more screaming sell than buy, even as shares have plunged more than50% since last October.

GoPro is expected to report another bad quarteron July 26 losses of approximately$50 million, or 35 cents per share, on revenue of $243.59 million. Bullswill note that represents 10% growth over last year, and fewer losses, but the problems here are more fundamental.

Most InvestorPlace writers are telling readers to go away. James Brumley recently called the stock a one hit wonder.Tom Taulli said the creative magic is long gone.Ryan Fuhrman called it a real buzz kill.

Only Richard Saintvilus sees any hope, mainly in the companys software. The drone market is growing rapidly, and content sharing through GPROsoftware holds promise. With a market cap now lower than its annual sales, albeitsolid sales, he holds out for hopeof a second-half rebound.

Most of the opportunity in the drone market lies in heavy commercial and military drones, the kind that can drop packages and bombs, rather thanin the lightweight, consumer, camera-focused drones that GoPro makes.

But, that does bring up a promising path forward, shouldGoPro choose to take it. I dont buy stocks based on an assumption that management will change course, but if GPROdid shift its focus, stockholders would benefit.

Next Page

Read the original:

GoPro Inc (GPRO): The Dark Side of Moore's Law - Investorplace.com

Look at the phone in your hand you can thank the state for that – The Guardian

Who are the visionaries who drive human progress? The answer, as we all know, is the geeks, the free spirits and the crazy dreamers, who thumb their noses at authority: the Peter Thiels and the Mark Zuckerbergs of the world; the likes of Steve Jobs and the Travis Kalanick; the giants with an uncompromising vision and an iron will, as though they have stepped fresh from the pages of one of Ayn Rands novels.

Innovation, Steve Jobs once said, distinguishes between a leader and a follower. Now, if ever there were a prototypical follower, it would have to be the government. After all, why else would nearly all the innovative companies of our times hail from the United States, where the state is much smaller than in Europe?

Media outlets including the Economist and the Financial Times never tire of telling us that governments role is to create the right preconditions: good education, solid infrastructure, attractive tax incentives for innovative businesses. But no more than that. The idea that the cogs in the government machine could divine the next big thing is, they insist, an illusion.

Take the driving force behind the digital revolution, also known as Moores law. Back in 1965, the chip designer Gordon Moore was already predicting that processor speeds would accelerate exponentially. He foresaw such wonders as home computers, as well as portable communications equipment and perhaps even automatic controls for automobiles.

And just look at us now! Moores law clearly is the golden rule of private innovation, unbridled capitalism, and the invisible hand driving us to ever lofty heights. Theres no other explanation right? Not quite.

For years, Moores law has been almost single-handedly upheld by a Dutch company one that made it big thanks to massive subsidisation by the Dutch government. No, this is not a joke: the fundamental force behind the internet, the modern computer and the driverless car is a government beneficiary from socialist Holland.

Our story begins on 1 April 1984 in a shed knocked together on an isolated lot in Veldhoven, a town in the south of the Netherlands. This is where a small startup called ASML first saw the light of day. Employing a couple of dozen techies, it was a collaborative venture between Philips and ASM International set up to produce hi-tech lithography systems: in plain English, machines that draw minuscule lines on chips.

Fast-forward 25 years, and ASML is a major corporation employing more than 13,000 engineers at 70 locations in 16 countries. With a turnover of over 5.9 billion (5.2bn) and earnings of 1.2bn, it is one of the most successful Dutch companies, ever. It controls over 80% of the chip machine market the global market, mind you.

In point of fact, the company is the most powerful force upholding Moores law. For them, this law is not a prediction: its a target. The iPhone, Googles search engine, the kitty clips it would all be unthinkable without those crazy Dutch dreamers from Veldhoven.

Naturally, youll be wondering who was behind this paragon of innovation. The story told by the company itself fits the familiar mould, of a handful of revolutionaries who got together and turned the world upside down. It was a matter of hard work, sweat and pure determination against almost insurmountable odds, explains ASML in its corporate history. It is a story of individuals who together achieved greatness.

Government isnt just there to administer life-support to failing markets. Without it, many would not even exist

Theres one protagonist you never find mentioned in these sort of stories: government. But dive deep into the archives of newspapers and annual reports back to the early 90s and another side to this story emerges.

From the get-go, ASML was receiving government handouts. By the fistful. When in 1986 a crisis in the worldwide chip industry brought ASML to its knees, and while several big competitors toppled, the chip machine-maker from the south of Holland got a leg-up from its national government. Competitors who had survived the crisis no longer had enough funds to develop the next big thing, explains the companys site. So while its rivals licked their wounds, ASML shot into the lead. Is ASML an anomaly in the history of innovation? Not quite.

A few years ago the economist Mariana Mazzucato published a fascinating book debunking a whole series of myths about innovation. Her thesis is summed up in the title The Entrepreneurial State.

Radical innovation, Mazzucato reveals, almost always starts with the government. Take the iPhone, the epitome of modern technological progress. Literally every single sliver of technology that makes the iPhone a smartphone instead of a stupidphone internet, GPS, touchscreen, battery, hard drive, voice recognition was developed by researchers on the government payroll.

Why, then, do nearly all the innovative companies of our times come from the US? The answer is simple. Because it is home to the biggest venture capitalist in the world: the government of the United States of America.

These days there is a widespread political belief that governments should only step in when markets fail. Yet, as Mazzucato convincingly demonstrates, government can actually generate whole new markets. Silicon Valley, if you look back, started out as subsidy central. The true secret of the success of Silicon Valley, or of the bio- and nanotechnology sectors, Mazzucato points out, is that venture investors surfed on a big wave of government investments.

True innovation takes at least 10 to 15 years, whereas the longest that private venture capitalists are routinely willing to wait is five years. They dont join the game until all the riskiest plays have already been made by governments. In the case of biotechnology, nanotechnology and the internet, venture investors didnt jump on the bandwagon until after 15 to 20 years. Venture capitalists are not willing to venture enough.

The relationship between government and the market is mutual and necessary. Apple may not have invented the internet, GPS, touchscreens, batteries, hard drives and voice recognition; but then again, Washington was never very likely to make iPhones. Theres not much point to radical innovations if no one turns them into products.

To dismiss the government as a bumbling slowpoke, however, wont get us anywhere. Because its not the invisible hand of the market but the conspicuous hand of the state that first points the way. Government isnt there just to administer life support to failing markets. Without the government, many of those markets would not even exist.

The most daunting challenges of our times, from climate change to the ageing population, demand an entrepreneurial state unafraid to take a gamble. Rather than wait around for the market, government needs to have vision, be decisive to take to heart Steve Jobs motto: stay hungry, stay foolish.

Utopia for Realists: And How We Can Get There is available from the Guardian bookshop

This article was translated from Dutch by Elizabeth Manton

See the rest here:

Look at the phone in your hand you can thank the state for that - The Guardian

Ethernet Getting Back On The Moore’s Law Track – The Next Platform

July 10, 2017 Timothy Prickett Morgan

It would be ideal if we lived in a universe where it was possible to increase the capacity of compute, storage, and networking at the same pace so as to keep all three elements expanding in balance. The irony is that over the past two decades, when the industry needed for networking to advance the most, Ethernet got a little stuck in the mud.

But Ethernet has pulls out of its boots and left them in the swamp and is back to being barefoot again on much more solid ground where it can run faster. The move from 10 Gb/sec to 40 Gb/sec was slow and costly, and if it were not for the hyperscalers and their intense bandwidth hunger we might not even be at 100 Gb/sec Layer 2 and Layer 3 switching, much less standing at the transition to 200 Gb/sec and looking ahead to the not-to-distant future when 400 Gb/sec will be available.

Bandwidth has come along just at the right moment, when advances in CPU throughput are stalling as raw core performance did a decade ago and as new adjunct processing capabilities, embodied in GPUs, FPGAs, and various kinds of specialized processors are coming to market to get compute back on the Moores Law track. Storage, thanks to flash and persistent flash-like and DRAM-like memories such as 3D XPoint from Intel and Micron Technology, is also undergoing an evolution. It is a fun time to be a system architect, but perhaps only because we know that with these advanced networking options that bandwidth is not going to be a bottleneck.

The innovation that is allowing Ethernet to not leap ahead so much as jump to where it should have already been is PAM-4 signaling. The typical non-return to zero, or NRZ, modulation used with Ethernet switching hardware, cabling, and server adapters can encode one bit on a signal. With pulse amplitude modulation, or PAM, multiple levels of signaling can be encoded, so multiple bits can be encoded in the signal. With PAM-4, there are four levels of signaling which allow for two bits of data to be encoded at the same time on the signal, which doubles the effective bandwidth of a signal without increasing the clock rate. And looking ahead down the road, there is a possibility of stuffing even more bits in the wire using higher levels of PAM, and the whiteboards of the networking world are sketching out how to do three bits per signal with PAM-8 encoding and four bits per signal with PAM-16 encoding.

With 40 Gb/sec Ethernet, we originally had 10 Gb/sec lanes aggregated. This was not a very energy efficient way to do 40 Gb/sec, and it was even worse for early 100 Gb/sec Ethernet aggregation gear, which ganged up ten 10 Gb/sec lanes. When the hyperscalers nudged the industry along in July 2014 to backcast this 25 GHz (well, really 28 GHz before encoding) to 25 Gb/sec and 50 Gb/sec Ethernet switching with backwards compatibility to run 10 Gb/sec and 40 Gb/sec, the industry did it. So we got to affordable 100 Gb/sec switching with four lanes running at 25 Gb/sec, and there were even cheaper 25 Gb/sec and 50 Gb/sec options for situations where bandwidth needs were not as high, and at a much better cost. (Generally, you got 2.5X the bandwidth for 1.5X to 1.8X the cost, depending on the switch configuration.)

With the 200 Gb/sec Spectrum-2 Ethernet switching that Mellanox Technologies is rolling out, and that other switch makers are going to adopt, the signaling is still running at 25 GHz effective, but with the Spectrum-2 gear Mellanox has just unveiled, it is layering on PAM-4 modulation to double pump the wires, so it delivers 50 Gb/sec per lane even though it is still running at the same speed as 100 Gb/sec Ethernet lanes. And to reach 400 Gb/sec with Spectrum-2 gear, Mellanox is planning to widen out to eight lanes running at this 25 GHz (effective) while layering on PAM-4 modulation to get 100 Gb/sec effective per lane. At some point, the lane speed will have to increase to 50 GHz, but with PAM-8 modulation the switching at eight lanes could be doubled again to 800 GB/sec, and with PAM-16 you could hit 1.6 TB/sec. Adding in the 50 GHz real signaling here would get us to 3.2 TB/sec something that still probably seems like a dream and that is probably also very far into the future.

This all sounds a lot easier in theory than it will be to actually engineer, Kevin Deierling, vice president of marketing at Mellanox, tells The Next Platform. You can go to PAM-8 and you can go to Pam-16, but when you do that, you are starting to shrink the signal and it gets harder and harder to discriminate from one level in the signal and the next. Your signal-to-noise ratio goes away because you are shrinking your signal. Some folks are saying lets go to PAM-8 modulation, and other folks are saying that they need to use faster signaling rates like 50 GHz. I think we will see a combination of both.

The sweet thing about using PAM-4 to get to 200 Gb/sec switching is that the same SFP28 and QSFP28 adapters and cables that were used for 100 Gb/sec switching (and that are used for the 200 Gb/sec Quantum HDR InfiniBand that was launched by Mellanox last year and that will start shipping later this year) are used for the doubled up Ethernet speed bump. You need better copper cables for Spectrum-2 because the signal-to-noise ratio is shrinking, and similarly the optical transceivers need to be tweaked for the same reason. But the form factors for the adapters and switch ports remain the same.

With the 400 Gb/sec Spectrum-2 switching, the adapters have new wider form factors, with Mellanox supporting the QSFP-DD (short for double density) option instead of the OSFP (short for Octal Small Form Factor) option for optical ports. Deierling says Mellanox will let the market decide and support whatever it wants one, the other, or both but it is starting with QSFP-DD.

The Spectrum-2 ASIC can deliver 6.4 Tb/sec of aggregate switching bandwidth, and it can be carved up in a bunch of ways, including 16 ports at 400 Gb/sec, 32 ports at 200 Gb/sec, 64 ports at 100 Gb/sec (using splitter cables), and 128 ports running at 25 Gb/sec or 50 Gb/sec (again, using splitter cables). The Spectrum-2 chip can handle up to 9.52 billion packets per second, and has enough on chip SRAM to handle access control lists (ACLs) that span up to 512,000; with one of the 200 Gb/sec ports and a special FPGA accelerator that is designed to act as an interface to a chunk of external DRAM next to the chip, the Spectrum-2 can handle up to 2 million additional routes on the ACL what Deierling says is the first internet-scale Ethernet switch based on a commodity ASIC that is suitable for hyperscaler-class customers who want to do Layer 3 routing on a box at the datacenter scale.

As for latency, which is something that everyone is always concerned with, the port-to-port hop on the Spectrum-2 switch is around 300 nanoseconds, and this is about as low as the Ethernet protocol, which imposes a lot of overhead, can go, according to Deierling. The SwitchX-2 and Quantum InfiniBand ASICs from Mellanox can push latencies down to 100 nanoseconds or a tiny bit lower, but that is where InfiniBand hits a wall.

At any rate, Mellanox reckons that Spectrum-2 has the advantage in switching capacity, with somewhere between 1.6X and 1.8X the aggregate switching bandwidth compared to its competition and without packet loss and somewhere on the order of 1.5X to 1.7X lower latency, too.

At the moment, Mellanox is peddling four different configurations of its Spectrum-2 switches, which are shown below:

The Spectrum-2 switches are being made available in two different form factors, two full width devices and two half width devices. The SN3700 has a straight 32 ports running at 200 Gb/sec for flat, Clos style networks, while the SN3410 has 48 ports running at 50 Gb/sec with eight uplinks running at 200 Gb/sec for more standard three tiered networks used in the enterprise and sometimes on the edges of the datacenter at hyperscalers. The SN3100 is a half-width switch that has 16 ports running at 200 Gb/sec, and the SN3200 has 16 ports running at 400 Gb/sec.

It is interesting that there is not a full width SN series switch with 400 Gb/sec ports. This is intentionally so and based on the expected deployment scenarios. In scenarios where a very high bandwidth switch is needed to create a storage cluster or a hyperconverged storage platform, 16 ports in a rack is enough and two switches at 16 ports provides redundant paths between compute and storage or hyperconverged compute-storage nodes to prevent outages.

There is even a scenarios that, using the VMS Wizard software for the Spectrum-2 switch that converts a quad of the 2100 Gb/sec and 400 Gb/sec switches that creates a virtual modular switch that with 64 of the SN3410 devices that can support up to 3,072 ports in a single management domain. Take a look:

This Virtual Modular Switch is about 25 percent less expensive than actual modular switches with the same port count and lower bandwidth and higher latency.

Programmability is a big issue with networking these days, and the Spectrum-2 devices will be fully programmable and support both a homegrown compiler and scripting stack created by Mellanox as well as the P4 compiler that was created by Barefoot Networks for its Tofino Ethernet switch ASICs and that is being standardized upon by some hyperscalers. Mellanox expects for hyperscalers to want to do a lot of their own programming, but that most enterprise customers will simply run the protocols and routines that Mellanox itself codes for the machines. The point is, when a new protocol or extension comes along, Spectrum-2 will be able to adopt it and customers will not have to wait until new silicon comes out. The industry waited far too long for VXLAN to be supported in chips, and that will not happen again.

As for pricing, the more bandwidth you get, the more you pay, but the cost per bit keeps coming down and will for the 200 Gb/sec and 400 Gb/sec speeds embodied in the Spectrum-2 lineup. Pricing depends on volumes and on the cabling, of course, but here is how it generally looks. With the jump from 40 Gb/sec to 100 Gb/sec switching (based on the 25G standard), customers got a 2.5X bandwidth boost for somewhere between 1.5X and 1.8X the price somewhere around a 20 percent to 30 percent price/performance benefit. Today, almost two years later, 100 Gb/sec ports are at price parity with 40 Gb/sec ports back then, and Deierling says that a 100 Gb/sec port costs around $300 for a very high volume hyperscaler and something like $600 per port for a typical enterprise customer. The jump to 200 Gb/sec will follow a similar pattern. Customers moving from 100 Gb/sec to 200 Gb/sec switches (moving from Spectrum to Spectrum-2 devices in the Mellanox lineup) will get 2X the bandwidth for 1.5X the cost. Similarly, those jumping from 100 Gb/sec to 400 Gb/sec will get 4X the bandwidth per port for 3X the cost.

Over time, we expect that there will be price parity between 100 Gb/sec pricing today and 200 Gb/sec pricing, perhaps two years hence, and that the premium for 400 Gb/sec will be more like 50 percent than 100 percent. But those are just guesses. A lot depends on what happens in the enterprise. What we do know is that enterprises are increasingly being forced by their applications and the latency demands of their end user applications to deploy the kind of fat tree networks that are common at HPC centers and hyperscalers and they are moving away from the over-subscribed, tiered networks of the past where they could skimp on the switch devices and hope the latencies were not too bad.

Categories: Cloud, Connect, Enterprise, Hyperscale

Tags: Ethernet, Mellanox, PAM-4, Spectrum-2

Parameter Encoding on FPGAs Boosts Neural Network Efficiency OpenPower, Efficiency Tweaks Define Europes DAVIDE Supercomputer

See original here:

Ethernet Getting Back On The Moore's Law Track - The Next Platform

Chart in Focus: AMD’s Moore’s Law Plus Concept – Market Realist

AMD Seeks to Gain Market Share from Intel and NVIDIA PART 3 OF 22

In the past, Advanced Micro Devices (AMD) had suffered from delayed product launches because it used highly specialized semiconductor nodes that it had built. As a result, any process difficulties and yield issues were specific to the company.

In 2012, AMD spun off its manufacturing unit, Global Foundries, and became a fabless company. This helped it address the problem of process technologies and yield difficulties. In 2016, the company launched its first product on the 14nm (nanometer) process node, bringing it on par with its competitors Intel (INTC) and NVIDIA (NVDA) in terms of process technology.

However, Moores law is slowing. Moores law states that the size of the chip would shrink every two years and the number of transistors would double, thereby improving performance and reducing cost and power consumption.

As Moores law slows, companies are looking for innovative ways to power the next generation of computing capability. AMD developed the concept of Moores Law Plus to drive future innovation.

At its 2017 Financial Analyst Day, AMDs chief technology officer, Mark Papermaster, explained that its semiconductor technology alone cannot address the companys future computing needs. As a result, AMD has adopted a three-pronged approach to drive future generationchip development: integrate hardware, software support, and design from a system perspective.

AMD is improving its core architecture to integrate with other hardware. The company developed Infinity Fabric, which connects multiple chips efficiently and provides greater control with respect to power and security.

AMD is advancing its packaging from the current MCM (multichip modules) and 2.5D pack unit to 3D packaging.

The company has collaborated with industry participants like IBM (IBM) and Xilinx (XLNX) in developing anindustry-standard interconnect like CCIX (cache-coherent fabric to interconnect accelerators). CCIX would provide high-performance connectivity and rack scale for different accelerators and server processors.

AMD is also looking to optimize the physical design of its chips by making them denser and more power-efficient.

AMD is supporting its hardware with very high-performance software solutions.Instead of locking the software, AMD has adopted an open computing platform that allows users to download and upload information for free.

The company is using C/C++ Compiler and advanced frameworks like ROCm (Radeon Open Compute platform) to support hardware.

Semiconductor and software technology cannot deliver the desired computing power in isolation. All these technologies are integrated into a system design. For instance, AMDs Radeon Instinct Initiative would integrate the Ryzen CPU (central processing unit), the Vega GPU (graphics processing unit), HBM2 (high-bandwidth memory), and ROCm to deliver machine learning and heterogeneous computing systems.

AMD has recently acquired wireless millimeter-wave interconnect technology that it plans to use in developing wireless VR (virtual reality) headsets.

Next, well look at AMDs new CPU architecture.

Here is the original post:

Chart in Focus: AMD's Moore's Law Plus Concept - Market Realist

Could this 2D materials innovation push Moore’s law into sub-5nm gate lengths? – Electropages (blog)

In a major technological development a material-device-circuit level co-optimisation of field-effect transistors (FETs) based on 2D materials for high-performance logic applications scaled beyond the 10nm technology node has been presented.

It is the result of collaborative work between Imec, the nanoelectronics and digital technology innovation centre and scientists from KU Leuven in Belgium and Pisa University in Italy. In addition to this Imec has also created designs which are thought to allow the use of mono-layer 2D materials to facilitate Moores law below a 5nm gate length.

Scientists believe 2D materials which are formed from two-dimensional crystals may be able to create a transistor with a channel thickness down to the level of single atoms and gate lengths of a few nanometers.

A key technology driver that allowed the chip industry to progress Moores Law and to producing increasingly powerful devices was the continued scaling of gate lengths.

In order to counter the resulting negative short-channel effects, chip manufacturers have already moved from planar transistors to FinFETs. They are now introducing other transistor architectures such as nanowire FETs. This material breakthrough goes beyond existing practices.

In order to fit FETs based on 2D materials into the scaling roadmap it is essential to understand how their characteristics relate to their behavior in digital circuits. In a recent paper published in Scientific Reports the Imec scientists and their colleagues explained how to choose materials, design the devices and optimise performance to create circuits meeting the requirements for sub-10nm high-performance logic chips. Their findings demonstrate the need to use 2D materials with anisotropic characteristics, meaning it is stronger along its length than laterally and also has a smaller effective mass in the transport direction.

Using one such material, monolayer black-phosphorus, the researchers presented device designs which they say could pave the way to extend Moores law into the sub-5nm gate length.

These designs reveal that for sub-5nm gate lengths, 2D electrostatics arising from gate stack design become more of a challenge than direct source-to-drain tunneling.

These results are very encouraging because in the case of 3D semiconductors, such as Si, scaling gate length so aggressively is practically impossible.

Paul Whytock is European Editor for Electropages. He has reported extensively on the electronics industry in Europe, the United States and the Far East for over twenty years. Prior to entering journalism he worked as a design engineer with Ford Motor Company at locations in England, Germany, Holland and Belgium.

Share on Google Plus Share

Here is the original post:

Could this 2D materials innovation push Moore's law into sub-5nm gate lengths? - Electropages (blog)

NVIDIA White Paper Projects MCM-GPU Future Will Outrun Moore’s … – Hot Hardware


Hot Hardware
NVIDIA White Paper Projects MCM-GPU Future Will Outrun Moore's ...
Hot Hardware
It's not too often we get the feeling that some of the technology that we regularly use is reaching its upper-limit, but there comes a time when new ideas need to ...

and more »

See the rest here:

NVIDIA White Paper Projects MCM-GPU Future Will Outrun Moore's ... - Hot Hardware