Moore’s Law and supply chain planning systems – The 21st Century Supply Chain – Perspectives on Innovative (blog)

It was in 1965 that Dr. Gordon Moore made a prediction that changed the pace of tech. His prediction, popularly known as Moores law, was with regards to doubling of the number of transistors per square inch on an integrated circuit every 18 months or so. As a result of the innovations attributable to the endurance of Moores law over the last 50+ years, we have seen significant accelerations in processing power, storage, and connectivity. These advances continue to have major implications on how companies plan their supply chains. In my nearly two decades as a supply chain professional, I have seen quite a few changes.

Lets look at some of the big shifts that have taken place in the supply chain planning space.

Early on in my career, I remember working with a large global company who had to take their interconnected global supply chain model and slice it up into distinct independent supply chain models. This was because the processing power at the time was simply not enough to plan their supply chain in a single instance. This surgical separation of supply chains required a high degree of ingenuity and identifying the portions of supply network with the least amount of interconnections, and partition them. This was not the most optimal way to build a supply chain model, but they did what they could within the limitations of the technology then. With the advent of better processing power, they were able to consolidate these multiple instances into a single global instance leading to a better model of their business. This is just one of many such examples.

As the hardware side of the solution benefited from Moores law, in parallel, developers of the supply chain applications continued to make conscious efforts to better utilize the storage, processing, and network resources available to them. This multi-pronged approach resulted in squeezing further efficiencies and bringing better scalability. Now companies are getting more adventurous with their planning and are getting planning down to the point of consumption. While there is enough debate within the supply chain community as to whether the data at more atomic levels is clean, trustworthy, and dense enough, and whether the extra effort needed to model down to the granular levels is worth it, the fact that we are seeing technology scale to such levels of granularity is illustrative of the power of Moores law.

In a traditional packaged planning software deployment, the vendor sells a perpetual license for the software, helps the customer with sizing the hardware, waits for the hardware to be setup and configured at the customers premises, then installs the software and the middleware components needed before the software configurations can begin. This whole process can take several weeks or in many cases, months. With Moores law holding its power over the decades, and resulting gains in processing, storage, and network speeds, newer delivery models prevailed. Supply Chain Planning capability is now being provided in a Software as a Service (SaaS) model. Immediately upon executing the necessary contracts, customers can start accessing the software, so the project can begin in earnest. This is shifting the focus from Technology enablement to Business capability enablement. I remember the days when prospects approached Cloud with skepticism, specifically around the security of cloud based systems. Now, while I still see a number of prospects asking questions around security as part of the RFP (Request for Proposal) process, it is fair to say that the security discussion in most instances is turning out to be a set of quick conversations with the customers IT teams. There is in general, a growing acknowledgement that a SaaS vendor catering to many customers is better equipped to handle security vulnerabilities than any one companys IT organization.

One added advantage of the move to the cloud is accessibility. Until a few years ago, every RFP looking for global deployment of supply chain planning systems used to contain questions around accessibility on dial up lines and such in developing nations. Now it is not as often that I see questions around speed of networks and accessibility. With tech becoming accessible across the globe and with increasing availability of the bandwidth, I am seeing fewer companies query about accessibility from different geographies. Instead, the questions are more geared around access from various mobile devices, which is becoming a core requirement. The SaaS model renders itself very well to such support across varied devices and form factors. SaaS is illustrative of the symbiotic progress between hardware and software delivery models powered by Moores law.

While there is enough talk about the rise of the machines and autonomous supply chains, the newer forms of planning technology is in fact helping get the best of bringing together the humans and machines, rather than making humans redundant. The previous generations of planning technology was very much waterfall oriented with Demand Planning, followed by Supply Planning, followed by Capacity Planning, and so on. It severely undermined the role of human intelligence in supply chain planning. The well intentioned users of such systems spend more time in data gathering, preparation, and piece together information on outdated data using excel macros and such. Also, building an S&OP capability with such underlying technology is turning out to be an expensive band aid for several organizations.

Such batch, waterfall-oriented planning is giving way to near real-time concurrent planning supported by what-if scenarios and social collaboration. Supported by technologies such as in memory computing, concurrent planning can happen at a scale like we have not seen before. Such advances in planning at the speed of business can also better leverage advances in IoT, Machine Learning, and Data science. Batch oriented supply chain planning capabilities of the previous generation are not fit to consume the real time digital signals from smart, connected devices, and course correct as needed. Having a system that can supplement human intelligence so planners can make decisions at the speed of business can be very empowering.

Now it is becoming very realistic and affordable to represent the model of an end to end network of a large corporation with all its assumptions and parameters, and simulate the response strategies to the various stimuli the supply chain receives. Linear approximations of highly non-linear supply chains are giving way to more realistic modeling of supply networks.

All in all, Moores law did have a major impact on the supply chain planning capabilities. Significant gaps still exist between the art of the possible with a new way of concurrent planning, as compared to how many organizations run their supply chain planning processes in a batch oriented manner today. My advice to the companies embarking on supply chain transformation the future is here! Challenge yourself on if the old ways of planning will meet the needs of the organizations of the present day. If Moores law helped get unprecedented computing power right in your pocket in the form of a smart phone, what can it do to your supply chain? The possibilities are limitless. You just need to be open to explore!

As Vice President of Industry Strategy at Kinaxis, Madhav serves as a trusted advisor for our customers through sales and implementation, ensuring success. He also engages with our strategic customers and key industry leaders to drive thought leadership and innovation. Madhav joined Kinaxis in the summer of 2016, bringing many years of experience in Supply Chain Management across various industries. Madhav started his professional career at i2 (which was later acquired by JDA). During his 17+ year tenure at i2/JDA, Madhav played numerous roles in Customer Support, Consulting, Presales, and Product Management. During his illustrious career, he was instrumental in helping enable numerous large scale transformational supply chain opportunities. He is very passionate about Supply Chain Management and the role it plays in making the world a better place. He shares this passion with others through his engagements and writings. Madhav has a Ph.D. in Chemical Engineering from University of Florida and a B.Tech. in Chemical Engineering from Indian Institute of Technology (IIT), Madras.

More blog posts by Dr. Madhav Durbha

See the article here:

Moore's Law and supply chain planning systems - The 21st Century Supply Chain - Perspectives on Innovative (blog)

Could going beyond Moore’s Law open trillion dollar markets? – Scoop.co.nz (press release)

Press Release #2 Multicore World 2017

Could going beyond Moores Law open trillion dollar markets for New Zealand?

Technology is advancing at a faster rate than societys expectations, says Paul Fenwick, keynote speaker at Multicore World 2017, Wellington, February 20 - 22

We can go from science fiction to consumer availability, with very little in the way of discussion in between. But the questions they raise are critically important, says the Australian, one of a number of global experts at a world leading forum on what is possible with vastly underutilised computing processing power now available.

Not many look at critical questions such as What happens when self-driving vehicles cause unemployment, when medical expert systems work on behalf of insurance agencies rather than patients, and weapon platforms make their own lethal decisions, he says.

Conference Director Nicolas Erdody says MW17 is much more than a talk-fest.

Erdody says that 90% of all the data in the world has been generated in the past two years; a pattern that will keep repeating. How on earth will we process these massive amounts of data, and actually make meaningful sense and use of it, he asks?

Among some of the industry, academic and research experts is Prof Michelle Simmons.

She is an Australian Research Council Laureate Fellow and Director at the Centre for Quantum Computation & Communication Technology, UNSW. She will describe the emerging field of quantum information, a response to the fact that device miniaturization will soon reach the atomic limit, set by the discreteness of matter, leading to intensified research in alternative approaches for creating logic devices of the future.

Prof Satoshi Matsuoka (Japan) will present his keynote Flops to Bytes: Accelerating Beyond Moores Law and Dr John Gustafson (former Director of Intel Labs, now Visiting Professor at the National University of Singapore) will reveal a new data type called posit, that provide a better solution for approximate computing.

In this context, New Zealander Prof Michael Kelly, Prince Philip Professor of Technology from the University of Cambridge (UK) will ask in his keynote How Might the Manufacturability of the Hardware at Device Level Impact on Exascale Computing?

Dr Nathan DeBardeleben from Los Alamos National Labs (US) will discuss how supercomputer resilience and fault-tolerance are increasingly challenging areas of extreme-scale computer research as agencies and companies strive to solve the most critical problems. In his talk he will discuss how data analytics and machine learning techniques are being applied to influence the design, procurement, and operation of some of the worlds largest supercomputers

The assemblage of big brains around multicore computing and parallel programming will pose questions and answers as the world moves towards exascale computing in the next decade. Being part of such discussions can position New Zealand technologists, entrepreneurs and scientists at the intersection of two massive global markets that will benefit this countrys future growth: Decision-Making (estimated in $2 Trillion) and Food and Agriculture (estimated in $5 Trillion), says Nicolas Erdody, Open Parallel CEO and MW17 Conference Organiser

The 6th annual Multicore World, to be held at Shed 6 will discuss these and other Big Questions. MW17 will be three days of intensive talks, panels and discussion in a think-tank format that allows plenty of time for one on one meetings.

The conference is organised by Open Parallel Ltd (New Zealand) and sponsored by MBIE, Catalyst IT, NZRise and Oracle Labs

ENDS

Scoop Media

Continue reading here:

Could going beyond Moore's Law open trillion dollar markets? - Scoop.co.nz (press release)

Large-Scale Quantum Computing Prototype on Horizon – The Next Platform

February 16, 2017 Jeffrey Burt

What supercomputers will look like in the future, post-Moores Law, is still a bit hazy. As exascale computing comes into focus over the next several years, system vendors, universities and government agencies are all trying to get a gauge on what will come after that. Moores Law, which has driven the development of computing systems for more than five decades, is coming to an end as the challenge of making smaller chips loaded with more and more features is becoming increasingly difficult to do.

While the rise of accelerators, like GPUs, FPGAs and customized ASICs, silicon photonics and faster interconnects will help drive performance to meet many of the demands of such emerging applications as artificial intelligence and machine learning, data analytics, autonomous vehicles and the Internet of Things, down the road new computing paradigms will have to be developed to address future workload challenges. Quantum computing is among the possibilities being developed as a possible solution as vendors look to map out their pathways into the future.

Intel, which more successfully than any other chip maker has driven Moores Law forward, is now turning some of its attention to the next step in computing. CEO Brian Krzanich last week during the companys investor event said Intel is investing a lot of time, effort and money in both quantum computing and neuromorphic computing developing systems that can mimic the human brain and Mark Seager, Intel Fellow and CTO for the HPC ecosystem in the chip makers Scalable Datacenter Solutions Group, told The Next Platform that at Intel, we are serious about other aspects of AI like cognitive computing and neuromorphic computing. Our way of thinking about AI is more broad than just machine learning and deep learning, but having said that, the question is how the technologies required for these workloads are converging with HPC.

Quantum computing has been talked about for decades, and there have been projects pushing the idea for almost just as long. It holds out the promise of systems that are multiple times faster than current supercomputers. At the core of quantum computers are qubits, which are to quantum systems what bits are to traditional computers.

IBM last year made its quantum computing capabilities available on the IBM Cloud to give the public access to the technology and to drive innovation and new applications that can be used for the technology. Big Blue has been working on quantum computing technology for more than three decades. D-Wave currently is the only company to offer commercial quantum computing systems, and last month introduced its latest version, the D-Wave 2000Q, which has 2,000 qubits twice the number of its predecessor and has its first customer in Temporal Defense Systems, which will use the system to address cybersecurity threats. The systems are expensive reportedly in the $15 million range and the number of applications that can run on them are small, though D-Wave officials told The Next Platform that the number of applications will grow over the next decade and that the company is working to encourage that growth.

Others organizations also are pushing to expand the capabilities of quantum computing. Researchers led by Prof. Winfried Hensinger, head of the Ion Quantum Technology Group at the University of Sussex in England, this month unveiled a blueprint for building a modular, large-scale and highly scalable quantum computer and plans to build a prototype of the system at the university. The modular model and a unique way for moving qubits between the modules are at the center of what the researchers who also come from the United States, Denmark, Japan and Germany are developing. Qubits take advantage of what is called in quantum mechanics superposition the ability to have values of 1 and 0 at the same time. That ability fuels much of the promise of quantum computers that are significantly faster than conventional systems.

Quantum physics is a very strange theory predicting things like an atom can be in two different places at the same time, were harnessing these very strange effects in order to build a new type of computer. These quantum computers will change all of our lives, revolutionizing science, medicine and commerce.

The computer will be built through modules that contain an electronics layer, a cooling layer using liquid nitrogen and piezo actuators. Each module will be lowered into a steel frame, and the modules will leverage connections created via electric fields that transmit ions from one module to the next. Its a step in another direction from the fiber optic technologies many scientists are advocating for in quantum computers.

The researchers in Sussex argue that using electric fields to transport the charged atoms will offer connection speeds between the modules that are 100,000 faster than current fiber technologies and, according to Hensinger, will allow us to build a quantum computer of any size [and] allow us to achieve phenomenal processing powers. Each module will hold about 2,500 qubits, enabling a complete system that can contain 2 billion or more qubits.

The blueprint and prototype will be the latest step in what is sure to be an ongoing debate about what quantum computers will look like. However, creating modular system that can scale quickly and offers a very fast connectivity technology will help drive the discussion forward. Hensinger and his colleagues are making the blueprint public in hopes that other scientists will to take in what theyre developing and build off of it.

Categories: Compute

Tags: Quantum Computing

Why Googles Spanner Database Wont Do As Well As Its Clone How Yahoos Internal Hadoop Cluster Does Double-Duty on Deep Learning

Follow this link:

Large-Scale Quantum Computing Prototype on Horizon - The Next Platform

Intel Corporation (NASDAQ:INTC) Realizes There Will Be A Post-Moore’s Law Era And Is Already Investing In … – Seneca Globe

Intel Corporation (NASDAQ:INTC)[Trend Analysis], stock knocked down around -0.34% in early session as its gaining volume of 44.07 Million. Intel (INTC) declared that it realizes there will be a post-Moores Law era and is already investing in technologies to drive computing beyond todays PCs and servers. The chipmaker is investing heavily in quantum and neuromorphic computing, said Brian Krzanich, CEO of Intel, during a question-and-answer session at the companys investor day on Thursday.

We are investing in those edge type things that are way out there, Krzanich said. To give an idea of how far out these technologies are, Krzanich said his daughter would perhaps be running the company by then.

Researching in these technologies, which are still in their infancy, is something Intel has to do to survive for many more decades. Shrinking silicon chips and cramming more features into them is becoming difficult, and Intel is already having trouble in manufacturing smaller chips.

The stock showed weekly upbeat performance of -3.23%, which maintained for the month at -3.67%. Similarly, the positive performance for the quarter recorded as 2.42% and for the year was 29.15%, while the YTD performance remained at -1.87%. INTC has Average True Range for 14 days of 0.57.

Cirrus Logic, Inc. (NASDAQ:CRUS)[Trend Analysis] pretends to be active mover, stock plunged around -2.13% to traded at $54.19.

The liquidity measure in recent quarter results of the company was recorded 3.90 as current ratio, on the other side the debt to equity ratio was 0.09, and long-term debt to equity ratio remained 0.09. The Company has gross margin of 49.10% and profit margin was positive 16.80% in trailing twelve months. (Read Latest [Free Analytic] Facts on NASDAQ:CRUS and Be Updated). To accommodate long-term intention, experts calculate Return on Investment of 12.50%. The firm has Profit Margin of positive 16.80%.

For latest Market Updates SubscribesHere

See the original post:

Intel Corporation (NASDAQ:INTC) Realizes There Will Be A Post-Moore's Law Era And Is Already Investing In ... - Seneca Globe

Whats next? Going Beyond Moores Law SAT Press Releases – Satellite PR News (press release)

Submit the press release

Conditioning consumers to expect certain advances in speed, battery life, and capabilities, Moores Law has led the way for the computing industry for decades.

Because Moores law suggests exponential growth, it is unlikely to continue indefinitely. Software and hardware innovations will likely keep the dream of Moores Law alive for several years to come; however, there may come a time when Moores Law is no longer applicable due to temperature constraints. As such a revolutionary approach to computing is required. The IEEE Rebooting Computing Initiative is dedicated to studying next-generation alternatives for the computing industry. In Engadgets recent Public Access article, Beyond Moores Law, Tom Conte, IEEE Rebooting Computing Initiative Co-Chair and Professor in the Schools of Electrical & Computer Engineering and Computer Science at the Georgia Institute of Technology, provides an overview of next-generation alternatives that could meet the growing demand for advances in computing technology.

From cryogenic computing to quantum computing, there are a variety of alternatives to meet the expectations of consumers. Change is coming to the computing industry. Are you interested in learning more? Tom will provide insight on this topic at the annual SXSW Conference and Festival, 10-19 March, 2017. His session, Going Beyond Moores Law, is included in the IEEE Tech for Humanity Series at SXSW. For more information please see http://techforhumanity.ieee.org.

Originally posted here:

Whats next? Going Beyond Moores Law SAT Press Releases - Satellite PR News (press release)

Moore’s Law And The History Of Comic Book Movies – Monkeys Fighting Robots (blog)

Back in 1965, Intel co-founder Gordon Moore made an observation that the number of transistors was doubling every year, thereby doubling the power of computers. Moores Law as it came to be known would prove even more accurate than he imagined. Since Moores observation, computing power continues to grow at an incredible rate. All this growing technology directly lead to the effects of Star Wars, Terminator 2, and the CG-heavy comic book movies of today.

No other genre benefits from computing power quite like superheromovies. Every year, Disney and Warner Brothers unleash a new effects-heavy, punch-fest starring a beloved character from comic book lore. The superhero trend went into overdrive in 2008 with Iron Man, but before that, Raimis Spider-Man conquered box offices with dazzling use of CG; before that Singers first two X-Men movies were on top. However, things get a little murkier before the arrival of X1 in 2000, and thats where the debate begins.

Some in geekdom believe Blade is the father of modern comic book movies; others argue its Tim Burtons Batman in 1989; still, others look back at Superman: The Movie. Im here to say that theyre all wrong and right! Ill explain.

Comic books were a pulp mainstay for decades. But up through the 1970s, there were only two movies to mention.

Superman and the Mole Men 1951 There wasnt going to be anyone else who broke the mold first. Superman was the most popular comic book of the time and already had a hit TV show. Superman and the Mole Men was an extension of the show, featuring George Reeves as the last son of Krypton.

Batman: The Movie 1966 In the 60s, campy Batman was all the rage. Adam West filled the cape and cowl and through the course of three seasons fought the greatest hits of Batmans rogues gallery. In 1966, much like the Superman movie of the 50s, Producers wisely created a feature length episode. In it, Penguin and the United Underworld are turning people into cubes.

You will believe a man can fly. If I had to pick an actual starting point for comic book movies as mainstream money-makers, it would undoubtedly be here. Richard Donners Superman was a mega-hit at the box office. The effects look dated now (40 years, hello!) but the innovations pioneered by Star Wars just a year before helped Donner create a dazzling comic book movie like never before.

In the 70s, anti-heroes like Batman and Wolverine werent as big a thing as today. Heroes were still meant to be the best of us, not psychologically disturbed or ferocious. Superman was still king of the comic book mountain in the minds of the masses, and there was no one else who could lift the weight of the comic book universe into the mainstream like the Man of Steel.

Total Number of Comic Book Movies Up Until December 31st, 1979: 3

The 80s were slow-going for comic book films. Superman carried the torch with three sequels, each drastically worse than the one before it. But two movies made an impact. One film served as a subtle nudge, while the other became the standard bearer.

Not a hit by any stretch of the imagination, Swamp Thing from director Wes Craven holds an important place in comic book movie history. Craven, a master of horror films, even while trying to win the mainstream hearts of Hollywood execs and keep away from his usual style, still added his signature to Swamp Thing. That macabre touch created a distinction from what was the norm and played into the growing popularity of anti-heroes.

Tim Burtons Batman was a smash box office success, rocketing into the top earners of all time. Donners Superman knocked down the door into the mainstream. But Burtons Batman went in and beat the crap out of everyone. Batman was a hype phenomenon in the days before the Internet and sites like Monkeys Fighting Robots existed. Warner Brothers unleashed a torrent of marketing that consisted of an entire magazine devoted to the film before release. Similar to leaked photos the magazine highlighted allthings about the movie.

Number of Comic Book Feature Films: 9

Its in the 1990s when thingstake a radical leap. After the success of Batman, Hollywood was gearing up to turn every comic book they could get their hands on into a movie. There were four more Batman films, Dolph Lungren played The Punisher, and the Teenage Mutant Ninja Turtles continued their transition from dark comic book to a lighthearted multimedia franchise. Again, two films set the stage for things to come.

Many viewers had no idea that The Crow was a graphic novel by James OBarr. Today, most remember the movie as the final film of Brandon Lee. The Crow is all 90s grunge-goth action movie awesome that holds up well today. Director Alex Proyas, who later created the sci-fi noir film Dark City, bathed The Crow in rain and darkness, with the dark atmospheres lifting when it serves the story. The Crow continued to lengthen the path of the anti-hero.

By the late 90s, comic book movies were either Batman movies or obscure comics and graphic novels made on an average budget. Like The Crow, only the most ardent geeks even knew Blade was a comic book, but the Wesley Snipes action movie was a sleeper hit that sliced and diced its way to a strong box office performance. Blade softened the goth style of The Crow and made it sleek with fitted leather armor and killer electronica soundtrack. Blades slick look, attitude, and sense of humor is something that continues to grow and involve in the majority of mainstream comic book movies.

Number of Comic Book Feature Films: 22

The first X-Men movie released in 2000 and Bryan Singers origin story for Marvels super-team was a wild success, breaking box office records like Burtons Batman 11 years earlier. Its here where I believe two things happened. Comic book movies as we knew them ended and comic book movies as we will come to know them began.

X-Men ended the era of practical comic book movies, as in, practical effects. Blade used CG to accent practical effects, while X-Men was a mix of practical and CG. And that use of CG, plus the way Singer presented the material, evolved into Raimis Spider-Man in 2003. Spidey, thenext big hit was a CG-heavy, joke-filled popcorn flick. Sound familiar? The borderline campy attitude of Sonys first Spider-Man created a new standard for comic book movies. Just five years later, Marvel would begin its reign at the box office with a CG-heavy, joke-filled Iron Man who is arguably also an anti-hero.

Since 2000, 77 comic book movies have seen release! We dont need to get into the specifics because everyone knows whats come and whats to come. But here are the numbers.

Number of Comic Book Feature Films the 2000s: 33

Number of Comic Book Feature Films in the 2010s: 44, so far

Like Moores Law and transistors, the number of comic book movies we can fit into a year has increased. Its leveled some, but continues to grow, and the comic book movie trend sees no end in site. Now consider that weve only talked about American comic book movies.Ghost in the Shell, a Japanese Manga (aka comic book) and Valerian, a French comic book, are on the way to the big screen.Oh, also dont forgetthat theres TV, but thats another article for another time. Moores Law will hold steady for technology. Maybe for comic book movies we can call it, Lees Law.

Original post:

Moore's Law And The History Of Comic Book Movies - Monkeys Fighting Robots (blog)

Unwinding Moore’s Law from Genomics with Co-Design – The Next Platform

February 8, 2017 Nicole Hemsoth

More than almost any other market or research segment, genomics is vastly outpacing Moores Law.

The continued march of new sequencing and other instruments has created a flood of data and development of the DNA analysis software stack has created a tsunami. For some, high performance genomic research can only move at the pace of innovation with custom hardware and software, co-designed and tuned for the task.

We have described efforts to build custom ASICs for sequence alignment, as well as using reprogrammable hardware for genomics research, but for centers that have defined workloads and are limited by performance constraints (with an eye on energy efficiency), the push is still on to find novel architectures to fit the bill. In most cases, efforts are focused on one aspect of DNA analysis. For instance, de novo assembly exclusively. Having hardware that is tuned (and tunable) that can match the needs of multiple genomics workloads (whole genome alignments, homology searches, etc.) is ideal.

With these requirements in mind, a research team at Stanford, led by computing pioneer, Bill Daly, has taken aim at both the hardware and software inefficiencies inherent to genomics via the creation of a new hardware acceleration framework that they say can offer between a 125X and 15.6X speedup over the state-of-the-art software counterparts for reference-guided and de novo assembly of third generation (long) sequencing reads, respectively. The team also reports significant efficiency improvements on pairwise sequence alignments (39,000X more energy efficient than software alone).

Over 1,300 CPU hours are required to align reads from a 54X coverage of the human genome to a reference and over 15,600 CPU hours to assemble the reads de novoToday, it is possible to sequence genomes on rack-size, high-throughput machines at nearly 50 human genomes per day, or on portable USB-stick size sequences that require several days per human genome.

The Stanford-based hardware accelerated framework for genomic analysis, called Darwin, has several elements that go far beyond the creation or configuring of custom or reprogrammable hardware. At the heart of the effort is the Genome Alignment using Constant Memory Trace-back (GACT), which is an algorithm focused on long reads (more data/compute intensive to handle but provide more comprehensive results) that uses constant memory to make the compute-heavy part of the workload more efficient.

The use of this algorithmic approach has a profound hardware design implication, the team explains, because all previous hardware accelerators for genomic sequence alignment have assumed an upper-bound on the length of sequences they align or have left the trace-back step in alignment to software, thus undermining the benefits of hardware acceleration. Also critical to the effort is a filtering algorithm that cuts down on the search space for dynamic programming, called D-SOFT, which can be tuned for sensitivity.

To put this in context, keep in mind that long sequence reads are improve the quality of genome assembly and can be very useful in personalized medicine because it is possible to identify variances and mutations. However, this capability comes at a pricethe team notes that mean error rates can be as high as 40% in some cases and while this error can be corrected, it takes time to do so, thus cutting down on the performance and efficiency of the process. The tunable nature of Darwin helps correct for this and is fit to the hardware to speed for more accuracy faster, and with less power consumption.

Layout of one of the GACT processing elements. A 64 processing element array (minus the TB of memory) requires 0.27 square mm area with additional space for control, trace-back logic, and storage blocks. A single GATC array consumes 137mW of power.

On the hardware side, the team has already fully prototyped the concept on FPGA and performed ASIC synthesis for the GACT framework on a 45nm TSMC device. In that prototyping effort, they found pairwise alignment for sequences had a 763X jump on software-only approaches and was over 39,000X more energy efficient. The parameters of D-SOFT can be set to make it very specific event for noisy sequences at high sensitivity and the hardware acceleration of GACT results in 762X speedup over software.

Although D-SOFT is one of the critical elements that creates the tunability that is required for both accuracy and efficiency, it is also the bottleneck in the hardware/software design, eating up 80% of the overall runtime. The problem is not memory capacity, but access patterns, which the team expects they might address by speeding the random memory access using an approach like e-DRAM. Removing this barrier would allow the team to scale Darwins performance. Unlike other custom designs, for once, memory capacity is not a bottleneck as it uses only 120 MB for two arrays, which means far more can fit on a single chip.

Darwin handles and provides high speedup versus hand-optimized software for two distinct applications: reference-guided and de novo assembly of reads, and can work with reads with very different error rates, the team concludes, noting that Darwin is the first hardware-accelerated framework to demonstrate speedup in more than one class of applications, and in the future, it can extend to alignment applications even beyond read assembly.

Categories: Analyze

Tags: DNA, Genomics, Life Sciences

The Case For IBM Buying Nvidia, Xilinx, And Mellanox Putting ARM-Based Microservers Through The Paces

Continued here:

Unwinding Moore's Law from Genomics with Co-Design - The Next Platform

Moore’s Law is running out but don’t panic – ComputerWeekly.com

Intel kicked off CES 2017 in Las Vegas with the declaration that Moores Law is still relevant as it slated its first 10nm (nanometre) processor chips for release later this year.

A collection of our most popular articles on datacentre management, including: Cloud vs. Colocation: Why both make sense for the enterprise right now; AWS at 10: How the cloud giant shook up enterprise IT and Life on the edge: The benefits of using micro datacenters

By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.

You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.

Despite this, engineers are facing real issues in how to continue to push system performance to cope with the growing demands of new and emerging datacentre workloads.

This isnt the first time the end of Moores Law has been proclaimed, but Intel and other chip makers have so far found new tricks for shrinking transistors to meet the goal of doubling density every two years, with a knock-on boost for compute performance.

Intel chief executive Brian Krzanich said at CES: Ive been in this industry for 34 years and Ive heard the death of Moores Law more times than anything else in my career. And Im here today to really show you and tell you that Moores Law is alive and well and flourishing. I believe Moores Law will be alive well beyond my career, alive and well and kicking.

Yet the pace is slowing as Intel works at developing 7nm and 5nm technologies to follow on from 10nm. The introduction of 10nm itself has already been delayed by a year because of difficulties with the manufacturing process, and these difficulties are likely to increase as the size approaches physical limits on how small the on-chip circuitry can be made.

I cant see them getting much beyond 5nm, and Moores Law will then run out because we will have reached the end of the silicon era, says Ovum principal analyst Roy Illsley. Some industry observers think this will happen in the next 10 years or so.

As to what will ultimately replace silicon, such as optical processing or quantum computing, there appears no consensus so far. However, this does not mean that compute power will cease to expand, as both hardware and software in the datacentre have evolved since the days of single-chip servers and monolithic applications.

The way apps are written has changed, says Illsley. They are now distributed and scalable, so Moores Law is a rather pointless metric for what a computer can do, anyway.

In fact, the industry hit a similar crisis some time ago, when Intel discovered that its single-core chips simply overheated when ever-increasing clock speeds started to approach 4GHz. The solution then was to change tack and deliver greater processing power by using the extra transistors to put multiple processor cores onto the same chip, and comparable architectural shifts will enable the industry to continue to boost processing power.

Such an approach can be seen in the growing interest in complementing conventional central processing units (CPUs) with specialised accelerators that may be better suited to handling specific tasks or workloads. A good example of this is the graphics processing unit (GPU), which has long been used to accelerate 3D graphics, but which has also found its way into high-performance compute (HPC) clusters thanks to the massively parallel architecture of a GPU which makes it excellent for performing complex calculations on large datasets.

In 2016, Nvidia launched its DGX-1 server, which sports eight of its latest Tesla GPUs with 16GB memory apiece and is aimed at applications involving deep learning and artificial intelligence (AI) accelerated analytics. Nvidias system can do what would have taken a whole datacentre of servers a few years ago, at a pretty competitive price, says Illsley.

Another example is the field programmable gate array (FPGA), which is essentially a chip full of logic blocks that can be configured to perform specific functions. It provides a hardware circuit that can perform those functions much faster than can be done in software, but which can be reconfigured under software control, if necessary.

One notable adopter of FPGAs is Microsoft, which uses the technology in its Azure datacentre servers to speed up Bing searches and accelerate software-defined networking (SDN).

Intel is also working on integrating FPGA circuitry into some of its Xeon server chips, which could lead to broader adoption. In 2016, the firm showed off a Xeon coupled with a discrete FPGA inside a chip package, but its goal is to get both onto a single piece of silicon.

Meanwhile, Intel prefers to push its Xeon Phi platform rather than GPU acceleration for demanding workloads. These many integrated core chips combine a large number of CPU cores (up to 72 in the latest Knights Landing silicon) which are essentially x86 cores with 512-bit vector processing extensions, so they can run much of the same code as a standard Intel processor.

However, one issue with having so many cores on one chip is getting access to data in system memory for all those cores. Intel has addressed this by integrating 16GB of high-speed memory inside each Xeon Phi chip package, close to the CPU cores.

HPE has shown a different approach with The Machine, its experimental prototype for a next-generation architecture. This has been described as memory-driven computing, and is based around the notion of a massive, global memory pool that is shared between all the processors in a system, enabling large datasets to be processed in memory.

A working version, demonstrated at HPE Discover in December 2016, saw each processor directly controlling eight dual inline memory modules (DIMMs) as a local memory pool, with a much larger global pool of memory comprising clusters of eight DIMMs connected via a memory fabric interface that also links to the processors. In the demo, all the memory was standard DRAM, but HPE intended The Machine to have a non-volatile global memory pool.

In fact, focusing on processors overlooks the fact that memory and storage are a bigger brake on performance, as even flash-based storage takes several microseconds to read a block of data, during which time the processor may execute millions of instructions. So anything that can speed memory and storage access will deliver a welcome boost to system performance, and a number of technologies are being developed, such as Intel and Microns 3D XPoint or IBMs Phase-Change Memory, which promise to be faster than flash memory, although their cost is likely to see them used at first as a cache for a larger pool of slower storage.

These are being developed alongside new I/O interfaces that aim to make it quicker and easier to move data between memory and the processor or accelerator. Examples include Nvidias NVLink 2.0 for accelerators and the Gen-Z standard that aims to deliver a high-speed fabric for connecting both memory and new storage-class memory technologies.

One thing Illsley thinks we may see in the future is systems that are optimised for specific workloads. Currently, virtually all computers are general-purpose designs that perform different tasks by running the appropriate software. But some tasks may call for a more specialised application-specific architecture to deliver the required performance, especially if AI approaches such as deep learning become more prevalent.

Moores Law, which started out as an observation and prediction on the exponential growth of transistors in integrated circuits by Intel founder Gordon Moore, has lasted five decades. We may be reaching the point where it no longer holds true for silicon chips, but whatever happens, engineers will ensure that compute power continues to expand to meet the demands thrown at it.

See original here:

Moore's Law is running out but don't panic - ComputerWeekly.com

Call for Papers: Workshop on HPC in a post Moore’s Law World – insideHPC

The Workshop on HPC computing in a post Moores Law World has issued their Call for Papers. Held in conjunction with ISC 2017, the all-day workshop takes place June 22 in Frankfurt, Germany.

The impending end of traditional MOSFET scaling has sparked research into preserving HPC performance improvements through alternative computational models. To better shape our strategy, we need to understand where each technology is headed and where it will be in a span of 20 years. This workshop brings together experts who develop or use promising technologies to present the state of their work, and spark a discussion on the promise and detriments of each approach. This includes technologies that adhere to the traditional digital computational model, as well as new models such as neuromorphic and quantum computing models. As part of the workshop, we are accepting paper submissions. Papers will be published in the Springers Lecture Notes in Computer Science (LNCS) series. You can find the call for papers with detailed instructions and a link to the submission site here. We will also hold short panels and keynote presentations from experts in the field.

In scope for this workshop are all topics relevant to improving performance for HPC applications after MOSFET scaling (currently driven by Moores law) stops:

Submissions are due March 6, 2017.

Sign up for our insideHPC Newsletter

See the rest here:

Call for Papers: Workshop on HPC in a post Moore's Law World - insideHPC

Moore’s Law is dead, long live Moore’s Law – ExtremeTech

Moores Law turns 50 this coming week making this an opportune time to revisit Gordon Moores classic prediction, its elevation to near-divine pronouncement over the last 50 years, and the question of what, if anything, Moores Law can teach us about the future of computing. My colleague David Cardinal has already discussed thelaw itself,as well as the early evolution of the integrated circuit. To get a sense of where Moores Law might evolve in the future, we sat down with lithographer, instructor, and gentleman scientist, Dr. Christopher Mack. It might seem odd to talk about the future of Moores Law with a scientist who half-jokingly toasted its death just a year ago but one of the hallmarks of the Law is the way its been reinvented several times over the past fifty years.

IBMs System/360. Photo courtesy of Wikipedia

In arecent article, Dr. Mack argues that what we call Moores Law is actually at least three different laws. In the first era, dubbed Moores Law 1.0, the focus was on scaling up the number of components on a single chip. One simple example can be found in the evolution of the microprocessor itself. In the early 1980s, the vast majority of CPUs could only perform integer math on-die. If you wanted to perform floating point calculations (meaning calculations done using a decimal point), you had to buy a standalone floating point unit with its own pinout and motherboard socket (on compatible motherboards).

Some of you may also recall that in the early days of CPU cache, the cache in question was mounted to the motherboard (and sometimes upgradeable), not integrated into the CPU die. The term front-side bus (which ran from the northbridge controller to main memory and various peripherals) was originally contrasted with the back-side bus, which ran to the CPU cache from the CPU itself. The integration of these components on-die didnt always cut costs sometimes, the final product was actually more expensive but it vastly improved performance.

Digitals VAX 11/780. In many ways, the consummate CISC machine.

Moores Law 2.0 really came into its own in the mid-1990s. Moores Law always had a quieter partner, known as Dennard Scaling. Dennard Scaling stated that as transistors became smaller, their power density remained constant meaning that smaller transistors required less voltage and lower current. If Moores Law had stated we would be able to pack more transistors into the same area, Dennard Scaling ensured that those transistors would be cooler and draw less power. It was Dennard Scaling that broke in 2005, as Intel, AMD, and most other vendors turned away from emphasizing clock-based scaling, in favor of adding more CPU cores and improving single-threaded CPU performance.

From 2005 through 2014, Moores Law continued but the emphasis was on improving cost by driving down the expense of each additional transistor. Those transistors might not run more quickly than their predecessors, but they were often more power-efficient and less expensiveto build. As Dr. Mack points out, much of this improvement was driven by developments in lithography tools. As silicon wafer yields soared and manufacturing outputs surged, the total cost of manufacturing (per transistor) fell, while the total cost per square millimeter fell slowly or stayed about the same.

Moores Law scaling through the classic era.

Moores Law 3.0, then, is far more diverse and involves integrating functions and capabilities that havent historically been seen as part of CPU functions at all. Intels on-die voltage regulator, or the further integration of power circuitry to better improve CPU idle and load characteristics, could be thought of as one application of Moores Law 3.0 along with some of Nvidias deep learning functions, or its push to move camera processing technology over to the same core silicon that powers other areas of the core.

Dr. Mack points to ideas like nanorelays tiny, tiny moving switches that may not flip as quickly as digital logic, but dont leak power at all once flipped. Whether such technologies will be integrated into future chip designs is anyones guess, and the research being poured into them is more uncertain. Its entirely possible that a company might spend millions trying to better implement a design in digital logic, or adapt principles of semiconductors to other types of chip design, only to find the final product is just incrementally better than the previous part.

Theres an argument against this shift in usage that goes something like this: Moores Law, divorced from Gordon Moores actual words, isnt Moores Law at all. Changing the definition of Moores Law changes it from a trustworthy scientific statement into a mealy-mouthed marketing term. Such criticisms arent without merit. Like clock speed, core counts, transistor densities, and benchmark results, Moores Law, in any form, is subject to distortion. Im sympathetic to this argument when Ive called Moores Law dead in the past, Ive been referring to it.

One criticism of this perspective,however, is that the extra layers of fudge were added a long time ago. Gordon Moores original paper wasnt published in The New York Times for public consumption it was a technical document meant to predict the long-term trend of observed phenomena. Modern foundries remain focused on improving density and cutting the cost per transistor (as much as is possible). But the meaning of Moores Law quickly shifted from a simple statement about costs and density trend lines and was presented as an overarching trend that governed nearly every aspect of computing.

Even this overarching trend began to change in 2005, without any undue help from marketing departments. At first, both Intel and AMD focused on adding more cores, but this required additional support from software vendors and performance tools. More recently, both companies have focused on improving power efficiency and cutting idle power to better fit into mobile power envelopes. Intel and AMD have done amazing work pulling down idle power consumption at the platform level, but full load CPU power consumption has fallen much more slowly and maximum CPU temperatures have skyrocketed. We now tolerate full load temperatures of 80-95C, compared to max temperatures of 60-70C less than a decade ago. CPU manufacturers and foundries deserve credit for building chips that can tolerate these higher temperatures, but those changes were made because the Dennard Scaling that underlay what Dr. Mack calls Moores Law 2.0 had already failed.

Transistor scaling continued long after IPC and clock speed had essentially flatlined.

Even an engineering-minded person can appreciate that each shift in the definition of Moores Law accompanied a profound shift in the nature of cutting-edge compute capability. Moores Law 1.0 gave us the mainframe and the minicomputer. Moores Law 2.0s emphasis on per-transistor performance and cost scaling ushered in the era of the microcomputer in both its desktop and laptop incarnations. Moores Law 3.0, with its focus on platform-level costs and total system integration has given us the smartphone, the tablet, and the nascent wearables industry.

Twenty years ago, the pace of Moores Law stood for faster transistors and higher clock speeds. Now it serves as shorthand for better battery life, higher boost frequencies, quicker returns to idle (0W is, in some sense, the new 1GHz), sharper screens, thinner form factors, and, yes higher overall performance in some cases, albeit not as quickly as most of us would like. It endures as a concept because it stands for something much larger than the performance of a transistor or the electrical characteristics of a gate.

After 50 years, Moores Law has become cultural shorthand for innovation itself. When Intel, or Nvidia, or Samsung refer to Moores Law in this context, theyre referring to the continuous application of decades of knowledge and ingenuity across hundreds of products. Its a way of acknowledging the tremendous collaboration that continues to occur from the fab line to the living room, the result of painstaking research aimed to bring a platforms capabilities a little more in line with what users want. Is that marketing? You bet. But its not just marketing.

Moores Law is dead. Long live Moores Law.

Read the original:

Moore's Law is dead, long live Moore's Law - ExtremeTech

02002-02052 (50 years): Moore’s Law, which has defined a …

Moore's Law, Gordon Moore's visionary prediction of continued exponential growth in semi-conductor performance, has provided the engine for innovation and the constantly increasing (and accelerating) power and resources at continually decreasing costs provided by techhnology.

Moore admits that Moore's Law has turned out to be more accurate, longer lasting and deeper in impact than he ever imagined. In fact, it has been Intel engineers, frustrated by an inability to see clearly more than 8 to 10 years into the future of their own technology, who have been the most conservative in estimating the lifespan of Moore's Law and partyly because they have been the most conservative in defining Moore's Law. They continue to focus on increasing the transistor count on silicon as the main driver of Moore's Law and thus announce that Moore's Law may slow or even stop by the end of the next decade, as transisters approach sub-atomic sizes.

Moore's Law, however, was never a physical law. It began as an observation, that became a prediction, that has now been dismissed as a "self-fulfilling prophecy".

However you choose to describe it, Moore's Law has always functioned as a expression of breathtaking (almost rash) optimism and as a pacesetting mechanism informed by scientific observation, commercial competitiveness and human ingenuity that we can and should have the ability to improve our power to provide capability and opportunity for humankind, continually and exponentially thus continuing to provide better, more efficient and less costly technologies.

This continued (and in fact unstoppabl) flow of increased performance, power and new value has transformed vast

The world has broadened its definition of Moore's Law as our understanding of physics, materials and complexity deepens and becomes more intimate. Recently Intel suggested that an "Expanded Moore's Law" is no longer driven solely by transitor count but by the combination of three factors. The first is the traditional increasing the count of components we can put on a chip. The second is increasing the complexity of components we can put on a chip. The third is increasing the convergence of technologies we implement on a chip.

Intel and its competitors continue to leverage and balance these factors as needed to continue producing the by-now-expected-and-required doubling of performance every new generation of technology.

(Those who go back and read Moore's original article that appeared in the April, 1965 issue of Electronics magazine will notice that Moore always used the word components, and even today tends to talk about increasing the complexity of components, rather than focusing solely on the number of transistors on a chip.)

At a certain point, you can choose to define a chip as a network all on its own, and as such subject to Metcalfe's Law. Metcalfe's Law may in fact prove to be one of the most important enablers of the continued growth of semi-conductor performance. (I use the term M (squared), Moore times Metcalfe, to represent this additional factor.)

Many scientists, including those who attended a recent science summit at DARPA, believe the exponential increase in benefits defined by Moore's Law will neither cease nor slow in the foreseeable future.

The source of those benefits may alter, but the value of Moore's Law has now as Moore originally hoped when he first made his famous observation begun an unstoppable expansion beyond traditional computational spaces that will eventually assure new capabilities, as well as increased performance, lower cost, and greater connectivity for vitually every traditional device and services eventually universal availability of transformatory improvements.

It is Moore's Law (arguably in combination with Metcalfe's Law) which is helping us invent and extend our future. We need it to keep going. And for the reasons described above, I believe it will -- certainly for the next five decades. This is the basis and the passion behind my bet.

See the rest here:

02002-02052 (50 years): Moore's Law, which has defined a ...

What is Moore’s Law? – ExtremeTech

If youve been around the internet for longer than Jayden Smith, youre probably familiar with Moores Law. Its often misquoted, often misunderstood, but its law status is rarely questioned. The most generalpossible way to state Moores Law isthis:computing power tends to approximately double every twoyears. It gained notoriety because people like laws that let them predict the future of one of the worlds biggest industries,but the very physical basis for this principle means it is slightly different and less reliable than many people believe.

Though he did not give it that name, Moores Law was first proposed in a magazine article by Intel co-founder Gordon E. Moore. What itactually saysis that the number of transistors that can be packed into a given unit of space willroughly double every two years.That prediction has remained impressively true, a fact thats allowed everything from pocket-sized smartphones to Crysis 3, and the continuing computerization of the economy.

Moores Law scaling

Yet, stated as a precaution about human abilities in physical manufacturing, and divorced from rather airy ideas like computing power, it becomes clear why Moores Lawwont necessarily always hold true. Remember that when Moore made his original prediction, hepredicted a doubling every year, but he quickly amended this to every two years. Physical limitations on the manufacturing of these chips could easily push that number back to five years or more, effectively invalidating Moores Law forever, and revealing it to be nothing more than Moores Very Good But Ultimately Limited Prediction (MVGBULP).

Gordon Moore, co-founder of Intel.

Today, all consumer processors are made out of silicon the second most abundant element in the Earths crust, after oxygen. But silicon is not a perfect conductor, and limits to the mobility of the electrons it carries impose a hard limit on how densely you can pack silicon transistors. Not only does power consumption come a huge issue,butaneffectcalledquantum tunneling can cause problems for keeping electrons contained beyond a certain thicknessthreshold.

Outside of research facilities, silicon transistors dont currently get smaller than 14 nanometers and while some 10 nanometer chips designs might someday reach the market, its seen as a foregone conclusion that to keep to Moores Law over a long period of time, well have to come up with newer and better materials to be the basis of next generation computers.

One oft-cited example is graphene, or the rolled up tubes of graphene called carbon nanotubes. Graphene is atomically thin, often called two-dimensional, and so it allows a huge increase on the physical side of things. On the other hand, graphene does not have a useful bandgap the energy differencewe need to navigate to bumpelectrons back and forth betweenthe conducting and non-conducting bands. Thats how silicon transistors switch on and off, which is the entire basis for their method of computation.

If this problem cant be offset in some way, a graphene computer would have to pioneer a whole new logical method for computing. One graphene computer chip from IBM proved to be incredibly fast, 10,000 times faster than a silicon chip but it was not a general-purpose processor. Since graphene cant be easily switched on and off in mass quantities, we cant simply swap in graphene for silicon and keep on with modern chip architectures.

Sebastian Anthony holding a wafer of graphene chips at IBM Research.

Other materials may offer more practical reductions in size and electrical resistance, and actually allow Moores Law to continue unbroken, but only if they hit the market quickly enough.Silicon-germanium, or just germanium alone, have been talked about for some time, but have yet to really materialize in any sort of affordable form. It wasrecently discovered that a material called titanium tri-sulfide can provide many of the same physical advantages as graphene, and do so with an achievable bandgap such a super-material might be whats needed, but graphene-like problems with manufacturing then rear their ugly heads.

Quantum computing could be another answer, but research is still so preliminary that its doubtful. Some believe theyll offer such a huge and immediate upgrade over modern processors that computer encryption will come tumbling down. However, quantum computing wont necessarily come in the form of a programmable digital computer right away; early quantum computers wont be able to run Windows, even if they are more than fast enough in a theoretical sense. Of all the possible solutions to looming problems with Moores Law, quantum computing is probably the least realistic. It has a lot of potential for specific applications, but quantum PCs are still too far out to be worth considering.

Moore himself admittedthat his Law cant continueforever in a 2005 interview. Its the nature of exponential functions, he said they eventually hit a wall, and while that makes perfect sense in the purely hypothetical world of mathematics, it tends not to work out as well in the real world. It could be that Moores Law will hold up when viewed on the century scale, zoomed out to diminish the importance of any small fluctuations between new technologies. But the fact remains that right now, were entering a lull as we wait for the next great processing tech to arrive.

Check out our ExtremeTech Explains series for more in-depth coverage.

Here is the original post:

What is Moore's Law? - ExtremeTech

Moore’s law – Simple English Wikipedia, the free encyclopedia

Moore's law is that the number of transistors on integrated circuits doubles about every two years. The period often quoted as "18 months" is due to Intel executive David House, who predicted that period for a doubling in chip performance (being a combination of the effect of more transistors and their being faster).[1]

The law is named after Intel co-founder Gordon Moore, who described the trend in his 1965 paper.[2][3][4] The paper noted that the number of components in integrated circuits had doubled every year from the invention of the integrated circuit in 1958 until 1965 and predicted that the trend would continue "for at least ten years".[2] His prediction has proved very accurate. The law is now used in the semiconductor industry to guide long-term planning and to set targets for research and development.[5]

The capabilities of many digital electronic devices are strongly linked to Moore's law: processing speed, memory capacity, sensors and even the number and size of pixels in digital cameras.[6] All of these are improving at (roughly) exponential rates as well.

This exponential improvement has greatly increased the effect of digital electronics in the world economy.[7] Moore's law describes a driving force of technological and social change in the late 20th and early 21st centuries.[8][9]

This trend has continued for more than half a century. Sources in 2005 expected it to continue until at least 2015 or 2020.[2][10] However, the 2010 update has growth slowing at the end of 2013,[11] after which transistor counts and densities are set to double every three years.

Originally posted here:

Moore's law - Simple English Wikipedia, the free encyclopedia

Moores lag Wikipedia

Moores lag, uppkallad efter en av Intels grundare Gordon E. Moore, betecknar det fenomen att antalet transistorer som fr plats p ett chip vxer exponentiellt. Takten som gller sedan mnga r tillbaka ger en frdubbling var 24:e mnad. Ofta citeras Moores lag som att det vore var 18:e mnad, men det r enligt Moore inte korrekt.[1] Moores lag har visat sig korrekt nda sedan 1965 d den formulerades, dock med en och annan justering av frdubblingstiden.

P 80-talet tolkades lagen som frdubblingen av antalet transistorer per chip, men det har kommit att ndrats med tiden. I brjan av 90-talet menades mer frdubblingen av mikroprocessorkraften och senare under decenniet frdubblingen av berkningskraft per fix kostnad.

Moore beskrev frst lagen som en frdubbling efter endast ett r, vilket han sen reviderade till tv r. Han menade aldrig sjlv att det skulle vara efter 18 mnader. Det r ngot som kommit till i efterhand d det visade sig ligga nrmare verkligheten. Det var heller inte Moore sjlv som kom p idn, utan den var knd sedan tidigare av dem som arbetade inom omrdet. Det tog ocks ungefr ett decennium innan lagen fick sitt namn "Moores lag". Numera kan begreppet anvndas till allt som ndras exponentiellt.

Moores lag har haft stor betydelse fr datorindustrin som i mngt och mycket lever p att fregende rs modeller mste bytas ut nr datorns CPU blivit frldrad enligt Moores lag. Lagen kommer ocks till anvndning nr man utvecklar exempelvis spel, och behver veta hur kraftiga datormaskinerna som finns p marknaden r nr spelet slpps.

Moores lag r enbart applicerbar fr halvledare. En utkning av Moores lag som innefattar all informationsteknisk utveckling freslogs 2001 av Ray Kurzweil. Denna lag r knd som the law of accelerating returns.

2015 meddelade branschfolk att man inom en snar framtid kommer tvingas frng moores lag.

See more here:

Moores lag Wikipedia

Moore’s Law – Cymer, Inc.

The Driving Force in the Semiconductor Industry

"Moore's Law" is well-known and widely used in the semiconductor industry term to describe the advancement in semiconductor device technology. First observed by Intel Corporation co-founder and former chairman Gordon E. Moore in 1965, the empirical theory predicts that the transistor density onintegrated circuits (ICs) increases exponentially, doubling approximately every two years with proportionate decreases in cost. This prediction has held true since then andis a driving force oftechnology advancements worldwide.

To continue to meet Moore's Law, the length and width of a transistor must shrink about 30% every 18 months. The ability to pattern smaller circuits depends on the wavelength of the light used in the photolithography process. A shorter wavelength of light can image circuitry with smaller critical dimensions (CDs) and pitch, which in turn allows the transistors to be smaller and transistor density to increase.

Since the introduction of its first Deep Ultraviolet (DUV) light source, Cymer has played a significant role in the advance of integrated circuit manufacturing.Cymer has worked to continuouslyimprove light source performance, enabling theapplication of its light sources to pattern ever smaller circuitry. As lithography continues to extend Moore's Law, extreme ultraviolet (EUV) lithography will succeed double-patterning ArF immersion lithography allowing the scaling of feature sizes and half-pitch to 22nm and beyond.

Furtherinformation on Moore's Law can be found on the Intel Corporation website.

View post:

Moore's Law - Cymer, Inc.

50 Years On, Moore’s Law Still Pushes Tech to Double Down

Slide: 1 / of 1 .

Caption: Gordon E. Moore. Chuck Nacke/Alamy

On April 19, 1965, the 36-year-old head of R&D at seminal Silicon Valley firm Fairchild Semiconductor published a prediction in a trade magazine, Electronics. The researcher claimed that the number of componentsthat is, transistorson a single computer chip would continue to double every year, while the cost per chip would remain constant.

Integrated circuits will lead to such wonders as home computersor at least terminals connected to a central computerautomatic controls for automobiles, and personal portable communications equipment, that researcher, Gordon Moore, wrote.

Moore's Law is both the imperative that propels tech companies forward and the standard by which they must abide in order to stay afloat in the industry.

At the time, Moore thought the prediction would hold true for a decadefrom 60 components on a single silicon chip to 65,000 by 1975. That year, he revised his forecast down to a doubling every two years. Moore went on to cofound a little company called Intel, which would become the number one semiconductor company in the world. Today, fifty years later, thedictum now famously known as Moores Law has withstood the test of time.

In the beginning, it was just a way of chronicling the progress, Moore, now 86 years old, said in an interview posted by Intel. But gradually, it became something that the various industry participants recognized as something they had to stay on or fall behind technologically.

Over the past five decades, the surge in computing power predicted by Moores Law has paralleled the trajectory of innovation in Silicon Valley. Computers were once the size of a room. Now smartphones with more processing power than NASA imagined it would need to send a man to the moon can easily fit in your pocket. When Moore first made his prediction, transistors were about the size of an eraser at the end of a pencil. Now, six million can fit into the period at the end of this sentence. The consistency with which more powerful chips have confirmed Moores Law has given companies the confidence to invest in the development of complementary technologies, from displays, sensors, and memory to digital imaging devices, software, and the internet. All the while, prices per unit of power keep falling.

But the reliability of Moores Law has also shaped expectations. Today, consumers all but demand that their gadgets get faster, cheaper, and more compact in step with Moores Law. Its both the imperative that propels tech companies forward and the standard by which they must abide in order to stay afloat in the industry.

Whats more, that expectation now extends, fairly or not, beyond gadgets to new innovations in cloud computing, the internet, social media, search, streaming video, and more. According to Dan Hutchenson, head of chip market research outfit VLSI Research, the market value of the companies across the spectrum of technologies beholden to Moores Law amounted to a whopping $13 trillion in 2014one-fifth of the asset value of the worlds economy.

As a result, Moores Law also means companies are in constant competition with their own progress, says Steve Brown, a strategist with Intel. Lucky for them, Brown says, Moores Law is not a fact of nature. Its more of an aspiration and a belief system, he says. Its that belief that drives technology companies to outdo themselves year after year, Brown saysa belief held by both themselves and their customers.

Beyond the advance of computing technology itself, the surge in computing power predicted by Moores Law has led to Moores Law-like transformations in other industries, including healthcare, pharmaceuticals, and genetics. Many drugs have been tested in the minds of computers, as Brown puts it. Computer software can analyze the human genome in minutes.

And its these advances, Brown believes, that might be the most important of all. Ultimately, it wont be about making a better, faster smartphone, he says. We may eventually discover how to make more food, create better living conditions and connect more people together. Moores Law could be key to unlocking that.

Go here to read the rest:

50 Years On, Moore's Law Still Pushes Tech to Double Down

Moore’s Law at 50: The past and future | Computerworld

Intel co-founder Gordon Moore.

When you're strapping on the latest smart watch or ogling an iPhone, you probably aren't thinking of Moore's Law, which for 50 years has been used as a blueprint to make computers smaller, cheaper and faster.

Without Moore's Law it's quite possible that new types of computers like Microsoft's HoloLens, a holographic wearable with which users can interact with floating images, would not have been developed. For decades, Moore's Law has been a guiding star for the development of modern electronics, though in recent years its relevance has been subject to debate.

Moore's Law isn't a scientific theory, but a set of observations and predictions made by Intel co-founder Gordon Moore in an article [click here to download] first published in Electronics Magazine on April 19, 1965, which were subsequently modified. His core prediction states that the density of transistors, or the number of transistors on a given die area, would double every two years, which leads to double the performance. Loosely translated, that means in 18 to 24 months you could buy a computer that is significantly faster than what you have today with the same amount of money.

The tech industry originally interpreted this to mean that making chips would get cheaper with scaling: as transistor density doubles, chips shrink in size, processing speeds up, and the cost per processor declines. For the past five decades, the tech world has based product plans and manufacturing strategies around this concept, leading to smaller, cheaper and faster devices.

Manufacturing advances have also made chips power-efficient, helping squeeze more battery life out of devices.

Without Moore's Law, "I don't think we could have a smartphone in the palm of our hand," said Randhir Thakur, executive vice president and general manager of the Silicon Systems Group at Applied Materials.

But engineers have predicted that Moore's Law will die in the next decade because of physical and economic challenges. Conventional computers could be replaced by quantum computers and systems with brain-like, or neural, chips, which function differently than current processors. Silicon could also be replaced by chips made using new materials, such as graphene or carbon nanotubes.

Intel applied Moore's observations first to memory products, with the benefit being lower cost per bit. Then it applied Moore's Law to integrated circuits, and Intel's first chip in 1971, the 4004, had 2,300 transistors. Intel's latest chips have billions of transistors, are 3,500 times faster, and 90,000 times more power efficient.

Since then, Moore's Law has been flexible enough to adapt to changes in computing. It was the force behind supercharging computer performance in the 1990s, and lowering power consumption in the last decade, said Mark Bohr, senior fellow at Intel.

"The type of performance we had on desktops 15 years ago is matched by laptops and smartphones in our hand today," Bohr said.

Moore's Law is being used as a guiding principle in the development of wearables, Internet of Things devices and even robots that can recognize objects and make decisions. It also affects a diverse range of products such as cars, health devices and home appliances, which are relying more on integrated circuits for functionality, Bohr said.

Intel innovations in manufacturing, Moore's Law presentation.

But engineers agree that Moore's Law could be on its last legs as chips scale down to atomic scale, and even Intel is having a tough time keeping pace. Gordon Moore has revisited Moore's Law over the last 50 years and at multiple times expressed doubts about its longevity. In a recent interview with IEEE Spectrum, Moore said keeping up was getting "more and more difficult."

Intel's innovations have kept Moore's Law chugging along, with the most recent technology advance being FinFET, in which transistors are placed on top of each other so more features can be packed on chips. Intel has spent billions of dollars establishing new factories, and innovations such as strained silicon, high-k metal gate and FinFET have helped give Moore's Law a long lease on life.

"Because Intel works hard on it, new, computing-hungry applications are emerging every day," said Xian-He Sun, distinguished professor of computer science at the Illinois Institute of Technology in Chicago.

But it is becoming difficult to etch an increasing number of features on ever-smaller chips, which are increasingly susceptible to a wide range of errors and defects. More attention is required in designing and making chips, and additional processes and personnel need to be put in place to prevent errors.

In addition, with research under way into new materials and technologies, silicon may be on its way out, a change that could fundamentally transform Moore's Law. There's a lot of interest in a family of so-called III-V materials -- compounds based on elements from the third and fifth columns of the periodic chart -- such as gallium arsenide or indium gallium arsenide.

"Moore's Law is morphing into something that is about new materials," said Alex Lidow, a semiconductor industry veteran and CEO of Efficient Power Conversion (EPC).

EPC is making a possible silicon replacement, gallium nitride (GAN), which is a better conductor of electrons, giving it performance and power-efficiency advantages over silicon, Lidow said. GAN is already being used for power conversion and wireless communications, and could make its way to digital chips someday, though Lidow couldn't provide a timeline.

"For the first time in 60 years there are valid candidates where it's about superior material rather than smaller feature size," Lidow said.

The economics of manufacturing smaller and faster chips are also tumbling. It's getting more expensive to make advanced factories, and the returns on making those chips are diminishing. Important tools like EUV (extreme ultraviolet) lithography, which transfers circuit patterns onto substrates, would make it possible to shrink chips to even smaller sizes but aren't yet available.

"The semiconductor has always faced challenges, which have been speed bumps. Now we're going up against a wall," said Jim McGregor, principal analyst at Tirias Research.

Experts can't predict where Moore's Law will be years from now, but it will eventually fall as the physics and economics of making smaller chips no longer make practical sense. Nevertheless, the legacy of Moore's Law will live on as a model for bringing down the price of components, which leads to cheaper devices and computers, McGregor said.

Moore's 1965 article ushered in an era of ever-increasing technological change. "We've taken servers the size of a room down to a mobile chip. It's amazing what we've done in that period of time," McGregor said.

More here:

Moore's Law at 50: The past and future | Computerworld

Moore’s Law is the reason your iPhone is so thin and cheap …

An aerial view of Intel's Ronler Acres campus in Hillsboro, Ore., with D1X, center, the site's newest factory for developing cutting-edge chips. Ben Fox Rubin/CNET

To get a sense of what society owes to Moore's Law, just ask what the world would look like if Intel co-founder Gordon Moore never made his famous 1965 observation that the processing power of computers would increase exponentially.

CNET

"It is almost unimaginable," said Genevieve Bell, a cultural anthropologist for Intel.

"The implications would be so dramatic, I struggle to put it in words," said Adrian Valenzuela, marketing director for processors for Texas Instruments.

Jeff Bokor, a professor of electrical engineering and computer science at the University of California, Berkeley, found at least one: "Cataclysmic."

The comments aren't wild hyperbole; they underscore just how significant an impact one little observation has had on the world. Moore's Law is more than a guideline for computer processor, or chip, manufacturing. It's become a shorthand definition for innovation at regular intervals, and has become a self-fulfilling prophecy driving the tech industry.

Are you happy about your sleeker iPhone 6 or cheaper Chromebook? You can thank Moore's Law.

With Sunday marking the 50th anniversary of Moore's observation, we decided to take stock of Moore's Law. CNET staff reporter Ben Fox Rubin offers an in-depth look at the work that semiconductor manufacturers are putting in to make sure the rate of improvement is sustainable. Tomorrow, CNET senior reporter Stephen Shankland explores alternative technologies and the future of Moore's Law while senior reporter Shara Tibken looks at Samsung's lesser known presence in the field.

But first, let's explore the effect of Moore's Law throughout history -- and start by dispelling some misconceptions. Most importantly, Moore's Law is not actually a law like Isaac Newton's Three Laws of Motion. In a paper titled, "Cramming More Components onto Integrated Circuits," published by the trade journal Electronics in 1965, Moore, who studied chemistry and physics, predicted that the number of components in an integrated circuit -- the brains of a computer -- would double every year, boosting performance.

A decade later, he slowed his prediction to a doubling of components every two years.

It wasn't until Carver Mead, a professor at the California Institute of Technology who worked with Moore at the Institute of Electrical and Electronics Engineers, coined the term "Moore's Law" in 1975 that it gained widespread recognition in the tech world. It became a goal for an entire industry to aspire to -- and hit -- for five decades.

"[It's] a name that has stuck beyond anything that I think could have been anticipated," Moore, now 86, said in an interview with Intel earlier this year.

This content is rated TV-MA, and is for viewers 18 years or older. Are you of age?

Sorry, you are not old enough to view this content.

Play

Moore's Law specifically refers to transistors, which switch electrical signals on and off so that devices can process information and perform tasks. They serve as the building blocks for the brains inside all our smartphones, tablets and digital gadgets.

The more transistors on a chip, the faster that chip processes information.

To keep Moore's Law going, chip manufacturers have to keep shrinking the size of the transistors so more can be placed together with each subsequent generation of the technology. The original size of a transistor was half an inch long. Today's newest chips contain transistors that are smaller than a virus, an almost unimaginably small scale. Chipmakers including Intel and Samsung are pushing to shrink them even more.

But size doesn't really matter when it comes to appreciating Moore's Law. More important is the broader idea that things get better -- smarter -- over time.

The law has resulted in dramatic increases in performance in smaller packages. The Texas Instruments processor that powers the navigation system in a modern Ford vehicle is nearly 1.8 million times more powerful than the Launch Vehicle Digital Computer that helped astronauts navigate their way to the moon in 1969.

The iPhone 6 in your pocket is more powerful than computers from a decade ago. CNET

And Apple's iPhone 6 is roughly 1 million times more powerful than an IBM computer from 1975 -- which took up an entire room -- according to a rough estimate by UC Berkeley's Bokor. The iPhone, priced starting at $650, is also a lot cheaper than a full-fledged desktop computer selling anywhere between $1,000 and $4,000 a decade ago -- and it can do so much more.

Just as critical is the time element of Moore's Law: the doubling of transistors every two years meant the entire tech industry -- from consumer electronics manufacturers to companies that make the equipment to manufacture chips and everything in between -- had a consistent rate that everyone could work at.

"It created a metronome," Bell said. "It's given us this incredible notion of constant progress that is constantly changing."

It also set a pace that companies need to keep, or else get left behind, according to Moore. "Rather than become something that chronicled the progress of the industry, Moore's Law became something that drove it," Moore said in an online interview with semiconductor industry supplier ASML in December.

While he didn't think his observation would hold true forever, chipmakers don't seem to be slowing down their efforts. "It's a self-fulfilling prophecy, so to the industry it seems like a law," said Tsu-Jae King Liu, a professor of microelectronics at UC Berkeley.

Nowadays, everyone assumes technology will just get better, faster and cheaper. If we don't have a sophisticated enough processor to power a self-driving car now, a faster one will emerge in a year or two.

Remove Moore's Law, and that assumption no longer holds true. Without a unifying observation to propel the industry forward, the state of integrated circuits and components might be decades behind.

"It's an exponential curve, and we would be much earlier on that curve," Valenzuela said. "I'm happy to say I don't have to carry my 1980s Zack Morris phone."

Intel's Bell imagines a more "horrifying" world without integrated circuits, one in which everything is mechanized, and common tropes of technology such as smartphones and even modern telephone service wouldn't exist. "The Internet would have been impossible," she said.

It's not a completely implausible alternate reality. Bell noted that many industries haven't moved as quickly to embrace new technology and ideas. The internal combustion engine hasn't changed much since Henry Ford's Model T more than a century ago, and it's only in the last several years that automakers have embraced batteries that power the engine.

Speaking of batteries, there's a reason why our smartphones lose their juice faster and faster -- battery technology hasn't kept pace with the advancement of the processor and its capabilities.

"Not too many industries have a clearly defined expectation in improvement of capability and cost benefits over such a long time," said H.S. Philip Wong, an engineering professor at Stanford.

It's a lot easier to document the progress achieved through Moore's Law. Increasingly sophisticated chips have resulted in not just more powerful standalone devices, but an ecosystem of gadgets that can talk to each other.

As Bell said, there would be no Internet without Moore's Law, which means Google or Facebook would never have existed, and Netflix would still be mailing DVDs (VHS tapes?) to you.

"It's a technology that's been much more open-ended than I would have thought in 1965 or 1975," Moore said. "And it's not obvious yet when it will come to the end."

Intel's button-sized Curie processor for wearables wouldn't be possible without Moore's Law. James Martin/CNET

Smaller processors have driven interest in the Internet of Things (IoT), or the idea that physical objects around us can be connected to the Internet and to each other. TI's Valenzuela said he remembers selling basic thermostats using rudimentary chips. Now smart thermostats built by Google's Nest have a processor powerful enough to run a smartphone.

Intel demonstrated the potential for the IoT idea in January at the Consumer Electronics Show with Curie, a button-size module designed to power smart wearable devices with a low-power processor. It's the reason why we're talking about self-driving cars, smart transportation systems, smart homes, smart watches and even clothes equipped with Internet-connected sensors.

"It's really like the water that we drink and air that we breathe," Wong said about society's dependence on the innovations brought on by Moore's Law. "We can't survive without it."

Read the original post:

Moore's Law is the reason your iPhone is so thin and cheap ...

Moore’s Law and The Secret World Of Ones And Zeroes

SciShow explains how SciShow exists -- and everything else that's ever been made or used on a computer -- by exploring how transistors work together in circuits to make all computing possible. Like all kinds of science, it has its limitations, but also awesome possibilities. ---------- Like SciShow? Want to help support us, and also get things to put on your walls, cover your torso and hold your liquids? Check out our awesome products over at DFTBA Records: http://dftba.com/artist/52/SciShow

Or help support us by subscribing to our page on Subbable: https://subbable.com/scishow ---------- Looking for SciShow elsewhere on the internet? Facebook: http://www.facebook.com/scishow Twitter: http://www.twitter.com/scishow Tumblr: http://scishow.tumblr.com

Thanks Tank Tumblr: http://thankstank.tumblr.com

Sources: http://www.mooreslaw.org/ http://www.intel.com/content/dam/www/... http://www.tldp.org/HOWTO/Unix-and-In... http://homepage.cs.uri.edu/book/binar... https://www.youtube.com/watch?v=qm67w... https://www.youtube.com/watch?v=cNN_t... http://www.newscientist.com/article/m... http://www.newscientist.com/article/m... http://www.tldp.org/HOWTO/Unix-and-In... http://www.extremetech.com/computing/... http://www.amasci.com/miscon/speed.html http://newsoffice.mit.edu/2013/comput...

Go here to see the original:

Moore's Law and The Secret World Of Ones And Zeroes

Louisiana Premises Liability Law – Irwin Fritchie Urquhart …

On August 4, 2006, Chinita Weber filed a lawsuit against Metropolitan Hospice alleging wrongful death and survival claims on behalf of her aunt, Mary London, who died at the facility in the days following Hurricane Katrina. The hurricane impacted the New Orleans area on August 29, 2005. Ms. Weber asserted that Metropolitan Hospice was negligent in causing her aunts death for two reasons. First, the facility was negligent in failing to evacuate in advance of Hurricane Katrina. Second, the facility was negligent in failing to provide adequate backup electrical power, thereby subjecting her aunt to extreme heat and unsanitary conditions, which she claimed ultimately caused her aunts death.

Metropolitan Hospice filed an exception of no right of action, arguing that the Louisiana statutes governing wrongful death and survival claims did not allow Ms. Weber the right to bring such claims on behalf of her aunt. Louisiana law permits only limited classes of beneficiaries to bring such claims, and a niece does not qualify as such a beneficiary. The trial court granted the exception, but allowed Ms. Weber thirty days to amend her petition to properly state a claim.

Ms. Weber had herself appointed as representative of her aunts succession, and filed an amended petition asserting wrongful death and survival claims as her aunts succession representative. Metropolitan Hospice responded by filing two exceptions: (1) an exception of no right of action arguing that as succession representative, Ms. Weber had no right to assert a wrongful death claim, and (2) an exception of prescription arguing that Ms. Webers survival claim was not timely asserted. The trial court granted both motions, and Ms. Weber appealed.

On appeal, the appellate court affirmed in part and reversed in part the trial courts decision. With regard to the exception of no right of action, the appellate court affirmed the trial courts dismissal of Ms. Webers wrongful death claim because Louisiana law does not allow a succession representative the right to bring a wrongful death claim. Nevertheless, the appellate court noted that a successor representative does have the right to bring a survival claim on behalf of the deceased person. Thus, whether Ms. Weber could continue pursuing the survival claim hinged on whether the appellate court agreed that the survival claim was untimely.

Louisiana law requires that survival claims be filed within one year from the date of the decedents death. While undoubtedly Ms. Weber filed her original 2006 lawsuit within one year of her aunts death, the key issue was whether the filing of her amended complaint in 2011 could relate back to the date that she filed her original lawsuit on August 4, 2006.

In accordance with Louisianas relation back doctrine, four factors determine whether an amended petition that either adds or substitutes a plaintiff can be treated as if it were filed on the date that the original petition was filed. They are: (1) if the amended claim arises out of the same conduct, transaction or occurrence as the original claim, (2) the defendant knew or should have known of the involvement of the new plaintiff, (3) the new and old plaintiffs are sufficiently related so that the new party is not entirely new or unrelated, and (4) the defendant is not prejudiced in preparing its defense. The appellate court determined that Ms. Webers amended lawsuit met these requirements.

The courts analysis did not end there, however. If Ms. Webers claims against Metropolitan Hospice could be considered medical malpractice claims rather than negligence claims, then her claims would still be untimely since Louisiana law requires that medical malpractice claims be filed within three years of the date of the decedents death without exception. Relying on other Louisiana decisions involving similar Katrina-related claims, the appellate court determined that Ms. Webers claims were not, in fact, medical malpractice claims. Accordingly, the court held that Ms. Webers survival claims were timely as her amended complaint related back to the date that she filed her original lawsuit.

Take-Away: In cases where someone has died as a result of the alleged negligence of a premises owner, the owner may be sued for damages sustained by the decedent prior to his death and damages sustained by surviving family members as a result of their loss.

This article was co-authored by Lizzi Richard, an associate at Irwin Fritchie Urquhart & Moore LLC.

See the original post here:

Louisiana Premises Liability Law - Irwin Fritchie Urquhart ...