{"id":212505,"date":"2017-03-02T10:51:47","date_gmt":"2017-03-02T15:51:47","guid":{"rendered":"http:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/uncategorized\/expanding-the-scope-of-verification-ee-journal.php"},"modified":"2017-03-02T10:51:47","modified_gmt":"2017-03-02T15:51:47","slug":"expanding-the-scope-of-verification-ee-journal","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/moores-law\/expanding-the-scope-of-verification-ee-journal.php","title":{"rendered":"Expanding the Scope of Verification &#8211; EE Journal"},"content":{"rendered":"<p>\n<p>    March 1, 2017  <\/p>\n<p>    by Kevin Morris  <\/p>\n<p>    Looking at the agenda for the 2017 edition of the annual DVCon    - arguably the industrys premiere verification conference, one    sees precisely what one would expect: tutorials, keynotes, and    technical sessions focused on the latest trends and techniques    in the ever-sobering challenge of functional verification in    the face of the relentless advance of Moores Law.  <\/p>\n<p>    For five decades now, our designs have approximately doubled in    complexity every two years. Our brains, however, have not. Our    human engineering noggins can still process just about the same    amount of stuff that we could back when we left college,    assuming we havent let ourselves get too stale. That means    that the gap between what we as engineers can understand and    what we can design has been growing at an exponential rate for    over fifty years. This gap has always presented the primary    challenge for verification engineers and verification    technology. Thirty years ago, we needed to verify that a few    thousand transistors were toggling the right ways at the right    times. Today, that number is in the billions. In order to    accomplish that and span the complexity gap, we need    significant leverage.  <\/p>\n<p>    The basic fundamentals of verification have persisted. Logic    simulation has always been a mainstay, processing vectors of    stimuli and expected results as fast and accurately as possible    - showing us where our logic or timing has gone awry. Along the    way, we started to pick up formal methods - giving us a way to    prove that our functionality was correct, rather than trying    to exhaustively simulate the important or likely scenarios.    Parallel to those two avenues of advancement, we have been    constantly struggling to optimize and accelerate the    verification process. Weve proceduralized verification through    standards-based approaches like UVM, and weve worked to    accelerate the execution of our verification processes through    technologies such as FPGA-based prototyping and    emulation.  <\/p>\n<p>    Taking advantage of Moores Law performance gains in order to    accelerate the verification of our designs as they grow in    complexity according to Moores Law is, as todays kids would    probably say, Kinda meta. But Moores Law alone is not    enough to keep up with Moores Law. Its the classic    perpetual-motion conundrum. There are losses in the system that    prevent the process from being perfectly self-sustaining. Each    technology-driven doubling of the complexity of our designs    does not yield a doubling of the computation that can be    achieved. We gradually accrue a deficit.  <\/p>\n<p>    And the task of verification is constantly expanding in other    dimensions as well. At first, it was enough to simply verify    that our logic was correct - that the 1s, 0s, and Xs at the    inputs would all propagate down to the correct results at the    outputs. On top of that, we had to worry about timing and    temporal effects on our logic. As time passed, it became    important to verify that embedded software would function    correctly on our new hardware, and that opened up an entire new    world of verification complexity. Then, people got cranky about    manufacturing variation and how that would impact our    verification results. And we started to get curious about how    things like temperature, radiation, and other environmental    effects would call our verification results into    question.  <\/p>\n<p>    Today, our IoT applications span vast interconnected systems    from edge devices with sensors and local compute resources    through complex communication networks to cloud-based computing    and storage centers and back again. We need to verify not just    the function of individual components in that chain, but of the    application as a whole. We need to confirm not simply that the    application will function as intended - from both a hardware    and software perspective - but that it is secure, robust,    fault-tolerant, and stable. We need to assure that performance    - throughput and latency - are within acceptable limits, and    that power consumption is minimized. This problem far exceeds    the scope of the current notion of verification in our    industry.  <\/p>\n<p>    Our definition of correct behavior is growing increasingly    fuzzy over time as well. For example, determining whether a    processed video stream looks good is almost impossible from a    programmatic perspective. The only reliable metric we have is    human eyes subjectively staring at a screen. There are many    more metrics for system success that have followed similar    subjectivity issues. As our digital applications interact more    and more directly and intimately with our human, emotional,    analog world, our ability to boil verification down to a known    set of zeros and ones slips ever farther from our grasp.  <\/p>\n<p>    The increasing dominance of big data and AI-based algorithms    further complicate the real-world verification picture. When    the behavior of both hardware and software is too complex to    model, it is far too complex to completely verify. Until some    radical breakthrough occurs in the science of verification    itself, we will have to be content to verify components and    subsystems along fairly narrow axes and hope that confirming    the quality of the flour, sugar, eggs, butter, and baking soda    somehow verifies the deliciousness of the cookie.  <\/p>\n<p>    There is no question that Moores Law is slowly grinding to a    halt. And, while that halt may give us a chance to grab a    breath from the Moores Law verification treadmill, it will by    no means bring an end to our verification challenges. The fact    is - if Moores Law ends today, we can already build systems    far too complex to verify. If your career is in verification,    and you are competent, your job security future looks pretty    rosy.  <\/p>\n<p>    But this may highlight a fundamental issue with our whole    notion of verification. Verification somewhat tacitly assumes a    waterfall development model. It presupposes that we design a    new thing, then we verify our design, then we make and deploy    the thing that we developed and verified. However, software    development (and Id argue that the development of all complex    hardware\/software applications such as those currently being    created for IoT) follows something much more akin to agile    development - where verification is a continual ongoing process    as the applications and systems evolve over time after their    initial deployment.  <\/p>\n<p>    So, lets challenge our notion of the scope and purpose of    verification. Lets think about how verification serves our    customers and our business interests. Lets re-evaluate our    metrics for success. Lets consider how the development and    deployment of products and services has changed the role of    verification. Lets think about how our technological systems    have begun to invert - where applications now span large    numbers of diverse systems, rather than being contained within    one. Moores Law may end, but our real work in verification has    just begun.  <\/p>\n<p>    EDA. Semiconductor.  <\/p>\n<p><!-- Auto Generated --><\/p>\n<p>More here:<\/p>\n<p><a target=\"_blank\" href=\"http:\/\/www.eejournal.com\/archives\/articles\/20170301-verification\/\" title=\"Expanding the Scope of Verification - EE Journal\">Expanding the Scope of Verification - EE Journal<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> March 1, 2017 by Kevin Morris Looking at the agenda for the 2017 edition of the annual DVCon - arguably the industrys premiere verification conference, one sees precisely what one would expect: tutorials, keynotes, and technical sessions focused on the latest trends and techniques in the ever-sobering challenge of functional verification in the face of the relentless advance of Moores Law. For five decades now, our designs have approximately doubled in complexity every two years. Our brains, however, have not.  <a href=\"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/moores-law\/expanding-the-scope-of-verification-ee-journal.php\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"limit_modified_date":"","last_modified_date":"","_lmt_disableupdate":"","_lmt_disable":"","footnotes":""},"categories":[14],"tags":[],"class_list":["post-212505","post","type-post","status-publish","format-standard","hentry","category-moores-law"],"modified_by":null,"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/212505"}],"collection":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/comments?post=212505"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/212505\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/media?parent=212505"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/categories?post=212505"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/tags?post=212505"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}