Expert: Don’t overlook security in rush to adopt AI – The Winchester Star

MIDDLETOWN Lord Fairfax Community College hosted technologist Gary McGraw on Wednesday night. He spoke of the cutting edge work being done at the Berryville Institute of Machine Learning, which he co-founded a year ago.

The talk was part of the colleges Tech Bytes series of presentations by industry professionals connected to technology.

The Berryville Institute of Machine Learning is working to educate tech engineers and others about the risks they need to think about while building, adopting and designing machine learning systems. These systems involve computer programs called neural networks that learn to perform a task such as facial recognition by being trained on lots of data, such as by the use of pictures, McGraw said.

Its important that we dont take security for granted or overlook security in the rush to adopt AI everywhere, McGraw said.

One easily relatable adaptation of this technology is in smartphones, which are using AI to analyze conversations, photos and web searches, all to process peoples data, he said.

There should be privacy by default. There is not. They are collecting your data you are the product, he said.

The institute anticipates within a week or two releasing a report titled An Architectural Risk Analysis of Machine Learning Systems in which 78 risks in machine learning systems are identified.

McGraw told the audience that, while not interchangeable terms, artificial intelligence and machine learning have been sold as magic technology that will miraculously solve problems. He said that is wrong. The raw data used in machine learning can be manipulated and it can open up systems to risks, such as system attacks that could compromise information, even confidential information.

McGraw cited a few of those risks.

One risk is someone fooling a machine learning system by presenting malicious input of data that can cause a system to make a false prediction or categorization. Another risk is if an attacker can intentionally manipulate the data being used by a machine learning system, the entire system can be compromised.

One of the most often discussed risks is data confidentiality. McGraw said data protection is already difficult enough without machine learning. In machine learning, there is a unique challenge in protecting data because it is possible that through subtle means information contained in the machine learning model could be extracted.

LFCC Student Myra Diaz, who is studying computer science at the college, attended the program.

I like it. I am curious and so interested to see how can we get a computer to be judgmental in a positive way, such as judging what it is seeing, Diaz said.

Remaining speakers for this years Tech Bytes programs are:

6 p.m. Feb. 19: Kay Connelly, Informatics.

1 p.m. March 11:Retired Secretary of the Navy Richard Danzig

6 p.m. April 8: Heather Wilson, Analytics, L Brands

Read more from the original source:
Expert: Don't overlook security in rush to adopt AI - The Winchester Star

Research report investigates the Global Machine Learning In Finance Market 2019-2025 – WhaTech Technology and Markets News

Machine Learning in Finance Market size was xx million US$ and it is expected to reach xx million US$ by the end of 2025, with a CAGR of xx% during 2019-2025.

Machine Learning in Finance Market studies record 2019 gives certain records of primary players like producers, suppliers, vendors, traders, clients, traders and and so on Machine Learning in Finance Market report offers a professional and deep evaluation on the prevailing country of Machine Learning in Finance Market that consists of major types, major packages, information kind consist of ability, manufacturing, market share, price, revenue, cost, gross, gross margin, boom rate, intake, import, export and etc. Enterprise chain, manufacturing procedure, price shape, advertising channel are also analyzed in this report.

The boom trajectory of the worldwide Machine Learning in Finance Market over the assessment period is shaped by way of several common and emerging regional and international developments, a granular assessment of which is offered in the research report. The study on reading the global Machine Learning in Finance Market dynamics takes a critical examine the business regulatory framework, technological advances in related industries, and the strategic avenues.

The value of machine learning in finance is becoming more apparent by the day. As banks and other financial institutions strive to beef up security, streamline processes, and improve financial analysis, ML is becoming the technology of choice.

The key players covered in this study:Ignite Ltd,Yodlee,Trill A.I.,MindTitan,Accenture,ZestFinance...

Request for Sample with TOC@www.researchtrades.com/requestle/1678067

Market segment by Type, the product can be split into;Supervised Learning,Unsupervised Learning,Semi Supervised Learning,Reinforced Leaning

Market segment by Application, split into:Banks,Securities Company,Others

Market segment by Regions/Countries, this report covers:United States,Europe,China,Japan,Southeast Asia,India,Central & South America

The study objectives of this report are:*To analyze global Machine Learning in Finance status, future forecast, growth opportunity, key market and key players.*To present the Machine Learning in Finance development in United States, Europe and China.

This email address is being protected from spambots. You need JavaScript enabled to view it.

Visit link:
Research report investigates the Global Machine Learning In Finance Market 2019-2025 - WhaTech Technology and Markets News

BOYS SWIMMING: Tigersharks top Waconia in conference dual – Crow River Media

The Hutchinson boys swimming and diving team defeated Waconia 91-81 Thursday night.

Conner Hogan had a great meet for the `Sharks. Hogan won the 100 freestyle and had the best time in the backstroke. He also helped both the 200 medley and 200 freestyle relay teams take first.

Same with Noah Tague. Tague won the 200 IM and helped the 200 medley relay team take first.

Mattew Olberg came in first in the 500 freestyle and helped the the 200 freestyle relay team come in first as well.

Alex Oestreich just missed out on 200 points in the diving competition with 199.95.

The `Sharks had a large enough lead going into the backstroke that they exhibitioned the final three races.

This was a nice warmup for them before they head down to Minneapolis and take on True Team State meet on Saturday at the Jean K. Freeman Aquatic Center at noon.

Hutchinson 91, Waconia 81 (Jan. 23)

Individual Results:

200 medley relay: 1. Hutch A (Conner Hogan, Noah Tague, Tristin Nelsen, Alex Oestreich) 1:45.86, 3. Hutch B (Riley Yerks, Devon Bode, Gabe Stassen, Dane Thovson) 1:53.49

200 freestyle: 1. Samuel Sinclair (W) 1:50.71, 2. Matthew Olberg 1:53.35, 3. Thovson 2:02.83, 4. Max Einck 2:04.19

200 IM: 1. Tague (H) 2:09.22, 3. Nelsen 2:13.65, 5. Stassen 2:24.32

50 freestyle: 1. David Sinclair (W) 22.14, 2. Bode 24.09, 3. Oestreich 25.24, 4. Charlie Jenum 25.87, Ben Becker 26.44

1 mtr diving: 1. Oestreich (H) 199.95, 2. Cameron Wagner 183.80

100 freestyle: 1. Hogan (H) 51.42, 2. Yerks 57.73, 4. Wagner 59.70, Becker 1:03.22

500 freestyle: 1. Olberg (H) 5:09.89, 3. Thovson 5:24.79, 5. Einck 5:40.79

200 freestyle relay: 1. Hutch A (Hogan, Oestreich, Nelsen, Olberg) 1:37.56, 2. Hutch B (Stassen, Anthony Witte, Jenum, Einck) 1:44.54, 4. Hutch C (Jackson Kramer, Grant Kropp, Ethan Field, Wagner) 1:51.50

100 backstroke: 1. Alex Kearney (W) 1:07.51, Hogan 1:01.78, Yerks, 1:08.00, Witte 1:08.08

100 breaststroke: 1. David Sinclair (W) 1:03.70, Tague 1:06.91, Bode 1:07.44, Jenum 1:12.65

400 freestyle relay: 1. Wac A (Nathan Sannito, Lars Johnson, Samuel Sinclair, David Sinclair) 3:35.70, Hutch A (Olberg, Thovson, Tague, Bode) 3:40.62, Hutch B (Einck, Jenum, Field, Yerks) 4:00.59, Hutch C (Wagner, Carter Johnson, Kramer, Witte) 4:10.32

Read the original here:
BOYS SWIMMING: Tigersharks top Waconia in conference dual - Crow River Media

David Sinclair on NMN and Epigenetics | | LEAF

Dr. David Sinclair, a Professor of Genetics at Harvard Medical School, is one of the most well-known researchers in the field of rejuvenation, and his lab is the beneficiary of a successful Lifespan.io campaign.

Today, Dr. Sinclair is releasing his book on Amazon, Lifespan: Why We Age and Why We Dont Have To, and on Wednesday, September 18, we hosted a special webinar with Dr. Sinclair as well.

David has recently appeared on shows such as Joe Rogan, not once but twice, the David Pakman show, and Tom Bilyeu. At International Perspectives in Geroscience, a conference hosted at Weizmann Institute of Science (Israel) on September 4-5, we had the opportunity to interview Dr. Sinclair about his work and his thoughts on the current state of research.

Back in February, you and a group of 16 researchers in the aging field went to the Academy for Health and Lifespan Research with the aim of promoting aging research, fostering the sharing of knowledge between scientists, and helping to guide governments and other key players in the industry. Could you please tell us a little more about the Academy and its current activity?

The Academy is a founding group of scientists who seek to understand the fundamental causes of aging and how to combat it. We have come together to build a society, a group of leaders around the world who will act as one voice to help shape not just the research, but public policy, future economic effects of the research, and medicines that are going to come from this field.

Do you think that aging is a disease or a syndrome or not? Whats your opinion on that?

Well, first of all, theres no correct answer. There is no law that says somethings a disease and something is not. Currently, the medical definition of a disease is something that causes a dysfunction or disability that happens to less than half of the population. Of course, aging happens to most of the population now, but I think that having a cut-off at 50% is arbitrary. Something that causes decline in functionality and eventual death should be worked on just as vigorously as something that only affects a minority of people.

Do you think that recognizing aging as a disease, in, say, the International Classification of Diseases makes sense in order to accelerate the development of new therapies addressing the root mechanisms of aging?

The World Health Organizations new definition of aging as a condition is helpful, but the real change will come when a leading country says that aging is a disease that can have a medicine approved for treatment. Right now, because aging is not a condition thats agreed upon by any regulator, drugs that may slow or reverse aging, and perhaps extend lifespan, healthy lifespan, for many years, doctors are very hesitant to prescribe those medicines. They follow the rule book. Metformin is a good example of a drug that is relatively safe and cheap and could potentially have a big benefit. But, because aging is not a disease, doctors rarely provide it to their patients until they actually become diabetic.

Basically, that means that the position of the government has to change and then once the government declares sort of a war on aging, then there could be some regulation changes, and then it may come to the point when doctors will be mentally ready to prescribe these drugs, right?

Thats right. Also, if aging is a prescriptable condition, then investment in aging-related drugs or longevity medicines will increase by orders of magnitude. The problem today is because aging isnt a prescriptable condition, drugs have to be developed for other diseases first, with the hope that then theyll be used more broadly.

Currently, medicine treats the symptoms, not the causes, of age-related diseases. Do you think that we might soon reach the point where therapies will be taken in a preventive manner to delay the onset of age-related diseases? What do you think might be the turning point for things to change? Basically, prevention is always a problem, even though its one of the most effective strategies, but we seem to never get there.

Well, theres a subset of the population, particularly in the US, but increasingly around the world, who are using the internet to educate themselves and are trying to take action before they become sick. Sometimes with medical supervision, sometimes not. Its a grassroots movement right now; for it to become mainstream, the regulations would have to change so that doctors can feel comfortable prescribing medicines to prevent diseases. But, if we dont change, then we will continue to practice whack-a-mole medicine and only treat one disease at a time after its already developed.

You travel the world a lot. Is there a country that you think is more forward-thinking in how aging is viewed and might take the first move to define aging as a directly treatable condition?

There are a few; the leading contenders right now are Australia, Singapore, and then the US and UK are also talking about it. The first country that does take this first bold step will reap the rewards of that with more investment and, of course, an increasingly productive and healthy population.

You name the countries that seem to have the highest life expectancy, actually. Do you think that its related to the understanding that the problem of population aging is becoming severe enough?

Thats exactly right. The countries that have a problem with the healthcare of the elderly have to do something because of the increased amount of elderly will only continue to raise the percent of GDP those countries spend; right now the US already spends 17%. Theyre not getting any younger, and their life expectancy isnt changing. So, for the US to really make progress, they need a new approach to medicine.

Lets talk a little bit more about your work. You are very well known for your work with NAD+ and its precursors; were often asked whether NR or NMN is better. However, the data seems to suggest that different precursors are more or less efficient in a tissue- or organ-dependent manner. Would it be fair to say that rather than asking which is better, we should instead consider these differences and that both may have their place?

Theyre very similar molecules, and both have been shown to provide a variety of health benefits in mice. That doesnt mean either of them will work to slow aging in humans, and thats why placebo-controlled clinical trials are required to know if one of them, or both of them, will work in certain conditions.

There has been a great deal of debate over the ability of NMN to pass through the plasma membrane to reach the interior of the cell. However, you and your team recently showed that under certain conditions, NMN can indeed enter the cell via a previously undocumented transporter without the need to change back to NR. Have there been any further developments with this? In particular, what does this mean for the efficiency of NMN, given its close proximity to NAD+ in the salvage pathway?

The NMN transporter was recently published by Shin Imais group; I wrote a commentary about it. Im aware of work thats not yet published by a few different labs, looking at how these molecules travel through the body of a mouse. The conclusion is that some tissues have transporters, some dont. It can even vary depending on where in the gut youre talking about. I think, in the end, whats going to happen, like most areas of science, is that everybodys right; it just depends on what youre talking about.

There is a number of human trials in progress for NMN, including one at Brigham and Womens Hospital. Can you tell us anything about that, and when might we expect to see some results?

Those studies began over a year ago, and they are currently Phase 1 safety studies in healthy volunteers. Next year, the plan is to test the pharmaceutical product in a disease area, most likely a rare disease, but also in the elderly to see if we can recapitulate some of the results weve seen in mice, such as increased blood flow and endurance.

Another area that you are involved in is partial cellular reprogramming to reverse age-related epigenetic alterations in cells and tissues. This is a topic that we have written about in the last year or two. Given the success of Belmonte and his team, and the enthusiasm for the approach in general, it really seems to have great potential. Can you please tell us a little bit about this approach and the approach that you are taking and how youre progressing so far?

For 20 years, weve been working on epigenetic changes as a cause of aging, starting with work in yeast and now in mammals. Weve developed viral vectors and combinations of reprogramming factors that appear to be much safer than the Belmonte work, and weve used them to reprogram the eye to restore vision in mice with glaucoma and in very old mice.

Some people argue that epigenetic alterations are similar to the hands of a clock and they only reflect aging, making them not an underlying cause but rather a consequence; do you consider them a cause or a consequence, and when partial programming is initiated, should it be considered to be actual rejuvenation?

Currently, it is believed that the clock is just an indicator of age and not part of the actual aging process, but our recent work that we deposited on bioRxiv strongly suggests that the process of reversing the clock doesnt just change the apparent age of the body, it actually reverses aging itself by restoring the function of the old cells to behave as though theyre young again. Therefore, the clock may not just be telling time; it may actually be controlling time.

That sounds fascinating. So, its actual rejuvenation, right?

Its early days, but this appears to be as close to rewinding the clock and rejuvenating at least parts of the body than anything that weve worked on before.

Back in 2016, when Belmonte and his colleagues demonstrated that partial cellular reprogramming in mice was possible, he estimated that such approaches might reach the public in the next decade. Do you think that we are on track for this to happen?

Were now more than on track. Were actually ahead of schedule. We found an apparently safe way to reprogram tissues, complex tissues, and there are at least two companies now expecting to start clinical trials within the next two years in humans.

Can you tell us a little bit more about that, or is that secret for a while?

One of the companies is called Iduna, and I formed this company with Steve Horvath, Belmonte, and Manuel Serrano in Spain. We have the funding to start a clinical trial next year.

Partial reprogramming is altering ourselves at the cellular level; how do you think the general public might react to such an idea, in your view? Is this going to be a real hurdle to getting people on board with using these treatments?

I found that everyone who hears these results of the Belmonte lab and of my lab is extremely excited because its a very simple but powerful concept of rewinding the clock, and I dont know of anybody who has said that we shouldnt go faster in trying to develop this technology.

In general, whats your usual way of overcoming the initial skepticism regarding the idea of healthy life extension? Because there is this problem with life extension, that people sometimes react weirdly to it.

Ive faced that my whole career since I started; theres always going to be a group of individuals who dont believe that humans are capable of certain things. It was the same with flying back in the early 20th century. I think we know enough now about how aging works and how to slow it and possibly reverse it that its going to be possible in our lifetimes to have a big impact on our healthspan and probably lifespan. Anybody who thinks that its not doesnt know how fast science is moving.

Whats your usual way to deal with skepticism; do you have some favorite arguments?

Mostly, I just go back to the lab and do better research and let the data speak for itself. There are a lot of people who wont be convinced until they see the actual experiments redone many times. What Ive done in the past two years is Ive put all my ideas and the advances in the field into a book, so Im hoping that this book will convince the skeptics or at least make them think hard about whats possible with their lives, what they can do now, and what soon should become possible.

Wonderful. Actually, my next question was about this book; could you please tell us a little bit more about it and what the readers should look forward to.

Lifespan takes the reader on a journey through history, looking at the endeavor of humans to try to live longer and using that historical perspective to look at todays situation and project into the future. The book also takes readers on a journey through the very cutting edge of aging research and things that the reader can do right now to take advantage of these new discoveries in their daily lives with changes in their daily activity, what they eat, when they eat, but also medicines that are currently available on the market that may extend lifespan. The last chapter is about where we are headed, what are the medicines that are in development, and then when these drugs become available, what does the world look like? Is it a better place or a worse place, and how will our lives change?

Wow, that sounds like a book that I would really like to read. You look pretty amazing for being 50 years old. Im 40, and I think you look better than me. Are you doing something to support your health, to feel better, to be more productive and to age slowly?

Im doing an experiment right now in my body. My father, my wife, and my dogs. Its voluntary, of course; my brother recently complained that he was being treated as the negative control in the experiment. I believed in the research and known the risks to be low, so, starting with resveratrol in 2003, I started taking that and Im still taking it, and Ive added to that NMN and some metformin as well. I try not to eat too much. I should exercise more. What I do, and what Ive learned works for me, and for members of my family, is also written down in detail in my book. So, if people would like to know it, they can read it.

Finally, is there a question that no one ever asks you and that you would like us to ask?

Am I afraid of dying?

Are you?

No.

Why not?

Ive been in situations where I thought I could die, planes that have lost their control, that kind of thing. I dont get nervous; Im not worried about that. The reason that Im doing what Im doing is Id like to leave the world a better place than I found it. Im also very curious, Id like to see what we can discover and what the future holds for all of humanity, not just for longevity, but the future of the planet. See if we can point humanity in the right direction and away from the bad scenarios that we seem to be on right now.

We would like to thank Dr. Sinclair for taking the time to make this interview with us and for answering our questions.If you would like to learn more about his work, you may be interested in watching the special webinar we did with Dr. Sinclair in September 2019 which you can see below.

Read this article:
David Sinclair on NMN and Epigenetics | | LEAF

ASC20 Finals to be Held in Shenzhen, Tasks Include Quantum Computing Simulation and AI Language Exam – HPCwire

BEIJING, Jan. 21, 2020 The 2020 ASC Student Supercomputer Challenge (ASC20) announced the tasks for the new season: using supercomputers to simulate Quantum circuit and training AI models to take English test. These tasks can be unprecedented challenges for the 300+ ASC teams from around the world. From April 25 to 29, 2020, top 20 finalists will fiercely compete at SUSTech in Shenzhen, China.

ASC20 set up Quantum Computing tasks for the first time. Teams are going to use the QuEST (Quantum Exact Simulation Toolkit) running on supercomputers to simulate 30 qubits in two cases: quantum random circuits (random.c), and quantum fast Fourier transform circuits (GHZ_QFT.c). Quantum computing is a disruptive technology, considered to be the next generation high performance computing. However the R&D of quantum computers is lagging behind due to the unique properties of quantum. It adds extra difficulties for scientists to use real quantum computers to solve some of the most pressing problems such as particle physics modeling, cryptography, genetic engineering, and quantum machine learning. From this perspective, the quantum computing task presented in the ASC20 challenge, hopefully, will inspire new algorithms and architectures in this field.

The other task revealed is Language Exam Challenge. Teams will take on the challenge to train AI models on an English Cloze Test dataset, vying to achieve the highest test scores. The dataset covers multiple levels of English language tests in China, including the college entrance examination, College English Test Band 4 and Band 6, and others. Teaching the machines to understand human language is one of the most elusive and long-standing challenges in the field of AI. The ASC20 AI task signifies such a challenge, by using human-oriented problems to evaluate the performance of neural networks.

Wang Endong, ASC Challenge initiator, member of the Chinese Academy of Engineering and Chief Scientist at Inspur Group, said that through these tasks, students from all over the world get to access and learn the most cutting-edge computing technologies. ASC strives to foster supercomputing & AI talents of global vision, inspiring technical innovation.

Dr. Lu Chun, Vice President of SUSTech host of the ASC20 Finals, commented that supercomputers are important infrastructure for scientific innovation and economic development. SUSTech makes focused efforts on developing supercomputing and hosting ASC20, hoping to drive the training of supercomputing talent, international exchange and cooperation, as well as inter discipline development at SUSTech.

Furthermore, during January 15-16, 2020, the ASC20 organizing committee held a competition training camp in Beijing to help student teams prepare for the ongoing competition. HPC and AI experts from the State Key Laboratory of High-end Server and Storage Technology, Inspur, Intel, NVIDIA, Mellanox, Peng Cheng Laboratory and the Institute of Acoustics of the Chinese Academy of Sciences gathered to provide on-site coaching and guidance. Previous ASC winning teams also shared their successful experiences.

About ASC

The ASC Student Supercomputer Challenge is the worlds largest student supercomputer competition, sponsored and organized by Asia Supercomputer Community in China and supported by Asian, European, and American experts and institutions. The main objectives of ASC are to encourage exchange and training of young supercomputing talent from different countries, improve supercomputing applications and R&D capacity, boost the development of supercomputing, and promote technical and industrial innovation. The annual ASC Supercomputer Challenge was first held in 2012 and has since attracted over 8,500 undergraduates from all over the world. Learn more ASC athttps://www.asc-events.org/.

Source: ASC

The rest is here:
ASC20 Finals to be Held in Shenzhen, Tasks Include Quantum Computing Simulation and AI Language Exam - HPCwire

New Centers Lead the Way towards a Quantum Future – Energy.gov

The world of quantum is the world of the very, very small. At sizes near those of atoms and smaller, the rules of physics start morphing into something unrecognizableat least to us in the regular world. While quantum physics seems bizarre, it offers huge opportunities.

Quantum physics may hold the key to vast technological improvements in computing, sensing, and communication. Quantum computing may be able to solve problems in minutes that would take lifetimes on todays computers. Quantum sensors could act as extremely high-powered antennas for the military. Quantum communication systems could be nearly unhackable. But we dont have the knowledge or capacity to take advantage of these benefitsyet.

The Department of Energy (DOE) recently announced that it will establish Quantum Information Science Centers to help lay the foundation for these technologies. As Congress put forth in the National Quantum Initiative Act, the DOEs Office of Science will make awards for at least two and up to five centers.

These centers will draw on both quantum physics and information theory to give us a soup-to-nuts understanding of quantum systems. Teams of researchers from universities, DOE national laboratories, and private companies will run them. Their expertise in quantum theory, technology development, and engineering will help each center undertake major, cross-cutting challenges. The centers work will range from discovery research up to developing prototypes. Theyll also address a number of different technical areas. Each center must tackle at least two of these subjects: quantum communication, quantum computing and emulation, quantum devices and sensors, materials and chemistry for quantum systems, and quantum foundries for synthesis, fabrication, and integration.

The impacts wont stop at the centers themselves. Each center will have a plan in place to transfer technologies to industry or other research partners. Theyll also work to leverage DOEs existing facilities and collaborate with non-DOE projects.

As the nations largest supporter of basic research in the physical sciences, the Office of Science is thrilled to head this initiative. Although quantum physics depends on the behavior of very small things, the Quantum Information Science Centers will be a very big deal.

The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit https://www.energy.gov/science.

Continue reading here:
New Centers Lead the Way towards a Quantum Future - Energy.gov

Jenkins Creator Launches Startup To Speed Software Testing with Machine Learning — ADTmag – ADT Magazine

Jenkins Creator Launches Startup To Speed Software Testing with Machine Learning

Kohsuke Kawaguchi, creator of the open source Jenkins continuous integration/continuous delivery (CI/CD) server, and Harpreet Singh, former head of the product group at Atlassian, have launched a startup that's using machine learning (ML) to speed up the software testing process.

Their new company, Launchable, which emerged from stealth mode on Thursday, is developing a software-as-a-service (SaaS) product with the ability to predict the likelihood of a failure for each test case, given a change in the source code. The service will use ML to extract insights from the massive and growing amount of data generated by the increasingly automated software development process to make its predictions.

"As a developer, I've seen this problem of slow feedback from tests first-hand," Kawaguchi told ADTmag. "And as the guy who drove automation in the industry with Jenkins, it seemed to me that we could make use of all that data the automation is generating by applying machine learning to the problem. I thought we should be able to train the machine on the model and apply quantifiable metrics, instead of relying on human experience and gut instinct. We believe we can predict, with meaningful accuracy, what tests are more likely to catch a regression, given what has changed, and that translates to faster feedback to developers."

The strategy here is to run only a meaningful subset of tests, in the order that minimizes the feedback delay.

Kawaguchi (known as "KK") and Singh worked together at CloudBees, the chief commercial supporter of Jenkins. Singh left that company in 2018 to serve as GM of Atlassian's Bitbucket cloud group. Kawaguchi became an elite developer and architect at CloudBees, and he's been a part of the community throughout the evolution of this technology. His departure from the company was amicable: Its CEO and co-founder Sacha Labourey is an investor in the startup, and Kawaguchi will continue to be involved with the Jenkins community, he said.

Software testing has been a passion of Kawaguchi's since his days at Sun Microsystems, where he developed Jenkins as a fork of the Hudson CI server in 2011. Singh also worked at Sun and served as the first product manager for Hudson before working on Jenkins. They will serve as co-CEOs of the new company. They reportedly snagged $3.2 million in seed funding to get the ball rolling.

"KK and I got to talking about how the way we test now impacts developer productivity, and how machine learning could be used to address the problem," Singh said. "And then we started talking about doing a startup. We sat next to each other at CloudBees for eight years; it was an opportunity I couldn't pass up."

An ML engine is at the heart of the Launchable SaaS, but it's really all about the data, Singh said.

"We saw all these sales and marketing guys making data-driven decisions -- even more than the engineers, which was kind of embarrassing," Singh said. "So it became a mission for us to change that. It's kind of our north star."

The co-execs are currently talking with potential partners and recruiting engineers and data scientists. They offered no hard release date, but they said they expect a version of the Launchable SaaS to become generally available later this year.

Posted by John K. Waters on 01/23/2020 at 8:41 AM

Read more from the original source:
Jenkins Creator Launches Startup To Speed Software Testing with Machine Learning -- ADTmag - ADT Magazine

Machine learning and eco-consciousness key business trends in 2020 – Finfeed

In 2020, small to medium sized businesses (SMBs) are likely to focus more on supporting workers to travel and collaborate in ways that suit them, while still facing a clear economic imperative to keep costs under control.

This will likely involve increased use of technologies such as machine learning and automation to: help determine and enforce spending policies; ensure people travelling for work can optimise, track, and analyse their spend; and prioritise travel options that meet goals around environmental responsibility and sustainability.

Businesses that recognise and respond to these trends will be better-placed to save money while improving employee engagement and performance, according to SAP Concur.

Fabian Calle, General Manager, Small to Medium Business, ANZ, SAP Concur, said, As the new decade begins, the business environment will be subject to the same economic ups and downs seen in the previous decade. However, with new technologies and approaches, most businesses will be able to leverage automation and even artificial intelligence to smooth out those peaks and troughs.

SAP Concur has identified the top five 2020 predictions for SMBs, covering economics, technology, business, travel, the environment, diversity, and corporate social responsibility:

Calle said, 2020 will continue to drive significant developments as organisations of all sizes look to optimise efficiency and productivity through employee operations and satisfaction. Australian businesses need to be aware of these trends and adopt cutting edge technology to facilitate their workers need to travel and collaborate more effectively and with less effort.

Original post:
Machine learning and eco-consciousness key business trends in 2020 - Finfeed

Google claims to have invented a quantum computer, but IBM begs to differ – The Conversation CA

On Oct. 23, 2019, Google published a paper in the journal Nature entitled Quantum supremacy using a programmable superconducting processor. The tech giant announced its achievement of a much vaunted goal: quantum supremacy.

This perhaps ill-chosen term (coined by physicist John Preskill) is meant to convey the huge speedup that processors based on quantum-mechanical systems are predicted to exhibit, relative to even the fastest classical computers.

Googles benchmark was achieved on a new type of quantum processor, code-named Sycamore, consisting of 54 independently addressable superconducting junction devices (of which only 53 were working for the demonstration).

Each of these devices allows the storage of one bit of quantum information. In contrast to the bits in a classical computer, which can only store one of two states (0 or 1 in the digital language of binary code), a quantum bit qbit can store information in a coherent superposition state which can be considered to contain fractional amounts of both 0 and 1.

Sycamore uses technology developed by the superconductivity research group of physicist John Martinis at the University of California, Santa Barbara. The entire Sycamore system must be kept cold at cryogenic temperatures using special helium dilution refrigeration technology. Because of the immense challenge involved in keeping such a large system near the absolute zero of temperature, it is a technological tour de force.

The Google researchers demonstrated that the performance of their quantum processor in sampling the output of a pseudo-random quantum circuit was vastly better than a classical computer chip like the kind in our laptops could achieve. Just how vastly became a point of contention, and the story was not without intrigue.

An inadvertent leak of the Google groups paper on the NASA Technical Reports Server (NTRS) occurred a month prior to publication, during the blackout period when Nature prohibits discussion by the authors regarding as-yet-unpublished papers. The lapse was momentary, but long enough that The Financial Times, The Verge and other outlets picked up the story.

A well-known quantum computing blog by computer scientist Scott Aaronson contained some oblique references to the leak. The reason for this obliqueness became clear when the paper was finally published online and Aaronson could at last reveal himself to be one of the reviewers.

The story had a further controversial twist when the Google groups claims were immediately countered by IBMs quantum computing group. IBM shared a preprint posted on the ArXiv (an online repository for academic papers that have yet to go through peer review) and a blog post dated Oct. 21, 2019 (note the date!).

While the Google group had claimed that a classical (super)computer would require 10,000 years to simulate the same 53-qbit random quantum circuit sampling task that their Sycamore processor could do in 200 seconds, the IBM researchers showed a method that could reduce the classical computation time to a mere matter of days.

However, the IBM classical computation would have to be carried out on the worlds fastest supercomputer the IBM-developed Summit OLCF-4 at Oak Ridge National Labs in Tennessee with clever use of secondary storage to achieve this benchmark.

While of great interest to researchers like myself working on hardware technologies related to quantum information, and important in terms of establishing academic bragging rights, the IBM-versus-Google aspect of the story is probably less relevant to the general public interested in all things quantum.

For the average citizen, the mere fact that a 53-qbit device could beat the worlds fastest supercomputer (containing more than 10,000 multi-core processors) is undoubtedly impressive. Now we must try to imagine what may come next.

The reality of quantum computing today is that very impressive strides have been made on the hardware front. A wide array of credible quantum computing hardware platforms now exist, including ion traps, superconducting device arrays similar to those in Googles Sycamore system and isolated electrons trapped in NV-centres in diamond.

These and other systems are all now in play, each with benefits and drawbacks. So far researchers and engineers have been making steady technological progress in developing these different hardware platforms for quantum computing.

What has lagged quite a bit behind are custom-designed algorithms (computer programs) designed to run on quantum computers and able to take full advantage of possible quantum speed-ups. While several notable quantum algorithms exist Shors algorithm for factorization, for example, which has applications in cryptography, and Grovers algorithm, which might prove useful in database search applications the total set of quantum algorithms remains rather small.

Much of the early interest (and funding) in quantum computing was spurred by the possibility of quantum-enabled advances in cryptography and code-breaking. A huge number of online interactions ranging from confidential communications to financial transactions require secure and encrypted messages, and modern cryptography relies on the difficulty of factoring large numbers to achieve this encryption.

Quantum computing could be very disruptive in this space, as Shors algorithm could make code-breaking much faster, while quantum-based encryption methods would allow detection of any eavesdroppers.

The interest various agencies have in unbreakable codes for secure military and financial communications has been a major driver of research in quantum computing. It is worth noting that all these code-making and code-breaking applications of quantum computing ignore to some extent the fact that no system is perfectly secure; there will always be a backdoor, because there will always be a non-quantum human element that can be compromised.

More appealing for the non-espionage and non-hacker communities in other words, the rest of us are the possible applications of quantum computation to solve very difficult problems that are effectively unsolvable using classical computers.

Ironically, many of these problems emerge when we try to use classical computers to solve quantum-mechanical problems, such as quantum chemistry problems that could be relevant for drug design and various challenges in condensed matter physics including a number related to high-temperature superconductivity.

So where are we in the wonderful and wild world of quantum computation?

In recent years, we have had many convincing demonstrations that qbits can be created, stored, manipulated and read using a number of futuristic-sounding quantum hardware platforms. But the algorithms lag. So while the prospect of quantum computing is fascinating, it will likely be a long time before we have quantum equivalents of the silicon chips that power our versatile modern computing devices.

[ Deep knowledge, daily. Sign up for The Conversations newsletter. ]

Visit link:
Google claims to have invented a quantum computer, but IBM begs to differ - The Conversation CA

What Is Quantum Computing, And How Can It Unlock Value For Businesses? – Computer Business Review

Add to favorites

We are at an inflection point

Ever since Professor Alan Turing proposed the principle of the modern computer in 1936, computing has come a long way. While advancements to date have been promising, the future is even brighter, all thanks to quantum computing, which performs calculations based on the behaviour of particles at the sub-atomic level, writes Kalyan Kumar, CVP and CTO IT Services,HCL Technologies.

Quantum computing promises to unleash unimaginable computing power thats not only capable of addressing current computational limits, but unearthing new solutions to unsolved scientific and social mysteries. Whats more, thanks to increasing advancement since the 1980s, quantum computing can now drive some incredible social and business transformations.

Quantum computing holds immense promise in defining a positive, inclusive and human centric future, which is what theWEF Future Council on Quantum Computingenvisages. The most anticipated uses of quantum computing are driven by its potential to simulate quantum structures and behaviours across chemicals and materials. This promise is being seen guardedly by current scientists who claim quantum computing is still far from making a meaningful impact.

This said, quantum computing is expected to open amazing and much-needed possibilities in medical research. Drug development time, which usually takes more than 10 to 12 years with billions of dollars of investment, is expected to reduce considerably, alongside the potential to explore unique chemical compositions that may just be beyond the limits of current classical computing. Quantum computing can also help with more accurate weather forecasting, and provide accurate information that can help save tremendous amounts of agriculture production from damage.

Quantum computing promises a better and improved future, and while humans are poised to benefit greatly from this revolution, businesses too can expect unapparelled value.

When it comes to quantum computing, it can be said that much of the world is at the they dont know what they dont know stage. Proof points are appearing, and it is seemingly becoming clear that quantum computing solves problems that cannot be addressed by todays computers. Within transportation, for example, quantum computing is being used to develop battery and self-driving technologies, while Volkswagen has also been using quantum computing to match patterns and predict traffic conditions in advance, ensuring a smoother movement of traffic. In supply chains, logistics and trading are receiving a significant boost from the greater computing power and high-resolution modelling quantum computing provides, adding a huge amount of intelligence using new approaches to machine learning.

The possibilities for businesses are immense and go way beyond these examples mentioned above, in domains such as healthcare, financial services and IT. Yet a new approach is required. The companies that succeed in quantum computing will be those that create value chains to exploit the new insights, and form a management system to match the high-resolution view of the business that will emerge.

While there are some initial stage quantum devices already available, these are still far from what the world has been envisaging. Top multinational technology companies have been investing considerably in this field, but they still have some way to go. There has recently been talk of prototype quantum computers performing computations that would have previously taken 10,000 years in just 200 seconds. Though of course impressive, this is just one of the many steps needed to achieve the highest success in quantum computing.

It is vital to understand how and when we are going to adopt quantum computing, so we know the right time to act. The aforementioned prototype should be a wakeup call to early adopters who are seeking to find ways to create a durable competitive advantage. We even recently saw a business announcing its plans to make a prototype quantum computer available on its cloud, something we will all be able to buy or access some time from now. If organisations truly understand the value and applications of quantum computing, they will be able to create new products and services that nobody else has. However, productising and embedding quantum computing into products may take a little more time.

One important question arises from all this: are we witnessing the beginning of the end for classical computing? When looking at the facts, it seems not. With the advent of complete and practical quantum computers, were seeing a hybrid computing model emerging where digital binary computers will co-process and co-exist with quantum Qbit computers. The processing and resource sharing needs are expected to be optimised using real time analysis, where quantum takes over exponential computational tasks. To say the least, quantum computing is not about replacing digital computing, but about coexistence enabling composed computing that handles different tasks at the same time similar to humans having left and right brains for analytical and artistic dominance.

If one things for sure, its that we are at an inflection point, witnessing what could arguably be one of the most disruptive changes in human existence. Having a systematic and planned approach to adoption of quantum computing will not only take some of its mystery away, but reveal its true strategic value, helping us to know when and how to become part of this once in a lifetime revolution.

Continued here:
What Is Quantum Computing, And How Can It Unlock Value For Businesses? - Computer Business Review

Healthcare venture investment in 2020: Quantum computing gets a closer look – Healthcare IT News

Among the healthcare technologies venture firms be looking at most closely at in 2020, various artificial intelligence and machine learning applications are atop this list, of course. But so are more nuts-and-bolts tools like administrative process automation and patient engagement platforms, VCs say.

Other, more leading-edge technologies genomics-focused data and analytics, and even quantum computing are among the areas attracting investor interest this year.

"We expect 2020 to mark the first year where health IT venture firms will start to look at quantum computing technology for upcoming solutions," Dr. Anis Uzzaman, CEO and general partner of Pegasus Tech Ventures, told Healthcare IT News.

"With the breakthrough supremacy announcement from Google validating the technology and the subsequent launch of the service Amazon Braket in 2019, there is sure to be a new wave of entrepreneurial activity starting in 2020."

He said quantum computing technology holds a lot of promise for the healthcare industry with potential breakthroughs possible throughout the health IT stack from operations and administration to security.

Among the promising companies, Uzzaman pointed to Palo Alto-based QC Ware, a startup pioneering a software solution that enables companies to use a variety of quantum hardware platforms such as Rigetti and IBM to solve a variety of enterprise problems, including those specifically related to healthcare.

He also predicted artificial intelligence would continue to be at the forefront for health IT venture firms in 2020 as it becomes more clear which startups may be winners in their initial target sectors.

"There has been consistent growth of investment activity over the past few years into healthcare startups using artificial intelligence to target a range of areas from imaging to diagnostics," he said.

However, Uzzaman also noted regulation and long enterprise sales cycles have largely slowed the ability for these companies to significantly scale their revenues.

"Therefore, we anticipate 2020 will be the year where it will become clearer to health IT venture firms who will be winners in applying artificial intelligence to imaging, pathology, genomics, operations, diagnostics, transcription, and more," he said. "We will also continue to see moderate growth in the overall investment amount in machine learning and AI companies, but will see a notable decrease in the number of companies receiving an investment.

Uzzaman explained there were already some signs in late 2019 that there could be late in a short-term innovation cycle for artificial intelligence with many companies, particularly those applying machine learning and AI to robotics, shutting down.

"However, we anticipate many companies will reach greater scale with their solutions and separate themselves from the competition, which will translate into more mega funding rounds," he said.

Ezra Mehlman, managing partner with Health Enterprise Partners, explained that at the beginning of each year, the firm conducts a market mapping exercise to determine which healthcare IT categories are rising to the top of the prioritization queue of our network of hospital and health plan limited partners.

"In the past year, we have seen budgets meaningfully open for automation solutions in administrative processing, genomics-focused data and analytics offerings, aging-in-place technologies and, in particular, patient engagement platforms rooted in proven clinical use cases," he said. "We are actively looking at all of these spaces."

He pointed out that in 2018, more than $2 billion was invested into artificial intelligence and machine learning healthcare IT companies, which represented a quarter of the total dollars invested into digital health companies that year.

"We view this as a recognition of two things: the meteoric aspirations that the market has assigned to AI and machine learning's potential, and a general sense that the underlying healthcare data infrastructure has reached the point of maturity, where it is possible to realize ROI from AI/machine learning initiatives," he said.

However, he said Health Enterprise Partners is still waiting for the "breakout" to occur in adoption.

"We believe we have now reached the point where category leaders will emerge in each major healthcare AI subsector and the usage will become more widespread we have made one such investment in the clinical AI space in the last year," Mehlman said.

Heading into 2020, Mehlman said companies that cannot deliver high-six-figure, year-one ROI in the form of increased revenue or reduced cost will struggle, and companies that cannot crisply answer the question, "Who is the buyer and what is the budget?" will be challenged.

"If one applies these tests to some of the areas that have attracted the most healthcare VC investment--social determinants of health, blockchain and digital therapeutics to name a few the number of viable companies sharply drops off," he said.

Mehlman noted that while these sound like simple principles, the current environment of rapidly consolidating, budget-constrained hospitals, vertically integrating health plans, and big tech companies making inroads into healthcare has raised the bar on what is required for a healthcare startup to gain meaningful market traction.

View original post here:
Healthcare venture investment in 2020: Quantum computing gets a closer look - Healthcare IT News

IBM And University Of Tokyo Launch Quantum Computing Initiative For Japan – E3zine.com

IBM and the University of Tokyo announced an agreement to partner to advance quantum computing and make it practical for the benefit of industry, science and society.

IBM and theUniversity of Tokyowill form theJapan IBM Quantum Partnership, a broad national partnership framework in which other universities, industry, and government can engage. The partnership will have three tracks of engagement: one focused on the development of quantum applications with industry; another on quantum computing system technology development; and the third focused on advancing the state of quantum science and education.

Under the agreement, anIBM Q System One, owned and operated by IBM, willbe installed in an IBM facility inJapan. It will be the first installation of its kind in the region and only the third in the world followingthe United StatesandGermany. The Q System One will be used to advance research in quantum algorithms, applications and software, with the goal of developing the first practical applications of quantum computing.

IBM and theUniversity of Tokyowill also create a first-of-a-kind quantumsystem technology center for the development of hardware components and technologies that will be used in next generation quantum computers. The center will include a laboratory facility to develop and test novel hardware components for quantum computing, including advanced cryogenic and microwave test capabilities.

IBM and theUniversity of Tokyowill also directly collaborateon foundational research topics important to the advancement of quantum computing, and establish a collaboration space on the University campus to engage students, faculty, and industry researchers with seminars, workshops, and events.

Developed byresearchers and engineers fromIBM Researchand Systems, the IBM Q System One is optimized for the quality, stability, reliability, and reproducibility of multi-qubit operations. IBM established theIBM Q Network, a community of Fortune 500 companies, startups, academic institutions and research labs working with IBM to advance quantum computing and explore practical applications for business and science.

Advances in quantum computing could open the door to future scientific discoveries such as new medicines and materials, improvements in the optimization of supply chains, and new ways to model financial data to better manage and reduce risk.

TheUniversity of Tokyowill lead theJapan IBM Quantum Partnership and bring academic excellence from universities and prominent research associations together with large-scale industry, small and medium enterprises, startups as well as industrial associations from diverse market sectors. A high priority will be placed on building quantum programming as well as application and technology development skills and expertise.

Follow this link:
IBM And University Of Tokyo Launch Quantum Computing Initiative For Japan - E3zine.com

Quantum networking projected to be $5.5 billion market in 2025 – TechRepublic

Several companies are working to advance the technology, according to a new report.

The market for quantum networking is projected to reach $5.5 billion by 2025, according to a new report from Inside Quantum Technology (IQT).

While all computing systems rely on the ability to store and manipulate information in individual bits, quantum computers "leverage quantum mechanical phenomena to manipulate information" and to do so requires the use of quantum bits, or qubits, according to IBM.

SEE:Quantum computing: An insider's guide (TechRepublic)

Quantum computing is seen as the panacea for solving the problems computers are not equipped to handle now.

"For problems above a certain size and complexity, we don't have enough computational power on earth to tackle them,'' IBM said. This requires a new kind of computing, and this is where quantum comes in.

IQT says that quantum networking revenue comes primarily from quantum key distribution (QK), quantum cloud computing, and quantum sensor networks. Eventually, these strands will merge into a Quantum Internet, the report said.

Cloud access to quantum computers is core to the business models of many leading quantum computer companiessuch as IBM, Microsoft and Rigettias well as several leading academic institutions, according to the report.

Microsoft, for instance, designed a special programming language for quantum computers, called Q#, and released a Quantum Development Kit to help programmers create new applications, according to CBInsights.

One of Google's quantum computing projects involves working with NASA to apply the tech's optimization abilities to space travel.

The Quantum Internet network will have the same "geographical breadth of coverage as today's internet," the IQT report stated.

It will provide a powerful platform for communications among quantum computers and other quantum devices, the report said.

And will enable a quantum version of the Internet of Things. "Finally, quantum networks can be the most secure networks ever built completely invulnerable if constructed properly," the report said.

The report, "Quantum Networks: A Ten-Year Forecast and Opportunity Analysis," forecasts demand for quantum network equipment, software and services in both volume and value terms.

"The time has come when the rapidly developing quantum technology industry needs to quantify the opportunities coming out of quantum networking," said Lawrence Gasman, president of Inside Quantum Technology, in a statement.

Quantum Key Distribution (QKD) adds unbreakable coding of key distribution to public key encryption, making it virtually invulnerable, according to the report.

QKD is the first significant revenue source to come from the emerging Quantum Internet and will create almost $150 million in revenue in 2020, the report said.

QKD's early success is due to potential usersbig financial and government organizationshave an immediate need for 100% secure encryption, the IQT report stated.

By 2025, IQT projects that revenue from "quantum clouds" are expected to exceed $2 billion.

Although some large research and government organizations are buying quantum computers for on-premise use, the high cost of the machines coupled with the immaturity of the technology means that the majority of quantum users are accessing quantum through clouds, the report explained.

Quantum sensor networks promise enhanced navigation and positioning and more sensitive medical imaging modalities, among other use cases, the report said.

"This is a very diverse area in terms of both the range of applications and the maturity of the technology."

However, by 2025 revenue from quantum sensors is expected to reach about $1.2 billion.

We deliver the top business tech news stories about the companies, the people, and the products revolutionizing the planet. Delivered Daily

Image: Getty Images/iStockphoto

See original here:
Quantum networking projected to be $5.5 billion market in 2025 - TechRepublic

Toshiba says it created an algorithm that beats quantum computers using standard hardware – TechSpot

Something to look forward to: Some of the biggest problems that need solving in the enterprise world require sifting through vast amounts of data and finding the best possible solution given a number of factors and requirements, some of which are at times unknown. For years, quantum computing has been touted as the most promising jump in computational speed for certain kind of problems, but Toshiba says revisiting classical algorithms helped it develop a new one that can leverage existing silicon-based hardware to get a faster result.

Toshiba's announcement this week claims a new algorithm it's been perfecting for years is capable of analyzing market data much more quickly and efficiently than those used in some of the world's fastest supercomputers.

The algorithm is called the "Simulated Bifurcation Algorithm," and is supposedly good enough to be used in finding accurate approximate solutions for large-scale combinatorial optimization problems. In simpler terms, it can come up with a solution out of many possible ones for a particularly complex problem.

According to its inventor, Hayato Goto, it draws inspiration from the way quantum computers can efficiently comb through many possibilities. Work on SBA started in 2015, and Goto noticed that adding new inputs to a complex system with 100,000 variables makes it easy to solve it in a matter of seconds with a relatively small computational cost.

This essentially means that Toshiba's new algorithm could be used on standard desktop computers. To give you an idea how important this development is, Toshiba demonstrated last year that SBA can get highly accurate solutions for an optimization problem with 2,000 connected variables in 50 microseconds, or 10 times faster than laser-based quantum computers.

SBA is also highly scalable, meaning it can be made to work on clusters of CPUs or FPGAs, all thanks to the contributions of Kosuke Tatsumura, another one of Toshiba's senior researchers that specializes in semiconductors.

Companies like Microsoft, Google, IBM, and many others are racing to be the first with a truly viable quantum commercial system, but so far their approaches have produced limited results that live inside their labs.

Meanwhile, scientists like Goto and Kosuke are going back to the roots by exploring ways to improve on classical algorithms. Toshiba hopes to use SBA to optimize financial operations like currency trading and rapid-fire portfolio adjustments, but this could very well be used to calculate efficient routes for delivery services and molecular precision drug development.

Read the original post:
Toshiba says it created an algorithm that beats quantum computers using standard hardware - TechSpot

Deltec Bank, Bahamas says the Impact of Quantum Computing in Banking will be huge – Press Release – Digital Journal

Deltec Bank, Quantum Computing can help institutions speed up their transactional activities while making sense of assets that typically seem incongruent.

Technologies based on quantum theory are coming to the financial sector. It is not an if, but a when for banks to begin using this option to evolve current business practices.

Companies like JPMorgan Chase and Barclays have over two years of experience working with IBMs quantum computing technology. The goal of this work is to optimize portfolios for investors, but several additional benefits could come into the industry as banks learn more about it.

Benefits of Quantum Computing in Banking

Quantum computing stayed in the world of academia until recent years when technology developers opened trial opportunities. The banking sector was one of the first to start experimenting with what might be possible.

Their efforts have led to the development of four positive outcomes that can occur because of the faster processing power that quantum computing offers.

1. Big Data Analytics

The high-powered processing capabilities of this technology make it possible for banks to optimize their big data. According to Deltec Bank, Quantum Computing can help institutions speed up their transactional activities while making sense of assets that typically seem incongruent.

2. Portfolio Analysis

Quantum computing permits high-frequency trading activities because it can appraise assets and analyze portfolios to determine individual needs. The creation of algorithms built on the full capabilities of this technology can mine more information to find new pathways to analysis and implementation.

3. Customer Service Improvements

This technology gives banks more access to artificial intelligence and machine learning opportunities. The data collected by institutions can improve customer service by focusing on consumer engagement, risk analysis, and product development. There will be more information available to develop customized financial products that meet individual needs while staying connected to core utilities.

4. Improved Security

The results of quantum computing in banking will create the next generation of encryption and safeguarding efforts to protect data. Robust measures that include encrypted individual identification keys and instant detection of anomalies can work to remove fraudulent transactions.

Privately Funded Research is Changing the Banking Industry

Although some firms are working with IBM and other major tech developers to bring quantum computing to the banking sector, it is private money that funds most of the innovations.

An example of this effort comes from Rigetti Computing. This company offers a product called Forest, which is a downloadable SDK that is useful in the writing and testing of programs using quantum technologies.

1QB Information Technologies in Canada has an SDK that offers the necessary tools to develop and test applications on quantum computers.

How the world approaches banking and finance could be very different in the future because of quantum computing. This technology might not solve every problem the industry faces today, but it can certainly put a significant dent in those issues.

Disclaimer: The author of this text, Robin Trehan, has an Undergraduate degree in economics, Masters in international business and finance and MBA in electronic business. Trehan is Senior VP at Deltec International http://www.deltecbank.com. The views, thoughts, and opinions expressed in this text are solely the views of the author, and not necessarily reflecting the views of Deltec International Group, its subsidiaries and/or employees.

About Deltec Bank

Headquartered in The Bahamas, Deltec is an independent financial services group that delivers bespoke solutions to meet clients unique needs. The Deltec group of companies includes Deltec Bank & Trust Limited, Deltec Fund Services Limited, and Deltec Investment Advisers Limited, Deltec Securities Ltd. and Long Cay Captive Management

Media ContactCompany Name: Deltec International GroupContact Person: Media ManagerEmail: Send EmailPhone: 242 302 4100Country: BahamasWebsite: https://www.deltecbank.com/

View original post here:
Deltec Bank, Bahamas says the Impact of Quantum Computing in Banking will be huge - Press Release - Digital Journal

University of Sheffield launches Quantum centre to develop the technologies of tomorrow – Quantaneo, the Quantum Computing Source

A new research centre with the potential to revolutionise computing, communication, sensing and imaging technologies is set to be launched by the University of Sheffield this week (22 January 2020).

The Sheffield Quantum Centre, which will be officially opened by Lord Jim ONeill, Chair of Chatham House and University of Sheffield alumnus, is bringing together more than 70 of the Universitys leading scientists and engineers to develop new quantum technologies.

Quantum technologies are a broad range of new materials, devices and information technology protocols in physics and engineering. They promise unprecedented capabilities and performance by exploiting phenomena that cannot be explained by classical physics.

Quantum technologies could lead to the development of more secure communications technologies and computers that can solve problems far beyond the capabilities of existing computers.

Research into quantum technologies is a high priority for the UK and many countries around the world. The UK government has invested heavily in quantum research as part of a national programme and has committed 1 billion in funding over 10 years.

Led by the Universitys Department of Physics and Astronomy, Department of Electronic and Electrical Engineering and Department of Computer Science, the Sheffield Quantum Centre will join a group of northern universities that are playing a significant role in the development of quantum technologies.

The University of Sheffield has a strong presence in quantum research with world leading capabilities in crystal growth, nanometre scale device fabrication and device physics research. A spin-out company has already been formed to help commercialise research, with another in preparation.

Professor Maurice Skolnick, Director of the Sheffield Quantum Centre, said: The University of Sheffield already has very considerable strengths in the highly topical area of quantum science and technology. I have strong expectation that the newly formed centre will bring together these diverse strengths to maximise their impact, both internally and more widely across UK universities and funding bodies.

During the opening ceremony, the Sheffield Quantum Centre will also launch its new 2.1 million Quantum Technology Capital equipment.

Funded by the Engineering and Physical Sciences Research Council (EPSRC), the equipment is a molecular beam epitaxy cluster tool designed to grow very high quality wafers of semiconductor materials types of materials that have numerous everyday applications such as in mobile phones and lasers that drive the internet.

The semiconductor materials also have many new quantum applications which researchers are focusing on developing.

Professor Jon Heffernan from the Universitys Department of Electronic and Electrical Engineering, added: The University of Sheffield has a 40-year history of pioneering developments in semiconductor science and technology and is host to the National Epitaxy Facility. With the addition of this new quantum technologies equipment I am confident our new research centre will lead to many new and exciting technological opportunities that can exploit the strange but powerful concepts from quantum science.

More:
University of Sheffield launches Quantum centre to develop the technologies of tomorrow - Quantaneo, the Quantum Computing Source

LIVE FROM DAVOS: Henry Blodget leads panel on the next decade of tech – Business Insider Nordic

The past decade saw technological advancements that transformed how we work, live, and learn. The next one will bring even greater change as quantum computing, cloud computing, 5G, and artificial intelligence mature and proliferate. These changes will happen rapidly, and the work to manage their impact will need to keep pace.

This session at the World Economic Forum, in Davos, Switzerland, brought together industry experts to discuss how these technologies will shape the next decade, followed by a panel discussion about the challenges and benefits this era will bring and if the world can control the technology it creates.

Henry Blodget, CEO, cofounder, and editorial director, Insider Inc.

This interview is part of a partnership between Business Insider and Microsoft at the 2020 World Economic Forum. Business Insider editors independently decided on the topics broached and questions asked.

Below, find each of the panelists' most memorable contributions:

Julie Love, senior director of quantum business development, Microsoft Microsoft

Julie Love believes global problems such as climate change can potentially be solved far more quickly and easily through developments in quantum computing.

She said: "We [Microsoft] think about problems that we're facing: problems that are caused by the destruction of the environment; by climate change, and [that require] optimization of our natural resources, [such as] global food production."

"It's quantum computing that really a lot of us scientists and technologists are looking for to solve these problems. We can have the promise of solving them exponentially faster, which is incredibly profound. And that the reason is this: [quantum] technology speaks the language of nature.

"By computing the way that nature computes, there's so much information contained in these atoms and molecules. Nature doesn't think about a chemical reaction; nature doesn't have to do some complex computation. It's inherent in the material itself.

Love claimed that, if harnessed in this way, quantum computing could allow scientists to design a compound that could remove carbon from the air. She added that researchers will need to be "really pragmatic and practical about how we take this from, from science fiction into the here-and-now."

Justine Cassell, a professor specializing in AI and linguistics. YouTube/Business Insider

"I believe the future of AI is actually interdependence, collaboration, and cooperation between people and systems, both at the macro [and micro] levels," said Cassell, who is also a faculty member of the Human-Computer Interaction Institute at Carnegie Mellon University.

"At the macro-level, [look], for example, at robots on the factory floor," she said. "Today, there's been a lot of fear about how autonomous they actually are. First of all, they're often dangerous. They're so autonomous, you have to get out of their way. And it would be nice if they were more interdependent if we could be there at the same time as they are. But also, there is no factory floor where any person is autonomous.

In Cassell's view, AI systems could also end up being built collaboratively with experts from non-tech domains, such as psychologists.

"Today, tools [for building AI systems] are mostly machine learning tools," she noted. "And they are, as you've heard a million times, black boxes. You give [the AI system] lots of examples. You say: 'This is somebody being polite. That is somebody being impolite. Learn about that.' But when they build a system that's polite, you don't know why they did that.

"What I'd like to see is systems that allow us to have these bottom-up, black-box approaches from machine learning, but also have, for example, psychologists in there, saying 'that's not actually really polite,' or 'it's polite in the way that you don't ever want to hear.'"

Microsoft president Brad Smith. YouTube/Business Insider

"One thing I constantly wish is that there was a more standardized measurement for everybody to report how much they're spending per employee on employee training because that really doesn't exist, when you think about it," said Smith, Microsoft's president and chief legal officer since 2015.

"I think, anecdotally, one can get a pretty strong sense that if you go back to the 1980s and 1990s employers invested a huge amount in employee training around technology. It was teaching you how to use MS-DOS, or Windows, or how to use Word or Excel interestingly, things that employers don't really feel obliged to teach employees today.

"Learning doesn't stop when you leave school. We're going to have to work a little bit harder. And that's true for everyone.

He added that this creates a further requirement: to make sure the skills people do pick up as they navigate life are easily recognizable by other employers.

"Ultimately, there's a wide variety of post-secondary credentials. The key is to have credentials that employers recognize as being valuable. It's why LinkedIn and others are so focused on new credentialing systems. Now, the good news is that should make things cheaper. It all should be more accessible.

"But I do think that to go back to where I started employers are going to have to invest more [in employee training]. And we're going to have to find some ways to do it in a manner that perhaps is a little more standardized."

Nokia president and CEO, Rajeev Suri. YouTube/Business Insider

Suri said 5G will be able to help develop industries that go far beyond entertainment and telecoms, and will impact physical or manual industries such as manufacturing.

"The thing about 5G is that it's built for machine-type communications. When we received the whole idea of 5G, it was 'how do we get not just human beings to interact with each other, but also large machines," he said.

"So we think that there is a large economic boost possible from 5G and 5G-enabled technologies because it would underpin many of these other technologies, especially in the physical industries."

Suri cited manufacturing, healthcare, and agriculture as just some of the industries 5G could help become far more productive within a decade.

He added: "Yes, we'll get movies and entertainment faster, but it is about a lot of physical industries that didn't quite digitize yet. Especially in the physical industries, we [Nokia] think that the [productivity] gains could be as much as 35% starting in the year 2028 starting with the US first, and then going out into other geographies, like India, China, the European Union, and so on.

More:
LIVE FROM DAVOS: Henry Blodget leads panel on the next decade of tech - Business Insider Nordic

The Need For Computing Power In 2020 And Beyond – Forbes

Having led a Bitcoin mining firm for over two years, I've come to realize the importance of computing power. Computing power connects the real (chip energy) and virtual (algorithm) dimensions of our world. Under the condition that the ownership of the assets remains unchanged, computing power is an intangible asset that can be used and circulated. It is a commercialized technical service and a consumption investment. This is a remarkable innovation for mankind, and it is an upgrade for the digital economy.

2020 marks the birth year of the computing power infrastructure. Our world is at the beginning of a new economic and technological cycle. We have entered the digital economic civilization. This wave of technology is driven by the combination of AI, 5G, quantum computing, big data and blockchain. People have started realizing that in the age of the digital economy, computing power is the most important and innovative form of productivity.

Computing power is not just technical but also economic innovation. It's a small breakthrough at the fundamental level with impact that will be immeasurable. And people have finally seen the value of the bottom layer through the 10 years of crypto mining evolution.

However, there are two major problems faced by the entire technological landscape: First is insufficient computing power. Second is the dominance of centralized computing power, which creates a monopoly and gives rise to manipulation problems and poor data security.

How does more computing power help?

Artificial Intelligence

Mining Bitcoin has allowed my company to build the foundation of computing infrastructure, so we are planning to eventually expand into AI computing. This experience has further shown me the importance of working toward developing more computing power if tech leaders want to continue creating innovative technologies.

Consider this: For an AI system to recognize someone's voice or identify an animal or a human being, it first needs to process millions of audio, video or image samples. It then learns to differentiate between two different pitches of voices or to differentiate faces based on various facial features. To reach that level of precision, an AI model needs to be fed a tremendous amount of data.

It is only possible to do that if we have powerful computers that can process millions of data points every single second. The more the computing power, the faster we can feed the data to train the AI system, resulting in a shorter span for the AI to reach near-perfection, i.e., human-level intelligence.

The computing power required by AI has been doubling roughly every three and a half months since 2012. The need to build better AI has made it mandatory to keep up with this requirement for more computing power. Tech companies are leaving no stone unturned to rise to this demand.

It is almost as if computing power is now an asset into which investors and organizations are pouring millions of dollars. They are constantly testing and modifying their best chips to produce more productive versions of them. The results of this investment are regularly seen in the form of advanced, more compact chips capable of producing higher computing power while consuming lesser energy.

For new technological breakthroughs, computing power itself has become the new "production material" and "energy." Computing power is the fuel of our technologically advanced society. I've observed it is driving the development in various technological landscapes, such as AI, graphics computing, 5G and cryptocurrency.

Cryptocurrency Mining

Similar to AI, the decentralized digital economy sector also relies on high computing power. Transactions of cryptocurrencies, such as Bitcoin, are validated through a decentralized process called "mining." Mining requires miners across the world to deploy powerful computers to find the solution or the hash to a cryptographic puzzle that proves the legitimacy of each transaction requested on the blockchain.

The bad news, however, is that the reward to mine Bitcoin is halved almost every four years. This means that following May 20, 2020 the next halving date miners who mine Bitcoin would receive half the reward per block compared to what they do now. Two primary factors that compensate for the halving of rewards are an increase in the price of Bitcoin and advanced chips with high computing power.

Miners run not one but multiple high-end graphics processing units to mine Bitcoin, which is an electricity-intensive process. The only way to keep mining profitably is to invest in better chips that produce more computing power with lower electricity consumption. This helps miners process more hashes per second (i.e., the hashrate) to get to the right hash and attain the mining reward.

So far, mining chip producers have delivered the promise of more efficient chips leading to an increase in the mining hashrate from 50 exahashes per second to 90 exahashes per second in the past six months. Per the reports, the efficiency of the latest chips combined with increased Bitcoin prices has helped keep the mining business highly profitable since the previous halving.

High computing power has become an addiction we humans are not getting rid of in the foreseeable future. With our growing fondness for faster computer applications and more humanlike AI, it's likely that we demand faster and more perfect versions of the systems we use today. A viable way to fulfill this would be to produce more computing power.

The two biggest challenges that lie in our way are producing clean electricity at lower costs and developing chips that have a lower electricity-consumption-to-computing-power-production ratio. The core of industrial production competition today lies in the cost of producing electricity. Low energy prices enable us to provide stable services. For example, there is an abundance of hydro-electric power in southwest China, and cooperative data centers are located there so they can harness the hydropower.

If we could make low-cost, clean energy available everywhere, we'd cut the cost of producing computing power. When this energy is used by power-efficient computing chips, the total cost drops even more and high computing power becomes highly affordable.

Read more from the original source:
The Need For Computing Power In 2020 And Beyond - Forbes

5 Emerging Technologies That Will Shape this Decade – San Diego Entertainer Magazine

UncategorizedByJohn Breaux|January 22, 2020

Some say that we are in the midst of a new technological revolution, with emerging technologies taking shape to transform the world we live in. As we step into a new decade, expect to see a handful of amazing advancements in technology that will dramatically shape our society at large.

Weve been told for years that self-driving cars are the future, but this decade will bring us the greatest advancements in this field as of yet. Companies have been researching and testing autonomous fleets of cars for years now, and some are finally gearing up to deploy them in the real world. Tesla has already released a self-driving feature in its popular electric vehicles, while Google-owned Waymo has completed a trial of autonomous taxi systems in California where it successfully transported more than 6000 people.

This radically powerful form of computing will continue to reach more practical applications throughout the decade. Quantum computers are capable of performing exponentially more powerful calculations when compared to traditional computing, but the size and power required to run them makes them difficult to use in a more practical sense. Further research in quantum, computing will allow greater application for solving real-world problems.

Augmenting our bodies with technology will become more common as wearable devices will allow us to improve everything from hearing to sight. Examples include devices and implants that will be able to enhance sensory capabilities, improve health, and contribute to a heightened quality of life and functional performance.

The advent of 5G will perhaps be one of the most impactful technologies for the many starting this year and proceeding onwards. 5G networks will have the capability of connecting us to the digital world in ways weve never had before, affording us blazing fast speeds of nearly 10 Gb/s. The speed of 5G will allow for seamless control of vast autonomous car fleets, precise robotic surgery, or streaming of 4K video with no buffering.

Drones are already a pivotal piece of technology in areas including transportation, surveillance, and logistics. Swarm robotics will be a new multi-robot system inspired by nature that will have major potential in completing tasks with unparalleled efficiency. Applications could include providing post-disaster relief, geological surveying, and even farming. Swarm robotics will be able to accomplish tasks through cooperative behavior while adapting to situations in ways that would not be possible with a single drone.

Follow this link:
5 Emerging Technologies That Will Shape this Decade - San Diego Entertainer Magazine

Meet the people who think soaking in a frozen Minneapolis lake is the secret to good health – Minneapolis Star Tribune

Ponce de Lens search for the fountain of youth in Florida is just a legend.

But about 1,500 miles to the north, in the icy waters of Cedar Lake in Minneapolis, dozens of people think theyve found the next best thing.

On a recent Sunday around 9:30 a.m., a diverse group of about 20 people dressed in swimsuits trekked to a spot near the shore on the west side of the lake and immersed themselves in an 8-by-12-foot rectangular hole cut in the ice. Later in the day, another group of people gathered to do the same thing.

This isnt a once-a-year, get-in, get-out, New Years Day plunge for Instagram bragging rights.

This is something that happens every Sunday throughout the winter.

Some people come several times a week, and stay for a good, long soak of five, 10, 15 minutes or more. Except for the knit hats, they look like they could be relaxing in a hot tub as they stand in water that ranges from waist- to neck-deep.

Called cold therapy or cold thermogenesis, ice-water bathing is a practice that biohackers and assorted others believe makes them healthier.

The Twin Cities Cold Thermogenesis Facebook group, which was created in 2016, claims the frigid dips do everything from increase testosterone in men to boosting brown adipose tissue. (The so-called brown fat or good fat may be helpful in combating obesity because it burns calories to create heat.)

Cold-water immersion also strengthens the immune system, according to Svetlana Vold, a part-time firefighter and ultramarathon winter bike racer from St. Louis Park, who organizes the Sunday morning cold-immersion session.

Vold and others say chilling out in the water combats inflammation, helps them sleep better and improves their focus and endurance. Some said theyre inspired by Wim The Iceman Hof, a Dutchman famous for his breathing and cold exposure technique called the Wim Hof Method.

The Cedar Lake group would probably meet the approval of David Sinclair, a Harvard genetics professor and longevity expert who thinks that cold exposure may help slow the aging process.

Maria OConnell, the organizer of the afternoon session, has been immersing herself in an ice-filled horse trough in her backyard since 2011. Initially its a little uncomfortable, she said. You end up getting better the more you do it.

But many say the frigid dunks are a mood-altering, even pleasurable experience.

It hurts so damn good, said Stephen McLaughlin, a 61-year-old Minneapolis resident. You are just completely present.

It makes me happy. I think its adrenaline, said Allison Kuznia, 42, of Minneapolis.

Its kind of a treat to go out and get really cold, said Nick White, 46, of Minneapolis. It gives you a feeling of euphoria.

Go here to read the rest:
Meet the people who think soaking in a frozen Minneapolis lake is the secret to good health - Minneapolis Star Tribune