Daily Archives: February 7, 2022

Global Food Waste Management Market To Progress Geometrically With $53.10 Billion Earnings By 2028 | Exclusive Report by Esticast Research The Grundy…

Posted: February 7, 2022 at 6:22 am

GET SAMPLE REPORTBUY COMPLETE REPORT

Esticast Research has published a new report titledFood Waste Management Market- By Waste Type (Cereals, Dairy Products, Fruits & Vegetables, Meat, Fish & Seafood, Oilseeds & Pulses, And Processed Foods And Coffee Grounds & Tea), By Process (Aerobic Digestion, Incineration/Combustion, And Anaerobic Digestion) And By Application (Animal Feed, Fertilizers, And Biofuel & Power Generation): Global Industry Perspective, Comprehensive Analysis And Forecast, 2021 2028.According to the report, Food Waste Management Market To Progress Geometrically With USD 53.10 Bn Earnings By 2028.

The top Leading Market Players Covered in this Report are :Remondis SE & Co. KG, FCC Environment Ltd., Advanced Disposable Services Inc., Veolia Environnement S.A., SUEZ, Covanta Holding Corporation, Waste Management Inc., Stericycle Inc., Clean Harbors Inc., and Waste Connections Inc.

Download Free PDF Sample Reportwith Complete TOC and Figures & Graphs (withcovid 19Impact Analysis):https://www.esticastresearch.com/request-for-sample/?utm_source=PR&utm_medium=Food-Waste-Management-Market

Food Waste Management Market analysis report is the representation of strategic research methodologies and market tactics. In order to fuel the business growth, knowing future scenario of the market and product sale is important and is the best way of obtaining forecasting of entire market scenario. Marketing strategy is also covered here to divide the market into different segments and target customers. Customers demands are also depicted in this Food Waste Management Market report to aware industries and increase the productivity rate of products and services. It segments the market into different categories such as behavioral segmentations and demographic segmentation for the key regions such as North America, Europe, Middle East, Africa, Asia Pacific and Latin America..

Segments:

The primary market segments and their sub-segments covered in this study provide information on the overall business climate. The categories in this research are built by analyzing supply and demand scenario that provides a thorough view of the market. The segment study provides a detailed view of the fastest-growing market sector as well as the factors influencing the fast/slow growth of the other segments. This section includes a comprehensive market share and revenue analysis.

Geography

Esticast Research most recent market report is divided into North America, Europe, Asia Pacific, Latin America, and the Middle East & Africa. The report includes thorough financials for every region based on its segments. In addition, detailed revenue and market share analysis is given for major countries within every region.

GetExclusive Discount on this report: https://www.esticastresearch.com/inquiry-before-buying/?utm_source=PR&utm_medium=Food-Waste-Management-Market

In order to perform significant actions to lead business towards successful path, it is important to obtain all the market growth related data and nothing is better than referring Food Waste Management Market survey report. It covers current market scenario and forecasting of upcoming market scenario for the period 2022-2028. This Food Waste Management Market survey report signifies how trends will affect the entire business development and growth. It further makes available some profit making business opportunities to make the most out of it. It further does the key observation of the framework of the market within economy. Market trade statistics, key players, their adopted strategies, market share, leading suppliers, trade statistics and trend analysis are some of the key factors provided in this Food Waste Management Market survey report to help business participants in making their business gainful and obtain best investment options.

Exact details regarding market performance is also provided in the Food Waste Management Market analysis along with complete overview on competitive landscape, pricing structure, trend convergence, digital transformation, sales effectiveness, latest innovations, customer buying nature and pricing analysis. All these factors greatly contribute in enhancing the business growth and knowing the market performance. It becomes easy for central participants to track the business performance regularly with the help of this Food Waste Management Market survey report. It talks about successful market strategies implemented by company players such as mergers, acquisitions, collaborations and novel item releases. It further aims at looking over all the compelling viewpoints from different major players, manufacturers to end buyer. Annual revenue, industry growth factors and market tactics are some of the significant factors covered in the Food Waste Management report. It also covers all the latest data regarding CORONA-19 virus effects on world economy. It further helps industry players to survive in the competitive market.

What Does This Report Provide?

This report provides a detailed understanding of the global Food Waste Management Market from qualitative and quantitative perspectives during the forecast period. The major market drivers, challenges, and opportunities for the global Food Waste Management Market have been covered in the report. This report further includes the market shares of the leading companies operating in the global market, along with their production capacities and the growth strategies adopted by them.

Inquiry Before Buying https://www.esticastresearch.com/get-discount/?utm_source=PR&utm_medium=Food-Waste-Management-Market

Objectives of this Report:

Key Questioned Answered in the Report:

Customization of the market analysis:

You can buy the complete report@https://www.esticastresearch.com/cart-share/nFzkB/?utm_source=PR&utm_medium=Food-Waste-Management-Market

Report includes Competitors Landscape:

Major trends and growth projections by region and countryKey winning strategies followed by the competitorsWho are the key competitors in this industry?What shall be the potential of this industry over the forecast tenure?What are the factors propelling the demand for theFood Waste Management Market ?What are the opportunities that shall aid in significant proliferation of the market growth?What are the regional and country wise regulations that shall either hamper or boost the demand for Food Waste Management Market?How has the covid-19 impacted the growth of the market?Has the supply chain disruption caused changes in the entire value chain?

About Esticast Research:

Esticast Researchis a research firm providing research reports on various industries with a unique combination of authenticity, extensive research, and infallibility. We provide syndicated reports, customization services, and consulting services to help businesses across the world in achieving their goals and overcoming complex challenges. We specialize in providing 360-degree view of the markets to assist clients in determining new opportunities and develop business strategies for the future with data and statistics on changing market dynamics. Esticast Research & Consulting has expert analysts and consultants with an ability to work in collaboration with clients to meet their business needs and give opportunities to thrive in a competitive world. A comprehensive analysis of industries ranging from healthcare to consumer goods and ICT to BFSI is provided by covering hundreds of industry segments. The research reports offering market forecasts, market entry strategies, and customer intelligence will help clients across the world in harnessing maximum value on their investment and realize their optimum potential.

Contact:

Mr. Ashish GedamkarEsticast Research & Consulting LLP.Office No.407, Navale Icon IT ParkNarhe,Pune 411041USA:+1-213-262-0704APAC:+91-959-503-5024Email:sales@esticastresearch.comVisit Our Web Site:https://www.esticastresearch.com/

Go here to see the original:

Global Food Waste Management Market To Progress Geometrically With $53.10 Billion Earnings By 2028 | Exclusive Report by Esticast Research The Grundy...

Posted in Progress | Comments Off on Global Food Waste Management Market To Progress Geometrically With $53.10 Billion Earnings By 2028 | Exclusive Report by Esticast Research The Grundy…

Zenith: The Last City Review In Progress Part 2 – MMORPG.com

Posted: at 6:22 am

Zenith: The Last City, a VR MMORPG, has been out for a week, and the team at RamenVR is keeping up so far. They continue to implement patches rapidly while communicating on the official Discord and released a brief launch post mortem/roadmap for 2022. But how are the servers now after a week? What are some issues that stand out currently with the game? Is the game fun to play? How does it compare to other MMORPGs? Lets have a look.

The games servers have stabilized to a point. Restarts, fixes, and maintenance are still happening daily. While playing on a Quest 2,Zenithhas not crashed or randomly disconnected for me once in the last few days. While playing on medium to low population servers, I have only run into a few noticeable issues.

Server lag bubbles have been the main problem so far. In combat, enemies can stop responding for a few seconds, and spells will not take effect. Then suddenly, everything catches up and takes effect all at once. Thankfully, while playing as an Essence Mage Tank, bubbles like this are usually not deadly for me. But I can imagine it being more of an issue for those playing as DPS or Heals.

I am curious to see if the lag bubbles are present on PCVR. They could also potentially be adding to the enemy AI being unresponsive at times at the start of a fight. When enemies do respond to my attacks, their tactics are somewhat simple. They go through visibly repeated attack rotations and use limited movements. Once you have animations down, most combats are not a challenge unless you get overwhelmed by numbers.

Even with the limited AI, the game is fun if you like MMORPGs and the typical trappings of the genre. Quests are your standard fare and repetitive with a side of grind. It's the other players that make combat in the game shine, and I always enjoy jumping in to help with a public event. The fights become more dynamic as I toss spells to slow the enemies and tag a target as it runs by after the DPS. Fighting in VR adds its own unique challenges as I move to block attacks and react to unexpected enemies, and certain moments felt more exciting than most traditional flatscreen MMORPGs.

So far soloing has been quite viable, but I do find myself going back to grind out hunts and public events as I fall behind by a level or two. If you try to move into a new zone and are three levels below the enemies, it can be a rough experience. Whereas if I am participating in events and doing the group quests, the leveling seems to keep up with the main quest storyline so far.

If youre a player who likes high-end graphics,Zenithmay not be the MMORPG for you. The visuals are very pretty at times, but the graphics are, well, basic, in many ways. This is especially true when it comes to your characters appearance. The games core systems are, for the most part, currently barebones. Everything is present that is needed for an MMORPG such as forming groups and guilds. But more advanced game systems such as in-game achievements, housing, guild features/leveling, and tutorials/tooltips for more nuanced gameplay elements are missing. I think some kind of practice room in the city of Zenith itself might go a long way to provide tutorials for players on topics such as blocking attacks effectively and navigating the skill system.

The class skills in Zenith look to be rather robust. As players level, Godstones(Skills) unlock, and there are three different Godstones to choose from for each gesture-based movement, spells, and ultimate slot. You can then loot enhancers of various rarities, which allows players to add major and minor traits as they gain experience with a Godstone. These traits can include a variety of benefits from reducing spell cost to increased damage. But there is no in-depth introduction to the skill system, how the random drops work, or how points are gained to be spent on upgrades. I am still exploring the system and look forward to diving into it more during the final review.

As a fan of MMORPGsZenithis fun and engaging, but it has issues. It is also a step forward as to what can be done in VR for an MMO title. But for players comparing it to existing flatscreen MMORPGs, you may end up somewhat disappointed. While I am still working my way up to max level, I have personally already gotten my moneys worth out of the title and am looking forward to exploring more of the game.

See the original post here:

Zenith: The Last City Review In Progress Part 2 - MMORPG.com

Posted in Progress | Comments Off on Zenith: The Last City Review In Progress Part 2 – MMORPG.com

Quantum Computing and Cybersecurity: A Fusion that Cannot be Ignored – Analytics Insight

Posted: at 6:21 am

Companies must be aware of the fusion between quantum computing and cybersecurity

New technological innovations are transforming economies and enhancing our living standards through increased productivity and reduced cost of production. In lieu of this technological evolution, hackers and cyber scammers are innovating new and innovative methods to hack into the systems of individuals and companies and steal the large amounts of data that are being generated with the help of data analytics and AI tools. Hence, cybersecurity has become an integral part of all business strategies and is a means to protect data from intruders. Cybersecurity enables professionals to protect any information available on the devices to assess future risks. One of the key players in this cybersecurity is quantum computing. To replace classical computing and deliver long-standing results, researchers and scientists are exploring quantum computing as a robust tool to enhance the effectiveness of cybersecurity platforms.

Even though there are various ways in which the implications of quantum computing can harm cybersecurity, there are several other qualities of quantum computing that can deliver exponential advantages for certain classes of problems, for example, factoring very large numbers, with profound implications on cybersecurity.

One of the biggest worries that cybersecurity analysts are facing currently is the emergence of new devices that are based on quantum physics and are considered to be superior to standard computers. These devices have the reputation of enabling cyber attackers to break into secure cryptography methods. Classical digital ciphers rely on complex mathematical formulas to convert data into encrypted messages for storage and transmission. Consequently, attackers can break into these cryptography codes and steal confidential information.

Cybersecurity experts believe that eventually, quantum computing developers will pose threats to the national security of a country due to their ability to break into modern cryptography systems and reveal encrypted messages and stored data. Besides, hackers can adopt advanced technologies, such as machine learning skills to develop and distribute deadly forms of malware.

Frauds and cybercriminals can also take advantage of the powers of quantum computing to create various novel approaches to breach cybersecurity firewalls. Even though such activities can also be computationally tolling on classic computers, with the integration of quantum computing technology, hackers can take advantage of its advanced features and create sophisticated attacks on larger networks of devices and networks.

Quantum computing is not just about the doom of cybersecurity applications. The technology can also help create robust cybersecurity encryption methods. With the help of privacy-enhancing computing (PEC) techniques, professionals can keep the data encrypted while in use and can also provide in-transit and at-rest protections. Since data privacy is a hot topic for individuals and business leaders, PEC can be deployed to create stronger encryption models. Also, homomorphic encryption can be deployed that enables the third parties to process encrypted data and provide results without ever having the knowledge of either. This type of encryption can use lattices, or some multi-dimensional algebraic constructs that would be impossible for intruders to crack. Cybersecurity is a good potential solution for different types of cybersecurity and encryption issues. Security-concerned companies must definitely understand the importance of quantum flexibility.

Quantum computing is a fast-approaching technology in the cybersecurity domain. Companies need to immediately leap into action and analyze the different ways to deploy quantum to enhance security and block intruders from stealing confidential data. Currently, the industry is witnessing a substantial increase in investment in solving the core problems around scaling error correction and algorithms. Enterprises should start thinking strategically about the long-term risks and the benefits of quantum computing technology and engage in serious ways to deploy the best practices of cybersecurity.

Share This ArticleDo the sharing thingy

About AuthorMore info about author

Analytics Insight is an influential platform dedicated to insights, trends, and opinions from the world of data-driven technologies. It monitors developments, recognition, and achievements made by Artificial Intelligence, Big Data and Analytics companies across the globe.

Original post:

Quantum Computing and Cybersecurity: A Fusion that Cannot be Ignored - Analytics Insight

Posted in Quantum Computing | Comments Off on Quantum Computing and Cybersecurity: A Fusion that Cannot be Ignored – Analytics Insight

In review: Top five Danish scientific discoveries of the year – The Post – The Copenhagen Post – Danish news in english

Posted: at 6:21 am

The year-that-went was one of general precarity, as fragile attempts at economic recovery were steamrolled by fresh waves of COVID,

Meanwhile, the arts industries squared up to asphyxiating restrictions, while meeting friends to soothe the nerves was marred by anxiety and awkward distancing.

Science thrivingBut necessity is the mother of invention and, in the face of all the challenges, science has thrived.

In Denmark, the worlds first wind-energy island was proposed (and South Korea quickly followed suit), a new species of whale was identified, and a dodgy GPS signal led to the accidental discovery of the worlds most northerly island.

Biggest impactThe biggest revelations hit the headlines far beyond Denmarks borders.

Selected for their impact, innovation and socio-cultural significance, here are Denmarks top five most exciting scientific discoveries of 2021.

1 Innovative chip resolves quantum computing headacheIn October, two young physicists at the University of Copenhagen put us a step closer to building the first large, functional quantum computer. A little background: the brain of a supercomputer is made up of memory devices called qubits. While an ordinary bit can store data in a state of either 1 or 0, a qubit can reside in both states simultaneously known as quantum superposition. Until now, researchers have managed to build small circuits, in which only one qubit can be operated at once. So it was hailed as a global milestone when Professor Federico Fedele and Assistant Professor Anasua Chatterjee simultaneously operated and measured multiple spin qubits on the same quantum chip the entirety of which was no larger than a single bacterium.

2 Broad horizons for broad beansBroad beans are rich in protein, easy to grow and Jamie Oliver the standard marker of populist cuisine makes a dip out of them. But they also produce vicin poisonous to over 400 million people in the world who are predisposed to the hereditary disorder favism. For these people, primarily in Asia, Africa and the Mediterranean countries, vicin can trigger acute anemia and liver disorders. In July, researchers in Copenhagen and Aarhus isolated the gene responsible for forming vicin in broad beans, paving the way for a vicin-free bean that could become a major future protein-source or make a globally appealing appetiser.

3 Metal detector rookie finds 1kg trove of ancient goldIn an absurd stroke of beginners luck, a man called Ole Ginnerup Schytz found fame in September when he picked up a metal detector for the first time and discovered a stunning cache of 6th century gold jewellery near the town of Jelling in Denmark. Experts dubbed it one of the most valuable archaeological finds in Denmarks history on a par with the Golden Horns of Gallehus. The hoards enormous wealth points to a prolific European trade network and suggests Jelling was a major seat of power. The jewellery will go on display at Vejlemuseerne in southern Jutland on 3 February 2022, before being rehomed in the National Museum.

4 Hotrocks could spell the end for lithium batteriesIn May, construction began on GridScale a remarkable new energy plant on the island of Lolland that can store renewable energy in stone. The method involves super-heating and super-cooling crushed pea-sized basalt in insulated steel tanks. The stone can store heat for many days and supply energy for up to a week outperforming lithium batteries on both cost and efficiency. The 35 million kroner plant was quickly dubbed hotrocks by the international community and will continue to be a hot topic into 2022, when moves will be made to integrate it in the Danish national grid.

5 Malaria medicine found to combat COVID-19In December, researchers at Aarhus University discovered that the malaria medicine Atovaquone can prevent COVID-19 infection. The medicine has a protective effect both before and after infection across different viral variants meaning it could be used for both prevention and treatment of COVID-19. Its a big find: Atovaquone is inexpensive, widely available and has already been approved by the US Food and Drug Administration. Its yet to be tested on Omicron, however, and has only been studied using lab-grown human cells.

Excerpt from:

In review: Top five Danish scientific discoveries of the year - The Post - The Copenhagen Post - Danish news in english

Posted in Quantum Computing | Comments Off on In review: Top five Danish scientific discoveries of the year – The Post – The Copenhagen Post – Danish news in english

Aggelos Kiayias interview: Will blockchain be good for the planet? – VentureBeat

Posted: at 6:21 am

Did you miss a session from GamesBeat's latest event? Head over to the GamesBeat & Facebook Gaming Summit & GamesBeat Summit: Into the Metaverse 2 On Demand page here.

Will blockchain technology be good for the planet? Ive been asking people about that and the first impression is that it will be incredibly wasteful using networks of computers to redundantly verify transactions for cryptocurrencies, nonfungible tokens (NFTs), and ultimately the metaverse. The second impression is that it could be a lot less wasteful shifting to cryptocurrency than relying on our current financial system. I talked about this with one of the experts in the field, and we delved into these myths on a more factual level.

Kiayias is chief scientist at Input Output, a blockchain engineering company. Its the driving force behind the Cardano blockchain, which is a third-generation cryptocurrency, which tries to improve on second-generation cryptocurrencies such as Ethereum and first-generation cryptocurrencies like Bitcoin.

A decentralized cryptocurrency keeps track of all transactions by all addresses on a peer-to-peer shared record. One of Cardanos innovations is the ability to support high-transaction capacities, fast transaction times, and low transaction fees, through a system of proof-of-stake. Cardano is not minable. The blockchain is a record of all transactions, but rather than validation by anyone who performs the proof-of-work, transactions are validated by consensus proof-of-stake. This is also far more environmentally friendly than mining other cryptocurrencies, as they are not using enormous amounts of electricity.

Input Output builds blockchain based products for governments, corporations and academic institutions and upskill people across the world, and it is focused on making blockchain technology that is environmentally friendly. Thats going to be necessary as we move toward the metaverse, the universe of virtual worlds that are all interconnected, like in novels such asSnow CrashandReady Player One. (Kyle Wiggers wrote a piece on the environmental impact of the metaverse).

Three top investment pros open up about what it takes to get your video game funded.

Input Output has contributed $500,000 to Stanford University to allow blockchains to process many transactions even with limited connectivity, providing support for the Tse Lab at the Department of Electrical Engineering of Stanford University. The aim is to enable even smartphones to handle blockchain transactions, while offline, to lessen the amount of electricity used in transactions.

Heres our edited transcript with Kiayias, who is

also chair in cybersecurity and privacy at the University of Edinburgh.

VentureBeat: I wanted to hear more about the work youve been doing. Could you start there? Id like to hear about the background, and the interest youve had in the energy usage of blockchain.

Aggelos Kiayias: Ive been working in cryptography since about the mid-90s. I was a mathematician originally. Cryptography was my passion when I was an undergraduate. It was a very different time, as you know. But were in an exciting time now, especially for those of us working in this area. A lot of these ideas we had on a blackboard, now the time has come to see them implemented, deployed, and used by people. Its a great time.

With respect to energy efficiency, that was one of the very early open questions in the space. Bitcoin was doing something amazing. It was capable of providing an IT service without having any centralized entity that provides it. Its a self-registered service. You could be part of the service provision by just registering yourself as a node. Thats pretty remarkable. It was unheard-of to deploy an IT service like that before. Its great to focus on Bitcoin from this angle, because it moves away the cryptocurrency, so to speak. Theres a lot to learn from Bitcoin just from studying how its possible to deploy a global scale IT service like that, automatically.

At the same time, the observation was, early onwere talking about very soon after Bitcoin was deployed. You had this huge energy expenditure, and it was only getting worse. Another thing that makes the situation even worse than just the energy expenditure is that the product is agnostic of the source of energy. It motivates you to find the cheapest possible energy, and the cheapest energy that you can find might not be an energy that we want to harness. It might be based on non-renewable carbon-based fuels and so forth.

Early on, then, there was an open question around how it was possible to create the service, create the system like that of Bitcoin, but without having the same energy expenditure. That was something that motivated me. It was more than seven years ago when I started working on this, this particular question. Soon we were looking at various flavors of proof of stake. The problem is, none of these protocols were designed in a way that you could be convinced that they would actually work. A lot of my effort was to apply this mathematical rigor that we apply in traditional cryptographic protocol design. Its not so traditional, because cryptography is a very young discipline, but lets say traditional with respect to what you would get in a system like Bitcoin, which is very new, the new kid on the block so to speak.

The question is, is it possible to apply that same mathematical rigor and extract a protocol that can give you the benefits of Bitcoin, this decentralized service provision, an automatic service that can appear out of the self-interest of the nodes that join the network? And it turned out that its possible to do it. We were very good at not only designing the Ouroboros protocol, but getting it deployed, and in general influencing a lot of deployment. The protocol I designed not only found itself as the backbone of Cardano, which Im sure youre familiar with, but also influenced the design of other systems. Polkadot used elements of Ouroboros to design their system, as well as in other efforts.

This is also characteristic of a lot of the work we do. This is work we make publicly available. We have a tradition of scientific peer review. We try to put it in the context of the scientific development of cybersecurity and computer science at large. We validate it not just with cryptocurrency experts, but we validate it in a scientific way with people who have experience in designing computer science protocols, and specifically cryptographic protocols. That gives you a bit of the story behind some of that work.

Above: Aggelos Kiayias is chief scientist at Input Output, which is behind the Cardano blockchain.

Image Credit: Input Output

VentureBeat: What do you think about the energy usage of the different protocols that are out there? If you look at Bitcoin or Ethereum or what some of the Layer-2 solutions do, taking the energy consumption down quite dramatically, to the point where its maybe not as big a concern anymoreare there still some things to be concerned about?

Kiayias: I wouldnt say Im optimistic that Layer-2 is going to help much. The protocol itself, which is the backboneas long as Layer-2 fundamentally relies on Layer-1, and if that Layer-1 is a proof of work-based Layer-1, then the only thing that makes it work is energy expenditure. Energy expenditure is going to go up as long as engaging in this mining operation is profitable. If Layer-1 has high utility, then having high utility is alsoit could also be that it supports a very successful Layer-2 thats also part of its utility. Even if Layer-1 is just the mediator of all the various things that happen at Layer-2, its still going to be a critical part of the infrastructure. Its still going to be profitable to mine with it. Its still going to consume vast amounts of energy. The design of the protocol itself is such that it consumes this amount of energy.

Now, this is not to say that the protocol is flawed. Its designed exactly to do that. The question is whether its worth it. We do a lot of things on planet Earth that consume vast amounts of energy. Bitcoin is one of them. The question we have to constantly ask ourselves is whether this is the best use of the resources we have, the technology we have, at any given time. Technology, after all, is always about finding and optimizing designs, proving them, understanding the problem you want to solve, and finding the most cost-efficient way to do it. In many ways Bitcoin showed the way. However, the way it spends its resources, it doesnt seem, at the moment, to be the best possible use of what we spend versus what we get out of it. Thats not aligning.

VentureBeat: As far as something like Ethereum forking over to proof of stake, do you believe that solves a lot of the problem? Or do we still have problems around that?

Kiayias: It certainly removes the energy expenditure fact. Any proof of stake protocolthere are differences between protocols. There are plenty of differences between proof of stake protocols. But in terms of energy expenditure, proof of stake protocols are similar in the sense that they use relatively small levels of energy, on par with running traditional servers. The difference with proof of work is tremendous. Any proof of stake protocol certainly makes the energy expenditure problem essentially disappear, at least at the level were seeing right now with Bitcoin. Thats only one dimension, though. Energy expenditure is not the only issue that blockchain systems have.

VentureBeat: If they do that, do they trade off something like security?

Kiayias: At the abstract level, you can think of it as a different security assumption. Theyre both secure. But if you look at the theory of security, security is always achieved under certain assumptions. Or to put it differently, its very rare to have a system that has unconditional security, as we say. There do exist some very simple systems in the history of cryptography that are unconditionally secure, but theyre extremely limited in applicability. Typically what you have in security is security proofs or security arguments that are conditional upon certain assumptions or certain behaviors or certain things that youd consider plausible.

Proof of stake or proof of work, they come with different assumptions. Theyre both conditional, I should emphasize. But the set of conditions are different. This doesnt simply mean that one is less secure than the other. In both cases, the conditions that you can argueof course, Im not talking about Ethereum specifically now. Im speaking very broadly about abstract, ideal proof of stake design. Both of these systems can be argued convincingly to be secure under a plausible set of conditions that are different in both cases.

VentureBeat: Ive also heard a very different argument about how the global financial system as it is can be very wasteful. That physical banks use a lot of energy, paper-based systems use a lot of resources. The comparison between that system and a system based on cryptocurrency seems like there is a big difference there. I dont know what youd think of that.

Kiayias: Ive seen this argument as well, but I dont buy it. Bitcoin, just Bitcoin, cannot substitute for the world financial system. A bank is not just a ledger. They have a whole infrastructure that includes customer service, backup systems around what to do when things go wrong. Its endless. There are a lot of other items there. Its possible to make such a comparison, but the places Ive seen this comparison being madeits too simplistic, comparing a brick and mortar world that includes things like customer service with just a ledger. Thats not realistic. Bitcoin, the ledger, is by itself insufficient as a drop-in replacement for the whole banking system.

VentureBeat: I guess its more of a comparison between centralized financial services and decentralized finance.

Kiayias: Yes, thats true. A comparison like that could be made. But I havent seen it being made in a convincing way. And I should say, even if someone makes this comparison convincingly, why not use less energy? Even if Bitcoin is more competitive energy-wise, we should still ask the question, whats the absolute minimum of energy we need to get a service like that? Just because Bitcoin is better than System X, that doesnt meanthe question here is not just what system is better, but if you want to do the job that System X does, whats the absolute minimum amount of energy we need?

This is the way we ask questions in computer science. Computer scientists look at questions like this. The problem, lets say, is sorting an array. The question is, whats the minimum number of comparisons you need? Its not about finding one algorithm and proving it. Its finding the best possible algorithm that does it. Im keen on doing this. This is what motivates my research and the research we do at Input Output. Certainly I think this is the right question to ask.

Above: Meld is a interesting new way to borrow money through the blockchain.

Image Credit: Meld

VentureBeat: I wrote about Ubisofts effort to bring NFTs into their games recently. They said that they were using Tezos, and that it was a lot more energy efficient. A transaction in Tezos was using a millionth of the energy of a Bitcoin transaction, so you dont have the gas fees. It was more like the equivalent of a couple of Google searches. It sounded good, but it seemed like the problem is that not many people are going to use that protocol.

Kiayias: Tezos is a proof of stake system, so these numbers are not surprising. Bitcoin is immensely more energy-hungry compared to any proof of stake system out there right now. In terms of the infrastructure, from the point of view of the end user, these things shouldnt matter too much. The end user tries to play a game or tries to exchange some artifacts they have, some art. Where the user does it is something that many end users are not too concerned about themselves.

But what are concerned about here, as far as the basic infrastructure questionwhat is a sound infrastructure to base the security of these transactions upon? This is the question that motivates the work were doing. This is the right question to ask. Theres plenty of systems out there, but the real question to ask is, are they secure? Have they been analyzed? What are their credentials? Were still at the very beginning of assessing system security in the blockchain space.

VentureBeat: Do you worry that were already wed to Bitcoin and Ethereum to the point where we might not be able to change?

Kiayias: No, I dont think so, to be honest. Were still very early. Were exploring so many different ways of doing this. There are a lot of things that everyone in the space is learning. Were still way too early to say that somehow the first mover advantage is going to eliminate possibilities for other systems beyond Bitcoin and Ethereum to be successful. Theres certainly a lot of room for that.

You could even imagine a setting where theres a multitude of systems that are all interoperable, with trustless bridges connecting them and the ability for people to seamlessly transfer assets between systems, without actually caring about how it works. If you look at the internet itself, its a connection of quite diverse networking backbones, but we dont care now. The packets coming from my machine reach yours and vice versa. Everything is happening seamlessly in the background because the internet infrastructure has been optimized to work like this.

This is a viable future for the blockchain space. You have a lot of systems that interoperate. Eventually what you expect is that some of the systems will go away and some of the other systems will grow, depending on their ability to scale, have sound economics, and interoperate with existing systems and legacy infrastructure. Thats very important. You cant imagine this technology in complete isolation. It must interoperate with existing infrastructure and financial systems and so forth.

VentureBeat: Do you think, for things like NFTs and games, that theres an obvious solution yet, an ideal protocol?

Kiayias: I should say theres a lot of research to do even in the NFT space. The basic backbone of what is an NFT is well-understood, and there are systems that can give you that. I have to point out, however, that Ive also seen a lot of subpar implementation of the NFT concept. In other words, even though the expertise is there around how to provide the basic NFT, its a bit unfortunate that I also see a lot of insecure implementations of that concept in different platforms. However, at this point in time, there are systems, Cardano included, that can give you a sound implementation of the NFT concept. They can be readily used by those that want to use such systems.

VentureBeat: If blockchain overcomes the energy challenge, do you think thats the main challenge that it faces in adoption? Or do you think there are other challenges, other problems that have to be overcome?

Kiayias: No, there are more. There are definitely more. There are a few different ones that require expertise from different areas. Certainly energy efficiency is obviously one of them, but then we also have scalability. The work that we hinted at in the beginning of this conversation with Stanford has to do with scalability. Some of the research that the team at Stanford that were funding through this gift, its working around this next generation of scalability for Layer-1 operations. Layer-1 is an ever-expanding database. The question is, is it possible to have small, finite nodes to support that infrastructure without sacrificing the security of the system?

So far, blockchains like Bitcoin, Ethereum, and others, theyre quite monolithic in what they consider the concept of a full node. This is something that knows everything and checks everything. Obviously, by itself, this is not scalable in the future. Were talking about a neverending, ever-expanding database. We definitely need to solve that problem, and without a lot of work, a lot of research, both within Input Output and with partners like Stanfordtheres a lot of work weve done with others confronting the scalability problem.

Above: The metaverse could use a lot of energy.

Image Credit: Fold

Meanwhile, in all of these systems, you self-register because youre incentivized to do so. Theres a lot of research that still needs to be done to validate the soundness of the mechanisms that these systems use to incentivize the participants. For example, you can see in recent times all the debate that exists around pricing transactions. There are big issues in Bitcoin, and Ethereum as well, but Bitcoin has transactions which are extremely expensive. You can start asking the same principal questions. Whats the best way, if you want to auction this desirable space for transactions on the blockchainwhats the best way to price it so that you can still not break the incentives of the system participants?

These are just two examples where we do active research. There are lots of open questions to solve.

VentureBeat: Every now and then I also hear that quantum computing is going to be a threat to blockchain. It seems like it could also be used to defend blockchains. I dont know how serious a problem that is.

Kiayias: We take it seriously, very seriously. Its definitely not feasible at the moment with the quantum computing capabilities that exist right now. Its still an open question as to how well these techniques are going to scale. But its a very serious concern, and we have to understand how to develop protocols that, as we call it in security sometimes, are post-quantum secure. How can you develop a protocol that, after a quantum computer exists, still retains its security?

The good news, and perhaps something that people misunderstand sometimes, is that you do not need a quantum computer to be protected from a quantum adversary. Its possible to develop classical algorithms, classical cryptography techniques, that are secure even against quantum attackers. Thats something weve demonstrated in other areas of security, for example secure communications between websites, clients and servers. Its something that we understand how to do in a way thats post-quantum secure. Bringing this technology and understanding the right technology for the blockchain space is an ongoing research effort. Im quite active in this myself right now, together with my colleagues, but theres a general effort toward making blockchains post-quantum secure. Its something we take seriously, but its definitely within the realm of feasibility for the next few years, to have blockchains which are completely quantum safe, so to speak.

VentureBeat: Whats your own feeling about the potential for the decentralized internet, decentralized web, versus what we have right now thats centralized around big tech companies?

Kiayias: Im very enthusiastic about this. This is going to be one of the best applications of blockchain technology. Right now, what we observe is that all this centralization around the big corporations that basically silo a lot of informationin many ways it can be argued that it should be more of a public resource and less of something that one specific company should be able to capitalize on. There are a lot of issues of basic human rights. Do I have a right to see what information is collected about me? Do I have the right to transfer my information? Do I have the right to erase some of the information thats collected about me? These are basic questions that concern people in the IT law space, the legal aspects of information technology.

Blockchain systems do provide a lot of tools that can be used to make this situation better from the point of view of regulation. These ideas sometimes are called reg tech, regulatory technology. These are techniques that could be very useful in the future, and could upgrade the way we do regulatory compliance. Lets say regulatory compliance could catch up with the times. Im very optimistic about this. Some of the regulations we use right now are very antiquated. Were in a world where big corporations can just do regulatory arbitrage. They have a department in some jurisdiction where they operate because its advantageous for them to be there. Its a perversion of how the system should work on a global scale.

I do think that building these applications on top of a substrate of blockchain technology could somehow elevate the internet into something that has memory, something that has the ability for users to engage in a neutral space, without being locked into a particular company. This is a very promising direction for blockchain technology, and I think thats going to have a big impact on what we do in the next few years.

VentureBeat: Do you think theres some limited form of decentralization thats more ideal than complete decentralization? Do we need to form lots of DAOs in order to replace companies, things like that?

Kiayias: Its a great question. Whats the right level of decentralization for a particular type of application or system? Im very confident that theres going to be a wide spectrum. We cannot decentralize everything. Not everything makes sense to be decentralized. We still want to have services that operate as centralized entities in one way or another, just because there are needs for peak performance or agility.

Decentralized systems, no matter how you design them, are highly distributed. Their responsiveness will always be beaten by a super efficient, optimized centralized system. As the saying goes, the benevolent dictatorship is the best system that you could ever have, because theres only one dictator, he acts in everyones best interests, and he can act immediately and solve every problem without any delay. The only problem is that theres a scarcity of benevolent dictators.

Thats something we can solve with decentralization. Weve solved it historically with democracy. As we can see in the way human societies have evolved over thousands of years, theyve moved to settings where you have some centralization. You have a president or a prime minister. But you also have decentralized components of operation. You have elections. You have hierarchies of management. You have separation of duties, separation of parts of the government. This is not new. If you study political systems you can see a diverse landscape of some decentralized processes, some centralized, and a lot of checks and balances that glue everything together. I think exactly the same thing is going to happen in information technology.

Above: Illuvium is creating a DAO.

Image Credit: Illuvium

VentureBeat: What are the areas of research that youre most excited about? What do you think is worth a lot of your time right now?

Kiayias: At this particular point in time, Ill mention scalability, which Im working on very actively with the team at Input Output. We have a lot of questions about how to optimize these functions. Making the system scalable, it also has to be agile, because different use cases on top of a blockchain require different types of operations. You want to create a system that somehow shapes itself according to a particular use case. Scaling for the occasion is always an important concern. Theres a lot of research going on about this right now.

I mentioned economics and game theory. Another topic Im working on actively right now with the team is governance. I cant emphasize this enough. Sound governance of a decentralized project on blockchain systems that are live is extremely important for their long-term success. One of the biggest problems weve seen in the space so far is systems getting into trouble because they cant properly manage the way that the system evolves. Its impossible, if we look at the history of software systems, to have the perfect system that stays there forever. What we know instead is that, first of all, bugs do happen. They have to be patched and corrected expediently. At the same time, circumstances do change. You have to be able to adapt. Governance and software updates in the decentralized setting is an extremely important question. Its something were very actively working on. So those are three examples of the top research streams we have going on right now.

VentureBeat: Are there any subjects where youre worried about the state of things, the direction that were going?

Kiayias: In the long term, nothing specifically. The whole space is maturing rapidly. We will be able to solve the main questions. Im very optimistic about the future, about the whole direction were going. I see a lot of bright people in the industry space. I see a lot of good expertise participating. I believe all of the major questions will be solved, from a technological point of view.

About the social aspects, one issue that worries me is that a lot of the research and a lot of the development were doing is driven collectively, the whole blockchain space by computer scientists and software engineers. We know from the history of recent information technology services like Facebook that you can do things, if you dont have an interdisciplinary approach of understanding the questions that youre trying to solveits possible to create systems that can do a lot of harm. Weve seen this, in the case of Facebook, with the spread of misinformation that has hurt public health in some cases during the pandemic, as well as in the case of Myanmar. There are a lot of cases where well-intentioned technology development has led to adverse effects.

What Im trying to do, and something were striving toward at Input Output, is to promote interdisciplinary work and research. Thats something I look forward to being more active with in 2022 and further on, so we develop solutions that really solve the important problems we want to solve, without creating new problems as happened with the hasty development of information technology services that weve seen in other cases, like Facebook as I mentioned.

VentureBeat: Do you have some confidence that, say, within five years well be able to solve the energy usage problem? Can we get to an ideal protocol?

Kiayias: First of all, I think energy usage is something that weve taken care of already. Cardano and other systems are going in this direction. We have a sound infrastructure that is not energy-hungry. But as for all the other questions, because there are many other questionsgovernance, as I mentioned, is a key question. Scalability in all use cases is a key question. The next five-year window is enough time to develop systems that are resilient and reasonably capable of evolving so that they are exceptionally long-lived. Im confident that the research and development weve done so far, and thats going to happen in the next five years, will take us to that point.

GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Learn More

See the article here:

Aggelos Kiayias interview: Will blockchain be good for the planet? - VentureBeat

Posted in Quantum Computing | Comments Off on Aggelos Kiayias interview: Will blockchain be good for the planet? – VentureBeat

3 AI Trends to Watch in 2022 | Inc.com – Inc.

Posted: at 6:21 am

My mission for more than two decades has been to help artificial intelligence (AI) work for the masses, and I truly believe in AI's potential to make our lives healthier, happier and more productive.

Of course, AI comes with certain challenges (like any emerging technology), especially as companies more fully operationalize AI. In fact, there are at least three significant AI trends already on the move this year -- and I'm closely watching how these shifts will help businesses continue to navigate AI hurdles like removing bias and building trust.

Trend 1: Actioning AI ethics and governance

For years, there's been discussion about eliminating bias from AI models. These concerns are not only prominent among industry professionals -- they're increasingly arising in mainstream media outlets as well. For example, NBC recently aired an episode of "American Auto" that centered on a self-driving car's failure to recognize and brake for people of color.

In 2022, we're seeing conversations about AI ethics and bias mitigation transition from abstract frameworks into real-world practices. This evolution is largely powered by emerging startups that provide AI monitoring and governance solutions for businesses. Now, a big question mark for AI-driven companies is whether to outsource machine learning performance monitoring to companies like Credo, Fiddler and Arize AI, or build out internal capabilities to validate, monitor and analyze machine learning models.

My advice: Don't overthink this decision. At Smart Eye, we have largely implemented checks and balances in house, but I look forward to exploring potential partnerships with emerging companies in the governance space. However, if your organization currently lacks the in-house expertise to properly operationalize AI ethics and governance, go ahead and bring on a partner who can. Many third-party solutions can implement systems that can train, validate and analyze the efficacy of your AI systems, and at levels that are difficult to achieve with data offered by your customer base alone. Start simple as well, perhaps by monitoring the diversity of your training and test data, or biases present in your inference results. Then you can work off of this information to intervene as required.

Over time, you can increase efforts and adopt more tools and capabilities to help eliminate bias and add model explainability. From my perspective, the important thing is that your organization is bringing AI ethics and governance into action now.

Trend 2: Increasing AI's role in hybrid workplaces

According to recent research from Microsoft, more than 70% of workers globally want flexible remote work options to continue. I know that's how I personally feel. Hybrid work environments are here to stay -- even at my company, we're figuring out what hybrid looks like in practice, and how to get ahead of employee burnout. But one thing I'm sure of is that AI will continue to drive innovation in the future of work.

We've seen a recent rise in organizations embracing collaboration and workspace tools designed to boost engagement and happiness levels. As a next step, layering in AI can help you learn so much more about how your team is doing. For example, I'm a big fan of startups like Read AI, which monitor meetings in real time to gauge who speaks the most, the overall sentiment in the "room" and other nuanced behaviors. Over time, gathering this information helps leaders improve future experiences for employees, surface helpful coaching insights and uncover team members' skills.

The pandemic has accelerated the adoption of virtual and/or hybrid settings, and it's great to see more tools coming to market that can quantify and support social and emotional intelligence in organizations. Upgrading how you engage with employees has powerful mental health implications, too. Rightfully so, building out more helpful mental health resources remains a top priority for organizations in 2022 -- nearly 40% of employers expanded mental health benefits during the pandemic. I'm eager for more large-scale deployment of AI systems that better quantify an individual's mental health needs and can provide just-in-time support -- stay tuned for more updates on that front!

Trend 3: Exploring AI and Web3

A third trend I'm keeping my eye on is the intersection of AI and the emerging world of Web3, crypto and NFTs (non-fungible tokens).

One obvious area where AI is being applied is in synthetic data -- otherwise known as artificially created data. In my world, we turn to synthetic data all of the time to power deep learning models and train generative models -- without having to dedicate massive amounts of time and money to producing labeled diverse data sets. We're slowly seeing the application of generative adversarial networks in Web3, where one can create thousands of unique, synthetic characters to populate the metaverse. This opens the door to engineering new user experiences, and even exploring new monetization and branding opportunities (think influencers in the metaverse!).

The same goes for NFTs. While NFTs themselves still feel quite new, there are a lot of opportunities to embed AI and make these digital assets more interactive. Imagine intelligent NFTs (iNFTs) that have natural language understanding, perceptual capabilities and computer vision, and therefore can engage audiences in conversations -- an iNFT telling you about its "origin story," for instance. This is definitely a space I'm watching.

No matter the trend, keep AI human

Amid divergent trends, there's a consistent element in 2022: keeping humans central to the AI equation.

This goal is giving way to the emergence of a new technology category: human insight AI -- AI technologies designed to understand, support and predict human behaviors within complex environments. This year, we're already seeing human insight AI strategies leading to improvements, from the interiors of our cars to hybrid work environments ... and maybe even in the metaverse. But no matter where human insight AI is applied, we will all be better off for it, and I can't wait for these human elements to power all sorts of AI experiences in 2022 and beyond.

The opinions expressed here by Inc.com columnists are their own, not those of Inc.com.

Original post:

3 AI Trends to Watch in 2022 | Inc.com - Inc.

Posted in Ai | Comments Off on 3 AI Trends to Watch in 2022 | Inc.com – Inc.

Can AI Save Humanity From Climate Change? That’s the Wrong Question – Interesting Engineering

Posted: at 6:21 am

Artificial intelligence is among the most poorly understood technologies of the modern era. To many, AI exists as both a tangible but ill-defined reality of the here and now and an unrealized dream of the future, a marvel of human ingenuity, as exciting as it is opaque.

Its this indistinct picture of both what the technology is and what it can do that might engender a look of uncertainty on someones face when asked the question, Can AI solve climate change? Well, we think, it must be able to do something, while entirely unsure of just how algorithms are meant to pull us back from the ecological brink.

Such ambivalence is understandable. The question is loaded, faulty in its assumptions, and more than a little misleading. It is a vital one, however, and the basic premise of utilizing one of the most powerful tools humanity has ever built to address the most existential threat it has ever faced is one that warrants our genuine attention.

Machine learning the subset of AI that allows for machines to learn from data without explicit programming and climate change advocacy and action are relatively new bedfellows. Historically, a lack of collaboration between experts in the climate and computer sciences has resulted in a field of exploration that is still very much in its infancy.

Happily, recent years have seen the beginnings of a shift in that paradigm, with groups like Climate Informatics and the Computational Sustainability Network focusing on how computational techniques can be leveraged to advance sustainability goals.

Taking this notion a step further, a group of young experts in machine learning and public policy founded Climate Change AI in 2019, a non-profit that aims to improve community-building, facilitate research and impactful work, and advance the machine learning-climate change discourse.

There have been different communities working on different aspects of this topic, but no one community unifying the discourse on AI and the many different approaches to climate action, explained Priya Donti, co-founder and power and energy lead of CCAI in an interview with Interesting Engineering.

Climate Change AI has, in no uncertain terms, altered that landscape. In 2019, the group published a paper entitled Tackling Climate Change with Machine Learning, a call-to-arms for the machine learning community that presented 13 areas ranging from electricity systems and transportation to climate prediction and agriculture where the technology might be best utilized. Dozens of experts in the machine learning, climate change, and policy communities contributed sections to the paper and well-known figures like Andrew Ng and Yoshua Bengio provided expert advice on the project as well.

In the years since its publication, the organization has helped foster communication through workshops and other activities, ensuring that the people joining these events are a blend of computer scientists and those from other disciplines.

Encouraging this communication is neither easy nor without its difficulties, however, something that David Rolnick, one of the papers authors and co-founder and biodiversity lead of CCAI readily acknowledges.

The machine learning and AI community is very vulnerable to hubris, explained Rolnick in an interview with Interesting Engineering. Thinking we can solve the problems of other fields without [...] working with people in those fields, without having to leave our algorithmic tower. As in other areas of applied machine learning, meaningful work on climate change requires collaboration.

The interdisciplinary mingling the group promotes is beginning to bear fruit. Many of the professionals who engage in these events help facilitate dialogue between experts of varying fields who would otherwise have a hard time understanding each other, a prerequisite of any collaborative effort.

Were starting to see a lot more people who [...] are not 100 percent machine learning experts, theyre not 100 percent experts in the climate-change-related domain, [but] theyve done a really good job of doing work at the bridge between those two things, and as a result, are able to bring people together, Donti notes enthusiastically.

The team at CCAI believe that researchers and policymakers alike are beginning to alter the focus of their efforts as a direct result of the groups 2019 paper, and its broader efforts. Along with healthcare, climate change is now widely viewed as a key application of AI for the greater good, something that wasnt the case just a few years ago.

I think one thing thats inspiring is the number of people who have risen up to take on [the climate change] challenge, says Donti.

Crucially, though, that inspiration needs to translate to results, and that mentality underpins the teams efforts.

Whether Im optimistic or pessimistic, fundamentally, Im action oriented, and I think its important to do what we can, she underscores.

Ultimately, doing what we can to address climate change through AI (or any other technology) is going to be approached via two basic principles: limiting greenhouse gas emissions going into the future and responding to the effects of what levels of climate change we have, unfortunately, already locked in.

Research bodies, governmental institutions, and private companies around the world are beginning to take up the challenge on both fronts. Brainbox AI, for example, is a Montreal-based company that uses machine learning to optimize HVAC systems in office buildings and other kinds of real estate. This is a key area to focus on when dealing with potential GHG reduction, as the energy consumed by buildings accounts for a quarter of global energy-related emissions alone.

Given that real estate is a major contributor to greenhouse gas emissions, the decision-makers in the industry have a major opportunity to lead the charge, explained Jean-Simon Venne, CTO and co-founder of Brainbox AI in an email exchange with Interesting Engineering.

An AI-driven HVAC system can allow a building to self-operate, proactively, without any human intervention. It can ultimately evaluate the most optimal HVAC configuration for energy efficiency, saving money but also reducing the load on the power grid, keeping the buildings footprint low.

Adaptation will be just as crucial an effort, as extreme weather events driven by rising temperatures rapidly increase in frequency. Disaster response is one area already seeing the application of AI technologies, with machine learning being used to help people recover from natural catastrophes far quicker than in the past.

Such was the case during the 2021 typhoon season in Japan, when the U.K.-based company Tractable used its AI in partnership with a major Japanese insurer to assess external property damage caused by Typhoon Mindulle, helping homeowners recover more quickly. The company claims it can reduce the time needed for damage assessment from several months to a single day.

Just as neither of the goals of climate change mitigation and adaptation will be easy to make progress with, neither can be accomplished using AI alone. While the technology lends itself to flashy news headlines and compelling sci-fi narratives in literature and film, its far from the silver-bullet solution that its often made out to be.

Rolnick stresses that the practicality of what machine learning can and cant accomplish must be a primary consideration when entertaining the idea of applying the technology to any particular problem. Climate change isnt a binary issue, and we must mould our attitudes accordingly.

[AI] is not the most powerful tool, he emphasizes. Its not the best tool. Its one tool, and its a tool that I had at my disposal. Im not optimistic because of AI specifically, Im optimistic because climate change isnt an on-off switch. We get to decide just how bad it is. Any difference that we can make is a meaningful difference that will save lives.

The applications of machine learning are manifold, and both the groups 2019 paper and their recently-published policy report for the Global Partnership on AI are well worth an in-depth read.

The team at CCAI underscores that one basic use of machine learning in this space is its ability to help gather data, like how the technology was recently used to create a map of the worlds solar energy facilities, an inventory that will be of great value going into the future. Such datasets will help scientists better guide their research and policymakers make informed decisions.

Another area where it can make a substantial difference is in improving forecasting, scheduling, and control technologies that pertain to electricity grids.

The energy output of electricity sources like solar panels and wind turbines are variable, meaning they fluctuate depending on external factors like how much the sun is or isnt shining on any particular day.

To ensure consistent power output independently of weather conditions, back-ups like natural gas plants run in a constant CO2-emitting state, ready to fill in those gaps. Improving energy-storing tech like batteries could be a way to reduce the need for such high-emission practices, with machine learning being able to greatly accelerate the process of materials development and discovery.

Were seeing huge advancements in batteries in terms of cost and energy density, Donti says. Batteries are going to be a critical piece of the puzzle, and there are some companies using AI to speed up the discovery of next-generation batteries. One example is Aionics.

Aionics is a U.S.-based startup using machine learning to expedite battery design, which could, in addition to improving electricity systems, unclog one of the bottlenecks standing in the way of electric vehicle adoption on a large scale.

Using machine learning to help decarbonize the transportation sector on a larger scale is more difficult, however. Passenger and freight transport are notoriously difficult to decarbonize. If fossil fuels are to be replaced with batteries, for example, they will in many cases need to be extremely energy-dense. But thats only a tiny part of the picture, the bigger issue being the convoluted nature of the transportation sector itself.

In the electricity sector, you have relatively few, large players, and its rather centralized. What happens in terms of innovations is happening in fewer companies with more aggregate datasets, explained Lynn Kaack, assistant professor of computer science and public policy at the Hertie School in Berlin and co-founder and public sector lead at CCAI in an interview with Interesting Engineering.

In transportation, there are many more and smaller companies [...] often there is much less means, much less data to exploit. Where one can take the system perspective, trying to optimize routing, charging station placement, machine learning has interesting things to add, but its not always straightforward.

Kaack points to the example of how German passenger rail operator Deutsche Bahn is looking at maintenance optimization through machine learning. Technological failures result in delays, and delays have a big influence on whether or not passengers perceive rail as a viable alternative to driving.

Technical challenges are far from the only thing that needs to be overcome in the service of doing right by the planet. How these issues and their potential solutions are framed and perceived matters greatly.

The public sphere is prone to putting a spotlight on glitzy techno-cures that can divert attention away from simpler but potentially more actionable projects and technologies. Neither are research bodies or governmental agencies immune to such frenzy. Awareness here is crucial, as the lens through which AI is seen can play a role in dictating the direction research leans and where funding ends up.

AI can make certain kinds of action easier, but it can also lead to greenwashing, Rolnick warns. Techno-solutionism can lead people to think they are having a much bigger impact than they are, and even divert peoples attention away from lower-tech, but more impactful courses of action.

Working on unsexy problems is important. How even the most exciting technologies get integrated into the workflow where they will be applied is quite simply boring, essential work. Persuading the relevant parties involved in funding and finding a new solution often requires the right rhetorical touch.

For different innovations and solutions, we should think about who the audiences are who need to be convinced, who are the people who might be financing things, how do you make [the incentives] clear to private and governmental funding sources, Donti says.

By the looks of things, many appear to find the group and its goals compelling. Climate Change AI has had a direct impact on funding for programs like the U.S. governments DIFFERENTIATE program and Swedens AI in the service of the climate program, for example, and theyve just finished the first round of an innovation grants program thats allocating two million dollars to projects that will promote new work by creating publicly available datasets.

On a broader scale, how we leverage and manage AI is a topic that is increasingly being given the attention it deserves. Last April, the European Commission introduced the Artificial Intelligence Act, the first large-scale regulatory framework for the European Union regarding technology.

While some claim the framework doesnt do enough to protect civil rights and liberties, it is a step in the right direction, and the more central and common these high-profile discussions become, the better. Anyone and everyone involved in machine learning applications need to embed the ethical considerations of relevant stakeholders, not just investors, into the foundations of the technology as much as possible.

Taking all of this together, its not a stretch to say that AI can be utilized to address climate change. But the fact remains that the issue is an extraordinarily complex one, and even those directly involved in approaching it admit that the conversation of when and how we do that is an ever-evolving one, wherein the most effective path forward is never exactly clear.

Are you going to spend your time with practical applications and policymaking, helping people who are supposed to make decisions shape funding programs and inform legislation, or do you go back to fundamental research? Its difficult to balance them and understand which has the greatest impact, Kaack says.

While a difficult question to navigate, that its even being asked is nothing short of inspiring. Doing what is within ones reach stands out as an evergreen principle for achieving real, tangible action, even when dealing with something like climate change. The overall message is less of a, Do it with AI, and simply more of a, Do, period. In the face of a problem of this scale, one that often feels paralyzing in its insurmountability, that message is a refreshingly galvanizing one to hear.

Im not here to say that AI should be our priority, reiterates Rolnick. AI is a powerful tool, but climate action will require all the tools. The moral of the story for me is that it is important for people to think about how they can use the tools they have to make a difference on problems that they care about.

See the rest here:

Can AI Save Humanity From Climate Change? That's the Wrong Question - Interesting Engineering

Posted in Ai | Comments Off on Can AI Save Humanity From Climate Change? That’s the Wrong Question – Interesting Engineering

Pecan AI Raises $66M To Advance AI Automation And Predictive Analytics – NoCamels – Israeli Innovation News

Posted: at 6:21 am

Israeli predictive analytics platform, Pecan AI, has secured $66 million in a Series C funding round led by global private equity and venture capital firm, Insight Partners, along with support from prior investors including GV, S-Capital, GGV Capital, Dell Technologies Capital, and others.

This raises the companys total funds raised to over $100 million.

Pecan AI said the funding will be used to scale its global footprint and accelerate research and development of thelow-code predictive modeling and data science platform.

Founded in 2018, Pecans AI services aid business intelligence, operations, and revenue teams to predict revenue-impacting risks and outcomes without the need for data scientists. Pecan enables its users to transform substantial amounts of raw transactional data into accurate insights predicting the profitability impacts key performance indicators such as customer lifetime value, retention, conversion rates, and demand forecasting directly yield.

To date, Pecans predictive algorithms impact billions of dollars in revenue for consumer goods, fintech, insurance, mobile application, and wellness and beauty companies of various sizes, tripling its annual revenue from the past year.

We believe that any company should be able to deploy AI-based predictive analytics, even without data science resources on staff, said Zohar Bronfman, CEO and co-founder of Pecan AI. This new funding will help us scale Pecan further to overcome the data science scarcity gap, enabling our customers to move beyond outdated data-mining techniques that offer little value in predicting future outcomes.

View post:

Pecan AI Raises $66M To Advance AI Automation And Predictive Analytics - NoCamels - Israeli Innovation News

Posted in Ai | Comments Off on Pecan AI Raises $66M To Advance AI Automation And Predictive Analytics – NoCamels – Israeli Innovation News

Improving AI-enabled Healthcare in the U.S. – OpenGov Asia

Posted: at 6:21 am

Magnetic resonance imaging (MRI) technology is a widely used albeit costly tool for diagnosing brain injuries and strokes. Its high procurement, installation and operating costs, however, mean much of the developing world has no access to it.

Researchers from the University of Hong Kong (HKU) have successfully developed a new magnetic resonance imaging (MRI) technology, the ultra-low field (ULF) 0.055 Tesla brain MRI, which can operate from a standard AC wall power outlet and requires neither radiofrequency nor magnetic shielding room. Further, a conventional, typical MRI machine can cost up to US$3 million, yet the ULF-MRI scanner costs only a fraction of this price.

The research team was led by Professor Ed X. Wu, Chair of Biomedical Engineering and Lam Woo Professorship in Biomedical Engineering of the Department of Electrical and Electronic Engineering, HKU. The research output was published inNature Communications, and also highlighted inNature AsiaandScientific American.

The HKU team is one of the three leading ULF-MRI academic research groups worldwide, with one based at Harvard/MGH, dedicated to developing novel ULF-MRI technology. Their goal, as shared by researchers like Professor Wu, is to popularise and broaden the use of MRI.

As an MRI researcher for over 30 years, Professor Wu is delighted and derives a strong sense of fulfilment from the development of what he calls a scaled-down MRI scanner that is far more affordable than what is on offer in hospitals. The human body is mostly made of water molecules, on which MRI thrives, said Professor Wu. MRI is a gift from nature and we must use it more. Currently, it is underutilised as a diagnostic tool.

It is estimated that currently more than 90% of MRI scanners are located in high-income countries, and two-thirds of the worlds population do not have access to them. The total number of clinical scanners is estimated at only about 50,000 worldwide.

The HKU team has made the design and algorithms of ULF 0.055 Tesla brain MRI open-source knowledge, available to all interested in developing the technology further or applying it in diverse areas. This virtually opens the door to making advancements in various aspects of healthcare provision in terms of MRI applications. This will be a big field, Professor Wu said, the team has demonstrated the concept and shown the feasibility of a simplified version of MRI. There are many ways to move forward.

With the use of a deep learning algorithm, the team has removed the constraint in conventional MRI, namely the need to be shielded from the outside radiofrequency signal, which results in a bulky, non-mobile set-up. The existing MRI scanners are essentially giant magnets and need a purpose-built room to shield them from outside signals and to contain the powerful magnetic fields generated by their superconducting magnets, which require costly liquid helium cooling systems. The teams new computing and hardware concept made the latest development possible.

Professor Wu is confident that a critical mass of researchers could push the frontiers of knowledge. He noted that the open-source approach is the quickest way to spread knowledge. It is hoped that MRI can be used in more fields other than radiology, for example in paediatrics, neurosurgery or the emergency room. The team welcomes more people from the scientific, clinical and industrial sectors to research to benefit healthcare, he said.

In collaboration with Professor Gilberto Leung of Neurosurgery and other clinicians at Queen Mary Hospital, his team had validated the results of using ULF-MRI by comparing them with images obtained from a standard 3 Tesla MRI machine. They could identify most of the same pathologies, including stroke and tumours results, despite the lack of clarity and resolution required for precision diagnostics.

Professor Wu said, I believe computing and big data will be an integral as well as inevitable part of the future MRI technology. Given the inherent nature of MRI, I believe widely deployed MRI technologies will lead to immense opportunities in the future through data-driven MRI image formation and diagnosis in healthcare. This will lead to low-cost, effective, and more intelligent clinical MRI applications, ultimately benefiting more patients.

See more here:

Improving AI-enabled Healthcare in the U.S. - OpenGov Asia

Posted in Ai | Comments Off on Improving AI-enabled Healthcare in the U.S. – OpenGov Asia

Virtual Reality – NAS Home

Posted: at 6:21 am

Virtual RealityDefinition:Virtual reality has been notoriously difficult to define over theyears. Many people take "virtual" to mean fake or unreal, and "reality" to refer to the real world. This results in an oxymoron.The actual definition of virtual, however, is "to have the effect ofbeing such without actually being such". The definition of "reality" is "the property of being real", andone of the definitionsof "real" is "to have concrete existence". Using these definitions"virtual reality" means "to have the effect of concrete existencewithout actually having concrete existence", which is exactly theeffect achieved in a good virtual reality system. There is norequirement that the virtual environment match the real world.Inspired by these considerations, for the virtual windtunnel we adapt the following definition:Virtual reality is the use of computer technology to createthe effect of an interactive three-dimensional world in which theobjects have a sense of spatial presence.In this definition, "spatial presence" means that the objectsin the environment effectively have a location in three-dimensionalspace relative to and independent of your position. Note that this is an effect, not an illusion. The basic idea is to presentthe correct cues to your perceptual and cognitive system so that your brain interprets those cues as objects "out there" in thethree-dimensional world. These cues have been surprisingly simpleto provide using computer graphics: simply render a three-dimensional object (in stereo) from a pointof view which matches the positions of your eyes as you move about.If the objects in the environment interact with you then the effectof spatial presence is greatly heightened.Note also that we do not require that the virtual reality experiencebe "immersive". While for some applications the sense of immersionis highly desirable, we do not feel that it is required for virtual reality. The main point of virtual reality, and the primary difference between conventional three-dimensional computer graphicsand virtual reality is that in virtual reality you are working withThings as opposed to Pictures of Things.Requirements:The primary requirement of virtual reality is that the scenebe re-rendered from your current point of view as you move about.The frame rate at which the scene must be re-rendered depends on the application. For applications like the virtual windtunnel,it turns out that a minimum frame rate of 10 frames per second is enough to support the sense of spatial presence. While motionat this frame rate is clearly discontinuous, if properly done our cognitive systems will interpret the resulting imagesas three-dimensional objects "out there".

The other requirement is that interactive objects in the environmentcontinuously respond to your commands after only a small delay. Justhow long a delay can be tolerated depends on the application, but forapplications like the virtual windtunnel delays of up to about a tenth of a second can be allowed. Longer delays result in a significantlydegraded ability to control objects in the virtual environment.

We summarize the Virtual Reality Performance Requirements:

For more information on VR see the papers found on Steve Bryson's home page.

See more here:

Virtual Reality - NAS Home

Posted in Virtual Reality | Comments Off on Virtual Reality – NAS Home