Elon Musk dings Bill Gates and says their conversations were underwhelming, after the Microsoft billionaire buys an electric Porsche – Pulse Nigeria

No-one is safe from Elon Musk's barbs, it seems not even Bill Gates.

Elon Musk dissed the Microsoft billionaire in a tweet sent Tuesday, claiming his conversations with the Microsoft founder had been "underwhelming."

Musk made the remark after an unofficial Tesla news account expressed disappointment with Gates' recent decision to buy a Porsche Taycan instead of a Tesla.

The Porsche Taycan is the German automaker's first all-electric vehicle and represents a direct rival to many of Tesla's models. Its starting price is $103,800 .

Gates said he'd ordered the "very, very cool" vehicle during an interview with YouTuber Marques Brownlee , published Friday.

"That's my first electric car, and I'm enjoying it a lot," he said.

During the interview, the 64-year-old tech grandee discussed the state of electric cars in general, noting that their range still falls below that of traditional gasoline vehicles. Consumers may experience "anxiety" about this when buying one, he said.

Still, Gates and Musk have more insights in common than the Tesla CEO might like to admit.

They have both, for example, spoken about the dangers posed by artificial intelligence.

Both men have endorsed a book by Oxford philosophy professor Nick Bostrom, "Superintelligence," which warns of the risks to human life posed by AI.

Musk said the book was "worth reading" in a 2014 tweet , while Gates endorsed the book in a 2015 interview with Baidu CEO Robin Li .

NOW WATCH: 62 new emoji and emoji variations were just finalized, including a bubble tea emoji and a transgender flag. Here's how everyday people submit their own emoji.

See Also:

SEE ALSO: AI is a greater threat to human existence than climate change, says the Oxford professor endorsed by Bill Gates

Originally posted here:

Elon Musk dings Bill Gates and says their conversations were underwhelming, after the Microsoft billionaire buys an electric Porsche - Pulse Nigeria

Thinking Beyond Flesh and Bones with AI – Ghana Latest Football News, Live Scores, Results – Ghanasoccernet.com

The best way to predict the future is to invent it goes the quote. If you are someone who is interested in discovering and inventing things, then "Artificial Intelligence" is the right domain for you. It will not only make your life interesting, but you would be able to make other lives simple and easy!

What does thinking beyond bones and flesh mean? Artificial intelligence is not just about inventing robots and replacing humans, but also about every hard activity of replacing the slog. For example, AI can be used in different areas in the medical field, civil engineering, military services, machine learning, and other fields. To simply portray, artificial intelligence enables computers or software to think wisely about how a person behaves. As a result, the field is vast and you can have your hands on whichever lane seems alluring to you.

The ultimate goal of AI is to achieve human goals through computer programming! AI is about mimicking human intelligence, but with a computer program and with a little help from data. The way they think, act and respond to problems just like a human mind.

One of the most significant features of AI is the new invention of Israeli military soldier robots, which are used as soldiers to replace men and women. This, in turn, is not only effective, but it also reduces the loss of life caused by each war. Its design also minimizes the damage to the robot. How sensitive, but a knowledgeable and useful invention! Therefore, the future of the world depends on how easy it is to obtain any work, and our future is nothing more than Artificial Intelligence!

Now, let us see how many types of AI are there!

Artificial Narrow Intelligence (ANI)

The concept of ANI generally means the flow of designing a computer or machine to perform a single task with high intelligence. It understands the individual tasks that must be performed efficiently. It is considered the most rudimentary concept of AI.

E.g.:

Artificial superintelligence is an aspect where intelligence is more powerful and sophisticated than human intelligence. While human intelligence is considered to be the most capable and developmental, superintelligence can suppress human intelligence.

It will be able to perform abstractions that are impossible for human minds to even think. The human brain is constrained to some billion neurons.

Artificial intelligence has the ability to mimic human thought. The ASI goes a step beyond and acquires a cognitive ability that is superior to humans.

As the name suggests, it is designed for general purpose. Its smartness could be applied to a variety of tasks as well as to learn and improve itself. It is as intelligent as a human brain. Unlike ANI it can improve the performance of itself.

E.g.: AlphaGo, it is currently used only to play the game Go, but its intelligence can be used in various levels and fields.

Scope of AI

The global demand for experts with relevant AI knowledge has doubled in the past three years and will continue to increase in the future. There are many more options for voice recognition, expert system, AI-enabled equipment, and more.

Artificial intelligence is the end of the future. So, why is no one willing to contribute to the future of the planet? Actually, in recent years, AI jobs have increased by almost 129%. In the United States alone, the demand for AI-related job is as high as 4,000!

Well, to catch the lightning opportunity present in AI, you need a bachelor's degree in computer science, data science, information science, math, etc. Now, if you are an undergraduate, then you can easily get a job in the AI domain with a reputed online certification course on AI. Doing this, you can earn anywhere between 600,000 and 1,000,000 in India! In the United States, you can get US$50,000 - US$100,000.

In this smart world, it's easy to find any online certification courses. Some online courses may only focus on the simple foundations of AI, while others offer professional courses, etc. All you have to do is choose the lane you want to follow and start your route.

You would be glad to know that Intellipaat offers an industry-wide best AI course program that has been meticulously designed as per the industry standard and conducted by SMEs. This will not only enhance your knowledge but also help you bring a share of the knowledge gained in the field.

You need to master certain necessary skills to shine in this field such as Programming, Robotics, autonomous cars, space research, etc., You will also be required to gain special skills in Mathematics, statistics, analytics, and engineering skills. A good communication skill is always appreciated if you are aspiring to be in the business field in order to explain and get the right thing to the people out there.

Learners fascinated in the profession of artificial intelligence should discover numerous options in the field. Up-and-coming careers in AI can be accomplished in a variety of environments, such as finance, government, private agencies, healthcare, arts, research, agriculture, and more. The range of jobs and opportunities in AI is very high.

See more here:

Thinking Beyond Flesh and Bones with AI - Ghana Latest Football News, Live Scores, Results - Ghanasoccernet.com

Herring River project will revive long-afflicted estuary ecosystem – Cape Cod Times

TuesdayFeb18,2020at3:01AM

My Wellfleet neighborhood abuts the Herring River, which was degraded in 1909 after a dike restricted tidal flow to the once-healthy estuary. Because tidal flow was blocked, water quality deteriorated, resulting in the death of fish and other sea life.

Currently, a dedicated group of people has spent over a decade working to restore this precious natural resource. Benefits include protecting and enhancing shellfishing and restoring once-abundant river herring and eels. Diamondback terrapins, winter flounder and striped bass should return to spawn, as well. Birds and other wildlife also will benefit from a healthy coastal environment.

A healthy tidal marsh will buffer storm surges resulting from climate change. Carbon stored by coastal wetlands reduces global warming. Free-flowing tides will help distribute sediment, allowing the marsh to gain elevation to help mitigate the effects of rising seas. There also will be a reduction in biting mosquitoes, which breed in stagnant waters.

I ask all to join me in supporting this project. Once restored, a healthy Herring River Estuary will provide recreational opportunities and seafood to be enjoyed by residents and visitors for generations.

Mary Ellen Manning, Wellfleet

See the rest here:

Herring River project will revive long-afflicted estuary ecosystem - Cape Cod Times

Strategic Analysis of Electric Vehicle Ecosystem in the United Kingdom, 2018-2025 – ResearchAndMarkets.com – Business Wire

DUBLIN--(BUSINESS WIRE)--The "Strategic Analysis of Electric Vehicle (EV) Ecosystem in the United Kingdom, 2018 - 2025" report has been added to ResearchAndMarkets.com's offering.

The automotive industry is rapidly evolving in terms of technology, as well as tackling environmental issues. Electric vehicles (EVs) have been introduced as a clean energy initiative, as they have low or zero emissions and have come a long way to becoming an integral part of OEMs' business strategies. Automakers are creating separate EV business units to be prepared for the expected EV boom in the short term. However, the surge in demand will create a need for huge charging infrastructure and safety regulations and standards.

The United Kingdom is a country that is aggressively pushing the country towards electrification, especially in the automotive and transportation sector. Stringent emission regulations, liberal incentives/subsidies for consumers and manufacturers, high level of localisation, concrete safety standards, and an established technology roadmap are some of the key steps taken by the government to ensure the success of electric vehicles in the near future.

Major OEMs such as Tata-JLR, Volkswagen, and Daimler have announced ambitious sales targets, and are expected to launch a large number of new and constructive electric vehicle models (from city-suited to long-range and powerful ones). Charging infrastructure, which is one of the major factors driving electric vehicle adoption, is also picking up pace, with many new companies entering the market. It has opened up new business models, enabling companies to position themselves either as manufacturers or operators or as a payment gateway.

The study gives a detailed analysis of the current and future prospects of electric vehicle sales in the United Kingdom - by model, by OEM, and by type of vehicle, until 2025. It provides insights into how charging stations have evolved over time and how companies have designed their strategies to establish a profitable supply chain. It also lists the various kinds of investments made in the electric vehicle space, promoting adoption.

Key Issues Addressed

Key Topics Covered:

1. Executive Summary

2. Research Scope, Objectives, and Methodology

3. EV Market Scenario

4. Government Efforts

5. Charging Station Infrastructure

6. Growth Opportunities and Companies to Action

7. Key Conclusions

Companies Mentioned

For more information about this report visit https://www.researchandmarkets.com/r/vnbo36

See more here:

Strategic Analysis of Electric Vehicle Ecosystem in the United Kingdom, 2018-2025 - ResearchAndMarkets.com - Business Wire

Governor’s grizzly council makes tracks to Libby City Hall this month – The Western News

The Western News - Front Page Slider, Governors grizzly council makes tracks to Libby City Hall this month '); $(this).addClass('expanded'); $(this).animate({ height: imgHeight + 'px' }); } } }); }); function closeExpand(element) { $(element).parent('.expand-ad').animate({ height: '30px' }, function () { $(element).parent('.expand-ad').removeClass('expanded'); $(element).remove(); }); } function runExpandableAd() { setTimeout(function() { $('.expand-ad').animate({ height: $('.expand-ad img').height() + 'px' }); }, 2000); setTimeout(function() { $('.expand-ad').animate({ height: '30px' }); }, 4000); } function customPencilSize(size) { var ratio = 960/size; var screenWidth = $('body').width(); if (screenWidth > 960) screenWidth = 960; $('.expand-ad__holder').parent('.ad').css('padding-bottom', (screenWidth / ratio) + 'px'); $('.expand-ad__holder').css({ height: (screenWidth / ratio) + 'px' }); $('.expand-ad').css({ height: (screenWidth / ratio) + 'px' }); $('.expand-ad img').css('height', 'auto'); $('.expand-ad embed').css('height', 'auto'); $('.expand-ad embed').css('width', '100%'); $('.expand-ad embed').css('max-width', '960px'); } function customSize(size, id) { var element = jQuery('script#' + id).siblings('a').children('img'); if (element.length 960) screenWidth = 960; element.css('height', (screenWidth / ratio) + 'px'); } (function () { window.addEventListener('message', function (event) { $(document).ready(function() { var expand = event.data.expand; if (expand == 'false') { $('.expand-ad__holder').removeClass('expand-ad__holder'); $('.expand-ad').removeClass('expand-ad'); } }); }, false); function loadIframe(size, id) { $('.ad').each(function () { var iframeId = $(this).children('ins').children('iframe').attr('name'); var element = $(this).children('ins').children('iframe'); if (element.length > 0) { var ratio = 960 / size; var screenWidth = $('body').width(); if (screenWidth > 960) screenWidth = 960; element.css('height', (screenWidth / ratio) + 'px'); } }); } })();

See the rest here:

Governor's grizzly council makes tracks to Libby City Hall this month - The Western News

Balance Your Own Ecosystem in the Great Board Game Ecosystem – Paste Magazine

Science games are trendy right now, with Wingspan, my #1 game of last year, building on real characteristics of over 100 species of North American birds to create a richly textured game that gets the science right. Genius Games is a small publisher that focuses specifically on games with science or math themes, including Periodic (played on the periodic table) and Cytosis (a game of cell division, endorsed by the Journal of Cell Science). One of their latest titles, Ecosystem, asks players to build tableaux of cards that score enough points to win yet are balanced like a real ecosystem so the player doesnt lose points for a lack of biodiversity.

Ecosystem is a very straightforward game to playall the complexity is in the scoring. The game itself is a card-drafting tableau builder, and if you know what those two terms mean, you have a pretty good idea of how the game works. Youll build your ecosystem in a grid of four rows of five cards each, and you place those cards over two rounds, starting each round with ten cards. Youll place a card, pass your remaining hand to the player to your left or right, place another card, and so on until all players have placed 10 cards in the round. The only restrictions on card placement are that you must place each card orthogonally adjacent to a card youve already placed, and you cant go beyond the limits of four rows by five columns.

The deck includes 130 cards in total, so no matter your player count you wont use all the cards in any game (the maximum is 120, or 20 cards each with six players). The most common cards are meadows and streams, of which there are 20 apiece, while the remainder of the deck includes eight to 12 each of nine different types of animals, birds, fish, or insects, each of which has a unique function in the game.

Two card types, streams and wolves, score competitivelythe player with the most gets the most points, the player with the next-most gets a smaller point total, and so on. Most cards score based on whats nearby, either immediately adjacent or within two cards, including trout, bear, dragonflies, and bees, as well as foxes, which score only if there are no bears or wolves adjacent to them. Meadows score in clustersyou have to have at least two meadow cards adjacent to each other to score at all. One card, rabbits, scores just a single point each (up from nothing in the original edition), but when played gives you the only opportunity to move cards youve already played, either switching two in your tableau or replacing one with the rabbit and relocating the new one.

Most of these scoring rules at least mimic something from the real world. Dragonflies score (in a slightly convoluted way) if theyre next to streams, as do trout. Bears score if theyre next to trout and bees. Bees score if theyre next to meadows. Eagles score if theyre within twoyou know, because they flycards of trout or rabbits. Deer are the most solitary animal in the scoring, gaining two points for each row with a deer and two points for each column, so you can score 18 points for deer if you place one in each column and one in each row.

The game does reward you for building your ecosystem with sufficient biodiversity. If you have at least six categories in which you didnt score any points, you lose five points; if you have four or fewer categories with zeroes, however, you get a bonus. In my various plays with player counts from four to six, I dont think anyone has actually taken the penalty; youd have to actively be trying to focus on a few categories to the exclusion of others for that to happen.

Ecosystem plays two to six, but its definitely best with at least four so you get more cards in play and so the effects of the card-drafting mechanic are more pronounced. With two players, the rules include a neutral player that takes 10 cards in each round and randomly takes one card whenever a hand is passed to its position; the neutral player also counts in the stream and wolf scoring. Its much better with more players, though, and still moves quickly because of the simultaneous play; even with five players, two of them new to the game, it only took us about 35-40 minutes, and I think itd be under a half an hour if wed played again with the same group.

The scoring is not that intuitive, and youll probably check your player reference card frequently even after a few plays, but when I introduce people to the game, they get caught up in the tableau-building part and learn the scoring as we goprobably the best way to learn most games. Its a great lightweight game and, at $15, a great value for a highly portable title.

Keith Law is the author of Smart Baseball and a senior baseball writer for The Athletic. His latest book, The Inside Game, is due out in April 2020. You can find his personal blog the dish, covering games, literature, and more, at meadowparty.com/blog.

Go here to read the rest:

Balance Your Own Ecosystem in the Great Board Game Ecosystem - Paste Magazine

Explained: Mobile App Architecture – The Basis of App Ecosystem – Appinventiv

What Do We Mean by App Architecture?

The technical definition: It is a combination of structural elements and their individual set of interfaces using which a system is composed in addition to the framework behavior of all the structural elements.

In laymans terms: Mobile application architecture is a set of techniques and model/design that are supposed to be followed for building a structured mobile app. It can also be denoted as an apps skeleton upon which the working and quality is based.

So, everything that defines an app how the data would move, the UI/UX, the choice of platform, the tech stack, etc. is a part of Mobile app architecture patterns.

Selecting the right architecture must be a default primary step in the planning and designing phase of a software development project. However, most often than not, the stage is ignored simply on the grounds of making the development process slower and thus more extensive. While ironically, the lack of it is a major reason behind a mobile application failure.

The lack of an enterprise application architecture, though, introduces a number of issues in the software:

The glaring issues attached with the lack of an architecture or of it being deemed unimportant gives birth to the need to build a mobile application architecture that is right and accounts to all the necessary considerations.

At this stage, you will have to keep the device type into consideration. This would need you to study the screen size, resolution, CPU characteristics, memory, and storage space, plus the availability of the development tool environment.

The app features would have dependency on the software or hardware, which is why it is important to have the details of devices on which the app would run.

Throughout its lifecycle, your application will face several events where the internet connectivity will either be dwindling or there would be none at all. Your app architecture will have to be built noting the worst network conditions. You will have to design the data access mechanism, caching, and state management according to the worst case scenarios.

The importance of UI/UX within an application is unquestionable. Ensuring that your UI is devised to keep users engaged and give them an uncluttered experience is an important part of your mobile app infrastructure One that would define how well it is designed.

While majorly accounted for the app architecture designing front, the element would call for an expertise in both backend and frontend. On the basis of your understanding of who the customers are and what are their app requirements, you should analyze which one of these would be good for your app:

Knowing the elements will only take you halfway when dissecting what is mobile app architecture to its entirety.

All the mobile app architectures are divided into three layers. Understanding what these three layers are, helps mobile app development companies understand what architectures are made of.

The aim of this layer is to look into how to present the application to end users. When designing this layer, the mobile app developers must identify the correct client type for intended infrastructure. Additionally, the clients deployment restrictions must also be kept in mind. Another necessity is selecting the correct data format and using robust data validation mechanisms for protecting the apps from invalid entry.

This layer looks into elements on the business front. In layman words, it looks into the way businesses are presented to the end users. This consists of business components, workflow, and the entities under two sub-layer hood: Domain model and Service.

The service layer looks into the definition of common application function set that are available to the end users. While the domain model layer looks into the knowledge and expertise linked to specific problem areas.

The data access layer must meet the application requirements and should help in offering efficient and secure data transactions. Mobile app developers should also consider the maintenance side of the data while ensuring that the data layer can be modified easily with the changing business requirements.

This layer consists of the data specific components such as access components, utilities, helpers, and the service agents.

The three elements are placed under two subheads: Persistence Layer and Network Layer. The former offers simplified data access which is stored in the backend, the latter is needed for making networking calls.

The intent of everything you have read till now is to not just understand what is mobile app architecture but What is a Good Mobile App Architecture. Now, what makes an architecture a good architecture is the principle set it is based on.

Question: What are the foundations of a good mobile app architecture?

Answer: A good app architecture (both Android mobile app architecture and iOS application architecture) is the one which enforces good programming patterns and assumptions.

Meeting all these different conditions enables you to speed up the development process while making maintenance much easier. Additionally, a well devised architecture in addition to platform centric technology is best used for solving complicated business issues in an effective manner for app projects.

Establishing an architecture as good is an event that calls for it to follow different principles. These principles also hold the answer to how to choose the right architecture for your mobile app.

It is the systems ability to react to the changing environment. In the case of mobile apps, the environment changes maybe a lot more frequent noting the market and technological changes. A good mobile app architecture ensures that the system is portable enough to answer to the changes keeping the impact of those changes at the minimum.

Noting the requirement changes happening due to the environment changes should be modified to correct the faults, better the performance, etc. In such a scenario, there is always a need for constant app maintenance. A good mobile architecture and programming must ensure high maintainability while reducing the efforts needed to keep the system up and running.

A good mobile app architecture must understand that for a faster mobile app development process it is important that components and protocols can be reused during updations or at redesign. Noting this, it is important that the architecture has the space for add reusability in the structured development approach.

Data security is the most major non-functional need of an application. The architecture must be robust enough for securing the data which is consumed by the app. It should also be in sync with the organizations security ecosystem, while all the data which is stored on device must be properly encrypted.

Users expect applications to be quick and issues free. If the app takes a lot of time to fetch the details, the probability of users abandoning the application increases by manifold. A good mobile app architecture should be such that every single one of the users expectations are met to its entirety.

This is the stage which would set the basis of your deep diving further into the types of app architecture and having a conversation with the engineering team.

Sudeep Srivastav

CEO, Appinventiv

In search for strategic sessions?.

Visit link:

Explained: Mobile App Architecture - The Basis of App Ecosystem - Appinventiv

What are Bhutan’s sacred forests worth? – Forests News, Center for International Forestry Research

There are many ways to assess mountain ecosystem services, but because of these challenges the researchers decided to study the mountain communities social perceptions to put together an initial picture of Bhutans ecosystem services.

We felt this perception study would be a good way to kickstart our understanding of the forests ecosystem services, said Jigme Wangchuk, a researcher at UWICER. Its a quick and affordable method and it helps the communities better understand the connection between ecosystem services and their livelihood.

The researchers used participatory research methods, including focus group discussions, interviews and household surveys and focused on three forest types, namely high-altitude oak forests, forest management units and community plantations.

Like most studies on ecosystem services, the CIFOR and UWICER study divided services into four categories: provisioning services, which are products obtained from the ecosystem, such as fresh water or food; regulating services, which are benefits obtained from the regulation of ecosystem processes such as climate regulation or avalanche mitigation; habitat services, which highlight the importance of ecosystems to provide habitat for migratory species and to maintain the viability of gene-pools; and cultural services, which are non-material benefits that people obtain from ecosystems such as spiritual enrichment, or recreation.

The idea was to learn what components and services of the forest are important to the local people so down the line the government and other potential buyers know which services are worth paying to preserve. The end goal, Baral said, was for Bhutan to explore the Payment for Ecosystem Services (PES) scheme, where whoever preserves or maintains an ecosystem service should be paid for doing so. An example of this is when companies buy carbon offset credits and the payment goes to communities taking care of a forest that sequesters carbon from the atmosphere.

For this to work in Bhutan, two things need to be clear. First, the buyer needs to know what they are paying for, and second, the seller needs to know what they are providing to the buyer, Baral said. For this reason, its important and critical to determine what services Bhutans forests are providing to whom. And identifying and quantifying these services is the primary step.

After interviews and discussions with 396 villagers in villages spread out across 17 forests in Bhutan, the researchers reported 17 ecosystem services perceived overall as important to the communities.

The villagers recognized fresh water, timber, fuel wood, non-wood forest products, fodder and leaf litter as the forests provisioning services, with fresh water and timber as the most important. For regulating services, they thought ground water recharge, fresh air, carbon sequestration and soil erosion protection equally ranked as the top services.

They named soil productivity, wildlife habitat, biodiversity, pollination as supporting services, with soil productivity and biodiversity as most important. And for cultural services, they chose recreation and aesthetic services as most important, followed closely by cultural spiritual sites.

The researchers compared this list to ecosystem services identified by forestry experts and found common priorities, namely the provisioning services of fresh water, non-wood forest products, fodder, food, fuel wood, and grazing; and the regulating services of soil erosion protection, natural hazard reduction, and water purification.

Researchers noted priority services chosen by residents in the three different study areas. Among the ecosystem services identified by communities in the oak forest, water regulation, provision of fodder and fuel wood were the priorities. Access to timber was the priority for communities in the planted forest, and those in the forest management units named land productivity, freshwater, timber, and fresh air as the top ecosystem services.

Overall, the interviews and discussions revealed that villagers ascribe their good health to a healthy forest. And the researchers are happy they now have baseline knowledge of the forests. But the study also uncovered a perception that there is a general decline in the provision of ecosystem services, particularly from forest management units. Villagers attributed the decline to a country-wide increase in the demand for timber, which they have linked to socio-economic development.

The researchers plan on validating these findings through scientific field measurements and similar studies in other forest management regimes.

Baral, Sears, Wangchuk and Choden noted that among the best things to have come out of the project was a synergy between the communities and the researchers, and between the international researchers and the local scientists.

Researchers learned about how communities value the forest and learned to appreciate local knowledge, and villagers learned about the ecosystem service conceptual framework. Prior to the study, we thought the local people were aware of ecosystem services obtained from their forest but not all of the regulating and supporting services, Choden said. After the study we learned that people are aware and value their ecosystem services beyond provisioning services.

My sense is that the local people would like to know more about their forest. And if they have a different narrative about the relationship between trees and water, I think they wouldnt mind hearing about the scientific narrative, Sears said. At the same time, we scientists recognized local people as the experts. They have been living near these forests for generations. We learn from them.

UWICER researchers feel that the training they received in research design and social science methods as part of the project will go a long way. It substantially enhanced our capacity in project planning, implementation in the field, interpretation of results and publication of results, Wangchuk said. Many local researchers got to author or co-author studies, a big opportunity that enhanced our scientific paper writing skills and presentation.

Our Bhutanese colleagues, they took the initiative once the team defined what the three projects would be, Sears said. They took it and ran with it.

What enhances these new capacities is that local researchers now have a stronger link to the local knowledge that farmers and villagers have used to care for their forests for hundreds of years. Its a whole other knowledge system, Sears said. It now informs future research and empowers the Bhutanese people in decision making. That has great value.

(Visited 1 times, 1 visits today)

The rest is here:

What are Bhutan's sacred forests worth? - Forests News, Center for International Forestry Research

ConsenSys Grants funds third cohort of projects to benefit the Ethereum ecosystem – CryptoNinjas

Initially announced at Devcon 4 in 2018, Joseph Lubin, co-founder of Ethereum and the founder of ConsenSys, one of the worlds largest blockchain companies, building the tools, infrastructure, and apps that power the Ethereum network created a $550,000 grant fund to be distributed to projects building out the Ethereum ecosystem.

ConsenSys Grants funds open-source projects that benefit the greater Ethereum ecosystem. Projects supported include critical areas such as core infrastructure, improved developer tooling and UX, security, and access to knowledge for developers, users, and social impact projects.

Since 2019, the ConsenSys Grantsprogram has funded 25 projects in total.

Infrastructure:

Ecosystem Growth:

Usability + Dev Tooling:

Security

Infrastructure

Usability + dev tooling

Social Impact

Education + technical knowledge

Sigma Prime is extremely grateful for the support received from ConsenSys. This grant helped us scale the Lighthouse development team, allowing us to hire and retain amazing talents. Were proud of all the progress achieved over the past few months, with Lighthouse becoming a leading Eth2 implementation. The funding received from ConsenSys is instrumental to reaching our goals. Mehdi Zerouali, Director, Sigma Prime

ConsenSys Grants is more than the money; it helped the team understand its on the right track, and gave community validation. A great tool for showing community that this is a step in the right direction. Yoav Weiss, CTO, TabooKey (Gas Station Network)

Girls are coming out of the program so well-rounded, educated, and being able to compete in the blockchain space. They are telling their story through the communities that they serve. It has been awesome. And provided them a platform to be invited to the blockchain ecosystem. Ewurabena Ashun, Curriculum Management & Development , Black Girls Code

The rest is here:

ConsenSys Grants funds third cohort of projects to benefit the Ethereum ecosystem - CryptoNinjas

Ecosystem Services Assessment of the Prins Hendrik Zanddijk – Dredging Today

The International Association of Dredging Companies (IADC) latest edition of their Terra et Aqua magazine featured a very interesting article named Ecosystem services assessment of the Prins Hendrik Zanddijk.

This study examines which and, if possible, how much more ecosystem services are provided by the most recent nature inspired coastal protection project Prins Hendrik Zanddijk, in comparison with a traditional concrete and asphalt construction.

During the last decade, reclamation of a sandbody as a coastal protection measure has evolved into a viable and attractive alternative to handbook-engineering using concrete and asphalt.

The latter, considered traditional coastal protection methods, offers the reassurance of multiple generations of engineering experience and reliability.

Nature based solutions (NBS) are defined as solutions that are inspired and supported by nature, which are cost-effective, simultaneously providing environmental, social and economic benefits and help build resilience (European Commission 2019).

NBS such as sandbody designs must cope with dynamic behaviour and variability of the building material as well as with uncertainty of maintenance costs. The quantification of the cost-effective part of this definition remains a difficult task.

Authors

Jan Fordeyn (Jan De Nul Group) | Katrien Van der Biest (University of Antwerp) | Emile Lemey (Jan De Nul Group) | Annelies Boerema (University of Antwerp) | Patrick Meire (University of Antwerp)

Posted on February 13, 2020 with tags IADC, Prins Hendrik, Terra et Aqua.

See the article here:

Ecosystem Services Assessment of the Prins Hendrik Zanddijk - Dredging Today

JPMorgan Dipping its Toes into the Ethereum Ecosystem Could Be Bullish for ETH – newsBTC

Banking giant JPMorgan has offered mixed signals when it comes to their thoughts on cryptocurrencies, with the banks CEO frequently bashing Bitcoin and other cryptocurrencies, despite offering their own intra-bank digital asset dubbed JPM Coin that is built upon the Ethereum (ETH) blockchain.

Now, it appears the JPMorgan could be dipping its toes into the Ethereum ecosystem, as news recently broke that the financial institution is looking to merge its blockchain unit with ConsenSys.

Analysts believe that this potential merger would be highly bullish for Ethereums price action.

The upcoming merger, which was announced in a recent report from Reuters, is expected to be formally announced in the next six months, and all the details surrounding this event still remain foggy.

It is important to note that JPMorgans blockchain unit called Quorum is built upon the Ethereum network, which makes ConsenSys which is led by Ethereum co-founder Joe Lubin an obvious merger target due to its heavy involvement within the Ethereum ecosystem.

Ethereum is currently trading up just under 2% at its current price of $226, although its intraday climb likely has less to do with investors excitement surrounding this possibility, and more to do with Bitcoins rise to $10,000.

If this partnership does materialize, however, it will likely generate some buzz surrounding Ethereum, and possibly lead retail investors to fomo into fresh positions.

Although it is unlikely that the cryptocurrencys current price action will be influenced by this news due to the lack of details surrounding it, analysts do believe it is firmly bullish for ETH in the long-term.

Satoshi Flipper, a prominent cryptocurrency analyst on Twitter, explained in a recent tweet that this merger could be JPMorgans attempt to increase its presence within the enterprise blockchain arena prior to the launch of ETH 2.0.

So why is this so bullish for #Ethereum? Because cash is king and JPMorgan has much of it. With the pending release of 2.0, JPMorgan could desire an increased presence in the enterprise blockchain arena. And #Ethereum is a quick ticket to get there, he explained while referencing the news report.

Because it could be another six months before theres a formal announcement regarding this partnership, there is still a lot of time for things to change, which is likely leading Ethereum investors to express some caution when it comes to trading this news.

Read the original:

JPMorgan Dipping its Toes into the Ethereum Ecosystem Could Be Bullish for ETH - newsBTC

Regenerative farming: how farmers can make the transition – The Poultry Site

The FAI Farm Gate podcast is interviewing Claire Hill, farm manager at FAI Farms and Caroline Grindrod, regenerative farmer and consultant from Roots of Nature, to discuss their experiences with regenerative agriculture and its potential to make farming an environmental and economic boon in the coming decades.

Though FAI Farm in Oxford is already an organic livestock farm, they recently decided to transition their breeder/finisher operation to regenerative farming. Hill and her team made the decision after reviewing existing evidence on grazing practices and environmental sustainability. When thinking of her business strategy, She felt that developing the market and supply chain for regeneratively raised animals could counteract the current narrative of agricultures role in the climate crisis.

She began transitioning her ruminants off grain and onto FAIs pastureland to get more savvy with grazing and soil management. She also wanted to improve her water infrastructure to make it more sustainable. Hill refers to the moves as an, evolution. These changes, and the growing salience of climate change, led her to embrace regenerative farming instead of remaining an organic operation.

According to Grindrod, there isnt a single definition of regenerative farming. However, she stresses that the term shouldnt be diluted or green-washed as it gains traction. When explaining regenerative agriculture to Costain, she cited the summary published on Wikipedia. That definition lists the priorities of regenerative agriculture: increasing biodiversity, improving soil quality, restoring watersheds and enhancing the ecosystem of the farm.

When explaining how she advises her clients, she says that a healthy ecosystem will also be a productive one. Regenerative farming requires the farmer to completely rethink their role in the ecosystem. Instead of the farmer acting as a manager, she tells farmers that they are in the ecosystem with their animals. By viewing the farm and surrounding ecology holistically, farmers will be better placed to sequester carbon, replenish the watershed and foster biodiversity.

Regenerative farming also takes a different view of daily farming practices. Instead of focusing on inputs and extracting profitable outputs, regenerative agriculture is more focused on fostering complexity in the surrounding environment. It isnt about damaging things less while farming Grindrod wants her clients to be at the right end of ecosystem management.

Holistic management the framework Grindrod uses when transitioning farms to regenerative methods involves looking at the farm context and surrounding environment. The goal for the farmer is to do the best they can with the farm they have. To begin, Grindrod evaluates the physical landscape of the farm and the ecosystem processes. This includes the water and mineral cycles. It also focuses on more nuanced indicators like community dynamics and energy flow.

Community dynamics is a way of indicating the level of complexity in the farms ecosystem. After initially evaluating the surrounding environment, Grindrod might consider introducing functional species to counteract issues that are emerging on farm. This could mean introducing pest-eating insects to reduce a farms reliance on pesticides or adding methanotrophs to the farm system to ensure wastes are broken down quickly.

Energy flow focuses on ways farmers can create or store energy on-farm. Grindrod might think of ways for farmers to sequester carbon in plant biomass or where to place solar panels to capture the most energy.

The holistic framework means that any regenerative steps are highly individualised and focused on long-term farm goals like improving soil quality or eliminating pesticide use. Theres also an emphasis on identifying root causes for using conventional farm inputs that can harm the environment like fertiliser or medical treatments. The idea is that the decisions made within the framework will evaluate the welfare and health consequences of the animals, input costs, ecosystem costs and make sure solutions are economically feasible.

Both Grindrod and Hill concede that changing farmers mindsets was one of the biggest obstacles to implementing regenerative agriculture. Farmers tend to be risk-averse and know the land very well meaning that new methods can be met with resistance. Since regenerative solutions are farm-specific and not universal, Hill and Grindrod have had farmers immediately say, you cant do that on this farm to many of their proposals.

As Hill was explaining her own transition from organic production to regenerative agriculture, she said that it requires the mentality of, how can we do that on this farm? as opposed to, you cant do that on this farm.

Hill emphasised that her move wasnt a formula that could easily adopted by all farms the transition to regenerative agriculture is difficult to replicate on a wide scale. In addition, many of Hills steps didnt work immediately out of the gate. It took time and investment to see the solutions work.

Another component of farmers mindsets that had to change was the perception that adopting regenerative techniques would lead to drops in yield or profitability. In Hills experience, this perception isnt borne out by the facts. When discussing FAI Farms transition with Costain, she predicts that her profit margins will increase because she isnt spending as much money on inputs and is saving time on her daily tasks.

Since making the transition, Hill isnt spending as much on straw or silage and she doesnt have to do as much tractor work (so less is spent on diesel). She is also saving money on overall running costs since she is using fewer machines. Treatment costs for the animals are down as well meaning that fewer chemicals are being released in the ecosystem. For her farm, these benefits have made the regenerative investments worthwhile; shes hoping to make further investments in the future.

Though FAI Farm is early in their regenerative journey, Grindrod is confident that it will become more successful over time. Her consultancy, Roots of Nature, has shown that regenerative farms can have high productivity measures while regenerating soil quality and the ecosystem. In her view, regeneration could be the key strategy for the agricultural sector to remain sustainable and resilient for the future.

Listen to the full Farm Gate podcast here.

Original post:

Regenerative farming: how farmers can make the transition - The Poultry Site

OIF Members to Showcase Innovation and Interoperability Solutions for the Industrys Most Critical Challenges at OFC 2020 – Yahoo Finance

Interoperability demos on 400ZR, CEI-112G, FlexE and IC-TROSA; OIF experts to lead panel discussions and OIF to Host "Cu (see you) Beyond 112 Gbps" Workshop

OIF will host one of the largest interoperability demos in its history reflecting the ongoing significance of OIFs work in addressing global network challenges. Twenty industry-leading system vendors, component vendors and test equipment vendors will demonstrate critical insight into how key technologies 400ZR, Common Electrical I/O (CEI)-112G, Flex Ethernet (FlexE) and Integrated Coherent Transmit-Receive Optical Sub Assembly (IC-TROSA) interoperate within the industrys ecosystem at OFC 2020 in San Diego, March 10-12, 2020 (booth #6221).

Participating companies include Acacia Communications, Amphenol, Arista Networks, Cadence Design Systems, Inc., Cisco Systems, Inc., Credo Semiconductor (HK) LTD, Fujitsu Optical Components, II-VI, Inphi Corporation, Juniper Networks, Keysight Technologies, Marvell, Microchip, Molex, NeoPhotonics, Samtec, Inc., Spirent Communications, TE Connectivity, VIAVI Solutions and Yamaichi Electronics.

"The participation level by 20 companies in this years interoperability demos for 400ZR, CEI-112G, FlexE and IC-TROSA, is a true reflection of our enduring leadership and evidence that these technologies continue to be the preeminent focus areas for our member companies," explained Steve Sekel of Keysight and OIFs Physical and Link Layer (PLL) Interoperability Working Group (WG) Chair. "We are eager to showcase the advancements that these technologies and our work have made over the past year."

400ZR Demo

OIF has defined the 400ZR interface that provides interoperability of coherent optical interfaces for data center interconnect applications. The demo will show first time ever operation of 400ZR equipment from multiple system and module vendors and in multiple pluggable form factors.

CEI-112G Demo

The CEI-112G demo will feature interoperating channels, components and silicon that demonstrate the CEI-112G-XSR, CEI-112G-VSR, CEI-112G-MR and CEI-112G-LR draft implementation agreements. Demonstrating interoperability for extra short reach channels is important to support the co-packaging developments that are expected to be discussed throughout the week at OFC. Interoperable very short reach, medium reach and long reach channels and silicon are also critical to support the developing 112 Gbps equipment that is expected to come to market soon. These updated demos surpass what OIF has demonstrated in the past with additional member contributions and continue to build on the developing ecosystem of products coming to market to support CEI-112G.

FlexE Demo

The FlexE demo that will be on display incorporates FlexE silicon operating with multiple test equipment and interoperating over a 400 Gbps fiber network that will be deployed on the show floor between the Ethernet Alliance booth and the OIF booth. This reflects further developments that are taking place in the FlexE market since the previous OIF FlexE demos.

IC-TROSA Demo

The IC-TROSA features all the optical building blocks for a coherent module in a single package. This demo will highlight important aspects of IC-TROSA integration as well as real-time EVM measurements with the updated script for 400ZR.

An additional point of interest will be a static display of coherent optical components which will emphasize the role of OIF in the coherent optical marketplace over the past 10 years.

OIF @ OFC 2020 Activities

OIF experts will participate in two panels at OFC 2020 that feature the latest updates on critical technologies that work to enable a more efficient and reliable network.

"400ZR Specification Update" Tuesday, 10 March, 13:30 14:30, Theater III

Moderator: Karl Gass, OIF PLL WG Vice Chair Optical

Speakers include: Josef Berger, Inphi Corporation; Masahiro Mogi, Fujitsu Optical Components; Gert Sarlet, II-VI; Marc Stiller, NeoPhotonics and Markus Weber, Acacia Communications

Industry experts from OIF will lead a panel discussion of representatives from the DSP, optics, equipment and end user communities on the conflicting demands for a near-term, high-volume, interoperable, moderate reach, coherent 400G optical link. The session will also include an update on OIFs project to define a 400ZR link specification.

Story continues

"112 Gbps Electrical Interfaces An OIF update on CEI-112G" Wednesday, 11 March, 16:15 17:00, Theater II

Moderator: Nathan Tracy, OIF President, TE Connectivity

Speakers include: Ed Frlan, OIF Technical Committee Chair, Semtech, Corp.; Mike Li, OIF Board, Intel; Cathy Liu, OIF Board, Broadcom, Inc.; Gary Nicholl, OIF Board, Cisco; Steve Sekel, OIF PLL Interoperability WG Chair, Keysight Technologies

OIF experts will lead a panel discussion on the ongoing CEI-112G electrical interface development projects, and the new architectures they will enable including chiplet packaging, co-packaged optics and internal cable-based solutions. The panel will provide an update on the multiple interfaces being defined by OIF including CEI-112G MCM, XSR, VSR, MR and LR for 112 Gbps applications of die-to-die, chip-to-module, chip-to-chip and long reach over backplane and cables.

"Cu (see you) Beyond 112 Gbps" Workshop Thursday, 12 March, 12:00-17:30, Hilton San Diego Gaslamp, 401 K St (across from the SD Convention Center), San Diego, Ca.

OIF will hold a half-day workshop "Cu (see you) Beyond 112 Gbps" featuring experts from Arista, Broadcom, Inc., Cisco, Facebook, Google, Innovium, Intel, MACOM and TE Connectivity discussing the needs and challenges for electrical interfaces beyond 112 Gbps. Registration required:

Check the status of OIFs current work here.

About OIF

OIF is where the optical networking industrys interoperability work gets done. Building on 20 years of effecting forward change in the industry, OIF represents the dynamic ecosystem of 100+ industry leading network operators, system vendors, component vendors and test equipment vendors collaborating to develop interoperable electrical, optical and control solutions that directly impact the industrys ecosystem and facilitate global connectivity in the open network world. Connect with OIF at @OIForum, on LinkedIn and at http://www.oiforum.com.

View source version on businesswire.com: https://www.businesswire.com/news/home/20200218005625/en/

Contacts

Leah WilkinsonWilkinson + Associates for OIFEmail: leah@wilkinson.associates Office: 703-907-0010

Original post:

OIF Members to Showcase Innovation and Interoperability Solutions for the Industrys Most Critical Challenges at OFC 2020 - Yahoo Finance

Announced the Addition of New Statistical Data Titled as Embedded Computing Ecosystem Market 2020 by Growth, Demand & Opportunities & Forecast…

Global Embedded Computing Ecosystem market reports provide an in-depth and detailed analysis to thoroughly analyze the details you need. This report provides an extensive overview of market-based factors that are expected to have a substantial and decisive impact on the markets growth prospects over the forecast period.

Demand for the global Embedded Computing Ecosystem market is growing significantly, driving a larger market with a better-quality experience. The rapid increase in technological progress is expected to be practical in the next few years.

Avail Sample Report @https://www.researchnreports.com/request_sample.php?id=222794

Top Key players:

AMD, ANalog Devices, ARM, Apple, Broadcom, Cypress, Fujitsu, IBM, Infineon, Microchip/Atmel, Microsoft, Nvidia, Qualcomm, Renesas, STMicroelectronics, Samsung, Texas Instruments

Geographically, regions such as North America, Europe, Asia Pacific (APAC), the Middle East, and Africa and Latin America can be classified into the global Embedded Computing Ecosystem market. After four years, the presence of a large distribution network is dominant and is expected to reach the highest CAGR by the end of a given forecast period.

In summary, the report includes business and finance, company profile, and recent growth. The challenges faced by each company and the business strategy implemented by him in order to generate and generate high profits in the market were also presented.

The report can answer the following questions:

Objective of Studies:

Get maximum discount: https://www.researchnreports.com/ask_for_discount.php?id=222794

Table of Content:

Chapter 1 Global Embedded Computing Ecosystem Industry Overview

Chapter 2 Global Economic Impact on Industry

Chapter 3 Global Market Competition by Manufacturers

Chapter 4 Global Production, Revenue (Value) by Region (2014-2020)

Chapter 5 Global Supply (Production), Consumption, Export, Import by Regions (2014-2020)

Chapter 6 Global Production, Revenue (Value), Price Trend by Type

Chapter 7 Global Market Analysis by Application

Chapter 8 Manufacturing Cost Analysis

Chapter 9 Industrial Chain, Sourcing Strategy and Downstream Buyers

Continue for TOC

For more information, please visit @https://www.researchnreports.com/enquiry_before_buying.php?id=222794

Company Overview:

Research N Reports is a new age market research firm where we focus on providing information that can be effectively applied. Today being a consumer-driven market, companies require information to deal with the complex and dynamic world of choices, where relying on a soundboard firm for your decisions becomes crucial. Research N Reports specializes in industry analysis, market forecasts and as a result getting quality reports covering all verticals, whether be it gaining perspective on current market conditions or being ahead in the cutthroat global competition. Since we excel at business research to help businesses grow, we also offer to consult as an extended arm to our services which only helps us gain more insight into current trends and problems. Consequently, we keep evolving as an all-rounder provider of viable information under one roof.

Contact Us:

Address: 10916 Gold Point Dr

Houston, TX 77064, USA

Call Us: USA: +1 510-402-1213

UK: +44 753-712-1342

APAC & Malta: +356 2792 2019

Email: [emailprotected]

http://www.researchnreports.com

Go here to see the original:

Announced the Addition of New Statistical Data Titled as Embedded Computing Ecosystem Market 2020 by Growth, Demand & Opportunities & Forecast...

Here’s how a visiting venture capitalist explores Houston’s startup ecosystem for the first time – InnovationMap

When Houston Exponential established the HX Venture Fund, the goal was to bring out-of-town capital and investors into the city of Houston. The fund of funds invests in a portfolio of venture capital funds with the hope that those funds find a way back into the Houston startup ecosystem.

After a little over a year, HXVF has invested in five funds: Boston-based .406 Ventures, Austin-based Next Coast Ventures, Boston-based OpenView Venture Partners, Washington D.C.-based Updata Partners, and Austin-based LiveOak Venture Partners.

The fund of funds is also regularly hosting those five funds as well as a mix of potential portfolio fund members in Houston for what the HXVF calls "immersion days" where the venture capitalists can meet local startups, innovation leaders, and even fellow investors that they could eventually co-invest with.

"The goals of these days are to have venture capitalists travel to Houston, meet with our entrepreneurs (and the startup development organizations like Station, Cannon and WeWork that support them), and provide both capital and expertise in company building to the tech companies," says Sandy Guitar Wallis, managing partner at HXVF. "The venture capitalists also meet with HX Venture Fund corporate LPs, who can be customers or acquirers of their portfolio companies."

Just this month alone, HXVF is hosting four funds two from their portfolio and two that they haven't yet invested in. San Antonio-based Active Capital, which has raised a $21 million fund, is among the visiting VCs this month. The fund's founder, Pat Matthews, an entrepreneur turned venture capitalist, has shared his busiest day February 5 as well as his perspective on Houston innovation with InnovationMap.

After waking up at the Hotel Derek, Matthews starts his second day in Houston by taking a Lyft to the Greater Houston Partnership for what he believes to be a breakfast meeting with Wallis and Guillermo Borda of HXVF, but the group has too much to discuss that a meal falls by the wayside.

Before this trip, Matthews hasn't visited Houston in a professional capacity. While Active Capital is based just down I10 in San Antonio, the firm's investments are split almost in half by deals done in Texas versus the rest of the world. Active Capital focuses on B2B SaaS investments usually leading in seed or series A rounds.

Matthews has called Texas home for around a decade. He founded an email marketing startup in Virginia, which was acquired by San Antonio-based Rackspace. He relocated to join Rackspace and worked on growing the organization for six years before creating Active Capital.

Following the meeting still unfed, Matthews meets up with Serafina Lalany from Houston Exponential to carpool to The Cannon on the west side of town.

Matthews forgoes his usual carb aversion to eat slices of Domino's pizza at The Cannon before beginning his first of three fireside chats with Houston innovators. Patrick Schneidau, CEO of Truss, leads the conversation at The Cannon. (Schneidau is a board member of InnovationMap's.) After the chat, Matthews has a meeting with a startup before heading back into town.

With one fireside chat down, Matthews heads into his second one of the day at Station Houston with Joe Alapat, founder of Liongard. Matthews observes that each of the entrepreneurs who interviewed him had great questions, and seemed to be far along with their companies. Meanwhile, any of the people he met before or after the chat seemed to be at a much earlier stage in their startup journey.

The last fireside chat was hosted by Rakesh Agrawal of Snapstream at WeWork's Jones Building location. Matthews and Agrawal attempted to set up a Facebook livestream for the conversation, but an issue with the technology wouldn't allow for the stream.

With meetings and fireside chats done, Matthews heads straight to a dinner with Blair Garrou, founder and managing director of Mercury Fund. The two venture capitalists dine at Eunice and split several appetizers and a bottle of wine while discussing their own recent investments and interests. Matthews, who met Garrou in 2014, thinks of him as a great mentor in venture capital.

Matthews headed back to the hotel after dinner and crashes hard after the long day. He would head back to San Antonio on a Vonlane bus he gets a lot of work done on his trips the next day.

Matthews says he left Houston with an overall positive opinion of the city, and says it's similar to other Texas cities, aside from Austin, in its startup presence and capacity. While he assumed he'd meet energy and space startups, he realized Houston had a lot more going on than that.

"It definitely seemed like there was a lot of passion and a lot of hustle," Matthews says. "And it seems like the city is really working to support and cultivate that and keep it in Houston. I was inspired."

Throughout the visit, Matthews handed out his business card and some conversations have developed from those connections, he says. Another representative from Active Capital who is focused on sourcing deals with startups will visit next, and Matthews says he also thinks that he'll return to Houston to continue conversations he's been having, including some with other investors.

"I could definitely see doing deals in Houston," Matthews tells InnovationMap.

The rest is here:

Here's how a visiting venture capitalist explores Houston's startup ecosystem for the first time - InnovationMap

On Apple’s revenue warningJim Cramer and others weigh in – CNBC

Here's what five experts say investors should watch regarding Apple's expectations of a financial hit because of the coronavirus outbreak in China.

A day after the announcement, Apple stock was down 3% at midday Tuesday.

Jim Cramer, host of CNBC's "Mad Money," says long-term believers of Apple's bull case should buy in here.

"I am surprised it's not down more because it's not just supply. It's demand. Demand has to come down because they don't do much shopping in China and with supply, it's not them. It's supply chain and some of the companies that need to give them the supplies, they can switch their six or seven plants in that Hubei area that are halted. I think there's just not a lot of people that want to sell anything. I think a lot of people thought this would happen and we're just waiting for that shoe to fall. I can't come up with a reason short term to buy it. I do think that if you believe the coronavirus is going to get better, you buy it right here."

Mike Volpi, general partner at Index Ventures, says a stock reaction is to be expected as Apple gets squeezed on the supply and demand side.

"Not entirely surprising. If you look at what's happening in China right now, there's over 150 million people that have some kind of a restriction can't leave their home, can't travel. That's over 10% of the population and Apple faces a bit of a double whammy in that when people can't travel and leave their homes, they can't buy phones, they can't buy iPads and so forth and at the same time, all of their production is in China, so you get hit on the supply side and on the demand side. So in some sense, not totally surprising."

Daniel Flax, senior research analyst at Neuberger Berman, says overall demand still remains healthy.

"I think what's important here is that people are demanding the devices which are innovative, the ecosystem remains healthy, newer growth drivers like wearables are doing very, very well for the company and so this is of course tragic for China and the people globally that are being impacted, but we see it largely as supply related and of course a little bit of demand impacted in China for the company."

William Power, senior research analyst at Robert W. Baird, says demand outside of China should offset the impact.

"This has been an ecosystem story for some time. We think that remains very much intact. You still got a services business that's growing mid-teens, you got a wearables business the grew north of 30%. I think one of the key comments out of the release last night was that they're still seeing very strong demand outside of China. Expectations are still in line on that front. So this really hasn't extended beyond China obviously a terrible human toll and we think ultimately is more of a temporary impact."

Gene Munster, founder of Loup Ventures, says this stock reaction is less than other warnings in the past.

"I think it's important to point out in the context this is the second time in six quarters that Apple's missed numbers. China was part of it back in December of 2018. At that point, if you look at the stock, that was a well-telegraphed miss but the trajectory downward was 35%. And so we're talking about one-10th of the reaction versus six quarters ago. I think that context is important because, on large, investors are viewing this as something that is transitional, this black swan type of event, and we'll move through it."

Disclaimer

View post:

On Apple's revenue warningJim Cramer and others weigh in - CNBC

Floating Farms May Help Reinvent the World’s Food Ecosystems – WIRED

Karma, Courage, and Sustainabetty are special heifers. They have uninterrupted views of Rotterdam harbor, poop on a poop deck, and walk that gangplank to a pasture. They and 31 other Meuse-Rhine-Issel cows clomped aboard the world's first floating dairy farm last May.

It's the best milk in the world, says Peter van Wingerden, founder of the Dutch property development company Beladon, which built the barge. In 2012, hearing that floods from Hurricane Sandy had crippled New York City's food distribution system, he imagined that waterborne urban farms could boost food security. Why Rotterdam? A quarter of the Netherlands is below sea level. Why 1,500-pound bovines? If we could put big animals inside the city on a floating barge, we could do anything.

Getting a green light required years of answering questions from local officials: Crucially, do cows get seasick? On a steady platform, their research concluded, heifers likely won't spew their cud. The 4,843-square-foot stable floats on concrete pontoons anchored by two steel beams driven 65 feet into the seabed. The structure rises and falls with the 8-foot tides and never tilts more than 11 inches, even in winds topping 70 mph or if the herd crowds the stern to watch passing crustaceans.

Each day aboard this largely self-sustaining ecosystem, cows eat potato peels and grass clippings, then set free 5,700-plus pounds of dung, which a Roomba-like robot sucks up and dumps down a shaft to a deck below. There it's turned into fertilizer for the soccer fields and parks that grow the grass feed. A milking robot pulls around 5 gallons from each heifer, which is bottled or made into yogurt and then trucked to local grocery stores.

Van Wingerden has talked to food companies and developers seeking to bring buoyant dairies to Singapore, Dubai, and New York. Alas, experts say large-scale floating farms would be prohibitively expensive and rely on too many resources to remain sustainable. But van Wingerden hopes the sight of cows grazing on a boat sparks creative thinking for future food production. Humans must produce 56 percent more food to feed a global population of 9.8 billion by 2050. Sure, seems like that'll happen when pigs fly. Or, when cows float.

Laura Mallonee (@LauraMallonee) writes about photography for WIRED.

This article appears in the March issue. Subscribe now.

More Great WIRED Stories

The rest is here:

Floating Farms May Help Reinvent the World's Food Ecosystems - WIRED

Italian carmaker Maserati to invest USD 1.29 billion to build an electric ecosystem – The New Indian Express

By Express News Service

Italian luxury automaker Maserati has revealed details about its global product strategy. As part of the new plan, the company will shift focus on its electrification push with pure electric vehicles (EVs) and hybrid vehicles along with expanding its product line-up with an all-new SUV.

The company said its models will be 100 per cent developed, engineered and built in Italy. Its electrification programme starts this year, and the first hybrid car to be built will be the new Maserati Ghibli. Production of the new Maserati GranTurismo and GranCabrio, the brands first cars to adopt 100 per cent electric solutionsm, will commence in 2021, it said.

Maserati has also decided to build the GranTurismo and GranCabrio at the Mirafiori production hub, with an investment of 800 million Euros ($1.29 billion).The latest generation of the GranTurismo and GranCabrio, two iconic cars for the Trident Brand, have totalled more than 40,000 units sold from 2007 to 2019.

During 2020, Mirafiori will be strengthening its position as a world hub dedicated to the electrification and mobility of the future, with a large proportion of its capacity allocated to the production of the brands new electrified cars, Maserati said in a statement. It added that its heart is still in Modena (Italy), where it has its headquarters. In 2020, the first of the new Modena-built Maserati models will be the super sports car. The company informed that major modernisation work is in progress on the production line at the Modena plant, partly to accommodate the electric version of the new super sports car. The carmaker is also coming up with a utility vehicle, which will be built at Cassino and intended to play a leading role for the brand.

About 800 million Euros will be invested in construction of the new production line, scheduled to begin at the end of the first quarter of 2020. The first pre-production cars are expected to come off the line by 2021, the carmaker said. In concluding remark, Maserati said as one of the brands recent claims puts it, the music is changing, and this will be even more apparent in May this year, when past and future will meet to place Maserati firmly on the world stage for the future of mobility. In December, it launched the V6 petrol variants of Ghibli, Quattroporte and Levante for the very first time in the Indian market, as part of its 2020 range.

Maseratis 2020 planAs part of its 2020 range, Maserati had launched the V6 petrol variants of Ghibli, Quattroporte and Levante for the very first time in the Indian market in December. The petrol variants will be available with both the twin-turbo V6 engines, of 350 hp and 430 hp.

See original here:

Italian carmaker Maserati to invest USD 1.29 billion to build an electric ecosystem - The New Indian Express

Quantum computing – Wikipedia

Study of a model of computation

Quantum Computing is the use of quantum-mechanical phenomena such as superposition and entanglement to perform computation. A quantum computer is used to perform such computation, which can be implemented theoretically or physically[1]:I-5 There are two main approaches to physically implementing a quantum computer currently, analog and digital. Analog approaches are further divided into quantum simulation, quantum annealing, and adiabatic quantum computation. Digital quantum computers use quantum logic gates to do computation. Both approaches use quantum bits or qubits.[1]:213

Qubits are fundamental to quantum computing and are somewhat analogous to bits in a classical computer. Qubits can be in a 1 or 0 quantum state. But they can also be in a superposition of the 1 and 0 states. However, when qubits are measured the result is always either a 0 or a 1; the probabilities of the two outcomes depends on the quantum state they were in.

Quantum computing began in the early 1980s, when physicist Paul Benioff proposed a quantum mechanical model of the Turing machine.[2]Richard FeynmanandYuri Maninlater suggested that a quantum computer had the potential to simulate things that a classical computer could not.[3][4] In 1994, Peter Shor developed a quantum algorithm for factoring integers that had the potential to decrypt all secured communications.[5]

Despite ongoing experimental progress since the late 1990s, most researchers believe that "fault-tolerant quantum computing [is] still a rather distant dream".[6] On 23 October 2019, Google AI, in partnership with the U.S. National Aeronautics and Space Administration (NASA), published a paper in which they claimed to have achieved quantum supremacy.[7] While some have disputed this claim, it is still a significant milestone in the history of quantum computing.[8]

The field of quantum computing is a subfield of quantum information science, which includes quantum cryptography and quantum communication.

The prevailing model of quantum computation describes the computation in terms of a network of quantum logic gates. What follows is a brief treatment of the subject based upon Chapter 4 of Nielsen and Chuang.[9]

A memory consisting of n {textstyle n} bits of information has 2 n {textstyle 2^{n}} possible states. A vector representing all memory states has hence 2 n {textstyle 2^{n}} entries (one for each state). This vector should be viewed as a probability vector and represents the fact that the memory is to be found in a particular state.

In the classical view, one entry would have a value of 1 (i.e. a 100% probability of being in this state) and all other entries would be zero. In quantum mechanics, probability vectors are generalized to density operators. This is the technically rigorous mathematical foundation for quantum logic gates, but the intermediate quantum state vector formalism is usually introduced first because it is conceptually simpler. This article focuses on the quantum state vector formalism for simplicity.

We begin by considering a simple memory consisting of only one bit. This memory may be found in one of two states: the zero state or the one state. We may represent the state of this memory using Dirac notation so that

The state of this one-qubit quantum memory can be manipulated by applying quantum logic gates, analogous to how classical memory can be manipulated with classical logic gates. One important gate for both classical and quantum computation is the NOT gate, which can be represented by a matrix

The mathematics of single qubit gates can be extended to operate on multiqubit quantum memories in two important ways. One way is simply to select a qubit and apply that gate to the target qubit whilst leaving the remainder of the memory unaffected. Another way is to apply the gate to its target only if another part of the memory is in a desired state. These two choices can be illustrated using another example. The possible states of a two-qubit quantum memory are

In summary, a quantum computation can be described as a network of quantum logic gates and measurements. Any measurement can be deferred to the end of a quantum computation, though this deferment may come at a computational cost. Because of this possibility of deferring a measurement, most quantum circuits depict a network consisting only of quantum logic gates and no measurements. More information can be found in the following articles: universal quantum computer, Shor's algorithm, Grover's algorithm, DeutschJozsa algorithm, amplitude amplification, quantum Fourier transform, quantum gate, quantum adiabatic algorithm and quantum error correction.

Any quantum computation can be represented as a network of quantum logic gates from a fairly small family of gates. A choice of gate family that enables this construction is known as a universal gate set. One common such set includes all single-qubit gates as well as the CNOT gate from above. This means any quantum computation can be performed by executing a sequence of single-qubit gates together with CNOT gates. Though this gate set is infinite, it can be replaced with a finite gate set by appealing to the Solovay-Kitaev theorem.

Integer factorization, which underpins the security of public key cryptographic systems, is believed to be computationally infeasible with an ordinary computer for large integers if they are the product of few prime numbers (e.g., products of two 300-digit primes).[10] By comparison, a quantum computer could efficiently solve this problem using Shor's algorithm to find its factors. This ability would allow a quantum computer to break many of the cryptographic systems in use today, in the sense that there would be a polynomial time (in the number of digits of the integer) algorithm for solving the problem. In particular, most of the popular public key ciphers are based on the difficulty of factoring integers or the discrete logarithm problem, both of which can be solved by Shor's algorithm. In particular, the RSA, DiffieHellman, and elliptic curve DiffieHellman algorithms could be broken. These are used to protect secure Web pages, encrypted email, and many other types of data. Breaking these would have significant ramifications for electronic privacy and security.

However, other cryptographic algorithms do not appear to be broken by those algorithms.[11][12] Some public-key algorithms are based on problems other than the integer factorization and discrete logarithm problems to which Shor's algorithm applies, like the McEliece cryptosystem based on a problem in coding theory.[11][13] Lattice-based cryptosystems are also not known to be broken by quantum computers, and finding a polynomial time algorithm for solving the dihedral hidden subgroup problem, which would break many lattice based cryptosystems, is a well-studied open problem.[14] It has been proven that applying Grover's algorithm to break a symmetric (secret key) algorithm by brute force requires time equal to roughly 2n/2 invocations of the underlying cryptographic algorithm, compared with roughly 2n in the classical case,[15] meaning that symmetric key lengths are effectively halved: AES-256 would have the same security against an attack using Grover's algorithm that AES-128 has against classical brute-force search (see Key size).

Quantum cryptography could potentially fulfill some of the functions of public key cryptography. Quantum-based cryptographic systems could, therefore, be more secure than traditional systems against quantum hacking.[16]

Besides factorization and discrete logarithms, quantum algorithms offering a more than polynomial speedup over the best known classical algorithm have been found for several problems,[17] including the simulation of quantum physical processes from chemistry and solid state physics, the approximation of Jones polynomials, and solving Pell's equation. No mathematical proof has been found that shows that an equally fast classical algorithm cannot be discovered, although this is considered unlikely.[18] However, quantum computers offer polynomial speedup for some problems. The most well-known example of this is quantum database search, which can be solved by Grover's algorithm using quadratically fewer queries to the database than that are required by classical algorithms. In this case, the advantage is not only provable but also optimal, it has been shown that Grover's algorithm gives the maximal possible probability of finding the desired element for any number of oracle lookups. Several other examples of provable quantum speedups for query problems have subsequently been discovered, such as for finding collisions in two-to-one functions and evaluating NAND trees.

Problems that can be addressed with Grover's algorithm have the following properties:

For problems with all these properties, the running time of Grover's algorithm on a quantum computer will scale as the square root of the number of inputs (or elements in the database), as opposed to the linear scaling of classical algorithms. A general class of problems to which Grover's algorithm can be applied[19] is Boolean satisfiability problem. In this instance, the database through which the algorithm is iterating is that of all possible answers. An example (and possible) application of this is a password cracker that attempts to guess the password or secret key for an encrypted file or system. Symmetric ciphers such as Triple DES and AES are particularly vulnerable to this kind of attack.[citation needed] This application of quantum computing is a major interest of government agencies.[20]

Since chemistry and nanotechnology rely on understanding quantum systems, and such systems are impossible to simulate in an efficient manner classically, many believe quantum simulation will be one of the most important applications of quantum computing.[21] Quantum simulation could also be used to simulate the behavior of atoms and particles at unusual conditions such as the reactions inside a collider.[22]

Quantum annealing or Adiabatic quantum computation relies on the adiabatic theorem to undertake calculations. A system is placed in the ground state for a simple Hamiltonian, which is slowly evolved to a more complicated Hamiltonian whose ground state represents the solution to the problem in question. The adiabatic theorem states that if the evolution is slow enough the system will stay in its ground state at all times through the process.

The Quantum algorithm for linear systems of equations or "HHL Algorithm", named after its discoverers Harrow, Hassidim, and Lloyd, is expected to provide speedup over classical counterparts.[23]

John Preskill has introduced the term quantum supremacy to refer to the hypothetical speedup advantage that a quantum computer would have over a classical computer in a certain field.[24] Google announced in 2017 that it expected to achieve quantum supremacy by the end of the year though that did not happen. IBM said in 2018 that the best classical computers will be beaten on some practical task within about five years and views the quantum supremacy test only as a potential future benchmark.[25] Although skeptics like Gil Kalai doubt that quantum supremacy will ever be achieved,[26][27] in October 2019, a Sycamore processor created in conjunction with Google AI Quantum was reported to have achieved quantum supremacy,[28] with calculations more than 3,000,000 times as fast as those of Summit, generally considered the world's fastest computer.[29] Bill Unruh doubted the practicality of quantum computers in a paper published back in 1994.[30] Paul Davies argued that a 400-qubit computer would even come into conflict with the cosmological information bound implied by the holographic principle.[31]

There are a number of technical challenges in building a large-scale quantum computer,.[32] David DiVincenzo listed the following requirements for a practical quantum computer:[33]

Sourcing parts for quantum computers is very difficult: Quantum computers need Helium-3, a nuclear research byproduct, and special cables that are only made by a single company in Japan.[34]

One of the greatest challenges is controlling or removing quantum decoherence. This usually means isolating the system from its environment as interactions with the external world cause the system to decohere. However, other sources of decoherence also exist. Examples include the quantum gates, and the lattice vibrations and background thermonuclear spin of the physical system used to implement the qubits. Decoherence is irreversible, as it is effectively non-unitary, and is usually something that should be highly controlled, if not avoided. Decoherence times for candidate systems in particular, the transverse relaxation time T2 (for NMR and MRI technology, also called the dephasing time), typically range between nanoseconds and seconds at low temperature.[35] Currently, some quantum computers require their qubits to be cooled to 20 millikelvins in order to prevent significant decoherence.[36]

As a result, time-consuming tasks may render some quantum algorithms inoperable, as maintaining the state of qubits for a long enough duration will eventually corrupt the superpositions.[37]

These issues are more difficult for optical approaches as the timescales are orders of magnitude shorter and an often-cited approach to overcoming them is optical pulse shaping. Error rates are typically proportional to the ratio of operating time to decoherence time, hence any operation must be completed much more quickly than the decoherence time.

As described in the Quantum threshold theorem, if the error rate is small enough, it is thought to be possible to use quantum error correction to suppress errors and decoherence. This allows the total calculation time to be longer than the decoherence time if the error correction scheme can correct errors faster than decoherence introduces them. An often cited figure for the required error rate in each gate for fault-tolerant computation is 103, assuming the noise is depolarizing.

Meeting this scalability condition is possible for a wide range of systems. However, the use of error correction brings with it the cost of a greatly increased number of required qubits. The number required to factor integers using Shor's algorithm is still polynomial, and thought to be between L and L2, where L is the number of qubits in the number to be factored; error correction algorithms would inflate this figure by an additional factor of L. For a 1000-bit number, this implies a need for about 104 bits without error correction.[38] With error correction, the figure would rise to about 107 bits. Computation time is about L2 or about 107 steps and at 1MHz, about 10 seconds.

A very different approach to the stability-decoherence problem is to create a topological quantum computer with anyons, quasi-particles used as threads and relying on braid theory to form stable logic gates.[39][40]

Physicist Mikhail Dyakonov has expressed skepticism of quantum computing as follows:

There are a number of quantum computing models, distinguished by the basic elements in which the computation is decomposed. The four main models of practical importance are:

The quantum Turing machine is theoretically important but the direct implementation of this model is not pursued. All four models of computation have been shown to be equivalent; each can simulate the other with no more than polynomial overhead.

For physically implementing a quantum computer, many different candidates are being pursued, among them (distinguished by the physical system used to realize the qubits):

A large number of candidates demonstrates that the topic, in spite of rapid progress, is still in its infancy. There is also a vast amount of flexibility.

The class of problems that can be efficiently solved by quantum computers is called BQP, for "bounded error, quantum, polynomial time". Quantum computers only run probabilistic algorithms, so BQP on quantum computers is the counterpart of BPP ("bounded error, probabilistic, polynomial time") on classical computers. It is defined as the set of problems solvable with a polynomial-time algorithm, whose probability of error is bounded away from one half.[61] A quantum computer is said to "solve" a problem if, for every instance, its answer will be right with high probability. If that solution runs in polynomial time, then that problem is in BQP.

BQP is contained in the complexity class #P (or more precisely in the associated class of decision problems P#P),[62] which is a subclass of PSPACE.

BQP is suspected to be disjoint from NP-complete and a strict superset of P, but that is not known. Both integer factorization and discrete log are in BQP. Both of these problems are NP problems suspected to be outside BPP, and hence outside P. Both are suspected to not be NP-complete. There is a common misconception that quantum computers can solve NP-complete problems in polynomial time. That is not known to be true, and is generally suspected to be false.[62]

The capacity of a quantum computer to accelerate classical algorithms has rigid limitsupper bounds of quantum computation's complexity. The overwhelming part of classical calculations cannot be accelerated on a quantum computer.[63] A similar fact prevails for particular computational tasks, like the search problem, for which Grover's algorithm is optimal.[64]

Bohmian Mechanics is a non-local hidden variable interpretation of quantum mechanics. It has been shown that a non-local hidden variable quantum computer could implement a search of an N-item database at most in O ( N 3 ) {displaystyle O({sqrt[{3}]{N}})} steps. This is slightly faster than the O ( N ) {displaystyle O({sqrt {N}})} steps taken by Grover's algorithm. Neither search method will allow quantum computers to solve NP-Complete problems in polynomial time.[65]

Although quantum computers may be faster than classical computers for some problem types, those described above cannot solve any problem that classical computers cannot already solve. A Turing machine can simulate these quantum computers, so such a quantum computer could never solve an undecidable problem like the halting problem. The existence of "standard" quantum computers does not disprove the ChurchTuring thesis.[66] It has been speculated that theories of quantum gravity, such as M-theory or loop quantum gravity, may allow even faster computers to be built. Currently, defining computation in such theories is an open problem due to the problem of time, i.e., there currently exists no obvious way to describe what it means for an observer to submit input to a computer and later receive output.[67][68]

Visit link:
Quantum computing - Wikipedia

What Is Quantum Computing? The Next Era of Computational …

When you first stumble across the term quantum computer, you might pass it off as some far-flung science fiction concept rather than a serious current news item.

But with the phrase being thrown around with increasing frequency, its understandable to wonder exactly what quantum computers are, and just as understandable to be at a loss as to where to dive in. Heres the rundown on what quantum computers are, why theres so much buzz around them, and what they might mean for you.

All computing relies on bits, the smallest unit of information that is encoded as an on state or an off state, more commonly referred to as a 1 or a 0, in some physical medium or another.

Most of the time, a bit takes the physical form of an electrical signal traveling over the circuits in the computers motherboard. By stringing multiple bits together, we can represent more complex and useful things like text, music, and more.

The two key differences between quantum bits and classical bits (from the computers we use today) are the physical form the bits take and, correspondingly, the nature of data encoded in them. The electrical bits of a classical computer can only exist in one state at a time, either 1 or 0.

Quantum bits (or qubits) are made of subatomic particles, namely individual photons or electrons. Because these subatomic particles conform more to the rules of quantum mechanics than classical mechanics, they exhibit the bizarre properties of quantum particles. The most salient of these properties for computer scientists is superposition. This is the idea that a particle can exist in multiple states simultaneously, at least until that state is measured and collapses into a single state. By harnessing this superposition property, computer scientists can make qubits encode a 1 and a 0 at the same time.

The other quantum mechanical quirk that makes quantum computers tick is entanglement, a linking of two quantum particles or, in this case, two qubits. When the two particles are entangled, the change in state of one particle will alter the state of its partner in a predictable way, which comes in handy when it comes time to get a quantum computer to calculate the answer to the problem you feed it.

A quantum computers qubits start in their 1-and-0 hybrid state as the computer initially starts crunching through a problem. When the solution is found, the qubits in superposition collapse to the correct orientation of stable 1s and 0s for returning the solution.

Aside from the fact that they are far beyond the reach of all but the most elite research teams (and will likely stay that way for a while), most of us dont have much use for quantum computers. They dont offer any real advantage over classical computers for the kinds of tasks we do most of the time.

However, even the most formidable classical supercomputers have a hard time cracking certain problems due to their inherent computational complexity. This is because some calculations can only be achieved by brute force, guessing until the answer is found. They end up with so many possible solutions that it would take thousands of years for all the worlds supercomputers combined to find the correct one.

The superposition property exhibited by qubits can allow supercomputers to cut this guessing time down precipitously. Classical computings laborious trial-and-error computations can only ever make one guess at a time, while the dual 1-and-0 state of a quantum computers qubits lets it make multiple guesses at the same time.

So, what kind of problems require all this time-consuming guesswork calculation? One example is simulating atomic structures, especially when they interact chemically with those of other atoms. With a quantum computer powering the atomic modeling, researchers in material science could create new compounds for use in engineering and manufacturing. Quantum computers are well suited to simulating similarly intricate systems like economic market forces, astrophysical dynamics, or genetic mutation patterns in organisms, to name only a few.

Amidst all these generally inoffensive applications of this emerging technology, though, there are also some uses of quantum computers that raise serious concerns. By far the most frequently cited harm is the potential for quantum computers to break some of the strongest encryption algorithms currently in use.

In the hands of an aggressive foreign government adversary, quantum computers could compromise a broad swath of otherwise secure internet traffic, leaving sensitive communications susceptible to widespread surveillance. Work is currently being undertaken to mature encryption ciphers based on calculations that are still hard for even quantum computers to do, but they are not all ready for prime-time, or widely adopted at present.

A little over a decade ago, actual fabrication of quantum computers was barely in its incipient stages. Starting in the 2010s, though, development of functioning prototype quantum computers took off. A number of companies have assembled working quantum computers as of a few years ago, with IBM going so far as to allow researchers and hobbyists to run their own programs on it via the cloud.

Despite the strides that companies like IBM have undoubtedly made to build functioning prototypes, quantum computers are still in their infancy. Currently, the quantum computers that research teams have constructed so far require a lot of overhead for executing error correction. For every qubit that actually performs a calculation, there are several dozen whose job it is to compensate for the ones mistake. The aggregate of all these qubits make what is called a logical qubit.

Long story short, industry and academic titans have gotten quantum computers to work, but they do so very inefficiently.

Fierce competition between quantum computer researchers is still raging, between big and small players alike. Among those who have working quantum computers are the traditionally dominant tech companies one would expect: IBM, Intel, Microsoft, and Google.

As exacting and costly of a venture as creating a quantum computer is, there are a surprising number of smaller companies and even startups that are rising to the challenge.

The comparatively lean D-Wave Systems has spurred many advances in the fieldand proved it was not out of contention by answering Googles momentous announcement with news of a huge deal with Los Alamos National Labs. Still, smaller competitors like Rigetti Computing are also in the running for establishing themselves as quantum computing innovators.

Depending on who you ask, youll get a different frontrunner for the most powerful quantum computer. Google certainly made its case recently with its achievement of quantum supremacy, a metric that itself Google more or less devised. Quantum supremacy is the point at which a quantum computer is first able to outperform a classical computer at some computation. Googles Sycamore prototype equipped with 54 qubits was able to break that barrier by zipping through a problem in just under three-and-a-half minutes that would take the mightiest classical supercomputer 10,000 years to churn through.

Not to be outdone, D-Wave boasts that the devices it will soon be supplying to Los Alamos weigh in at 5000 qubits apiece, although it should be noted that the quality of D-Waves qubits has been called into question before. IBM hasnt made the same kind of splash as Google and D-Wave in the last couple of years, but they shouldnt be counted out yet, either, especially considering their track record of slow and steady accomplishments.

Put simply, the race for the worlds most powerful quantum computer is as wide open as it ever was.

The short answer to this is not really, at least for the near-term future. Quantum computers require an immense volume of equipment, and finely tuned environments to operate. The leading architecture requires cooling to mere degrees above absolute zero, meaning they are nowhere near practical for ordinary consumers to ever own.

But as the explosion of cloud computing has proven, you dont need to own a specialized computer to harness its capabilities. As mentioned above, IBM is already offering daring technophiles the chance to run programs on a small subset of its Q System Ones qubits. In time, IBM and its competitors will likely sell compute time on more robust quantum computers for those interested in applying them to otherwise inscrutable problems.

But if you arent researching the kinds of exceptionally tricky problems that quantum computers aim to solve, you probably wont interact with them much. In fact, quantum computers are in some cases worse at the sort of tasks we use computers for every day, purely because quantum computers are so hyper-specialized. Unless you are an academic running the kind of modeling where quantum computing thrives, youll likely never get your hands on one, and never need to.

See more here:
What Is Quantum Computing? The Next Era of Computational ...