Monthly Archives: April 2020

Netflix Wins Rights to Ted Melfis The Starling With Melissa McCarthy and Kevin Kline – TheWrap

Posted: April 24, 2020 at 2:43 pm

Netflix has won the global rights in an auction to The Starling, a dramedy directed by Ted Melfi that stars Melissa McCarthy, Kevin Kline, Timothy Olyphant, Chris ODowd and Daveed Diggs, the streamer announced Monday.

The Starling is described as a heartwarming and comic story about a woman who suffers through a hardship and then becomes obsessed with trying to kill a small starling bird nested in her backyard that harasses and attacks her, something that now represents all her problems. In the process, she forms an unlikely friendship with a quirky psychologist turned veterinarian with his own troubled past.

Melfi is directing the film from a script by Matt Harris, and hell also produce alongside Kimberly Quinn and Limelights Dylan Sellers and Chris Parker. The Starling comes from Limelight, Entertainment One (eOne) and Boies Schiller Entertainment. The executive producers are Boies Schiller Entertainments Zack Schiller and David Boies and eOnes Jen Gorton and Zev Foreman.

Also Read: Melissa McCarthy's 'Superintelligence' Bumped From Holiday Theatrical Release to Premiere on HBO Max

Also co-starring in the film are Skyler Gisondo, Loretta Devine, Laura Harrier, Rosalind Chao and Kimberly Quinn. The Starling is currently in post-production.

McCarthy, who starred in Melfis St. Vincent back in 2014, has made a dramatic turn in recent years after work in Can You Ever Forgive Me? and The Kitchen. Shell also next appear in the live-action remake of Disneys The Little Mermaid.

The Starling is Melfis first feature since 2016s Best Picture-nominated Hidden Figures.

CAA Media Finance and UTA Independent negotiated the deal with Netflix.

Deadline first reported news of the deal.

The actress has come a long way since her days playing Sookie

"Go" (1999)

McCarthy made her feature film debut with a supporting role in "Go," directed by Doug Liman.

"Charlie's Angels: Full Throttle" (2000)The actress had a small role as Doris, a woman flirting with Jimmy Bosley at the crime scene.

"Gilmore Girls" (2000-2007)

McCarthy was cast as Sookie St. James, the best friend of Lorelai Gilmore, in the WB television series. The series ended in 2007, and McCarthy was not asked to return for the reboot announced in February.

"Curb Your Enthusiasm" (2004)McCarthy played a saleswoman in an episode titled "The Surrogate," in which Larry David gets a heart monitor and uses the device to get out of uncomfortable situations.

In 2005, McCarthy married Ben Falcone, fellow actor and future "Bridesmaids" co-star (seen here at the 2007 Sundance Film Festival).

"Mike &Molly" (2010-2015)"Mike & Molly" premiered on CBS in 2010 and starred McCarthy and Billy Gardell as a couple who fall in love. The show was cancelled in January 2016.

McCarthy even earned an Oscar nomination for her role in "Bridesmaids," and presented at the 2012 ceremony with co-star Rose Byrne.

"This Is 40" (2012)

With Paul Rudd and Leslie Mann in the leads of Judd Apatow's comedy, McCarthy played a kid's mom who gets in a verbal argument with Rudd's character, Pete, at school.

"Identity Thief" (2013)

The film was a surprise hit at the box office, debuting to $34.5 million and grossing $134.5 million although it received terrible reviews. Jason Bateman starred in the film about a man getting his identity stolen by a woman.

"The Heat" (2013)Directed by Paul Feig, McCarthy teamed up with Sandra Bullock to take down a mobster. The film grossed $230 million globally from a $43 million budget.

"Tammy" (2014)

The film, which received mixed reviews, had McCarthy in the role of a recently-unemployed woman who goes on a road trip with her alcoholic grandmother. The film made $84.5 million domestically.

"The Boss" (2016)

McCarthy stars as a disgraced industry titan who goes to prison for insider trading. She then tries to redeem herself by starting a new empire with brownies.

The actress has come a long way since her days playing Sookie

The actress has come a long way since her days playing Sookie

Continue reading here:

Netflix Wins Rights to Ted Melfis The Starling With Melissa McCarthy and Kevin Kline - TheWrap

Posted in Superintelligence | Comments Off on Netflix Wins Rights to Ted Melfis The Starling With Melissa McCarthy and Kevin Kline – TheWrap

Google’s Head of Quantum Computing Hardware Resigns – WIRED

Posted: at 2:41 pm

In late October 2019, Google CEO Sundar Pichai likened the latest result from the companys quantum computing hardware lab in Santa Barbara, California, to the Wright brothers first flight.

One of the labs prototype processors had achieved quantum supremacyevocative jargon for the moment a quantum computer harnesses quantum mechanics to do something seemingly impossible for a conventional computer. In a blog post, Pichai said the milestone affirmed his belief that quantum computers might one day tackle problems like climate change, and the CEO also name-checked John Martinis, who had established Googles quantum hardware group in 2014.

Heres what Pichai didnt mention: Soon after the team had first got its quantum supremacy experiment working a few months earlier, Martinis says, he had been reassigned from a leadership position to an advisory one. Martinis tells WIRED that the change led to disagreements with Hartmut Neven, the longtime leader of Googles quantum project.

Martinis resigned from Google early this month. Since my professional goal is for someone to build a quantum computer, I think my resignation is the best course of action for everyone, he adds.

A Google spokesman did not dispute this account, and says that the company is grateful for Martinis contributions and that Neven continues to head the companys quantum project. Parent company Alphabet has a second, smaller, quantum computing group at its X Labs research unit. Martinis retains his position as a professor at the UC Santa Barbara, which he held throughout his tenure at Google, and says he will continue to work on quantum computing.

Googles quantum computing project was founded by Neven, who pioneered Googles image search technology, in 2006, and initially focused on software. To start, the small group accessed quantum hardware from Canadian startup D-Wave Systems, including in collaboration with NASA.

Everything you ever wanted to know about qubits, superpositioning, and spooky action at a distance.

The project took on greater scale and ambition when Martinis joined in 2014 to establish Googles quantum hardware lab in Santa Barbara, bringing along several members of his university research group. His nearby lab at UC Santa Barbara had produced some of the most prominent work in the field over the past 20 years, helping to demonstrate the potential of using superconducting circuits to build qubits, the building blocks of quantum computers.

Qubits are analogous to the bits of a conventional computer, but in addition to representing 1s and 0s, they can use quantum mechanical effects to attain a third state, dubbed a superposition, something like a combination of both. Qubits in superposition can work through some very complex problems, such as modeling the interactions of atoms and molecules, much more efficiently than conventional computer hardware.

How useful that is depends on the number and reliability of qubits in your quantum computing processor. So far the best demonstrations have used only tens of qubits, a far cry from the hundreds or thousands of high quality qubits experts believe will be needed to do useful work in chemistry or other fields. Googles supremacy experiment used 53 qubits working together. They took minutes to crunch through a carefully chosen math problem the company calculated would take a supercomputer on the order of 10,000 years, but does not have a practical application.

Martinis leaves Google as the company and rivals that are working on quantum computing face crucial questions about the technologys path. Amazon, IBM, and Microsoft, as well as Google offer their prototype technology to companies such as Daimler and JP Morgan so they can run experiments. But those processors are not large enough to work on practical problems, and it is not clear how quickly they can be scaled up.

When WIRED visited Googles quantum hardware lab in Santa Barbara last fall, Martinis responded optimistically when asked if his hardware team could see a path to making the technology practical. I feel we know how to scale up to hundreds and maybe thousands of qubits, he said at the time. Google will now have to do it without him.

More Great WIRED Stories

The rest is here:

Google's Head of Quantum Computing Hardware Resigns - WIRED

Comments Off on Google’s Head of Quantum Computing Hardware Resigns – WIRED

Wiring the Quantum Computer of the Future: aNovel Simple Build with Existing Technology – Analytics Insight

Posted: at 2:41 pm

Wiring the Quantum Computer of the Future: a Novel Simple Build with Existing Technology

The basic units of a quantum computer can be rearranged in 2D to solve typical design and operation challenges

Efficient quantum computing is expected to enable advancements that are impossible with classical computers. Scientists from Japan and Sydney have collaborated and proposed a novel two-dimensional design that can be constructed using existing integrated circuit technology. This design solves typical problems facing the current three-dimensional packaging for scaled-up quantum computers, bringing the future one step closer.

Quantum computing is increasingly becoming the focus of scientists in fields such as physics and chemistry,and industrialists in the pharmaceutical, airplane, and automobile industries. Globally, research labs at companies like Google and IBM are spending extensive resources on improving quantum computers, and with good reason. Quantum computers use the fundamentals of quantum mechanics to process significantly greater amounts of information much faster than classical computers. It is expected that when error-corrected and fault-tolerant quantum computation is achieved, scientific and technological advancement will occur at an unprecedented scale.

But, building quantum computers for large-scale computation is proving to be a challenge in terms of their architecture. The basic units of a quantum computer are the quantum bits or qubits. These are typically atoms, ions, photons, subatomic particles such as electrons,or even larger elements that simultaneously exist in multiple states, making it possible to obtain several potential outcomes rapidly for large volumes of data. The theoretical requirement for quantum computers is that these are arranged in two-dimensional (2D) arrays, where each qubit is both coupled with its nearest neighbor and connected to the necessary external control lines and devices. When the number of qubits in an array is increased, it becomes difficult to reach qubits in the interior of the array from the edge. The need to solve this problem has so far resulted in complex three-dimensional (3D) wiring systems across multiple planes in which many wires intersect,making their construction a significant engineering challenge.

A group of scientists from Tokyo University of Science, Japan, RIKEN Centre for Emergent Matter Science, Japan, and University of Technology, Sydney, led by Prof Jaw-Shen Tsai, proposes a unique solution to this qubit accessibility problem by modifying the architecture of the qubit array. Here, we solve this problem and present a modified superconducting micro-architecture that does not require any 3D external line technology and reverts to a completely planar design, they say. This study has been published in the New Journal of Physics.

The scientists began with a qubit square lattice array and stretched out each column in the 2D plane. They then folded each successive column on top of each other, forming a dual one-dimensional array called a bi-linear array. This put all qubits on the edge and simplified the arrangement of the required wiring system.The system is also completely in 2D. In this new architecture, some of the inter-qubit wiringeach qubit is also connected to all adjacent qubits in an arraydoes overlap, but because these are the only overlaps in the wiring, simple local 3D systems such as airbridges at the point of overlap are enough and the system overall remains in 2D. As you can imagine, this simplifies its construction considerably.

The scientists evaluated the feasibility of this new arrangement through numerical and experimental evaluation in which they tested how much of a signal was retained before and after it passed through an airbridge. Results of both evaluations showed that it is possible to build and run this system using existing technology and without any 3D arrangement.

The scientists experiments also showed them that their architecture solves several problems that plague the 3D structures: they are difficult to construct, there is crosstalk or signal interference between waves transmitted across two wires, and the fragile quantum states of the qubits can degrade. The novel pseudo-2D design reduces the number of times wires cross each other,thereby reducing the crosstalk and consequently increasing the efficiency of the system.

At a time when large labs worldwide are attempting to find ways to buildlarge-scale fault-tolerant quantum computers, the findingsof this exciting new study indicate that such computers can be built using existing 2D integrated circuit technology. The quantum computer is an information device expected to far exceed the capabilities of modern computers, Prof Tsai states.The research journey in this direction has only begun with this study, and Prof Tsai concludes by saying, We are planning to construct a small-scale circuit to further examine and explore the possibility.

###

ReferenceTitle of original paper: Pseudo-2D superconducting quantum computing circuit for the surface code: the proposal and preliminary tests

Journal:New Journal of Physics

DOI:10.1088/1367-2630/ab7d7d

Tokyo University of Science (TUS) is a well-known and respected university, and the largest science-specialized private research university in Japan, with four campuses in central Tokyo and its suburbs and in Hokkaido. Established in 1881, the university has continually contributed to Japans development in science through inculcating the love for science in researchers, technicians, and educators.

With a mission of Creating science and technology for the harmonious development of nature, human beings, and society, TUS has undertaken a wide range of research from basic to applied science. TUS has embraced a multidisciplinary approach to research and undertaken intensive study in some of todays most vital fields. TUS is a meritocracy where the best in science is recognized and nurtured. It is the only private university in Japan that has produced a Nobel Prize winner and the only private university in Asia to produce Nobel Prize winners within the natural sciences field.

Website:https://www.tus.ac.jp/en/mediarelations/

Dr Jaw-Shen Tsai is currently a Professor at the Tokyo University of Science, Japan. He began research in Physics in 1975 and continues to hold interest in areas such as superconductivity, the Josephson effect, quantum physics, coherence, qubits, and artificial atoms. He has 160+ research publications to his credit and serves as the lead author in this paper. He has also won several awards, including Japans Medal of Honor, the Purple Ribbon Award.

Professor Jaw-Shen Tsai

Department of Physics

Tokyo University of Science

Tsutomu Shimizu

Public Relations Divisions

Tokyo University of Science

Email: mediaoffice@admin.tus.ac.jp

Website: https://www.tus.ac.jp/en/mediarelations/

Share This ArticleDo the sharing thingy

Read the original post:

Wiring the Quantum Computer of the Future: aNovel Simple Build with Existing Technology - Analytics Insight

Comments Off on Wiring the Quantum Computer of the Future: aNovel Simple Build with Existing Technology – Analytics Insight

Quantum Computing Is Hot And Noisy, But Zapata Opens Early Access – Forbes

Posted: at 2:41 pm

Zapata's quantum coders, ready for a hot & noisy ride.

Were on the road to quantum computing. But these massively powerful machines are still in somewhat embryonic prototype stages and we still have several key challenges to overcome before we can start to build more of them.

As a quantum reminder: traditional computers compute on the basis of binary 1s and 0s, so all values and mathematical logic are essentially established from a base of those two values quantum superposition particles (known as qubits) can be 1 or 0, or anywhere in between and the value expressed can be differentiated depending upon what angle the qubit is viewed from so with massively more breadth, we can create a lot more algorithmic logic and computing power.

One of the main challenges associated with building quantum computing machines is the massive heat they generate. Scientists have been working with different semiconducting materials such as so-called quantum dots to help overcome the heat challenge. This issue is that qubits are special, qubits are powerful, but qubits are also fragile... and heat is one of their sworn enemies.

Another core challenge is noise.

As computations pass through the quantum gates that make up the quantum circuits in our new super quantum machines they create a lot of noise disturbance (think of an engine revving louder as it speeds up), so this means we have come to define and accept the term NISQ-based quantum applications i.e. Noisy Intermediate-Scale Quantum (NISQ).

As beautifully clarified by theoretical physicist John Preskill in this 2018 paper, Noisy Intermediate-Scale Quantum (NISQ) technology will be available in the near future. Quantum computers with 50-100 qubits may be able to perform tasks which surpass the capabilities of todays classical digital computers, but noise in quantum gates will limit the size of quantum circuits that can be executed reliably. Quantum technologists should continue to strive for more accurate quantum gates and, eventually, fully fault-tolerant quantum computing.

The fact that we know about the heat and noise challenges hasnt stopped companies like Strangeworks, D-Wave Systems, Coldquanta and others (including usual suspects Intel, IBM and Microsoft) forging on with development in the quantum space. Joining that list is Boston-headquartered Zapata Computing, Inc. The company describes itself as the quantum software company for near-term/NISQ-based quantum applications empowering enterprise teams. Near-term in this case meaning, well, now i.e. quantum stuff we can actually use on quantum devices of about 100-300 qubits.

Zapatas latest quantum leap in quantum (pun absolutely intended) is an early access program to Orquestra, its platform for quantum-enabled workflows. The company claims to have provided a software- and hardware-interoperable enterprise quantum toolset i.e. again, quantum tools we can actually use in modern day enterprise IT departments.

Using Zapatas unified Quantum Operating Environment, users can build, run and analyze quantum and quantum-inspired workflows. This toolset will empower enterprises and institutions to make their quantum mark on the world, enabling them to develop quantum capabilities and foundational IP today while shoring up for derivative IP for tomorrow, says CEO Christopher Savoie. It is a new computing paradigm, built on a unified enterprise framework that spans quantum and classical programming and hardware tools. With Orquestra, we are accelerating quantum experiments at scale.

Zapatas Early Access Program to Orquestra is aimed at users with backgrounds in software engineering, machine learning, physics, computational chemistry or quantum information theory working on the most computationally complex problems.

Orquestra is agnostic across the entire software and hardware stack. It offers an extensible library of open source and Zapata-created components for writing, manipulating and optimizing quantum circuits and running them across quantum computers, quantum simulators and classical computing resources. It comes equipped with a versatile workflow system and Application Programming Interfaces (APIs) to connect all modes of quantum devices.

We developed Orquestra to scale our own work for our customers and then realized the quantum community needs it, too. Orquestra is the only system for managing quantum workflows, said Zapata CTO Yudong Cao. The way we design and deploy computing solutions is changing. Orquestras interoperable nature enables extensible and modular implementations of algorithms and workflows across platforms and unlocks fast, fluid repeatability of experiments at scale.

So were on a journey. The journey is the road from classical-to-quantum and the best advice is to insist upon an interoperable vehicle (as Zapata has provided here) and to take a modular and extensible approach. In car analogy theory, that would mean break your journey up into bite-size chunks and make sure you have enough gas for the long haul when it comes. The quantum software parallel is obvious enough not to even explain.

Even when quantum evolves to become more ubiquitously available, many people think it will still be largely delivered as a cloud computing Quantum-as-a-Service (QaaS) package, but understanding the noisy overheated engine room in the meantime makes for a fascinating movie preview.

See the original post here:

Quantum Computing Is Hot And Noisy, But Zapata Opens Early Access - Forbes

Comments Off on Quantum Computing Is Hot And Noisy, But Zapata Opens Early Access – Forbes

Will Quantum Computing Really Change The World? Facts And Myths – Analytics India Magazine

Posted: at 2:41 pm

In recent years, some big tech companies like IBM, Microsoft, Intel, or Google have been working in relative silence on something that sounds great: quantum computing. The main problem with this is that it is difficult to know what exactly it is and what it can be useful for.

There are some questions that can be easily solved. For example, quantum computing is not going to help you have more FPS on your graphics card at the moment. Nor will it be as easy as changing the CPU of your computer for a quantum to make it hyperfast. Quantum computing is fundamentally different from the computing we are used to, but how?

At the beginning of the 20th century, Planck and Einstein proposed that light is not a continuous wave (like the waves in a pond) but that it is divided into small packages or quanta. This apparently simple idea served to solve a problem called the ultraviolet catastrophe. But over the years other physicists developed it and came to surprising conclusions about the matter, of which we will be interested in two: the superposition of states and entanglement.

To understand why we are interested, lets take a short break and think about how a classic computer works. The basic unit of information is the bit, which can have two possible states (1 or 0) and with which we can perform various logical operations (AND, NOT, OR). Putting together n bits we can represent numbers and operate on those numbers, but with limitations: we can only represent up to 2 different states, and if we want to change x bits we have to perform at least x operations on them: there is no way to magically change them without touching them.

Well, superposition and entanglement allow us to reduce these limitations: with superposition, we can store many more than just 2 ^ n states with n quantum bits (qubits), and entanglement maintains certain relations between qubits in such a way that the operations in one qubit they forcefully affect the rest.

Overlapping, while looking like a blessing at first glance, is also a problem. As Alexander Holevo showed in 1973, even though we have many more states than we can save in n qubits, in practice we can only read 2 ^ n different ones. As we saw in an article in Genbeta about the foundations of quantum computing: a qubit is not only worth 1 or 0 as a normal bit, but it can be 1 in 80% and 0 in 20%. The problem is that when we read it we can only obtain either 1 or 0, and the probabilities that each value had of leaving are lost because when we measured it we modified it.

This discrepancy between the information kept by the qubits and what we can read led Benioff and Feynman to demonstrate that a classical computer would not be able to simulate a quantum system without a disproportionate amount of resources, and to propose models for a quantum computer that did. was able to do that simulation.

Those quantum computers would probably be nothing more than a scientific curiosity without the second concept, entanglement, which allows two quite relevant algorithms to be developed: quantum tempering in 1989 and Shors algorithm in 1994. The first allows finding minimum values of functions, which So said, it does not sound very interesting but it has applications in artificial intelligence and machine learning, as we discussed in another article. For example, if we manage to code the error rate of a neural network as a function to which we can apply quantum quenching, that minimum value will tell us how to configure the neural network to be as efficient as possible.

The second algorithm, the Shor algorithm, helps us to decompose a number into its prime factors much more efficiently than we can achieve on a normal computer. So said, again, it doesnt sound at all interesting. But if I tell you that RSA, one of the most used algorithms to protect and encrypt data on the Internet, is based on the fact that factoring numbers are exponentially slow (adding a bit to the key implies doubling the time it takes to do an attack by force) then the thing changes. A quantum computer with enough qubits would render many encryption systems completely obsolete.

Until now, quantum computing is a field that hasnt been applied much in the real world. To give us an idea, with the twenty qubits of the commercial quantum computer announced by IBM, we could apply Shors factorization algorithm only to numbers less than 1048576, which as you can imagine is not very impressive.

Still, the field has a promising evolution. In 1998 the first ord quantum drive (only two qubits, and needed a nuclear magnetic resonance machine to solve a toy problem (the so-called Deutsch-Jozsa problem). In 2001 Shors algorithm was run for the first time. Only 6 years later, in 2007, D-Wave presented its first computer capable of executing quantum quenching with 16 qubits. This year, the same company announced a 2000 qubit quantum quenching computer. On the other hand, the new IBM computers, although with fewer qubits, they are able to implement generic algorithms and not only that of quantum quenching. In short, it seems that the push is strong and that quantum computing will be increasingly applicable to real problems.

What can those applications be? As we mentioned before, the quantum tempering algorithm is very appropriate for machine learning problems, which makes the computers that implement it extremely useful, although the only thing they can do is run that single algorithm. If systems can be developed that, for example, are capable of transcribing conversations or identifying objects in images and can be translated to train them in quantum computers, the results could be orders of magnitude better than those that already exist. The same algorithm could also be used to find solutions to problems in medicine or chemistry, such as finding the optimal treatment methods for a patient or studying the possible structures of complex molecules.

Generic quantum computers, which have fewer qubits right now, could run more algorithms. For example, they could be used to break much of the crypto used right now as we discussed earlier (which explains why the NSA wanted to have a quantum computer). They would also serve as super-fast search engines if Grovers search algorithm can be implemented, and for physics and chemistry, they can be very useful as efficient simulators of quantum systems.

Unfortunately, algorithms and codes for classic computers couldnt be used on quantum computers and magically get an improvement in speed: you need to develop a quantum algorithm (not a trivial thing) and implement it in order to get that improvement. That, at first, greatly restricts the applications of quantum computers and will be a problem to overcome when those systems are more developed.

However, the main problem facing quantum computing is building computers. Compared to a normal computer, a quantum computer is an extremely complex machine: they operate at a temperature close to absolute zero (-273 C), the qubits support are superconducting and the components to be able to read and manipulate the qubits are not simple either.

What can a non-quantum quantum computer be like? As we have explained before, the two relevant concepts of a quantum computer are superposition and entanglement, and without them, there cannot be the speed improvements that quantum algorithms promise. If computer disturbances modify overlapping qubits and bring them to classical states quickly, or if they break the interweaving between several qubits, what we have is not a quantum computer but only an extremely expensive computer that only serves to run a handful of algorithms. equivalent to a normal computer (and will probably give erroneous results).

Of the two properties, entanglement is the most difficult to maintain and prove to exist. The more qubits there are, the easier it is for one of them to deinterlace (which explains why increasing the number of qubits is not a trivial task). And it is not enough to build the computer and see that correct results come out to say that there are intertwined qubits: looking for evidence of entanglement is a task in itself and in fact, the lack of evidence was one of the main criticisms of D-systems. Wave in its beginnings.

A priori and with the materials that quantum computers are being built with, it does not seem that miniaturization is too feasible. But there is already research on new materials that could be used to create more accessible quantum computers. Who knows if fifty years from now we will be able to buy quantum CPUs to improve the speed of our computers.

comments

The rest is here:

Will Quantum Computing Really Change The World? Facts And Myths - Analytics India Magazine

Comments Off on Will Quantum Computing Really Change The World? Facts And Myths – Analytics India Magazine

Google’s top quantum computing brain may or may not have quit – Fudzilla

Posted: at 2:41 pm

We will know when someone opens his office door

John Martinis, who had established Googles quantum hardware group in 2014, has cleaned out his office, put the cats out and left the building.

Martinis says a few months after he got Googles now legendary quantum computing experiment to go he was reassigned from a leadership position to an advisory one.

Martinis told Wired that the change led to disagreements with Hartmut Neven, the longtime leader of Googles quantum project.

Martinis said he had to go because his professional goal is for someone to build a quantum computer.

Google has not disputed this account, and says that the company is grateful for Martinis contributions and that Neven continues to head the companys quantum project.

Martinis retains his position as a professor at the UC Santa Barbara, which he held throughout his tenure at Google, and says he will continue to work on quantum computing.

To be fair, Googles quantum computing project was founded by Neven, who pioneered Googles image search technology, and got enough cats together.

The project took on greater scale and ambition when Martinis joined in 2014 to establish Googles quantum hardware lab in Santa Barbara, bringing along several members of his university research group. His nearby lab at UC Santa Barbara had produced some of the most prominent work in the field over the past 20 years, helping to demonstrate the potential of using superconducting circuits to build qubits, the building blocks of quantum computers.

Googles ground-breaking supremacy experiment used 53 qubits working together. They took minutes to crunch through a carefully chosen math problem the company calculated would take a supercomputer 10,000 years to work out. It still does not have a practical use, and the cats were said to be bored with the whole thing.

Continued here:

Google's top quantum computing brain may or may not have quit - Fudzilla

Comments Off on Google’s top quantum computing brain may or may not have quit – Fudzilla

Devs: How the Quantum Computer Works & Mysteries That Remain – Screen Rant

Posted: at 2:41 pm

Devs' final episode answered many of fans' questions about the quantum computer at the heart of the Devs project, but mysteries about the computer and the characters' fates remain. The miniseries is an exclusive production between FX and Hulu, as part of the new FX on Hulu banner. Written and directed by Alex Garland, the show dives into some heady existential ideas, using the central mystery of Lily Chan (Sonoya Mizuno) investigating the mysterious death of her boyfriend as a vehicle to explore themes of quantum physics, determinism, and free will - all being manipulated behind the scenes of thenefarious tech company that she works for.

From the very first episode of the series, audiences were keyed into the fact that the tech corporation Amaya was working on asecretive project in their Devs division. The reveal came sooner than expected, as episode 2 confirmed the suspicions of the show's most ardent fans: the Devs team is working on an extremely powerful quantum computer,the purpose of which far exceeds the limitations of the real-world quantum computers being worked on at IBM and Google. Amaya's computer runs a specific set of data and code; more directly,the quantum computer is capable of distilling the universe down to matters of cause and effect, making it essentially able to predict the future.

Related: Devs Ending & NewTitle Meaning Explained

The quantum computer's reliance on determinism, which focuses on a myopic cause-and-effect-dependent view of reality, has been the center ofDevs' intra-character conflicts. Forest (Nick Offerman) fired Lyndon (Cailee Spaeny) because of their disagreement about the Many Worlds Theory and a Determinist understanding of reality, and leading into the finale, audiences were keenly aware of a statement made by Katie (Alison Pill) back in episode 6: the quantum computer can't see past a fixed point, one that involves Lily in the Devs laboratory. Episode 8's reveals answered fans' questions about the computer's functionality, but not all of the explanations hold up.

Throughout the series, Forest and Katie have ascertained that the quantum computer is determinist in theory and that no variations can occur because their reality is set in stone. This falls in line with Forest's personal philosophy and his reasons for clinging to determinism: if everything is predetermined, then he has no personal culpability in the death ofForest's wife and child. This extends to the murder of Sergei (Karl Glusman) and Katie's assistance in Lyndon's unexpected death. But after Lyndon improved the Devs projections by introducing the Many-Worlds Theory, it became clear that Forest and Katie were adhering to the quantum computer's projections not because they hadto, but because they wantedto.

However,afterLily arrives at Devs, in episode 8, she sees the future predicted for her by Forest's deterministic projection: on the Devs screen, she shoots Forest in the face, and the bullet pierces the lift's glass, breaking the airtight seal that keeps the lift afloat. Lily plunges to her death. As the scene plays out, however, she tosses away the gun as the lift's doors close, ensuring that she won't follow the same sequence the computer predicted. Her choice breaks the deterministic frameworkthat Forest and Katie have clung to throughout the series, and when Forest is reincarnated in the Devs simulation, he realizes that determinism was a faulty philosophy, a way of looking at the world that fails to fit with the data.

Lily's choicesupports two concomitant theories in quantum physics. The Many-Worlds Theory,posited by Hugh Everett, has already been debated throughout the show's run, butsince Lily's choice was motivated by her observation of the outcome, the Copenhagen Theory also has merit. As described by Katie's teacher in episode 5, the Copenhagen Theory "suggests that the act of measurement affects the system." Despite Katie scoffing at this theory,Devs' finaleoffers evidence for both the Copenhagen and Many-Worlds Theories.

Related: What To Expect From Devs Season 2

There's a popular fan theory regarding the show that originated on Reddit, from user emf1200, that suggests the entire show takes place within a simulation. This comes from the fact that the projection software works by simulating events through the usage of the predictive algorithm: the Devs team isn't technically peering backwards through time; they're reconstructing time and viewing it like a movie. Episode 7 has a scene where Stewart(Stephen McKinley Henderson)shows off the computer's predictive capabilities to a group of employees, and casually mentions how "somewhere in each box, there's another box." This implies that within the simulation the Devs team is watching, there's another version of the Devs team watching another simulation, and so on and so forth. By this logic, there's enough evidence to suggest that the show fans are watching is not the prime universe, but simply a simulation somewhere within a stack of simulations.

Though the finale did not strictly follow this theory - there was indeed a prime reality - when Forest and Lily are reincarnated in the Devs computer, a life that Katie characterizes as "indistinguishable" from reality, they essentially enter the "box within a box." They each become Neo (Keanu Reeves), without superpowers,fromThe Matrix, knowing that they are in a simulation with the power to exercise free will within each reality.

Up until episode 4, the Devs team was convinced (at Forest's insistence) that the universe operated on the De Brogile-Bohm theory, a deterministic interpretation of quantum physics that suggests events are set in stone as the result of cause and effect. This produced some results, namely - the preliminary version of the projection that could only render hazy, static-filled visions of the past and future. However, in episode 7, Stewart and the rest of the team perfected the quantum computerby switching out Forest's determinist theory with Lyndon's Many-Worlds theory, which Stewart says "is the universe as is."But once the quantum computer operated under the Many-Worlds Theory, why did it fail to predict Lily's decision to throw away the gun? According to Stewart and Lyndon, the multiverse exists,but the predictions made by the project are only of one universe. All Lyndon's algorithm did was clear up the static, not change the nature of the prediction.

But this raises yet another question:why did the deterministic computer projection stop accurately predicting the future at the point of Lily's death, when the actual moment that violated the laws of determinism was her decision to toss the weapon, not when she died? This question brings up afrustratingissue with the show's conclusion. Each adherent to all of the theories about quantum superposition can find evidence to support his/her position, andDevs offers no definitive conclusion. Copenhagen enthusiasts note that Lily's observation affected the outcome, Many-Worlds theorists are pleased with the free will implications of Lily's decision, and Determinists note that despite Lily tossing away the gun, Forest and Lily still died in the samemannerthe computer predicted.

Related: Devs: What Stewart's Poem Is & Why Forest Gets It Wrong

In the simulation Forest states that he "exercised a little free will" by giving each version of Lily and Forest in the Devs simulation knowledge of other worlds. This effectively deals with the show's recurrent themes aboutForest using determinism as a scapegoat for personal accountability because each version of Forest must reckon with the knowledge that other Forests are living under better or worse outcomes. However, while the show's conclusion holds up thematically, the failure of the quantum computer to accurately predict the final episode's outcome remains a mystery, one thatDevsas a whole didn't adequately address.

More: Why Hulu's Devs Represents A New Era Of Cyberpunk

Star-Lord's Infinity War Mistake Happened Before (Except He Wasn't Wrong)

Chrishaun Baker is a Feature Writer for Screen Rant, with a host of interests ranging from horror movies to video games to superhero films. A soon-to-be graduate of Western Carolina University, he spends his time reading comic books and genre fiction, directing short films, writing screenplays, and getting increasingly frustrated at the state of film discourse in 2020. You can find him discussing movies on Letterboxd or working up a migraine over American politics on Twitter.

Go here to see the original:

Devs: How the Quantum Computer Works & Mysteries That Remain - Screen Rant

Comments Off on Devs: How the Quantum Computer Works & Mysteries That Remain – Screen Rant

The future of quantum computing in the cloud – TechTarget

Posted: at 2:41 pm

AWS, Microsoft and other IaaS providers have jumped on the quantum computing bandwagon as they try to get ahead of the curve on this emerging technology.

Developers use quantum computing to encode problems as qubits, which compute multiple combinations of variables at once rather than exploring each possibility discretely. In theory, this could allow researchers to quickly solve problems involving different combinations of variables, such as breaking encryption keys, testing the properties of different chemical compounds or simulating different business models. Researchers have begun to demonstrate real-world examples of how these early quantum computers could be put to use.

However, this technology is still being developed, so experts caution that it could take more than a decade for quantum computing to deliver practical value. In the meantime, there are a few cloud services, such as Amazon Bracket and Microsoft Quantum, that aim to get developers up to speed on writing quantum applications.

Quantum computing in the cloud has the potential to disrupt industries in a similar way as other emerging technologies, such as AI and machine learning. But quantum computing is still being established in university classrooms and career paths, said Bob Sutor, vice president of IBM Quantum Ecosystem Development. Similarly, major cloud providers are focusing primarily on education at this early stage.

"The cloud services today are aimed at preparing the industry for the soon-to-arrive day when quantum computers will begin being useful," said Itamar Sivan, co-founder and CEO of Quantum Machines, an orchestration platform for quantum computing.

There's still much to iron out regarding quantum computing and the cloud, but the two technologies appear to be a logical fit, for now.

Cloud-based quantum computing is more difficult to pull off than AI, so the ramp up will be slower and the learning curve steeper, said Martin Reynolds, distinguished vice president of research at Gartner. For starters, quantum computers require highly specialized room conditions that are dramatically different from how cloud providers build and operate their existing data centers.

Reynolds believes practical quantum computers are at least a decade away. The biggest drawback lies in aligning the quantum state of qubits in the computer with a given problem, especially since quantumcomputersstill haven't been proven to solve problems better than traditional computers.

Coders also must learn new math and logic skills to utilize quantum computing. This makes it hard for them since they can't apply traditional digital programming techniques. IT teams need to develop specialized skills to understand how to apply quantum computing in the cloud so they can fine tune the algorithms, as well as the hardware, to make this technology work.

Current limitations aside, the cloud is an ideal way to consume quantum computing, because quantum computing has low I/O but deep computation, Reynolds said. Because cloud vendors have the technological resources and a large pool of users, they will inevitably be some of the first quantum-as-a-service providers and will look for ways to provide the best software development and deployment stacks.

Quantum computing could even supplement general compute and AI services cloud providers currently offer, said Tony Uttley, president of Honeywell Quantum Solutions.In that scenario, the cloud would integrate with classical computing cloud resources in a co-processing environment.

The cloud plays two key roles in quantum computing today, according to Hyoun Park, CEO and principal analyst at Amalgam Insights. The first is to provide an application development and test environment for developers to simulate the use of quantum computers through standard computing resources.

The second is to offer access to the few quantum computers that are currently available, in the way mainframe leasing was common a generation ago. This improves the financial viability of quantum computing, since multiple users can increase machine utilization.

It takes significant computing power to simulate quantum algorithm behavior from a development and testing perspective. For the most part, cloud vendors want to provide an environment to develop quantum algorithms before loading these quantum applications onto dedicated hardware from other providers, which can be quite expensive.

However, classical simulations of quantum algorithms that use large numbers of qubits are not practical. "The issue is that the size of the classical computer needed will grow exponentially with the number of qubits in the machine," said Doug Finke, publisher of the Quantum Computing Report.So, a classical simulation of a 50-qubit quantum computer would require a classical computer with roughly 1 petabyte of memory. This requirement will double with every additional qubit.

Nobody knows which approach is best, or which materials are best. We're at the Edison light bulb filament stage. Martin ReynoldsDistinguished vice president of research at Gartner

But classical simulations for problems using a smaller number of qubits are useful both as a tool to teach quantum algorithms to students and also for quantum software engineers to test and debug algorithms with "toy models" for their problem, Finke said.Once they debug their software, they should be able to scale it up to solve larger problems on a real quantum computer.

In terms of putting quantum computing to use, organizations can currently use it to support last-mile optimization, encryption and other computationally challenging issues, Park said. This technology could also aid teams across logistics, cybersecurity, predictive equipment maintenance, weather predictions and more. Researchers can explore multiple combinations of variables in these kinds of problems simultaneously, whereas a traditional computer needs to compute each combination separately.

However, there are some drawbacks to quantum computing in the cloud. Developers should proceed cautiously when experimenting with applications that involve sensitive data, said Finke. To address this, many organizations prefer to install quantum hardware in their own facilities despite the operational hassles, Finke said.

Also, a machine may not be immediately available when a quantum developer wants to submit a job through quantum services on the public cloud. "The machines will have job queues and sometimes there may be several jobs ahead of you when you want to run your own job," Finke said. Some of the vendors have implemented a reservation capability so a user can book a quantum computer for a set time period to eliminate this problem.

IBM was first to market with its Quantum Experience offering, which launched in 2016 and now has over 15 quantum computers connected to the cloud. Over 210,000 registered users have executed more than 70 billion circuits through the IBM Cloud and published over 200 papers based on the system, according to IBM.

IBM also started the Qiskit open source quantum software development platform and has been building an open community around it. According to GitHub statistics, it is currently the leading quantum development environment.

In late 2019, AWS and Microsoft introduced quantum cloud services offered through partners.

Microsoft Quantum provides a quantum algorithm development environment, and from there users can transfer quantum algorithms to Honeywell, IonQ or Quantum Circuits Inc. hardware. Microsoft's Q# scripting offers a familiar Visual Studio experience for quantum problems, said Michael Morris, CEO of Topcoder, an on-demand digital talent platform.

Currently, this transfer involves the cloud providers installing a high-speed communication link from their data center to the quantum computer facilities, Finke said. This approach has many advantages from a logistics standpoint, because it makes things like maintenance, spare parts, calibration and physical infrastructure a lot easier.

Amazon Braket similarly provides a quantum development environment and, when generally available, will provide time-based pricing to access D-Wave, IonQ and Rigetti hardware. Amazon says it will add more hardware partners as well. Braket offers a variety of different hardware architecture options through a common high-level programming interface, so users can test out the machines from the various partners and determine which one would work best with their application, Finke said.

Google has done considerable core research on quantum computing in the cloud and is expected to launch a cloud computing service later this year. Google has been more focused on developing its in-house quantum computing capabilities and hardware rather than providing access to these tools to its cloud users, Park said. In the meantime, developers can test out quantum algorithms locally using Google's Circ programming environment for writing apps in Python.

In addition to the larger offerings from the major cloud providers, there are several alternative approaches to implementing quantum computers that are being provided through the cloud.

D-Wave is the furthest along, with a quantum annealer well-suited for many optimization problems. Other alternatives include QuTech, which is working on a cloud offering of its small quantum machine utilizing its spin qubits technology. Xanadu is another and is developing a quantum machine based on a photonic technology.

Researchers are pursuing a variety of approaches to quantum computing -- using electrons, ions or photons -- and it's not yet clear which approaches will pan out for practical applications first.

"Nobody knows which approach is best, or which materials are best. We're at the Edison light bulb filament stage, where Edison reportedly tested thousands of ways to make a carbon filament until he got to one that lasted 1,500 hours," Reynolds said. In the meantime, recent cloud offerings promise to enable developers to start experimenting with these different approaches to get a taste of what's to come.

Visit link:

The future of quantum computing in the cloud - TechTarget

Comments Off on The future of quantum computing in the cloud – TechTarget

Advanced Encryption Standard (AES): What It Is and How It Works – Hashed Out by The SSL Store – Hashed Out by The SSL Store

Posted: at 2:41 pm

Understanding advanced encryption standard on basic level doesnt require a higher degree in computer science or Matrix-level consciousness lets break AES encryption down into laymans terms

Hey, all. We know of security of information to be a hot topic since, well, forever. We entrust our personal and sensitive information to lots of major entities and still have problems with data breaches, data leaks, etc. Some of this happens because of security protocols in networking, or bad practices of authentication management but, really, there are many ways that data breaches can occur. However, the actual process of decrypting a ciphertext without a key is far more difficult. For that, we can thank the encrypting algorithms like the popular advanced encryption standard and the secure keys that scramble our data into indecipherable gibberish.

Lets look into how AES works and different applications for it. Well be getting a little into some Matrix-based math so, grab your red pills and see how far this rabbit hole goes.

Lets hash it out.

You may have heard of advanced encryption standard, or AES for short but may not know the answer to the question what is AES? Here are four things you need to know about AES:

The National Institute of Standards and Technology (NIST) established AES as an encryption standard nearly 20 years ago to replace the aging data encryption standard (DES). After all, AES encryption keys can go up to 256 bits, whereas DES stopped at just 56 bits. NIST could have chosen a cipher that offered greater security, but the tradeoff would have required greater overhead that wouldnt be practical. So, they went with one that had great all-around performance and security.

AESs results are so successful that many entities and agencies have approved it and utilize it for encrypting sensitive information. The National Security Agency (NSA), as well as other governmental bodies, utilize AES encryption and keys to protect classified or other sensitive information. Furthermore, AES is often included in commercial based products, including but limited to:

Although it wouldnt literally take forever, it would take far longer than any of our lifetimes to crack an AES 256-bit encryption key using modern computing technology. This is from a brute force standpoint, as in trying every combination until we hear the click/unlocking sound. Certain protections are put in place to prevent stuff from like this happening quickly, such as a limit on password attempts before a lockdown, which may or may not include a time lapse, to occur before trying again. When we are dealing with computation in milliseconds, waiting 20 minutes to try another five times would seriously add to the time taken to crack a key.

Just how long would it take? We are venturing into a thousand monkeys working on a thousand typewriters to write A Tale of Two Cities territory. The possible combinations for AES 256-bit encryption is 2256. Even if a computer can do multiple quadrillions of instructions per second, then we are still in that eagles-wings-eroding-Mount-Everest time frame.

Needless to say, its waaaaaaaaaaaaaaaaaaay (theres not enough memory on our computers to support the number of a letters that I want to convey) longer than our current universe has been in existence. And thats just for a 16-byte block of data. So, as you can see, brute forcing AES even if it is 128 bits AES is futile.

That would likely change, though, once quantum computing becomes a little more mainstream, available, and effective. Quantum computing is expected to break AES encryption and require other methods to protect our data but thats still a ways down the road.

Manage Digital Certificates like a Boss

14 Certificate Management Best Practices to keep your organization running, secure and fully-compliant.

To better understand what AES is, you need to understand how it works. But in order to see how the advanced encryption standard actually works, however, we first need to look at how this is set up and the rules concerning the process based on the users selection of encryption strength. Typically, when we discuss using higher bit levels of security, were looking at things that are more secure and more difficult to break or hack. While the data blocks are broken up into 128 bits, the key size have a few varying lengths: 128 bits, 196 bits, and 256 bits. What does this mean? Lets back it up for a second here.

We know that encryption typically deals in the scrambling of information into something unreadable and an associated key to decrypt the scramble. AES scramble procedures use four scrambling operations in rounds, meaning that it will perform the operations, and then repeat the process based off of the previous rounds results X number of times. Simplistically, if we put in X and get out Y, that would be one round. We would then put Y through the paces and get out Z for round 2. Rinse and repeat until we have completed the specified number of rounds.

The AES key size, specified above, will determine the number of rounds that the procedure will execute. For example:

As mentioned, each round has four operations.

So, youve arrived this far. Now, you may be asking: why, oh why, didnt I take the blue pill?

Before we get to the operational parts of advanced encryption standard, lets look at how the data is structured. What we mean is that the data that the operations are performed upon is not left-to-right sequential as we normally think of it. Its stacked in a 44 matrix of 128 bits (16 bytes) per block in an array thats known as a state. A state looks something like this:

So, if your message was blue pill or red, it would look something like this:

So, just to be clear, this is just a 16-byte block so, this means that every group of 16 bytes in a file are arranged in such a fashion. At this point, the systematic scramble begins through the application of each AES encryption operation.

As mentioned earlier, once we have our data arrangement, there are certain linked operations that will perform the scramble on each state. The purpose here is to convert the plaintext data into ciphertext through the use of a secret key.

The four types of AES operations as follows (note: well get into the order of the operations in the next section):

As mentioned earlier, the key size determines the number of rounds of scrambling that will be performed. AES encryption uses the Rjindael Key Schedule, which derives the subkeys from the main key to perform the Key Expansion.

The AddRoundKey operation takes the current state of the data and executes the XOR Boolean operation against the current round subkey. XOR means Exclusively Or, which will yield a result of true if the inputs differ (e.g. one input must be 1 and the other input must be 0 to be true). There will be a unique subkey per round, plus one more (which will run at the end).

The SubBytes operation, which stands for substitute bytes, will take the 16-byte block and run it through an S-Box (substitution box) to produce an alternate value. Simply put, the operation will take a value and then replace it by spitting out another value.

The actual S-Box operation is a complicated process, but just know that its nearly impossible to decipher with conventional computing. Coupled with the rest of AES operations, it will do its job to effectively scramble and obfuscate the source data. The S in the white box in the image above represents the complex lookup table for the S-Box.

The ShiftRows operation is a little more straightforward and is easier to understand. Based off the arrangement of the data, the idea of ShiftRows is to move the positions of the data in their respective rows with wrapping. Remember, the data is arranged in a stacked arrangement and not left to right like most of us are used to reading. The image provided helps to visualize this operation.

The first row goes unchanged. The second row shifts the bytes to the left by one position with row wrap around. The third row shifts the bytes one position beyond that, moving the byte to the left by a total of two positions with row wrap around. Likewise, this means that the fourth row shifts the bytes to the left by a total of three positions with row wrap around.

The MixColumns operation, in a nutshell, is a linear transformation of the columns of the dataset. It uses matrix multiplication and bitwise XOR addition to output the results. The column data, which can be represented as a 41 matrix, will be multiplied against a 44 matrix in a format called the Gallois field, and set as an inverse of input and output. That will look something like the following:

As you can see, there are four bytes in that are ran against a 44 matrix. In this case, matrix multiplication has each input byte affecting each output byte and, obviously, yields the same size.

Now that we have a decent understanding of the different operations utilized to scramble our data via AES encryption, we can look at the order in which these operations execute. It will be as such:

Note: The MixColumns operation is not in the final round. Without getting into the actual math of this, theres no additional benefit to performing this operation. In fact, doing so would simply make the decryption process a bit more taxing in terms of overhead.

If we consider the number of rounds and the operations per round that are involved, by the end of it, you should have a nice scrambled block. And that is only a 16-byte block. Consider how much information that equates to in the big picture. Its miniscule when compared to todays file/packet sizes! So, if each 16-byte block has seemingly no discernable pattern at least, any pattern that can be deciphered in a timely manner Id say AES has done its job.

We know the advanced encryption standard algorithm itself is quite effective, but its level of effectiveness depends on how its implemented. Unlike the brute force attacks mentioned above, effective attacks are typically launched on the implementation and not on the algorithm itself. This can be equated to attacking users as in phishing attacks versus attacking the technology behind the service/function that may be hard to breach. These can be considered side-channel attacks where the attacks are being carried out on other aspects of the entire process and not the focal point of the security implementation.

While I always advocate going with a reasonable/effective security option, a lot of AES encryption is happening without you even knowing it. Its locking down spots of the computing world that would otherwise be wide open. In other words, there would be many more opportunities for hackers to capture data if advanced encryption standard wasnt implemented at all. We just need to know how to identify the open holes and figure out how to plug them. Some may be able to use AES and others may need another protocol or process.

Appreciate the encryption implementations we have, use the best ones when needed, and happy scrutinizing!

Read the original post:

Advanced Encryption Standard (AES): What It Is and How It Works - Hashed Out by The SSL Store - Hashed Out by The SSL Store

Comments Off on Advanced Encryption Standard (AES): What It Is and How It Works – Hashed Out by The SSL Store – Hashed Out by The SSL Store

Grandparents rely on drive-bys, technology to stay in touch – Journal Review

Posted: April 23, 2020 at 2:46 am

By CANDY NEAL

HUNTINGBURG, Ind. (AP) Michelle McCain misses holding her grandchildren.

And they miss her.

Not being able to hug, kiss, and hold them has been so heartbreaking, she said. Its been really hard with our youngest grandchild, Lakelyn. She is 7 months old now and its hard not to snuggle with her.

But Michelle does get to visit with the children from time to time.

Her two older sons, Tim and Tyler Rainey, have done drive-by visits with the grandkids in the backseat. So while Michelle and her husband, Brad, are on the porch of their rural Holland home, the grandkids are talking and beaming smiles to them out of their truck window. Doing this has given the grandparents the chance to see 10-year-old Addysen, 8-year-old Hadley, 6-year-old Remington, 2-year-old Canyon and 7-month-old Lakelyn. The grandkids even left Easter flowers for Michelle.

We have developed a game of blowing and catching kisses from each other, Michelle said. They all remain in their trucks and we stand on our front porch and talk and blow kisses.

Daughter-in-law Erica came up with the idea.

Hadley wanted to see us, Michelle said, and (Erica) parked on the other side of the highway on the county road and let us holler and wave to Hadley. Then, Tyler followed suit with his three kids, but he drove up through our front yard next to our front porch.

Our front yard now has a path developing in it, Michelle said, laughing.

Grandma and grandpa did their own drive-bys on Easter, delivering barbecue chicken, potato salad and Easter baskets. Theyve also been involved in their grandkids education, ordering activity books for them at the beginning of their e-learning sessions at home. Michelles also done some online videos of the grandkids.

Just fun things for them since Grandmas hugs are on hold for a while, she said.

Penny Spangler of Jasper, left, watches her husband, Tom, reach out to their great-granddaughter, Carter Snyder, 11 months, as Carters parents, Abbey and Collin Snyder, watch at their home in Huntingburg on Saturday.

Tom and Penny Spangler have also gotten to see their granchildren and great-grandchildren through drive-by visits.

Before the pandemic, the Spanglers had a tradition of having the grandchildren and great-grandchildren over for dinner on Monday nights, called MeMes Monday Night Meals. That would include the grandkids and great-grandchildren in Huntingburg, Ireland and Tell City, if possible. They also have grandchildren in Indianapolis.

We found that food really attracts grandkids and great-grandkids, Tom said.

MeMe and Poppa Spangler would provide a home-cooked meal with dessert, and the grandchildren would bring their families over to spend time eating and catching up.

When the pandemic started, I know they couldnt come see us, Penny said. So two weeks ago, I fixed a big old meatloaf, butter noodles and bar cookie dessert. And we went around and we delivered meals to the kids that day.

We put them on their steps, Tom said. And then stand by the car, and theyll come out.

Since the family cant be around the table together sharing the meal, they send a video of them eating, something with the baby, something creative. Its been funny, he said.

On Easter, the Spanglers went to see their Tell City family, including 18-month-old great-granddaughter Madison. That time, they stood on the porch and talked to the kids through the window.

They also keep in touch by phone, calling and using the many video communication apps.

We love Snapchat, Penny said. We have a family group. So were taking pictures and sending pictures back and forth. When we send it to family, then we know it gets to everybody.

The latest video app theyre using now is Zoom. We had 11 people on Zoom on Easter, Penny said. Were hoping to do that every Sunday night.

All this helps them keep in touch with their family. All the videos, Snapchats, Zoom, all that keeps us in touch, Penny said. We are thankful to have a way to see our grandkids and great-grandkids.

Grandparents who have been active with their grandkids may find it hard to be separated from them now.

There hadnt been a week or weekend that I hadnt seen some of them, Michelle said. And now, honestly, I have cried myself to sleep a couple nights. I miss them.

Technology has helped. A couple years ago, I got each household an Alexa from Amazon for Christmas, Michelle said. So they can call me and we can see each other. Technology has come in handy during this difficult time.

But the drive by visits are the next best thing to getting and giving Grandma hugs!

__

Source:

Read more here:

Grandparents rely on drive-bys, technology to stay in touch - Journal Review

Posted in Uncategorized | Comments Off on Grandparents rely on drive-bys, technology to stay in touch – Journal Review