Confidential computing is needed for AI and also for AI-based self-driving cars.
Lets play a spying game.
A friend of yours wants to write down a secret and pass along the note to you. There is dire concern that an undesirable interloper might intercept the note. As such, the secret is first encrypted before being written down and thus will be inscrutable to anyone that perchance intervenes. All told, the message will look scrambled or seem like gobbledygook.
You have the password or key needed to decrypt the message.
After the note has passed through many hands, it finally reaches you. The fact that many others saw and ostensibly were able to read the note is of no consequence. They could not make head nor tails of what it said.
Upon receiving the encrypted message, you decide to decrypt it. Voila, you can now see what it says. The message successfully and with vital security has been appropriately received and deciphered. The world is saved and everyone can rejoice.
But wait for a second, when you decrypted the message, you wrote it down, and meanwhile, a dastardly spy was looking over your shoulder. The snoop has now seen the entire message, spying it in all its glory and while in plaintext. The gig is up. Sadly, after having gone from hand to hand and being protected that entire time, at this last moment the secret was revealed.
Maybe worse still, you are the one that revealed it (i.e., you being the intended receiver).
What went wrong?
Some might refer to this as the last-mile problem or perhaps more aptly coin it as the last-step problem in this instance.
You see, the catchphrases of last-mile or last-step are often used when describing a situation that has a kind of gap or arduous challenge at the very end of a task or activity. For example, in the telecommunications industry, there is the notion that the hardest and most costly part of providing high-speed networking to homes is the so-called last mile from the main trunk to the actual home of the consumer.
Envision a cable that runs down the middle of a neighborhood street and the last mile would be to make all the offshoot branches that need to extend from the centerline to each specific domicile. The number of such branches is high. It is one thing to simply lay down the center cable, while a hugely costly effort to then string out to each house. Even though the actual distance is not a mile long to get from the center to each house, the notion is that youve overall reached the proverbial last-mile or last step involved in the process.
This last-mile or last-step can be the weak link in a long chain of efforts.
Imagine expending an enormous amount of time and energy toward getting a slew of things done, and at the last stage, the matter is ultimately either untenable or somehow spoiled. This happened indeed in the scenario of you getting the purposely encrypted note. Once it got into your hands, you decrypted it and did not realize that you were being spied upon.
The classic line is to always make sure that you finish what you start. We might wish to definitively augment the line by adding that you need to finish fully and with the proper gusto, else the finish might be undermined and become the exasperating and disappointing point-of-weakness in a long series of otherwise careful steps.
In the cybersecurity field, there are three major ways that data such as the message on the note are usually intended to be protected or secured:
Data at rest (standing still data)
Data in transit (flowing data)
Data in use (when being read or utilized)
Your friends note to you was in transit when it was being passed along from person to person on its way to you. At some point, perhaps the note was sitting on someones desk for a while, waiting for them to pick it up and continue the journey of the note to you. That would be data at rest.
That covers the data at rest and the data in transit instances.
When you opted to decrypt the note and take a look at what it said, the data was considered in use at that time. Per the saga, this is when things went awry and a despicable spy saw the message. Up until that moment, the message was relatively secret and secure. The last mile or last step exposed it.
I bring this up to highlight a hot new trend known as confidential computing.
We will use the tale of the encrypted note to help explore the nature of confidential computing. Admittedly, the parable per se is not wholly precisely on-target with the topic but you will soon see that it does offer a semblance of insightful parallels.
Confidential computing is usually associated with making use of cloud computing.
Cloud computing is the now familiar notion of using unseen computing resources that are available via remote access. Referring to this as cloud computing is an easy way to envision the matter and has fortuitously been an extremely catchy way to denote various computers as being in the cloud and available for use.
When your data is placed into a cloud-based computer, you likely want to feel comfortable that the data is well-protected. If the data is sitting in a database, perhaps a fiendish hacker might try to access the data. You want to prevent the cybercrook from being able to see your precious data, and ergo there are typically cybersecurity locks that seek to keep the bad hackers out of the database.
Suppose though the evildoer cracks through the locks. Aha, by encrypting the data, which is sitting at rest, the ability to do anything untoward with the data is greatly lessened. Though the hacker might be able to see the data, it is scrambled and generally unusable.
Imagine that there is a need to share the data and thus copy it to another database on a different computer. While the data is in this transit from one database to another, it is potentially vulnerable to prying eyes. If the data is encrypted while in transit, the interloper will presumably not gain much since the data is inscrutable.
We now are heading to the last mile or last step.
Assume that at some point the data will be needed for making calculations. The database with the encrypted data is accessed and the data while still encrypted is copied over to a computer that is doing the computations. Alls good so far.
Upon the encrypted data being brought into the CPU (Central Processing Unit) of the computer, at this last-mile or last step, it is now necessary to decrypt it, else the data wont be of much use for making the desired calculations if otherwise remaining in an encrypted format.
Here is the potential loophole in all of this series of carefully encrypted steps. Now that the data is momentarily decrypted for use while inside the CPU, it becomes potentially open for a wrongdoer to peek at it. Akin to the saga of your having gotten the encrypted note into your hands, and then doing the decryption, there is a chance that someone might be spying and able to see the now in-the-clear message.
Your first thought might be that the idea of a cybercriminal hacking all the way into the inner guts of the CPU while it is processing seems nearly unimaginable.
Can they do really that?
The answer is yes, it is possible.
That being said, it is generally a quite difficult trick to pull off. Numerous system protections would have to be overcome. Nonetheless, a very determined and crafty cyber hacker could devise such a devilish scheme (especially when you include the nation-state's elements of cybersecurity, see my coverage at this link here).
This last-mile cybersecurity concern is being partially mitigated by the use of confidential computing.
Within the CPU of a confidential computing arranged computer, there is a special highly secure enclave setup. This is usually done via using a hardware-based environment that governs the execution of CPU running tasks. In industry parlance, this is known as a Trusted Execution Environment (TEE). Some keys or passwords are kept under added protection and used only when the last step occurs.
The enclave tries to hide from any other resources what is going on inside the enclave. Remember how you inadvertently allowed a spy to look over your shoulder? That is the type of intrusion that the enclave is fortress-like constructed to keep at bay.
Heres why this is especially relevant to cloud computing.
Suppose the cloud computer being used has somehow gotten malware on it. If there isnt a provided provision of confidential computing, the risks of the malware peeking at the CPU and also catching the data in an unencrypted format are heightened. Likewise, the OS or operating system of cloud computing could potentially take a peek (perhaps the OS then leaks it elsewhere), and even (sadly) there is a possibility that employees of the cloud provider might have access to take a look.
Via the use of the TEE and the enclave, the notion is that none of those other potential interlopers can see what is going on inside the CPU during the computational efforts. Furthermore, there is typically a feature of confidential computing that upon detecting that perhaps an interloper is trying to do something untoward, whatever CPU action might have been planned or underway is typically canceled and an alert is raised.
This could be likened to you having potentially noticed that a person was spying over your shoulder. Id assume that had you realized the interloper was present, you would have stopped doing the decryption of the secret message. Of course, you might have already started to decrypt it, in which case maybe the interloper saw some of it, but at least you would curtail your activities at that juncture. Plus, you likely would have called for the cops to come and bust the reprehensible spy.
Most of the major cloud providers have made available various flavors of confidential computing, including the biggies such as IBM Cloud, Amazon AWS, Microsoft Azure, Google Cloud, Oracle Cloud, and others. The makers of CPUs are also integral to the confidential computing architecture and thus companies such as Intel, AMD, and the like are involved.
As eloquently stated in a paper by IBM Fellow and CTO for Cloud Security, Nataraj Nagaratnam: As companies rely more and more on public and hybrid cloud services, data privacy in the cloud is imperative. The primary goal of confidential computing is to provide greater assurance to companies that their data in the cloud is protected and confidential, and to encourage them to move more of their sensitive data and computing workloads to public cloud services.
There is a well-known group called the Confidential Computing Consortium (CCC) that has banded together numerous cloud providers, hardware vendors, and software development outfits to focus on confidential computing. Per the posted CCC remarks of Stephen Walli, Governing Board Chair: The Confidential Computing Consortium is a community focused on open source licensed projects securing data in use and accelerating the adoption of confidential computing through open collaboration.
For those readers that are adept at programming, you likely know that your encrypted data while sitting on a database is usually decrypted once you bring the data into the internal memory of the computer system. This is done so that then the CPU can readily use the data for doing computations. In the confidential computing arrangement, the data is not decrypted until the final moment or last-mile or last-step of being placed into the CPU for use. Therefore, even while sitting in internal memory, the data is encrypted and less vulnerable to cyberattack.
One additional quick point is that this scheme for confidential computing does not guarantee that no one can ever hack it. The cybersecurity field is an ongoing game of cat and mouse. Each new protection that is devised will be deviously picked apart until some unforeseen hole or gotcha is discovered. The hole will usually get plugged. Meanwhile, the gambit continues as the cybercrooks try to find a means to undo or overcome the plug or look for other ways to break-in.
This is a never-ending cycle.
In that sense, the confidential computing approach is another added layer of cybersecurity. The more layers that you have, the odds are that it becomes increasingly harder for someone to crack through. At your home, you might have a gated fence around your property (a layer of protection), locks on your doors and windows (another layer of protection), and a motion detector inside the house (yet an additional layer). The belief is that by placing numerous hurdles in the way of a robber, they will be rebuffed in their intrusion efforts.
Having those added layers is not cost-free. For each layer, you need to ascertain the cost of the added protection versus the risks and consequences of someone breaking in. This is of course the same for confidential computing. Whether you require confidential computing is contingent on the type of computing activities you are undertaking, the magnitude of cybersecurity you are desirous of achieving, the risks and adverse consequences if a cyber breach occurs, etc.
Your car might also have various layers of security protection. There are locks on the car doors. The windows are made of materials that are hard to smash. Any motion immediately next to the vehicle might be detected and cause the horn to sound. And so on.
Speaking of cars, the future of cars consists of AI-based true self-driving cars.
Allow me to briefly elaborate on this point and then tie things to the topic of confidential computing.
There isnt a human driver involved in a true self-driving car. Keep in mind that true self-driving cars are driven via an AI driving system. There isnt a need for a human driver at the wheel, and nor is there a provision for a human to drive the vehicle. For my extensive and ongoing coverage of Autonomous Vehicles (AVs) and especially self-driving cars, see the link here.
Heres an intriguing question that is worth pondering: Will confidential computing be useful for the advent of AI systems all told, and particularly for the advent of AI-based true self-driving cars?
Before jumping into the details, Id like to further clarify what is meant when referring to true self-driving cars.
Understanding The Levels Of Self-Driving Cars
As a clarification, true self-driving cars are ones that the AI drives the car entirely on its own and there isnt any human assistance during the driving task.
These driverless vehicles are considered Level 4 and Level 5 (see my explanation at this link here), while a car that requires a human driver to co-share the driving effort is usually considered at Level 2 or Level 3. The cars that co-share the driving task are described as being semi-autonomous, and typically contain a variety of automated add-ons that are referred to as ADAS (Advanced Driver-Assistance Systems).
There is not yet a true self-driving car at Level 5, which we dont yet even know if this will be possible to achieve, and nor how long it will take to get there.
Meanwhile, the Level 4 efforts are gradually trying to get some traction by undergoing very narrow and selective public roadway trials, though there is controversy over whether this testing should be allowed per se (we are all life-or-death guinea pigs in an experiment taking place on our highways and byways, some contend, see my coverage at this link here).
Since semi-autonomous cars require a human driver, the adoption of those types of cars wont be markedly different than driving conventional vehicles, so theres not much new per se to cover about them on this topic (though, as youll see in a moment, the points next made are generally applicable).
For semi-autonomous cars, it is important that the public needs to be forewarned about a disturbing aspect thats been arising lately, namely that despite those human drivers that keep posting videos of themselves falling asleep at the wheel of a Level 2 or Level 3 car, we all need to avoid being misled into believing that the driver can take away their attention from the driving task while driving a semi-autonomous car.
You are the responsible party for the driving actions of the vehicle, regardless of how much automation might be tossed into a Level 2 or Level 3.
AI And Self-Driving Cars And Confidential Computing
For Level 4 and Level 5 true self-driving vehicles, there wont be a human driver involved in the driving task.
All occupants will be passengers.
The AI is doing the driving.
One aspect to immediately discuss entails the fact that the AI involved in todays AI driving systems is not sentient. In other words, the AI is altogether a collective of computer-based programming and algorithms, and most assuredly not able to reason in the same manner that humans can.
Why this added emphasis about the AI not being sentient?
Because I want to underscore that when discussing the role of the AI driving system, I am not ascribing human qualities to the AI. Please be aware that there is an ongoing and dangerous tendency these days to anthropomorphize AI. In essence, people are assigning human-like sentience to todays AI, despite the undeniable and inarguable fact that no such AI exists as yet.
With that clarification, you can envision that the AI driving system wont natively somehow know about the facets of driving. Driving and all that it entails will need to be programmed as part of the hardware and software of the self-driving car.
Lets dive into the myriad of aspects that come to play on this topic.
One overarching point that is worthy of particular attention is that any AI system and especially ones running in the cloud should be potentially making use of confidential computing.
This is regrettably not a top-of-mind consideration for many AI developers.
The typical focus for AI software engineers is primarily on the underlying AI capabilities such as employing advanced uses of Machine Learning (ML) and Deep Learning (DL). Once the AI system is ready to be fielded, the AI builders tend to be less attentive to what happens when the program is placed into operational use. The assumption is that whatever existent cybersecurity is already available in the execution environment will probably be sufficient.
The average AI developer usually wants to get back to their AI bag-of-tricks and continue tweaking the AI-related elements of the system, or perhaps move onward to some other new development that requires their honed skills at crafting AI systems. Concerns about whether or not the prevailing execution environment for their budding AI system is highly secure does not explicitly enter into their mindset and nor is found in their usual toolset.
Some will exhort, hey, Im not a darned cybersecurity expert, Im an AI developer (that line is a heartfelt homage to the classic indication in Star Trek that hey, Im a doctor, darn it, not an engineer).
The thing is, the best AI systems can be readily brought to their knees if the cybersecurity is not topnotch and using all available layers of protection. Up until recently, many AI systems were not necessarily aimed at domains that entailed potentially high risks and pronounced adverse consequences if the AI was undermined at execution.
Nowadays, with AI becoming pervasive across all manner of applications, the idea of treating AI systems as merely experimental or prototypes is now long gone.
Simply stated, any AI developer worth their salt should be seriously giving due consideration to how their AI systems will be deployed, including what kinds of cyberattacks might be launched to undercut the AI system processing. Since the AI developer ought to know what portends for especially vulnerable weaknesses in their AI while executing, they should take a close look at confidential computing as a potential countermeasure and gauge whether this added layer of security is warranted.
Im not saying that it will always be a necessity, just that with AI systems of a sensitive nature running in the cloud, it is prudent and nearly obligatory to consider which of the numerous potential cybersecurity precautions should be undertaken.
Hopefully, that will be a useful call to arms for those AI developers that havent yet taken into account the utility of confidential computing. And perhaps a startling wake-up blaring of trumpets for some.
Moving beyond the overall notion of all types of AI systems that are running in the cloud, lets next take a gander at the use of the cloud for the specific advent of AI-based true self-driving cars. The most commonly anticipated use of the cloud for self-driving cars encompasses the use of OTA (Over-The-Air) electronic communications capabilities.
Via OTA, various patches and updates stored in the cloud for a fleet of self-driving cars can be downloaded into each autonomous vehicle and accordingly installed, doing so automatically. This is handy to be able to remotely push out new features for the AI driving system or possibly provide bug fixes, plus avoiding having to bring the vehicles to a dealer site or some repair shop merely to do needed software updates.
The OTA will also enable the ease of uploading data from the self-driving cars into the fleet-provided cloud. Self-driving cars will have a sensor suite that includes video cameras, radar, LIDAR, ultrasonic units, thermal imagining, and other such devices. The data they collect can be usefully analyzed by collecting together the data across an entire fleet of self-driving cars and then conglomerating it while in the cloud.
For my extensive coverage of the cloud as it pertains to autonomous vehicles and also for self-driving cars, see the link here.
So, you might be wondering, what does this have to do with confidential computing?
Think of it this way, if there are programs and data in the cloud that are going to potentially be downloaded and installed into the AI driving systems, this becomes a handy and sneaky path for a cyber attacker to get their malware into the self-driving cars. The cybercrook merely plants the evil-doing elements into the cloud and then patiently waits until the OTA mechanism does the rest of the work for the wrongdoer by broadcasting it out into the fleet.
Whereas most people tend to be thinking about how an AI driving system might get corrupted or undermined by someone physically accessing the autonomous vehicle, the likely greater threat comes from using the OTA to do so. The innocent beauty of the OTA is that it is an already assumed trusted avenue to directly get something inserted into the AI driving system, and this will happen across an entire fleet of self-driving cars. Imagine that there were hundreds, maybe thousands, perhaps hundreds of thousands of self-driving cars and all of them were using an OTA to get updates from a fleet cloud.
Okay, so we might want to put some devoted attention to what is happening in the fleet cloud.
The more cybersecurity we put there, the lessened the chances that the OTA will become a specter of doom. It could be that the judicious use of confidential computing for the fleet cloud will curtail or at least make much harder the possibility of launching a cyberattack that might inevitably get carried out into the AI driving systems of the fleet.
Another potential use of confidential computing would be for the execution or processing that takes place inside self-driving cars.
When the AI driving system is being executed on the onboard computer processors, this execution obviously needs to be highly secure too. The tough tradeoff is that confidential computing tends to incur a performance hit on the processors and thus presents a somewhat complicated consideration when dealing with real-time systems. Keep in mind that real-time processing is controlling the actions of the self-driving car. Any substantive delay in processing times can be problematic.
Self-driving cars are real-time machines that also just so happen to involve life-or-death matters.
You typically do not have that same life-or-death concern for an everyday cloud-based application. If the cloud processing has any modicum of delay, this might be of little consequence. In addition, because a cloud-based application resides in the cloud, you can readily toss more processors at the application or reallocate to using faster processors available in the cloud.
For a self-driving car, the processors installed into the autonomous vehicle are generally not as readily switched out, since that can be a very physical effort and logistically costly to undertake. Automakers and self-driving tech firms are pretty much stuck once theyve decided which processors to put into their self-driving cars. Theyve got to hope that the choice will last a while.
- Craft Guild of chefs reveals the names of chefs through to next stage of The National Chef of the Year competition - Premier Construction Magazine - July 21st, 2021
- Beating Breath of the Wild like its Snake took careful planning and infinite patience - Destructoid - July 21st, 2021
- E-transmission: The allure of toxic leaders - Daily Sun - July 21st, 2021
- Clairos Sling paves a brilliantly inspired and elegantly orchestrated road to comfort - The A.V. Club - July 21st, 2021
- CBD Infused Drinks: Everything You Need to Know... - Marketscreener.com - July 21st, 2021
- Meet Simone Ferretti, an astute name and personality climbing his way to the top in the social media industry. - The American Reporter - July 21st, 2021
- 10 Lessons Bungie Should Learn From Destiny 2s Season Of The Splicer - Forbes - July 18th, 2021
- Album reviews: KSI All Over the Place and Willow Lately I Feel Everything - The Independent - July 18th, 2021
- Delhi University to admit students based on cut-offs - The Indian Express - July 18th, 2021
- 6 Ways The Internet Is Slowly Transforming Our Minds 2021 Tips - BollyInside - July 18th, 2021
- SmugMug Source Preview: Say Goodbye to NAS Servers and Hello to the Best Cloud Storage Yet - Fstoppers - July 18th, 2021
- Album reviews: KSI All Over the Place and Willow Lately I Feel Everything - MSN UK - July 16th, 2021
- RBSE 10th and 12th Result 2021: Schools Complete Uploading of Marks, Rajasthan Board Results Expected Soon rajresults.nic.in - Jagran Josh - July 16th, 2021
- How Scrybe Streaming Is Disrupting the Music Industry, Interview With CEO Christian Phyfier - The Ritz Herald - July 16th, 2021
- Best iOS and Android apps we used in 2021 (so far) - Mashable - July 16th, 2021
- Prontonn receives USD 9 million of funding - Inventiva - July 16th, 2021
- In Defence of HIKAI - The Shillong Times - July 16th, 2021
- The Echo Show 10 Is The Only Smart Display That Follows Every Step Of The Way - MensXP.com - July 16th, 2021
- Who is Baby J? BTS responds to rumors, debuts Permission to Dance on Jimmy Fallon show - MEAWW - July 16th, 2021
- Bill Ackman Sent a Text to the CEO of Mastercard. What Happened Next Is a Parable for ESG. - Institutional Investor - June 20th, 2021
- #ATA2021: Telehealth Is Key to Equity and Access for Every Patient Population - HealthTech Magazine - June 20th, 2021
- CBSE 12th Result 2021: Schools Asked To Hold Practical Exams Online, and Upload Marks by 28 June - AglaSem News - June 20th, 2021
- Cardiff cyclist catches up with driver and gives him piece of her mind after cat call - Nation.Cymru - June 6th, 2021
- After Making Example of Jon Rahm, PGA Tour Expects Vaccinations to Increase Significantly: Opinion - Pro Golf Weekly - June 6th, 2021
- Doug Fishbone show at the Crawford couldnt be more timely - The Irish Times - June 6th, 2021
- Why Developers Prefer Creating Apps That Begin As iOS-Exclusives? We Spoke To Two Developers Ahead Of WWDC 2021 - Mashable India - June 6th, 2021
- Young Juno nominees on why theyre moving to the algorithm of TikTok - 660 News - June 6th, 2021
- As EVM theories fade after Mamata Banerjees win, here is all you need to know about how EVMs cannot be hacked, no matter who wins - OpIndia - June 6th, 2021
- Downloading our thoughts to the mainframe may be the stuff of science fiction but humans have been imagining it for centuries - The Conversation AU - May 20th, 2021
- Google Photos to end free unlimited uploads: What it means to you - SlashGear - May 20th, 2021
- How to start vlogging - Creative Bloq - May 20th, 2021
- Many Latino men haven't gotten vaccinated. Misinformation, fear and busy lives are factors - Los Angeles Times - May 20th, 2021
- GFRIEND to officially disband; Members upload handwritten letters for fans while Sowon has a special message - PINKVILLA - May 20th, 2021
- Google Photos unlimited free storage is ending this month, so what are your options - India Today - May 20th, 2021
- Severe weather soaks spirits in rain-ravaged area - "We're trying to hang in there" - Port Arthur News - The Port Arthur News - May 20th, 2021
- Assams fictional superhero Advitya all set to hit the screens in fancy outfit without mask - The Hindu - May 20th, 2021
- '1000-Lb Sisters' Season 3: What Happened to Tammy Slaton's Forehead? - Showbiz Cheat Sheet - May 20th, 2021
- Netizens Baffled Over Moms Photo Of Daughter Sinking Into Concrete - Latin Times - May 20th, 2021
- [App Fridays] How 1MG is meeting the rise in demand for teleconsultations, medicines amid COVID-19 resurgence - YourStory - May 20th, 2021
- Steven Crowder Sues YouTube for Silencing Conservatives: 'This Is the Big One' - CBN News - May 20th, 2021
- First Lady hands over 30 000 books - The Herald - May 20th, 2021
- Free to fly again - The Garden Island - May 11th, 2021
- Here's How to Record a Video That's Three Minutes on TikTok - Distractify - May 11th, 2021
- Dishonest censorship scare may torpedo Bill C-10, a chance to update broadcasting laws for the modern era - The Globe and Mail - May 11th, 2021
- AI-Based Editing That Learns Your Style? Fstoppers Reviews ImagenAI - Fstoppers - May 11th, 2021
- Google Photos unlimited free storage offer ends this month, and here are few things you should keep in mind - India Today - May 9th, 2021
- Here's how to touch of death with 2 combos as Gill, Rose and Ibuki in Street Fighter 5: Champion Edition as showcased by LPN - EventHubs - May 9th, 2021
- Feel Better Yoga in Burlington launched online-only memberships in response to COVID - Burlington Times News - May 9th, 2021
- Sinn Fin hopes to use controversial voter database for Dublin byelection - The Irish Times - May 9th, 2021
- Dogs at polling stations are the real stars of the Scottish Parliament election - The National - May 9th, 2021
- Dodie on growing up on YouTube and her new album Build a Problem: I didnt have any boundaries - iNews - May 9th, 2021
- 40 Must-See New Movies to See This Summer Season - IndieWire - May 9th, 2021
- MTV: Who is Tyler from Catfish? Updates on his relationship with Stefany! - Reality Titbit - Celebrity TV News - May 9th, 2021
- Guided By Voices Earth Man Blues May Be the Power-Pop Heroes Best Album in Decades - Rolling Stone - May 1st, 2021
- How to turn on TikTok's new auto captions - Mashable - May 1st, 2021
- As seen on CBS This Morning: Protecting Veterans' digital accounts with a free service offer and book from GoodTrust - VAntage Point - VAntage Point... - May 1st, 2021
- COVID-19 update: Faculty and staff vaccinations - University of Denver Newsroom - May 1st, 2021
- Sonny Fodera & Just Kiddin' team up with Lilly Ahlberg on 'Closer' - We Rave You - May 1st, 2021
- The Kid LAROI Talks Without You Remix Featuring Miley Cyrus and Putting Raw Emotion Into His Music - Variety - May 1st, 2021
- Were These Migrant Teens Framed for Arson That Left 13,000 Homeless? - The Daily Beast - May 1st, 2021
- Blockchain startup S!NG wants creators to lean on NFTs to protect their intellectual property - TechCrunch - April 27th, 2021
- How to upload and view 4K images on Twitter for Android and iOS - HT Tech - April 27th, 2021
- Worrying Kwon Mina Instagram post is removed as fans share support on Twitter - HITC - Football, Gaming, Movies, TV, Music - April 27th, 2021
- Mathilde Tantot: French Model Who Suffered Wardrobe Malfunction Breaks the Internet with Bare-All Snaps - International Business Times, Singapore... - April 27th, 2021
- Twitter now allows you to share pictures in 4K on Android, iOS - Moneycontrol.com - April 23rd, 2021
- Of OTTs and more! - The Times of India Blog - April 23rd, 2021
- Nodal officers to check availability of beds in pvt, Govt hosps in G'gram - Daily Pioneer - April 23rd, 2021
- Fighting COVID-19: Vaccination picks up in Jammu and Kashmir - indiablooms - April 23rd, 2021
- Billions of bugs: Meet the cicada chasers trailing Brood X - CNET - April 23rd, 2021
- Fashion Society stitches together first virtual fashion magazine - The Poly Post - April 23rd, 2021
- BTS drive for Smart urges PH youth to 'live your passion with purpose' - Manila Bulletin - April 23rd, 2021
- Dortmunds strict squad rules and fines revealed with Arsenal star Aubameyang branded punishment king dur... - The Sun - April 23rd, 2021
- All that you need to know about Facebook group cover photo size - TechPluto - April 23rd, 2021
- Jimmie Johnson ready to make his official jump from NASCAR to IndyCar this weekend - ESPN - April 19th, 2021
- Sushmita Sen's "Meditation In Action" Post Is Beating Our Monday Blues. Watch Her Workout Video - NDTV - April 19th, 2021
- Vaccine passports for COVID-19: How they'll be a part of global travel - CNET - April 19th, 2021
- Creators On The Rise: Gunnar Deatherage Is Stitching Himself A Career On TikTok And YouTube Shorts - Tubefilter - April 19th, 2021
- Guest blog: Why broadband has never been more important for a residential care home in the midst of a global pandemic | Business & Finance -... - April 19th, 2021
- From Kaviya Maran to Aditi Hundia, women who grabbed IPL cameramen's attention over the years - DNA India - April 19th, 2021
- Core is an Epic-funded development platform hoping to be the next Roblox - PC Gamer - April 19th, 2021