Real-Time Data Analytics and The Metaverse – Nanalyze

Posted: December 19, 2021 at 6:38 pm

One of techs biggest news stories coming out of 2021 would be Facebooks rebrand to Meta. Like Google, its likely that nobody will adhere to that new name, but everyone sure took note of the strategic direction change towards the metaverse. The Web 3.0 / creator economy / DiFi crowd quickly attached themselves to the theme, and everyone had a good group wank on Twitter over what visionaries they are. Truth be told, the metaverse is an idea thats been around for decades.

Lets get back to basics for a moment. Four years ago, we published a piece titled How to Invest in the Singularity Its Near which summarized a keynote given by Masayoshi Son, CEO and Founder of SoftBank,about the importance of ARM.

He says that the chip maker he acquired last year, ARM Holdings, is expected to ship over 1 trillion IoT chips in the next20 yearswithARM IoT chips commanding an 80% market share.

The metaverse we want to invest in is exactly that. Its the same vision that Jensen Huang of NVIDIA has one built from scratch where the physical world is connected to the virtual world. All these IoT chips and sensors are turning our entire planet into a digital twin a metaverse.

Then theres Facebooks vision of everyone going into a virtual reality (VR) world where they show off their NFTs and desperately try not to offend anyone.

Digital twins produce a tremendous amount of big data exhaust that can be analyzed for insights. The old factoid, 90% percent of the worldsdatawas created in the last two years is probably equally true today as the volume of data being produced has exploded and will continue to in the coming years.

Its why Warren Buffet set aside his aversion to tech stocks and invested in Snowflake, a cloud-based data management platformthat helps companies do more with less. Such a value proposition will sell equally well in times of economic turmoil, and investors have clamored on board. The stock remains extremely overvalued probably for good reason and now has a market capitalization of over $100 billion.

We wont invest in any firm with a simple valuation ratio over 40, no matter how grand the value proposition might be. We might never be able to invest in Snowflake, but were fine with that, especially when there are other big data companies out there we might find equally compelling. One area that sparked our interest is real-time data analytics, something thats particularly applicable to the metaverse.

How do tech visionaries stay relevant when technologies change so quickly? Two reasons. First, they learn exceptionally fast. Back in the day, rock star programmers could fake it until they made it by learning languages and platforms at the speed of projects. Second, some things dont change. More processing power and lower latency are always desirable attributes. NVIDIA (NVDA) became the biggest semiconductor company in the world because they built better processors. As for latency, it all comes down to one thing. How fast can we analyze the big data exhaust spewing forth from our growing population of digital twins? One answer is two words Apache Kafka.

Web 3.0 wankers are always droning on about how decentralized autonomous organizations DAOs are the future. ConstitutionDAO was supposed to show the world this grand vision of technological socialism, but it only managed to burn millions in gas fees and fail miserably at what it set out to do. Truth be told, the DAO ethos has been around for a while. Its called opensource software. Developers have a special place in their hearts for a technology that is shared in a community where everyone works to make it better. Few understand just how strong this allegiance is in the software development community.

Some of the smartest nerds on this planet will gladly dedicate every iota of their spare time to helping out the opensource community. Its not because theyre losers, its because opensource communities are very effective in harnessing resources to collaborate and produce great things that couldnt be possible if a single entity owned the software in question. The worlds largest opensource foundation is Apache. And if the name offends you, they could care less. Theyre too busy making great things happen instead of complaining on Twitter to get attention.

Anyone who slings code for a living has probably used at least some of the $22 billion worth of Apache Open Source software products made available to the public-at-large at 100% no cost. Apache Projects are overseen by a self-selected team of active volunteers who contribute to their respective project for no compensation except street cred. Projects are auto-governing with a heavy slant towards driving consensus to maintain momentum and productivity. This incredibly successful decentralized autonomous organization has managed to produce over 300 projects, some of which probably even sound familiar to the general population like Hadoop.

One project in the above list is Apaches Kafka, one of the most successful projects Apache has ever delivered, and one being used by over 100,000 organizations globally.

Apache Kafka isthe most popular open-source stream-processing softwarefor collecting, processing, storing, and analyzing data at scale. Most known for its excellent performance, low latency, fault tolerance, and high throughput, its capable of handling thousands of messages per second. Its now being used by 80% of the Fortune-500.

Example use cases include the following:

The ability to glean insights from data while it is being generated is the final frontier when it comes to predictive analytics for business decision making.

Snowflake was the largest software initial public offering (IPO) ever coming out of the gates with a $68 billion valuation. With a current market cap of over $100 billion (IBMs market cap is $114 billion), its already grown enough to fall into our mega size bucket.

Another big player in this space, Databricks, is private but said to command a valuation of $100 billion or more in a possible IPO. Both these companies are likely to maintain their lofty valuations over time which means we need to think about other opportunities to invest in the exponential growth of big data. Theres one pure-play company that a subscriber brought to our attention which somehow missed our daily IPO screening process. Its a $16 billion real-time data company that was founded by the creators of Apache Kafka. From what weve seen so far, it may find a place in our own tech stock portfolio. Coming soon, a deep dive into a firm with a very compelling value proposition.

If we think of the big data opportunity holistically, we can divide it into two components static data and real-time data. Everyone understands why value in providing information to make business decisions with quicker is more valuable. The most responsive way to provide data analytics is while the data is being generated. Real-time data analytics is the new paradigm for organizations of all kinds as we slowly build up the metaverse. Next up, well look at how we might be able to invest in this theme.

Want to know what 30 tech stocks we own right now? Want to know which ones we think are too risky to hold? Become a Nanalyze Premium member and find out today!

More here:

Real-Time Data Analytics and The Metaverse - Nanalyze

Related Posts