Monthly Archives: June 2020

The history (and evolution) of athlete activism – The Drum

Posted: June 13, 2020 at 12:58 am

Sports stars are increasingly using their huge personal platforms to speak out about injustice, even if sometimes their sponsors would prefer them not to. Simon Oliveira, the sport and entertainment specialist who has worked with the likes of David Beckham, Usain Bolt, Neymar Jr and Liam Payne, examines the rise of athlete activism, and what it means for brand partners.

Athletes around the world have been central to the global outpouring of emotion since the senseless death of George Floyd.

In times of tragedy and injustice, the power of social media is never more evident. Since that fateful day, almost every globally recognised sportsperson has taken to their personal platforms to express their thoughts and feelings on the endemic racism that still exists in society. Anger has now turned to activism.

Athlete activism is not a new phenomenon of course. Even back in 1966, heavyweight boxing champion Muhammad Ali refused to be drafted into the US military, citing his religious beliefs and opposition to the Vietnam War. This led to a three-and-a-half-year hiatus from the sport as he was denied a boxing license in every US state and stripped of his passport.

This incident was followed by one of the most symbolic moments in sports history when Tommie Smith and John Carlos delivered a Black Power salute on the medal rostrum at the Mexico 68 Olympics. At the time, the Olympic Games was by far the biggest global platform for an athlete due to its worldwide media exposure, led by the advent of colour television coverage. Carlos and Smith were expelled from the Games, ostracised for their actions and received death threats on their return to the United States.

Fast forward 50 years and athletes are now publishing titans with bigger personal followings, reach and influence than most traditional media and many of the brands that endorse them. Culturally, however, it feels like we are at a very similar age to that of the aforementioned events and the counterculture generation of the late 60s early 70s. But, the journey to a point where athletes feel empowered to share their sentiments in such an open and honest way, has been a long one.

The experiences of Ali, Carlos and Smith and countless other athletes unafraid to voice their opinions to media who were willing to tell their story led to a wall of silence for many that followed, who became increasingly wary of the professional recriminations.

In addition, during the late 80s and early 90s, as commercial interests in sport grew and media began covering off-field stories as well as on, sport stars and celebrities became front page news. Consequently, many athletes started to protect themselves by sanitising their public views. The assumption was that to attract endorsements and appease clubs, leagues and commercial partners athletes were increasingly wary of having a voice on cultural, political or societal issues and tended to stick to sport.

Michael Jordan is a very relevant example of this apolitical approach and the subject was revisited in Netflix and ESPNs 'The Last Dance' documentary, which reflected on the time Jordan failed to publicly endorse Harvey Gantt, the African-American former Democratic mayor of Charlotte, in his racially contentious Senate race versus Republican Jesse Helms.

Republicans buy sneakers, too, was the line Jordan famously used by way of explanation at the time. He clarified his position during the documentary stating activism is just not in his nature: Was that selfish? Probably, he admits. But thats where my energy was. Having released a statement (via social media) regarding George Floyds untimely death and subsequently pledged to donate $100 million over the next 10 years to racial equality and social justice causes, it appears his stance has now changed significantly.

George Floyds tragic death has been the catalyst for the global Black Lives Matter movement, but the efforts of athletes to call-out racially motivated injustices had already begun to reach a crescendo.

In 2016, Colin Kaepernick first took the knee with his position supported and endorsed by his sponsor, Nike. In 2018, LeBron James was told to shut up and dribble by a Fox News host, and subsequently launched UNINTERRUPTED, his own athlete empowerment brand. Last year, Raheem Sterling challenged the British medias perception of black players and has taken a lead in calling out racism in football.

Athletes have never held more influence in their hands. Many have exceptional spokespeople who have a natural gift for storytelling that plays out on their social media platforms. Unlike other socially constructed media celebrities, for example, the Kardashian-Jenners, who share the rarefied air of 100 million followers or more, not all sportspeople are comfortable in the social media spotlight. Let's not forget, these are professional athletes first and foremost and that defines them. Some enlist the services of support teams to help manage these assets. Others simply choose not to engage at all.

The efforts of Kaepernick, James, Sterling and many more, in shining a light on injustice and keeping issues front of mind for their millions of followers, has put a collective pressure on the federations, leagues and brands they represent to stand for something more than just social media statements. As President Obama said in his commencement speech this week about peaceful protests: they make the folks in charge uncomfortable".

While many brands and organisations, including sporting institutions, have shown solidarity with the George Floyd protests, the question is now how will they contribute to real change? Presently, the activism on display is driven from the bottom up by athletes themselves. The NFLs public shift on the Black Lives Matter movement, which eventually led to a video from commissioner Roger Goodell condemning racism and admitting wrongdoing last week, started with a rogue NFL video producer editing a video featuring prominent players, without his employer's knowledge. The leadership of major leagues, competitions and brands must make social purpose a priority by educating fans and consumers and ensuring stricter punishments for socially unacceptable behaviour.

Broader society, and by proxy media, now celebrate influential voices who are not afraid put their heads above the parapet. Moving forwards, more brands and media owners must support these sentiments with actions. Those who display purpose, authenticity and deliver on their promises will be the ones that thrive.

Pandoras Box is now well and truly open and sportsmen and women will continue to take strength from the positive reactions to their willingness to speak out. Top-level athletes, and the organisations they represent, can collectively reach more people than almost any politician and connect. This is an age where you can educate, influence and inspire and make a real difference for generations to come. The challenge now for athletes will be to ensure they remain accountable and live up to their claims.

Lets be clear: there are some deep-seated, undeniable issues prevalent in many western democracies. However, we do still have the right to protest and to voice our opinion in person or on social media. Lets hope that some of the athletes take their new-found activism and continue to call out injustice wherever and whatever country they see it and are empowered to do so by the brands, teams and countries they represent.

Simon Oliveira is the managing director of KIN Partners. He has worked with stars including David Beckham, Usain Bolt, Neymar Jr, Lewis Hamilton, Andy Murray and Liam Payne, was a founding partner in content studio OTRO, and has co-produced documentaries, such as I am Bolt and Class of '92.

The rest is here:

The history (and evolution) of athlete activism - The Drum

Posted in Evolution | Comments Off on The history (and evolution) of athlete activism – The Drum

A Long Childhood May Be How Crows and Jays Evolved Their Smarts – Smithsonian.com

Posted: at 12:58 am

A big brain is useless without the protection and education provided by an extended, nurturing parental presence, according to a new study comparing the lengthy childhoods of humans and certain brainy birds.

The average adult humans brain accounts for about two percent of their body weight, but consumes 20 percent of the calories burned by its owner. In childhood the brains caloric demands are even greater, peaking at 43 percent of kids daily energy requirements.

Brains are weird adaptationsthey come empty and are very costly, Michael Griesser, an evolutionary biologist from the University of Konstanz and co-author of the new research, tells Natalie Parletta of Cosmos. So it takes individuals a lot of time to make this adaptation worthwhile.

Studies of people and primates suggest that extended parenting is key to making the brains metabolic costs worthwhile and thus to the evolution of smarts more broadly, the researchers write. To paint a fuller picture of the role extended parenting plays in helping offspring survive and in the evolution of greater and more varied cognitive abilities the researchers looked to a more distant branch of the evolutionary tree: birds.

Corvid birdsa group including crows, ravens and jaysare noted brainiacs of the avian world and also spend extra time rearing their young. To systematically study where corvids stand relative to their feathered brethren, the researchers compiled a database of the life histories of thousands of bird species, including 127 corvids, reports Amanda Heidt for Science.

Compared to other birds, corvids spend more time in the nest before fledging, are doted on by their parents for longer and have larger brains relative to their bodies than other birds in the database, the researchers report this month in the journal Philosophical Transactions of the Royal Society B.

The study also included extensive field observations of two corvid species: New Caledonian crows and Siberian jays. The jays that watched their parents solve experimental puzzles learned faster and also received more food from their parents, per Cosmos. For the young jays, hanging around their folks made them more likely to survive and pass on their genes to offspring of their own, according to a statement.

These jays stay in family groups for as long as four years. In contrast, a group of chicken-like birds called megapodes dont even incubate their eggs, which they lay in burrows or inside piles of decaying leaves. Megapode young begin life by digging their way through several feet of rotting plant material or soil and emerge able to fly and fend for themselves.

While observing the New Caledonian crows, the researchers saw parents who were tolerant of their offsprings meddling as the adults were trying to use sticks to gather food. Tolerant parents are essential for the mischievous youngsters, which take up to a year to grasp valuable and complex life skills and stay with their parents for up to three years, according to the study.

Both humans and corvids spend their youth learning vital skills, surrounded by tolerant adults which support their long learning process, says Natalie Uomini, a researcher studying the evolution of cognition from the Max Planck Institute and the studys lead author, in the statement.

Moreover, corvids and humans have the ability for lifelong learninga flexible kind of intelligence which allows individuals to adapt to changing environments throughout their lifetime.

The researchers argue that the development of extended parenting is pivotal in the evolution of increasingly advanced cognitive abilities, a subject of intense debate. They write that, extended parenting provides a safe haven, access to tolerant role models, reliable learning opportunities and food, which makes offspring more likely to survive.

This pushes evolution in two ways. First, if the offspring of long-suffering, devoted parents live longer and have more babies, those traits may become more common through natural selection. Second, it also creates a situation that might allow uncommonly smart offspring to thrive, pushing forward the evolution of new cognitive skill-sets that take months or years to develop.

Uomini tells Science that studies into the development of other animals, even ones as different from us as birds, can grant humans insights into the evolutionary conditions that helped our big brains and our intelligence to evolve.

See the original post here:

A Long Childhood May Be How Crows and Jays Evolved Their Smarts - Smithsonian.com

Posted in Evolution | Comments Off on A Long Childhood May Be How Crows and Jays Evolved Their Smarts – Smithsonian.com

8 Wild Examples of Evolution Copying Itself – Gizmodo

Posted: at 12:58 am

Artists reconstruction of Homo floresiensis.Image: Kinez Riza

Paleontological evidence suggests humans arent immune to the effects of convergent evolution. Homo sapiens is the last human species left standing, but plenty of other humans have walked the Earth, including Neanderthals, Denisovans, H. erectus, H. naledi, among others.

In 2004, scientists working on the island of Flores discovered evidence of a diminutive human species, called Homo floresiensis, popularly known as the Hobbit. Now extinct, this archaic human stood no taller than 3 feet and 7 inches (109 centimeters). Incredibly, evidence of a second diminutive species, named Homo luzonensis, was uncovered in the Philippines last year.

These species lived at roughly the same timeroughly 50,000 years agobut nowhere close to each other. Their striking physical similarities has been attributed to an evolutionary process known as insular dwarfism, in which a species shrinks over time owing to limited resources. Perhaps not coincidentally, both human species lived on islands, which are known to produce diminutive species of various sorts.

More:

8 Wild Examples of Evolution Copying Itself - Gizmodo

Posted in Evolution | Comments Off on 8 Wild Examples of Evolution Copying Itself – Gizmodo

The evolution of the luxury retail experience – FashionUnited UK

Posted: at 12:58 am

Trendstop|Friday, 12 June 2020

Pre-pandemic, experiential luxury was one of the biggest growth areas of the luxury sector. Instagrammable moments became more highly prized among Millennials and Gen Z consumers than the continued accumulation of product. Despite a hiatus caused by the outbreak, the consumer shift to a less but better mentality will inevitably reignite desire for experience-led initiatives.

Trendstop invites FashionUnited readers to discover how the luxury market is reinventing and innovating post-coronavirus.

Luxury brands with a wealth of outlets and flagship locations, are rethinking how to repurpose them, in light of social distancing guidelines. Labels such as Valentino, are reinventing the in-store event in a virtual format. In a first of its kind, the Valentino Garavani Backnet showcase invited guests to visit its Milan outfit remotely. Sales staff were on hand to guide through customisation options and purchasing of its namesake sneaker.

Taking an omnichannel approach is allowing brands to utilise current resources and combine them with new innovations to transform their operations in the new normal climate. A merging of the physical and digital worlds can create innovative, retail experiences with seamless service consumers expect from luxury retailers and brands that are investing and building on these capabilities will have the advantage over their competitors. Understanding their audience and the differing impacts felt at different consumer levels allows brands to plan for future targeted marketing events.

FashionUnited readers can get free access to Trendstops Resort SS20 Key Themes Directions, featuring all the essential themes of the season. Simply click here to receive your free report.

Trendstop.com is one of the world's leading trend forecasting agencies for fashion and creative professionals, renowned for its insightful trend analysis and forecasts. Clients include H&M, Primark, Forever 21, Zalando, Geox, Evisu, Hugo Boss, L'Oreal and MTV.

Visit link:

The evolution of the luxury retail experience - FashionUnited UK

Posted in Evolution | Comments Off on The evolution of the luxury retail experience – FashionUnited UK

Why the evolution of blockchain reliability is critical to protecting your digital assets – The European Sting

Posted: at 12:58 am

(Credit: Unsplash)

This article is brought to you thanks to the collaboration ofThe European Stingwith theWorld Economic Forum.

Author: Amy Steele, Audit & Assurance Partner, Deloitte & Touche LLP

Much has been said in recent months about blockchain technology and the security and reliability its networks can offer. Given the fact that risks and controls may look different in the blockchain ecosystem, especially when supporting a companys financial reporting process, protecting digital assets deserves further exploration.

Why effectively-designed blockchains are key

In traditional contexts, an asset can be observed or tracked via source documents or physical observation. However, digital assets exist as a digital record maintained on a blockchain, often with no further physical representation. Digital assets comprise a broad range of items represented as binary data with usage rights and can range from certificates to crypto-assets. If the blockchain breaks down, a company may not be able to assert through their financial reporting that the digital asset exists. Therefore, existence of digital assets is dependent on the reliability of the blockchain (i.e., blockchain technology and support network).

What is the World Economic Forum doing about blockchain?

Blockchain is an early-stage technology that enables the decentralized and secure storage and transfer of information. It has the potential to be a powerful tool for tracking goods, data, documentation and transactions. In this way, it can cut out intermediaries, reduce corruption, increase trust and empower users.

The potential uses of blockchain technology are essentially limitless, as every transaction is recorded and distributed on a ledger that is almost impossible to hack. Though the most well-known use case is cryptocurrencies, blockchain is being positioned to become a global decentralized source of trust that could be used to collect taxes, reduce financial fraud, improve healthcare privacy and even ensure voting security.

Blockchain has the potential to upend entire systems but it also faces challenges. Read more about the work we have launched on blockchain and distributed ledger technologies to ensure the technology is deployed responsibly and for the benefit of all. Were working on accelerating the most impactful blockchain use cases, ranging from making supply chains more inclusive to making governments more transparent, as well as supporting central banks in exploring digital currencies.

Identifying vulnerabilities

A reliable blockchain should have an effective design for its intended purpose to properly record a digital assets creation or transfer with the following elements:

It may be difficult to conclude a digital asset actually exists when one or more of these elements is vulnerable to breaking down. Below we explore some of these vulnerabilities and what companies can do to enhance their internal controls over blockchain reliability.

Image: Deloitte Development LLC

Deployment services are those technology and service providers that allow individuals and businesses to interact with a digital asset. Examples include blockchain explorer software, digital wallets, custodial services and exchanges. There are variations in the types of deployment services and vulnerabilities, however those that offer data services (e.g., blockchain reader, analytics) present unique challenges to existence. As part of supporting that a digital asset exists, one would look to its representation on the blockchain.

Within financial reporting, a company may utilize a service to directly read and report on the status of its digital assets (e.g., type, quantity, historical transactions). The information obtained by the company may have been sourced from the blockchain, but it may be incomplete or inaccurate through errors in the software or manipulation. Companies may look to obtain a service auditors report from the data service provider that would support reliability of its internal controls.

The objective of the consensus protocol is to ensure that the blockchain networks node operators will reach the same conclusion about the validity of transactions. A consensus protocol includes rules for: designating which node operator(s) determines which transactions should be accepted in the next block of recorded transactions as true; operating the incentive model that encourages participation and discourages bad actors; and equal sharing of information for a common truth of facts among node operators.

Reliability of the blockchain records may be vulnerable when a blockchains open source software contains errors or bugs. Critical vulnerabilities have been uncovered within the source code of various blockchains that if exploited could have triggered an unintentional hard fork (i.e., split of the blockchain). Unfortunately, service auditor reports are not available for public blockchains and it may not be feasible or effective for companies to perform their own source code reviews. Companies may consider ways to assess new developments and reports of vulnerabilities in code versions.

The network enablers comprise the blockchain node operators that perform the essential tasks of validating new blocks and mining (specific to proof-of-work blockchains) for the network. Most node operators are honest and seek to support the reliability of blockchain records in pursuit of the consensus protocols incentive model. However, there are a variety of attack vectors against blockchain networks depending on how the consensus protocol is designed. There have been attacks against blockchain networks that resulted in transactions being removed, reorganized and replaced. The reliability of the blockchain records seemingly become less reliable with fewer node operators in a blockchain network. Implementing security policies (e.g., notarizing blocks, penalty systems) may be considered during a blockchains lifecycle, but these tend to reduce speed of transaction processing. Companies may consider employing their own monitoring activities to understand and respond to risks.

The security of blockchain technology occurs when data is cryptographically linked through the chain of blocks. This key feature of blockchain also poses challenges to reversing bad transactions or fixing unreliable smart contracts caused by user error or poor design. Companies may need to rely on the internal controls operated by smart contract owners and consider employing their own internal controls around initiating transactions and recovering unintended transactions with smart contracts.

Each public blockchain is supported by a community of developers who may be individuals, groups of individuals and formal organizations. Their effectiveness is key for blockchain reliability. The community promotes their blockchain adoption, responds to feedback from users and node operators, performs research and development for the source code, organizes version updates, and performs source code testing and monitoring. The community is often organized around a non-profit foundation that provides governance over the blockchain. While commonly known digital assets may have active foundations, there are many abandoned digital assets where the foundation dissolved or failed to form. Companies may consider employing their own monitoring activities to assess a community or foundations effectiveness, integrity, talent and version updates.

A reliable blockchain should have an effective design for its intended purpose to properly record a digital assets creation or transfer.

Amy Steele

Unique risks and challenges emerge when considering financial reporting in this ecosystem. This article has only begun to touch on challenges with the reliability of blockchain technology and its records a key consideration for assessing if a companys digital assets even exist (the World Economic Forums Blockchain Deployment Toolkit, launched last month, discusses these considerations in more detail). Still, it is increasingly clear that each of the elements that support reliability deserve further analysis but many questions remain as it relates to these systems of internal controls. Certainly, authoritative bodies around the world are proposing frameworks, issuing guidance and providing feedback for companies and service providers. Companies will only fully benefit from adopting digital assets by being proactive and savvy in meeting these complex challenges with the help of effectively designed internal controls over financial reporting.

Contributors to this report include: Tim Davis, Risk & Financial Advisory Principal, Deloitte & Touche LLP; Brian Hansen, Audit & Assurance Partner, Deloitte & Touche LLP; Peter Taylor, Audit & Assurance Senior Manager, Deloitte & Touche LLP

Read this article:

Why the evolution of blockchain reliability is critical to protecting your digital assets - The European Sting

Posted in Evolution | Comments Off on Why the evolution of blockchain reliability is critical to protecting your digital assets – The European Sting

Origin Stories RNA, DNA, and a Dose of Imagination – Discovery Institute

Posted: at 12:58 am

Editors note: Eric Anderson is an attorney, software company executive, and co-author of the recently released book, Evolution and Intelligent Design in a Nutshell.

A new paper in Nature seeks to shed light on lifes origins from non-life on the early Earth, that is, on abiogenesis. Several outlets have picked up the story, including New Scientist. Phys.org explains that the research, led by Cambridge scientists, shows for the first time how some of the building blocks of both DNA and RNA could have spontaneously formed and co-existed in the primordial soup on Earth.

My purpose is not to question the research protocol or the results. No doubt the work is impeccable and the results as described. I am willing to assume that the researchers recreated early Earth conditions and demonstrated realistic prebiotic synthesis of deoxyadenosine, deoxyinosine, cytidine, and uridine. (Of course, early Earth conditions continue to be debated.)

This is fascinating, and it contributes to our understanding of these potential building blocks of RNA and DNA. What we do have to watch out for is how research results get interpreted within a naturalistic framework and sometimes get presented without the critical context.

The paper mentions the ongoing debate about whether abiogenesis began with RNA as the first carrier of information, giving rise to DNA (the RNA World hypothesis), or with RNA and DNA together at the same time. One of the challenges with the RNA World has been the lack of a plausible scenario to produce DNA from RNA under realistic prebiotic conditions. In addition, the genetic systems we are familiar with in biology today use both DNA and RNA, prompting some to suggest that a contemporaneous rise of DNA and RNA simplifies the abiogenesis scenario and, as the authors argue, streamlines the eventual genetic takeover of homogeneous DNA from RNA as the principal information-storage molecule.

On the other hand, one of the challenges facing this RNA/DNA World scenario is that it would have required the abiotic synthesis of building blocks of both DNA and RNA in close proximity and, preferably, under the same geochemical scenario. It is in this area that the researchers seek to make a contribution, demonstrating that certain building blocks of both RNA and DNA can be synthesized with prebiotically plausible reactions and substrates.

They conclude that RNA and DNA building blocks may have coexisted on the early Earth before the emergence of life. This is notable and seems, under the assumptions granted above, a reasonable conclusion.

However, the mere existence of RNA and DNA building blocks tells us little about the formation of life, with all its interrelated systems, including the information required to build and maintain and self-replicate that life.

The ultimate goal of abiogenesis researchers is to explain the emergence of life from non-life. John Sutherland, leader of the Cambridge group and one of the most accomplished origin-of-life researchers, observes, Our work suggests that in conditions consistent with shallow primordial ponds and rivulets there was a mixed genetic system with RNA and DNA building blocks co-existing at the dawn of life. This fulfills what many people think is a key precondition for the spontaneous emergence of life on Earth.

This is an impressive statement, and if we are not careful we might get the impression that in fact there was a mixed genetic system with RNA and DNA building blocks under early Earth conditions. We might get the further impression that because a key precondition had been fulfilled, we are moving steadily closer to explaining the spontaneous emergence of life on Earth. However, a closer look is warranted.

First, there was no mixed genetic system. In fact, there wasnt any kind of functional system, just molecules interacting under the normal tug-and-pull of physics and chemistry. Furthermore, a genetic system requires a special kind of functional capability. It requires not just DNA and RNA, but the information content that ebbs and flows between them. Meaning, a symbolic code, pursuant to which a string of nucleotides in the primordial DNA would be interpreted and translated into another state the symbolic and the immaterial being coaxed into the concrete and the material. Such a system necessarily involves information processing, with not just bare DNA storage at hand, but also retrieval and translation mechanisms. Such a system arising through unguided natural processes has never been observed, and we have theoretical and practical reasons to conclude it never will be.

Sutherland does not claim that their research demonstrates the existence of a primitive genetic system. He is more thoughtful in his wording, talking about how the authors work suggests a mixed genetic system. But that suggestion gains traction only in the context of an a priori story that has been assumed. Namely, that there must have been some kind of simple, naturally occurring system that eventually gave rise to the spontaneous emergence of life on Earth. Take away that assumption, and the suggestion of an early primitive genetic system, arising through unguided natural processes, evaporates.

The authors demonstrated the production of potential building blocks. Not all the building blocks needed for RNA and DNA as we know them, mind you, but some of the building blocks. They then suggest that this reduced number of building blocks could have served as a kind of alternative genetic alphabet. The headline from Cambridge teases us with this very possibility: Primitive genetic alphabet based on RNA and DNA.

Yet from an information-theoretic standpoint, having building blocks for a potential alphabet is unremarkable. I can build an information storage system out of sticks and stones.

There are multiple elephants in this room: What is the functional context of that system? Where does the information come from? How it is retrieved? How is it interpreted and acted upon? What is the overall meaning and purpose of that information? How does it become directed toward a comprehensive, integrated, cohesive end producing a living organism? The mere existence of molecules that could hypothetically serve as physical carriers of hypothetical symbols as part of a hypothetical primordial alphabet tells us nothing in response to these questions.

The researchers understand this, no doubt. The problem comes when we fail to appreciate that there is a fundamental difference in kind between having some building blocks on the one hand, and putting those building blocks to use in constructing a sophisticated functional system on the other. It is a critical distinction, and we might begin to suspect that no amount of research into the former can solve the latter. These are fundamentally different kinds of issues.

Are, then, research efforts toward forming the building blocks of RNA and DNA putting us on the path to a naturalistic abiogenesis explanation? Ill do you one better than building blocks. Lets stack the deck heavily in favor of the naturalistic story.

Ill give you all the nucleotides you want, formed and activated and ready to line up into nice polynucleotide chains. Ill even give them to you in just the right proportions for optimum effect. Ill give you the most hospitable environment for your fledgling structures to form. Ill throw in whatever type of energy source you want: just the right amount to facilitate the chemical reactions; not too much to destroy the nascent formations. Ill spot you that all these critical conditions occur in the same location and at the same time. Shoot, Ill even step in to prevent the inevitable interfering cross-reactions. Ill also miraculously make your fledgling chemical structures immune from both their natural rate of breakdown and from breakdown by other reagents in the environment.

Every one of the foregoing gifts represents an open question and a challenge to the abiogenesis account. Now, what do you think the next step is? What is your theory about how life forms?

There is no naturalistic answer. But taking time to at least think through the myriad problems with abiogenesis should be a required exercise for anyone proposing a naturalistic scenario.

Sutherland mentions that the groups research fulfills a key precondition for the spontaneous emergence of life on Earth. What was fulfilled? His point is that many people believe life had to start with both DNA and RNA together, not the traditional RNA World scenario that slowly gave rise to DNA.

He is right. There are good reasons for thinking life started with both DNA and RNA. Thats both because (as noted above) no one has been able to propose a plausible scenario that would produce DNA from RNA under real-world conditions, and also because the genetic systems we are familiar with in biology indeed include both DNA and RNA.

Yet the recent research did not demonstrate that such a genetic system can realistically arise by itself. Nor did it bring us closer to demonstrating the spontaneous emergence of life on Earth.

After all, beyond a genetic system, to keep life going on the early Earth another capability is required: self-replication. Many origin-of-life researchers view self-replication as the key goal, relying on the Darwinian process of mutation and natural selection to build the rest of the systems for the first organism. (See, for example, here and here.) Yet as I demonstrate in my engineering analysis in Evolution and Intelligent Design in a Nutshell, self-replication is not the starting point for the origin of life. Instead, it lies at the end of an extremely complicated, sophisticated, and specified engineering process. (p.84)

When we analyze what is required for the emergence of the simplest form of life No, when we step back further and analyze what is required for that oft-imagined precursor to life a single self-replicating molecule we find that the abiogenesis story has set forth on the wrong path from square one.

Make no mistake. This impressive work was performed by some of the most capable researchers in this field. The authors deserve recognition for it. Origin-of-life research doesnt get any better.

Yet for anyone tempted to think we are on our way to explaining the origin of life in naturalistic terms, what do we really have? Well, we now have building blocks on the early Earth that, potentially, could be used in some later process as part of the production of DNA and RNA. Alternatively, as the authors suggest, perhaps the building blocks could have served as the initial information carrier and then later turned into modern RNA and DNA.

Either way, these would then need to be carefully strung together into information-rich molecules, based on a symbolic code. That in turn would require multiple machines and interrelated systems to access, interpret, and utilize the information, which would further require a suite of hundreds of genes, components, and systems to survive in a prebiotic environment and self-replicate. None of these steps is plausible by purely natural means. All of them speak to the need for intelligent input.

Turning from the first life to a wonder of contemporary technology, consider a self-driving car. A marvel of engineering, a self-driving car has copious systems and sub-systems and components, made of numerous materials, organized in just the right way, and humming along under the control of sophisticated software dancing through carefully designed circuit boards and integrated circuit chips made from silicon. Yet, based on the engineering analysis I lay out in our book, we know that a self-replicating system requires much more than even all this.

Discovering how building blocks of RNA and DNA may have formed has about as much relevance to the spontaneous emergence of life on Earth as the discovery of naturally occurring silicon does to the spontaneous emergence of a self-driving car.

These building blocks have no interest in turning into an organism nor any tendency to do so. Instead, they will do what they always do, drifting in the primordial soup, suffering their normal rate of breakdown and unwilling to control their natural urges to cross-react with other interfering chemicals. All the while they are unaware of and indifferent to the fact that they might have, in the right hands, served as carriers for a simple genetic alphabet.

The modesty of the result of producing building blocks is due to the fact that this is all we can get from unguided natural processes. The result kind of, perhaps, with some imagination, might look like something that could be a precursor to life.

Forever lacking are the key elements required for life: coordinated activity, coherent function, regulated control, meaningful information, purposeful intent. These things require intelligence.

See original here:

Origin Stories RNA, DNA, and a Dose of Imagination - Discovery Institute

Posted in Evolution | Comments Off on Origin Stories RNA, DNA, and a Dose of Imagination – Discovery Institute

The Sword on their musical evolution, gear epiphanies and secret pedalboard weapons – Guitar World

Posted: at 12:58 am

Fresh off hiatus to find themselves in a pandemic-enforced hiatus that has grounded their forthcoming tour with Primus, The Sword have arrived at a surreal point in their history.

But what better time to delve into that history than the present? The Sword have just spent over a year putting together Chronology 2006-2018 (CD) and Conquest of Kingdoms (Vinyl), deluxe boxsets collecting their biggest riffers and a cornucopia of unreleased material, and guitarists J.D. Cronise and Kyle Shutt join us on a socially distanced conference call to retrace their steps through riffs, guitars, and tones gone by.

The Sword were formed in Austin, Texas, back in 2006. Hopped-up on the heady riff-work of Led Zeppelin, Black Sabbath, Sleep et al they soon found an international audience on the back of the loose-leaf metal of their debut LP, Age of Winters. The anthemic Freya growing the buzz around the band after it was picked up for the videogame Guitar Hero.

In an increasingly digitized culture, The Sword reveled in anachronisms - large stacks, bigger riffs and epic arrangements that interpreted the heavy metal idiom at its most narratively-focused. There were concept records, such as Warp Riders, with album art and songwriting that suggested The Sword were holdovers from the 70s and just released from the amber.

We had a bunch of material, different songs, different riffs, and I had this story in my head, recalls Cronise. It was a matter of figuring out which music we had that would fit which parts of the story. It was like scoring a film that didnt exist. It was like, What would work for a chase scene?

The Sword found a lot of early success, and that divided people. But that didnt matter, so long as people were talking about them. There were high-profile tours, with Metallica, Guns N Roses, Kyuss.

When drummer Trivett Wingo was replaced by Santiago Vela III aka Jimmy, in 2010/11, their songwriting evolved again. I think with knowing what Trivett was capable of and switching to Jimmy, having a different style, it just kind of changed the approach that I took when writing riffs, says Cronise.

Especially with Used Future, adds Shutt. A lot of those songs came from beat patterns that Jimmy laid down and was just put into a Dropbox folder. So you were literally writing riffs to beats that already exist.

I didnt do any solos in the early days, and I think I did all my tracks in three-and-a-half hours or something like that

Over the course of seven studio albums, The Sword weaned themselves off the mega-watt approach to explore different styles of gain. Much of their musical evolution was seeded in the pursuit of new vintage tones. Theres a neat symmetry in that; the sense that no matter what stage they were in this evolution The Sword were always born too late somehow.

You have just put together an anthology, so lets start back at the beginning. What are your memories of recording Age of Winters?

J.D. Cronise: It was very D.I.Y. We recorded it at Brian Richies house, basically in his room, and various places around the house. We used the whole house as our recording studio and it was done piecemeal when we had time to do it.

What gear were you using back then?

Cronise: Thats a good question. I wish that I had documented that sort of thing better. I know for a lot of my tracks on that record I was using an SG Faded. I dont remember what year it was but it was one of those ones with the real thin finish.

It had the crescent moon inlays and that was one of the only SGs I ever owned, and I played that just cos it was really easy to play. I found that I could track really fast with it. I used that and a Les Paul Custom, but mostly the SG.

Kyle Shutt: I didnt have a whole lot of gear at the time. I was very young when the band first started. I had a Guild S-100. I think it was a 71. It was in a natural finish with the clear pickguard, and I used that for all my tracks. My amp, I think was Brians Laney AOR 100, and I used a [Maxon DS-830] Distortion Master, and that was on the whole time.

I didnt do any solos in the early days, and I think I did all my tracks in three-and-a-half hours or something like that In Brians kitchen! It was an absurdly short amount of time for the amount of copies that album sold. [Laughs]

Of course, Freya blew up and you were everywhere. Had things changed by the time you recorded Gods of the Earth?

The thing about The Sword was that it was divisive from day one. You either loved it you just totally hated it and wouldnt shut up about how much you hated it

Cronise: Well we did that in a little studio in Austin. That was the most challenging for me personally, because me and the engineer did a lot of the producing and the mixing of the record. I just remember a lot of long hours in the studio staring at the mixing board.

Shutt: We bit off a lot. Lets do it to tape! Lets do it ourselves to tape! It was a lot of learning technical things.

Cronise: I had the most to do with how that record sounds and it is the one that a lot of people think sounds the worst! [Laughs] The thing is, I listen to it and its like, Yep! But thats exactly what I wanted to sound like. I think the cymbals are too loud but I turned them up on purpose. Love it or hate it, thats how it was intended.

Shutt: Its a lot of peoples favorite. The thing about The Sword was that it was divisive from day one. You either loved it you just totally hated it and wouldnt shut up about how much you hated it. It worked to our advantage. It was great.

Dividing opinion is good though. At least people have an opinion.

Shutt: Totally!

Cronise: Way better than being ignored.

How do you look back at your playing and composition then?

Cronise: Well it is weird. Its not necessarily what I would do now but it is exactly what I wanted to do then. But yeah, listening to those old arrangements and riffs and stuff Wow! It was just so proggy, really extended arrangements with so many parts and so many little nuances and

There was no being five minutes late with Metallica. You know exactly what is expected of you and where you need to be every night

Shutt: so many notes! When we did Warp Riders we had just come off some crazy world tour with Metallica, when three years before we were working in a video store and a photocopy store. We dove head-first into the music and that was all we did. I hate to say it was overthought, but it was prog, it was thinking mans music.

Touring with Metallica must have made you such better players.

Cronise: I am sure it did. Yeah, the regimentation of it and having to be ready to go every night at an exact time. There was no being five minutes late. I definitely think it drilled us hard. It was very predictable. You know exactly what is expected of you and where you need to be every night, but at the same time it is also nice to play a show in a club where nobody is going to lose enough money to buy a Mercedes if youre not onstage at exactly 10 oclock!

Shutt: Thats a real quote! [laughs]

Tell us about how your relationship with gain changed over the years. Youve dialed it back a bit.

Cronise: I wanted to explore things that were a little more classic-sounding, vintage-sounding. We really did the super-high-gain thing on the first few records and I just got into playing more vintage-style guitars and playing vintage fuzz pedals.

We tracked Used Future with little tube combo amps. You cant make every record with Orange stacks and dimed Big Muffs. Youve got to try some different things.

Shutt: Like, J.D. said, you can only do so much diming your amp and going for it, making these crazy towering arrangements, super-metal stuff.

When a lot of people didnt like the fact that High Country or Used Future were toned down a little bit, theyd come see us live and it was just as heavy as the old stuff. When you compared our old material with the new it all made sense. I think that everyone who saw us realized that.

On the first few records, we went in with all of our gear, what we would take to a show, and that is how we recorded, probably, the first four records

Cronise: On the first few records, we went in with all of our gear, what we would take to a show, and that is how we recorded, probably, the first four records. But on the last couple of records, we have worked with producers who have had a little bit more input, different perspectives, and to us, like Kyle said, they are tools to take advantage of.

And that in and of itself is a process of discovery.

Cronise: Its a little bit more experimental. It was definitely a refreshing thing to record with these little manageable Princetons rather than a full half-stack in the studio.

Is that a sign of maturity, appreciating the power of diming a small amp

Cronise: Well! I dont wanna use the M word too much but [Laughs]

Yet the Princeton is a refined choice...

Cronise: Yeah, I used the Princeton and I have gotten really into a Vox Tone Bender. I used that for a lot of my stuff, to the point where I had to buy one after we were done with the record. They are not cheap but I loved it so much that I had to have one."

Well, this is a good time to ask about gear epiphanies. What have been the big ones for you?

Shutt: When I was younger I thought you had to have this specific amp, and that is what you need to rock, or this guitar or this pedal. I dont know when I realized this but it sort of became apparent that you need to be able to have a discernible style no matter what you are playing through.

Thats when I got more into pickups, swapping out things, and trying to get away from Orange and Marshall - no disrespect to them but everyone plays them. I needed an amp, so I got Brooks Harlan from the band War on Women to build me two versions of this 50-watt amp [Big Crunch One Knob] that I just love. It is louder than God.

It is not the 70s anymore. There are very few arena rock bands left with huge crews and semi-trucks to carry around their Marshall stacks. It is just not a thing anymore

And I partnered up with Reverend Guitars to make my own signature model. I really tried to focus on what my sound is and what I needed to have that. I wouldnt call it an epiphany but it was a gradual evolution from having played a lot of gear.

Cronise: Yeah, Id say the same. Its my tastes changing and realizing that you dont need these certain things to play a certain type of music. I think that many of the gear companies use that to market the gear. In heavy music, bigger is better all that sort of thing, and I think people are learning nowadays that it is not.

It is not the 70s anymore. There are very few arena rock bands left with huge crews and semi-trucks to carry around their Marshall stacks. It is just not a thing anymore. I mean, it is for a very few bands but those bands wont be around forever and people are realizing you dont need that stuff to have a loud guitar sound or have a heavy guitar tone.

Well stage volume has changed for a start. You cant play so loud.

Cronise: I remember a couple of years ago seeing Uncle Acid & the Deadbeats here in Asheville and they sounded amazing! It was so heavy and loud, and you couldnt even see their amps they were so small. You had to get right to the front of the stage to see what they were playing.

They were playing little combos and it sounded great. It wasnt necessarily an epiphany but it was a demonstration. See? You dont need big stacks to be heavy. A lot of those European bands have a good appreciation of vintage gear and how to get good sounds.

With me, my personal gear philosophy is: I dont just use vintage gear, but if the vintage gear is better Id rather use that, and if there is a point in time when a certain thing was best, Im going to use that one. I use a mix of all kinds of stuff but there is a bit of me that thinks, Well they did a great one in the 70s so why am I going to buy a shitty one from the 90s?

I am a big fan of one-knob pedals but I can deal with three if necessary. One or two is ideal

I like things that are simple. If the new thing is new and shiny but has more buttons and more functions and a bunch of crap I dont need, I would rather have the MkI or MkII version with no extra knobs, no extra features.

Like if there are more than three knobs on a guitar pedal, you start to get a bit anxious

Cronise: Precisely. Any more than three... You are absolutely right. I am a big fan of one-knob pedals but I can deal with three if necessary. One or two is ideal.

Were there any pedalboard secret weapons?

Cronise: Yes, its called The Pedal, Hush Systems The Pedalnot to be confused with the Rocktron Hush pedal. It is the predecessor, the two-channel version. I always used it. It squashes noise; its like the best noise gate ever, and it adds a little compression and I have always had it on my board and I probably always will. I have two, three or four of them in a closet somewhere.

Shutt: And they are so cheap. Every time we would see one we would just buy one.

Cronise: They are discontinued. You used to be able to get them in discount pedal bins for 50 bucks or something. I have hoarded them. Thats my secret weapon. Everything else I have used has pretty much changed.

As far as fuzz or distortion, Ive gone through a ton of those. I have always used a [MXR] Phase 90. That has been my tried and trusted phaser forever, but thats not a main part of my sound. But Hush Systems The Pedal, thats the secret sauce.

These days it takes more for me to write a song than it did 10 or 15 years ago. I have to feel like that song needs to be written

Shutt: Especially back in the day when we were playing loud, with so much gain. We had to get rid of that gnarly feedback for clean stops and it was just the best. It wasnt really a gate; it was a hiss-reducer or something like that?

Okay, last one. We started at the beginning, but whats next for The Sword?

Cronise: Well, man, its kind of in limbo until things get moving again. Weve been on this hiatus for a couple of years and kinda laying low, and this anthology came together. It just so happened that when this came out we got this offer to do the Primus tour. It wasnt necessarily a premeditated return to action.

Will there be new material?

Cronise: Yeah, maybe. These days it takes more for me to write a song than it did 10 or 15 years ago. I have to feel like that song needs to be written. These days I am a little more thoughtful. Does this song need to exist? Am I saying something I havent said before? Am I expressing something that no one has expressed better before me? So I dont write as prolifically as I used to. If new Sword material materializes at some point then it will! [Laughs]

Shutt: Haha!

Cronise: But, that being said, I do miss playing live. Even though we might not have a record coming out in the foreseeable future, I still like playing shows. I am still proud of our work, which is why we are putting out this anthology to begin with. We still want to play live together because that is where all the fun is.

Shutt: Totally.

Chronology 2006 - 2018 and Conquest of Kingdoms are out on June 19 via Craft Recordings.

View post:

The Sword on their musical evolution, gear epiphanies and secret pedalboard weapons - Guitar World

Posted in Evolution | Comments Off on The Sword on their musical evolution, gear epiphanies and secret pedalboard weapons – Guitar World

The Next Evolution and Debate in the Cloud – Traders Magazine

Posted: at 12:58 am

The cloud.

Its the buzzword that keeps on buzzing. Years after the concept became a hot topic in mainstream circles among all, it is still the subject of many debates within the financial services industry, and everyone, from the vendors hawking transformative solutions to the leeriest buy-side firms, seems to have an opinion.

Thats not to say that nothing has changed, however. A Refinitiv report last fall predicted that the financial services industry will spend 48% of its collective IT budget on cloud services in 2020, up from 41% in 2019. Around the same time, a survey by IHS Markit and WBR revealed that 80% of buy-side respondents will use the cloud for data management by the end of 2020. Those figures are a far cry from the early days of the cloud, in which tepid adoption rates and cautious technologists dominated the conversation.

Historically, one of the common misconceptions from the buy side with respect to technology strategy is that the perceived risks of the cloud can outweigh the benefits, said Hoony Youn, CTO at MackeyRMS, a provider of SaaS-delivered research management software for investment managers. I think that has changed dramatically over the past few years. As the buy side has embraced the cloud, many of the older perceptions, that the cloud is less reliable and less secure, have proven to be quite the opposite.

This greater understanding has led to a fruitful period for vendors of solutions that leverage the cloud, to the extent that the cloud is now a selling point it connotes sleekness, modernity and a rejection of legacy technology. But the industrys education is not complete, and many of the continued misconceptions boil down to a single point: not all cloud systems are created equal.

One of the key divides in how these providers use the cloud is single tenancy versus multitenancy. Conversations with several individuals in the space revealed that this is no minute detail in fact, it can have dramatic effects on efficiency, security and cost. In order to fully take advantage of rapid innovation, firms must dive deeper and learn what is really under the hood of their cloud solutions, and this is a natural place to start.

Another way to look at the single tenancy versus multitenancy question is: how many different active versions of the software exist? With a single tenant solution, every client has its own independent database and instance of the software; with a multitenant solution, a single instance of the software serves the entire client base.

While there are diverse offerings within these groups, proponents of single-tenant solutions typically tout their security, reliability and controllability, while those in favor of multitenancy point to its cost-effectiveness, efficiency and wider ecosystem.

Deployment models are becoming quite complex given the importance of the cloud in the capital markets, said Brad Bailey, Research Director at Celent. For those most concerned with control, means of access to the cloud and extension of their own infrastructure, single tenancy is often the choice. For those that are looking for maximum cost savings, ease of deploying changes across multiple instances and rapid cycles of updates, multitenancy would likely be the choice.

So, on one side there is a model that can better align with a firms existing technology strategy, while the other offers a number of key efficiencies at the cost of control. In many ways, it is a reincarnation of the old debates over the cloud itself.

And just like those old debates, the topic elicits strong feelings from those in the space. Sean Sullivan, CRO at LiquidityBook, a provider of SaaS-based, multitenant order management solutions, believes that due to their reliance on professional services, single-tenant solutions pose a business risk for vendor and client alike.

While they are wrapped in the cloud and all the rhetoric that comes with that, single-tenant solutions share a lot of similarities with legacy systems, said Sullivan. They ignore the fundamental benefits of the cloud. When every client requires bespoke upgrades, localized customizations and patch releases, processes can become backed up and the product inevitably suffers, and this is especially true during times of disruption. Multitenant systems are far nimbler. Every single new functionality we build is released to the same instance, so our clients dont have to worry about racking up professional services fees and our ability to service them is never compromised.

LiquidityBook has ridden its model to significant success in recent years in 2019, the firm posted a 33% year-over-year improvement in terms of revenues. 2020 is shaping up to be even more successful, says Sullivan, especially with many legacy and single-tenant providers facing hurdles related to the COVID-19 pandemic.

Our experience with multitenant trading technology has been very positive, said Ben Searle, CIO atLevinEasterly Partners, a private asset management firm specializing in value investing. Deploying LiquidityBooks LBX Buy Side required heavy collaboration and some extensive back-end work at the outset, but very little since then. We now have access to a high-performance product and benefit from regular updates while avoiding legacy processes and servicing fees.

This process is typical of multitenant solutions while they often require a lengthy bespoke integration up front, every subsequent upgrade occurs automatically, saving time and resources in the long run. Meanwhile, with every new client the multitenant provider signs, the product is built out and improved, and the existing client base reaps the rewards. Its all part of what Sullivan calls an ecosystem of good ideas.

With a multitenant solution, clients should expect better ROI via shared resources, databases and applications, said Tom Pfister, Vice President of Global Product Strategy at Confluence Technologies, which provides a suite of reporting solutions to large asset managers. Hard costs are shared more efficiently across many tenants. In the same way, clients are effectively sharing their brainpower, making their innovations available to other tenants and driving the entire industry forward.

Forging the Future

Looking ahead, one thing seems clear: the cloud is here to stay. As firms get smarter on the topic and refine their strategies, multitenant solutions appear likely to continue to gain market share on their single-tenant counterparts.

Our multitenant model provides a huge range of advantages for our business, said Pfister. It greatly simplifies upgrades and client onboarding and reduces the costs of hardware, IT and product delivery. These are passed on to our clients, so everyone involved benefits.

Youn echoed this sentiment, calling multitenancy a win-win situation which delivers a tremendous amount of operational and cost efficiency.

Of course, single-tenant solutions are not going anywhere, and many firms that emphasize autonomy or want to have their data completely isolated from other customers will continue to go that route. In some ways, these offerings bridge the gap between monolithic legacy platforms and dynamic multitenant systems, filling an important void in the market.

But no matter what route they choose to go, the bottom line is that firms must do their homework in order to fully realize the benefits of the cloud future. Simply being in the cloud is just the beginning, and exactly how each system leverages the cloud could have a major impact. Every firms needs are different, but by educating themselves and asking the right questions, the industry will be better equipped to continue the cloud conversation no matter how long it lasts.

The following article appeared on Traders Magazine in June 2019

The Cloud Services Providers Next Play?

The times they are a changin.

Though Bob Dylan originally sang those words in 1964, theyve probably never been truer than they are today, at least when it comes to technology.

Name the industry, and in almost all cases technology-led disruptors have entered and seriously shaken up the status quo. Amazons effect on retail is obviously the prime example, but Ubers impact on the livery industry, Netflixs impact on content distribution and even Caspers impact on the mattress industry are others. Perhaps the only major industry to have *not* seen a significant impact due to the entry of new, tech-savy disruptors is the capital markets.

But in the opinion of some, it is only a matter of time before the major global banks most of whom operate aging, overly-complex technology stacks will see the same type of competition that other industries have felt.

Complexity Without Cloud

Currently, trying to decipher a problem in a large banks capital markets infrastructure is like an archaeological dig, according to Tony Amicangioli, Founder and CEO of capital markets infrastructure technology provider HPR (formerly known as Hyannis Port Research).

As you cut through the crust of technological layers you might find that in the early 2000s whoever was head of IT thought C++ was the solution to everything. Then they were replaced by a Java devotee and the most recent layers were built by a true believer in Python and their team. Multiply this across different regions Europe, AsiaPac and the Americas and again by asset class, with some solutions developed in-house, others by vendors, and it becomes unmanageable. Fixing it seems impossible. How are you going to take all these mission-critical systems off-line and rebuild from scratch? said Amicangioli.

Firms like Amazon, Apple, Google and Microsoft the cloud natives, if you will dont have this problem. Their foundational technology leverages the Cloud natively and is built to scale. At their essence, they are very simple. Take Google, which is fundamentally a distributed, de facto operating system. Ten Google apps may do ten very different things, but the underlying technology is highly unified. This efficiency and unification have enabled the rapid ascent of these companies.

Cloud technology is not just about moving applications to a central providers data center, says Amicangioli. In our view, effectively leveraging the Cloud is about unifying your systems and simplifying your development approaches within a singular and universal computing environment. We see all technology frameworks ultimately destined for this since it almost always represents the most cost-effective, responsive and performant environment.

Amicangioli knows. He was previously CTO of Tower Researchs Lime Brokerage subsidiary, one of the early winners in the race to sponsor HFT providers. Before that he founded one of the first Cloud startups, and early in his career served as an executive at hardware powerhouse Juniper Networks.

Ive been in business long enough to watch the likes of Sun Microsystems and Digital Equipment, once giants in the technology industry, vanish, he said. At times I get the same uneasy feeling when looking at some of todays largest banks given where their technology is in comparison to the entrants we all know are coming. Without a doubt, there will be winners and losers.

Trend Toward Simplicity

Its no secret that the stacks run by virtually all of the tier 1 global banks are in need of a major overhaul, and that is not news to the executives at those firms, either, said a former bank technology executive who asked to remain nameless. I think the failure to act has been driven in part by a belief that the regulatory and capital moats that we have in the capital markets will be sufficient deterrents, but I cant see that being the case for too much longer.

To that point, being in a highly regulated industry is no protection. Consider Oscar, the health insurance startup that last year attracted a $375 million investment from Google parent company Alphabet. Oscars proven that its possible to enter a complex market and gain competitive advantages by building a cloud-native technology stack from the ground up, resulting in better efficiency, better margins and ultimately better service to customers.

Thats not to say that Wall Street has been oblivious to the threat from tech-led competitors, however. JP Morgan CEO Jamie Dimon warned in a 2015 investor letter that Silicon Valley is coming, and two years ago noted that the bank is spending nearly $10B per year on remaining competitive. Other banks have similarly robust technology budgets, but far too much of that spending is to fund finger-in-the-dyke projects, and not nearly enough goes to the strategic ones, says the tech exec.

Some see the capital markets dividing into two camps when it comes to technology: the haves and the have-nots. With the stakes so high, CTOs fear of making mistakes can induce a form of paralysis, ushering their firms into the have-not category through inaction. As cloud adoption becomes the key driver of success, the industry will see these two camps transform into winners and losers, these people say.

Its tempting to see the devolution of large financial services firms as inevitable, but really, its not too late yet, said Amicangioli. The bank that locks on to this reality, finding efficiencies through technology, is going to do very well. The potential upside for being the first to truly get this right really cant be overstated.

CTO at MackeyRMS, a provider of SaaS-delivered research management software for investment managers. I think that has changed dramatically over the past few years. As the buy side has embraced the cloud, many of the older perceptions, that the cloud is less reliable and less secure, have proven to be quite the opposite.

This greater understanding has led to a fruitful period for vendors of solutions that leverage the cloud, to the extent that the cloud is now a selling point it connotes sleekness, modernity and a rejection of legacy technology. But the industrys education is not complete, and many of the continued misconceptions boil down to a single point: not all cloud systems are created equal.

One of the key divides in how these providers use the cloud is single tenancy versus multitenancy. Conversations with several individuals in the space revealed that this is no minute detail in fact, it can have dramatic effects on efficiency, security and cost. In order to fully take advantage of rapid innovation, firms must dive deeper and learn what is really under the hood of their cloud solutions, and this is a natural place to start.

E Pluribus Unum

Another way to look at the single tenancy versus multitenancy question is: how many different active versions of the software exist? With a single tenant solution, every client has its own independent database and instance of the software; with a multitenant solution, a single instance of the software serves the entire client base.

While there are diverse offerings within these groups, proponents of single-tenant solutions typically tout their security, reliability and controllability, while those in favor of multitenancy point to its cost-effectiveness, efficiency and wider ecosystem.

Deployment models are becoming quite complex given the importance of the cloud in the capital markets, said Brad Bailey, Research Director at Celent. For those most concerned with control, means of access to the cloud and extension of their own infrastructure, single tenancy is often the choice. For those that are looking for maximum cost savings, ease of deploying changes across multiple instances and rapid cycles of updates, multitenancy would likely be the choice.

So, on one side there is a model that can better align with a firms existing technology strategy, while the other offers a number of key efficiencies at the cost of control. In many ways, it is a reincarnation of the old debates over the cloud itself.

And just like those old debates, the topic elicits strong feelings from those in the space. Sean Sullivan, CRO at LiquidityBook, a provider of SaaS-based, multitenant order management solutions to the buy and sell sides, believes that due to their reliance on professional services, single-tenant solutions pose a business risk for vendor and client alike.

While they are wrapped in the cloud and all the rhetoric that comes with that, single-tenant solutions share a lot of similarities with legacy systems, said Sullivan. They ignore the fundamental benefits of the cloud. When every client requires bespoke upgrades, localized customizations and patch releases, processes can become backed up and the product inevitably suffers, and this is especially true during times of disruption. Multitenant systems are far nimbler. Every single new functionality we build is released to the same instance, so our clients dont have to worry about racking up professional services fees and our ability to service them is never compromised.

LiquidityBook has ridden its model to significant success in recent years in 2019, the firm posted a 33% year-over-year improvement in terms of revenues. 2020 is shaping up to be even more successful, says Sullivan, especially with many legacy and single-tenant providers facing hurdles related to the COVID-19 pandemic.

Our experience with multitenant trading technology has been very positive, said Ben Searle, CIO atLevinEasterly Partners, a private asset management firm specializing in value investing. Deploying LiquidityBooks LBX Buy Side required heavy collaboration and some extensive back-end work at the outset, but very little since then. We now have access to a high-performance product and benefit from regular updates while avoiding legacy processes and servicing fees.

This process is typical of multitenant solutions while they often require a lengthy bespoke integration up front, every subsequent upgrade occurs automatically, saving time and resources in the long run. Meanwhile, with every new client the multitenant provider signs, the product is built out and improved, and the existing client base reaps the rewards. Its all part of what Sullivan calls an ecosystem of good ideas.

With a multitenant solution, clients should expect better ROI via shared resources, databases and applications, said Tom Pfister, Vice President of Global Product Strategy at Confluence Technologies, which provides a suite of reporting solutions to large asset managers. Hard costs are shared more efficiently across many tenants. In the same way, clients are effectively sharing their brainpower, making their innovations available to other tenants and driving the entire industry forward.

Forging the Future

Looking ahead, one thing seems clear: the cloud is here to stay. As firms get smarter on the topic and refine their strategies, multitenant solutions appear likely to continue to gain market share on their single-tenant counterparts.

Our multitenant model provides a huge range of advantages for our business, said Pfister. It greatly simplifies upgrades and client onboarding and reduces the costs of hardware, IT and product delivery. These are passed on to our clients, so everyone involved benefits.

Youn echoed this sentiment, calling multitenancy a win-win situation which delivers a tremendous amount of operational and cost efficiency.

Of course, single-tenant solutions are not going anywhere, and many firms that emphasize autonomy or want to have their data completely isolated from other customers will continue to go that route. In some ways, these offerings bridge the gap between monolithic legacy platforms and dynamic multitenant systems, filling an important void in the market.

But no matter what route they choose to go, the bottom line is that firms must do their homework in order to fully realize the benefits of the cloud future. Simply being in the cloud is just the beginning, and exactly how each system leverages the cloud could have a major impact. Every firms needs are different, but by educating themselves and asking the right questions, the industry will be better equipped to continue the cloud conversation no matter how long it lasts.

The following article appeared on Traders Magazine in June 2019

The Cloud Services Providers Next Play?

The times they are a changin.

Though Bob Dylan originally sang those words in 1964, theyve probably never been truer than they are today, at least when it comes to technology.

Name the industry, and in almost all cases technology-led disruptors have entered and seriously shaken up the status quo. Amazons effect on retail is obviously the prime example, but Ubers impact on the livery industry, Netflixs impact on content distribution and even Caspers impact on the mattress industry are others. Perhaps the only major industry to have *not* seen a significant impact due to the entry of new, tech-savy disruptors is the capital markets.

But in the opinion of some, it is only a matter of time before the major global banks most of whom operate aging, overly-complex technology stacks will see the same type of competition that other industries have felt.

Complexity Without Cloud

Currently, trying to decipher a problem in a large banks capital markets infrastructure is like an archaeological dig, according to Tony Amicangioli, Founder and CEO of capital markets infrastructure technology provider HPR (formerly known as Hyannis Port Research).

As you cut through the crust of technological layers you might find that in the early 2000s whoever was head of IT thought C++ was the solution to everything. Then they were replaced by a Java devotee and the most recent layers were built by a true believer in Python and their team. Multiply this across different regions Europe, AsiaPac and the Americas and again by asset class, with some solutions developed in-house, others by vendors, and it becomes unmanageable. Fixing it seems impossible. How are you going to take all these mission-critical systems off-line and rebuild from scratch? said Amicangioli.

Firms like Amazon, Apple, Google and Microsoft the cloud natives, if you will dont have this problem. Their foundational technology leverages the Cloud natively and is built to scale. At their essence, they are very simple. Take Google, which is fundamentally a distributed, de facto operating system. Ten Google apps may do ten very different things, but the underlying technology is highly unified. This efficiency and unification have enabled the rapid ascent of these companies.

Cloud technology is not just about moving applications to a central providers data center, says Amicangioli. In our view, effectively leveraging the Cloud is about unifying your systems and simplifying your development approaches within a singular and universal computing environment. We see all technology frameworks ultimately destined for this since it almost always represents the most cost-effective, responsive and performant environment.

Amicangioli knows. He was previously CTO of Tower Researchs Lime Brokerage subsidiary, one of the early winners in the race to sponsor HFT providers. Before that he founded one of the first Cloud startups, and early in his career served as an executive at hardware powerhouse Juniper Networks.

Ive been in business long enough to watch the likes of Sun Microsystems and Digital Equipment, once giants in the technology industry, vanish, he said. At times I get the same uneasy feeling when looking at some of todays largest banks given where their technology is in comparison to the entrants we all know are coming. Without a doubt, there will be winners and losers.

Trend Toward Simplicity

Its no secret that the stacks run by virtually all of the tier 1 global banks are in need of a major overhaul, and that is not news to the executives at those firms, either, said a former bank technology executive who asked to remain nameless. I think the failure to act has been driven in part by a belief that the regulatory and capital moats that we have in the capital markets will be sufficient deterrents, but I cant see that being the case for too much longer.

To that point, being in a highly regulated industry is no protection. Consider Oscar, the health insurance startup that last year attracted a $375 million investment from Google parent company Alphabet. Oscars proven that its possible to enter a complex market and gain competitive advantages by building a cloud-native technology stack from the ground up, resulting in better efficiency, better margins and ultimately better service to customers.

Thats not to say that Wall Street has been oblivious to the threat from tech-led competitors, however. JP Morgan CEO Jamie Dimon warned in a 2015 investor letter that Silicon Valley is coming, and two years ago noted that the bank is spending nearly $10B per year on remaining competitive. Other banks have similarly robust technology budgets, but far too much of that spending is to fund finger-in-the-dyke projects, and not nearly enough goes to the strategic ones, says the tech exec.

Some see the capital markets dividing into two camps when it comes to technology: the haves and the have-nots. With the stakes so high, CTOs fear of making mistakes can induce a form of paralysis, ushering their firms into the have-not category through inaction. As cloud adoption becomes the key driver of success, the industry will see these two camps transform into winners and losers, these people say.

Its tempting to see the devolution of large financial services firms as inevitable, but really, its not too late yet, said Amicangioli. The bank that locks on to this reality, finding efficiencies through technology, is going to do very well. The potential upside for being the first to truly get this right really cant be overstated.

See the article here:

The Next Evolution and Debate in the Cloud - Traders Magazine

Posted in Evolution | Comments Off on The Next Evolution and Debate in the Cloud – Traders Magazine

Steph Curry vs Steve Nash comparison: Evolution of the NBA – Franchise Sports

Posted: at 12:58 am

Guards Stephen Curry and Steve Nash both helped revolutionise the point guard position in the NBA. With this they brought a new aspect of basketball into play.

Nash was drafted in the first round with the 15th pick back in 1996 by the Phoenix Suns, Curry was also drafted in the 1st round in 2009 but seventh overall by the Golden State Warriors.

Steve Nash would go on to play 18 seasons in the NBA with the Phoenix Suns, Dallas Mavericks, and the Los Angeles Lakers. During his career he would win two MVP awards, make the All-Star team eight times, make the All-NBA team seven times alongside being the assist champion five times.

Despite Nash never winning an NBA Championship he was inducted into the 2018 Basketball Hall of Fame alongside such greats as Jason Kidd and Ray Allen

Curry has currently played 11 seasons in the NBA. All of these have been played as a Golden State Warrior. In this time Curry has won three NBA Championships, been a six time All Star and six-time All-NBA player, won two MVPs, been a scoring and steal champion, both in the 2015-2016 season.

Both Steve Nash and Stephen Curry won their two MVP (Most Valuable Player) awards back to back.

Nash won his during the 2004/2005 and 2005/2006 seasons while Curry won his during the 2014/2015 and 2015/2016 seasons.

During their back to back MVP seasons both Curry and Nash shot over 50% from two-point range and over 40% from three-point range.

When it comes to shooting Curry and Nash are similar. When it comes to their free throw shooting, they are both reliable with Nash shooting 90.4% from the free throw line and Curry sitting at 90.6%.

Even though Curry has taken nearly three times as many three pointers as Nash did, they are still close in this area as well. While Curry can shoot contested threes, Nash was not so good, so he had to use his playmaking ability to get more open shots.

With Curry sitting on a 43.5% shooting percentage from three-point range Nash only just below him on 42.8% in an era where not many threes were shot and 35% was considered a top shooting percentage.

Unsurprisingly they are also similar from two-point range as well. Curry also takes more two-point shots per game with 8.9 to Nashs 7.4. However, Nash has the slightly higher percentage shooting 51.8% while Curry is on 51.5%.

Depending on how you see this, it could be seen as a passing of the torch. Considering that Nash was more of a 2000s player and Curry is a 2010s player, Currys era succeeds Nashs dominant spell.

Nash shot the three-point shot very well at a high percentage, but he did not shoot many per game having a career average of 3.2 per game and a career high of 4.7.

Then when Curry came into the NBA, he built on the foundations of Nashs game. Curry took what Nash did with three-point shooting and took it to the next level by taking 8.2 three-point attempts per game. Curry has also had three seasons when he has shot on average over 10 three pointers per game.

Their playmaking may differ in the eyes of some fans, due to the era that Nash and Curry have played in Nash will have had more chance to show his playmaking ability.

Through the time that Steve Nash played in the NBA the three-point shot was not seen as important as it is now. This would lead to Nash having to be more of a playmaker than a scorer like Curry is now, even though he was a top scorer.

With Curry it is the other way around. In todays era the three-point shot is used more than ever. This gives Curry less of a chance to show his playmaking ability because he has been taking 10.7 threes per game since the 2015/16 season (this is when the three pointers were starting to be shot a lot more.)

Due to this, Currys playmaking does go more under the radar even though he averages 6.6 assists for his career which shows he can move the ball around.

Originally posted here:

Steph Curry vs Steve Nash comparison: Evolution of the NBA - Franchise Sports

Posted in Evolution | Comments Off on Steph Curry vs Steve Nash comparison: Evolution of the NBA – Franchise Sports

Evolution of the BMW X6 – CarWale

Posted: at 12:58 am

Before the BMW X6 came into existence, there were the coupes and there were SUVs. Some mad lads at the Bavarian carmakers R&D decided to merge these two unlikely body styles into one. And the top bosses at BMW gave it a green flag and thus was born the X6. The carmaker coined the term SAC for it meaning sports activity coupe, but what the X6 did was pioneer an all-new body style Coupe-SUV or SUV-Coupe. Let us have a detailed look at the evolution of the BMW X6

The X6 Concept

At the 2007 Frankfurt Motor Show, BMW introduced a hunky chunk of metal which had a top half of a sports coupe and bottom half of an SUV. This satyr-type concept received a mixed bag of reaction, but since the concept was almost production-ready, BMW went ahead and did what they intended to.

First Generation E71 (2008-2014)

Work on the E71-gen X6 began shortly after the E70-gen X5 in the early 2000s. It was based on the previous-gen 5 Series and 6 Series platform under the head of Peter Tuennermann. The lead design of Pierre Leclercq was passed in 2006 and the first-gen X6 broke cover at the 2008 Detroit Motor Show. The first-gen model was sold globally until 2014.

BMW also introduced a high-performance X6 M guise with a 550bhp 4.4-litre V8. The xDrive was standard and the other powertrain included 3.0-litre straight-six petrol and diesel engine options. There was also an X6 ActiveHybrid on sale for a short while.

Second Generation F16 (2015-2019)

The second-generation X6 debuted at the 2014 Paris Motor Show.By the time the E71 had already found 2.5 lakh takers globally. It was based on the F15-gen X5, from which the styling was borrowed too. But BMW maintained the sloping roofline even while offering a slightly larger and practical boot space. It also got more engine options and was sold in more markets across the globe than before.

The impractical and mental X6 M (F86) was one of the quickest vehicles in its class. By this time, many other manufacturers had started to duplicate the X6s formula. Mercedes-Benz did it with the GLC/GLE Coupe. The Toyota C-HR is another example. Even the Audi Q8/Lamborghini Urus carry a similar coupe-SUV styling.

Third Generation G06 (2019-present)

Last year, BMW debuted the third-generation X6. It is now based on the same CLAR platform as the 7 Series and the X7 and not only is it bigger on the inside, its more spacious and more powerful under the skin. And for the first time, it gets an illuminated kidney grille function which is fancy, to say the least. Itwas launched in India earlier this week in the xDrive40i guise (you can read all about it over here), but we expect more powerful derivatives like the M50i and the full-blown X6 M to be introduced later.

Originally posted here:

Evolution of the BMW X6 - CarWale

Posted in Evolution | Comments Off on Evolution of the BMW X6 – CarWale