Neil Young’s Spotify tiff is a reminder that tech giants always win – Euronews

The opinions expressed in this article are those of the author and do not represent in any way the editorial position of Euronews.

As a listener, you might not care. But as an artist, it can be a tough pill to swallow to know that an algorithm, as opposed to human preference, might be behind your success or failure, Jonah Prousky writes.

Neil Young and Joni Mitchell begrudgingly returned their music to Spotify last month, two years after leaving the platform in protest of its largest podcaster, Joe Rogan.

According to Young, Rogan was using the platform to spread misinformation about the COVID-19 pandemic.

They can have Rogan or Young. Not both, wrote Young to his manager at Warner Music Group.

It turns out, Spotify can have both.

And, no matter what you think of Youngs protest (or boycott, or whatever it was), his clash with Spotify is a reminder that tech giants have a funny way of getting what they want and resistance from artists is usually futile.

Many creators have long been frustrated with platforms like Spotify and YouTube due to the algorithms they employ, which in part drive views and streams, and by extension, pay.

Most creators, however, dont have the clout to issue ultimatums, nor the money to leave these platforms.

While some artists on Spotify make a decent living, there is a far, far greater volume of artists literally millions of them who are struggling to make ends meet from their streaming royalties, according to Rolling Stone.

Also, without an established audience of ones own, artists are pretty much beholden to Spotify and YouTube for views.

According to Forbes, Spotify holds a dominant 30.5% of the music streaming market, more than double its nearest competitor, Apple Music, which has a 13.7% share. YouTube is virtually unrivalled.

Who cares, you might say, Spotify is beloved. And, hasnt the company done a lot to democratise music?

Its true, the company cut out a lot of the red tape associated with the legacy music business by giving new artists a direct line (and business model) for reaching listeners.

That ethos is even enshrined in the companys mission statement, which is to unlock the potential of human creativity by giving a million creative artists the opportunity to live off their art and billions of fans the opportunity to enjoy and be inspired by it.

The company has done much to advance that mission. Its capable of launching music careers in ways that never would have been possible in decades past. An artists streams and by extension, earnings can skyrocket almost overnight if their songs make it onto one of the platform's most-listened-to playlists.

It can quite literally be the difference between driving Uber and making music on the side and earning $200,000 (187,880) in streaming royalties.

So any attempt to criticise the platform ought to be wary of what its done for some musicians. But, in many ways, the platforms algorithm has homogenised music tastes around a small number of top artists, making it harder for new musicians to gain traction.

Algorithms", wrote Scott Timberg in a column for Salon, "are about driving you closer and closer to what you already know. And instead of taking you toward what you want to listen to, they direct you toward slight variations of what youre already consuming.

What people are already consuming is just a small subset of Spotifys artist base, whose tunes gobble up our collective attention.

In 2013, the top 1% of artists accounted for over three-quarters of all revenue from recorded music sales. In that year 20% of songs on Spotify had never been streamed, wrote Ludovic Hunter-Tilney for the Financial Times.

Maybe thats always been the case, youll wonder. I mean, anyone who's seen The X Factor knows that not every artist is worthy of our attention. But the decision of what and who to listen to used to be a human one.

As a listener, you might not care, especially if you think the algorithm has a good handle on your taste. But as an artist, it can be a tough pill to swallow to know that an algorithm, as opposed to human preference, might be behind your success or failure.

So, say youre a musician or content creator who feels the algorithm has treated you unfavourably. What are you going to do, leave? Boycott?

Well, some are. A growing wave of artists and content creators are leaving Spotify and YouTube, often for platforms like Substack and Patreon, where their earnings arent beholden to the algorithm.

Platforms like Substack and Patreon allow creators to own their audience since earnings on these platforms arent tied to views, rather, audience members pay creators directly and the platforms take a small cut.

Still, that move is really only viable for established artists like Young and Mitchell who have audiences.

So, if youre just starting out as a musician or content creator, you really have no choice but to dig in your heels and hope the algorithm likes your stuff.

Jonah Prousky is a Canadian freelance writer based in London. His work has appeared in several leading publications including the Canadian Broadcasting Corporation (CBC), Toronto Star, and Calgary Herald.

At Euronews, we believe all views matter. Contact us at view@euronews.com to send pitches or submissions and be part of the conversation.

Continue reading here:

Neil Young's Spotify tiff is a reminder that tech giants always win - Euronews

Elon Musk Thinks Cannibals Are Invading the United States – Futurism

In his latest racist outburst, multihyphenate billionaire Elon Musk joined other conservative pundits in accusing Haitian migrants of being "cannibals," arguing that they shouldn't be allowed to move to the US.

The news comes after political unrest in the island nation came to a head this week. On Monday, Haiti's prime minister Ariel Henry agreed to resign if other Caribbean nations were to form a transitional government on behalf of the country. The statement angered Haitians, triggering mass protests, with tires being burned in the streets.

Meanwhile, Musk took to his social media platform X to further unverified and sensationalist claims of cannibalism arising out of the conflict, as NBC reports.

Case in point, today, the mercurial CEO tweeted a link to a video that claimed to show evidence of cannibalism in Haiti in response to the report.

The video was promptly taken down by X, Axios reports, which stated that the video had violated its rules.

In other words, even Musk's own social media company isn't willing to support his increasingly racist anti-immigration posts.

Ever since Musk took over the company formerly known as Twitter, hate speech has flourished on the platform. The billionaire has spread his own share of misinformation as well, from bogus COVID-19 data to false information about the Israel-Gaza conflict.

Musk has also made plenty of his own racist remarks on his platform. In January, he argued that Black students at Historically Black Colleges and Universities (HBCUs) have lower IQs and therefore shouldn't become pilots ridiculous claims that were met with horror by civil rights groups.

Most recently, the billionaire took aim at the people of Haiti, playing into debunked tropes.

Over the weekend, Musk tweeted "cannibal gangs..." in response to a clip by right-wing commentator Matt Walsh about unrest in Haiti.

"Civilization is fragile," he wrote in response to another since-deleted video, which claimed to show footage of a "cannibal gang eating body parts."

This week, Musk joined right-wing commentator Ian Miles Cheong, who argued on X earlier this week that there were "cannibal gangs in Haiti who abduct and eat people."

"If wanting to screen immigrants for potential homicidal tendencies and cannibalism makes me 'right wing,' then I would gladly accept such a label!" an incensed Musk wrote in a reply to a separate post in which Cheong complained about the NBC report. "Failure to do so would put innocent Americans in [sic] mortal risk," he added, failing to provide any evidence for his outlandish claims.

As experts have since pointed out, the posts were likely the result of gang propaganda campaigns designed to stoke fear, as NBC reports. While it's still possible that the odd gang leaders are indeed capable of such ghoulish acts, generalizing these claims is not only misleading a State Department spokesperson told the broadcaster that it had received no credible reports of cannibalism but even clearly playing into racist tropes that date back to colonial times.

There's also the issue of basic human decency. Through no fault of its residents, Haiti is in crisis; instead of wondering how the country he immigrated to could help, Musk is punching down at the most extreme examples of social dysfunction he can find online.

"It is very disturbing that Elon Musk would repeat these absurdities that do, indeed, have a long history," Yale University professor of French and African diaspora studies Marlene Daut told NBC.

In short, it's yet another troubling sign of Musk's descent into extreme right-wing circles, while using his considerable following and social media network to further conspiracy theories and racist disinformation.

"A whole population is getting blamed for what some psycho gang members are doing," Washington-based lawyer and moderator of the subreddit r/Haiti, told NBC. "It is racist. It is dehumanizing."

More on Musk: Elon Musk Deletes Tweet Saying Ex-Wives Responsible for Collapse of Civilization

Here is the original post:

Elon Musk Thinks Cannibals Are Invading the United States - Futurism

Supreme Court Will Decide What Free Speech Means on Social Media – Gizmodo

The Supreme Court is hearing two cases on Monday that could set new precedents around free speech on social media platforms. The cases challenge two similar laws from Florida and Texas, respectively, which aim to reduce Silicon Valley censorship on social media, much like Elon Musk has done at X in the last year.

Twitter Verification is a Hot Mess

After four hours of opening arguments, Supreme Court Justices seemed unlikely to completely strike down Texas and Floridas laws, according to Bloomberg. Justice Clarence Thomas said social media companies were engaging in censorship. However, Chief Justice John Roberts questioned whether social media platforms are really a public square. If not, they wouldnt fall under the First Amendments protections.

At one point, the lawyer representing Texas shouted out, Sir, this is a Wendys. He was trying to prove a point about public squares and free speech, but it didnt make much sense.

The cases, Moody v. NetChoice and NetChoice v. Paxton, both label social media platforms as a digital public square and would give states a say in how content is moderated. Both laws are concerned with conservative voices being silenced on Facebook, Instagram, TikTok, and other social media platforms, potentially infringing on the First Amendment.

Silencing conservative views is un-American, its un-Texan and its about to be illegal, said Texas Governor Greg Abbott on X in 2021, announcing one of the laws the Supreme Court is debating on Monday.

If Big Tech censors enforce rules inconsistently, to discriminate in favor of the dominant Silicon Valley ideology, they will now be held accountable, said Florida Governor Ron DeSantis in a 2021 press release, announcing his new law.

NetChoice, a coalition of techs biggest players, argues that these state laws infringe on a social media companys right to free speech. The cases have made their way to the United States highest court, and a decision could permanently change social media.

The laws could limit Facebooks ability to censor pro-Nazi content on its platform, for example. Social media companies have long been able to dictate what kind of content appears on their platform, but the topic has taken center stage in the last year. Musks X lost major advertisers following a rise in white supremacist content that appeared next to legacy brands, such as IBM and Apple.

NetChoice argues that social media networks are like newspapers, and they have a right to choose what appears on their pages, litigator Chris Marchese told The Verge. The New York Times is not required to let Donald Trump write an 0p-ed under the First Amendment, and NetChoice argues the same goes for social media.

NetChoices members include Google, Meta, TikTok, X, Amazon, Airbnb, and other Silicon Valley staples beyond social media platforms. The association was founded in 2001 to make the Internet safe for free enterprise and free expression.

Social and political issues have consumed technology companies in recent months. Googles new AI chatbot Gemini was accused of being racist against white people last week. In January, Mark Zuckerberg, sitting before Senate leaders, apologized to a room of parents who said Instagram contributed to their childrens suicides or exploitation.

Both of these laws were created shortly after Twitter, now X, banned Donald Trump in 2021. Since then, Musk has completely revamped the platform into a free speech absolutist site. Similar to Governors Abbot and DeSantis, Musk is also highly concerned with so-called liberal censorship on social media.

The Supreme Courts decision on these cases could have a meaningful impact on how controversy and discourse play out on social media. Congress has faced criticism for its limited role in regulating social media companies in the last two decades, but this decision could finally set some ground rules. Its unclear which way the Court will lean on these cases, as the issues have little precedent.

Go here to see the original:

Supreme Court Will Decide What Free Speech Means on Social Media - Gizmodo