This week's notable links
This is my regular digest of links and media I found notable over the last week. Did I miss something? Let me know!
During Meta's earnings call, Mark Zuckerberg said that Facebook and Instagram data is used to train the company's AI models.
“On Facebook and Instagram, there are hundreds of billions of publicly shared images and tens of billions of public videos, which we estimate is greater than the Common Crawl dataset and people share large numbers of public text posts in comments across our services as well.”
He's playing to win: one unstated competitive advantage is that Meta actually has the legal right to use training data generated on its own services. It's probably not something most users are aware of, but by posting content there, they grant the company rights to use it. If OpenAI falls afoul of copyright law, Meta's tech has a path forward.
It's a jarring thought, though. I'm certainly not keen on a generative model being trained on my son's face, for example. I'm curious how many users will feel the same way. #AI
‘The Messenger’ Implosion Once Again Shows The Real Problem With U.S. Journalism Is Shitty Management By Visionless, Fail-Upward Brunchlords
"If you’ve spent any time in journalism, it’s completely wild to think about what a small team of smart, hungry journalists and editors could do with $50 million. It’s enough to staff a team of hard-nosed ProPublica-esque journalists for the better part of the next decade."
While we're here, might I suggest donating to ProPublica so those hard-nosed journalists can stick around to do exactly that? #Media
A lovely interview with Winnie Lim, whose deeply human, beautifully-written blog is one of my absolute must-reads.
This spoke to me, except substitute Oxford for Singapore: "I felt very alienated and lonely as a young person in the 1990s. It was incredible to discover the internet and know there is an entire world out there, that there are actually many people living diverse lives that were not visible or encouraged in Singapore."
Winnie and I both worked at Medium at different times, and yet both have a very strong own-your-own-domain philosophy. Her blogging story is really similar to mine, even if the content of her blog is very much her own.
Just a complete pleasure to read. #Culture
"OpenAI’s GPT-4 only gave people a slight advantage over the regular internet when it came to researching bioweapons, according to a study the company conducted itself." Uh, great?
"On top of that, the students who used GPT-4 were nearly as proficient as the expert group on some of the tasks. The researchers also noticed that GPT-4 brought the student cohort’s answers up to the “expert’s baseline” for two of the tasks in particular: magnification and formulation." Um, splendid?
"However, the study’s authors later state in a footnote that, overall, GPT-4 gave all participants a “statistically significant” advantage in total accuracy." Ah, superb? #AI
"It should be obvious that any technology prone to making up facts is a bad fit for journalism, but the Associated Press, the American Journalism Project, and Axel Springer have all inked partnerships with OpenAI."
The conversation about AI at the Online News Association conference last year was so jarring to me that I was angry about it for a month. As Tyler Fisher says here, it presents existential risk to the news industry - and beyond that, following a FOMO-driven hype cycle rather than building things based on what your community actually needs is a recipe for failure.
As Tyler says: "Instead of trying to compete, journalism must reject the scale-driven paradigm in favor of deeper connection and community." This is the only real path forward for journalism. Honestly, it's the only real path forward for the web, and for a great many industries that live on it. #AI
Josh Marshall on The Messenger: "It really is like if you were on a parachute jump and some cocky idiot just jumped out of the plane with no chute saying he had it covered and, obviously, plummeted to the ground and died."
Beyond the well-deserved snark, this is actually a great breakdown of what went wrong here, and why businesses like The Messenger don't work anymore. The scale-advertising-social equation is obsolete.
Forgive me if it sounds like I'm banging some sort of drum, but you really do need to build deeper relationships through community, get to know the people you're serving, and build something that meets their unmet needs incredibly well. A content farm ain't it. #Media
Transport for London have redesigned the Tube map in concentric circles as part of a promotional partnership with a phone company. Just one of the many, many ways public transit is desperately grasping for funds all over the world.
Here in Philly, SEPTA is working to rename stations based on corporate sponsorships. The Tube actually did this once before already, renaming Bond Street to Burberry Street for London fashion week. That (as well as these new maps, presumably) was temporary; these are permanent.
I don't blame transit authorities for trying to make up for budget shortfalls however they can. But it's also sad. Public transit is an important public good; it's a real shame that we can't seem to fully fund it from the public purse. The point is not for transit to be profitable, it's to provide real infrastructure that lifts everybody up. #Society
"Research has shown that the more readers know about our reporters, the more likely they are to understand the rigors of our journalistic process and trust the results." So the NYT enhanced its journalist profiles to make them more human.
People trust people, not brands. The design makes sense: it deepens the relationship between a reader and the journalist whose work they're interacting with.
I think these are just the first steps of that humanization, though. Newsrooms need to transition from thinking about "audience" to "community": a one-way broadcast relationship to the kind of two-way conversation the internet was built for. #Media
A really great piece about blameless postmortems and how the psychological safety to tell the truth leads to fewer mistakes and - in the case of the aviation industry - fewer lives lost.
"It’s often much more productive to ask why than to ask who. [...] A just organizational culture recognizes that a high level of operational safety can be achieved only when the root causes of human error are examined; who made a mistake is far less important than why it was made."