This week's notable links
This is my regular digest of links and media I found notable over the last week. Did I miss something? Let me know!
Calm Down—Your Phone Isn’t Listening to Your Conversations. It’s Just Tracking Everything You Type, Every App You Use, Every Website You Visit, and Everywhere You Go in the Physical World
[Jonathan Zeller at McSweeney's]
"We do not live in some tech dystopia in which our smartphones clandestinely use their mics to pick up every word we say and then feed us commercial messages based on them. The truth is simpler and not at all alarming: your phone only seems to be listening to you because it’s collecting data about every word you type, every website you visit, and, through GPS tracking, everywhere you go in the physical world."
No notes: this is pretty good.
[Link]
Peter Capaldi says posh actors are smooth, confident and tedious
[Vanessa Thorpe in The Guardian]
“Art is about reaching out. So I think it’s wrong to allow one strata of society to have the most access.”
This is an older article, but it resonated with me so much that I wanted to share it immediately.
This is so important, and a sign of what we've lost:
“I went [to art school] because the government of the day paid for me to go and I didn’t have to pay them back. There was a thrusting society then, a society that tried to improve itself. Yes, of course, it cost money. But so what? It allowed people from any kind of background to learn about Shakespeare, or Vermeer.”
A culture where only the rich are afforded the space, training, and platform to make art is missing the voices that make it special.
The same goes for other spaces: newsrooms where only the wealthy can serve as journalists cannot accurately represent the people who depend on it. Technology without class diversity is myopic. Above all else, a culture of rich people is boring as hell.
Art school - like all school - should be free and available to everyone. It's tragic that it's not. We all lose out, regardless of our background.
[Link]
Fighting bots is fighting humans
"I fear that media outlets and other websites, in attempting to "protect" their material from AI scrapers, will go too far in the anti-human direction."
I've been struggling with this.
I'm not in favor of the 404 Media approach, which is to stick an auth wall in front of your content, forcing everyone to register before they can load your article. That isn't a great experience for anyone, and I don't think it's sustainable for a publisher in the long run.
At the same time, I think it's fair to try and prevent some bot access at the moment. Adding AI agents to your robots.txt - although, as recent news has shown, perhaps not as effective a move as it might be - seems like the right call to me.
Clearly an AI agent isn't a human. For ad hoc queries - where an agent is retrieving content from a website in direct response to a user query - it clearly is acting on behalf of a human. Is it a browser, then? Maybe? If it is, we should just let it through.
It's accessing articles as training data that I really take issue with (as well as the subterfuge of not always advertising what it is when it accesses a site). In these cases, content is copied into a corpus in a manner that's outside of its licensing, without the author's knowledge. That sucks - not because I'm in favor of DRM, but because often the people whose work is being taken are living on a shoestring, and the software is run by very large corporations who will make a fortune.
But yes: I don't think auth walls, CAPTCHAs, paywalls, or any added friction between content and audience are a good idea. These things make the web worse for everybody.
Molly's post is in response to an original by Manu Moreale, which is also worth reading.
[Link]
Law enforcement is spying on thousands of Americans’ mail, records show
[Drew Harwell at the Washington Post]
"Postal inspectors say they fulfill [requests from law enforcement to share information from letters and packages] only when mail monitoring can help find a fugitive or investigate a crime. But a decade’s worth of records, provided exclusively to The Washington Post in response to a congressional probe, show Postal Service officials have received more than 60,000 requests from federal agents and police officers since 2015, and that they rarely say no."
I wish this was surprising. Something similar seems to have gone on in every trusted facet of American life: from cell phone providers to online library platforms to license plate readers on the roads. It's all part of an Overton window shift into pervasive surveillance that has been ongoing for decades.
Senator Ron Wyden is right to be blunt:
“These new statistics show that thousands of Americans are subjected to warrantless surveillance each year, and that the Postal Inspection Service rubber stamps practically all of the requests they receive.”
We shouldn't accept it. And yet, by and large, we do.
[Link]
The Future of Fashion Commerce Is a Designer's AI Bot Saying You Look Great and Your Personal AI Bot Sifting Through the Bullshit
"The best commerce platforms will be constantly grooming you, priming you, shaping you to buy. The combination of short-term and long-term value that leads to the optimal financial outcome for the business."
I think this is inevitably correct: the web will devolve into a battle between different entities who are all trying to persuade you to take different actions. That's already been true for decades, but it's been ambient until now; generative AI gives it the ability to literally argue with us. Which means we're going to need our own bots to argue back.
Hunter's analogy of a bot that's supposedly in your corner calling bullshit on all the bots trying to sell things to you is a good one. Except, who will build the bot that's in your corner? Why will it definitely be so? Who will profit from it?
What a spiral this will be.
[Link]
Why does moral progress feel preachy and annoying?
[Daniel Kelly and Evan Westra in Aeon]
"Many genuinely good arguments for moral change will be initially experienced as annoying. Moreover, the emotional responses that people feel in these situations are not typically produced by psychological processes that are closely tracking argument structure or responding directly to moral reasons."
This is a useful breakdown of why arguments for social progress encounter so much friction, and why the first emotional response may be to roll our eyes. It's all about our norm psychologies - and some people have stronger reactions than others.
As the authors make clear here, people who are already outside of the mainstream culture for one reason or another (immigration, belonging to a minority or vulnerable group, and so on) already feel friction from the prevailing norms being misaligned with their own psychology. If that isn't the case, change is that much harder.
But naming it is at least part of the battle:
"Knowing this fact about yourself should lead you to pause the next time you reflexively roll your eyes upon encountering some new, annoying norm and the changes its advocates are asking you to make. That irritation is not your bullshit detector going off."
Talking about these effects, and understanding their origins, helps everyone better understand their reactions and get to better outcomes. Social change is both necessary and likely to happen regardless of our reactions. It's always better to be a person who celebrates progressive change rather than someone who creates friction in the face of it.
[Link]