I wouldn’t call this a “what a week in entertainment!” media week, but at least one outlet ran an “special emergency” newsletter, so clearly we had news. Instead of that big story–or the continued musical chairs at Warner Media–I’ve had my eye on a few stories that add up to a bigger one.
Most Important Story of the Week – Youtube Battles Child Pornography/Predators
This was a contender for the most important story last week, but got bumped since it isn’t really a “one-time’ story. It’s slightly evergreen. Since we invented video on the internet, we’ve had these problems. Slate had an article on Periscope and child predators back in 2017. So I’m not just picking on Youtube even though I put them in the headline; any social media platform (with video/images) eventually has to deal with predators targeting teenagers and children.
Let’s stick on that for one quick moment.
“Predators”
“Targeting”
“Children”
I just paused to think, “Is that too strong?” It really isn’t. It just describes what is happening. If you doubt this, read the excellent article that set off the furor on Wired. Clearly, this is a problem. (Again, pair it with the Slate article above and doubtless this problem happens on multiple social platforms.)
The best defense of Youtube (and others) has been something along the lines of, “Well, you know they do have to handle billions of videos and trillions of interactions. This is fair, yet I feel like I need a Matt Levine-esque analogy to explain why this isn’t really a defense.
Let’s say you owned a park. For some reason, you were able to monetize parks and turn the parks into private places. And since this is the tech age, say I turned one park into 10,000 parks. All sorts of kids started playing there, mostly with their parents, but sometimes you convinced parents the kids were safe in the parks without them. Then all of a sudden a bunch of creepy dudes in the forties started hanging out at the park without kids. And then they talked to the kids and eventually asked the young girls to expose themselves. If I was making money off that enterprise, is saying, “Well, I own 10,000 parks, I can’t keep child predators out of all of them!” The answer would be, “Well you better damn well try.”
See running safe parks is part of the requirement to run parks. And with video, running a service where people can’t target children should be part of the requirement.
The problem with Youtube, Periscope and others (who likely have the same problems) is that my 10,000 parks doesn’t even capture the scale. I’d need 1,000,000,000 parks! This is the challenge of social video, in that content is no longer curated by executives in Hollywood offices working by the dozens, but by engineers optimizing equations on computers anywhere around the globe.
Like I said above, what makes it work is also what creates this shady underbellies, as Slate called them. This is where I concede the very eloquent defense of the tech companies. Tyler Cowen (who I saw linked to by Kevin Drum) makes the case that when it comes to social media, we have a trade-off of three forces: the scale we want to achieve, the costs to review all the content, and the consistency to treat only get rid of bad content. Cowen and Drum argue you can only have two of the three. That’s hard to disagree with.
This view was echoed recently in Dylan Byer’s newsletter too, who linked to Wired writer, Antonia Garcia Martinez. To summarize the challenge facing Youtube and others, “All detractors have to do is point to one bad piece of content, whereas Youtube is hosting billions of videos.” That’s a hard point to disagree with. If you want Youtube to exist–meaning you think it is valuable–you have to accept it is huge, so hard to police perfectly. Further, it isn’t like Youtube is doing nothing to combat these issues.
Ultimately, while I understand the scale of the problem, I don’t think those defenses get it quite right. And I have a few counters for today. Basically, regulators should demand Youtube (and other social platforms with video or images) do better when it comes to children, and not just reactively to bad press:
First, this isn’t about all content, but clearly illegal/evil content.
The counter is summarized in this Chris Mim’s tweet, which doesn’t mention the child pornography issue, but is in the same family.
https://twitter.com/mims/status/1101134318989164550
My “synthesis” is that we can’t control all content, but can try to control content that is clearly evil, for lack of a better word. Promoting genocide? Yep. Interfering with democracy? Yep. And content that hurts children. (Vaccines are a tougher call, but given that kids can die when they contract illnesses, it merits solutions too.) The fact is, as compelling as the “This isn’t a huge problem” argument is, if a social platform helped cause a genocide or creepy young men flock to teen videos, that’s a problem. And illegal. Even more so if you’re monetizing that interaction. One of the costs of running a video platform is finding this content and banning it.
Second, these companies are WILDLY profitable.
We don’t know if Youtube is profitable, but we know Google/Alphabet is. Google had $19 billion in free cash flow by one estimate. So if they needed $1 billion dollars funds each year to fight pornography on top of what they’re spending, well that’s an easy judgment call. Do it! You’d still have $18 billion in cash flow. Same for Facebook (see my long read below), who had $10 billion in free cash flow. For the privilege of making tons of money, protect children and stop genocide. That’s an easy call.
Moreover, my gut says the “they are already doing a lot” defense doesn’t hold up to real world facts. If you asked Google, “How many ad-technology programmers do you have?” and then, “How many content moderation programmers do you have?” I bet the former outnumbers the latter. The fact is Google makes money off ads, so it employs a lot of people building that technology. Same with Facebook and Twitter and so on. Frankly, the economics tell big tech companies that they need to do just enough to avoid bad press, then focus the rest of their resources on selling more ads.
Third, I’m not worried about the content that is “close” for child pornography
Inevitably someone will complain their video was banned or hidden because of moderation. We need to ignore this. And we can only do it with better information. I think of websites on breast cancer being hidden because of the word “breast” and likely explicit images. That’s an okay tradeoff, lest we find ourselves deluged with nude images. So yeah, if your video is hurt in rankings because it somehow features a close call, but the algorithm did it to protect children, okay! What made it a close call in the first place?
Fourth, some steps to fix this seem insanely easy.
This is the other, supremely obvious argument against the “scale” defense. As the Wired article mentioned, Google finishes search terms by sometimes adding “young” to terms like “yoga”. Google why would you do that? Just stop! Does that take a rocket scientist to find out the terms that are used by child predators are inappropriate? Certain terms like “teen” or “young” shouldn’t autofill on your search engine Google next to other words. I can’t imagine a product manager confronted with this fact couldn’t fix the problem.
Or, as the Youtube and Slate article mentioned, it is easy to see that previously unsuccessful feeds suddenly have wildly popular videos featuring arguably inappropriate content. Just shut those down after they trigger a threshold. Or flag them to a moderator. Again, this isn’t insanely hard, though designing the algorithms and processes requires money, people (product managers and engineers) and a determined, informed and focused effort to fix the problem.
Bonus Long Read – The Secret Lives of Facebook Moderators
I wouldn’t call this long article the “flip side” of content moderation, but more an explanation that algorithms are still incredibly imperfect. (If they weren’t? Well, then the robots have taken over.) To fill the gap, we need people to still manually review lots of content. And in a, “oh, that’s not surprising at all” reveal, Facebook uses a contracting company to outsource this important function, pays the people very poorly and keeps everyone under NDAs to hide any bad behavior. So in the “effectiveness” metric, Facebook deliberately pays very little and as a result the employees suffer. This was another great read.
Other Contenders for Most Important Story
Samsung kills Blu-ray players (new ones)
I’ve taken a controversial position on physical disks before: they aren’t dead!
This is shocking in some quarters. Physical discs are in a strange middle ground in that you can make money off them, but they aren’t a growth industry by any means. That all said, when device makers stop making new players, then the industry has a shorter time frame. (To be clear, they will keep selling players, but not make new versions.)
WarnerMedia hires Bob Greenblatt; Studio Head under Investigation after The Hollywood Reporter Report
Last week, the rumors quickly emerged that Bob Greenblatt, formerly of Showtime and NBC would be taking over part of WarnerMedia’s empire, including HBO and Turner, explaining why Plepler and Levy left. This came true this week, and there was a larger reorg where Greenblatt, Kevin Tsujihara and Jeff Zucker have expanded roles. The best analogy for me is that this is like a new NBA owner putting in a new coach and general manager. (See, the Lakers.)
As for the Kevin Tsujihara story, my response was “Yikes”. Others have covered this better, from THR (who broke it) to the Ankler to Deadline (who didn’t cover the story, but the Warner investigation into the story). Notably, Variety has still not run this story, as far as I can tell. That is its own interesting story of media analysis, but again I’ll focus on the business angle. If Tsujihara follows Jeff Bewkes, John Martin, David Levy and Richard Plepler–who knows what the investigation will find?–then the chaos in AT&T’s newest acquisition is beyond just bringing in our guys. There is such thing as too much turnover, and AT&T might have reached that.
Update to Old Ideas
For some reason, I found a series of articles touching upon previous columns or articles, and I love to keep these things updated.
Neal Rothschild in Axios on the Engagement rates of Social Platforms
Axios dug into engagement rates on various platforms, that in my mind is the better way to measure influence than just followers. So when you’re evaluating how well you are doing as a company, you need to focus on getting engaged followers, not just followers total. (Those you can buy.) Right now, Instagram is the top of the engagement metrics, at least among top accounts.
Rachel Kraus in Mashable on Pinterest as “Good” Social Platform
I liked this article about how Pinterest knows its customers, and doesn’t try to go too far beyond that. The key is they know they are a digital pinboard, not some engine into changing the world or taking over your life. I like that focus on customers and how it has guided a slow, but engagement-filled, growth.
In a column a few weeks back, I praised NBC for trying to vary up how it does primetime ads. I’ve seen the change on Bravo (Top Chef) and NBC (Brooklyn 99). Apparently, the initial signs are positive, taken with a heaping dose of salt for the obvious potential bias in that NBC wouldn’t tell us if this wasn’t working, would they? Also, the pods were in the highest-rated shows, which is another potential source of bias.
BBC and More Netflix Datecdotes
Finally, I can’t help a good, “broadcaster says Netflix has X ratings” story, this time in the UK! Regarding the Crown versus some other BBC shows, that end up globally on channels ranging from BBC America to PBS to Netflix itself.