Each week, at the end of both the TV and Film sections in the Streaming Ratings Report, I nominate “Dogs Not Barking” of the week, which may lead you to ask:
What is a “Dog Not Barking”?
Good question! It’s a term that I started using, inspired by a Sherlock Holmes short story, to describe how a certain type film or TV show underperforms on the streaming ratings charts.
So today, here’s a quick explainer that I can link to every time we use the term.
What is a Dog Not Barking?
In short, it’s a flop. A dud. Instead of a hit, it’s a miss. Basically, it’s any streaming TV show or movie that didn’t do well.
As long as Hollywood has been making movies or TV shows, there’ve been TV shows and movies TV shows that fail. As far as streaming ratings go, we call them “Dogs Not Barking”.
But It’s More Complicated Than That (Because Everything In Streaming Is)
In the before times—think pre-2013—TV ratings were pretty simple. Each week, Nielsen just told you how many people watched every TV show and movie that aired on broadcast and cable each week. How well did a movie or TV show do? Just look at the ratings! You didn’t need the Entertainment Strategy Guy to explain it to you. If you followed the ratings each week—as almost everyone in town did—you had a good sense of what was popular and what wasn’t.
And that’s mostly the case for broadcast and cable television to this day. Take, for example, this headline:
Wow, that’s pretty straightforward: we know the ratings for every network show. What didn’t work on broadcast? Just scroll down to the bottom of this list to see for yourself.
The problem with streaming ratings is that they don’t have the same level of comprehensiveness as broadcast ratings. Nielsen (per agreements with the streamers) only releases the top thirty TV shows (licensed and original) and films each week. (This, of course, doesn’t mean that there are no ratings…just less of them.)
If a TV show or film falls outside of the top thirty, who knows how well it did?
That lack of knowing is the problem. In the absence of data, though, we don’t assume the worst. (By we, I mean the collective media ecosystem.) Instead, if a show debuts and fails to chart, we just don’t notice it.
In “The Adventure of Silver Blaze” Sherlock Holmes investigates the theft of a race horse and the murder of its owner and has this exchange with a detective:
Gregory (Scotland Yard detective): Is there any other point to which you would wish to draw my attention?
Holmes: To the curious incident of the dog in the night-time.
Gregory: The dog did nothing in the night-time.
Holmes: That was the curious incident.
These “Dogs Not Barking” are an important part of the streaming ratings story each week. It’s not what happened, but what didn’t happen: every week, shows miss the ratings. This adds up to dozens and dozens of TV shows every year. By process of elimination, like Sherlock Holmes deducing that something is wrong, we know that these shows didn’t do well.
Which means they’re flops. Bombs. Duds. Misses.
We call them “Dogs Not Barking” or DNBs for short.
And some of these shows are huge, so we track the biggest misses. At the end of the year, we nominate the “Dogs Not Barking” of the Year.
Could Some of These DNB Shows Be Hits?
Probably not, for two reasons.
First, the “floor”, so to speak, for the Nielsen ratings isn’t that big. The last show on the Originals list usually gets about 3-4 million viewership hours, which isn’t that big, especially compared to broadcast television. Maybe in a few years, this will change, but if a show misses the Nielsen rankings entirely, that’s still a clear sign that a TV show or movie didn’t do well.
For Netflix, who releases a global top ten list (something every streamer can and should be required to do; pay attention WGA, DGA, PGA and SAG/AFTRA!), if a show misses their rankings, it’s almost assuredly a miss. (Though we wish that they released a U.S. (or U.S./Canada) list as well…)
Second, we use other metrics to identify misses beyond Nielsen. For a TV show or film to become a “Dog Not Barking” it has to do poorly or just ot shown up on every metric or almost every metric we use. (And we’re increasing the number of data sources each month for our Streaming Ratings Report.)
– On TV Time…did it make the list for less than two weeks?
– Are its IMDb scores low?
– Samba TV mostly just reports hits, but occasionally provides numbers on misses.
And so on.
Shows can fail to make Nielsen’s top ten lists, but still make TV Time’s lists (like any popular Paramount+ TV shows). Or shows can fail to make the Nielsen lists, but still have intense fan interest, like Heartstopper, which isn’t a hit, but also isn’t a bomb, since a lot of fans seem to really love it.
If a film or TV show misses out on all of our ratings sources—or does poorly on the metrics we do have—we know it’s a miss or a flop or a dud or a bomb.
Is Every Show That Fails To Chart a Flop?
Not really. First, there’s a lot of filler content out there and a lot of it’s really cheap. I’m talking about genres like anime, kids programming, game shows, sports docs, standup specials, reality shows, foreign films, true crime docs. If inexpensive, mass-produced shows don’t chart, it doesn’t really matter. These shows don’t cost a lot to make, so it’s probably not a big deal if they don’t make the ratings/rankings charts.
We’re mainly looking to call out big, expensive TV shows.
Like Pachinko, an Apple TV+ show which reports say cost as much as The Crown to make (over $100 million) and whose producer said on KCRW’s The Business (and other outlets) that it was a challenge to get this show made because it had so, so many red flags. (It probably makes sense, in retrospect, why no one watched it.) For that much money, Apple could have made probably fifty hours of a multi-cam sitcom. Or about 100,000 hours of House Hunters.
You need Ted Lasso-sized numbers to justify an eight zero budget.
Why does this matter?
It matters so the industry doesn’t waste money making dozens of TV shows that 90% of America doesn’t care about and doesn’t want to watch. As belts tighten in response to Wall Street’s fickleness about streaming, or a possible recession, this will only be more important. Hollywood’s free lunch, as Richard Rushfield has noted, is probably coming to an end.
Everyone working in the entertainment industry should know what works and what doesn’t. What serves their audience and what doesn’t. What’s popular and what isn’t.
But the current media landscape doesn’t provide these answers. So we do. Unlike the past with Nielsen ratings, the bombs are hidden, which allow more bombs to be made, something I’ve seen first hand when I used to work at a streamer.