The same people saying click-bait is a problem are the ones clicking on that bait

facebook_logo.pngI was going to write something really profound about Facebook’s announcement that it was cracking down on “click bait” type stories, penalizing them the News Feed algorithm based on a number of factors, but then Dave Coustan wrote this and why bother. See his post for how this is likely to impact brand and other publishers.

The question I keep asking, though, is how did Facebook ultimately decide this as a problem it needed to address? It says these types of posts were resulting in a steady amount of negative feedback, both on pages themselves and in a survey, where people said these types of “…and you won’t believe what happened next” articles weren’t very good.

But then why do they so frequently appear in people’s News Feed? I can understand why people can feel like they’ve been the victim of a bait-and-switch, but then why take the steps of engaging with that post in a such a way that similar stories show up more regularly?

This is, to me, a bit like surveys where people tell a news organization they want more hard-hitting stories about international news but then tune in only when one celebrity gets in a traffic accident while driving naked. Surveys are notoriously bad ways of gauging actual behavior and the changes now being made based on those results are now going to impact a lot of publishing programs.

There’s beauty in Twitter’s unfiltered stream

Both Mathew Ingram at GigaOm and Charlie Warzel at Buzzfeed (there are others but those are the biggest ones I’ve seen) have pieces about the stark contrast between what people are seeing on Facebook and what people are seeing on Twitter over the last week or so.

The gist of this is that Facebook has been filled with videos of people taking the Ice Bucket Challenge, a stunt designed to raise awareness of and get people to donate to ALS charities while Twitter has been the place to turn to for updates about what’s happening in Ferguson, MO as well as more recent stories about the beheading of a journalist in Syria and more.

I won’t belabor the point but instead encourage everyone to read those two pieces linked above. But I will take a moment and reiterate how the difference is largely – almost solely – because one uses an algorithm to decide what to display (Facebook) while the other is the unfiltered stream (Twitter).

While Twitter’s lack of algorithm may not mean someone sees everything – a lot is missed in the time we’re looking away – it does mean they see a lot more. And it’s easier for people to catch up because there’s often someone who’s X amount of time behind the news and who’s sharing things now, a bit later than when it was breaking.

As Twitter makes more and more rumblings about introducing some sort of algorithm to the Timeline (something that isn’t necessarily bad…as long as they either make it an opt-in feature or something I can easily opt out of) I hope they’re looking at all the commentary about this issue and that it’s giving them pause. While they can muck around with officially making Favorites something that appears in your Timeline in addition to Retweets all they want, the beauty is in the stream. It’s messy, it’s imperfect and it’s often wrong. But it’s also a magnificent example of a wonderful, flowing public conversation. And it’s *much* more important to have this sort of free-flow of ideas and news than be subject to some system’s idea of what is or isn’t important.

Two Tragedies, Two Opportunities to Reconsider Social Media Publishing

People were stunned Tuesday afternoon when news broke that Robin Williams had, by his own hand, passed away. That includes myself. Williams was a huge talent and an incredibly loved and influential comedian, actor and person.

As the afternoon more and more people in the “Social Media Marketing” wing of Twitter particularly came out with their opinion that, in the wake of the tragedy, brands should pause their social publishing programs. While I certainly see their point, I disagreed. Williams’ death was certainly sudden and Twitter was filled with an outpouring of emotion over it, but in my opinion it didn’t rise to the level where brands appear incredibly insensitive and tone-deaf with their scheduled updates. There was nothing inappropriate about continuing to publish at that time like there is when there’s a shooting at a school, a bomb goes off somewhere in the U.S. or something similarly massive happens. It’s hard calculus, and I certainly respected differing opinions, but this didn’t meet the necessary criteria to make that recommendation.

(Note that if we paused every time a bomb goes off elsewhere in the world no brands would ever tweet again until the second coming of Christ himself. We also differentiate between shootings. 12 people could be killed in Chicago over this weekend and every brand in America would continue to publish without a second thought.)

After watching on Twitter as events unfolded in Ferguson, MO last night, though, I’m increasingly of the opinion that the situation *there* does cross that threshold.

Think about it: Right now in an American city – one just 275 miles from my front door – there are police confronting unarmed citizens with riot gear, including long-range rifles, tear gas and more. That’s really happening. Journalists are being taken to prison, peaceful protestors are being tear-gassed and more.



(image via @chicagotribune)

Imagine if brands took a stand and said, “In light of the actions happening in Ferguson, MO we are taking a break from our social media marketing. When we feel the situation has been resolved, or there’s a clear path that’s being taken to that resolution, we will resume regular posts. Until then, our thoughts are with the people of Ferguson.”

Now that might sound like crazy talk. Take a stand on a public issue that has little to nothing to do with a company’s business? That’s nuts and this shouldn’t impact marketing one way or the other, right? But that’s exactly what brands and companies are doing everyday with issues like birth control, same-sex marriage and a host of other social issues.

Just like taking a stand on same-sex marriage shows companies attempting to convey human emotions and take a stand against what is the status quo, doing so around the events of Ferguson would be a showing that they’re not blind to the events around them, events that are, on many levels, hard to believe.

I’ll admit I’m not ready to make this call myself just yet. It’s so outside the norm that I’m still hesitant to recommend what I’ve outlined above. But it’s something that’s growing and growing in the back of my mind and, if things aren’t resolved there soon, I may work up the nerve to at least try to lead by example.

Facebook Save and the needed next step in read-it-later apps

Facebook introduced Save last week, their own version of a “read it later” service that allows people to save interesting stories for reading at a later time. The idea being that while they’re scanning Facebook they may see something that looks interesting but they don’t have time right then to click through to the story or to fully digest it, so they need to save it for later, when they presumably have more time. This puts them in competition with existing services like Pocket, Instapaper and a handful of others.


When I first read about the service it struck me as a solution in need of a problem. That’s largely because of not only how I see people using Facebook, which is in large chunks of time so they can sufficiently scan their Newsfeed, but also in how Facebook has spent so many calories recently positioning itself as part of the ongoing, current conversation. They want to be the place where people are talking about what’s on TV, what events are happening and so on, so anything that is geared toward time-shifting seems inconsistent and a bit odd.

Going back to the first point, “save for later” functionality has in my experience always been something that’s been sought by the power-user set and not necessarily by the mass audience, who just want to go online to check email and check for updates from friends, family and the brands who they’re hoping share coupons with them. That’s part of the reason why RSS, which by its very nature was all about time-shifted viewing and saving deep reading for a later time, never caught on, because the desire to do so just wasn’t there in a mass audience. They didn’t need to do that because they had their bookmarked pages and were fine with that workflow. And it’s part of the reason why the average web user hasn’t even heard of Pocket et al, because that’s not how they’re consuming content online.

So it’s not clear who Facebook Save is really meant for. And the best guess I can make is that it’s intended to be something that brand publishers are supposed to applaud because it dangles the possibility that while someone may not click through to a story now, they may do so later.

But metrics for Facebook Save were not part of the announcement and it’s unclear as to whether this is something that will be introduced or added on down the road. Without it, though, it will be difficult for publishers to truly measure how and to what extent people are using this tool.

Facebook Save is, though, not unique in this manner. To the best of my knowledge none of the big save-for-later apps offer any sort of metrics for publishers. And those numbers are absolutely necessary for publishers in the same way that “day plus seven” ratings numbers have increasingly become necessary for TV networks. Let’s walk through why:

I, as a brand publisher, post X story on Facebook today. And I can see in my site analytics that views to that page/post spiked in the period immediately following that, knowing that links on Facebook (or Twitter or anywhere else) have a very short half-life. There may be a few moderate bumps down the road but generally after a certain period of time you’re going to see those numbers flat-line.

But now factor in save-for-later functionality, whether it’s within the Facebook framework or in something like Pocket, which integrates with Feedly, Digg Reader, Twitter’s mobile app and more. If I, as publisher, can start to attribute traffic to those apps then I have that much more complete a picture of my readers. And it’s not just that, since apps like Pocket allow for reading completely within the app, without sending any traffic to the source site. So if these apps were to work with publishers to show how many people were saving stories from their domains, how many articles they were saving, where they save the article from (Twitter, RSS, Facebook etc), how long they saved the story before they read it and so on, then all of a sudden publishers are swimming in additional reader data that can help drive strategy on a number of levels.

It also opens up a world of possibilities for the apps themselves, as well as their business models specifically. Imagine if apps were able to sell ads to publishers that looked like recommended reads, giving readers the option to either save it for later reading or to go ahead and visit the site directly. There’s value in both options.

There’s also this from Om Malik:

Those are great ideas that would, yes, require some development work. But honestly a network that’s based on similar reading habits is of much more use to me than some other social networks. It’s basically what Google Reader Shared Items was before it was so brutally ignored and killed. It’s easy to see scenarios where I say “Ooo…X Friend would like this story, I’ll message it to them” within the Pocket – or other app – framework.

On an even more basic level, these apps need to more fully embrace a variety of networks. For instance, it would be nice to be able to choose LinkedIn, Flipboard or other networks to share a story from Pocket instead of just Twitter or Facebook. (Note that I don’t use Buffer, which is an option within Pocket. Not because I don’t want to, I’ve just never taken the time to set something like this up and it’s for an even thinner slice of the audience than users of save-for-later apps.)

Better metrics and data would serve the save-for-later market well. Since Facebook is now treading into these waters in a serious (even if it might be a drastic misreading of their audience) manner it’s a good time for the existing players to start mining new areas to prove their value to readers and publishers, both important stakeholders in how the tools they produce get used.

Facebook Reach (Or You Can’t Have the Glengarry Leads)

facebook_logo.pngBrian Boland, who leads the Ads Product Marketing team at Facebook, has published a very interesting and much discussed piece on the Facebook Business Blog about the decrease in organic reach most brand publishers have been, are and will be seeing for their posts.

Boland lays the fault for that dropping reach at the feet of two things: First, there’s more content being produced than ever before and Second, this is actually how the News Feed is supposed to work, pulling out what it deems to be important – or of value – and showing it off while hiding other posts. He specifically denies that reach is being throttled in an effort to encourage companies to buy ads to boost that reach.

We’ve no choice but to take Boland at his word. But there are several points at which it’s easy to contest what he says here:

First, for most brands (at least in my experience) the falloff wasn’t a gradual thing; It was sudden. One month everything was running at a certain level and the next the bottom fell all the way out. That’s not because 300% more people were posting and fighting for New Feed position than they were the month before. That’s because *something* about the algorithm changed and it directly impacted how brands were doing business.

Second, it’s a bit hard to take his initial statements about organic reach *not* being tied to a desire for ad revenue when the entire second half of the post is filled with examples and stories about how buying ads is the smart play for brands who actually want to reach people. It’s a bit like being told “No, you won’t die unless I amputate your leg. But let me tell you the 17 reasons amputating your leg is the only way you’ll live.”

But the part that sticks out for me most is that this is business as usual for Facebook, that they’re just doing what they should be doing. Which is fine, right up to the point where you realize the following:

They’re telling you what should or shouldn’t be important to you.

Put all the social media strategy concerns aside for a moment and think of how monumental that is. We’ve not only ceded control of what we see to Facebook (and others) but we’ve given up the ability to, on any level, make value designations for ourselves. And we’ve done this out of a sense of it being more convenient this way.

Facebook has often tried to compare itself to a newspaper, which often makes that same sort of decision-making. This or that story does make it into the paper (you can also use the analogy of a TV or radio broadcast if you life) and some don’t. Those decisions are out of the control of the reader and in the hands of someone else – a gatekeeper to use the vernacular – whose job it is to rank stories in order of importance. After a certain point there’s no more room for them.

But that analogy falls apart for me when you realize that Facebook is not the one producing the news, a term that’s loose enough to include everything from your aunt’s picture of her flowers to updates from your favorite retailer to a story from CNN about the federal budget. It’s the platform on which many news producers – again, using a fairly broad definition of the word – distribute their material.

So it’s not a newspaper, simply doing what it does. It’s more like a television set – the actual physical appliance – deciding what shows you can and can’t watch based on some unseen algorithm that allows for no override when you, the viewer, realize it’s not doing what you want it to.

It’s in that sense that Facebook, I think, needs to stop acting like a newspaper and we all stop thinking that “well that’s just how it works” is a legitimate rationale.

Instead it needs to act more like a newspaper stand, allowing for indiscriminate access to whatever material the audience would like and allowing each member of that audience to set their own priorities. X person wants this, Y person wants that and it’s up to them decide. And if they want everything – the proverbial firehose – that’s their choice as well.

To do that Facebook would need to stop basing an individual’s News Feed on each individual piece of content or the activity of others in someone’s network and start giving each user more control at the Page level. Let me mark This Page as a Tier 1, where I see everything they publish, this one as Tier 2 where I only see some of what’s published and so on.

*This* would be Facebook acting more like a neutral delivery platform than anything they’ve done before. Which of course means it likely won’t happen.

For brand publishers there’s some serious debating that needs to happen about what’s more valuable for them: Encouraging fans to consume content in the News Feed, where exposure is a dicey proposition or encouraging them to visit the Facebook Page directly to check for recent updates. And no, you’re not imagining things: That sounds like exactly how the web worked before there was such as thing as real-time streams, social networks or anything else. It was the static web and it was inconvenient. But right now there may be more value in making a Facebook page a destination site than something that’s about the stream.

That reality calls into question, though, the very value proposition of Facebook itself. If you’re pushing people to visit a Page directly as opposed to waiting until (or if) something crosses the News Feed, then what’s the value of doing *anything* on the rented land of Facebook where you’re constrained by a set of T&C that’s outside your control as opposed to building a site – or microsite – for a particular kind of content and owning the entire user experience, as well as capturing all the data that comes with that scenario?

It also reminds me of the existence of a technology that allows you to *not* have to bookmark all your favorite websites to see what’s new but still allows you to have absolute control over what you consume and what you do with it: RSS. That simple, neglected and much-derided format lets you see everything that’s new from a site, all within a “reader” that is completely neutral, treating content from one site in the same way it does all the others, with publishers themselves in charge of most of the user experience.

I’m not saying everyone needs to stop using Facebook for brand publishing tomorrow. Far from it, there’s still a lot to be said for the “fish where the fish are” philosophy. But right now we’re in an environment where the fisherman’s lines are being shortened by the people who own the lake and they’re being told they can reach more fish if they just buy this longer line, though making money off the longer line is totally not the reason the lines were cut in the first place.

Or, to put it another way. These leads are not for you.


That, to me, is making increasingly less sense.

Watch this brilliant…

Not that I’m questioning how great a bit of writing this John Oliver bit is, but have we entered the stage of the video’s life yet where people are calling it “brilliant” simply because others have used that word and the people who are just getting to it now don’t want to look less than insightful as to its brilliance? Cause that’s how it’s increasingly reading to me.

This is by no means unique, just the latest example of a video (or story, or photo) that cycles through the online world over the course of days and you see the same sort of thing with movie reviews and other mass-consumption items. A sort of group-think sets in where each person needs to match – or outdo – the hyperbole assigned to something by those who have already commented on it. It’s the internet equivalent of Fredo’s “I’M SMART!” lament in The Godfather.

Making a mountain out of an ant hill

If you want a good look at the current, depressing state of film “journalism” you can’t do better than the recent round of coverage of the Ant-Man movie Marvel Studios has in the works.

To recap the facts: Writer/Director Edgar Wright has been working on the movie for something like six or seven years. The movie finally got a release date for 2015 and buzz had been building because geeks already love Wright and the couple snippets of test/concept footage he’s released have been pretty cool. Then last week he pulled out suddenly, citing the usual “creative differences.” There was some mention of last-minute script revisions done without his guidance, but no one – at least not that I’ve seen – has gone on record confirming that.


Since news broke, though, the various movie and entertainment sites and blogs have gone absolutely bonkers with speculation, rumor, conjecture and hearsay, all written either in a way that over-sensationalizes something that comes nowhere near being a fact or which states as fact the latest rumor or bit of gossip. And because everyone is unwilling to cede pageviews to any other site these rumors (and worse) show up on every site since heaven forbid they not appear on the first page of search results for “ant man director.”

I can’t help thinking “this is what we’ve wrought” whenever 35% of my RSS reading is some variation on the latest non-story. But mostly it just makes me sad, both that there’s so little reporting going on and such a mad rush for pagviews that the slightest whisper, regardless or its source, triggers a mad dash to the “Publish” button.

Fighting the tendency to go down the “things were different in my day” road, things have changed a lot in the last several years when it comes to the movie blog/site world. The big sites have gotten bigger, some smaller sites have faded out, new players have emerged, some former leaders in the space have gone corporate and so on. But one thing that hasn’t changed – at least from my perspective – is the relationship between these sites and the studios they report on.

As these sites began to grow studio publicists began to notice them and included them on their press outreach. So they would get pitched news and so on and even got invited to location visits. But, because these sites and their writers were basically just taking what was spoon-fed to them and/or aggregating stories from the Hollywood trades there was little in the way of “reporting” that developed.

These sites were started and run by fans and not reporters, which at first is what was great about them. “Wow, here are fans just talking about what’s important to *them* and not being all stodgy and old-fashioned about it.” That’s exactly what the early days of the social web was supposed to enable and it did. But then things began to grow and we entered into a state of what can only be called Mutually Assured Destruction.

See it’s not like any one site at this point can say “You know what, we’re done with (fill in the name of any property, especially one that caters to geek culture) rumors, we’re going to do some hard reporting.” To do so would be to cede pageviews – and therefore ad revenue – to the four dozen competing sites who are going to be all too happy to continue running every piece of rampant speculation they can.

So everyone goes on about their merry way, adding question marks to headlines (something that should result in immediate confiscation of your keyboard) and acting like they’re just doing what their readers want them to do.

It’s enough to make you long for the good old days of restricted access to publishing platforms, strict editorial oversight and other vestiges of the “old media” world. But now instead of worrying about whether they got the story right – or even if there are *any* facts in what’s being written – all concerns are about filling the content maw. If X number of posts aren’t published these sites can’t fill their ad quota, won’t publish enough tweets to get traction and so on. Every story, rumor and report must be commented on, regardless of whether either the source or the comment has any credibility.

The bigger, more serious problem is that what started out as a fringe problem has now infected the whole organism. Traditional Hollywood trades that *had* the reporting resources have followed everyone to the bottom at the same time they’ve had systemic problems faced by others in the media industry, problems that have resulted in layoffs of the staff that could chase down leads and verify rumors before they went to print. Now everyone is on the gossip train for better or worse.

I hold out hope that things will course correct in the near future and we won’t be stuck in a world where truth is unknowable because no one has the skills to divine it from the other, more salacious voices whispering in their ear. But I fear that won’t be coming any time soon, which is too bad since this version of things is, to be honest and personal, beginning to impact my enjoyment of the movies themselves. Moreso, in fact, than any of the countless spoilers that are posted about these sites in the lead up to release, right up until the point where the same sites that have run 54 unfounded rumors complain that they don’t want to be spoiled about the movie itself.

Yes, this is a rant. But it’s one that I hope points to some serious problems. I don’t know what the solution is outside of pleading with everyone to do a better job when doing their job. But I’m hoping that there are people who will fight to do things more above-board and in a more responsible – one might even say journalistic – manner.

The homepage for all things Thilk


Get every new post delivered to your Inbox.

Join 2,602 other followers

%d bloggers like this: