FreakyFriday: Weird of the Day – Google Doesn’t ‘Get’ the Flu

Looks like Dr. Google misdiagnosed this year’s flu season. A widely circulated article posted by Nature this week describes how Google’s flu tracking application ended up overestimating this year’s epidemic.

If you’re not familiar with Google’s flu tracker, it’s one of a number of projects falling under Google.org, which seeks to leverage Google products for social good. Check out their site – not only are the projects really cool, but they show how the data generated by search goes far beyond marketing. The flu tracker attempts to measure the spread and severity of outbreaks based in part on Web searches – that is, people searching for flu symptoms and other related topics.large water slides for sale

Neat, right? And, actually, the project has historically been fairly accurate, at least enough so that medical researchers planned to take a serious dive into Google’s numbers this flu season.

The flu was bad this year; just not as bad as Google predicted. The search giant’s numbers doubled what the Centers for Disease Control actually observed, according to Nature.

It doesn’t take Nate Silver to figure out some of what went wrong. Among other things, Google didn’t account for the – excuse the pun – viral nature of this year’s flu season. Media coverage about the predicted severity of this year’s flu – including stories about Google’s incredible flu tracker – boosted the number of Web searches for flu-related topics, throwing off Google’s algorithms.

Nature posits that this is a temporary setback for a promising approach, but GigaOM and others point out that the whole issue raises important questions about the reliability of Web data.

My takeaway – and the takeaway for search marketers – is that search data, click-thru-rates and all the other numbers we pull out of the Web are incredible, powerful tools, but they still need real-world context to be used effectively.

The freaky thing is, numbers don’t always mean what we think they do. And neither can they be divorced from facts on the ground. For a blunt example, Carnival Cruise Lines is probably seeing a spike in Web searches this week coinciding with its well-publicized fiasco in the Gulf of Mexico. And anybody who is familiar with cyberchondria knows that not everybody who searches for disease symptoms on the Web is actually sick.

Digging into the numbers still requires human expertise. Google alone can’t cure what ails you – or your business.

FreakyFriday: Weird of the Day – Robots vs. Writers

By: iCopywriter blogger Alex Dalenberg

Guess what? The writers are still winning. That may come as a shock to anyone who works in the media industry (your humble blogger is himself the veteran of a mass newsroom layoff, even at the tender age of 27).

Old media’s struggle to adapt to digital disruption wasn’t bad enough. But anyone who has been watching the bleeding edge of online content over the past few years knows that the geeks who rule our world are hard at work on the robots that will finish us all off.

Not physical robots of course, but algorithms that can aggregate and even write stories in the place of humans. Statsheet.com is one of these. It automatically writes game recaps based on data. For example, this story about Gonzaga crushing Lewis & Clark State* in basketball this week, was not written by a human. Of course, the writing is, shall we say, robotic.

But, this week, one of the pioneers of this kind of automation — TechMeme founder Gabe Rivera had some interesting things to say this week about the old-fashioned process of curating news. This is via the tech blog GigaOm, by the way.

Rivera’s site, of course, automatically aggregates Silicon Valley headlines from around the Web — and it works quite well. But in the past few years, he’s added human editors throughout the country to help the machines do their job better. The problem is that the algos can’t yet sense when a story is played out, or not truly worthy of the front page. For now, human news sense still can’t be replicated.

Not that this doesn’t mean algorithms and aggregation are dead. The fairly obvious feeling I get from this is that the publishers who will be successful are the ones who find the best mix of automation and the human touch. Excellent content — and how it’s organized and presented — still rules the day.

Which actually isn’t so freaky after all.

* Side note: the human who edited this piece looked a little more into that Gonzaga-Lewis & Clark State bot-originated article. We scanned the article in question, and the Flesch-Kincaid readability calculator suggested that FIVE of the 18 sentences be revised. 30% of the content.

Hmmmm…Can you get bot content for your site? Sure. Will it be a cheaper option for you? Yep. Will Google know if your content was bot-generated? Bet on it – they’re not dummies. Will they penalize you? We think so…are you willing to chance it?  iCopywriter real, live, human writers and editors.

Photo Credit: Sebastianlund