Nieman Journalism Lab |
This Week in Review: Debating Google and evil, and a case study in breaking news accuracy Posted: 27 Jan 2012 08:00 AM PST Google, social search, privacy, and evil: Two weeks after Google raised the ire of Facebook and Twitter by privileging Google+ within its search results, engineers at the two companies (plus MySpace) came out with a sharp response: A browser bookmarklet not-so-subtly titled “Don’t Be Evil,” that removes the extra weighting Google+ results get in the new Search Plus Your World feature. Search Engine Land’s Danny Sullivan has a thorough explanation of what the tool does, and search veteran John Battelle described what this “well-timed poke in the eye” means within Silicon Valley. Some tech bloggers agreed with the sentiment behind the new hack: PandoDaily’s Sarah Lacy said Google needs to acknowledge to its users that it’s no longer presenting unbiased and objective search results, and her colleague MG Siegler and Daring Fireball’s John Gruber argued that Google’s big problem isn’t ethical but practical — it’s damaging its product by making results less relevant. Others didn’t see Google as the villain in this situation: Tech entrepreneur Chris Dixon argued that Twitter is asking for a sweetheart deal — top Google search rankings for their information without giving Google firehose access to it. Om Malik and Mathew Ingram of GigaOM pointed out that Facebook’s record in putting user needs before its own gain is pretty spotty itself. Danny Sullivan proposed a truce between Google, Facebook, and Twitter based on making users’ public information public to any search engine, treating social action as proprietary and profiles as search metadata, and making contacts portable. Google fueled more suspicion of evil later in the week when it announced a new privacy policy that will unite its tracking of users’ behavior across search, Gmail, YouTube, and Google+ — a change users can’t opt out of. TechCrunch’s Eric Eldon explained the reason for the move: Google’s trying to improve the quality of its social data to compete with Facebook’s growing pool. The obvious question here is, as Mathew Ingram framed it, will all this information sharing be good for users, or just Google’s advertisers? Gizmodo’s Mat Honan led the way in charging the latter, saying that Google is taking away the user control that helped form the cornerstone of its “don’t be evil” philosophy. Devin Coldewey of TechCrunch and Christopher Dawson of ZDNet argued the opposite, that Google is only simplifying its privacy policies, something that should be easier to understand and maybe even more helpful for users. Danny Sullivan’s response was mixed, as he pointed out both potential benefits and concerns for users. That ambivalence was shared by Wired’s Tim Carmody, who concluded that Google is not evil, but “something else, something more than a little uncanny, something that despite conjecture, projections, fictions, and a combination of excitement and foreboding, we haven't fully prepared ourselves to recognize yet.” Elsewhere in the Google empire, Google+ announced a change to its real-names-based policy, allowing “established pseudonyms.” ZDNet’s Violet Blue noted that the allowance of pseudonyms is still quite limited, and Trevor Gilbert of PandoDaily said this change is probably related to Google+ pseudonyms’ value in Google’s new integrated social search function. Adam Shostack of Emergent Chaos argued that the initial insistence on real names was a big part of Google+’s disappointing start, and the AP’s Jonathan Stray wondered why Google is so insistent on real names in the first place. JoePa’s death and breaking news accuracy: We saw an interesting case study in breaking news, accuracy, and Twitter last weekend when the death of longtime Penn State football coach Joe Paterno was falsely reported Saturday night by a Penn State student news site called Onward State, then spread across Twitter. (Paterno died the following morning.) Jeff Sonderman of Poynter put together a useful Twitter timeline of the mishap, which prompted an apology and resignation by the site’s managing editor, Devon Edwards, though he’ll stay on staff there. Some other news organizations that repeated the error, most prominently CBSSports.com, published their own apologies, too. The following day, Onward State explained how the error occurred — one reporter got an email that turned out to be a hoax, and another reporter was dishonest in his confirmation of it. Daniel Victor of ProPublica gave a more detailed account with some background about how the site has combined reporting and aggregation. Poynter’s Craig Silverman gave a parallel explanation of how the AP decided not to run with the report. Silverman also reviewed the aftermath of the erroneous report, concluding that journalists are too focused on the benefits of reporting news first, without looking enough at the risk. He chastised CBS Sports for not crediting Onward State with the scoop, but then passing it off on them when the story was shown to be false. Sports blogger Clay Travis said CBS’ dubious behavior — particularly running with an unconfirmed bombshell report without linking to the source — was a function of “search whoring,” a tactic he said is running rampant in sports journalism. GigaOM’s Mathew Ingram went easier on Onward State, saying their process wasn’t much different from that of established news orgs and praising them for their quick corrections and transparency. King Kaufman of the sports site Bleacher Report may have drawn the simplest, best lesson out of all of this: “Only report what you know to be true, and tell your audience how you know it.” And while writing about an unrelated story, the Lab’s Gina Chen gave some other tips on bringing clarity to breaking news in a real-time environment. Lessons from the SOPA/PIPA fight: The web declared victory last Friday in the fight over SOPA and PIPA with the postponement of both bills, then shifted promptly to postmortem mode for much of this week. Talking Points Memo’s Carl Franzen had a great account of how all this happened, and New York magazine’s Will Leitch said this was a seminal moment in the ascendancy of the web’s ethic of collaborative creation above Hollywood’s traditional gatekeeping model. On the What It All Means front, one post stands out: Renowned Harvard network scholar Yochai Benkler’s seven lessons from the SOPA/PIPA fight, in which he explained the tension between Hollywood’s desire for increased copyright control and freedom of the web that gives rise to the networked public sphere. Last week’s events, he wrote, gave a glimpse of the power of that networked public, which he argued is more legitimate than the power of money: “if the industry wants to be able to speak with the moral authority of the networked public sphere, it will have to listen to what the networked public is saying and understand the political alliance as a coalition.” Several others, including the Guardian’s Dan Gillmor, also warned of the entertainment industry’s lust for control and the copyright fights that will continue to flow out of that desire. NYU prof Clay Shirky argued this point most forcefully, cautioning us not to underestimate how far the industry will go to regain its control, and Instapaper founder Marco Arment told us not to underestimate how much the industry loathes assertive users: “They see us as stupid eyeballs with wallets, and they are entitled to a constant stream of our money.” Venture capitalist Fred Wilson was more positive in his assessment of what’s next, urging the entertainment and tech industries to come together under a set of shared goals and principles. Reading roundup: Several other ongoing discussions were still on slow burn this week. Here’s a quick review of those: — New York Times public editor Arthur Brisbane issued his formal follow-up to his much-maligned “truth vigilantes” column, saying that he’s okay with the Times doing routine fact-checking and rebutting of officials’ false claims in news articles, as long as it does so very carefully and cautiously. Brisbane also stated his case on CNN’s Reliable Sources, and NPR ombudsman Edward Schumacher-Matos examined the issue as well. Voice of San Diego, meanwhile, published its own manifesto for truth vigilantism. In other fact-checking news, Politifact, still smarting from the controversy around its Lie of the Year choice, faced renewed criticism over its rating statements in President Barack Obama’s State of the Union as only “Half True” despite also saying they were factually accurate. That earned blowback from economist Jared Bernstein, MSNBC’s Rachel Maddow, and others. Politifact revised its rating to “Mostly True,” but Maddow wasn’t satisfied, saying to Politifact: “You are undermining the definition of the word fact in the English language by pretending to it in your name.” — Textbooks for Apple’s newly updated iBooks platform are flying off the digital shelves, though concerns about rights issues are lingering. John Gruber explained how different Apple’s proprietary file format looks depending on where you’re coming from, and Cult of Mac’s Mike Elgan argued against Apple’s rights critics. Here at the Lab, Matthew Battles said it’ll take a lot more than Apple to fix what’s wrong with education publishing. — A Pew report found that tablet and e-reader ownership nearly doubled over the holidays. As The New York Times explained, growth was particularly strong among women, the wealthy, and the highly educated. The Atlantic’s Megan Garber wondered if the gift-giving bump is really as good as it seems for Apple and Amazon. — A few interesting pieces on online sharing: Reuters’ Felix Salmon reflected on how it will disrupt the web’s traditional model, and Poynter’s Jeff Sonderman wrote a guide to making news content shareable. The Lab’s Justin Ellis also gave some engagement tips based on Facebook data, and ProPublica’s Daniel Victor looked at the viral success of images on Facebook. Researcher Nick Diakopoulos crunched some New York Times numbers to see what news gets shared on Twitter. — Finally, a couple of enlightening exit interviews with Raju Narisetti, who is leaving The Washington Post’s top digital post for The Wall Street Journal: One at the Lab and another at Poynter. Photo of Joe Paterno statue by Penn State used under a Creative Commons license. |
David Skok: Aggregation is deep in journalism’s DNA Posted: 27 Jan 2012 07:00 AM PST Below are a few quotes. Can you guess when each was written, and to what they refer?
Sounds a bit like an online aggregator, doesn’t it, pulling a few salient points from much longer work? One more:
These are, in fact, a series of subscriber comments sent to Time magazine in the weeks following its launch on March 3, 1923. This past weekend, I was out with a friend who happens to be a former editor at Time. We were analyzing the current state of the news media in light of recent developments, including The Huffington Post’s plans to launch a 24-hour live web TV network and Buzzfeed’s aggressive push into politics. These organizations — often lambasted for aggregating other’s content while producing little of their own — are repositioning themselves with new strategies, with more room for distinctive, often original content. My friend argued this was nothing new. Henry Luce’s Time started as a full-fledged aggregator almost 89 years ago. A quick visit to the library confirmed his statements. Sure enough, all 29 pages of the black and white weekly — its signature red-border cover not yet developed — were packed with advertisements and aggregation. This wasn’t just rewrites of the week’s news; it was rip-and-read copy from the day’s major publications — The Atlantic Monthly, The Christian Science Monitor, and the New York World, to name a few. Today, of course, Time, between print and online properties, reaches a global audience of 25 million; it employs celebrated journalists and editors, and it remains among America’s preeminent journalism institutions. Using history as our guide, we shouldn’t be surprised in the recent developments at the Huffington Post and Buzzfeed — nor should we be surprised when, in the coming months and years, other sites disdained by some make similar moves. These are organizations beginning their march up the value chain — beyond LOLcats to politics, beyond aggregation to original content, beyond cheap to upmarket. My friend is right: This is entirely predictable, and furthermore, precisely what disruption theory predicts. Clay Christensen’s theory of disruption, first described in the seminal book, The Innovator’s Dilemma, argues that this pattern repeats itself from industry to industry. New entrants to a field start at the low end, establish a foothold, eat away at the customer base of incumbents — and then move up the value chain. It happened with Japanese automakers in the 1980s, who started with cheap subcompacts and moved up to making Lexuses. It happened in the steel industry, where mini-mills began as a cheap, lower-quality alternative to established integrated mills, then moved their way up, pushing aside the industry’s giants. In the news business, newcomers do this by delivering a product that is faster and more personalized than that provided by the bigger, more established news organizations. They also create new market demand by engaging new audiences. (A 17-year-old may not read The New York Times, but they may stumble upon Buzzfeed to see that viral cat video.) Herein lies Christensen’s critical point, and one that media companies should not forget. Because new-market disruptions initially attract those that aren’t traditional consumers of The New York Times or the Wall Street Journal, these incumbent organizations feel little pain or threat. So they stay the course on content, competing on “quality” against these new-market disruptors. Meanwhile, the disruptors, once they establish themselves at the market’s low end, move into the space previously held by the incumbents by producing cheaper, personalized content. It is not until the disruption is in its final stages that it erodes the position of the incumbents. This is the definition of the innovators’ dilemma. There are two critical points to be made. First, the aggregators of today will be the original reporters of tomorrow. Those of us who care about good journalism shouldn’t dismiss the Buzzfeeds of the world because they aren’t creating high-quality reporting. Their search for new audiences will push them into original content production. Buzzfeed may be focused on cat videos and aggregation now, but disruption theory argues that content companies like it will move into the realm of the Huffington Post — which in turn, has already indicated its desire to compete more directly with The New York Times. Second, and perhaps more important, is that despite the obituaries for quality journalism, we can take comfort in remembering that we’ve been here before. We need look no further than that same 1923 volume of Time magazine. Under a passage entitled “Machines Do It“:
Mr. Bliven is Bruce Bliven, at the time former managing editor of The New York Globe and soon to become editor of The New Republic. Bliven’s quote wasn’t given in an interview with a Time reporter. It was a rip-and-read from an article Bliven had written in that month’s Atlantic Monthly titled, “Our Changing Journalism.” Time’s report went on for several more paragraphs, summarizing and quoting. We’ve been here before. The question is not, how aggregation is ruining journalism, but how traditional journalism will respond to the aggregation. Editor’s Note: Reading David’s piece made me want to hunt down Bliven’s original essay. Closest I could find is this, an old how-to-write instruction book for schoolchildren that includes an excerpt of Bliven’s original piece. It still makes for provocative reading, almost a century later. |
You are subscribed to email updates from Nieman Journalism Lab To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google Inc., 20 West Kinzie, Chicago IL USA 60610 |