Facebook’s “Photo Memories” and filter failure

If you’ve scanned your friends’ photo albums on Facebook recently, you may have noticed a new feature on the right sidebar labeled “Photo Memories.”  This raises an important issue that I’ve been meaning to blog about for a while: our collective digital memory.  It’s the subject of a fairly new book titled Delete: The Virtue of Forgetting in the Digital Age.

I’ve not yet read the book, but I listened to a talk by the author, Viktor Mayer-Schonberger, at Harvard’s Berkman Center, as well as his conversation with Farhad Manjoo of Slate on Bloggingheads TV.  (While the Berkman talk is a more thorough discussion of his ideas, in some ways the Bloggingheads talk is clearer and more illuminating.)

Here’s how Berkman describes the book:

DELETE argues that in our quest for perfect digital memories where we can store everything from recipes and family photographs to work emails and personal information, we’ve put ourselves in danger of losing a very human quality—the ability and privilege of forgetting. Our digital memories have become double-edged swords—we expect people to “remember” information that is stored in their computers, yet we also may find ourselves wishing to “forget” inappropriate pictures and mis-addressed emails. And, as Mayer-Schönberger demonstrates, it is becoming harder and harder to “forget” these things as digital media becomes more accessible and portable and the lines of ownership blur (see the recent Facebook controversy over changes to their user agreement).

Mayer-Schönberger examines the technology that’s facilitating the end of forgetting—digitization, cheap storage and easy retrieval, global access, and increasingly powerful software—and proposes an ingeniously simple solution: expiration dates on information.

The cataloging of our lives online is a relatively new phenomenon, so we haven’t had much time to consider its impact.  But it’s going to be interesting.  Here’s Facebook VP Christopher Cox explaining the potential impact of Facebook Places:

Too many of our human stories are still collecting dust on the shelves of our collections at home…Those stories are going to be pinned to a physical location so that maybe one day in 20 years our children will go to Ocean Beach in San Francisco, and their little magical thing will start to vibrate and say, ‘This is where your parents first kissed.’

Google CEO Eric Schmidt puts it this way:

I don’t believe society understands what happens when everything is available, knowable and recorded by everyone all the time.

According to the Wall Street Journal, Schmidt  “predicts, apparently seriously, that every young person one day will be entitled automatically to change his or her name on reaching adulthood in order to disown youthful hijinks stored on their friends’ social media sites.”

In case all of this wasn’t tricky enough, applying expiration dates to information, should we want to take Mayer-Schonberger’s advice, is somewhere between difficult and impossible.  Luckily, deleting information isn’t the only way to control our collective memory.  As Clay Shirky says, “There’s no such thing as information overload – only filter failure.”  What we view online is only partially determined by what’s online, because everything’s online.  What we view is determined largely by our filters.  Facebook is a filter, as is Google.  My RSS reader is a filter, as is my email inbox.

Blogging for Foreign Policy, Evgeny Morozov is thinking along the same lines:

So what else could we do, given that expiration-date-technology capable of destroying all copies is not an option? This is an easy one: make offensive information harder to find. After all, it’s the fact that our data is findable – most commonly through search engines – that makes us really concerned.

To apply Shirky’s maxim to the question of digital memory, let’s return to Facebook’s Photo Memories.  The albums of photos of my friends from college have been on Facebook for years.  But, until recently, I had to dig to look at them.  They only registered on my feed if someone tagged someone or commented on them. Typically, as time passed, this happened less and less and then not at all.  And so those photos, though available, were no longer viewed.  And then the filter was changed.  Suddenly, albums from years ago are being thrust in front of me and I’m looking through some of them again.  The point is that digital memory is about more than availability.  In practice, it’s about filters.

What do we want to remember and what do we want to forget?  I’m not sure.  I find several of Mayer-Schonberger’s examples of the dangers of remembering to be quite compelling.  But, in general, the availability of more and more information also has plenty of upside.  Obviously, the question is about balancing the two, and I think it’s clear that we don’t yet have any idea where that balance should be struck.

In the meantime, perhaps we should focus on improving the design and governance of our filters.  We should be in favor of openness, transparency, democracy and individual autonomy.  This isn’t the same as saying we want information to be available and transparent.  But if our filters are open, transparent, and democratic, we’ll at least have an easier time evaluating and improving them.

Perhaps we won’t miss forgetting as much as we think.  (Will we wistfully look back at old photos and fondly remember forgetting?)  Yet, there’s reason to think that if we do ever want to forget, the solution lies in filtering the past rather than deleting it.

Apple’s music social network, Ping, and a followup on Facebook feeds

After my last post, I had a few conversations with friends about a categorized or sortable Facebook feed.  The point of my post was as much about how Facebook could have better managed the transition from profiles to feeds as it was about categorizing updates, but the latter was, for whatever reason, what I ended up debating.

What I heard from multiple people was, basically, “I don’t use Facebook that way.”  As far as I can tell, my friends mostly use Facebook to post pictures, view pictures, and write on walls.

Perhaps this is the wrong conclusion to draw, but I think this is just more evidence that Facebook is continuing to miss out on an opportunity.  The people I spoke to aren’t thinking about Facebook as a tool for discovering new music, books, interests, etc. because Facebook hasn’t made it easy for them to do so.

If my friends are any indication (and perhaps they’re not?), Facebook users view the platform as a place to see what their friends are up to, and to stay in touch.  They don’t see it as an unprecedented social graph with massive potential to inform and recommend various aspects of their lives.

But that’s what it is.

Perhaps the introduction of a music social network by Apple will spur innovation at Facebook.  After all, is a sortable/categorized news feed really all that much to ask?  Maybe I’m nuts, but I think people would learn to appreciate it quickly.

UPDATE: Apparently, Facebook is testing a feature to “subscribe” to certain friends, to be sure you don’t miss any of their updates.  I have nothing to say about this now, and it’s not directly related to what I’m talking about, but I’ll count it as news feed innovation.

Facebook Profiles and the News Feed

Richard McManus has a post at ReadWriteWeb on the declining significance of profile pages on Facebook.  This is as good a jumping off point as any for a post I’ve been wanting to write about what I consider a missed opportunity for Facebook.  But first, a little background from McManus:

As Facebook becomes more and more popular, the social network giant is putting more emphasis on the real-time feed. In other words, the activities of your friends displayed in reverse chronological order on your Facebook homepage. In the old days of Facebook – and indeed traditionally with social networks like MySpace and Friendster – you’d visit a person’s profile page to see what they’re up to. Facebook changed this paradigm in September 2006, when it introduced the news feed as the primary way to keep track of your friends. In October 2009, that feature was re-named the “live feed” and Facebook introduced a more filtered news feed for your homepage.

Now on to my gripe…*

It’s easy to imagine why Facebook would want to push the feed.  The more often content is updated, the more often you’re likely to check the site.  When a huge percentage of the site’s content is relatively static – profile pages – there is less reason to visit.

Yet, profile pages were central to users’ conception of the site.  This may explain at least a small part of the anger over the introduction of the News Feed and the more recent “Connections”.

And my feed on Facebook is pretty uninteresting.  Though Facebook has taken some steps to improve the relevance of the feed, there’s more work to be done.

Which is why I wish they’d built the feed around the existing structure of the profile.  Facebook already had divided my life up into a surprisingly useful, yet simple, set of categories: music, books, TV, movies, activites and interests.  Why not structure a news feed around these categories, plus a Twitter-esque “what are you up to?” (or Facebook’s “What’s on your mind”)?

For each category, imagine you replaced “Favorite” with “Latest” and posted updates by category.  What are you listening to these days? What’s your favorite TV show this season? What book did you just finish?

Now imagine a feed divied up by these categories.  You could see all the updates at once, of course.  But when looking for new music I could click the Music tab on my Feed to see all my friends’ updates on “Latest Music.”

I suspect that this adaptation of the profile structure would have provoked less rage than the original News Feed rollout.  And though that opportunity is missed, it may not be too late to introduce some sort of basic tag/category structure that accomplishes the same thing.

I know it’d make me check Facebook more often.

*I hesitate to second guess these decisions as I generally think users’ reactions against changes to Facebook – the News Feed being the most prominent example – represent a disappointing bias against change of any kind.  Most users didn’t think much about the changes, nor did they give themselves time to grow accustomed to them; they simply protested something new.

Population and the Senate

Admittedly, this has nothing to do with media or the web – the usual topics of this blog… but the U.S. Senate has been on my mind recently for many reasons, including this excellent New Yorker piece by George Packer.

Since each state has two senators regardless of population I started to wonder what the least representative possible Senate majority would look like.  In other words: what percentage of the U.S. population lives in the 26 least populated states (since the senators of these states, acting together, could technically form a majority)?

It turns out that a Senate majority could be reached that represented only 16% of the country. A filibuster could be broken (60 votes, 30 states) by a coalition representing 22% of the country.

(To reach these figures I used the Census Bureau’s 2009 state population estimates.)

I don’t want to make any normative claim here.  The Senate wasn’t designed to reflect state population and when the nation was founded senators weren’t even directly elected.  I don’t have firm opinions about how I’d reform the Senate so my purpose here is merely descriptive.

That hypothetical majority – 52 Senators from the 26 least populous states – would be comprised of 23 Republicans, 27 Democrats and 2 Independents.

The map below shows the 26 least populous states in DARK GRAY, the next 4 least populous (needed to reach a 60 vote coalition) in LIGHT GRAY, and the 20 most populous in WHITE.

Least representative possible Senate majority

Of course in the 60-vote Senate blocking legislation is significantly easier than passing it.  As David Roberts of Grist notes, senators representing a mere 8.3% of the population could successfully filibuster legislation.

A journalistic ethic for the 21st century

I’ve been thinking a lot recently about how to preserve the best features of traditional journalism as media continues to be transformed in the online era.  At its best, journalism is defined by the ethic of its practitioners, though defining that ethic can be difficult.  Some might say it’s about objectivity – or even impartiality.  It’s skeptical; it’s independent; it’s “just the facts”.

Whatever it is, there’s a good argument to be made that it’s badly in disrepair, or perhaps even fundamentally ill-suited to the web.  NYU Professor Jay Rosen, for one, has mounted a sophisticated critique against what he dubs “the quest for innocence”, described as “the desire to be manifestly agenda-less and thus ‘prove’ in the way you describe things that journalism is not an ideological trade.”

Yet even if the journalistic ethic is flawed, that doesn’t mean there aren’t aspects worth preserving.  But what are they?  And how might they differ from the basic requirements of what we might call “intellectual integrity” that one would expect from good non-journalistic writing like that of an academic blogger?

I’d puzzled over this question for a while and, despite lots of tentative thoughts, hadn’t really come up with a good model or set of guidelines.  And then Dave Weigel got fired from the Washington Post, setting off debate over the role of the blogger/reporter.  Via Harvard’s Nieman Lab I came across this post by Jim Henley outlining the blog-reporter ethos:

* original reporting on first-hand sources
* a frankly stated point-of-view
* tempered by a scrupulous concern for fact
* an effort to include a fair account of differing perspectives
* ending in a willingness to plainly state conclusions about the subject

If this sounds familiar, that’s because it’s basically magazine-reporter ethos, says Henley.

I submit that this is just magazine-journalism ethos with the addition of cat pictures. If you think about what good long and short-form journalism looks like at a decent magazine, it looks like the bullet-points above…

…What blogging does is enable the magazine-journalism ethos to meet a frequent publication schedule – even more frequent than the newspaper’s traditional daily schedule.

I’m not a journalist, but this strikes me as exactly right.  Good magazine journalism contains a healthy dose of journalistic ethic without the worst excesses of the quest for innocence.

I have seldom read Weigel, but my preferred example of this blog-reporter ethos is Andy Revkin, author of the blog Dot Earth and former New York Times science reporter.  Here’s how Revkin describes his blogging:

I’ve spent a quarter century doing “conventional” journalism, and sought to create Dot Earth as an unconventional blog. It is not a spigot for my opinion. It is instead a journey that you’re invited to take with me. It is certainly not conventional journalism. To my mind, for most of the issues that will shape this century most profoundly, the old model of journalism is no longer a good fit.

And:

Lately, I’ve been describing the kind of inquiry I do on Dot Earth as providing a service akin to that of a mountain guide after an avalanche. Follow me and I can guarantee an honest search for a safe path. This is a big contrast from the dominant journalism paradigm of the last century, crystallized in Walter Cronkite’s “ That’s the way it is” signoff.

As a regular Dot Earth reader, I’d argue that Revkin’s blogging is consistent with Henley’s vision.  This vision of journalism outlined by Henley and – I believe – practiced by Revkin strikes me as the best of both worlds.

While Revkin states above that the blog is “not a spigot for my opinion” it is not voiceless and if it is opinionated, it isn’t in the same manner as an op-ed column.  Freed from some of journalism’s more damaging constraints, it brings to bear a perspective; but one tempered by a journalistic ethic fit for the 21st century.

The vanishing media middle class

Pew published a study a while back titled New Media, Old Media on the differences between new/social media and the traditional press.  There are a lot of interesting takeaways, among them:

The BBC, CNN, the New York Times and the Washington Post accounted for fully 80% of all links [in the blogosphere].

The Myth of Digital DemocracyBut what about the long tail?  What about the internet fragmenting culture?  Does this data contradict those claims?  Not entirely.  The best framework I’m aware of for understanding this phenomenon comes from Matthew Hindman’s 2009 book The Myth of Digital Democracy (read Chapter 1 here).

Hindman, a political scientist at Arizona State University, writes about what he calls the “missing middle” which he describes as follows:

On the one hand, the news market in cyberspace seems even more concentrated on the top ten or twenty outlets than print media is.  On the other, the tiniest outlets have indeed earned a substantial portion of the total eyeballs…. It is the middle-class outlets that have seen relative decline in the online world.

The rich get richer; the poor get richer.  The middle class gets the squeeze.

The Pew study doesn’t necessarily confirm Hindman’s analysis – he has his own data – and I’m not in any position to judge his methods.  The point, however, is that the concentration of sources linked to in the blogosphere does not necessarily contradict claims of fragmentation or the long tail.  It merely complicates them.

Case Study: Why I don’t need to pay for news

I learned yesterday viaTwitter that Republican Senator Lindsey Graham would not be supporting the Kerry-Lieberman climate & energy bill that he helped to craft.  Grist reporter David Roberts tweeted:

Profile in Courage: Lindsey Graham now says he’ll vote against #climatebill. Not enough offshore drilling left. http://bit.ly/bHraHO

The link, to National Journal, requires a subscription and offers only a teaser. Annoying.  But I got the news and surely would hear more soon.

And I did, this morning, when I checked my email and opened up the day’s Wonkbook, Ezra Klein’s morning policy roundup.  Here’s what Wonkbook gave me:

Citing changes to the offshore drilling provisions, Lindsey Graham says he’ll vote against the climate bill he helped write: http://bit.ly/bj4U7J

Nothing I didn’t know.  And still the gated link.  So I headed to my RSS reader.  Sure enough, Brad Plumer had a nice post on the subject.  Of course, he’s got a National Journal subscription, so in the course of the post he gives me three paragraphs from the original article, while adding analysis of his own.  In case that wasn’t enough, the Washington Post’s post, links to the National Journal, and borrows a few quote from Graham.

I wrote previously about how hard it is for paywalls to compete with free content.  As long as this sort of quoting is legal, there’s just no need for me to pay for hard news.  Not only can other reporters cover the same story once it’s been broken, but bloggers can quote paragraphs at a time.  So if you want me to support your journalism you’re better off asking nicely.

The future of paying attention

Nicholas Carr has a short piece in The Wall Street Journal reiterating his argument that the web is “turning us into scattered and superficial thinkers.”  (More Carr here and here.)  Is the web uniquely full of “constant distractions and interruptions”?  Carr drives home his point with a comparison to another medium:

It is revealing, and distressing, to compare the cognitive effects of the Internet with those of an earlier information technology, the printed book. Whereas the Internet scatters our attention, the book focuses it. Unlike the screen, the page promotes contemplativeness.

Sure enough, one of the attributes of the book, as a technology, that I’ve lately come to appreciate is the focus it lends.  Being offline, and thus comparatively free of distraction, can be a technological benefit.

But in a companion WSJ piece, Clay Shirky takes issue with the comparison:

In the history of print, we got erotic novels 100 years before we got scientific journals, and complaints about distraction have been rampant; no less a beneficiary of the printing press than Martin Luther complained, “The multitude of books is a great evil. There is no measure of limit to this fever for writing.” Edgar Allan Poe, writing during another surge in publishing, concluded, “The enormous multiplication of books in every branch of knowledge is one of the greatest evils of this age; since it presents one of the most serious obstacles to the acquisition of correct information.”

The key, Shirky argues, is that we, as a society, learn how to make good use of new mediums.

The response to distraction, then as now, was social structure. Reading is an unnatural act; we are no more evolved to read books than we are to use computers. Literate societies become literate by investing extraordinary resources, every year, training children to read. Now it’s our turn to figure out what response we need to shape our use of digital tools.

This is probably not enough to satisfy Carr, who fears that we’ll trade in a superior set of habits and structures for an inferior one.  From the Carr piece:

[Developmental psychologist Patricia] Greenfield concluded that “every medium develops some cognitive skills at the expense of others.” Our growing use of screen-based media, she said, has strengthened visual-spatial intelligence, which can improve the ability to do jobs that involve keeping track of lots of simultaneous signals, like air traffic control. But that has been accompanied by “new weaknesses in higher-order cognitive processes,” including “abstract vocabulary, mindfulness, reflection, inductive problem solving, critical thinking, and imagination.” We’re becoming, in a word, shallower.

Carr’s prescription is less time online.  But I’m personally much more intrigued by the project of designing the habits, norms, technologies and social structures that allow us to maximize the benefits of the web.

This is what author and academic Howard Rheingold calls “infotention.” As he puts it:

Infotention is a word I came up with to describe the psycho-social-techno skill/tools we all need to find our way online today, a mind-machine combination of brain-powered attention skills with computer-powered information filters… Knowing what to pay attention to is a cognitive skill that steers and focuses the technical knowledge of how to find information worth your attention. More and more, knowing where to direct your attention involves a third element, together with your own attentional discipline and use of online power tools – other people.

Cognitive discipline + technology + a good network = the future of paying attention.  No one knows how well it will work, but few can deny the potential offered by the web.  Learning how to use it well is one of the greatest challenges we now face.

Framing your friendships

Two things are indisputably true of Tyler Cowen: he has an interesting mind, and he has an economist’s mind.

This struck me as I was reading Chapter 4 of Create Your Own Economy, titled ‘IM, Cell Phones, and Facebook’.  It’s a quirky (and occasionally funny) chapter about how our choice of communication platform impacts our communications.

On one level it’s the “medium is the message” thesis.  But, since he’s a behavioral economist, Cowen frames his argument as competition between competing frames of reference.

We choose to send or receive messages in particular ways, in part, to determine which kinds of framing effects will influence our thoughts and emotions.  The greater the number of media we have to choose from, the more likely this process will suit our tastes.

Though economists often discuss framing effects in the context of bias or irrationality, Cowen focuses on the potential benefits of competition between mediums.  “Facebook,” he writes, “has made me friendlier… It is a framing effect that I have chosen to keep, and to my advantage.”

Framing effects may not be the simplest lense through which to view his basic point: greater choice in our communications is a good thing.  But it’s an interesting lense.  And an economist’s lense.

Tyler Cowen on cultural literacy

I’m reading Tyler Cowen’s book Create Your Own Economy and I’ll be posting thoughts and snippets as I go.  Here’s Cowen on the new cultural literacy:

What cultural literacy means today is not whether you can “read” all the symbols in a Rubens painting but whether you can operate an iPhone and other web-related technologies.  The iPhone, if used properly, can get you to website on Rubens as well.  The question is not whether you know the classics but whether you are capable of assembling your own blend of cultural bits.  When viewed in this light, today’s young people are very culturally literate and in fact they are very often the cultural leaders and creators. (pg. 59)

I’m only a few chapters in, but I’m greatly enjoying the book.  And, of course, I can’t recommend Cowen’s blog, Marginal Revolution, highly enough.