Dec 232010
 

My recent post on net neutrality unpacked a hypothetical offered by Reason magazine.  I laid out several reasons I thought their reasoning was flawed, and I want to briefly add another.  First, here’s the hypothetical:

If AT&T DSL blocked your access to Google because they wanted you to use Yahoo, what would you do? Probably cancel your plan and go to a provider that gives you easy access to your favorite sites.

When it comes to ISP’s discriminating between services or content, blocking sites is a blunt instrument (and one the FCC has now disallowed.)

Much trickier is discriminating via speed of service.  In the context of the hypothetical that means AT&T agreeing to make Yahoo arrive faster than Google or vice versa.

Even if you think blocked sites are enough to make users change ISPs, you might reasonably conclude they’re less likely to do so over this more subtle kind of discrimination.  Especially if they don’t even know if it’s going on.

Of course, the FCC is now requiring ISPs to “disclose what steps they take to manage their networks”.  Does Reason at least think that’s a good thing?  Or do they think consumers will sleuth out this kind of discrimination on their own and switch to neutral ISPs?

Share
Dec 232010
 

Happy Holidays, courtesy of someecards.com:
someecards.com - I'm assuming Facebook will let Santa know what I want for Christmas

Seriously, though, I want to thank everyone who has taken the time to read this blog over the past year.  Especially those regular readers who’ve done me the favor of diligently read every post.  The feedback you’ve provided has been invaluable.  Thank you.

Here’s one more someecard. I couldn’t resist:
someecards.com - Happy Holidays to all my Facebook friends and the marketers accessing my private Facebook information

Share
Dec 222010
 

I’ve held off posting anything about Wikileaks, as the subject’s complexity is a little daunting.  I still don’t have any polished thoughts, but I’ll offer a few unpolished ones alongside some reading recommendations:

If you haven’t been following this story, check out The Beginner’s Guide to Wikileaks at The Atlantic.

One of the basic aspects of the story I was missing at first was the extent to which Wikileaks worked with media organizations and even governments to redact the documents and decide what to publish.  To learn more about that, read Glenn Greenwald here.

The one question that consistently hurt my ability to think clearly about this story was Is Wikileaks good or bad? or, put another way, Should these cables have been published? So my advice is to put that aside for now and focus on a few other interesting aspects of the story, like…

Is Wikileaks a new kind of media organization or a new kind of source? The New York Times treats it as “a source, not a partner”, according to NYT executive editor Bill Keller.  An excellent summary of his comments on Wikileaks is available at the Nieman Lab.  For a different perspective try Matthew Ingram of GigaOM arguing “Like It or Not, WikiLeaks is a Media Organization”. NYT’s David Carr has thoughts here.

Another interesting line of inquiry looks at how governments can exert indirect control over organizations like Wikileaks in cases where they lack the ability to exert direct control.  Henry Farrell at Crooked Timber has a good post on this topic.

Another thing I’ve been pondering is what predispositions predict one’s opinion of Wikileaks.  This post by Tom Slee, which I found via both Clay Shirky and Crooked Timber, puts it this way:

Your answer to “what data should the government make public?” depends not so much on what you think about data, but what you think about the government.

I think he’s only partly right.  What you think about government matters tremendously.  But I wouldn’t downplay data.  I’m finding, in reading and conversations, that what you think about it does also hinge on what you think about technology.  All else equal, if you’re bullish about technology’s prospects for improving the world, you’re more likely to approve of Wikileaks’ data dump.  Ditto if you’re already sympathetic to hacker culture.  Or if you generally view increased access to information as crucial to improving society.

Put these two items – thoughts on government and thoughts on technology – together and I think it explains much of the disconnect between the standard Washingtonian’s view of Wikileaks and the standard geek view.  The latter is dominated by a combination of liberals and libertarians, both of which are likely to harbor deep suspicions about the government’s handling of international affairs.  Add to that a predisposition towards technology – contrasted with a view where tech can cause as many problems as it solves – and a true disconnect is revealed.  For these two reasons, the geek world has a much stronger bias towards transparency than the beltway insider.

I don’t mean “bias” in a pejorative way, and certainly don’t mean to suggest that one or the other view is closer to being right.  My own sympathies in this case are all over the map.  But I’d love to test my theory.  How much power would questions about Iraq and waterboarding have in predicting sympathy to Wikileaks? I imagine quite a lot.  But what about one’s reaction to a statement like “information wants to be free”? I’d bet that has some predictive power as well.

In closing, in place of any master synthesis or confident opinion, I’ll simply link to Clay Shirky’s post on the topic, which I think lays out the issues nicely.

For more reading, The Atlantic has a terrific roundup of reactions here.  My Delicious links on Wikileaks are here.

Share
Dec 212010
 

Net neutrality is something where my bias is so clear (in favor) that I try to be extra careful not to stake out a position before I’ve thoroughly researched the issue.  For that reason I’m still not sure what I think about it even in principle, much less the FCC’s recently proposed rules.

So I was interested to see what points libertarian magazine Reason put forth in this anti-net neutrality video:

To start, I want to zero in on this line:

If AT&T DSL blocked your access to Google because they wanted you to use Yahoo, what would you do? Probably cancel your plan and go to a provider that gives you easy access to your favorite sites.

There are multiple things wrong with this statement.

First, this scenario completely ignores the widespread lack of competition between ISPs.  The consumer behavior the video describes only makes sense in the context of competitive internet service.  The libertarian’s response is to point out that the answer to that is reforming telecom regulation to foster competition.  But if that’s really your argument, you have to make some reference to it instead of misleading viewers into thinking that competition already exists.  In other words, the fact that the scenario Reason describes is only even possible if significant reforms pass first seems relevant.

Ok, now put that aside for a moment, and imagine enough competition existed for this kind of consumer behavior to be possible.  Is it plausible?  The video uses a nice trick to make us think it is: appealing to our universal love of Google.  So let’s flip it around and try that on for size:

If AT&T DSL blocked your access to Yahoo because they wanted you to use Google, what would you do?  Off the top of my head my answer is “Probably nothing.”

The third misleading thing about that example is the choice between incumbents.  Yahoo or Google?  If you tell me I can’t use Google I know enough about how much I love Google to seek out another provider.

But what about Google vs. the next great search company?  If Google can reach into its deep pockets to ensure its searches are delivered faster, that makes it a lot harder for an emerging company/technology to compete for market share.  New entrants not only lack the deep pockets to pay ISPs, they lack the name recognition required to convince consumers to seek out neutral ISPs.  Even if I would switch ISPs to make sure Google isn’t disadvantaged relative to Yahoo, would I actively switch from an ISP that equally prioritized incumbents in order to access new entrants I’d never heard of?

I don’t consider any of these to be case-closed arguments in favor of net neutrality.  But if these are the best arguments against net neutrality, they’re fairly weak.  The libertarian trump card, played at the end of the video, is unforeseen consequences, and I take that point seriously.  That’s one of the reasons I’ve not yet reached a firm position on net neutrality.

But while I may not have a stance on the issue, I do have a starting point, and it’s this: something incredibly important is at stake here.  Both at the software layer and the content layer we are seeing the rise of a fascinating model of information production.  I’ll once again defer on defining that model, except to say that is significantly non-commercial.  Non-commercial production is threatened by a non-neutral net, even more so than emergent commercial entities.  If you care about preserving that non-commercial aspect, if only to learn more about it and to see its full potential, you should care a lot about net neutrality.

Share
Dec 172010
 

Growing up I remember my parents subscribing to three magazines: The New Yorker, The Atlantic and Harper’s.  Over the past couple of weeks I’ve seen updates on how each is faring online.  Results vary.  A lot.

Let’s start with the worst…

Harper’s

harperscoverIt wasn’t all that long ago that I was considering subscribing to Harper’s.  They have some great essays, including this quality August nonfiction essay: Happiness is a Worn Gun.  But if this week’s column by the magazine’s publisher is any indication, Harper’s holds the web in outright contempt.  Publisher John MacArthur describes his web strategy as “protectionist” and that says it all.  MacArthur is impervious to the recommendations of “internet hucksters”.  Moreover, he’s “offended” by “the online sensibility”.

The internet’s impact on our politics is a controversial topic.  There are a diverse range of respectable views on the subject.  But it’s clear MacArthur has no interest in pro-internet arguments, no matter their merit or credentials.  He prefers to dismiss them, just like he dismissed email.  And we all know how that turned out… No joke, this is the argument.  You can’t make this up.

With a publisher like this at the helm, I’m not optimistic about Harper’s chances.  And that’s a real shame.

The New Yorkernewyorkercover

The New Yorker is my boring, middle-of-the-road example here.  It’s not a natural web innovator, but it hasn’t rolled over either.  They’ve got a robust blogs section.  And they recently redesigned their site.  It feels more web-like, yet preserves much of the classic New Yorker look.

The Atlantic

AtlanticcoverFull disclosure: The Atlantic was my favorite of the three growing up.  And it still is.  These days a lot of that is due to its excellence online.  The magazine has assembled a top-notch team of bloggers, built around the celebrity of Andrew Sullivan.  And just months ago it launched a terrific Technology channel which quickly became my favorite source of tech news and analysis.  On top of that, they’ve parleyed it all into profit.

Online matters

I could go on at length about all the reasons online matters, but for now I’ll address only how it impacts my consumption and support of these magazines.  I can say with confidence that so long as John MacArthur is publisher of Harper’s I will never subscribe.  I was given a subscription to The New Yorker as a gift, and I’m really enjoying it.  But would I purchase it on my own?  Doubtful.  On any given week I read maybe 2 New Yorker pieces.  But the Atlantic?  It’s increasingly a go-to source for me.  I haven’t subscribed to the magazine, but that’s more a reflection of how I read the news.  If they ever offered a donation model, I’d happily support them.  Not that they’d need it, given their success with advertising revenue.

Share
Dec 162010
 

Several weeks ago Steven Johnson took to the op-ed page of The New York Times to defend his excellent new book on innovation and to declare “I am not a Communist.”  The question of possible communist sympathies was raised, apparently, on a World Network imagebook tour, in reference to his support of what he dubs “fourth quadrant” innovation.  The “fourth quadrant” refers to innovations produced by networked non-market actors, a category including open-source software, among other things, which Johnson argues has an unparalleled track record in fostering breakthroughs.

Does that make him a communist?  He doesn’t think so:

the problem is that we don’t have a word that does justice to those of us who believe in the generative power of the fourth quadrant… The choice shouldn’t be between decentralized markets and command-and-control states.

And he’s right.  The rise of the web has exposed the market-state dichotomy as transparently inadequate.  Projects like Linux and Wikipedia hint at the emergence existence of a very different model of economic organization that seemingly fits neither category.

It is a model we are only beginning to understand, and yet in many ways it challenges some of our core beliefs about how to organize a society.  In the contest between markets and central planning, the market has been largely (and largely justifiably) ascendant.  Yet the lessons of its ascendancy are subtly and not-so-subtly contradicted by the ways in which we organize, communicate and produce information online.

To understand how, we have to temporarily return to the battle between market and state.  In The Future of Ideas Lawrence Lessig writes:

Over the past hundred years, much of the heat in political argument has been about which system for controlling resources – the state or the market – works best.  That war is over.  For most resources, most of the time, the market trumps the state.  There are exceptions, of course, and dissenters still.  But if the twentieth century taught us one lesson, it is the dominance of private over state ordering.*

Why?  That is, of course, a question fit for a lifetime of inquiry.  But let me take a stab at summing it up: because humans are selfish and stupid.

Motivation

Markets motivate us by aligning incentives.  We are more likely to exert effort when doing so directly benefits us.  A considerable portion of social science revolves around this tenet, which might be expressed short-hand as Most of us are self-interested most of the time.  We often tend to simplify even further by treating selfishness as profit maximization.  As Harvard’s Yochai Benkler explains it in his masterpiece The Wealth of Networks:

Much of economics achieves analytic tractability by adopting a very simple model of human motivation… Adding more of something people want, like money, to any given interaction will, all things considered, make that interaction more desirable to rational people.  While simplistic, this highly tractable model of human motivation has enabled policy prescriptions that have proven far more productive than prescriptions that depended on other models of human motivation — such as assuming that benign administrators will be motivated to serve their people, or that individuals will undertake self-sacrifice for the good of the nation or the commune. (pg. 92)

Information

Markets prevail over central planning in large part due to the stupidity cognitive constraints of central planners.  We can only gather and process so much information.  Which means our actions have unforeseen consequences, the future is hard to predict, etc.  Here I’ll lean on Cass Sunstein channelling Hayek in his book Infotopia:

Hayek claims that the great advantage of prices is that they aggregate both the information and the tastes of numerous people, incorporating far more material than could possibly be assembled by any central planner or board… For Hayek, the key economics question is how to incorporate that unorganized and dispersed knowledge.  That problem cannot possibly be solved by any particular person or board.  Central planners cannot have access to all of the knowledge held by particular people.  Taken as a whole, the knowledge held by those people is far greater than that held by even the most well-chosen experts. (pg. 119)

Similarly, in his 1977 book “Politics and Markets”, political scientist Charles Lindblom describes the “key difference” between markets and central planning as “the role of intellect in social organization” with “on the one side, a confident distinctive view of man using his intelligence in social organization [central planning]; on the other side, a skeptical view of his capacity [markets].” (pg. 248)

The Networked Information Economy

At the macro level markets continue to maintain these advantages over planning.  But is there another game in town?  What we see on the web challenges us to at least reconsider the unassailability of markets, both with respect to motivation and information.  Asks Benkler:

Why can fifty thousand volunteers successfully coauthor Wikipedia… and then turn around and give it away for free?  Why do 4.5 million volunteers contribute their leftover computer cycles to create the most powerful supercomputer on Earth, SETI@Home?

Econ 101 has a hard time answering.  The high profile success of these and other projects forces us to remember that the simplistic model of human motivation, central as it is to our faith in markets, was never universally true.  Further, they invite us to revisit the usefulness of such an assumption, and to strive for a more complete model of human motivation.  We create and produce for any number of reasons beyond profit, including altruism, status, or even – in a world of low transaction costs – boredom.

Just as the market’s claim to dominance in motivating us is starting to be challenged, some are revisiting its dominance in aggregating information.  Sunstein explores the subject in Infotopia and highlights increasing efforts to aggregate human preferences online, including Amazon and Netflix.  If it’s obvious that we are doing better and better at aggregating information thanks to the Net, it’s less obvious how this might challenge the role of the market.

Imagine that Netflix has a small, set number of a rare movie to rent, and that it’s in high demand.  Who should get it first?  Auction the privilege off to the highest bidder, responds the free market advocate.  And, particularly in a scenario where customers have equal wealth at their disposal, this method has a lot to recommend it.  The market is incredibly efficient at allocating resources under ideal settings.  Tremendous gains in human welfare have been predicated on this fact.  But Netflix is developing sophisticated algorithms to use your preferences for movies you’ve seen to predict what movies you’ll like.  Is it so hard to believe that some day in the future an algorithm could – given the aim of maximizing viewer enjoyment – “beat the market” in determining how to distribute the movie?

We are undoubtedly in the early stages of understanding what motivates us to collaborate online (and off), and probably even less far along in our efforts to manage and make useful the wealth of information online, including identifying and aggregating our preferences.  I’ve been purposefully vague here in describing the new model I’m discussing.  Better defining that model will be the topic of a future post.

My argument here is simply that our increasingly connected world – what Benkler calls the “networked information economy” – invites us to question some of the most basic premises that have led us to organize our society around the market.  It would be foolish to let those premises, and the new models that challenge them, go unexamined.

*Lessig is explicit that he is talking about consumption, not production.  It’s a useful distinction, however the two are more related than he seems to admit in this instance.
**In borrowing Lessig’s words here I don’t mean to subscribe to any aggressively free-market worldview.  The choice between a centrally planned economy and a mostly privately organized one may be settled.  The battle over where to draw the lines in the mixed economy rages on.
Share
Dec 052010
 

I honestly couldn’t tell you what the home pages for several of my favorite websites look like.  I do a lot of my reading through an RSS reader, which delivers new content from the blogs and news feeds to which I subscribe.  When I do visit the actual sites it’s almost exclusively through links from Twitter, Facebook, Gchat, email, etc., which direct me to specific pieces of content.  I almost never head over to a site’s hogawker-logomepage to see what’s promoted there.

But the home page still matters.  A lot.  That’s my takeaway from Gawker founder Nick Denton’s description of the site’s redesign.  The redesign “represents an evolution of the very blog form that has transformed online media over the last eight years” according to Denton.  And one of the central changes is replacing the reverse-chronological content display of the typical blog with “one visually appealing “splash” story, typically built around compelling video or other widescreen imagery and run in full.”

There’s a lot more in the post, including about the merits of video and of scoops over aggregation.  But more than anything, I was surprised by the enduring emphasis on the home page, even in the age of Twitter.  Maybe beyond the times oughta mix up the home page a bit?

(If you don’t have time to read the whole Denton post, The Atlantic’s Alexis Madrigal has a Gawker-esque top 5 takeaways post.)

Share