Product & Startup Builder

Filtering by Category: Media

Building in someone else's yard

Added on by Chris Saad.

Loic Le Meur writes over on LinkedIn about his mistakes betting on Twitter with his company Seesmic. Seesmic was a company that produced a series of great Twitter clients for multiple platforms (Mobile, Web, Desktop etc). When Twitter started shutting down developers and releasing their own official clients Seesmic's business was undermined and ultimately shuttered.

I'm not blaming Twitter for this strategic change – they did not know they would take that decision at the time when they were fully supporting their ecosystem. I blame myself entirely. I should have never dedicated all my team resources to build on one platform. That is a lesson learned the hard way along with many other developers. I was too excited and became blind.
...
Here are my two cents for entrepreneurs betting on someone else's success: be careful that everything can change from one day to another and all the rules will change. I will never be that dependent on anyone anymore.

 

Loic is a wicked smart and very successful entrepreneur. He's always smiling, generous and well liked by his peers. It's a real shame that Twitter pivoted in the way that it did to undermine his business.

I'd like to refine Loic's lessons learned a little here, though. In my opinion the problem was not betting on someone else's platform but rather...

  1. Twitter is not a platform, it's a media company
  2. Betting on one media company rather than multiple

Whenever a company makes money from Ads, it's not a platform/technology company - it's a media company. As a media company It needs to control the eyeballs so that it can control the ad impressions.

To be fair, though, Twitter's ad revenue model wasn't in place when Loic started betting on them. It was clear, however, that their revenue model was still in flux and that ads would play a role in order to keep the service free for end-users.

The reality is companies successfully rely on other platforms all the time. Amazon Web Services is a great example of this. There's never a risk that AWS is going to start turning off or competing with its developers because it is a true platform.

Like AWS, Echo is a true platform. We make our money by encouraging developers to build world class apps on our platform and we even help them sell those apps to major customers.

Facebook, Twitter etc were never true technology platforms. They are distribution channels. They are data sources. They are social services. But they are not platforms.

Ironically this is still happening today. Major media companies and developers still spend enormous sums of money encouraging their users to participate on Twitter and Facebook as 'outsourced engagement platforms'. Ironically Media companies who should understand the value of owning the audience and the ad impressions are happily outsourcing them to competing media companies (Facebook and Twitter). I write more about this over on the Echo blog.

The key, then, is not avoiding 3rd party platforms, but rather to understand the difference between platforms, products, services and media companies. It's key to understand the incentives, revenue flows and business models so you can understand how to align your company and product with the value chain.

 

Dark Social and Facebook+

Added on by Chris Saad.

Working with large brands at Echo is thrilling. They have the content, products and reach that matter in everyday people’s lives. This means that even small improvements in their Realtime, Social strategy results in big impacts on large groups of people. One of the prevailing misconceptions we find when we first get started with a new customer, however, is that Facebook is Social. Facebook comments, Facebook Likes, Facebook Fan Pages are often seen as the beginning and the end of the social ‘strategy’.

For as long as I can remember, my career has been about helping others to remember that Facebook (or Myspace or AOL etc) can only ever be one part of the larger web and Internet landscape. The percentage fluctuates of course but it is never 100%.

A new article in The Atlantic this week, however, reminds us that not only is Facebook only a fraction of the overall web (in terms of traffic referrers and participation) but also that its not even the biggest fraction. It also reminds us that while modern social networking has introduced many powerful novelties, being social on the internet is far from a new phenomenon. In fact, it has been a pervasive part of internet interactions since the beginning - think Email and Instant Messaging for example. These ‘old’ tools continue to have a huge (in fact the largest) impact on your referrer traffic and engagement.

This engagement, however, is under measured and not well understood. The Atlantic postulates that it appears in web analytics as unknown referrers to non home page or section front pages - assuming that direct traffic to deep links can only come from people sharing links to one another using tools that don’t leave referrer signatures. So the Atlantic has taken to calling this class of traffic ‘Dark Social’.

Below is a chart of their referral traffic as measured by ChartBeat. Most notably they have shown and labeled the appropriate traffic as 'Dark Social' on the chart.

This chart clearly shows that, for The Atlantic, Dark Social, and non Facebook ‘Standard Social’ together, accounts for almost 80% of all referral traffic.

In this light it is obvious that what’s needed is a ‘Facebook+’ strategy. Or better put, a strategy that puts your website at the center, with Mobile + Desktops + Facebook + Twitter + Reddit + Digg + StumbleUpon + Dark Social + many others as link distribution pipes.

This means that for maximum coverage and distribution, every login, sharing, commenting, following, notification, trending surface can’t just be a Facebook widget. You need white label Social Software Infrastructure that connects your audience to your site using the tools, technologies and distribution opportunities of the entire web.

The web has always been, and will always continue to be the platform. Social or otherwise.

The Open Web Is Dead - Long live the Open Web

Added on by Chris Saad.

Yesterday Robert Scoble once again declared that the Open Web was dead. His argument was that Apps and proprietary black holes like Facebook are absorbing all the light (read: users, attention, value, investment) and taking our beloved open platform right along with it. In his post, he kindly (but incorrectly) named me as the only person who really cares about the Open Web. While that's flattering, I think he's wrong about me being the only one who cares.

But he is right about the Open Web. It's in real danger. URLs are fading into the background,  native Mobile apps are all the rage and Facebook threatens to engulf the web into a proprietary black hole.

But I think there's a bigger problem going on right now. Not just with the web, but with silicon valley (as stewards of the web). We've lost sight of the things that matter. We're obsessed with quick wins, easily digestible VC pitches, stock options and flipping for a Ferrari.

There's more to this game than that. Let me touch on some of the things I see going on.

  1. Lead not just cheerlead In our obsession with being seen by our micro-audiences as 'thought leaders' or 'futurists' it's always very tempting to watch which way the wind is blowing and shout loudly that THERE is the future. Like a weather vane, it's easy to point the way the wind is blowing, but our biggest, best opportunity is not to declare a popular service 'the next big thing' just because a few visible people are hanging out there. Rather our collective and individual responsibility is to help articulate a direction we think moves the state of the art forward for both the web and for society at large. Something, as leaders of this field, we believe in. Just like VCs develop an investment thesis, we should all have a vision for where the web is going (and how it should get there) and actively seek out, support and promote quiet heros who are building something that moves the needle in the right direction.
  2. Add to the web's DNA Almost every startup I see today is focused on building an 'App' and calling it a 'Platform'. Too often (almost every time) though, these apps are nothing more than proprietary, incremental and niche attempts at making a quick buck. We need more companies to think deeper. Think longer term. What are you doing to change the fabric of the web's DNA forever? How can you contribute to the very essence of the Internet the same way that TCP/IP, HTTP, HTML, JS and so many other technologies have done. Even proprietary technologies have provided valuable evolutions forward - things like Flash and yes, even FB. How are you going to live forever? This is why Facebook used to call itself a 'Social Utility' instead of a 'Social Network'. Mark Zuckerberg was never content to be the next Myspace Tom. He wanted to be the next Alexander Graham Bell. And now he is.
  3. Don't just iterate, innovate Of course, someone has to build Apps. We can't all be working at the infrastructure layer. But too many of the Apps we chose to build (or champion) are incremental. As startup founders, investors and influencers it's so easy to understand something that can be described as the 'Flipboard of Monkeys' instead of thinking really hard about how a completely new idea might fit into the future. Sure there are plenty of good business and marketing reasons why you shouldn't stray too far from the beaten path, broadening it one incremental feature at a time, but the core essence of what you're working on can't be yet another turn of a very tired wheel. If you're shouting 'Me too' then you're probably not thinking big enough.
  4. B2C, not Ego2C Silicon valley is clearly a B2C town. We all love the sexy new app that our mother might eventually understand. Something we can get millions of users to use so we can show them lots of ads. Besides the fact that I think we should focus a little more on B2B, the problem is we're not really a B2C town at all. We're actually more focused on what I will call Ego2c. That is, we pick our favorite apps based on how famous the founding team is OR how easily we can use the app to build yet another niche audience for ourselves (and brands/marketers). It would be a tragedy if the social web revolution boils down to new methods of PR and marketing. But that's what we seem to be obsessed with. As soon as any app from a famous founder gets released we give it tones of buzz while plenty of more deserving projects get barley a squeak. If the app gets a little traction (typically the ones that have Ego mechanics baked in) you see a million posts about how marketers can exploit it. Inevitably the app developers start to focus on how to 'increase social coefficients' instead of how to help human beings make a connection or find utility in their lives.
  5. "Users don't care" Speaking more specifically about the Open vs. Closed debate, too often we hear the criticism "Users don't care about open". This is absolutely true and the reason why most open efforts fail. Users don't care about open. They care about utility and choice. This is why the only way to continue propagating the open web is to work with BUSINESS. B2B. Startups, Media Brands, The bigco Tech companies. They care about open because the proprietary winners are kicking the losers ass and that usually means there are at least 1 or more other guys who need a competitive advantage. They need to team up and build, deploy and popularize the open alternative.  That's why open always wins. There's always plenty of losers around who are going to commoditize the popular closed thing. As technology leaders we're paid to care about things users don't care about. Things that shape the future. While users, in the short term, might not care, we should dare to think and dream a little bigger. As a case study look at Android vs. iOS. iOS is more profitable for a single company, but the other is now a force of nature.
  6. Death is just a stage of life Just because something is no longer interesting doesn't mean it's dead. Its spirit, and often times the actual technology, lives on, one layer below the surface. RSS is a great example of this. RSS's spirit lives on in ActivityStreams and the general publish/subscribe model. It is powering almost every service-to-service interaction you currently enjoy. Is it dead, or has it simply become part of the DNA of the Internet? Could RSS (or something like it) be better exposed higher up in the stack, absolutely, but that will take some time, thoughtful execution and influencers who are willing to champion the cause. The same is true for OpenID and OAuth.
  7. The Arc of the Universe Is long but It bends towards Open The battle of Open vs. Closed is not a zero sum game. Both have their time. It's a sin wave. First, closed, proprietary solutions come to define a new way of fulfilling a use case and doing business. They solve a problem simply and elegantly and blaze a path to market awareness, acceptance and commercialization. Open, however, always follows. Whether it's a year, a decade or a century, Open. Always. Wins. The only question is how long, as an industry, are we going to keep our tail tucked between our legs in front of the the great giant proprietary platform of the moment or are we going to get our act together to ensure the "Time to Open" is as short as possible. It takes courage, co-ordination and vision, but we can all play our part to shorten the time frame between the invention of a proprietary app and the absorption of that value into the open web platform.
  8. Acknowledge reality FB has won. It's done. Just like Microsoft won the Desktop OS (in part handed to them by IBM), so too has FB won the Social OS (in part handed to them by Microsoft). For now. Acknowledging the truth is the first step to changing it. The only question now is how long we're all willing to wait until we get our act together to turn the proprietary innovation of the 'social graph' into part of the open web's core DNA. We need to recognize our power. They have ~1B users? The open web has more. Chances are that the major website or brand you work for has plenty of its own users as well. Are you going to send them to FB, or are you going to invest in your own .com. Trust me, I know it's really, really easy to take what you're given because you're too busy putting out a million fires. But as technology leaders I challenge us all to build something better. We're the only ones who can.
  9. [Edit] Don't kill Hollywood Did you catch the YC post  calling for silicon valley to kill hollywood. Not only was this reckless and short sighted, it's the exact opposite of what we should be doing. Instead of trying to kill or cannibalize media companies and content creators, how about we work with them to create the next generation of information technology. They have the audiences+information and we have the technology. Instead, most silicon valley companies, by virtue of their B2C focus, are too busy leaching off major media instead of finding ways to help transform it. Sure most of them move slowly - but move they are. Move they must. Helping them is very profitable. I write more about this on the Echo blog - calling it 'Real-time Storytelling'
  10. [Edit] Today's data portability problem When I started the DataPortability project the issue of the time was personal data portability. That's not the case anymore. While user-centric data portability is still being done via proprietary mechanisms it's a) actually possible and b) moving more towards open standards every day. The real issue right now is firehoses. Access to broad corpuses of data so that 3rd parties can innovate is only possible through firehoses (for now). To put it another way, the reason Google was possible was because the open web was crawl-able - for free - with no biz dev deal. The reason FB was possible was because the open web allowed any site to spring up and do what it wanted to do. Today, too much of our data is locked up in closed repositories that can and must be cracked open. Google's moves to exclude other socnets (besides G+) from their search results until they had free and clear access to them might be inconvenient for users in the short term, but, as a strategic forcing function, is in the best interest of the open web long term.

End of rant.

Analysis of F8, Timeline, Ticker and Open Graph

Added on by Chris Saad.

So at F8 last week Facebook announced Ticker, Timeline and extensions to the Open Graph API to allow for new verbs and nouns. Here's what really happened.

  • They split their single 'News Feed' into 3 levels of filtering. Now (Ticker), Relevant (News Feed), Historical (Timeline). (Side note, we've had a 'Ticker' style product at Echo that we called 'Community Stream' for a long time now - and most of our customers and partners said to us 'why would we want to show all that data it's just noisy'. Maybe now they will take a second look.). Question: Will G+, Twitter and the REST of the web adopt the same model? They should.
  • This allows FB to collect more 'noise' (also known as synaptic firings or Attention data) which, in turn, allows them to find more signal (also known as synaptic inferences or attention management). I've long said that the answer to information overload is not LESS information - it's MORE. The more information you have the more ability you have to find patterns and surface them in relevant places (I said it so long ago I can't even find the link). Question: Will independent websites think to collect their OWN Attention data BEFORE sending it to FB so they can leverage for their own purposes. The value of this data is incalculable.
  • Having these new presentation metaphors in place, they then created a mechanism to collect more data in the form of expanded Verbs and Nouns in the Open Graph API. With this new API, user's are now expected to abandon explicit gestures of sharing and instead, accept that every action they take is auto-shared to their friends. Question: When will the first horror stories start coming out about engagement ring purchases, personal health issues and sexual orientations being inappropriately revealed due to auto-sharing?
  • Using all the bling of the Timeline, along with new messaging and a simple little opt in toggle of 'Add to my timeline' they managed to re-launch 'Beacon' without anyone noticing (none of the tech blogs I saw even mentioned it). Question: Why did none of the tech media cover that angle of the story?

I continue to be in awe of Facebook's scale, seriousness, ambition and momentum. There has never been anything like it before.

They have created an Attention Management Platform that rivals Google Search and easily out classes many of my best ideas about Attention Management and Personal Relevancy back when I was thinking about the problem.

It's breathtaking.

And since it is all done with hard links to a single proprietary hub, it is eating the web like a cancer.

Before F8 it was clear that Google+ was a 1 or 2 years behind FB. Now they are 3 or 4.

Only time will tell who, how and why more open systems will begin to reassert themselves in the ecosystem. My bet is that it wont come from a b2c copy-cat, though. It will come from a well organized, commercially incentivized b2b play.

The part that still confuses me, though, is why ANY serious media company would want their news to load in a 'FB canvas app' instead of their own website. It makes zero sense. None of this changes the reality that you need to own your own data and your own point source. I made a little comparison table earlier in the week that explains why.

WSJ Outsources its business to Facebook

Added on by Chris Saad.

Today WSJ announced that it has built a news publishing platform that lives inside Facebook - effectively outsourcing their core website to the Social Networking Giant. The number of reasons this is a bad idea is staggering. I've tried to summarize them in a spreadsheet comparing a FB approach verses an Open Web approach.

Please feel free to contribute

Real Names getting Real Attention

Added on by Chris Saad.

There's a lot of fury on the web right now about 'Real Names'. FB is trying to use it as a unique feature of their comments system claiming it reduces trolling and low value comments. Of course that isn't really true. For one, any commenting system could force FB login. Two, users will troll with or without their name attached and, worse yet, many legitimate users won't participate for any number of reasons if they can't use a pseudonym. There are plenty of better ways to increase quality in your comments including participation from the content creators, game mechanics, community moderation and more.

The real debate, however, is about G+ trying to copy FB's stance on Real Names. They are insisting all user accounts use them and are actively shutting down accounts that violate the policy. They are being so heavy handed about that even people who ARE using their real name are getting notices of violation - most notable Violet Blue.

I'm not really an expert on pseudonyms, shared contexts and anonymity so I'm going to stay out of this debate.

The real question for me, however, is what is Google's strategic business reason for this policy. There must be a long term plan/reason for it otherwise they wouldn't be insisting so hard.

My assumption is that it's related to their intention to become a canonical people directory and identity provider on the internet to compete with FB in this space.

FB, after all, does not just get it's power from news feeds and photo apps - it gets it from the deep roots it has laid down into the DNA of the internet as the provider of 1st class identity infrastructure and identity information.

In this sense, FB's social contract has served them very well, and Google's attempt to copy it is a hint that they understand FB is not just a .com feature set, but a powerful identity utility. They must (and in some cases seem to be) understand that strategy and it's aggressiveness if they are to properly compete with the monopoly. My only hope, however, is that they are coming up with their own inspired counter strategy rather than just copying the moves they see on the surface - because that's doomed to fail.

What is 'Real-time as a Service'?

Added on by Chris Saad.

First, to define 'Real-time' Real-time is no CDN or Cache latency. When there is new data in the database, it's available to the end-user.

Real-time is not needing to hit the refresh button to see new information. It's when information folds into the page while you're reading it.

Real-time is a new volume and velocity of data. A lot of web data used to consist of 'Blog Posts' or 'News Articles'. Documents. Real-time web data is about activities. Granular, human readable micro-stories about the activities that users make.

"I read this", "I rated this", "I commented on this", "I shared this", "I edited this" and so on. Why? Because capturing, surfacing and socializing real-time activity data is part of the core essence of the social web. The ability to see not just the result of actions by users, but the play-by-play stream of those actions along side faces, names and time/date stamps takes an experience from a static 'snapshot' into a living, breathing stream. Further, by enabling users to like, reply, flag, share and otherwise interact with these activities, sites are creating new opportunities for engagement, conversation and conversion.

Real-time is a presentation metaphor. It often (but not always) takes the form of a reverse chronological stream with nested comments and likes. It helps users understand the order of things and mixes content with conversation in a way that drives engagement and return visits.

Real-time means filters instead of facts. Let the user decide what they want to see - to craft an experience that makes sense for them, and their friends.

Now, what is 'Real-time as a Service'?

If all the things above are true, then it changes everything we used to know about web infrastructure, databases, user interfaces and tools for moderation or curation.

APIs can no longer be request-response. Databases must now store far more data at far faster rates. User interfaces need to factor in names, faces and actions. Moderation and curation tools must leverage algorithms, crowd sourcing and real-time flows.

Real-time as a service, then, is cloud infrastructure that helps make this transition easier.

It is a database that can handle new magnitudes of scale - handling hundreds or thousands of write events per section. Not just to a flat table, but to a hierarchical tree of arbitrary activities.

Site -> Section -> Article -> Rating -> Comment -> Reply -> Like.

It's a database that can store all items permanently so that users can visit old streams at any time. Permanent storage that can also handle localized annotations. Localized annotations are the ability to modify the metadata of an activity - say a Tweet (Promote it, tag it, retarget it in the tree etc) - in such a way that that your view of a tweet is different from another customer's view.

It's a database that enables not just the ability to perform an SQL-like search query, but also continuously updates you when the data changes - so that you can modify the UI on the fly.

It's a database that returns not just flat query results, but a hierarchical tree - allowing you to present the activity in context.

It's a database that handles not just a few hundred users requesting (reading) data, but a few million users swarming to see the latest action in a sports game or a concert.

It's a database that organically makes connections between items by understanding the relationships of URLs and #tags to make implicit links in the graph where and when they're needed. For example a tweet mentioning acme.com should be attached to Acme.com in the tree.

And most importantly, it's a database company that understands that the opportunity of the Real-time, Social Web is far too big and moves far too quickly to possibly be built by a single vendor. A company that, as a result of this understanding, chooses open standards over proprietary formats; Partnership with best-of-breed partners over trying to build mediocre versions of everything by itself.

Polls, Ratings, Comments, Live Blogging, Forums, Data Bridging, Data Enriching, Visualization, Moderation, Curation, Analytics Game Mechanics, Authentication... the list is endless. They are all transformed by the Real-time web. They must all be part of Real-time as a Service.

And finally, Real-time as a Service is about service. Enterprise grade support. Best in class uptime. White label.

That's Real-time as a Service.

Further Reading

NYT Paywall, Huffpo Lawsuit - Symptoms of the same misconception

Added on by Chris Saad.

Over the last few days I have been debating the NYT pay wall on a private email thread of friends. I didn't feel the need to post it on my blog because I thought that pay walls were so obviously a losing strategy that it was a waste of time to comment.

But combined with the recent law suit against the Huffingon Post and Arianna Huffington's eqloquent response yesterday, I felt it was worth while to re-publish my thoughts here. Most of them are based on thinking and writing that I did many years ago around Attention. Most of that old writing has been lost in the blog shuffle. Hopefully one day I will dig it up and re-post it in a safe place.

On to the issue...

The price of content

I believe that people have historically paid for the medium not the content.

They pay for 'Cable' not for 'CNN News'. They pay for 'The Paper' not for the content in the newspaper. They pay for 'CDs' not for the music on the album.

Also they paid a lot because the medium was perceived to be scarce (scarce materials, scarce shelf space, scarce advertising dollars), scarce talented people.

Consumers are not stupid, they understand (if only somewhere at the back of their mind) that the COST of creating and distributing things has been deflated by a growing list of converging trends.

We live in a world of abundance (in the area of digital content anyway). Shelf space is infinite (database entries), any kid in a basement can make content and there is no physical media anymore so cost of distribution has disappeared as well.

The scarcity now is on the consumption side - Attention is the scarce resource. Value is derived from scarcity.

That's why on the Internet, Attention allocation systems (Google Search, FB News Feed etc) are attracting traffic, engagement and ultimately profit.

In this new world, the price of content must be reduced significantly as shakeouts and rebalancing occurs - because the cost of producing it is approaching zero.

The more the Music, TV and News industry fight this, the more they leave themselves open to disruption by Google, FB, Twitter and the rest of silicon valley.

This is not even to mention that everyone is producing content now. Tweets, Photos, Videos - it's abundant. Of course most of it isn't very 'good' by J school standards - but that's irrelevant. The world has never rewarded good with any consistency.

Also just because content is not good, doesn't mean it isn't personally meaningful.

For example, I care more what my child (theoretical child of course) posts to FB than the most important journalist in all the world says on CNN.

But please don't confuse my dispassionate assessment of the issue as pleasure or happiness at the demise of mainstream media though.

I am simply stating the facts because without understanding those we can't begin to change them (if that's what the media world decided to do).

In terms of making a judgement of those facts, I think that curators who weave and summarize a broader narrative in the form of 'reporting' are critical for an informed citizenship and a functional democracy. I believe in it so much that I have dedicate my life to helping mainstream media companies staying relevant and co-writing things like this: http://aboutecho.com/2010/08/18/essay-real-time-storytelling/

But I also believe that mainstream mass media broke an ancient (and by ancient, I mean as old as rudimentary human communication) pattern of people telling each other personal stories vs. getting all their stories/news from editorialized mass broadcasts.

The Internet may just be restoring the balance. The result is some massive restructuring of inflated budgets, processes, offices, costs etc. While we're in the middle of that restructuring, it looks like a media apocalypse. Until it settles down and a new equilibrium is found.

Here's what Arianna wrote on the subject:

The key point that the lawsuit completely ignores (or perhaps fails to understand) is how new media, new technologies, and the linked economy have changed the game, enabling millions of people to shift their focus from passive observation to active participation -- from couch potato to self-expression. Writing blogs, sending tweets, updating your Facebook page, editing photos, uploading videos, and making music are options made possible by new technologies.

The same people who never question why someone would sit on a couch and watch TV for eight hours straight can't understand why someone would find it rewarding to weigh in on the issues -- great and small -- that interest them. For free. They don't understand the people who contribute to Wikipedia for free, who maintain their own blogs for free, who tweet for free, who constantly refresh and update their Facebook pages for free, and who want to help tell the stories of what is happening in their lives and in their communities... for free.

Free content -- shared by people who want to connect, share their passions, and have their opinions heard -- fuels much of what appears on Facebook, Twitter, Tumblr, Yelp, Foursquare, TripAdvisor, Flickr, and YouTube. As John Hrvatska, a commenter on the New York Timeswrote of the Tasini suit, "So, does this mean when YouTube was sold to Google that all the people who posted videos on YouTube should have been compensated?" (And Mr. Hrvatska no doubt contributed that original and well-reasoned thought without any expectation he'd be paid for it. He just wanted to weigh in.)

Read more on her post

Update

And here's a bit of 'Free Content' - A conversation I had on Twitter wish someone who disagreed with this post.

What is Echo StreamServer?

Added on by Chris Saad.

Yesterday we announced a new Echo product called StreamServer. There is very little more I can say that Khris Loux has not already said so eloquently on stage at the #e2 launch event

When you work so hard and long on something (depending on how you look at it, StreamServer was either 15, 2.5 or 1 year in the making) its hard to sum it all up in one, 1 hour event.

But that's what we tried to do.

We tried to thread the needle between a contemporary story about activity data, the existential change (read: opportunity or threat) occurring on the web as traffic and monetization flows to proprietary social networking platforms, the opportunity for every major node on the web to be just as powerful and innovative, the need for open standards and powerful cloud services as the basis of the the rebuttal and our deep desire to make this an industry wide effort. We tried to communicate the important role of aggregation and the pivotal job of mainstream media, e-commerce, entertainment, startup and agencies play in curating activity information for the masses.

We also tried to communicate that this was not just a pipe dream, but rather a commercial reality for major customers. A solution running at scale. A new distribution and monetization opportunity for 3rd party devs and a future ready piece of infrastructure for media companies.

I think we did the best job possible at threading all these stories, and doing it with a human, authentic voice through the lens of customer and partner experiences.

I'm proud of the work we've done so far, and the tireless efforts of the Echo team and our customer/partner devs.

And all of that being said, though, we are only at the beginning. We have just planted the first seed and I look forward to helping it grow.

So what is StreamServer in my words?

It is the real-time, social scale database that Twitter, Facebook, Quora, Foursquare and others built, delivered as an ec2 style cloud service. Turn it on, and forget about managing the data or scaling the infrastructure.

It is the first of its kind and it will hopefully form the basis of many new companies as they deliver many new, novel and innovative experiences to customers and end users everywhere.

And it's a bet on the future of open standards, developer ecosystems, a heterogeneous web made up of first class social nodes.

It's Real-time as a Service.

New Twitter. Feature comparison

Added on by Chris Saad.

Jeremiah and I wrote an analysis of the New Twitter vs. Current Facebook. Here's a snippet:

Situation: Twitter’s new redesign advances their user experience

Twitter has announced a new redesign today, yet by looking at the news, there hasn’t been a detailed breakdown of these two leading social networks.  Overall, Twitters new features start to resemble some features of a traditional social network, beyond their simple messaging heritage.  We took the key features from both social website and did a comparison and voted on the stronger player?

[Great Detailed Graph goes here - See it on Jeremiah's blog]

Our Verdict: Facebook Features Lead Over Twitter’s New Redesign

Facebook’s features offer a more robust user experience, and they have a longer history of developing the right relationships with media, developers, and their users. Twitter, a rapidly growing social network has launched a series of new features (described by the founder as “smooth like butter”) that provide users with a snappy experience and enhanced features.

We tallied the important features of this launch and to their overall expansion strategy and have concluded that Facebook’s features continue to hold dominance over Twitter, despite the noticeable improvements. While we don’t expect that Twitter wants to become ‘another Facebook’ they should play to their strengths and remaining nimble and lightweight yet allowing for developers and content producer to better integrate into their system.

Check out the full results over on his blog.

Guest Post: Facebook's world view

Added on by Chris Saad.

Just wanted to share with you here that I wrote a guest post on Mashable last week about Facebook's world view. Be sure to check it out here.

Are these blunders a series of accidental missteps (a combination of ambition, scale and hubris) or a calculated risk to force their world view on unsuspecting users (easier to ask for forgiveness)? Only the executives at Facebook can ever truly answer this question.

What’s clear, though, is that their platform is tightly coupled with countless other websites and applications across the web, and their financial success is aligned with many influential investors and actors. At this stage, and at this rate, their continued success is all but assured.

But so is the success of the rest of the web. Countless social applications emerge every day and the rest of the web is, and always will be, bigger than any proprietary platform. Through its action and inaction, Facebook offers opportunities for us all. And in the dance between their moves and the rest of the web’s, innovation can be found.

The only thing that can truly hurt the web is a monopoly on ideas, and the only ones who can let that happen are web users themselves.

Guest Post: Facebook's claims about data portability are false

Added on by Chris Saad.

I have published a guest post on RWW about Facebook's recent privacy challenges and their claims about data portability.

"The lack of honesty and clarity from the company and its representatives ... and the continued trend of taking established language - such as "open technology" or "data portability" - and corrupting it for its own marketing purposes, is far more disconcerting than the boundaries it's pushing with its technology choices."

Read it here.

Open is not enough. Time to raise the bar: Interoperable

Added on by Chris Saad.

Last week Elias Bizannes and I wrote a post Assessing the Openness of Facebook's 'Open Graph Protocol'. To summarize that post, it's clear that Facebook is making a play to create, aggregate and own not only identity on the web, but everything that hangs off it. From Interests to Engagement - not just on their .com but across all sites. To do this they are giving publishers token value (analytics and traffic) to take over parts of the page with pieces of Facebook.com without giving them complete access to the user , their data or the user experience (all at the exclusion of any other player). In addition, they are building a semantic map of the Internet that will broker interests and data on a scale never before seen anywhere.

In the face of such huge momentum and stunningly effective execution (kudos to them!), aiming for (or using the word) Open is no longer enough. The web community needs to up it's game.

The same is true for data portability - the group and the idea. Data portability is no longer enough. We must raise the bar and start to aim for Interoperable Data Portability.

Interoperability means that things work together without an engineer first having to figure out what's on the other end of an API call.

When you request 'http://blog.areyoupayingattention.com' it isn't enough that the data is there, or that that its 'open' or 'accessible'. No. The reason the web works is because the browser knows exactly how to request the data (HTTP) and how the data will be returned (HTML/CSS/JS). This is an interoperable transaction.

Anyone could write a web server, create a web page, or develop a web browser and it just works. Point the browser somewhere else, and it continues to work.

Now map this to the social web. Anyone could (should be able to) build an open graph, create some graph data, and point a social widget to it and it just works. Point the social widget somewhere else, and it continues to work.

As you can see from the mapping above, the interaction between a social widget and it's social graph should be the same as that of a browser and a web-server. Not just open, but interoperable, interchangeable and standardized.

Why? Innovation.

The same kind of innovation we get when we have cutting edge web servers competing to be the best damned web server they can be (IIS vs. Apache), and cutting edge websites (Yahoo vs. MSN vs. Google vs. Every other site on the Internet) and cutting edge browsers (Netscape vs. IE vs. Safari vs. Chrome). These products were able to compete for their part in the stack.

Imagine if we got stuck with IIS,  Netscape and Altavista locking down the web with their own proprietary communication channels. The web would have been no better than every closed communication platform before it. Slow, stale and obsolete.

How do we become interoperable? It's hard. Really hard. Those of us who manage products at scale know its easy to make closed decisions. You don't have to be an evil mastermind - you just have to be lazy. Fight against being lazy. Think before you design, develop or promote your products - try harder. I don't say this just to you, I say it to myself as well. I am just as guilty of this as anyone else out there developing product. We must all try harder.

Open standards are a start, but open protocols are better. Transactions that, from start to finish, provide for Discoverability, Connectivity and Exchange of data using well known patterns.

The standards groups have done a lot of work, but standards alone don't solve the problem. It requires product teams to implement the standards and this is an area I am far more interested in these days. How do we implement these patterns at scale.

Customers (i.e. Publishers) must also demand interoperable products. Products that not just connect them to Facebook or Twitter but rather make them first class nodes on the social web.

Like we said on the DataPortability blog:

In order for true interoperable, peer-to-peer data portability to win, serious publishers and other sites must be vigilant to choose cross-platform alternatives that leverage multiple networks rather than just relying on Facebook exclusively.

In this way they become first-class nodes on the social web rather than spokes on Facebook’s hub.

But this is just the start. This just stems the tide by handing the keys to more than one player so that no one player kills us while the full transition to a true peer-to-peer model takes place.

If the web is to truly stay open and interoperable, we need to think bigger and better than just which big company (s) we want to hand our identities to.

Just like every site on the web today can have its own web server, every site should also have the choice to host (or pick) its own social server. Every site should become a fully featured peer on the social web. There is no reason why CNN can not be just as functional, powerful, effective and interchangeable as Facebook.com.

If we don't, we will be stuck with the IIS, IE and Netscape's of the social web and innovation will die.

Missed opportunities in Publishing

Added on by Chris Saad.

MG Siegler over on Techcrunch yesterday wrote a story about how the AP is tweeting links to its stories. Those links, however, are not to its website. Instead those twitter links lead to Facebook copies of their stories! Here's a snippet of his post:

The AP is using their Twitter feed to tweet out their stories — nothing new there, obviously — but every single one of them links to the story on their Facebook Notes page. It’s not clear how long they’ve been doing this, but Search Engine Land’s Danny Sullivan noted the oddness of this, and how annoying it is, tonight. The AP obviously has a ton of media partners, and they could easily link to any of those, or even the story hosted on their own site. But no, instead they’re copying all these stories to their Facebook page and linking there for no apparent reason.

As Sullivan notes in a follow-up tweet, “i really miss when people had web sites they owned and pointed at. why lease your soul to facebook. or buzz. or whatever. master your domain.”

What’s really odd about this is the AP’s recent scuffle with Google over the hosting of AP content. The two sides appeared to reach some sort of deal earlier this month (after months of threats and actual pulled content), but now the AP is just hosting all this content on Facebook for the hell of it?

To me this isn't unusual at all. In fact it's common practice amongst 'social media experts'. Many of us use/used tools like FriendFeed, Buzz, Facebook etc not just to share links, but to actually host original content. We actively send all our traffic to these sites rather than using them as draws back to our own open blog/publishing platforms.

I completely agree with MG. Sending your audience to a closed destination site which provides you no brand control, monetization or cross-sell capability shows a profound misunderstanding of the economics of publishing.

Some will argue that the content should find the audience, and they should be free to read it wherever they like. Sure, I won't disagree with that, but actively generating it in a non-monetizable place and actively sending people there seems like a missed opportunity to me. Why not generate it on your blog and then simply share the links in other places. If those users choose to chat over there, that's fine, but the first, best place to view the content and observe the conversation should always be at the source, at YOUR source. YOUR site.

Some will argue that those platforms generate more engagement than a regular blog/site. They generate engagement because your blog is not looked after. You're using inferior plugins and have not taken the time to consider how your blog can become a first class social platform. You're willing to use tools that cannibalize your audience rather than attract them. You're willing to use your  blog as a traffic funnel back to other destination sites by replacing big chunks of it with FriendFeed streams rather than hosting your own LifeStream like Louis Gray and Leo Laporte have done.

Some will argue (or not, because they don't realize or don't want to say it out loud) that they are not journalists, they are personalities, and they go wherever their audience is. They don't monetize their content, they monetize the fact that they HAVE an audience by getting paying jobs that enable them to evangelize through any channel that they choose. Those people (and there are very few of them) have less incentive to consolidate their content sources (although there are still reasons to do so). Unfortunately, though, media properties sometimes get confused and think they can do the same thing.

The list of reasons why publishing stuff on Buzz or FriendFeed or Facebook as a source rather than an aggregator goes on and on, so I will just stop here.

I'm glad MG has picked up on it and written about it on Techcrunch.

#blogsareback

Update: Steve Rubel is agreeing with the AP's approach. Using all sorts of fancy words like Attention Spirals, Curating and Relationships Steve is justifying APs ritual suicide of their destination site in favor of adding value, engagement and traffic to Facebook. Sorry Steve, but giving Facebook all your content and your traffic and not getting anything in return is called giving away the house.

Again, I'm not advocating that you lock content away behind paywalls, I'm simply saying that you need to own the source and make your site a first-class citizen on the social web. Not make Facebook the only game in town by handing it your audience.

Google Buzz = FriendFeed Reborn

Added on by Chris Saad.

FriendFeed was dead, now it is re-born as Google Buzz. I've not been able to try the product yet, but philosophically and architecturally it seems superior to FriendFeed.

Here are my observations so far:

Consumption Tools

Buzz is better than FriendFeed because Google is treating it as a consumption tool rather than a destination site (by placing it in Gmail rather than hosting it on a public page). FriendFeed should have always been treated this way. Some people got confused and started hosting public discussions on FriendFeed.

That being said, though, I've long said that news and sharing is not the same as an email inbox and those sorts of items should not be 'marked as read' but rather stream by in an ambient way.

While Buzz is in fact a stream, it is its own tab that you have to focus on rather than a sidebar you can ignore (at least as far as I can tell right now).

How it affects Publishers (and Echo)

The inevitable question of 'How does this affect Echo' has already come up on Twitter. Like FriendFeed before it, Buzz generates siloed conversations that do not get hosted at the source.

So, the publisher spends the time and money to create the content and Buzz/Google get the engagement/monetization inside Gmail.

For some reason, all these aggregators think that they need to create content to be of value. I disagree. I long for a pure aggregator that does not generate any of its own content such as comments, likes, shares etc.

That being said, however, the more places we have to engage with content the more reasons there are for Echo to exist so that publishers can re-assemble all that conversation and engagement back on their sites.

Synaptic Connections

Note that they don't have a 'Follow' button - it's using synaptic connections to determine who you care about. Very cool! I worry though that there might not be enough controls for the user to override the assumptions.

Open Standards

Already, Marshall is calling it the savior of open standards. I don't think Open Standards need to be saved - but they certainly have all the buzz words on their site so that's promising.

That's it for now, maybe more later when I've had a chance to play with it.

Update: After playing with it this morning, and reading a little more, it's clear that this is actually Jaiku reborn (not FriendFeed), because the Jaiku team were involved in building it. They deserve a lot of credit for inventing much of this stuff in the first place - long before FriendFeed.

Also, having used it only for an hour, the unread count on the Buzz tab is driving me nuts. It shouldn't be there. It's a stream not an inbox. Also it makes no sense why I can't display buzz in a sidebar on the right side of my primary Gmail inbox view. That would be ideal.

It's also funny to me that some people have tried to give Chris Messina credit for Buzz even though he's been at Google for no more than a month. They clearly don't understand how long and hard it is to build product. Messina is good, but he aint that good :)

Facebook and the future of News

Added on by Chris Saad.

Marshall Kirkpatrick has written a thoughtful piece over on Read/Write Web entitled 'Facebook and the future of Free Thought' in which he explains the hard facts about news consumption and the open subscription models that were supposed to create a more open playing field for niche voices. In it, he states that news consumption has barely changed in the last 10 years. RSS and Feed Readers drive very little traffic and most people still get their news from hand selected mainstream portals and destination sites (like MSN News and Yahoo news etc). In other words, mainstream users do not curate and consume niche subscriptions and are quite content to read what the mainstream sites feed them.

This is troubling news (pun intended) for those of us who believe that the democratization of publishing might open up the world to niche voices and personalized story-telling.

Marshall goes on to argue that Facebook might be our last hope. That since everyone spends all their time in Facebook already, that the service has an opportunity to popularize the notion of subscribing to news sources and thereby bring to life our collective vision of personalized news for the mainstream. Facebook already does a great deal of this with users getting large amounts of news and links from their friends as they share and comment on links.

Through my work with APML I have long dreamed of a world where users are able to view information through a highly personalized lens - a lens that allows them to see personally relevant news instead of just popular news (note that Popularity is a factor of personal relevancy, but it is not the only factor). That doesn't mean the news would be skewed to one persuasion (liberal or conservative for example) but rather to a specific topic or theme.

Could Facebook popularize personalized news? Should it? Do we really want a closed platform to dictate how the transports, formats and tools of next generation story-telling get built? If so, would we simply be moving the top-down command and control systems of network television and big media to another closed platform with its own limitations and restrictions?

Personalized news on closed platforms are almost as bad as mainstream news on closed platforms. News organizations and small niche publishers both need a way to reach their audience using open technologies or we are doomed to repeat the homogenized news environment of the last 2 decades. The one that failed to protect us from a war in Iraq, failed to innovate when it came to on-demand, and failed to allow each of us to customize and personalize our own news reading tools.

That's why technologies like RSS/Atom, PubSubHub and others are so important.

What's missing now is a presentation tool that makes these technologies sing for the mainstream.

So far, as an industry, we've failed to deliver on this promise. I don't have the answers for how we might succeed. But succeed we must.

Perhaps established tier 1 media sites have a role to play. Perhaps market forces that are driving them to cut costs and innovate will drive these properties to turn from purely creating mainstream news editorially toward a model where they curate and surface contributions from their readership and the wider web.

In other words, Tier 1 publishers are being transformed from content creators to content curators - and this could change the game.

In the race to open up and leverage social and real-time technologies, these media organizations are actually making way for the most effective democratization of niche news yet.

Niche, personalized news distributed by open news hubs born from the 'ashes' of old media.

Don't like the tools one hub gives you? Switch to another. the brands we all know and love have an opportunity to become powerful players in the news aggregation and consumption game. Will they respond in time?

Due to my experience working with Tier 1 publishers for Echo, I have high hopes for many of them to learn and adapt. But much more work still remains.

Learn more about how news organizations are practically turning into personalized news curation hubs over on the Echo Blog.

Facebook privacy changes are not evil

Added on by Chris Saad.

I give Facebook a lot of crap. But I don't think their latest privacy changes are all that nefarious. It's pretty obvious what they are doing. They want search inventory to sell to Google and Microsoft. They want to be as cool as Twitter.

I think the more important story is that they are turning their square into a triangle.

A well placed friend of mine (who shall remain nameless) gave me this metaphor (I will try not to butcher it too much).

Twitter is like a triangle. Small group of people (on top) broadcasting to a large group of people down bottom.

Facebook is/was more like a square. Everyone communicating more or less as equal peers (at least on their own personal profile pages).

This is very rare on the internet. It's rare anywhere really. It's unusual to have a platform that encourages so much 'public' peer-2-peer participation.

It's clear, however, that Facebook is trying to have its cake and eat it too. They want to be a triangle for those who want one, and a square for those who want one of those.

Will it work? Maybe. They are a 'Social Utility' after all. They have never thought of themselves as a vertical social network with a static social contract. As I've said before, their ability to change and evolve at scale is beyond impressive. It has never been seen before.

From College kid profile pages, to app platform, to stream platform, to stream platform with deep identity and routing. Their flexibility, rate of change and reinvention is staggering. They put Madonna and Michael Jackson to shame.

Ultimately Facebook wants to be the Microsoft Outlook and Google Adsense of the Social Web all rolled into one. Maybe throw some PayPal in for good measure.

To do this I think you will see them continue to provide square or triangle options for their users (with their own personal bias towards triangles) and deprecate legacy parts of their system like canvas pages and groups.

Ultimately, though, the real opportunity is to look beyond the public vs. private debate and observe the 'Multiple Publics' that Danah Boyd and Kevin Marks speak about. But that's a post for another day.

Is this good or bad for us? I'm not sure it matters. It's another big bet for the company though, and it was a necessary step to clean up the half steps that resulted in privacy setting hell on the service so far.

Redefining Open

Added on by Chris Saad.

In my mind, there are four kinds of open.

  • Torvalds Open.
  • Zuckerberg Open.
  • Not Open but we use the word Open anyway.
  • Saad Open.

This fragmentation has diluted the word open to the point where it almost has no value.

It's time to re-define the word open. First let me explain each category.

Torvalds Open.

In Linus Torvalds world (the guy who invented Linux) Open means that the software is developed through a community process. The source code is visible and modifiable by anyone and is available for free.

This is called 'Open Source'.

Companies may package and bundle the software in new and novel ways, and provide support and services on top for a fee.

The problem with Open Source on the web is that the software itself has less value than the network effects and up-time provided by a branded, hosted experience. Running Twitter.com on open source software, for example, would have very little value because Twitter's lock-in is not their software, but rather their name space (@chrissaad) and their developer ecosystem all developing software with dependencies on their proprietary API.

Open Source is useful, interesting and important, but is not what I mean when I talk about the Open Web. I feel like its value is well understood and it is not the first, best way of making our world (and the Internet) a better place - at least not in the same way it once did when client-side software was the primary way we used computers.

Zuckerberg Open.

When Mark Zuckerberg talks about open, he is not talking about Technology. He is talking about human interactions.

Ever since the popularity of Data Portability (via the DataPortability project) Facebook has gone to great lengths to redefine the word Open to mean the way people interact with each other.

In doing so, they have managed to, in large part, co-opt the word and claim their platform makes people 'more open'.

In many respects, and by their definition, they are right. Facebook has encouraged a mind bending number of people to connect and share with each other in ways that had been previously reserved for bloggers and other social media 'experts'.

Facebook deserves a lot of credit for introducing social networking to the masses.

Their definition of Open, however important, is not the kind I'm talking about either.

Not Open but we use the word Open anyway.

This is when a platform or product has an API and therefore claim that they have an 'Open Platform'.

There's nothing open about having an API. It's just having an API. The platform could be closed or open depending on how the given application and API is built and what limitations are placed upon it.

In most cases, an 'Open Platform' is not actually open, it's just a platform.

Saad Open

My definition of open is very specific. In fact a better way to describe it would be Interoperable and Distributed.

To explain, let me provide some compare and contrast examples.

Twitter is closed because it owns a proprietary namespace (e.g. @chrissaad). The only way to address people is using their username system. They own those usernames and have final authority over what to do with them.

They are closed because they do not provide free and clear access to their data without rate limiting that access or cutting deals for improved quality of services.

They are also closed because they are not a federated system. You can not start your own Twitter style tool and communicate with users on Twitter or vice versa. The only way to message people on Twitter is to use Twitter's propietary APIs for submitting and retrieving data.

A proprietary API is an API that is special to a company and/or produces data that is not in an open standard.

Wordpress, on the other hand (and to contrast) is an open system. Let's compare point for point.

It does not own the namespace on which it is developed. The namespaces are standard URLs. This blog, for example is hosted at blog.areyoupayingattention.com. Wordpress does not own that domain.

Wordpress produces a single type of data - blog posts. Those blog posts are accessible using an open standard - RSS or Atom. There is no rate limit on accessing that data.

Wordpress is a federated system. While they provide a hosted solution at Wordpress.com for convenience, there is nothing stopping me from switching to Blogger or Tumblr. The tools that you would use to consume my blog would remain unchanged and the programmers who make those tools would not need to program defensibly against Wordpress' API. They simply need to be given the URL of my RSS feed and they are good to go.

This makes Wordpress an open tool in the open blogosphere.

Blogging is open.

Microblogging should be open too.

To summarize. Open, in my definition, does not mean the software is open source or free. It means that the software receives open standards data, provides open standards data, has an interoperable API and can easily be switched out for other software.

Today I was challenged on Twitter that Echo is not 'Open' because it is proprietary code and costs money to use.

This person does not understand my definition of Open. Echo is open because it is not a destination site, it sits on any site anywhere. The owner of that site can take it off and replace it with another engagement tool at any time. The data being absorbed by Echo, for the most part, is RSS or Atom, and the data coming out of Echo is RSS.

It does not have any proprietary namespaces (except our useless legacy login system which we are trying to get rid of as quickly as possible) and does not pretend to create some amazing social network of its own. It is just a tool to communicate on the open, social web.

Is Echo perfect? No, of course not, but our intention is to make each and every aspect of the product as interoperable and distributed as possible. We will even use and contribute to open source where appropriate.

How does your product, or the tools you choose, compare? Tell me in the comments.

Next up, we should start to redefine the 'Open' community that creates open standards. Much of it is not very open.

Merry Christmas - The power of memes

Added on by Chris Saad.

Many, many of the things in our lives could be called 'Memes'.  Here's what happens when you type 'Define:meme' into Google.

Memes are everywhere. We just experienced a country wide meme here in the US called 'Thanksgiving'. We are about to hit a similar meme (except this one is global) called 'Christmas'.

Memes are fascinating things. They are almost as important as Context, Perspective and Metaphors. Together these three things compose the great majority of our thought processes.

What is this like (metaphor), What else is going on (context), What does everyone else think (meme), What does my experience and current state of mind tell me (Perspective).

Some memes emerge organically over time - like folding the end of hotel toilet paper into a little triangle. Others are created through brute force by strategic construction and repetition. No one has mastered this better than the extreme right wing of the US political system. Fox news is a bright shining example of how to craft, seed, propagate and manipulate a meme.

Silicon Valley loves a meme. We live on them. In fact one could argue that the whole ecosystem would shut down without the meme of the day, week and bubble.

.Com, Web 2.0, Data Portability, Real-time web, RSS is dead, Myspace, Facebook, Twitter, Cloud, Semantic Web, Synaptic Web and so on and so forth.

Like in real life, some of these memes emerge organically, some through brute force. Some make more sense than others. Some of these memes get undue attention. Some are created to stir controversy. Others form organically to create a shorthand. Some are genuine cultural shifts that have been observed and documented.

These memes matter. They matter a lot. They dictate a large part of how people act, what they pay attention to and their assumptions about the world in which they live, and the people they encounter. In Silicon Valley they dictate who gets heard and which projects get funded. They form the basis of many of our decisions.

Some services like Techmeme do a very good job at capturing daily memes. I've yet to see a service that captures memes that span weeks, months, years or even decades though. I dream of such a service. Particularly one focused on news memes.

Imagine being able to zoom in and out of the news, and drag the timeline back and forth like some kind of Google maps for headlines. Imagine being able to read about an IED explosion in Bagdad and quickly understand its context in the decade long struggle for the entire region through some kind of clustered headline/topic view.

Consider the context, perspective and metaphoric power such a tool would give us. How could it change our world view and help turn the temporary, vacuous nature of a microblog update into something far more substantial and impactful with an in line summary of the rich historic narrative inside which it belongs.

The algorithm to create such correlations and the user interface to present it would challenge even the smartest mathematicians and user interaction designers I imagine. It's commercial value is vague at best. It probably shouldn't be attached to a business at all - maybe it should be some kind of wikipedia style gift to the world.

Maybe the news media, Reuters, CNN and Washington Post might take it upon themselves to sponsor such a project in an effort to re-contextualize their news archives in the new AAADD, real-time, now, now now, every one is a journalist media world.

I've bought some domains and done some mockups of such a service, but I probably would never have the time or the patience to build it - at least not in the foreseeable future.

Maybe I'm just dreaming. But I think it's a good dream!