Product & Startup Builder

Filtering by Category: Analysis

Building in someone else's yard

Added on by Chris Saad.

Loic Le Meur writes over on LinkedIn about his mistakes betting on Twitter with his company Seesmic. Seesmic was a company that produced a series of great Twitter clients for multiple platforms (Mobile, Web, Desktop etc). When Twitter started shutting down developers and releasing their own official clients Seesmic's business was undermined and ultimately shuttered.

I'm not blaming Twitter for this strategic change – they did not know they would take that decision at the time when they were fully supporting their ecosystem. I blame myself entirely. I should have never dedicated all my team resources to build on one platform. That is a lesson learned the hard way along with many other developers. I was too excited and became blind.
...
Here are my two cents for entrepreneurs betting on someone else's success: be careful that everything can change from one day to another and all the rules will change. I will never be that dependent on anyone anymore.

 

Loic is a wicked smart and very successful entrepreneur. He's always smiling, generous and well liked by his peers. It's a real shame that Twitter pivoted in the way that it did to undermine his business.

I'd like to refine Loic's lessons learned a little here, though. In my opinion the problem was not betting on someone else's platform but rather...

  1. Twitter is not a platform, it's a media company
  2. Betting on one media company rather than multiple

Whenever a company makes money from Ads, it's not a platform/technology company - it's a media company. As a media company It needs to control the eyeballs so that it can control the ad impressions.

To be fair, though, Twitter's ad revenue model wasn't in place when Loic started betting on them. It was clear, however, that their revenue model was still in flux and that ads would play a role in order to keep the service free for end-users.

The reality is companies successfully rely on other platforms all the time. Amazon Web Services is a great example of this. There's never a risk that AWS is going to start turning off or competing with its developers because it is a true platform.

Like AWS, Echo is a true platform. We make our money by encouraging developers to build world class apps on our platform and we even help them sell those apps to major customers.

Facebook, Twitter etc were never true technology platforms. They are distribution channels. They are data sources. They are social services. But they are not platforms.

Ironically this is still happening today. Major media companies and developers still spend enormous sums of money encouraging their users to participate on Twitter and Facebook as 'outsourced engagement platforms'. Ironically Media companies who should understand the value of owning the audience and the ad impressions are happily outsourcing them to competing media companies (Facebook and Twitter). I write more about this over on the Echo blog.

The key, then, is not avoiding 3rd party platforms, but rather to understand the difference between platforms, products, services and media companies. It's key to understand the incentives, revenue flows and business models so you can understand how to align your company and product with the value chain.

 

Dark Social and Facebook+

Added on by Chris Saad.

Working with large brands at Echo is thrilling. They have the content, products and reach that matter in everyday people’s lives. This means that even small improvements in their Realtime, Social strategy results in big impacts on large groups of people. One of the prevailing misconceptions we find when we first get started with a new customer, however, is that Facebook is Social. Facebook comments, Facebook Likes, Facebook Fan Pages are often seen as the beginning and the end of the social ‘strategy’.

For as long as I can remember, my career has been about helping others to remember that Facebook (or Myspace or AOL etc) can only ever be one part of the larger web and Internet landscape. The percentage fluctuates of course but it is never 100%.

A new article in The Atlantic this week, however, reminds us that not only is Facebook only a fraction of the overall web (in terms of traffic referrers and participation) but also that its not even the biggest fraction. It also reminds us that while modern social networking has introduced many powerful novelties, being social on the internet is far from a new phenomenon. In fact, it has been a pervasive part of internet interactions since the beginning - think Email and Instant Messaging for example. These ‘old’ tools continue to have a huge (in fact the largest) impact on your referrer traffic and engagement.

This engagement, however, is under measured and not well understood. The Atlantic postulates that it appears in web analytics as unknown referrers to non home page or section front pages - assuming that direct traffic to deep links can only come from people sharing links to one another using tools that don’t leave referrer signatures. So the Atlantic has taken to calling this class of traffic ‘Dark Social’.

Below is a chart of their referral traffic as measured by ChartBeat. Most notably they have shown and labeled the appropriate traffic as 'Dark Social' on the chart.

This chart clearly shows that, for The Atlantic, Dark Social, and non Facebook ‘Standard Social’ together, accounts for almost 80% of all referral traffic.

In this light it is obvious that what’s needed is a ‘Facebook+’ strategy. Or better put, a strategy that puts your website at the center, with Mobile + Desktops + Facebook + Twitter + Reddit + Digg + StumbleUpon + Dark Social + many others as link distribution pipes.

This means that for maximum coverage and distribution, every login, sharing, commenting, following, notification, trending surface can’t just be a Facebook widget. You need white label Social Software Infrastructure that connects your audience to your site using the tools, technologies and distribution opportunities of the entire web.

The web has always been, and will always continue to be the platform. Social or otherwise.

The Open Web Is Dead - Long live the Open Web

Added on by Chris Saad.

Yesterday Robert Scoble once again declared that the Open Web was dead. His argument was that Apps and proprietary black holes like Facebook are absorbing all the light (read: users, attention, value, investment) and taking our beloved open platform right along with it. In his post, he kindly (but incorrectly) named me as the only person who really cares about the Open Web. While that's flattering, I think he's wrong about me being the only one who cares.

But he is right about the Open Web. It's in real danger. URLs are fading into the background,  native Mobile apps are all the rage and Facebook threatens to engulf the web into a proprietary black hole.

But I think there's a bigger problem going on right now. Not just with the web, but with silicon valley (as stewards of the web). We've lost sight of the things that matter. We're obsessed with quick wins, easily digestible VC pitches, stock options and flipping for a Ferrari.

There's more to this game than that. Let me touch on some of the things I see going on.

  1. Lead not just cheerlead In our obsession with being seen by our micro-audiences as 'thought leaders' or 'futurists' it's always very tempting to watch which way the wind is blowing and shout loudly that THERE is the future. Like a weather vane, it's easy to point the way the wind is blowing, but our biggest, best opportunity is not to declare a popular service 'the next big thing' just because a few visible people are hanging out there. Rather our collective and individual responsibility is to help articulate a direction we think moves the state of the art forward for both the web and for society at large. Something, as leaders of this field, we believe in. Just like VCs develop an investment thesis, we should all have a vision for where the web is going (and how it should get there) and actively seek out, support and promote quiet heros who are building something that moves the needle in the right direction.
  2. Add to the web's DNA Almost every startup I see today is focused on building an 'App' and calling it a 'Platform'. Too often (almost every time) though, these apps are nothing more than proprietary, incremental and niche attempts at making a quick buck. We need more companies to think deeper. Think longer term. What are you doing to change the fabric of the web's DNA forever? How can you contribute to the very essence of the Internet the same way that TCP/IP, HTTP, HTML, JS and so many other technologies have done. Even proprietary technologies have provided valuable evolutions forward - things like Flash and yes, even FB. How are you going to live forever? This is why Facebook used to call itself a 'Social Utility' instead of a 'Social Network'. Mark Zuckerberg was never content to be the next Myspace Tom. He wanted to be the next Alexander Graham Bell. And now he is.
  3. Don't just iterate, innovate Of course, someone has to build Apps. We can't all be working at the infrastructure layer. But too many of the Apps we chose to build (or champion) are incremental. As startup founders, investors and influencers it's so easy to understand something that can be described as the 'Flipboard of Monkeys' instead of thinking really hard about how a completely new idea might fit into the future. Sure there are plenty of good business and marketing reasons why you shouldn't stray too far from the beaten path, broadening it one incremental feature at a time, but the core essence of what you're working on can't be yet another turn of a very tired wheel. If you're shouting 'Me too' then you're probably not thinking big enough.
  4. B2C, not Ego2C Silicon valley is clearly a B2C town. We all love the sexy new app that our mother might eventually understand. Something we can get millions of users to use so we can show them lots of ads. Besides the fact that I think we should focus a little more on B2B, the problem is we're not really a B2C town at all. We're actually more focused on what I will call Ego2c. That is, we pick our favorite apps based on how famous the founding team is OR how easily we can use the app to build yet another niche audience for ourselves (and brands/marketers). It would be a tragedy if the social web revolution boils down to new methods of PR and marketing. But that's what we seem to be obsessed with. As soon as any app from a famous founder gets released we give it tones of buzz while plenty of more deserving projects get barley a squeak. If the app gets a little traction (typically the ones that have Ego mechanics baked in) you see a million posts about how marketers can exploit it. Inevitably the app developers start to focus on how to 'increase social coefficients' instead of how to help human beings make a connection or find utility in their lives.
  5. "Users don't care" Speaking more specifically about the Open vs. Closed debate, too often we hear the criticism "Users don't care about open". This is absolutely true and the reason why most open efforts fail. Users don't care about open. They care about utility and choice. This is why the only way to continue propagating the open web is to work with BUSINESS. B2B. Startups, Media Brands, The bigco Tech companies. They care about open because the proprietary winners are kicking the losers ass and that usually means there are at least 1 or more other guys who need a competitive advantage. They need to team up and build, deploy and popularize the open alternative.  That's why open always wins. There's always plenty of losers around who are going to commoditize the popular closed thing. As technology leaders we're paid to care about things users don't care about. Things that shape the future. While users, in the short term, might not care, we should dare to think and dream a little bigger. As a case study look at Android vs. iOS. iOS is more profitable for a single company, but the other is now a force of nature.
  6. Death is just a stage of life Just because something is no longer interesting doesn't mean it's dead. Its spirit, and often times the actual technology, lives on, one layer below the surface. RSS is a great example of this. RSS's spirit lives on in ActivityStreams and the general publish/subscribe model. It is powering almost every service-to-service interaction you currently enjoy. Is it dead, or has it simply become part of the DNA of the Internet? Could RSS (or something like it) be better exposed higher up in the stack, absolutely, but that will take some time, thoughtful execution and influencers who are willing to champion the cause. The same is true for OpenID and OAuth.
  7. The Arc of the Universe Is long but It bends towards Open The battle of Open vs. Closed is not a zero sum game. Both have their time. It's a sin wave. First, closed, proprietary solutions come to define a new way of fulfilling a use case and doing business. They solve a problem simply and elegantly and blaze a path to market awareness, acceptance and commercialization. Open, however, always follows. Whether it's a year, a decade or a century, Open. Always. Wins. The only question is how long, as an industry, are we going to keep our tail tucked between our legs in front of the the great giant proprietary platform of the moment or are we going to get our act together to ensure the "Time to Open" is as short as possible. It takes courage, co-ordination and vision, but we can all play our part to shorten the time frame between the invention of a proprietary app and the absorption of that value into the open web platform.
  8. Acknowledge reality FB has won. It's done. Just like Microsoft won the Desktop OS (in part handed to them by IBM), so too has FB won the Social OS (in part handed to them by Microsoft). For now. Acknowledging the truth is the first step to changing it. The only question now is how long we're all willing to wait until we get our act together to turn the proprietary innovation of the 'social graph' into part of the open web's core DNA. We need to recognize our power. They have ~1B users? The open web has more. Chances are that the major website or brand you work for has plenty of its own users as well. Are you going to send them to FB, or are you going to invest in your own .com. Trust me, I know it's really, really easy to take what you're given because you're too busy putting out a million fires. But as technology leaders I challenge us all to build something better. We're the only ones who can.
  9. [Edit] Don't kill Hollywood Did you catch the YC post  calling for silicon valley to kill hollywood. Not only was this reckless and short sighted, it's the exact opposite of what we should be doing. Instead of trying to kill or cannibalize media companies and content creators, how about we work with them to create the next generation of information technology. They have the audiences+information and we have the technology. Instead, most silicon valley companies, by virtue of their B2C focus, are too busy leaching off major media instead of finding ways to help transform it. Sure most of them move slowly - but move they are. Move they must. Helping them is very profitable. I write more about this on the Echo blog - calling it 'Real-time Storytelling'
  10. [Edit] Today's data portability problem When I started the DataPortability project the issue of the time was personal data portability. That's not the case anymore. While user-centric data portability is still being done via proprietary mechanisms it's a) actually possible and b) moving more towards open standards every day. The real issue right now is firehoses. Access to broad corpuses of data so that 3rd parties can innovate is only possible through firehoses (for now). To put it another way, the reason Google was possible was because the open web was crawl-able - for free - with no biz dev deal. The reason FB was possible was because the open web allowed any site to spring up and do what it wanted to do. Today, too much of our data is locked up in closed repositories that can and must be cracked open. Google's moves to exclude other socnets (besides G+) from their search results until they had free and clear access to them might be inconvenient for users in the short term, but, as a strategic forcing function, is in the best interest of the open web long term.

End of rant.

Analysis of F8, Timeline, Ticker and Open Graph

Added on by Chris Saad.

So at F8 last week Facebook announced Ticker, Timeline and extensions to the Open Graph API to allow for new verbs and nouns. Here's what really happened.

  • They split their single 'News Feed' into 3 levels of filtering. Now (Ticker), Relevant (News Feed), Historical (Timeline). (Side note, we've had a 'Ticker' style product at Echo that we called 'Community Stream' for a long time now - and most of our customers and partners said to us 'why would we want to show all that data it's just noisy'. Maybe now they will take a second look.). Question: Will G+, Twitter and the REST of the web adopt the same model? They should.
  • This allows FB to collect more 'noise' (also known as synaptic firings or Attention data) which, in turn, allows them to find more signal (also known as synaptic inferences or attention management). I've long said that the answer to information overload is not LESS information - it's MORE. The more information you have the more ability you have to find patterns and surface them in relevant places (I said it so long ago I can't even find the link). Question: Will independent websites think to collect their OWN Attention data BEFORE sending it to FB so they can leverage for their own purposes. The value of this data is incalculable.
  • Having these new presentation metaphors in place, they then created a mechanism to collect more data in the form of expanded Verbs and Nouns in the Open Graph API. With this new API, user's are now expected to abandon explicit gestures of sharing and instead, accept that every action they take is auto-shared to their friends. Question: When will the first horror stories start coming out about engagement ring purchases, personal health issues and sexual orientations being inappropriately revealed due to auto-sharing?
  • Using all the bling of the Timeline, along with new messaging and a simple little opt in toggle of 'Add to my timeline' they managed to re-launch 'Beacon' without anyone noticing (none of the tech blogs I saw even mentioned it). Question: Why did none of the tech media cover that angle of the story?

I continue to be in awe of Facebook's scale, seriousness, ambition and momentum. There has never been anything like it before.

They have created an Attention Management Platform that rivals Google Search and easily out classes many of my best ideas about Attention Management and Personal Relevancy back when I was thinking about the problem.

It's breathtaking.

And since it is all done with hard links to a single proprietary hub, it is eating the web like a cancer.

Before F8 it was clear that Google+ was a 1 or 2 years behind FB. Now they are 3 or 4.

Only time will tell who, how and why more open systems will begin to reassert themselves in the ecosystem. My bet is that it wont come from a b2c copy-cat, though. It will come from a well organized, commercially incentivized b2b play.

The part that still confuses me, though, is why ANY serious media company would want their news to load in a 'FB canvas app' instead of their own website. It makes zero sense. None of this changes the reality that you need to own your own data and your own point source. I made a little comparison table earlier in the week that explains why.

WSJ Outsources its business to Facebook

Added on by Chris Saad.

Today WSJ announced that it has built a news publishing platform that lives inside Facebook - effectively outsourcing their core website to the Social Networking Giant. The number of reasons this is a bad idea is staggering. I've tried to summarize them in a spreadsheet comparing a FB approach verses an Open Web approach.

Please feel free to contribute

Real Names getting Real Attention

Added on by Chris Saad.

There's a lot of fury on the web right now about 'Real Names'. FB is trying to use it as a unique feature of their comments system claiming it reduces trolling and low value comments. Of course that isn't really true. For one, any commenting system could force FB login. Two, users will troll with or without their name attached and, worse yet, many legitimate users won't participate for any number of reasons if they can't use a pseudonym. There are plenty of better ways to increase quality in your comments including participation from the content creators, game mechanics, community moderation and more.

The real debate, however, is about G+ trying to copy FB's stance on Real Names. They are insisting all user accounts use them and are actively shutting down accounts that violate the policy. They are being so heavy handed about that even people who ARE using their real name are getting notices of violation - most notable Violet Blue.

I'm not really an expert on pseudonyms, shared contexts and anonymity so I'm going to stay out of this debate.

The real question for me, however, is what is Google's strategic business reason for this policy. There must be a long term plan/reason for it otherwise they wouldn't be insisting so hard.

My assumption is that it's related to their intention to become a canonical people directory and identity provider on the internet to compete with FB in this space.

FB, after all, does not just get it's power from news feeds and photo apps - it gets it from the deep roots it has laid down into the DNA of the internet as the provider of 1st class identity infrastructure and identity information.

In this sense, FB's social contract has served them very well, and Google's attempt to copy it is a hint that they understand FB is not just a .com feature set, but a powerful identity utility. They must (and in some cases seem to be) understand that strategy and it's aggressiveness if they are to properly compete with the monopoly. My only hope, however, is that they are coming up with their own inspired counter strategy rather than just copying the moves they see on the surface - because that's doomed to fail.

What is 'Real-time as a Service'?

Added on by Chris Saad.

First, to define 'Real-time' Real-time is no CDN or Cache latency. When there is new data in the database, it's available to the end-user.

Real-time is not needing to hit the refresh button to see new information. It's when information folds into the page while you're reading it.

Real-time is a new volume and velocity of data. A lot of web data used to consist of 'Blog Posts' or 'News Articles'. Documents. Real-time web data is about activities. Granular, human readable micro-stories about the activities that users make.

"I read this", "I rated this", "I commented on this", "I shared this", "I edited this" and so on. Why? Because capturing, surfacing and socializing real-time activity data is part of the core essence of the social web. The ability to see not just the result of actions by users, but the play-by-play stream of those actions along side faces, names and time/date stamps takes an experience from a static 'snapshot' into a living, breathing stream. Further, by enabling users to like, reply, flag, share and otherwise interact with these activities, sites are creating new opportunities for engagement, conversation and conversion.

Real-time is a presentation metaphor. It often (but not always) takes the form of a reverse chronological stream with nested comments and likes. It helps users understand the order of things and mixes content with conversation in a way that drives engagement and return visits.

Real-time means filters instead of facts. Let the user decide what they want to see - to craft an experience that makes sense for them, and their friends.

Now, what is 'Real-time as a Service'?

If all the things above are true, then it changes everything we used to know about web infrastructure, databases, user interfaces and tools for moderation or curation.

APIs can no longer be request-response. Databases must now store far more data at far faster rates. User interfaces need to factor in names, faces and actions. Moderation and curation tools must leverage algorithms, crowd sourcing and real-time flows.

Real-time as a service, then, is cloud infrastructure that helps make this transition easier.

It is a database that can handle new magnitudes of scale - handling hundreds or thousands of write events per section. Not just to a flat table, but to a hierarchical tree of arbitrary activities.

Site -> Section -> Article -> Rating -> Comment -> Reply -> Like.

It's a database that can store all items permanently so that users can visit old streams at any time. Permanent storage that can also handle localized annotations. Localized annotations are the ability to modify the metadata of an activity - say a Tweet (Promote it, tag it, retarget it in the tree etc) - in such a way that that your view of a tweet is different from another customer's view.

It's a database that enables not just the ability to perform an SQL-like search query, but also continuously updates you when the data changes - so that you can modify the UI on the fly.

It's a database that returns not just flat query results, but a hierarchical tree - allowing you to present the activity in context.

It's a database that handles not just a few hundred users requesting (reading) data, but a few million users swarming to see the latest action in a sports game or a concert.

It's a database that organically makes connections between items by understanding the relationships of URLs and #tags to make implicit links in the graph where and when they're needed. For example a tweet mentioning acme.com should be attached to Acme.com in the tree.

And most importantly, it's a database company that understands that the opportunity of the Real-time, Social Web is far too big and moves far too quickly to possibly be built by a single vendor. A company that, as a result of this understanding, chooses open standards over proprietary formats; Partnership with best-of-breed partners over trying to build mediocre versions of everything by itself.

Polls, Ratings, Comments, Live Blogging, Forums, Data Bridging, Data Enriching, Visualization, Moderation, Curation, Analytics Game Mechanics, Authentication... the list is endless. They are all transformed by the Real-time web. They must all be part of Real-time as a Service.

And finally, Real-time as a Service is about service. Enterprise grade support. Best in class uptime. White label.

That's Real-time as a Service.

Further Reading

Initial quick thoughts on Google+

Added on by Chris Saad.

It's certainly very slick, but it's a few years behind FB. I mean that not just in timing and network effects, but in the much more strategic sense of platform ambition. FB.com was the FB strategy 4 years ago. FB is now going for the rest of the web. It's reach and role as an identity provider and social infrastructure player makes it much more important (and harder to beat) than launching a cool new service. So hopefully the Google+ team is thinking WAY beyond this as a destination site when they are thinking Google Social Strategy.

So far the broad ranging announcements from the +1 button to Google Analytics adding Social bode well for this being a company wide, product wide refresh. The key to success will be in thinking about the need to compete with FB beyond the walls and products of Google.

The key to that, of course, will be to get deep adoption by major sites.

Update: Upon thinking about it a little more. Google has once again missed an opportunity to play to their strengths. With the document web they played the role of aggregator and algorithmic signal detection system. With the social web, their ideal strategy would be to build the ultimate social inbox. A place where I can navigate, consume AND interact with Facebook + Twitter + Foursquare + Quora +++ in one place.

Instead they created yet another content source.

NYT Paywall, Huffpo Lawsuit - Symptoms of the same misconception

Added on by Chris Saad.

Over the last few days I have been debating the NYT pay wall on a private email thread of friends. I didn't feel the need to post it on my blog because I thought that pay walls were so obviously a losing strategy that it was a waste of time to comment.

But combined with the recent law suit against the Huffingon Post and Arianna Huffington's eqloquent response yesterday, I felt it was worth while to re-publish my thoughts here. Most of them are based on thinking and writing that I did many years ago around Attention. Most of that old writing has been lost in the blog shuffle. Hopefully one day I will dig it up and re-post it in a safe place.

On to the issue...

The price of content

I believe that people have historically paid for the medium not the content.

They pay for 'Cable' not for 'CNN News'. They pay for 'The Paper' not for the content in the newspaper. They pay for 'CDs' not for the music on the album.

Also they paid a lot because the medium was perceived to be scarce (scarce materials, scarce shelf space, scarce advertising dollars), scarce talented people.

Consumers are not stupid, they understand (if only somewhere at the back of their mind) that the COST of creating and distributing things has been deflated by a growing list of converging trends.

We live in a world of abundance (in the area of digital content anyway). Shelf space is infinite (database entries), any kid in a basement can make content and there is no physical media anymore so cost of distribution has disappeared as well.

The scarcity now is on the consumption side - Attention is the scarce resource. Value is derived from scarcity.

That's why on the Internet, Attention allocation systems (Google Search, FB News Feed etc) are attracting traffic, engagement and ultimately profit.

In this new world, the price of content must be reduced significantly as shakeouts and rebalancing occurs - because the cost of producing it is approaching zero.

The more the Music, TV and News industry fight this, the more they leave themselves open to disruption by Google, FB, Twitter and the rest of silicon valley.

This is not even to mention that everyone is producing content now. Tweets, Photos, Videos - it's abundant. Of course most of it isn't very 'good' by J school standards - but that's irrelevant. The world has never rewarded good with any consistency.

Also just because content is not good, doesn't mean it isn't personally meaningful.

For example, I care more what my child (theoretical child of course) posts to FB than the most important journalist in all the world says on CNN.

But please don't confuse my dispassionate assessment of the issue as pleasure or happiness at the demise of mainstream media though.

I am simply stating the facts because without understanding those we can't begin to change them (if that's what the media world decided to do).

In terms of making a judgement of those facts, I think that curators who weave and summarize a broader narrative in the form of 'reporting' are critical for an informed citizenship and a functional democracy. I believe in it so much that I have dedicate my life to helping mainstream media companies staying relevant and co-writing things like this: http://aboutecho.com/2010/08/18/essay-real-time-storytelling/

But I also believe that mainstream mass media broke an ancient (and by ancient, I mean as old as rudimentary human communication) pattern of people telling each other personal stories vs. getting all their stories/news from editorialized mass broadcasts.

The Internet may just be restoring the balance. The result is some massive restructuring of inflated budgets, processes, offices, costs etc. While we're in the middle of that restructuring, it looks like a media apocalypse. Until it settles down and a new equilibrium is found.

Here's what Arianna wrote on the subject:

The key point that the lawsuit completely ignores (or perhaps fails to understand) is how new media, new technologies, and the linked economy have changed the game, enabling millions of people to shift their focus from passive observation to active participation -- from couch potato to self-expression. Writing blogs, sending tweets, updating your Facebook page, editing photos, uploading videos, and making music are options made possible by new technologies.

The same people who never question why someone would sit on a couch and watch TV for eight hours straight can't understand why someone would find it rewarding to weigh in on the issues -- great and small -- that interest them. For free. They don't understand the people who contribute to Wikipedia for free, who maintain their own blogs for free, who tweet for free, who constantly refresh and update their Facebook pages for free, and who want to help tell the stories of what is happening in their lives and in their communities... for free.

Free content -- shared by people who want to connect, share their passions, and have their opinions heard -- fuels much of what appears on Facebook, Twitter, Tumblr, Yelp, Foursquare, TripAdvisor, Flickr, and YouTube. As John Hrvatska, a commenter on the New York Timeswrote of the Tasini suit, "So, does this mean when YouTube was sold to Google that all the people who posted videos on YouTube should have been compensated?" (And Mr. Hrvatska no doubt contributed that original and well-reasoned thought without any expectation he'd be paid for it. He just wanted to weigh in.)

Read more on her post

Update

And here's a bit of 'Free Content' - A conversation I had on Twitter wish someone who disagreed with this post.

What is Echo StreamServer?

Added on by Chris Saad.

Yesterday we announced a new Echo product called StreamServer. There is very little more I can say that Khris Loux has not already said so eloquently on stage at the #e2 launch event

When you work so hard and long on something (depending on how you look at it, StreamServer was either 15, 2.5 or 1 year in the making) its hard to sum it all up in one, 1 hour event.

But that's what we tried to do.

We tried to thread the needle between a contemporary story about activity data, the existential change (read: opportunity or threat) occurring on the web as traffic and monetization flows to proprietary social networking platforms, the opportunity for every major node on the web to be just as powerful and innovative, the need for open standards and powerful cloud services as the basis of the the rebuttal and our deep desire to make this an industry wide effort. We tried to communicate the important role of aggregation and the pivotal job of mainstream media, e-commerce, entertainment, startup and agencies play in curating activity information for the masses.

We also tried to communicate that this was not just a pipe dream, but rather a commercial reality for major customers. A solution running at scale. A new distribution and monetization opportunity for 3rd party devs and a future ready piece of infrastructure for media companies.

I think we did the best job possible at threading all these stories, and doing it with a human, authentic voice through the lens of customer and partner experiences.

I'm proud of the work we've done so far, and the tireless efforts of the Echo team and our customer/partner devs.

And all of that being said, though, we are only at the beginning. We have just planted the first seed and I look forward to helping it grow.

So what is StreamServer in my words?

It is the real-time, social scale database that Twitter, Facebook, Quora, Foursquare and others built, delivered as an ec2 style cloud service. Turn it on, and forget about managing the data or scaling the infrastructure.

It is the first of its kind and it will hopefully form the basis of many new companies as they deliver many new, novel and innovative experiences to customers and end users everywhere.

And it's a bet on the future of open standards, developer ecosystems, a heterogeneous web made up of first class social nodes.

It's Real-time as a Service.

New Twitter. Feature comparison

Added on by Chris Saad.

Jeremiah and I wrote an analysis of the New Twitter vs. Current Facebook. Here's a snippet:

Situation: Twitter’s new redesign advances their user experience

Twitter has announced a new redesign today, yet by looking at the news, there hasn’t been a detailed breakdown of these two leading social networks.  Overall, Twitters new features start to resemble some features of a traditional social network, beyond their simple messaging heritage.  We took the key features from both social website and did a comparison and voted on the stronger player?

[Great Detailed Graph goes here - See it on Jeremiah's blog]

Our Verdict: Facebook Features Lead Over Twitter’s New Redesign

Facebook’s features offer a more robust user experience, and they have a longer history of developing the right relationships with media, developers, and their users. Twitter, a rapidly growing social network has launched a series of new features (described by the founder as “smooth like butter”) that provide users with a snappy experience and enhanced features.

We tallied the important features of this launch and to their overall expansion strategy and have concluded that Facebook’s features continue to hold dominance over Twitter, despite the noticeable improvements. While we don’t expect that Twitter wants to become ‘another Facebook’ they should play to their strengths and remaining nimble and lightweight yet allowing for developers and content producer to better integrate into their system.

Check out the full results over on his blog.

Guest Post: Facebook's claims about data portability are false

Added on by Chris Saad.

I have published a guest post on RWW about Facebook's recent privacy challenges and their claims about data portability.

"The lack of honesty and clarity from the company and its representatives ... and the continued trend of taking established language - such as "open technology" or "data portability" - and corrupting it for its own marketing purposes, is far more disconcerting than the boundaries it's pushing with its technology choices."

Read it here.

Diaspora is not the answer to the Open Web, but that's ok

Added on by Chris Saad.

For whatever reason, a new project called Diaspora is getting a lot of attention at the moment. They are four young guys who have managed to crowd source $100k+ to build an open, privacy respecting, peer-to-peer social network. A number of people have asked me what I think, so instead of repeating myself over and over I thought I would write it down in one place.

First, I don't think Diaspora is going to be the 'thing' that solves the problem. There are too many moving parts and too many factors (mainly political) to have any single group solve the problem by themselves.

Second, I don't think that's any reason to disparage or discourage them.

When we launched the DataPortability project, we didn't claim we would solve the issue, but rather create a blueprint for how others might implement interoperable parts of the whole. We soon learned that task was impractical to say the least. The pieces were not mature enough and the politics was far too dense.

Instead, we have settled for providing a rolling commentary and context on the situation and promoting the efforts of those that are making strides in the right direction. We also play the important role of highlighting problems with closed or even anticompetitive behaviors of the larger players.

The problem with the DataPortability project, though, was not its ambition or even it's failure to meet those ambitions, but rather the way the 'old guard' of the standards community reacted to it.

The fact of the matter is that the people who used to be independent open advocates were actually quite closed and cliquey. They didn't want 'new kids on the block' telling them how to tell their story or promote their efforts. Instead of embracing a new catalyzing force in their midst, they set about ignoring, undermining and even actively derailing it at every opportunity.

Despite my skepticism about Diaspora, though, I don't want to fall into the same trap. I admire and encourage the enthusiasm of this group to chase their dream of a peer-to-peer social network.

Do I think they will succeed with this current incarnation? No. Do I think they should stop trying? No.

While this project might not work their effort and energy will not go to waste.

I think we need more fresh, independent voices generating hype and attention for the idea that an open alternative to Facebook can and must exist. Their success in capturing people's imagination only shows that there is an appetite for such a thing.

What they might do, however, is strongly consider how their work might stitch together existing open standards efforts rather than inventing any new formats or protocols. The technologies are getting very close to baked and are finding their way into the web at every turn.

We all need to do our part to embed them into every project we're working on so that peer-to-peer, interoperable social networking will become a reality.

Welcome to the party Diaspora team, don't let the old guard (who have largely left for BigCo's anyway) scare you off.

Google Buzz = FriendFeed Reborn

Added on by Chris Saad.

FriendFeed was dead, now it is re-born as Google Buzz. I've not been able to try the product yet, but philosophically and architecturally it seems superior to FriendFeed.

Here are my observations so far:

Consumption Tools

Buzz is better than FriendFeed because Google is treating it as a consumption tool rather than a destination site (by placing it in Gmail rather than hosting it on a public page). FriendFeed should have always been treated this way. Some people got confused and started hosting public discussions on FriendFeed.

That being said, though, I've long said that news and sharing is not the same as an email inbox and those sorts of items should not be 'marked as read' but rather stream by in an ambient way.

While Buzz is in fact a stream, it is its own tab that you have to focus on rather than a sidebar you can ignore (at least as far as I can tell right now).

How it affects Publishers (and Echo)

The inevitable question of 'How does this affect Echo' has already come up on Twitter. Like FriendFeed before it, Buzz generates siloed conversations that do not get hosted at the source.

So, the publisher spends the time and money to create the content and Buzz/Google get the engagement/monetization inside Gmail.

For some reason, all these aggregators think that they need to create content to be of value. I disagree. I long for a pure aggregator that does not generate any of its own content such as comments, likes, shares etc.

That being said, however, the more places we have to engage with content the more reasons there are for Echo to exist so that publishers can re-assemble all that conversation and engagement back on their sites.

Synaptic Connections

Note that they don't have a 'Follow' button - it's using synaptic connections to determine who you care about. Very cool! I worry though that there might not be enough controls for the user to override the assumptions.

Open Standards

Already, Marshall is calling it the savior of open standards. I don't think Open Standards need to be saved - but they certainly have all the buzz words on their site so that's promising.

That's it for now, maybe more later when I've had a chance to play with it.

Update: After playing with it this morning, and reading a little more, it's clear that this is actually Jaiku reborn (not FriendFeed), because the Jaiku team were involved in building it. They deserve a lot of credit for inventing much of this stuff in the first place - long before FriendFeed.

Also, having used it only for an hour, the unread count on the Buzz tab is driving me nuts. It shouldn't be there. It's a stream not an inbox. Also it makes no sense why I can't display buzz in a sidebar on the right side of my primary Gmail inbox view. That would be ideal.

It's also funny to me that some people have tried to give Chris Messina credit for Buzz even though he's been at Google for no more than a month. They clearly don't understand how long and hard it is to build product. Messina is good, but he aint that good :)

Facebook and the future of News

Added on by Chris Saad.

Marshall Kirkpatrick has written a thoughtful piece over on Read/Write Web entitled 'Facebook and the future of Free Thought' in which he explains the hard facts about news consumption and the open subscription models that were supposed to create a more open playing field for niche voices. In it, he states that news consumption has barely changed in the last 10 years. RSS and Feed Readers drive very little traffic and most people still get their news from hand selected mainstream portals and destination sites (like MSN News and Yahoo news etc). In other words, mainstream users do not curate and consume niche subscriptions and are quite content to read what the mainstream sites feed them.

This is troubling news (pun intended) for those of us who believe that the democratization of publishing might open up the world to niche voices and personalized story-telling.

Marshall goes on to argue that Facebook might be our last hope. That since everyone spends all their time in Facebook already, that the service has an opportunity to popularize the notion of subscribing to news sources and thereby bring to life our collective vision of personalized news for the mainstream. Facebook already does a great deal of this with users getting large amounts of news and links from their friends as they share and comment on links.

Through my work with APML I have long dreamed of a world where users are able to view information through a highly personalized lens - a lens that allows them to see personally relevant news instead of just popular news (note that Popularity is a factor of personal relevancy, but it is not the only factor). That doesn't mean the news would be skewed to one persuasion (liberal or conservative for example) but rather to a specific topic or theme.

Could Facebook popularize personalized news? Should it? Do we really want a closed platform to dictate how the transports, formats and tools of next generation story-telling get built? If so, would we simply be moving the top-down command and control systems of network television and big media to another closed platform with its own limitations and restrictions?

Personalized news on closed platforms are almost as bad as mainstream news on closed platforms. News organizations and small niche publishers both need a way to reach their audience using open technologies or we are doomed to repeat the homogenized news environment of the last 2 decades. The one that failed to protect us from a war in Iraq, failed to innovate when it came to on-demand, and failed to allow each of us to customize and personalize our own news reading tools.

That's why technologies like RSS/Atom, PubSubHub and others are so important.

What's missing now is a presentation tool that makes these technologies sing for the mainstream.

So far, as an industry, we've failed to deliver on this promise. I don't have the answers for how we might succeed. But succeed we must.

Perhaps established tier 1 media sites have a role to play. Perhaps market forces that are driving them to cut costs and innovate will drive these properties to turn from purely creating mainstream news editorially toward a model where they curate and surface contributions from their readership and the wider web.

In other words, Tier 1 publishers are being transformed from content creators to content curators - and this could change the game.

In the race to open up and leverage social and real-time technologies, these media organizations are actually making way for the most effective democratization of niche news yet.

Niche, personalized news distributed by open news hubs born from the 'ashes' of old media.

Don't like the tools one hub gives you? Switch to another. the brands we all know and love have an opportunity to become powerful players in the news aggregation and consumption game. Will they respond in time?

Due to my experience working with Tier 1 publishers for Echo, I have high hopes for many of them to learn and adapt. But much more work still remains.

Learn more about how news organizations are practically turning into personalized news curation hubs over on the Echo Blog.

A call for focus from the open standards community

Added on by Chris Saad.
Time to refocus the open community
Over on the Open Web Foundation mailing list Eran Hammer-Lahav who, despite his gruff and disagreeable personality, I respect greatly for his work in the development of open standards, is effectively calling for a complete shakeup of the foundation and the efforts being poured into the 'common ground' of the standards efforts.
Let me define the 'Common Ground' as I see it.
Building strong common ground is like building strong open standards deep into the stack. Just like a software stack, our community needs a stack of organizations that are loosely coupled and open to participation. Groups like the W3C and IETF provide a rock solid core, more agile groups focused on specific standards like OpenID and Oauth are in the middle and a project like the DataPortability project was supposed to be on top - a kind of user interface layer.
You see, good standards efforts are neccessarily projects that work to solve one small problem well. The problems are often deep technical challenges that attract passionate and, let's face it, geeky people to hack, debate and decide on details that don't hit the radar for 99.9% of the population.
The problem, of course, is that the rest of the world has to care for a standard to matter.
Leaders and project managers need to be found, real companies need to get involved (not just their staff), collaboration platforms need to facilitate real and open discussion, calls for collaboration need to be heard, specs need to be written (and written well), libraries need to be written, governance needs to be put in place and so on.
Also, once the standard is (half) baked, less involved hackers need to participate to test the theories in the real world. Less savvy developers need to hear about the standard and understand it. Business people need to understand the value of using a standard over a proprietary solution. They also need IP protections in place to ensure that by using the standard they are not putting their company at risk. Marketing people need to know how to sell it to their customer base. Customers need to know how to look for and choose open solutions to create a market place that rewards openness.
All of this is 'Common Ground'. It is common to any standards effort and there should - no must - be an organization that is just as lean, mean and aggresive as Facebook in place to provide these resources if we are ever going to compete with closed solutions.
At the start of 2008 the DataPortability project became very popular. It's goal was not to build standards, but rather to promote them. To provide much of the common ground that I described above.
The DP project's particular mission, in my mind at least, was to focus on the marketing effort. To build a massive spot light and to shine that intense light on the people, organizations and standards that were getting the job done.
Is the OWF providing a generic legal/IPR framework? Fantastic! It was the DPP's job to let everyone know - developers, business execs, media, potential editors, contributors and more. Our job was not, and should never be to start the framework itself, but rather to advocate for, provide context around and promote the hell out of someone else's effort to do so.
Is a conference happening next year? Excellent. It was the DPP's job to get in touch with the conference organizer, organize not just a DP panel, but a DP Track and to create room (and perhaps even a narritive) inside which the people doing the actual work can speak.
Has Facebook just announced a new feature that could have been achieved through a combination of existing open standards? Then it is' the DPP's job to consult with each of those standards groups and create a cohesive response/set of quotes for the media to use.
Unfortunately, though, many in the standards community chose to fight the creation of the project for whatever reasons crossed their mind at the time. They used all sorts of methods to undermine the effort. Some that would Fox News to shame.
The result, of course, has been a diversion from the important work of providing common area services to the standards community to a self-protection state of creating governance and creating our own 'deliverables' in order to justify and protect its own existance.
I have, as a result of a series of unfortunate events, fallen out of touch with the Steering group at the DPP. Moving to the US, getting disillusioned with the community I admired (not those involved with DPP. Ny friends at the DPP Steering group have always performed very admirably and worked extremely hard) and ultimately shifting my world view to realize that the best contribution I can make - the best way to really move the needle - is to ship Data Portability compliant software at scale.
At this juncture, however, I think it's time for us all to refocus on our original mission for the DataPortability Project.
To restate my humble view on the matter:
To provide a website that explains data portability to various audiences in neat and concise ways. It is the onramp for the standards community. You should be able to send anyone to 'dataportability.org' and they 'get it' and know what to do next.
To provide context and advocacy on news and development from inside and outside the standards community so that media, execs and less involved developers can understand and react
To build a community of interested parties so that they can swam to the aid of standards groups or the standards effort in general.
To act as a market force to (yes I'm going to say it) pick winners. To highlight what works, what doesn't and what should be done next to move the whole effort forward. Nothing is as powerful as removing confusion and planting a big red flag on the answer.
To recognize that we have the authority to do whatever we want to do because we are an independant, private group who has chosen to create public/transparent processes. We need to believe in ourselves. If we do good work, then people will listen. If we don't then they can listen to someone else.
This necessarily means that the only real deliverable from the project would be a small set of communication tools that build community, context and advocacy around what we believe is the 'truth' (or at least things worth paying attention to) in the broader standards community.
In my book that is not only a very worthy effort, it is increasingly critical to the success and health of the web.

Over on the Open Web Foundation mailing list Eran Hammer-Lahav who, despite his gruff and disagreeable personality, I respect greatly for his work in the development of open standards, is effectively calling for a complete shakeup of the foundation and the work being poured into the 'common ground' of the standards efforts.

Let me define the 'Common Ground' as I see it.

Building strong common ground is like building strong open standards deep into the stack. Just like a software stack, our community needs a stack of organizations that are loosely coupled and open to participation. Groups like the W3C and IETF provide a rock solid core, more agile groups focused on specific standards like OpenID and Oauth are in the middle and a project like the DataPortability project was supposed to be on top - a kind of user interface layer.

You see, good standards efforts are neccessarily projects that work to solve one small problem well. The problems are often deep technical challenges that attract passionate and, let's face it, geeky people to hack, debate and decide on details that don't hit the radar for 99.9% of the population.

The problem, of course, is that the rest of the world has to care for a standard to matter.

Leaders and project managers need to be found, real companies need to get involved (not just their staff), collaboration platforms need to facilitate real and open discussion, calls for collaboration need to be heard, specs need to be written (and written well), libraries need to be written, governance needs to be put in place and so on.

Also, once the standard is (half) baked, less involved hackers need to participate to test the theories in the real world. Less savvy developers need to hear about the standard and understand it. Business people need to understand the value of using a standard over a proprietary solution. They also need IP protections in place to ensure that by using the standard they are not putting their company at risk. Marketing people need to know how to sell it to their customer base. Customers need to know how to look for and choose open solutions to create a market place that rewards openness.

All of this is 'Common Ground'. It is common to any standards effort and there should - no must - be an organization that is just as lean, mean and aggresive as Facebook in place to provide these resources if we are ever going to compete with closed solutions.

At the start of 2008 the DataPortability project became very popular. It's goal was not to build standards, but rather to promote them. To provide much of the common ground that I described above.

The DP project's particular mission, in my mind at least, was to focus on the marketing effort. To build a massive spot light and to shine that intense light on the people, organizations and standards that were getting the job done.

Is the OWF providing a generic legal/IPR framework? Fantastic! It was the DPP's job to let everyone know - developers, business execs, media, potential editors, contributors and more. Our job was not, and should never be to start the framework itself, but rather to advocate for, provide context around and promote the hell out of someone else's effort to do so.

Is a conference happening next year? Excellent. It was the DPP's job to get in touch with the conference organizer, organize not just a DP panel, but a DP Track and to create room (and perhaps even a narritive) inside which the people doing the actual work can speak.

Has Facebook just announced a new feature that could have been achieved through a combination of existing open standards? Then it is the DPP's job to consult with each of those standards groups and create a cohesive response/set of quotes for the media to use.

What is the relationship Facebook Platform, OpenSocial, Open Standards, OpenID, OAuth, Portable Contacts and Twitter's 'Open API'? DataPortability.org should have the answer neatly described on its website.

Unfortunately, though, many in the standards community chose to fight the creation of the project for whatever reasons crossed their mind at the time. They used all sorts of methods to undermine the effort. Some that would put Fox News to shame.

The result, of course, has been a diversion from the important work of providing this common ground  to the standards community to a self-protection state of creating governance and creating our own 'deliverables' in order to justify and protect our own existence.

I have, as a result of a series of unfortunate events, fallen out of touch with the Steering group at the DPP. Moving to the US, getting disillusioned with the community I admired (not those involved with DPP. My friends at the DPP Steering group have always performed very admirably and worked extremely hard) and ultimately shifting my world view to realize that the best contribution I can make - the best way to really move the needle - is to ship Data Portability compliant software at scale.

At this juncture, however, I think it's time for us all to refocus on our original mission for the DataPortability Project.

To restate my humble view on the matter:

  • To provide a website that explains data portability to various audiences in neat and concise ways. It is the onramp for the standards community. You should be able to send anyone to 'dataportability.org' and they 'get it' and know what to do next.
  • To provide context and advocacy on news and development from inside and outside the standards community so that media, execs and less involved developers can understand and react
  • To build a community of interested parties so that they can swam to the aid of standards groups or the standards effort in general.
  • To act as a market force to (yes I'm going to say it) pick winners. To highlight what works, what doesn't and what should be done next to move the whole effort forward. Nothing is as powerful as removing confusion and planting a big red flag on the answer.
  • To recognize that we have the authority to do whatever we want to do because we are an independant, private group who has chosen to create public/transparent processes. We need to believe in ourselves. If we do good work, then people will listen. If we don't then they can listen to someone else.

This necessarily means that the only real deliverable from the project would be a small set of communication tools that build community, context and advocacy around what we believe is the 'truth' (or at least things worth paying attention to) in the broader standards community.

Many have scoffed at that these goals in the past claiming that there was no 'value'. In my book this set of goals is not only a very worthy, it is increasingly critical to the success and health of the web.

Facebook privacy changes are not evil

Added on by Chris Saad.

I give Facebook a lot of crap. But I don't think their latest privacy changes are all that nefarious. It's pretty obvious what they are doing. They want search inventory to sell to Google and Microsoft. They want to be as cool as Twitter.

I think the more important story is that they are turning their square into a triangle.

A well placed friend of mine (who shall remain nameless) gave me this metaphor (I will try not to butcher it too much).

Twitter is like a triangle. Small group of people (on top) broadcasting to a large group of people down bottom.

Facebook is/was more like a square. Everyone communicating more or less as equal peers (at least on their own personal profile pages).

This is very rare on the internet. It's rare anywhere really. It's unusual to have a platform that encourages so much 'public' peer-2-peer participation.

It's clear, however, that Facebook is trying to have its cake and eat it too. They want to be a triangle for those who want one, and a square for those who want one of those.

Will it work? Maybe. They are a 'Social Utility' after all. They have never thought of themselves as a vertical social network with a static social contract. As I've said before, their ability to change and evolve at scale is beyond impressive. It has never been seen before.

From College kid profile pages, to app platform, to stream platform, to stream platform with deep identity and routing. Their flexibility, rate of change and reinvention is staggering. They put Madonna and Michael Jackson to shame.

Ultimately Facebook wants to be the Microsoft Outlook and Google Adsense of the Social Web all rolled into one. Maybe throw some PayPal in for good measure.

To do this I think you will see them continue to provide square or triangle options for their users (with their own personal bias towards triangles) and deprecate legacy parts of their system like canvas pages and groups.

Ultimately, though, the real opportunity is to look beyond the public vs. private debate and observe the 'Multiple Publics' that Danah Boyd and Kevin Marks speak about. But that's a post for another day.

Is this good or bad for us? I'm not sure it matters. It's another big bet for the company though, and it was a necessary step to clean up the half steps that resulted in privacy setting hell on the service so far.

A failure of Imagination and Conviction

Added on by Chris Saad.

As you might know if you follow my work even remotely, my projects almost always come from a place of philosophical supposition. That is, I first create a model that I think matches the current and emerging state of the world, and then I create a product, project, format or other that works inside, encourages or commercializes that model. Many of my colleagues at JS-Kit do the same thing. Khris Loux and I, for example, spend hours and hours discussing our shared world views and how this translates to features, business direction and general life goals.

This methodology allows us to couch our decisions in well thought out mental models to make them more consistent, predictable and, we hope, more effective.

Over the years, and with my friends, I've proposed a number of these philosophical models including APML, DataPortability and most recently (this time working with Khris) SynapticWeb.

One of the hardest aspects of creating a philosophical model, however, is truly letting it guide you. To trust it. To take it's premise to the logical conclusion. Another challenge is explaining this methodology (and the value of the resulting outcomes) to others who a) don't think this way and b) have not taken the time to examine and live the model more fully.

Many times, the choices and decisions that I/we make from these models are nuanced, but the sum of their parts, we believe, are significant.

Let me make some concrete examples.

Social Media

There is this ongoing tension between the value of social/user generated media and the media produced by 'Journalists'. Sure social media is amazing, some say, but bloggers will never replace the role of Journalists.

The fact of the matter is, if your philosophical world view is that Social Media is important, that it is a return to one-to-one personal story telling and that it allows those in the know - involved in the action - to report their first hand accounts, then you must necessarily expand your imagination and have the conviction to follow that line of logic all the way to the end.

If you do, you must necessarily discover that the distinction between Journalists and 'Us' as social media participants (all of us) is authority, perspective, distribution and an attempt at impartiality.

In the end, however, we are each human beings (yes, even the journalists). Journalists are imbued with authority because a trusted news brand vets and pays them, they are given the gift of perspective because they sit above the news and are not part of it, they have distribution because their media outlet prints millions of pieces of paper or reaches into the cable set top boxes of millions of homes and their impartiality is a lie.

Can't these traits be replicated in social media? Of course they can.

Reputation can be algorithmically determined or revealed through light research/aggregation, perspective can be factored in by intelligent human beings or machines that find both sides of a story, distribution is clearly a solved problem through platforms like Twitter, Digg and others and impartiality is still a lie. At least in social media bias is revealed and transparency is the new impartiality.

I don't mean to provide an exhaustive reasoning on why Social Media as a philosophical framework holds up as new paradigm for news gathering and reporting here - only to give an example of how we must allow ourselves to imagine outside the box and have the conviction to fully believe in our own assumptions.

Streams

The same type of artificial mental barriers have appeared at every step of the way with each of the philosophical frameworks in which I have participated. Streams, is the most recent.

When we launched Echo we proposed that any conversation anywhere, irrespective of the mode or channel in which it was taking place, had the potential to be a first class part of the canonical and re-assembled lifestream of a piece of content.

Many pushed back. "Oh a Tweet can't possibly be as valuable as a comment" they lamented. They're wrong.

A Tweet, an @ Reply, a Digg, a Digg Comment, a Facebook Status Update, a Facebook Comment, an 'on page' comment and any other form of reaction each have just as much potential for value as the other.

Some have created artificial distinctions between them. They separate the stream into 'Comments' and 'Social Reactions'. I have news for everyone. A comment is a social reaction. Thinking of it as anything less is a failure of imagination and conviction. The trick is not a brute force separation of the two, but rather a nuanced set of rules that help diminish the noise and highlight the signal - where ever it might be - from any mode or channel. We've started that process in Echo with a feature we call 'Whirlpools'.

Communities

Another interesting failure of imagination that I come up against a lot lately is the notion of community building.

With Echo, we have taken the philosophical position that users already have a social network - many have too many of them in fact. There is no reason for them to join yet another network just to comment. Not ours, not our publisher's.

No, instead they should be able to bring their social network with them, participate with the content on a publisher's website, share with their existing friends on existing social networks, and leave just as easily.

By using Echo, you are not joining 'our community'. You already have a community. If anything you are participating in the Publishers community - not ours.

We don't welcome new customers to 'Our community'. Instead we help their users bring their community to a piece of content, interact, share and leave.

Publishers invest large quantities of capital in producing high quality content only to have the engagement and monetization opportunities occur on Social Networks. In these tough economic times, publishers can not afford to bleed their audience and SEO to yet another social network just to facilitate commenting. That is the opposite of the effect they are trying to achieve by adding rich commenting in the first place.

If we use our imagination, and have the conviction to see our ideas through, we realize that publishers need tools that encourage on-site engagement and re-assemble offsite reactions as well - not bolster the branded 3rd party communities of the products they use.

Be Brave

In summation - be brave. Observe the world, define a philosophical framework, imagine the possibilities and have the conviction to follow through on your ideas. Stop being lazy. Stop stopping short of taking your impulses to their logical conclusions because I've found, when you consistently execute on your vision it might be a little harder to sell your point of differentiation - but your contributions will ultimately be better, more consistent and more long lasting for your company, the web and the rest of the world.

Redefining Open

Added on by Chris Saad.

In my mind, there are four kinds of open.

  • Torvalds Open.
  • Zuckerberg Open.
  • Not Open but we use the word Open anyway.
  • Saad Open.

This fragmentation has diluted the word open to the point where it almost has no value.

It's time to re-define the word open. First let me explain each category.

Torvalds Open.

In Linus Torvalds world (the guy who invented Linux) Open means that the software is developed through a community process. The source code is visible and modifiable by anyone and is available for free.

This is called 'Open Source'.

Companies may package and bundle the software in new and novel ways, and provide support and services on top for a fee.

The problem with Open Source on the web is that the software itself has less value than the network effects and up-time provided by a branded, hosted experience. Running Twitter.com on open source software, for example, would have very little value because Twitter's lock-in is not their software, but rather their name space (@chrissaad) and their developer ecosystem all developing software with dependencies on their proprietary API.

Open Source is useful, interesting and important, but is not what I mean when I talk about the Open Web. I feel like its value is well understood and it is not the first, best way of making our world (and the Internet) a better place - at least not in the same way it once did when client-side software was the primary way we used computers.

Zuckerberg Open.

When Mark Zuckerberg talks about open, he is not talking about Technology. He is talking about human interactions.

Ever since the popularity of Data Portability (via the DataPortability project) Facebook has gone to great lengths to redefine the word Open to mean the way people interact with each other.

In doing so, they have managed to, in large part, co-opt the word and claim their platform makes people 'more open'.

In many respects, and by their definition, they are right. Facebook has encouraged a mind bending number of people to connect and share with each other in ways that had been previously reserved for bloggers and other social media 'experts'.

Facebook deserves a lot of credit for introducing social networking to the masses.

Their definition of Open, however important, is not the kind I'm talking about either.

Not Open but we use the word Open anyway.

This is when a platform or product has an API and therefore claim that they have an 'Open Platform'.

There's nothing open about having an API. It's just having an API. The platform could be closed or open depending on how the given application and API is built and what limitations are placed upon it.

In most cases, an 'Open Platform' is not actually open, it's just a platform.

Saad Open

My definition of open is very specific. In fact a better way to describe it would be Interoperable and Distributed.

To explain, let me provide some compare and contrast examples.

Twitter is closed because it owns a proprietary namespace (e.g. @chrissaad). The only way to address people is using their username system. They own those usernames and have final authority over what to do with them.

They are closed because they do not provide free and clear access to their data without rate limiting that access or cutting deals for improved quality of services.

They are also closed because they are not a federated system. You can not start your own Twitter style tool and communicate with users on Twitter or vice versa. The only way to message people on Twitter is to use Twitter's propietary APIs for submitting and retrieving data.

A proprietary API is an API that is special to a company and/or produces data that is not in an open standard.

Wordpress, on the other hand (and to contrast) is an open system. Let's compare point for point.

It does not own the namespace on which it is developed. The namespaces are standard URLs. This blog, for example is hosted at blog.areyoupayingattention.com. Wordpress does not own that domain.

Wordpress produces a single type of data - blog posts. Those blog posts are accessible using an open standard - RSS or Atom. There is no rate limit on accessing that data.

Wordpress is a federated system. While they provide a hosted solution at Wordpress.com for convenience, there is nothing stopping me from switching to Blogger or Tumblr. The tools that you would use to consume my blog would remain unchanged and the programmers who make those tools would not need to program defensibly against Wordpress' API. They simply need to be given the URL of my RSS feed and they are good to go.

This makes Wordpress an open tool in the open blogosphere.

Blogging is open.

Microblogging should be open too.

To summarize. Open, in my definition, does not mean the software is open source or free. It means that the software receives open standards data, provides open standards data, has an interoperable API and can easily be switched out for other software.

Today I was challenged on Twitter that Echo is not 'Open' because it is proprietary code and costs money to use.

This person does not understand my definition of Open. Echo is open because it is not a destination site, it sits on any site anywhere. The owner of that site can take it off and replace it with another engagement tool at any time. The data being absorbed by Echo, for the most part, is RSS or Atom, and the data coming out of Echo is RSS.

It does not have any proprietary namespaces (except our useless legacy login system which we are trying to get rid of as quickly as possible) and does not pretend to create some amazing social network of its own. It is just a tool to communicate on the open, social web.

Is Echo perfect? No, of course not, but our intention is to make each and every aspect of the product as interoperable and distributed as possible. We will even use and contribute to open source where appropriate.

How does your product, or the tools you choose, compare? Tell me in the comments.

Next up, we should start to redefine the 'Open' community that creates open standards. Much of it is not very open.