Product & Startup Builder

Filtering by Category: Dataportability

The Open Web Is Dead - Long live the Open Web

Added on by Chris Saad.

Yesterday Robert Scoble once again declared that the Open Web was dead. His argument was that Apps and proprietary black holes like Facebook are absorbing all the light (read: users, attention, value, investment) and taking our beloved open platform right along with it. In his post, he kindly (but incorrectly) named me as the only person who really cares about the Open Web. While that's flattering, I think he's wrong about me being the only one who cares.

But he is right about the Open Web. It's in real danger. URLs are fading into the background,  native Mobile apps are all the rage and Facebook threatens to engulf the web into a proprietary black hole.

But I think there's a bigger problem going on right now. Not just with the web, but with silicon valley (as stewards of the web). We've lost sight of the things that matter. We're obsessed with quick wins, easily digestible VC pitches, stock options and flipping for a Ferrari.

There's more to this game than that. Let me touch on some of the things I see going on.

  1. Lead not just cheerlead In our obsession with being seen by our micro-audiences as 'thought leaders' or 'futurists' it's always very tempting to watch which way the wind is blowing and shout loudly that THERE is the future. Like a weather vane, it's easy to point the way the wind is blowing, but our biggest, best opportunity is not to declare a popular service 'the next big thing' just because a few visible people are hanging out there. Rather our collective and individual responsibility is to help articulate a direction we think moves the state of the art forward for both the web and for society at large. Something, as leaders of this field, we believe in. Just like VCs develop an investment thesis, we should all have a vision for where the web is going (and how it should get there) and actively seek out, support and promote quiet heros who are building something that moves the needle in the right direction.
  2. Add to the web's DNA Almost every startup I see today is focused on building an 'App' and calling it a 'Platform'. Too often (almost every time) though, these apps are nothing more than proprietary, incremental and niche attempts at making a quick buck. We need more companies to think deeper. Think longer term. What are you doing to change the fabric of the web's DNA forever? How can you contribute to the very essence of the Internet the same way that TCP/IP, HTTP, HTML, JS and so many other technologies have done. Even proprietary technologies have provided valuable evolutions forward - things like Flash and yes, even FB. How are you going to live forever? This is why Facebook used to call itself a 'Social Utility' instead of a 'Social Network'. Mark Zuckerberg was never content to be the next Myspace Tom. He wanted to be the next Alexander Graham Bell. And now he is.
  3. Don't just iterate, innovate Of course, someone has to build Apps. We can't all be working at the infrastructure layer. But too many of the Apps we chose to build (or champion) are incremental. As startup founders, investors and influencers it's so easy to understand something that can be described as the 'Flipboard of Monkeys' instead of thinking really hard about how a completely new idea might fit into the future. Sure there are plenty of good business and marketing reasons why you shouldn't stray too far from the beaten path, broadening it one incremental feature at a time, but the core essence of what you're working on can't be yet another turn of a very tired wheel. If you're shouting 'Me too' then you're probably not thinking big enough.
  4. B2C, not Ego2C Silicon valley is clearly a B2C town. We all love the sexy new app that our mother might eventually understand. Something we can get millions of users to use so we can show them lots of ads. Besides the fact that I think we should focus a little more on B2B, the problem is we're not really a B2C town at all. We're actually more focused on what I will call Ego2c. That is, we pick our favorite apps based on how famous the founding team is OR how easily we can use the app to build yet another niche audience for ourselves (and brands/marketers). It would be a tragedy if the social web revolution boils down to new methods of PR and marketing. But that's what we seem to be obsessed with. As soon as any app from a famous founder gets released we give it tones of buzz while plenty of more deserving projects get barley a squeak. If the app gets a little traction (typically the ones that have Ego mechanics baked in) you see a million posts about how marketers can exploit it. Inevitably the app developers start to focus on how to 'increase social coefficients' instead of how to help human beings make a connection or find utility in their lives.
  5. "Users don't care" Speaking more specifically about the Open vs. Closed debate, too often we hear the criticism "Users don't care about open". This is absolutely true and the reason why most open efforts fail. Users don't care about open. They care about utility and choice. This is why the only way to continue propagating the open web is to work with BUSINESS. B2B. Startups, Media Brands, The bigco Tech companies. They care about open because the proprietary winners are kicking the losers ass and that usually means there are at least 1 or more other guys who need a competitive advantage. They need to team up and build, deploy and popularize the open alternative.  That's why open always wins. There's always plenty of losers around who are going to commoditize the popular closed thing. As technology leaders we're paid to care about things users don't care about. Things that shape the future. While users, in the short term, might not care, we should dare to think and dream a little bigger. As a case study look at Android vs. iOS. iOS is more profitable for a single company, but the other is now a force of nature.
  6. Death is just a stage of life Just because something is no longer interesting doesn't mean it's dead. Its spirit, and often times the actual technology, lives on, one layer below the surface. RSS is a great example of this. RSS's spirit lives on in ActivityStreams and the general publish/subscribe model. It is powering almost every service-to-service interaction you currently enjoy. Is it dead, or has it simply become part of the DNA of the Internet? Could RSS (or something like it) be better exposed higher up in the stack, absolutely, but that will take some time, thoughtful execution and influencers who are willing to champion the cause. The same is true for OpenID and OAuth.
  7. The Arc of the Universe Is long but It bends towards Open The battle of Open vs. Closed is not a zero sum game. Both have their time. It's a sin wave. First, closed, proprietary solutions come to define a new way of fulfilling a use case and doing business. They solve a problem simply and elegantly and blaze a path to market awareness, acceptance and commercialization. Open, however, always follows. Whether it's a year, a decade or a century, Open. Always. Wins. The only question is how long, as an industry, are we going to keep our tail tucked between our legs in front of the the great giant proprietary platform of the moment or are we going to get our act together to ensure the "Time to Open" is as short as possible. It takes courage, co-ordination and vision, but we can all play our part to shorten the time frame between the invention of a proprietary app and the absorption of that value into the open web platform.
  8. Acknowledge reality FB has won. It's done. Just like Microsoft won the Desktop OS (in part handed to them by IBM), so too has FB won the Social OS (in part handed to them by Microsoft). For now. Acknowledging the truth is the first step to changing it. The only question now is how long we're all willing to wait until we get our act together to turn the proprietary innovation of the 'social graph' into part of the open web's core DNA. We need to recognize our power. They have ~1B users? The open web has more. Chances are that the major website or brand you work for has plenty of its own users as well. Are you going to send them to FB, or are you going to invest in your own .com. Trust me, I know it's really, really easy to take what you're given because you're too busy putting out a million fires. But as technology leaders I challenge us all to build something better. We're the only ones who can.
  9. [Edit] Don't kill Hollywood Did you catch the YC post  calling for silicon valley to kill hollywood. Not only was this reckless and short sighted, it's the exact opposite of what we should be doing. Instead of trying to kill or cannibalize media companies and content creators, how about we work with them to create the next generation of information technology. They have the audiences+information and we have the technology. Instead, most silicon valley companies, by virtue of their B2C focus, are too busy leaching off major media instead of finding ways to help transform it. Sure most of them move slowly - but move they are. Move they must. Helping them is very profitable. I write more about this on the Echo blog - calling it 'Real-time Storytelling'
  10. [Edit] Today's data portability problem When I started the DataPortability project the issue of the time was personal data portability. That's not the case anymore. While user-centric data portability is still being done via proprietary mechanisms it's a) actually possible and b) moving more towards open standards every day. The real issue right now is firehoses. Access to broad corpuses of data so that 3rd parties can innovate is only possible through firehoses (for now). To put it another way, the reason Google was possible was because the open web was crawl-able - for free - with no biz dev deal. The reason FB was possible was because the open web allowed any site to spring up and do what it wanted to do. Today, too much of our data is locked up in closed repositories that can and must be cracked open. Google's moves to exclude other socnets (besides G+) from their search results until they had free and clear access to them might be inconvenient for users in the short term, but, as a strategic forcing function, is in the best interest of the open web long term.

End of rant.

Analysis of F8, Timeline, Ticker and Open Graph

Added on by Chris Saad.

So at F8 last week Facebook announced Ticker, Timeline and extensions to the Open Graph API to allow for new verbs and nouns. Here's what really happened.

  • They split their single 'News Feed' into 3 levels of filtering. Now (Ticker), Relevant (News Feed), Historical (Timeline). (Side note, we've had a 'Ticker' style product at Echo that we called 'Community Stream' for a long time now - and most of our customers and partners said to us 'why would we want to show all that data it's just noisy'. Maybe now they will take a second look.). Question: Will G+, Twitter and the REST of the web adopt the same model? They should.
  • This allows FB to collect more 'noise' (also known as synaptic firings or Attention data) which, in turn, allows them to find more signal (also known as synaptic inferences or attention management). I've long said that the answer to information overload is not LESS information - it's MORE. The more information you have the more ability you have to find patterns and surface them in relevant places (I said it so long ago I can't even find the link). Question: Will independent websites think to collect their OWN Attention data BEFORE sending it to FB so they can leverage for their own purposes. The value of this data is incalculable.
  • Having these new presentation metaphors in place, they then created a mechanism to collect more data in the form of expanded Verbs and Nouns in the Open Graph API. With this new API, user's are now expected to abandon explicit gestures of sharing and instead, accept that every action they take is auto-shared to their friends. Question: When will the first horror stories start coming out about engagement ring purchases, personal health issues and sexual orientations being inappropriately revealed due to auto-sharing?
  • Using all the bling of the Timeline, along with new messaging and a simple little opt in toggle of 'Add to my timeline' they managed to re-launch 'Beacon' without anyone noticing (none of the tech blogs I saw even mentioned it). Question: Why did none of the tech media cover that angle of the story?

I continue to be in awe of Facebook's scale, seriousness, ambition and momentum. There has never been anything like it before.

They have created an Attention Management Platform that rivals Google Search and easily out classes many of my best ideas about Attention Management and Personal Relevancy back when I was thinking about the problem.

It's breathtaking.

And since it is all done with hard links to a single proprietary hub, it is eating the web like a cancer.

Before F8 it was clear that Google+ was a 1 or 2 years behind FB. Now they are 3 or 4.

Only time will tell who, how and why more open systems will begin to reassert themselves in the ecosystem. My bet is that it wont come from a b2c copy-cat, though. It will come from a well organized, commercially incentivized b2b play.

The part that still confuses me, though, is why ANY serious media company would want their news to load in a 'FB canvas app' instead of their own website. It makes zero sense. None of this changes the reality that you need to own your own data and your own point source. I made a little comparison table earlier in the week that explains why.

Real Names getting Real Attention

Added on by Chris Saad.

There's a lot of fury on the web right now about 'Real Names'. FB is trying to use it as a unique feature of their comments system claiming it reduces trolling and low value comments. Of course that isn't really true. For one, any commenting system could force FB login. Two, users will troll with or without their name attached and, worse yet, many legitimate users won't participate for any number of reasons if they can't use a pseudonym. There are plenty of better ways to increase quality in your comments including participation from the content creators, game mechanics, community moderation and more.

The real debate, however, is about G+ trying to copy FB's stance on Real Names. They are insisting all user accounts use them and are actively shutting down accounts that violate the policy. They are being so heavy handed about that even people who ARE using their real name are getting notices of violation - most notable Violet Blue.

I'm not really an expert on pseudonyms, shared contexts and anonymity so I'm going to stay out of this debate.

The real question for me, however, is what is Google's strategic business reason for this policy. There must be a long term plan/reason for it otherwise they wouldn't be insisting so hard.

My assumption is that it's related to their intention to become a canonical people directory and identity provider on the internet to compete with FB in this space.

FB, after all, does not just get it's power from news feeds and photo apps - it gets it from the deep roots it has laid down into the DNA of the internet as the provider of 1st class identity infrastructure and identity information.

In this sense, FB's social contract has served them very well, and Google's attempt to copy it is a hint that they understand FB is not just a .com feature set, but a powerful identity utility. They must (and in some cases seem to be) understand that strategy and it's aggressiveness if they are to properly compete with the monopoly. My only hope, however, is that they are coming up with their own inspired counter strategy rather than just copying the moves they see on the surface - because that's doomed to fail.

Initial quick thoughts on Google+

Added on by Chris Saad.

It's certainly very slick, but it's a few years behind FB. I mean that not just in timing and network effects, but in the much more strategic sense of platform ambition. FB.com was the FB strategy 4 years ago. FB is now going for the rest of the web. It's reach and role as an identity provider and social infrastructure player makes it much more important (and harder to beat) than launching a cool new service. So hopefully the Google+ team is thinking WAY beyond this as a destination site when they are thinking Google Social Strategy.

So far the broad ranging announcements from the +1 button to Google Analytics adding Social bode well for this being a company wide, product wide refresh. The key to success will be in thinking about the need to compete with FB beyond the walls and products of Google.

The key to that, of course, will be to get deep adoption by major sites.

Update: Upon thinking about it a little more. Google has once again missed an opportunity to play to their strengths. With the document web they played the role of aggregator and algorithmic signal detection system. With the social web, their ideal strategy would be to build the ultimate social inbox. A place where I can navigate, consume AND interact with Facebook + Twitter + Foursquare + Quora +++ in one place.

Instead they created yet another content source.

What is Echo StreamServer?

Added on by Chris Saad.

Yesterday we announced a new Echo product called StreamServer. There is very little more I can say that Khris Loux has not already said so eloquently on stage at the #e2 launch event

When you work so hard and long on something (depending on how you look at it, StreamServer was either 15, 2.5 or 1 year in the making) its hard to sum it all up in one, 1 hour event.

But that's what we tried to do.

We tried to thread the needle between a contemporary story about activity data, the existential change (read: opportunity or threat) occurring on the web as traffic and monetization flows to proprietary social networking platforms, the opportunity for every major node on the web to be just as powerful and innovative, the need for open standards and powerful cloud services as the basis of the the rebuttal and our deep desire to make this an industry wide effort. We tried to communicate the important role of aggregation and the pivotal job of mainstream media, e-commerce, entertainment, startup and agencies play in curating activity information for the masses.

We also tried to communicate that this was not just a pipe dream, but rather a commercial reality for major customers. A solution running at scale. A new distribution and monetization opportunity for 3rd party devs and a future ready piece of infrastructure for media companies.

I think we did the best job possible at threading all these stories, and doing it with a human, authentic voice through the lens of customer and partner experiences.

I'm proud of the work we've done so far, and the tireless efforts of the Echo team and our customer/partner devs.

And all of that being said, though, we are only at the beginning. We have just planted the first seed and I look forward to helping it grow.

So what is StreamServer in my words?

It is the real-time, social scale database that Twitter, Facebook, Quora, Foursquare and others built, delivered as an ec2 style cloud service. Turn it on, and forget about managing the data or scaling the infrastructure.

It is the first of its kind and it will hopefully form the basis of many new companies as they deliver many new, novel and innovative experiences to customers and end users everywhere.

And it's a bet on the future of open standards, developer ecosystems, a heterogeneous web made up of first class social nodes.

It's Real-time as a Service.

New Twitter. Feature comparison

Added on by Chris Saad.

Jeremiah and I wrote an analysis of the New Twitter vs. Current Facebook. Here's a snippet:

Situation: Twitter’s new redesign advances their user experience

Twitter has announced a new redesign today, yet by looking at the news, there hasn’t been a detailed breakdown of these two leading social networks.  Overall, Twitters new features start to resemble some features of a traditional social network, beyond their simple messaging heritage.  We took the key features from both social website and did a comparison and voted on the stronger player?

[Great Detailed Graph goes here - See it on Jeremiah's blog]

Our Verdict: Facebook Features Lead Over Twitter’s New Redesign

Facebook’s features offer a more robust user experience, and they have a longer history of developing the right relationships with media, developers, and their users. Twitter, a rapidly growing social network has launched a series of new features (described by the founder as “smooth like butter”) that provide users with a snappy experience and enhanced features.

We tallied the important features of this launch and to their overall expansion strategy and have concluded that Facebook’s features continue to hold dominance over Twitter, despite the noticeable improvements. While we don’t expect that Twitter wants to become ‘another Facebook’ they should play to their strengths and remaining nimble and lightweight yet allowing for developers and content producer to better integrate into their system.

Check out the full results over on his blog.

Guest Post: Facebook's world view

Added on by Chris Saad.

Just wanted to share with you here that I wrote a guest post on Mashable last week about Facebook's world view. Be sure to check it out here.

Are these blunders a series of accidental missteps (a combination of ambition, scale and hubris) or a calculated risk to force their world view on unsuspecting users (easier to ask for forgiveness)? Only the executives at Facebook can ever truly answer this question.

What’s clear, though, is that their platform is tightly coupled with countless other websites and applications across the web, and their financial success is aligned with many influential investors and actors. At this stage, and at this rate, their continued success is all but assured.

But so is the success of the rest of the web. Countless social applications emerge every day and the rest of the web is, and always will be, bigger than any proprietary platform. Through its action and inaction, Facebook offers opportunities for us all. And in the dance between their moves and the rest of the web’s, innovation can be found.

The only thing that can truly hurt the web is a monopoly on ideas, and the only ones who can let that happen are web users themselves.

Guest Post: Facebook's claims about data portability are false

Added on by Chris Saad.

I have published a guest post on RWW about Facebook's recent privacy challenges and their claims about data portability.

"The lack of honesty and clarity from the company and its representatives ... and the continued trend of taking established language - such as "open technology" or "data portability" - and corrupting it for its own marketing purposes, is far more disconcerting than the boundaries it's pushing with its technology choices."

Read it here.

Diaspora is not the answer to the Open Web, but that's ok

Added on by Chris Saad.

For whatever reason, a new project called Diaspora is getting a lot of attention at the moment. They are four young guys who have managed to crowd source $100k+ to build an open, privacy respecting, peer-to-peer social network. A number of people have asked me what I think, so instead of repeating myself over and over I thought I would write it down in one place.

First, I don't think Diaspora is going to be the 'thing' that solves the problem. There are too many moving parts and too many factors (mainly political) to have any single group solve the problem by themselves.

Second, I don't think that's any reason to disparage or discourage them.

When we launched the DataPortability project, we didn't claim we would solve the issue, but rather create a blueprint for how others might implement interoperable parts of the whole. We soon learned that task was impractical to say the least. The pieces were not mature enough and the politics was far too dense.

Instead, we have settled for providing a rolling commentary and context on the situation and promoting the efforts of those that are making strides in the right direction. We also play the important role of highlighting problems with closed or even anticompetitive behaviors of the larger players.

The problem with the DataPortability project, though, was not its ambition or even it's failure to meet those ambitions, but rather the way the 'old guard' of the standards community reacted to it.

The fact of the matter is that the people who used to be independent open advocates were actually quite closed and cliquey. They didn't want 'new kids on the block' telling them how to tell their story or promote their efforts. Instead of embracing a new catalyzing force in their midst, they set about ignoring, undermining and even actively derailing it at every opportunity.

Despite my skepticism about Diaspora, though, I don't want to fall into the same trap. I admire and encourage the enthusiasm of this group to chase their dream of a peer-to-peer social network.

Do I think they will succeed with this current incarnation? No. Do I think they should stop trying? No.

While this project might not work their effort and energy will not go to waste.

I think we need more fresh, independent voices generating hype and attention for the idea that an open alternative to Facebook can and must exist. Their success in capturing people's imagination only shows that there is an appetite for such a thing.

What they might do, however, is strongly consider how their work might stitch together existing open standards efforts rather than inventing any new formats or protocols. The technologies are getting very close to baked and are finding their way into the web at every turn.

We all need to do our part to embed them into every project we're working on so that peer-to-peer, interoperable social networking will become a reality.

Welcome to the party Diaspora team, don't let the old guard (who have largely left for BigCo's anyway) scare you off.

Open is not enough. Time to raise the bar: Interoperable

Added on by Chris Saad.

Last week Elias Bizannes and I wrote a post Assessing the Openness of Facebook's 'Open Graph Protocol'. To summarize that post, it's clear that Facebook is making a play to create, aggregate and own not only identity on the web, but everything that hangs off it. From Interests to Engagement - not just on their .com but across all sites. To do this they are giving publishers token value (analytics and traffic) to take over parts of the page with pieces of Facebook.com without giving them complete access to the user , their data or the user experience (all at the exclusion of any other player). In addition, they are building a semantic map of the Internet that will broker interests and data on a scale never before seen anywhere.

In the face of such huge momentum and stunningly effective execution (kudos to them!), aiming for (or using the word) Open is no longer enough. The web community needs to up it's game.

The same is true for data portability - the group and the idea. Data portability is no longer enough. We must raise the bar and start to aim for Interoperable Data Portability.

Interoperability means that things work together without an engineer first having to figure out what's on the other end of an API call.

When you request 'http://blog.areyoupayingattention.com' it isn't enough that the data is there, or that that its 'open' or 'accessible'. No. The reason the web works is because the browser knows exactly how to request the data (HTTP) and how the data will be returned (HTML/CSS/JS). This is an interoperable transaction.

Anyone could write a web server, create a web page, or develop a web browser and it just works. Point the browser somewhere else, and it continues to work.

Now map this to the social web. Anyone could (should be able to) build an open graph, create some graph data, and point a social widget to it and it just works. Point the social widget somewhere else, and it continues to work.

As you can see from the mapping above, the interaction between a social widget and it's social graph should be the same as that of a browser and a web-server. Not just open, but interoperable, interchangeable and standardized.

Why? Innovation.

The same kind of innovation we get when we have cutting edge web servers competing to be the best damned web server they can be (IIS vs. Apache), and cutting edge websites (Yahoo vs. MSN vs. Google vs. Every other site on the Internet) and cutting edge browsers (Netscape vs. IE vs. Safari vs. Chrome). These products were able to compete for their part in the stack.

Imagine if we got stuck with IIS,  Netscape and Altavista locking down the web with their own proprietary communication channels. The web would have been no better than every closed communication platform before it. Slow, stale and obsolete.

How do we become interoperable? It's hard. Really hard. Those of us who manage products at scale know its easy to make closed decisions. You don't have to be an evil mastermind - you just have to be lazy. Fight against being lazy. Think before you design, develop or promote your products - try harder. I don't say this just to you, I say it to myself as well. I am just as guilty of this as anyone else out there developing product. We must all try harder.

Open standards are a start, but open protocols are better. Transactions that, from start to finish, provide for Discoverability, Connectivity and Exchange of data using well known patterns.

The standards groups have done a lot of work, but standards alone don't solve the problem. It requires product teams to implement the standards and this is an area I am far more interested in these days. How do we implement these patterns at scale.

Customers (i.e. Publishers) must also demand interoperable products. Products that not just connect them to Facebook or Twitter but rather make them first class nodes on the social web.

Like we said on the DataPortability blog:

In order for true interoperable, peer-to-peer data portability to win, serious publishers and other sites must be vigilant to choose cross-platform alternatives that leverage multiple networks rather than just relying on Facebook exclusively.

In this way they become first-class nodes on the social web rather than spokes on Facebook’s hub.

But this is just the start. This just stems the tide by handing the keys to more than one player so that no one player kills us while the full transition to a true peer-to-peer model takes place.

If the web is to truly stay open and interoperable, we need to think bigger and better than just which big company (s) we want to hand our identities to.

Just like every site on the web today can have its own web server, every site should also have the choice to host (or pick) its own social server. Every site should become a fully featured peer on the social web. There is no reason why CNN can not be just as functional, powerful, effective and interchangeable as Facebook.com.

If we don't, we will be stuck with the IIS, IE and Netscape's of the social web and innovation will die.

Google Buzz = FriendFeed Reborn

Added on by Chris Saad.

FriendFeed was dead, now it is re-born as Google Buzz. I've not been able to try the product yet, but philosophically and architecturally it seems superior to FriendFeed.

Here are my observations so far:

Consumption Tools

Buzz is better than FriendFeed because Google is treating it as a consumption tool rather than a destination site (by placing it in Gmail rather than hosting it on a public page). FriendFeed should have always been treated this way. Some people got confused and started hosting public discussions on FriendFeed.

That being said, though, I've long said that news and sharing is not the same as an email inbox and those sorts of items should not be 'marked as read' but rather stream by in an ambient way.

While Buzz is in fact a stream, it is its own tab that you have to focus on rather than a sidebar you can ignore (at least as far as I can tell right now).

How it affects Publishers (and Echo)

The inevitable question of 'How does this affect Echo' has already come up on Twitter. Like FriendFeed before it, Buzz generates siloed conversations that do not get hosted at the source.

So, the publisher spends the time and money to create the content and Buzz/Google get the engagement/monetization inside Gmail.

For some reason, all these aggregators think that they need to create content to be of value. I disagree. I long for a pure aggregator that does not generate any of its own content such as comments, likes, shares etc.

That being said, however, the more places we have to engage with content the more reasons there are for Echo to exist so that publishers can re-assemble all that conversation and engagement back on their sites.

Synaptic Connections

Note that they don't have a 'Follow' button - it's using synaptic connections to determine who you care about. Very cool! I worry though that there might not be enough controls for the user to override the assumptions.

Open Standards

Already, Marshall is calling it the savior of open standards. I don't think Open Standards need to be saved - but they certainly have all the buzz words on their site so that's promising.

That's it for now, maybe more later when I've had a chance to play with it.

Update: After playing with it this morning, and reading a little more, it's clear that this is actually Jaiku reborn (not FriendFeed), because the Jaiku team were involved in building it. They deserve a lot of credit for inventing much of this stuff in the first place - long before FriendFeed.

Also, having used it only for an hour, the unread count on the Buzz tab is driving me nuts. It shouldn't be there. It's a stream not an inbox. Also it makes no sense why I can't display buzz in a sidebar on the right side of my primary Gmail inbox view. That would be ideal.

It's also funny to me that some people have tried to give Chris Messina credit for Buzz even though he's been at Google for no more than a month. They clearly don't understand how long and hard it is to build product. Messina is good, but he aint that good :)

Facebook and the future of News

Added on by Chris Saad.

Marshall Kirkpatrick has written a thoughtful piece over on Read/Write Web entitled 'Facebook and the future of Free Thought' in which he explains the hard facts about news consumption and the open subscription models that were supposed to create a more open playing field for niche voices. In it, he states that news consumption has barely changed in the last 10 years. RSS and Feed Readers drive very little traffic and most people still get their news from hand selected mainstream portals and destination sites (like MSN News and Yahoo news etc). In other words, mainstream users do not curate and consume niche subscriptions and are quite content to read what the mainstream sites feed them.

This is troubling news (pun intended) for those of us who believe that the democratization of publishing might open up the world to niche voices and personalized story-telling.

Marshall goes on to argue that Facebook might be our last hope. That since everyone spends all their time in Facebook already, that the service has an opportunity to popularize the notion of subscribing to news sources and thereby bring to life our collective vision of personalized news for the mainstream. Facebook already does a great deal of this with users getting large amounts of news and links from their friends as they share and comment on links.

Through my work with APML I have long dreamed of a world where users are able to view information through a highly personalized lens - a lens that allows them to see personally relevant news instead of just popular news (note that Popularity is a factor of personal relevancy, but it is not the only factor). That doesn't mean the news would be skewed to one persuasion (liberal or conservative for example) but rather to a specific topic or theme.

Could Facebook popularize personalized news? Should it? Do we really want a closed platform to dictate how the transports, formats and tools of next generation story-telling get built? If so, would we simply be moving the top-down command and control systems of network television and big media to another closed platform with its own limitations and restrictions?

Personalized news on closed platforms are almost as bad as mainstream news on closed platforms. News organizations and small niche publishers both need a way to reach their audience using open technologies or we are doomed to repeat the homogenized news environment of the last 2 decades. The one that failed to protect us from a war in Iraq, failed to innovate when it came to on-demand, and failed to allow each of us to customize and personalize our own news reading tools.

That's why technologies like RSS/Atom, PubSubHub and others are so important.

What's missing now is a presentation tool that makes these technologies sing for the mainstream.

So far, as an industry, we've failed to deliver on this promise. I don't have the answers for how we might succeed. But succeed we must.

Perhaps established tier 1 media sites have a role to play. Perhaps market forces that are driving them to cut costs and innovate will drive these properties to turn from purely creating mainstream news editorially toward a model where they curate and surface contributions from their readership and the wider web.

In other words, Tier 1 publishers are being transformed from content creators to content curators - and this could change the game.

In the race to open up and leverage social and real-time technologies, these media organizations are actually making way for the most effective democratization of niche news yet.

Niche, personalized news distributed by open news hubs born from the 'ashes' of old media.

Don't like the tools one hub gives you? Switch to another. the brands we all know and love have an opportunity to become powerful players in the news aggregation and consumption game. Will they respond in time?

Due to my experience working with Tier 1 publishers for Echo, I have high hopes for many of them to learn and adapt. But much more work still remains.

Learn more about how news organizations are practically turning into personalized news curation hubs over on the Echo Blog.

A call for focus from the open standards community

Added on by Chris Saad.
Time to refocus the open community
Over on the Open Web Foundation mailing list Eran Hammer-Lahav who, despite his gruff and disagreeable personality, I respect greatly for his work in the development of open standards, is effectively calling for a complete shakeup of the foundation and the efforts being poured into the 'common ground' of the standards efforts.
Let me define the 'Common Ground' as I see it.
Building strong common ground is like building strong open standards deep into the stack. Just like a software stack, our community needs a stack of organizations that are loosely coupled and open to participation. Groups like the W3C and IETF provide a rock solid core, more agile groups focused on specific standards like OpenID and Oauth are in the middle and a project like the DataPortability project was supposed to be on top - a kind of user interface layer.
You see, good standards efforts are neccessarily projects that work to solve one small problem well. The problems are often deep technical challenges that attract passionate and, let's face it, geeky people to hack, debate and decide on details that don't hit the radar for 99.9% of the population.
The problem, of course, is that the rest of the world has to care for a standard to matter.
Leaders and project managers need to be found, real companies need to get involved (not just their staff), collaboration platforms need to facilitate real and open discussion, calls for collaboration need to be heard, specs need to be written (and written well), libraries need to be written, governance needs to be put in place and so on.
Also, once the standard is (half) baked, less involved hackers need to participate to test the theories in the real world. Less savvy developers need to hear about the standard and understand it. Business people need to understand the value of using a standard over a proprietary solution. They also need IP protections in place to ensure that by using the standard they are not putting their company at risk. Marketing people need to know how to sell it to their customer base. Customers need to know how to look for and choose open solutions to create a market place that rewards openness.
All of this is 'Common Ground'. It is common to any standards effort and there should - no must - be an organization that is just as lean, mean and aggresive as Facebook in place to provide these resources if we are ever going to compete with closed solutions.
At the start of 2008 the DataPortability project became very popular. It's goal was not to build standards, but rather to promote them. To provide much of the common ground that I described above.
The DP project's particular mission, in my mind at least, was to focus on the marketing effort. To build a massive spot light and to shine that intense light on the people, organizations and standards that were getting the job done.
Is the OWF providing a generic legal/IPR framework? Fantastic! It was the DPP's job to let everyone know - developers, business execs, media, potential editors, contributors and more. Our job was not, and should never be to start the framework itself, but rather to advocate for, provide context around and promote the hell out of someone else's effort to do so.
Is a conference happening next year? Excellent. It was the DPP's job to get in touch with the conference organizer, organize not just a DP panel, but a DP Track and to create room (and perhaps even a narritive) inside which the people doing the actual work can speak.
Has Facebook just announced a new feature that could have been achieved through a combination of existing open standards? Then it is' the DPP's job to consult with each of those standards groups and create a cohesive response/set of quotes for the media to use.
Unfortunately, though, many in the standards community chose to fight the creation of the project for whatever reasons crossed their mind at the time. They used all sorts of methods to undermine the effort. Some that would Fox News to shame.
The result, of course, has been a diversion from the important work of providing common area services to the standards community to a self-protection state of creating governance and creating our own 'deliverables' in order to justify and protect its own existance.
I have, as a result of a series of unfortunate events, fallen out of touch with the Steering group at the DPP. Moving to the US, getting disillusioned with the community I admired (not those involved with DPP. Ny friends at the DPP Steering group have always performed very admirably and worked extremely hard) and ultimately shifting my world view to realize that the best contribution I can make - the best way to really move the needle - is to ship Data Portability compliant software at scale.
At this juncture, however, I think it's time for us all to refocus on our original mission for the DataPortability Project.
To restate my humble view on the matter:
To provide a website that explains data portability to various audiences in neat and concise ways. It is the onramp for the standards community. You should be able to send anyone to 'dataportability.org' and they 'get it' and know what to do next.
To provide context and advocacy on news and development from inside and outside the standards community so that media, execs and less involved developers can understand and react
To build a community of interested parties so that they can swam to the aid of standards groups or the standards effort in general.
To act as a market force to (yes I'm going to say it) pick winners. To highlight what works, what doesn't and what should be done next to move the whole effort forward. Nothing is as powerful as removing confusion and planting a big red flag on the answer.
To recognize that we have the authority to do whatever we want to do because we are an independant, private group who has chosen to create public/transparent processes. We need to believe in ourselves. If we do good work, then people will listen. If we don't then they can listen to someone else.
This necessarily means that the only real deliverable from the project would be a small set of communication tools that build community, context and advocacy around what we believe is the 'truth' (or at least things worth paying attention to) in the broader standards community.
In my book that is not only a very worthy effort, it is increasingly critical to the success and health of the web.

Over on the Open Web Foundation mailing list Eran Hammer-Lahav who, despite his gruff and disagreeable personality, I respect greatly for his work in the development of open standards, is effectively calling for a complete shakeup of the foundation and the work being poured into the 'common ground' of the standards efforts.

Let me define the 'Common Ground' as I see it.

Building strong common ground is like building strong open standards deep into the stack. Just like a software stack, our community needs a stack of organizations that are loosely coupled and open to participation. Groups like the W3C and IETF provide a rock solid core, more agile groups focused on specific standards like OpenID and Oauth are in the middle and a project like the DataPortability project was supposed to be on top - a kind of user interface layer.

You see, good standards efforts are neccessarily projects that work to solve one small problem well. The problems are often deep technical challenges that attract passionate and, let's face it, geeky people to hack, debate and decide on details that don't hit the radar for 99.9% of the population.

The problem, of course, is that the rest of the world has to care for a standard to matter.

Leaders and project managers need to be found, real companies need to get involved (not just their staff), collaboration platforms need to facilitate real and open discussion, calls for collaboration need to be heard, specs need to be written (and written well), libraries need to be written, governance needs to be put in place and so on.

Also, once the standard is (half) baked, less involved hackers need to participate to test the theories in the real world. Less savvy developers need to hear about the standard and understand it. Business people need to understand the value of using a standard over a proprietary solution. They also need IP protections in place to ensure that by using the standard they are not putting their company at risk. Marketing people need to know how to sell it to their customer base. Customers need to know how to look for and choose open solutions to create a market place that rewards openness.

All of this is 'Common Ground'. It is common to any standards effort and there should - no must - be an organization that is just as lean, mean and aggresive as Facebook in place to provide these resources if we are ever going to compete with closed solutions.

At the start of 2008 the DataPortability project became very popular. It's goal was not to build standards, but rather to promote them. To provide much of the common ground that I described above.

The DP project's particular mission, in my mind at least, was to focus on the marketing effort. To build a massive spot light and to shine that intense light on the people, organizations and standards that were getting the job done.

Is the OWF providing a generic legal/IPR framework? Fantastic! It was the DPP's job to let everyone know - developers, business execs, media, potential editors, contributors and more. Our job was not, and should never be to start the framework itself, but rather to advocate for, provide context around and promote the hell out of someone else's effort to do so.

Is a conference happening next year? Excellent. It was the DPP's job to get in touch with the conference organizer, organize not just a DP panel, but a DP Track and to create room (and perhaps even a narritive) inside which the people doing the actual work can speak.

Has Facebook just announced a new feature that could have been achieved through a combination of existing open standards? Then it is the DPP's job to consult with each of those standards groups and create a cohesive response/set of quotes for the media to use.

What is the relationship Facebook Platform, OpenSocial, Open Standards, OpenID, OAuth, Portable Contacts and Twitter's 'Open API'? DataPortability.org should have the answer neatly described on its website.

Unfortunately, though, many in the standards community chose to fight the creation of the project for whatever reasons crossed their mind at the time. They used all sorts of methods to undermine the effort. Some that would put Fox News to shame.

The result, of course, has been a diversion from the important work of providing this common ground  to the standards community to a self-protection state of creating governance and creating our own 'deliverables' in order to justify and protect our own existence.

I have, as a result of a series of unfortunate events, fallen out of touch with the Steering group at the DPP. Moving to the US, getting disillusioned with the community I admired (not those involved with DPP. My friends at the DPP Steering group have always performed very admirably and worked extremely hard) and ultimately shifting my world view to realize that the best contribution I can make - the best way to really move the needle - is to ship Data Portability compliant software at scale.

At this juncture, however, I think it's time for us all to refocus on our original mission for the DataPortability Project.

To restate my humble view on the matter:

  • To provide a website that explains data portability to various audiences in neat and concise ways. It is the onramp for the standards community. You should be able to send anyone to 'dataportability.org' and they 'get it' and know what to do next.
  • To provide context and advocacy on news and development from inside and outside the standards community so that media, execs and less involved developers can understand and react
  • To build a community of interested parties so that they can swam to the aid of standards groups or the standards effort in general.
  • To act as a market force to (yes I'm going to say it) pick winners. To highlight what works, what doesn't and what should be done next to move the whole effort forward. Nothing is as powerful as removing confusion and planting a big red flag on the answer.
  • To recognize that we have the authority to do whatever we want to do because we are an independant, private group who has chosen to create public/transparent processes. We need to believe in ourselves. If we do good work, then people will listen. If we don't then they can listen to someone else.

This necessarily means that the only real deliverable from the project would be a small set of communication tools that build community, context and advocacy around what we believe is the 'truth' (or at least things worth paying attention to) in the broader standards community.

Many have scoffed at that these goals in the past claiming that there was no 'value'. In my book this set of goals is not only a very worthy, it is increasingly critical to the success and health of the web.

A failure of Imagination and Conviction

Added on by Chris Saad.

As you might know if you follow my work even remotely, my projects almost always come from a place of philosophical supposition. That is, I first create a model that I think matches the current and emerging state of the world, and then I create a product, project, format or other that works inside, encourages or commercializes that model. Many of my colleagues at JS-Kit do the same thing. Khris Loux and I, for example, spend hours and hours discussing our shared world views and how this translates to features, business direction and general life goals.

This methodology allows us to couch our decisions in well thought out mental models to make them more consistent, predictable and, we hope, more effective.

Over the years, and with my friends, I've proposed a number of these philosophical models including APML, DataPortability and most recently (this time working with Khris) SynapticWeb.

One of the hardest aspects of creating a philosophical model, however, is truly letting it guide you. To trust it. To take it's premise to the logical conclusion. Another challenge is explaining this methodology (and the value of the resulting outcomes) to others who a) don't think this way and b) have not taken the time to examine and live the model more fully.

Many times, the choices and decisions that I/we make from these models are nuanced, but the sum of their parts, we believe, are significant.

Let me make some concrete examples.

Social Media

There is this ongoing tension between the value of social/user generated media and the media produced by 'Journalists'. Sure social media is amazing, some say, but bloggers will never replace the role of Journalists.

The fact of the matter is, if your philosophical world view is that Social Media is important, that it is a return to one-to-one personal story telling and that it allows those in the know - involved in the action - to report their first hand accounts, then you must necessarily expand your imagination and have the conviction to follow that line of logic all the way to the end.

If you do, you must necessarily discover that the distinction between Journalists and 'Us' as social media participants (all of us) is authority, perspective, distribution and an attempt at impartiality.

In the end, however, we are each human beings (yes, even the journalists). Journalists are imbued with authority because a trusted news brand vets and pays them, they are given the gift of perspective because they sit above the news and are not part of it, they have distribution because their media outlet prints millions of pieces of paper or reaches into the cable set top boxes of millions of homes and their impartiality is a lie.

Can't these traits be replicated in social media? Of course they can.

Reputation can be algorithmically determined or revealed through light research/aggregation, perspective can be factored in by intelligent human beings or machines that find both sides of a story, distribution is clearly a solved problem through platforms like Twitter, Digg and others and impartiality is still a lie. At least in social media bias is revealed and transparency is the new impartiality.

I don't mean to provide an exhaustive reasoning on why Social Media as a philosophical framework holds up as new paradigm for news gathering and reporting here - only to give an example of how we must allow ourselves to imagine outside the box and have the conviction to fully believe in our own assumptions.

Streams

The same type of artificial mental barriers have appeared at every step of the way with each of the philosophical frameworks in which I have participated. Streams, is the most recent.

When we launched Echo we proposed that any conversation anywhere, irrespective of the mode or channel in which it was taking place, had the potential to be a first class part of the canonical and re-assembled lifestream of a piece of content.

Many pushed back. "Oh a Tweet can't possibly be as valuable as a comment" they lamented. They're wrong.

A Tweet, an @ Reply, a Digg, a Digg Comment, a Facebook Status Update, a Facebook Comment, an 'on page' comment and any other form of reaction each have just as much potential for value as the other.

Some have created artificial distinctions between them. They separate the stream into 'Comments' and 'Social Reactions'. I have news for everyone. A comment is a social reaction. Thinking of it as anything less is a failure of imagination and conviction. The trick is not a brute force separation of the two, but rather a nuanced set of rules that help diminish the noise and highlight the signal - where ever it might be - from any mode or channel. We've started that process in Echo with a feature we call 'Whirlpools'.

Communities

Another interesting failure of imagination that I come up against a lot lately is the notion of community building.

With Echo, we have taken the philosophical position that users already have a social network - many have too many of them in fact. There is no reason for them to join yet another network just to comment. Not ours, not our publisher's.

No, instead they should be able to bring their social network with them, participate with the content on a publisher's website, share with their existing friends on existing social networks, and leave just as easily.

By using Echo, you are not joining 'our community'. You already have a community. If anything you are participating in the Publishers community - not ours.

We don't welcome new customers to 'Our community'. Instead we help their users bring their community to a piece of content, interact, share and leave.

Publishers invest large quantities of capital in producing high quality content only to have the engagement and monetization opportunities occur on Social Networks. In these tough economic times, publishers can not afford to bleed their audience and SEO to yet another social network just to facilitate commenting. That is the opposite of the effect they are trying to achieve by adding rich commenting in the first place.

If we use our imagination, and have the conviction to see our ideas through, we realize that publishers need tools that encourage on-site engagement and re-assemble offsite reactions as well - not bolster the branded 3rd party communities of the products they use.

Be Brave

In summation - be brave. Observe the world, define a philosophical framework, imagine the possibilities and have the conviction to follow through on your ideas. Stop being lazy. Stop stopping short of taking your impulses to their logical conclusions because I've found, when you consistently execute on your vision it might be a little harder to sell your point of differentiation - but your contributions will ultimately be better, more consistent and more long lasting for your company, the web and the rest of the world.

Calling for open

Added on by Chris Saad.

Steve Gillmor often writes fantastic (and fantastically long) editorials on the landscape of the real-time web, but they are often very dense and sometimes fail to cover some key points. I thought I would take the liberty of translating and correcting his latest post with my own contributions.

Ever since FriendFeed was sold to Facebook, we’ve been told over and over again that the company and its community were toast. And as if to underline the fact, FriendFeed’s access to the Twitter firehose was terminated and vaguely replaced with a slow version that is currently delivering Twitter posts between 20 minutes and two hours after their appearance on Twitter. At the Realtime CrunchUp, Bret Taylor confirmed this was not a technical but rather a legal issue. Put simply, Twitter is choking FriendFeed to death.

Translation: The FriendFeed team were absorbed by way of acquisition. Twitter has terminated their priority access to Twitter data because FriendFeed is now owned by Twitter's primary competitor.

Correction: Of course Twitter turned them off. Facebook is Twitter's self-declared number one competitor. When you own the platform and the protocol you have every right to protect your own arse. In fact they have an obligation to their shareholders and investors.

What’s odd about this is that most observers consider FriendFeed a failure, too complicated and user-unfriendly to compete with Twitter or Facebook. If Twitter believed that to be the case, why would they endeavor to kill it? And if it were not a failure? Then Twitter is trying to kill it for a good reason. That reason: FriendFeed exposes the impossible task of owning all access to its user’s data. Does Microsoft or Google or IBM own your email? Does Gmail apply rate limiting to POP3 and IMAP?

Translation: Most commentators think that FriendFeed is dead because the founders have been bought by and buried inside Facebook. If FriendFeed is so dead why is Twitter trying to choke it.

Correction: FriendFeed is clearly dead. If you have ever worked for a startup and tried to ship a running product you know that focus is the only thing that will keep you alive. Facebook is a massive platform serving a scale of social interaction that has only been previously seen by distributed systems like email. The last thing Facebook wants is for its newly aquiried superstar team to waste time working on a platform that no longer matters to their commercial success or the bulk of their users (i.e. Friendfeed).

Twitter is choking FriendFeed for another reason - because it's systems are now essentially just a proxy to Facebook. As stated above, Twitter can not give it's number one competitor priority access to one of its major assets (i.e. timley access to the data).

The data that Microsoft and Google does not exercise hoarding tactics over (the examples Steve gave were IMAP and POP3) are open standards using open protocols.

I am never sure about Steve's position on open standards, he often vacillates from championing the open cause through projects like the Attention Trust only then to claim things like APML and DataPortability are bullshit - maybe he just doesn't like me (That can't be right can it Steve?).

The fact is, however, that open standards and protocols are the basis for open systems which is why companies like Microsoft and Google do not control your email. Twitter and Facebook are not open systems.

So the reason Twitter is killing FriendFeed is because they think they can get away with it. And they will, as far as it goes, as long as the third party vendors orbiting Twitter validate the idea that Twitter owns the data. That, of course, means Facebook has to go along with it. Playing ball with Twitter command and control doesn’t make sense unless Facebook likes the idea of doing the same thing with “their” own stream. Well, maybe so. That leaves two obvious alternatives.

The first is Google Wave, which offers much of the realtime conversational technology FriendFeed rebooted around, minus a way of deploying this stream publicly. The Wave team seems to be somewhat adrift in the conversion of private Waves to public streams, running into scaling issues with Wave bots that don’t seem to effectively handle a publishing process (if I understood the recent briefing correctly.) But if Waves can gain traction around events and become integrated with Gmail as Paul Buchheit recently predicted, then an enterprising Wave developer might write a bot that captures Tweets as they are entered or received by Twitter and siphons them into the Wave repository in near realtime.

Translation: Twitter is killing FriendFeed because they think no one will notice or care enough to stop them - Twitter has more than enough momentum and support to continue along it's current path. Facebook wont cry foul because they are doing the same hoarding technique with their own data.

Maybe Google Wave might save the day, but they seem to have lost their way.

Correction: Actually the only people who can call bullshit on Twitter and Facebook is us, the media. We are all media after all. Steve Gillmor in fact is one of the loudest voices - he should call bullshit on closed systems in general. Instead we all seem to be betting on one closed system to do better than another closed system.

We are like abused wives going back for more, each time pretending that our husbands love us. Guess what, they don't love us. They love their IPO.

I was the first to support Google Wave very loudly and proudly. I met with the team and was among the first to get in and play with the preview. It is a revolution in collaboration and how to launch a new open system. It is not, however, a Twitter or Facebook competitor. Especially not in its current state. It is not even a replacement to email. It is simply the best damned wiki product ever created.

Waves are the 180' opposite of FriendFeed and Facebook or even Twitter. They are open, flexible and lacking any structure whatsoever. Their current container, the Google Wave client, however, is totally sub-optimal for a messaging metaphor much less a many-to-many passive social platform. It is a document development platform. Nothing more.

The same could be true of Microsoft’s deal for the firehose, but here, as with Google, Twitter may not want to risk flaunting ownership of a stream that can so easily be cloned for its enterprise value. And as easily as you can say RSS is dead, Salesforce Chatter enters the picture. Here’s one player Twitter can’t just laugh off. First of all, it’s not Twitter but Facebook Benioff is cloning, and a future Facebook at that, one where the Everyone status will be built out as a (pardon the expression) public option. This free cross-Web Chatter stream will challenge Facebook’s transitional issues from private to public, given that Salesforce’s cloud can immediately scale up to the allegedly onerous task of providing personalized Track on demand.

Translation: Maybe the enterprise players - specifically Salesforces' Chatter - will save the day.

Correction: Doubtful. This is just another closed system for a specific vertical. It's long overdue. It is awesome. But it is not a Facebook or Twitter competitor much less an open alternative to the proprietary messaging systems we keep flocking to. It is simply a long overdue expansion of the simple changelog tracking feature on ERP assets. It's a simple feature that was sponsored by a simple question. "Why doesn't the asset changelog include more data - including social data?". Duh. I was doing this in my own web based CRM at the start of the decade.

It’s likely this pressure can be turned to good use by Facebook, unencumbered as they are by any licensing deal with Twitter. Instead, a Chatter alliance with the Facebook Everyone cloud puts Salesforce in the interesting position of managing a public stream with Google Apps support, which eventually could mean Wave integration. Where this might break first is in media publishing, as Benioff noted at the CrunchUp. Twitter’s leverage over its third party developers could be diluted significantly once Salesforce offers monetization paths for its Force.com developers. So much so that this may call Twitter’s bluff with FriendFeed.

Translation: No idea

But FriendFeed has always been more of a tactical takedown of Twitter than an actual competitor, a stalking horse for just the kind of attack Twitter seems most afraid of. No wonder the speed with which Twitter is introducing metadata traps to lock down the IP before a significant cloud emerges to challenge its inevitability. Lists, retweets, location — they’re all based on raising the rate limiting hammer to discourage heading for the exits. It’s not that retweets reduce the functionality of the trail of overlapping social circles, it’s that they lock them behind the Wall.

Translation: Twitter is introducing more metadata into tweets to maintain its lock in through API limits etc.

Correction: On this point Steve is partially correct. This isn't about rate limiting though - it's about turning Twitter's proprietary protocol into a real-time transport for all the data the web has to offer. It is not about API limits but rather cramming so much value into the pipe that the pipe becomes like water - you gotta drink from it or you're going to die.

I don’t expect anyone from Twitter to answer the simple question of when will Twitter give FriendFeed the same access they provide other third party client vendors. For now, it’s frustrating to not see the flow of Twitter messages in realtime, but over time we’ll build tools on top of FriendFeed to take such embargoed messages private. Once inside FriendFeed, the realtime conversations that result are just the kind of high value threads Chatter will support, Wave will accelerate, and Silverlight will transport. Keep up the good work, Twitter.

Translation: I doubt Twitter will play nice with FriendFeed and give them equal access again because once items are inside FriendFeed they turn into rich conversations. Conversations that Chatter will support, Wave will accelerate and silverlight will transport.

Correction: Actually Twitter does not and has never given fair and equal access to its data. FriendFeed had a moment in the sun with first class access the likes of which almost no one else has seen before or since.

I have no idea how Chatter fits into the B2C picture - it is clearly an Enterprise play for Salesforce. Wave indeed will act as a great interface through which to participate in real-time threads. The threads themselves, however, will need to be generated or framed by much more rigid systems designed for public discussion.

Silverlight is great for rich web apps. It is Microsoft's way of bringing the richness of the client into the browser. Just like .NET is to Java, Silverlight is to Flash. A way for Microsoft to leverage a key technology component without handing the crown to someone/something it doesn't control. But I'm not sure if fits into this discussion.

In the end, the only real solution for all of this, of course, is a return to the way the web has always worked (well). Open systems. The transport should not be Twitter, Facebook, FriendFeed, Wave or any other nonsense. It should be RSS and Atom (ActivityStrea.ms specifically) transported over PubSubHubBub and read by open standards aggregators. The namespaces should be OpenID based and adoptable by all.

The sooner the early adopter community realizes this, the commentators push for this and the developers code for this, the better off we will all be.

Disclosure: I work for JS-Kit, creators of Echo - one of the largest providers of Real-time streams. I also Tweet - trying to find an alternative though!

Twitter Lists and Namespaces

Added on by Chris Saad.

A very important fact that seems to be getting little to no coverage at the moment about Twitter Lists is the issue of namespaces. Twitter's number one asset is its control and allocation of namespaces. Those little things we call 'Usernames'. @chrissaad is not just my Twitter Name, it is a short form addressable identity that concretely links to my Twitter inbox any time someone uses it in a Tweet.

Addressable, convenient namespaces that can be used in a sentence like this are so interesting and important that facebook went to great lengths to copy them. Nothing on the open web has yet come close to this simplicity and effectiveness. Which is not to say there won't be an alternative soon.

The important fact with Twitter usernames, though, is that they are unique. There is a finite and shared 'space' in which 'names' can be allocated.

The result is that early adopters end up with all the best names and squatters rush to lock up all the best phrases. Late comers to the system end up with names like chris2423.

Twitter Lists, however, are different. They include the list creator's username. For example my JS-Kit list is "@ChrisSaad/jskit".

As you can see, the list 'JSKIT' is attached to my username. This means means that each user has their own namespace.

This result: There can't be a landrush for List names because the list naming convention sits on top of the username. It also means that no one can own a definitive list on a subject because each list is subjective.

This is an important design decision for Twitter. One that has both pros and cons for the community. Overall, however, I think the decision was a correct one. Lists can rise and fall organically (or at least based on the influence and popularity of their creators) without the pain and pressure (for Twitter) of maintaining yet another shared namespace.

Twitter's username namespace, however, is just rife with and waiting for all sorts of headaches. I don't envy their position and I can't wait for an open alternative.

Stalqer - Viral Loops and Network Effects

Added on by Chris Saad.

63954v1-max-150x150Today a company I am advising has launched in the press and will soon be available in the Apple App Store. They are called Stalqer and, as Techcrunch writes, they are basically Foursquare on steroids.

I think that's a pretty good description. The fact is, however, the most impressive thing about Stalqer is not what it does but how it does it. Rather than approaching acquisition and retention of users like any typical app , it uses data portability, viral loops and network effects to on-board and engage users on an ongoing basis.

Not enough app developers consider this when engineering their user experiences and the result is usually a big 'Techcrunch' launch and a big flame out as users flock for a 5 minute road test and never return.

Mick (CEO of Stalqer) and his team, however, have almost turned virality and network effects it into a science.

Here are some of the highlights of their product decisions.

  1. Instead of building yet another registration and friending system, they simply import your Facebook Friends.
  2. Instead of being content to be confined by Facebook's data licensing limitations, they merge and mingle FB data with other data sources (in this case, your phone's address book!) to access email addresses and phone numbers.
  3. Instead of assuming that their app lives in a vacuum, they are using other data sources (Facebook, Phone Book and eventually others) to aggregate location data and make a best guess at friend locations even if they aren't using the app.
  4. Instead of being limited by their active user base, they encourage existing users to manipulate and optimize profiles of non-users - the effect being that even if you don't use Stalqer, chances are one of your friends is doing the work of checking you in. Don't like where they put you - then sign up and get back control!
  5. Instead of letting the multitasking limitations of the iPhone limit their background tracking capabilities, they innovated their way out of the problem using amazing email tricks.

The list of innovations goes on and on.

The Stalqer team have done an amazing job of baking in the right workflows to ensure maximum adoption and engagement based on their primary use case (discovering people around you) without resorting to raw gaming tricks like points and badges.

I can't wait to see how the app performs and what they do next!

As a side note, I too have been experimenting with non-obvious network effects in my day-job. More on that later...

FriendFeed is over - Time for a Blog Revolution

Added on by Chris Saad.

The blog revolution that I spoke of in my previous post 'Blogs are Back" feels to me, right now, like the Iranian revolution that almost happened a couple of months back. It is in danger of fading away as we get wrapped up in 'what will Facebook do next' mania. You see, a couple of months ago there seemed to be an awakening that blogs are the first, best social networking platforms. This realization seemed to be driven by many converging factors including...

  1. Twitter Inc decisions that have not reflected the will of the community – particularly changing the @ behavior, changing their API without informing developers, making opaque decisions with their Suggested User List and limiting access to their Firehose.
  2. Facebook’s continued resistance to true DataPortability
  3. The emergence of tools and technologies that turn blogs into real-time, first class citizens of the social web. Tools like Lijit, PubSubHubBub and of course Echo.
  4. A broader understanding that blogs are a self-owned, personalized, tool agnostic way to participate in the open social web.
  5. FriendFeed selling out to Facebook
  6. A flurry of great posts on the subject
  7. The broader themes of the Synaptic Web

Instead though, it now seems that many bloggers are holding on desperately to the notion that FriendFeed may survive or that Facebook may get better. They continue to pour their content, conversation and influence into a platform that does not hold their brand, their ads or their control. We all seem desperate to see what next move these closed platforms make.

I have news for you - FriendFeed is dead. The team has moved on to work with the core Facebook team.

At best, FriendFeed will go the way of Del.icio.us and Flickr - stable but not innovating. At worst, it will go the way of Jaiku or even Dodgeball.

It's time we start re-investing in our own, open social platforms. Blogs. Blogs are our profile pages - social nodes - on the open, distributed social web.

Blogs missing a feature you like from FriendFeed? Build a plugin. There's nothing Facebook or FriendFeed does that a blog can't do with enough imagination.

Our job now, as early adopters and social media addicts, should be to build the tools and technologies to educate the mainstream that blogs and blogging can be just as easy, lightweight, social and exciting as Facebook. Even more so.

All that's need is a change in perspective and slight tweaks around the edges.

Blogs are back.

Who's with me?

Blogs are Back

Added on by Chris Saad.

When Khris and I showed Robert Scoble Echo prior to the Launch at the Real-Time Crunchup he said "Wow, Blogs are Back!". I couldn't agree more. It looks like his sentiment is starting to propagate.

When I say Blogs are Back I mean that the balance between other forms of social media (Twitter, Facebook, FriendFeed etc) are now finding their rightful balance with the first and foremost social platform, Blogging.

This is not to suggest that other forms of interaction are going away, only that there is a natural equilibrium to be struck.

There are a number of factors that are helping this trend along.

They include:

  1. Twitter Inc decisions that have not reflected the will of the community - particularly changing the @ behavior, changing their API without informing developers, making opaque decisions with their Suggested User List and limiting access to their Firehose.
  2. Facebook's continued resistance to true DataPortability
  3. The emergence of tools and technologies that turn blogs into real-time, first class citizens of the social web. Tools like Lijit, PubSubHubBub and of course Echo.
  4. A realization that blogs are a self-owned, personalized, tool agnostic way to participate in the open social web.
  5. The broader themes of the Synaptic Web

I also discussed this with Dave Winer, Doc Searls and Marshall Kirkpatrick the other day on the BadHairDay podcast.

You can also see previous references to this in my 'What is Echo' post. I've also posted a more detailed account of how Echo fits into this notion on the JS-Kit blog.

Robert Scoble and Shel Israel have also posted on this. I also registered 'BlogsAreBack.com' (what should I do with it?).

I look forward to see what this new trend brings!