Product & Startup Builder

Filtering by Category: Technology

Building in someone else's yard

Added on by Chris Saad.

Loic Le Meur writes over on LinkedIn about his mistakes betting on Twitter with his company Seesmic. Seesmic was a company that produced a series of great Twitter clients for multiple platforms (Mobile, Web, Desktop etc). When Twitter started shutting down developers and releasing their own official clients Seesmic's business was undermined and ultimately shuttered.

I'm not blaming Twitter for this strategic change – they did not know they would take that decision at the time when they were fully supporting their ecosystem. I blame myself entirely. I should have never dedicated all my team resources to build on one platform. That is a lesson learned the hard way along with many other developers. I was too excited and became blind.
...
Here are my two cents for entrepreneurs betting on someone else's success: be careful that everything can change from one day to another and all the rules will change. I will never be that dependent on anyone anymore.

 

Loic is a wicked smart and very successful entrepreneur. He's always smiling, generous and well liked by his peers. It's a real shame that Twitter pivoted in the way that it did to undermine his business.

I'd like to refine Loic's lessons learned a little here, though. In my opinion the problem was not betting on someone else's platform but rather...

  1. Twitter is not a platform, it's a media company
  2. Betting on one media company rather than multiple

Whenever a company makes money from Ads, it's not a platform/technology company - it's a media company. As a media company It needs to control the eyeballs so that it can control the ad impressions.

To be fair, though, Twitter's ad revenue model wasn't in place when Loic started betting on them. It was clear, however, that their revenue model was still in flux and that ads would play a role in order to keep the service free for end-users.

The reality is companies successfully rely on other platforms all the time. Amazon Web Services is a great example of this. There's never a risk that AWS is going to start turning off or competing with its developers because it is a true platform.

Like AWS, Echo is a true platform. We make our money by encouraging developers to build world class apps on our platform and we even help them sell those apps to major customers.

Facebook, Twitter etc were never true technology platforms. They are distribution channels. They are data sources. They are social services. But they are not platforms.

Ironically this is still happening today. Major media companies and developers still spend enormous sums of money encouraging their users to participate on Twitter and Facebook as 'outsourced engagement platforms'. Ironically Media companies who should understand the value of owning the audience and the ad impressions are happily outsourcing them to competing media companies (Facebook and Twitter). I write more about this over on the Echo blog.

The key, then, is not avoiding 3rd party platforms, but rather to understand the difference between platforms, products, services and media companies. It's key to understand the incentives, revenue flows and business models so you can understand how to align your company and product with the value chain.

 

Dark Social and Facebook+

Added on by Chris Saad.

Working with large brands at Echo is thrilling. They have the content, products and reach that matter in everyday people’s lives. This means that even small improvements in their Realtime, Social strategy results in big impacts on large groups of people. One of the prevailing misconceptions we find when we first get started with a new customer, however, is that Facebook is Social. Facebook comments, Facebook Likes, Facebook Fan Pages are often seen as the beginning and the end of the social ‘strategy’.

For as long as I can remember, my career has been about helping others to remember that Facebook (or Myspace or AOL etc) can only ever be one part of the larger web and Internet landscape. The percentage fluctuates of course but it is never 100%.

A new article in The Atlantic this week, however, reminds us that not only is Facebook only a fraction of the overall web (in terms of traffic referrers and participation) but also that its not even the biggest fraction. It also reminds us that while modern social networking has introduced many powerful novelties, being social on the internet is far from a new phenomenon. In fact, it has been a pervasive part of internet interactions since the beginning - think Email and Instant Messaging for example. These ‘old’ tools continue to have a huge (in fact the largest) impact on your referrer traffic and engagement.

This engagement, however, is under measured and not well understood. The Atlantic postulates that it appears in web analytics as unknown referrers to non home page or section front pages - assuming that direct traffic to deep links can only come from people sharing links to one another using tools that don’t leave referrer signatures. So the Atlantic has taken to calling this class of traffic ‘Dark Social’.

Below is a chart of their referral traffic as measured by ChartBeat. Most notably they have shown and labeled the appropriate traffic as 'Dark Social' on the chart.

This chart clearly shows that, for The Atlantic, Dark Social, and non Facebook ‘Standard Social’ together, accounts for almost 80% of all referral traffic.

In this light it is obvious that what’s needed is a ‘Facebook+’ strategy. Or better put, a strategy that puts your website at the center, with Mobile + Desktops + Facebook + Twitter + Reddit + Digg + StumbleUpon + Dark Social + many others as link distribution pipes.

This means that for maximum coverage and distribution, every login, sharing, commenting, following, notification, trending surface can’t just be a Facebook widget. You need white label Social Software Infrastructure that connects your audience to your site using the tools, technologies and distribution opportunities of the entire web.

The web has always been, and will always continue to be the platform. Social or otherwise.

The Open Web Is Dead - Long live the Open Web

Added on by Chris Saad.

Yesterday Robert Scoble once again declared that the Open Web was dead. His argument was that Apps and proprietary black holes like Facebook are absorbing all the light (read: users, attention, value, investment) and taking our beloved open platform right along with it. In his post, he kindly (but incorrectly) named me as the only person who really cares about the Open Web. While that's flattering, I think he's wrong about me being the only one who cares.

But he is right about the Open Web. It's in real danger. URLs are fading into the background,  native Mobile apps are all the rage and Facebook threatens to engulf the web into a proprietary black hole.

But I think there's a bigger problem going on right now. Not just with the web, but with silicon valley (as stewards of the web). We've lost sight of the things that matter. We're obsessed with quick wins, easily digestible VC pitches, stock options and flipping for a Ferrari.

There's more to this game than that. Let me touch on some of the things I see going on.

  1. Lead not just cheerlead In our obsession with being seen by our micro-audiences as 'thought leaders' or 'futurists' it's always very tempting to watch which way the wind is blowing and shout loudly that THERE is the future. Like a weather vane, it's easy to point the way the wind is blowing, but our biggest, best opportunity is not to declare a popular service 'the next big thing' just because a few visible people are hanging out there. Rather our collective and individual responsibility is to help articulate a direction we think moves the state of the art forward for both the web and for society at large. Something, as leaders of this field, we believe in. Just like VCs develop an investment thesis, we should all have a vision for where the web is going (and how it should get there) and actively seek out, support and promote quiet heros who are building something that moves the needle in the right direction.
  2. Add to the web's DNA Almost every startup I see today is focused on building an 'App' and calling it a 'Platform'. Too often (almost every time) though, these apps are nothing more than proprietary, incremental and niche attempts at making a quick buck. We need more companies to think deeper. Think longer term. What are you doing to change the fabric of the web's DNA forever? How can you contribute to the very essence of the Internet the same way that TCP/IP, HTTP, HTML, JS and so many other technologies have done. Even proprietary technologies have provided valuable evolutions forward - things like Flash and yes, even FB. How are you going to live forever? This is why Facebook used to call itself a 'Social Utility' instead of a 'Social Network'. Mark Zuckerberg was never content to be the next Myspace Tom. He wanted to be the next Alexander Graham Bell. And now he is.
  3. Don't just iterate, innovate Of course, someone has to build Apps. We can't all be working at the infrastructure layer. But too many of the Apps we chose to build (or champion) are incremental. As startup founders, investors and influencers it's so easy to understand something that can be described as the 'Flipboard of Monkeys' instead of thinking really hard about how a completely new idea might fit into the future. Sure there are plenty of good business and marketing reasons why you shouldn't stray too far from the beaten path, broadening it one incremental feature at a time, but the core essence of what you're working on can't be yet another turn of a very tired wheel. If you're shouting 'Me too' then you're probably not thinking big enough.
  4. B2C, not Ego2C Silicon valley is clearly a B2C town. We all love the sexy new app that our mother might eventually understand. Something we can get millions of users to use so we can show them lots of ads. Besides the fact that I think we should focus a little more on B2B, the problem is we're not really a B2C town at all. We're actually more focused on what I will call Ego2c. That is, we pick our favorite apps based on how famous the founding team is OR how easily we can use the app to build yet another niche audience for ourselves (and brands/marketers). It would be a tragedy if the social web revolution boils down to new methods of PR and marketing. But that's what we seem to be obsessed with. As soon as any app from a famous founder gets released we give it tones of buzz while plenty of more deserving projects get barley a squeak. If the app gets a little traction (typically the ones that have Ego mechanics baked in) you see a million posts about how marketers can exploit it. Inevitably the app developers start to focus on how to 'increase social coefficients' instead of how to help human beings make a connection or find utility in their lives.
  5. "Users don't care" Speaking more specifically about the Open vs. Closed debate, too often we hear the criticism "Users don't care about open". This is absolutely true and the reason why most open efforts fail. Users don't care about open. They care about utility and choice. This is why the only way to continue propagating the open web is to work with BUSINESS. B2B. Startups, Media Brands, The bigco Tech companies. They care about open because the proprietary winners are kicking the losers ass and that usually means there are at least 1 or more other guys who need a competitive advantage. They need to team up and build, deploy and popularize the open alternative.  That's why open always wins. There's always plenty of losers around who are going to commoditize the popular closed thing. As technology leaders we're paid to care about things users don't care about. Things that shape the future. While users, in the short term, might not care, we should dare to think and dream a little bigger. As a case study look at Android vs. iOS. iOS is more profitable for a single company, but the other is now a force of nature.
  6. Death is just a stage of life Just because something is no longer interesting doesn't mean it's dead. Its spirit, and often times the actual technology, lives on, one layer below the surface. RSS is a great example of this. RSS's spirit lives on in ActivityStreams and the general publish/subscribe model. It is powering almost every service-to-service interaction you currently enjoy. Is it dead, or has it simply become part of the DNA of the Internet? Could RSS (or something like it) be better exposed higher up in the stack, absolutely, but that will take some time, thoughtful execution and influencers who are willing to champion the cause. The same is true for OpenID and OAuth.
  7. The Arc of the Universe Is long but It bends towards Open The battle of Open vs. Closed is not a zero sum game. Both have their time. It's a sin wave. First, closed, proprietary solutions come to define a new way of fulfilling a use case and doing business. They solve a problem simply and elegantly and blaze a path to market awareness, acceptance and commercialization. Open, however, always follows. Whether it's a year, a decade or a century, Open. Always. Wins. The only question is how long, as an industry, are we going to keep our tail tucked between our legs in front of the the great giant proprietary platform of the moment or are we going to get our act together to ensure the "Time to Open" is as short as possible. It takes courage, co-ordination and vision, but we can all play our part to shorten the time frame between the invention of a proprietary app and the absorption of that value into the open web platform.
  8. Acknowledge reality FB has won. It's done. Just like Microsoft won the Desktop OS (in part handed to them by IBM), so too has FB won the Social OS (in part handed to them by Microsoft). For now. Acknowledging the truth is the first step to changing it. The only question now is how long we're all willing to wait until we get our act together to turn the proprietary innovation of the 'social graph' into part of the open web's core DNA. We need to recognize our power. They have ~1B users? The open web has more. Chances are that the major website or brand you work for has plenty of its own users as well. Are you going to send them to FB, or are you going to invest in your own .com. Trust me, I know it's really, really easy to take what you're given because you're too busy putting out a million fires. But as technology leaders I challenge us all to build something better. We're the only ones who can.
  9. [Edit] Don't kill Hollywood Did you catch the YC post  calling for silicon valley to kill hollywood. Not only was this reckless and short sighted, it's the exact opposite of what we should be doing. Instead of trying to kill or cannibalize media companies and content creators, how about we work with them to create the next generation of information technology. They have the audiences+information and we have the technology. Instead, most silicon valley companies, by virtue of their B2C focus, are too busy leaching off major media instead of finding ways to help transform it. Sure most of them move slowly - but move they are. Move they must. Helping them is very profitable. I write more about this on the Echo blog - calling it 'Real-time Storytelling'
  10. [Edit] Today's data portability problem When I started the DataPortability project the issue of the time was personal data portability. That's not the case anymore. While user-centric data portability is still being done via proprietary mechanisms it's a) actually possible and b) moving more towards open standards every day. The real issue right now is firehoses. Access to broad corpuses of data so that 3rd parties can innovate is only possible through firehoses (for now). To put it another way, the reason Google was possible was because the open web was crawl-able - for free - with no biz dev deal. The reason FB was possible was because the open web allowed any site to spring up and do what it wanted to do. Today, too much of our data is locked up in closed repositories that can and must be cracked open. Google's moves to exclude other socnets (besides G+) from their search results until they had free and clear access to them might be inconvenient for users in the short term, but, as a strategic forcing function, is in the best interest of the open web long term.

End of rant.

WSJ Outsources its business to Facebook

Added on by Chris Saad.

Today WSJ announced that it has built a news publishing platform that lives inside Facebook - effectively outsourcing their core website to the Social Networking Giant. The number of reasons this is a bad idea is staggering. I've tried to summarize them in a spreadsheet comparing a FB approach verses an Open Web approach.

Please feel free to contribute

Real Names getting Real Attention

Added on by Chris Saad.

There's a lot of fury on the web right now about 'Real Names'. FB is trying to use it as a unique feature of their comments system claiming it reduces trolling and low value comments. Of course that isn't really true. For one, any commenting system could force FB login. Two, users will troll with or without their name attached and, worse yet, many legitimate users won't participate for any number of reasons if they can't use a pseudonym. There are plenty of better ways to increase quality in your comments including participation from the content creators, game mechanics, community moderation and more.

The real debate, however, is about G+ trying to copy FB's stance on Real Names. They are insisting all user accounts use them and are actively shutting down accounts that violate the policy. They are being so heavy handed about that even people who ARE using their real name are getting notices of violation - most notable Violet Blue.

I'm not really an expert on pseudonyms, shared contexts and anonymity so I'm going to stay out of this debate.

The real question for me, however, is what is Google's strategic business reason for this policy. There must be a long term plan/reason for it otherwise they wouldn't be insisting so hard.

My assumption is that it's related to their intention to become a canonical people directory and identity provider on the internet to compete with FB in this space.

FB, after all, does not just get it's power from news feeds and photo apps - it gets it from the deep roots it has laid down into the DNA of the internet as the provider of 1st class identity infrastructure and identity information.

In this sense, FB's social contract has served them very well, and Google's attempt to copy it is a hint that they understand FB is not just a .com feature set, but a powerful identity utility. They must (and in some cases seem to be) understand that strategy and it's aggressiveness if they are to properly compete with the monopoly. My only hope, however, is that they are coming up with their own inspired counter strategy rather than just copying the moves they see on the surface - because that's doomed to fail.

Initial quick thoughts on Google+

Added on by Chris Saad.

It's certainly very slick, but it's a few years behind FB. I mean that not just in timing and network effects, but in the much more strategic sense of platform ambition. FB.com was the FB strategy 4 years ago. FB is now going for the rest of the web. It's reach and role as an identity provider and social infrastructure player makes it much more important (and harder to beat) than launching a cool new service. So hopefully the Google+ team is thinking WAY beyond this as a destination site when they are thinking Google Social Strategy.

So far the broad ranging announcements from the +1 button to Google Analytics adding Social bode well for this being a company wide, product wide refresh. The key to success will be in thinking about the need to compete with FB beyond the walls and products of Google.

The key to that, of course, will be to get deep adoption by major sites.

Update: Upon thinking about it a little more. Google has once again missed an opportunity to play to their strengths. With the document web they played the role of aggregator and algorithmic signal detection system. With the social web, their ideal strategy would be to build the ultimate social inbox. A place where I can navigate, consume AND interact with Facebook + Twitter + Foursquare + Quora +++ in one place.

Instead they created yet another content source.

NYT Paywall, Huffpo Lawsuit - Symptoms of the same misconception

Added on by Chris Saad.

Over the last few days I have been debating the NYT pay wall on a private email thread of friends. I didn't feel the need to post it on my blog because I thought that pay walls were so obviously a losing strategy that it was a waste of time to comment.

But combined with the recent law suit against the Huffingon Post and Arianna Huffington's eqloquent response yesterday, I felt it was worth while to re-publish my thoughts here. Most of them are based on thinking and writing that I did many years ago around Attention. Most of that old writing has been lost in the blog shuffle. Hopefully one day I will dig it up and re-post it in a safe place.

On to the issue...

The price of content

I believe that people have historically paid for the medium not the content.

They pay for 'Cable' not for 'CNN News'. They pay for 'The Paper' not for the content in the newspaper. They pay for 'CDs' not for the music on the album.

Also they paid a lot because the medium was perceived to be scarce (scarce materials, scarce shelf space, scarce advertising dollars), scarce talented people.

Consumers are not stupid, they understand (if only somewhere at the back of their mind) that the COST of creating and distributing things has been deflated by a growing list of converging trends.

We live in a world of abundance (in the area of digital content anyway). Shelf space is infinite (database entries), any kid in a basement can make content and there is no physical media anymore so cost of distribution has disappeared as well.

The scarcity now is on the consumption side - Attention is the scarce resource. Value is derived from scarcity.

That's why on the Internet, Attention allocation systems (Google Search, FB News Feed etc) are attracting traffic, engagement and ultimately profit.

In this new world, the price of content must be reduced significantly as shakeouts and rebalancing occurs - because the cost of producing it is approaching zero.

The more the Music, TV and News industry fight this, the more they leave themselves open to disruption by Google, FB, Twitter and the rest of silicon valley.

This is not even to mention that everyone is producing content now. Tweets, Photos, Videos - it's abundant. Of course most of it isn't very 'good' by J school standards - but that's irrelevant. The world has never rewarded good with any consistency.

Also just because content is not good, doesn't mean it isn't personally meaningful.

For example, I care more what my child (theoretical child of course) posts to FB than the most important journalist in all the world says on CNN.

But please don't confuse my dispassionate assessment of the issue as pleasure or happiness at the demise of mainstream media though.

I am simply stating the facts because without understanding those we can't begin to change them (if that's what the media world decided to do).

In terms of making a judgement of those facts, I think that curators who weave and summarize a broader narrative in the form of 'reporting' are critical for an informed citizenship and a functional democracy. I believe in it so much that I have dedicate my life to helping mainstream media companies staying relevant and co-writing things like this: http://aboutecho.com/2010/08/18/essay-real-time-storytelling/

But I also believe that mainstream mass media broke an ancient (and by ancient, I mean as old as rudimentary human communication) pattern of people telling each other personal stories vs. getting all their stories/news from editorialized mass broadcasts.

The Internet may just be restoring the balance. The result is some massive restructuring of inflated budgets, processes, offices, costs etc. While we're in the middle of that restructuring, it looks like a media apocalypse. Until it settles down and a new equilibrium is found.

Here's what Arianna wrote on the subject:

The key point that the lawsuit completely ignores (or perhaps fails to understand) is how new media, new technologies, and the linked economy have changed the game, enabling millions of people to shift their focus from passive observation to active participation -- from couch potato to self-expression. Writing blogs, sending tweets, updating your Facebook page, editing photos, uploading videos, and making music are options made possible by new technologies.

The same people who never question why someone would sit on a couch and watch TV for eight hours straight can't understand why someone would find it rewarding to weigh in on the issues -- great and small -- that interest them. For free. They don't understand the people who contribute to Wikipedia for free, who maintain their own blogs for free, who tweet for free, who constantly refresh and update their Facebook pages for free, and who want to help tell the stories of what is happening in their lives and in their communities... for free.

Free content -- shared by people who want to connect, share their passions, and have their opinions heard -- fuels much of what appears on Facebook, Twitter, Tumblr, Yelp, Foursquare, TripAdvisor, Flickr, and YouTube. As John Hrvatska, a commenter on the New York Timeswrote of the Tasini suit, "So, does this mean when YouTube was sold to Google that all the people who posted videos on YouTube should have been compensated?" (And Mr. Hrvatska no doubt contributed that original and well-reasoned thought without any expectation he'd be paid for it. He just wanted to weigh in.)

Read more on her post

Update

And here's a bit of 'Free Content' - A conversation I had on Twitter wish someone who disagreed with this post.

New Twitter. Feature comparison

Added on by Chris Saad.

Jeremiah and I wrote an analysis of the New Twitter vs. Current Facebook. Here's a snippet:

Situation: Twitter’s new redesign advances their user experience

Twitter has announced a new redesign today, yet by looking at the news, there hasn’t been a detailed breakdown of these two leading social networks.  Overall, Twitters new features start to resemble some features of a traditional social network, beyond their simple messaging heritage.  We took the key features from both social website and did a comparison and voted on the stronger player?

[Great Detailed Graph goes here - See it on Jeremiah's blog]

Our Verdict: Facebook Features Lead Over Twitter’s New Redesign

Facebook’s features offer a more robust user experience, and they have a longer history of developing the right relationships with media, developers, and their users. Twitter, a rapidly growing social network has launched a series of new features (described by the founder as “smooth like butter”) that provide users with a snappy experience and enhanced features.

We tallied the important features of this launch and to their overall expansion strategy and have concluded that Facebook’s features continue to hold dominance over Twitter, despite the noticeable improvements. While we don’t expect that Twitter wants to become ‘another Facebook’ they should play to their strengths and remaining nimble and lightweight yet allowing for developers and content producer to better integrate into their system.

Check out the full results over on his blog.

Guest Post: Facebook's world view

Added on by Chris Saad.

Just wanted to share with you here that I wrote a guest post on Mashable last week about Facebook's world view. Be sure to check it out here.

Are these blunders a series of accidental missteps (a combination of ambition, scale and hubris) or a calculated risk to force their world view on unsuspecting users (easier to ask for forgiveness)? Only the executives at Facebook can ever truly answer this question.

What’s clear, though, is that their platform is tightly coupled with countless other websites and applications across the web, and their financial success is aligned with many influential investors and actors. At this stage, and at this rate, their continued success is all but assured.

But so is the success of the rest of the web. Countless social applications emerge every day and the rest of the web is, and always will be, bigger than any proprietary platform. Through its action and inaction, Facebook offers opportunities for us all. And in the dance between their moves and the rest of the web’s, innovation can be found.

The only thing that can truly hurt the web is a monopoly on ideas, and the only ones who can let that happen are web users themselves.

Diaspora is not the answer to the Open Web, but that's ok

Added on by Chris Saad.

For whatever reason, a new project called Diaspora is getting a lot of attention at the moment. They are four young guys who have managed to crowd source $100k+ to build an open, privacy respecting, peer-to-peer social network. A number of people have asked me what I think, so instead of repeating myself over and over I thought I would write it down in one place.

First, I don't think Diaspora is going to be the 'thing' that solves the problem. There are too many moving parts and too many factors (mainly political) to have any single group solve the problem by themselves.

Second, I don't think that's any reason to disparage or discourage them.

When we launched the DataPortability project, we didn't claim we would solve the issue, but rather create a blueprint for how others might implement interoperable parts of the whole. We soon learned that task was impractical to say the least. The pieces were not mature enough and the politics was far too dense.

Instead, we have settled for providing a rolling commentary and context on the situation and promoting the efforts of those that are making strides in the right direction. We also play the important role of highlighting problems with closed or even anticompetitive behaviors of the larger players.

The problem with the DataPortability project, though, was not its ambition or even it's failure to meet those ambitions, but rather the way the 'old guard' of the standards community reacted to it.

The fact of the matter is that the people who used to be independent open advocates were actually quite closed and cliquey. They didn't want 'new kids on the block' telling them how to tell their story or promote their efforts. Instead of embracing a new catalyzing force in their midst, they set about ignoring, undermining and even actively derailing it at every opportunity.

Despite my skepticism about Diaspora, though, I don't want to fall into the same trap. I admire and encourage the enthusiasm of this group to chase their dream of a peer-to-peer social network.

Do I think they will succeed with this current incarnation? No. Do I think they should stop trying? No.

While this project might not work their effort and energy will not go to waste.

I think we need more fresh, independent voices generating hype and attention for the idea that an open alternative to Facebook can and must exist. Their success in capturing people's imagination only shows that there is an appetite for such a thing.

What they might do, however, is strongly consider how their work might stitch together existing open standards efforts rather than inventing any new formats or protocols. The technologies are getting very close to baked and are finding their way into the web at every turn.

We all need to do our part to embed them into every project we're working on so that peer-to-peer, interoperable social networking will become a reality.

Welcome to the party Diaspora team, don't let the old guard (who have largely left for BigCo's anyway) scare you off.

Open is not enough. Time to raise the bar: Interoperable

Added on by Chris Saad.

Last week Elias Bizannes and I wrote a post Assessing the Openness of Facebook's 'Open Graph Protocol'. To summarize that post, it's clear that Facebook is making a play to create, aggregate and own not only identity on the web, but everything that hangs off it. From Interests to Engagement - not just on their .com but across all sites. To do this they are giving publishers token value (analytics and traffic) to take over parts of the page with pieces of Facebook.com without giving them complete access to the user , their data or the user experience (all at the exclusion of any other player). In addition, they are building a semantic map of the Internet that will broker interests and data on a scale never before seen anywhere.

In the face of such huge momentum and stunningly effective execution (kudos to them!), aiming for (or using the word) Open is no longer enough. The web community needs to up it's game.

The same is true for data portability - the group and the idea. Data portability is no longer enough. We must raise the bar and start to aim for Interoperable Data Portability.

Interoperability means that things work together without an engineer first having to figure out what's on the other end of an API call.

When you request 'http://blog.areyoupayingattention.com' it isn't enough that the data is there, or that that its 'open' or 'accessible'. No. The reason the web works is because the browser knows exactly how to request the data (HTTP) and how the data will be returned (HTML/CSS/JS). This is an interoperable transaction.

Anyone could write a web server, create a web page, or develop a web browser and it just works. Point the browser somewhere else, and it continues to work.

Now map this to the social web. Anyone could (should be able to) build an open graph, create some graph data, and point a social widget to it and it just works. Point the social widget somewhere else, and it continues to work.

As you can see from the mapping above, the interaction between a social widget and it's social graph should be the same as that of a browser and a web-server. Not just open, but interoperable, interchangeable and standardized.

Why? Innovation.

The same kind of innovation we get when we have cutting edge web servers competing to be the best damned web server they can be (IIS vs. Apache), and cutting edge websites (Yahoo vs. MSN vs. Google vs. Every other site on the Internet) and cutting edge browsers (Netscape vs. IE vs. Safari vs. Chrome). These products were able to compete for their part in the stack.

Imagine if we got stuck with IIS,  Netscape and Altavista locking down the web with their own proprietary communication channels. The web would have been no better than every closed communication platform before it. Slow, stale and obsolete.

How do we become interoperable? It's hard. Really hard. Those of us who manage products at scale know its easy to make closed decisions. You don't have to be an evil mastermind - you just have to be lazy. Fight against being lazy. Think before you design, develop or promote your products - try harder. I don't say this just to you, I say it to myself as well. I am just as guilty of this as anyone else out there developing product. We must all try harder.

Open standards are a start, but open protocols are better. Transactions that, from start to finish, provide for Discoverability, Connectivity and Exchange of data using well known patterns.

The standards groups have done a lot of work, but standards alone don't solve the problem. It requires product teams to implement the standards and this is an area I am far more interested in these days. How do we implement these patterns at scale.

Customers (i.e. Publishers) must also demand interoperable products. Products that not just connect them to Facebook or Twitter but rather make them first class nodes on the social web.

Like we said on the DataPortability blog:

In order for true interoperable, peer-to-peer data portability to win, serious publishers and other sites must be vigilant to choose cross-platform alternatives that leverage multiple networks rather than just relying on Facebook exclusively.

In this way they become first-class nodes on the social web rather than spokes on Facebook’s hub.

But this is just the start. This just stems the tide by handing the keys to more than one player so that no one player kills us while the full transition to a true peer-to-peer model takes place.

If the web is to truly stay open and interoperable, we need to think bigger and better than just which big company (s) we want to hand our identities to.

Just like every site on the web today can have its own web server, every site should also have the choice to host (or pick) its own social server. Every site should become a fully featured peer on the social web. There is no reason why CNN can not be just as functional, powerful, effective and interchangeable as Facebook.com.

If we don't, we will be stuck with the IIS, IE and Netscape's of the social web and innovation will die.

Missed opportunities in Publishing

Added on by Chris Saad.

MG Siegler over on Techcrunch yesterday wrote a story about how the AP is tweeting links to its stories. Those links, however, are not to its website. Instead those twitter links lead to Facebook copies of their stories! Here's a snippet of his post:

The AP is using their Twitter feed to tweet out their stories — nothing new there, obviously — but every single one of them links to the story on their Facebook Notes page. It’s not clear how long they’ve been doing this, but Search Engine Land’s Danny Sullivan noted the oddness of this, and how annoying it is, tonight. The AP obviously has a ton of media partners, and they could easily link to any of those, or even the story hosted on their own site. But no, instead they’re copying all these stories to their Facebook page and linking there for no apparent reason.

As Sullivan notes in a follow-up tweet, “i really miss when people had web sites they owned and pointed at. why lease your soul to facebook. or buzz. or whatever. master your domain.”

What’s really odd about this is the AP’s recent scuffle with Google over the hosting of AP content. The two sides appeared to reach some sort of deal earlier this month (after months of threats and actual pulled content), but now the AP is just hosting all this content on Facebook for the hell of it?

To me this isn't unusual at all. In fact it's common practice amongst 'social media experts'. Many of us use/used tools like FriendFeed, Buzz, Facebook etc not just to share links, but to actually host original content. We actively send all our traffic to these sites rather than using them as draws back to our own open blog/publishing platforms.

I completely agree with MG. Sending your audience to a closed destination site which provides you no brand control, monetization or cross-sell capability shows a profound misunderstanding of the economics of publishing.

Some will argue that the content should find the audience, and they should be free to read it wherever they like. Sure, I won't disagree with that, but actively generating it in a non-monetizable place and actively sending people there seems like a missed opportunity to me. Why not generate it on your blog and then simply share the links in other places. If those users choose to chat over there, that's fine, but the first, best place to view the content and observe the conversation should always be at the source, at YOUR source. YOUR site.

Some will argue that those platforms generate more engagement than a regular blog/site. They generate engagement because your blog is not looked after. You're using inferior plugins and have not taken the time to consider how your blog can become a first class social platform. You're willing to use tools that cannibalize your audience rather than attract them. You're willing to use your  blog as a traffic funnel back to other destination sites by replacing big chunks of it with FriendFeed streams rather than hosting your own LifeStream like Louis Gray and Leo Laporte have done.

Some will argue (or not, because they don't realize or don't want to say it out loud) that they are not journalists, they are personalities, and they go wherever their audience is. They don't monetize their content, they monetize the fact that they HAVE an audience by getting paying jobs that enable them to evangelize through any channel that they choose. Those people (and there are very few of them) have less incentive to consolidate their content sources (although there are still reasons to do so). Unfortunately, though, media properties sometimes get confused and think they can do the same thing.

The list of reasons why publishing stuff on Buzz or FriendFeed or Facebook as a source rather than an aggregator goes on and on, so I will just stop here.

I'm glad MG has picked up on it and written about it on Techcrunch.

#blogsareback

Update: Steve Rubel is agreeing with the AP's approach. Using all sorts of fancy words like Attention Spirals, Curating and Relationships Steve is justifying APs ritual suicide of their destination site in favor of adding value, engagement and traffic to Facebook. Sorry Steve, but giving Facebook all your content and your traffic and not getting anything in return is called giving away the house.

Again, I'm not advocating that you lock content away behind paywalls, I'm simply saying that you need to own the source and make your site a first-class citizen on the social web. Not make Facebook the only game in town by handing it your audience.

Google Buzz = FriendFeed Reborn

Added on by Chris Saad.

FriendFeed was dead, now it is re-born as Google Buzz. I've not been able to try the product yet, but philosophically and architecturally it seems superior to FriendFeed.

Here are my observations so far:

Consumption Tools

Buzz is better than FriendFeed because Google is treating it as a consumption tool rather than a destination site (by placing it in Gmail rather than hosting it on a public page). FriendFeed should have always been treated this way. Some people got confused and started hosting public discussions on FriendFeed.

That being said, though, I've long said that news and sharing is not the same as an email inbox and those sorts of items should not be 'marked as read' but rather stream by in an ambient way.

While Buzz is in fact a stream, it is its own tab that you have to focus on rather than a sidebar you can ignore (at least as far as I can tell right now).

How it affects Publishers (and Echo)

The inevitable question of 'How does this affect Echo' has already come up on Twitter. Like FriendFeed before it, Buzz generates siloed conversations that do not get hosted at the source.

So, the publisher spends the time and money to create the content and Buzz/Google get the engagement/monetization inside Gmail.

For some reason, all these aggregators think that they need to create content to be of value. I disagree. I long for a pure aggregator that does not generate any of its own content such as comments, likes, shares etc.

That being said, however, the more places we have to engage with content the more reasons there are for Echo to exist so that publishers can re-assemble all that conversation and engagement back on their sites.

Synaptic Connections

Note that they don't have a 'Follow' button - it's using synaptic connections to determine who you care about. Very cool! I worry though that there might not be enough controls for the user to override the assumptions.

Open Standards

Already, Marshall is calling it the savior of open standards. I don't think Open Standards need to be saved - but they certainly have all the buzz words on their site so that's promising.

That's it for now, maybe more later when I've had a chance to play with it.

Update: After playing with it this morning, and reading a little more, it's clear that this is actually Jaiku reborn (not FriendFeed), because the Jaiku team were involved in building it. They deserve a lot of credit for inventing much of this stuff in the first place - long before FriendFeed.

Also, having used it only for an hour, the unread count on the Buzz tab is driving me nuts. It shouldn't be there. It's a stream not an inbox. Also it makes no sense why I can't display buzz in a sidebar on the right side of my primary Gmail inbox view. That would be ideal.

It's also funny to me that some people have tried to give Chris Messina credit for Buzz even though he's been at Google for no more than a month. They clearly don't understand how long and hard it is to build product. Messina is good, but he aint that good :)

Facebook and the future of News

Added on by Chris Saad.

Marshall Kirkpatrick has written a thoughtful piece over on Read/Write Web entitled 'Facebook and the future of Free Thought' in which he explains the hard facts about news consumption and the open subscription models that were supposed to create a more open playing field for niche voices. In it, he states that news consumption has barely changed in the last 10 years. RSS and Feed Readers drive very little traffic and most people still get their news from hand selected mainstream portals and destination sites (like MSN News and Yahoo news etc). In other words, mainstream users do not curate and consume niche subscriptions and are quite content to read what the mainstream sites feed them.

This is troubling news (pun intended) for those of us who believe that the democratization of publishing might open up the world to niche voices and personalized story-telling.

Marshall goes on to argue that Facebook might be our last hope. That since everyone spends all their time in Facebook already, that the service has an opportunity to popularize the notion of subscribing to news sources and thereby bring to life our collective vision of personalized news for the mainstream. Facebook already does a great deal of this with users getting large amounts of news and links from their friends as they share and comment on links.

Through my work with APML I have long dreamed of a world where users are able to view information through a highly personalized lens - a lens that allows them to see personally relevant news instead of just popular news (note that Popularity is a factor of personal relevancy, but it is not the only factor). That doesn't mean the news would be skewed to one persuasion (liberal or conservative for example) but rather to a specific topic or theme.

Could Facebook popularize personalized news? Should it? Do we really want a closed platform to dictate how the transports, formats and tools of next generation story-telling get built? If so, would we simply be moving the top-down command and control systems of network television and big media to another closed platform with its own limitations and restrictions?

Personalized news on closed platforms are almost as bad as mainstream news on closed platforms. News organizations and small niche publishers both need a way to reach their audience using open technologies or we are doomed to repeat the homogenized news environment of the last 2 decades. The one that failed to protect us from a war in Iraq, failed to innovate when it came to on-demand, and failed to allow each of us to customize and personalize our own news reading tools.

That's why technologies like RSS/Atom, PubSubHub and others are so important.

What's missing now is a presentation tool that makes these technologies sing for the mainstream.

So far, as an industry, we've failed to deliver on this promise. I don't have the answers for how we might succeed. But succeed we must.

Perhaps established tier 1 media sites have a role to play. Perhaps market forces that are driving them to cut costs and innovate will drive these properties to turn from purely creating mainstream news editorially toward a model where they curate and surface contributions from their readership and the wider web.

In other words, Tier 1 publishers are being transformed from content creators to content curators - and this could change the game.

In the race to open up and leverage social and real-time technologies, these media organizations are actually making way for the most effective democratization of niche news yet.

Niche, personalized news distributed by open news hubs born from the 'ashes' of old media.

Don't like the tools one hub gives you? Switch to another. the brands we all know and love have an opportunity to become powerful players in the news aggregation and consumption game. Will they respond in time?

Due to my experience working with Tier 1 publishers for Echo, I have high hopes for many of them to learn and adapt. But much more work still remains.

Learn more about how news organizations are practically turning into personalized news curation hubs over on the Echo Blog.

A call for focus from the open standards community

Added on by Chris Saad.
Time to refocus the open community
Over on the Open Web Foundation mailing list Eran Hammer-Lahav who, despite his gruff and disagreeable personality, I respect greatly for his work in the development of open standards, is effectively calling for a complete shakeup of the foundation and the efforts being poured into the 'common ground' of the standards efforts.
Let me define the 'Common Ground' as I see it.
Building strong common ground is like building strong open standards deep into the stack. Just like a software stack, our community needs a stack of organizations that are loosely coupled and open to participation. Groups like the W3C and IETF provide a rock solid core, more agile groups focused on specific standards like OpenID and Oauth are in the middle and a project like the DataPortability project was supposed to be on top - a kind of user interface layer.
You see, good standards efforts are neccessarily projects that work to solve one small problem well. The problems are often deep technical challenges that attract passionate and, let's face it, geeky people to hack, debate and decide on details that don't hit the radar for 99.9% of the population.
The problem, of course, is that the rest of the world has to care for a standard to matter.
Leaders and project managers need to be found, real companies need to get involved (not just their staff), collaboration platforms need to facilitate real and open discussion, calls for collaboration need to be heard, specs need to be written (and written well), libraries need to be written, governance needs to be put in place and so on.
Also, once the standard is (half) baked, less involved hackers need to participate to test the theories in the real world. Less savvy developers need to hear about the standard and understand it. Business people need to understand the value of using a standard over a proprietary solution. They also need IP protections in place to ensure that by using the standard they are not putting their company at risk. Marketing people need to know how to sell it to their customer base. Customers need to know how to look for and choose open solutions to create a market place that rewards openness.
All of this is 'Common Ground'. It is common to any standards effort and there should - no must - be an organization that is just as lean, mean and aggresive as Facebook in place to provide these resources if we are ever going to compete with closed solutions.
At the start of 2008 the DataPortability project became very popular. It's goal was not to build standards, but rather to promote them. To provide much of the common ground that I described above.
The DP project's particular mission, in my mind at least, was to focus on the marketing effort. To build a massive spot light and to shine that intense light on the people, organizations and standards that were getting the job done.
Is the OWF providing a generic legal/IPR framework? Fantastic! It was the DPP's job to let everyone know - developers, business execs, media, potential editors, contributors and more. Our job was not, and should never be to start the framework itself, but rather to advocate for, provide context around and promote the hell out of someone else's effort to do so.
Is a conference happening next year? Excellent. It was the DPP's job to get in touch with the conference organizer, organize not just a DP panel, but a DP Track and to create room (and perhaps even a narritive) inside which the people doing the actual work can speak.
Has Facebook just announced a new feature that could have been achieved through a combination of existing open standards? Then it is' the DPP's job to consult with each of those standards groups and create a cohesive response/set of quotes for the media to use.
Unfortunately, though, many in the standards community chose to fight the creation of the project for whatever reasons crossed their mind at the time. They used all sorts of methods to undermine the effort. Some that would Fox News to shame.
The result, of course, has been a diversion from the important work of providing common area services to the standards community to a self-protection state of creating governance and creating our own 'deliverables' in order to justify and protect its own existance.
I have, as a result of a series of unfortunate events, fallen out of touch with the Steering group at the DPP. Moving to the US, getting disillusioned with the community I admired (not those involved with DPP. Ny friends at the DPP Steering group have always performed very admirably and worked extremely hard) and ultimately shifting my world view to realize that the best contribution I can make - the best way to really move the needle - is to ship Data Portability compliant software at scale.
At this juncture, however, I think it's time for us all to refocus on our original mission for the DataPortability Project.
To restate my humble view on the matter:
To provide a website that explains data portability to various audiences in neat and concise ways. It is the onramp for the standards community. You should be able to send anyone to 'dataportability.org' and they 'get it' and know what to do next.
To provide context and advocacy on news and development from inside and outside the standards community so that media, execs and less involved developers can understand and react
To build a community of interested parties so that they can swam to the aid of standards groups or the standards effort in general.
To act as a market force to (yes I'm going to say it) pick winners. To highlight what works, what doesn't and what should be done next to move the whole effort forward. Nothing is as powerful as removing confusion and planting a big red flag on the answer.
To recognize that we have the authority to do whatever we want to do because we are an independant, private group who has chosen to create public/transparent processes. We need to believe in ourselves. If we do good work, then people will listen. If we don't then they can listen to someone else.
This necessarily means that the only real deliverable from the project would be a small set of communication tools that build community, context and advocacy around what we believe is the 'truth' (or at least things worth paying attention to) in the broader standards community.
In my book that is not only a very worthy effort, it is increasingly critical to the success and health of the web.

Over on the Open Web Foundation mailing list Eran Hammer-Lahav who, despite his gruff and disagreeable personality, I respect greatly for his work in the development of open standards, is effectively calling for a complete shakeup of the foundation and the work being poured into the 'common ground' of the standards efforts.

Let me define the 'Common Ground' as I see it.

Building strong common ground is like building strong open standards deep into the stack. Just like a software stack, our community needs a stack of organizations that are loosely coupled and open to participation. Groups like the W3C and IETF provide a rock solid core, more agile groups focused on specific standards like OpenID and Oauth are in the middle and a project like the DataPortability project was supposed to be on top - a kind of user interface layer.

You see, good standards efforts are neccessarily projects that work to solve one small problem well. The problems are often deep technical challenges that attract passionate and, let's face it, geeky people to hack, debate and decide on details that don't hit the radar for 99.9% of the population.

The problem, of course, is that the rest of the world has to care for a standard to matter.

Leaders and project managers need to be found, real companies need to get involved (not just their staff), collaboration platforms need to facilitate real and open discussion, calls for collaboration need to be heard, specs need to be written (and written well), libraries need to be written, governance needs to be put in place and so on.

Also, once the standard is (half) baked, less involved hackers need to participate to test the theories in the real world. Less savvy developers need to hear about the standard and understand it. Business people need to understand the value of using a standard over a proprietary solution. They also need IP protections in place to ensure that by using the standard they are not putting their company at risk. Marketing people need to know how to sell it to their customer base. Customers need to know how to look for and choose open solutions to create a market place that rewards openness.

All of this is 'Common Ground'. It is common to any standards effort and there should - no must - be an organization that is just as lean, mean and aggresive as Facebook in place to provide these resources if we are ever going to compete with closed solutions.

At the start of 2008 the DataPortability project became very popular. It's goal was not to build standards, but rather to promote them. To provide much of the common ground that I described above.

The DP project's particular mission, in my mind at least, was to focus on the marketing effort. To build a massive spot light and to shine that intense light on the people, organizations and standards that were getting the job done.

Is the OWF providing a generic legal/IPR framework? Fantastic! It was the DPP's job to let everyone know - developers, business execs, media, potential editors, contributors and more. Our job was not, and should never be to start the framework itself, but rather to advocate for, provide context around and promote the hell out of someone else's effort to do so.

Is a conference happening next year? Excellent. It was the DPP's job to get in touch with the conference organizer, organize not just a DP panel, but a DP Track and to create room (and perhaps even a narritive) inside which the people doing the actual work can speak.

Has Facebook just announced a new feature that could have been achieved through a combination of existing open standards? Then it is the DPP's job to consult with each of those standards groups and create a cohesive response/set of quotes for the media to use.

What is the relationship Facebook Platform, OpenSocial, Open Standards, OpenID, OAuth, Portable Contacts and Twitter's 'Open API'? DataPortability.org should have the answer neatly described on its website.

Unfortunately, though, many in the standards community chose to fight the creation of the project for whatever reasons crossed their mind at the time. They used all sorts of methods to undermine the effort. Some that would put Fox News to shame.

The result, of course, has been a diversion from the important work of providing this common ground  to the standards community to a self-protection state of creating governance and creating our own 'deliverables' in order to justify and protect our own existence.

I have, as a result of a series of unfortunate events, fallen out of touch with the Steering group at the DPP. Moving to the US, getting disillusioned with the community I admired (not those involved with DPP. My friends at the DPP Steering group have always performed very admirably and worked extremely hard) and ultimately shifting my world view to realize that the best contribution I can make - the best way to really move the needle - is to ship Data Portability compliant software at scale.

At this juncture, however, I think it's time for us all to refocus on our original mission for the DataPortability Project.

To restate my humble view on the matter:

  • To provide a website that explains data portability to various audiences in neat and concise ways. It is the onramp for the standards community. You should be able to send anyone to 'dataportability.org' and they 'get it' and know what to do next.
  • To provide context and advocacy on news and development from inside and outside the standards community so that media, execs and less involved developers can understand and react
  • To build a community of interested parties so that they can swam to the aid of standards groups or the standards effort in general.
  • To act as a market force to (yes I'm going to say it) pick winners. To highlight what works, what doesn't and what should be done next to move the whole effort forward. Nothing is as powerful as removing confusion and planting a big red flag on the answer.
  • To recognize that we have the authority to do whatever we want to do because we are an independant, private group who has chosen to create public/transparent processes. We need to believe in ourselves. If we do good work, then people will listen. If we don't then they can listen to someone else.

This necessarily means that the only real deliverable from the project would be a small set of communication tools that build community, context and advocacy around what we believe is the 'truth' (or at least things worth paying attention to) in the broader standards community.

Many have scoffed at that these goals in the past claiming that there was no 'value'. In my book this set of goals is not only a very worthy, it is increasingly critical to the success and health of the web.

Redefining Open

Added on by Chris Saad.

In my mind, there are four kinds of open.

  • Torvalds Open.
  • Zuckerberg Open.
  • Not Open but we use the word Open anyway.
  • Saad Open.

This fragmentation has diluted the word open to the point where it almost has no value.

It's time to re-define the word open. First let me explain each category.

Torvalds Open.

In Linus Torvalds world (the guy who invented Linux) Open means that the software is developed through a community process. The source code is visible and modifiable by anyone and is available for free.

This is called 'Open Source'.

Companies may package and bundle the software in new and novel ways, and provide support and services on top for a fee.

The problem with Open Source on the web is that the software itself has less value than the network effects and up-time provided by a branded, hosted experience. Running Twitter.com on open source software, for example, would have very little value because Twitter's lock-in is not their software, but rather their name space (@chrissaad) and their developer ecosystem all developing software with dependencies on their proprietary API.

Open Source is useful, interesting and important, but is not what I mean when I talk about the Open Web. I feel like its value is well understood and it is not the first, best way of making our world (and the Internet) a better place - at least not in the same way it once did when client-side software was the primary way we used computers.

Zuckerberg Open.

When Mark Zuckerberg talks about open, he is not talking about Technology. He is talking about human interactions.

Ever since the popularity of Data Portability (via the DataPortability project) Facebook has gone to great lengths to redefine the word Open to mean the way people interact with each other.

In doing so, they have managed to, in large part, co-opt the word and claim their platform makes people 'more open'.

In many respects, and by their definition, they are right. Facebook has encouraged a mind bending number of people to connect and share with each other in ways that had been previously reserved for bloggers and other social media 'experts'.

Facebook deserves a lot of credit for introducing social networking to the masses.

Their definition of Open, however important, is not the kind I'm talking about either.

Not Open but we use the word Open anyway.

This is when a platform or product has an API and therefore claim that they have an 'Open Platform'.

There's nothing open about having an API. It's just having an API. The platform could be closed or open depending on how the given application and API is built and what limitations are placed upon it.

In most cases, an 'Open Platform' is not actually open, it's just a platform.

Saad Open

My definition of open is very specific. In fact a better way to describe it would be Interoperable and Distributed.

To explain, let me provide some compare and contrast examples.

Twitter is closed because it owns a proprietary namespace (e.g. @chrissaad). The only way to address people is using their username system. They own those usernames and have final authority over what to do with them.

They are closed because they do not provide free and clear access to their data without rate limiting that access or cutting deals for improved quality of services.

They are also closed because they are not a federated system. You can not start your own Twitter style tool and communicate with users on Twitter or vice versa. The only way to message people on Twitter is to use Twitter's propietary APIs for submitting and retrieving data.

A proprietary API is an API that is special to a company and/or produces data that is not in an open standard.

Wordpress, on the other hand (and to contrast) is an open system. Let's compare point for point.

It does not own the namespace on which it is developed. The namespaces are standard URLs. This blog, for example is hosted at blog.areyoupayingattention.com. Wordpress does not own that domain.

Wordpress produces a single type of data - blog posts. Those blog posts are accessible using an open standard - RSS or Atom. There is no rate limit on accessing that data.

Wordpress is a federated system. While they provide a hosted solution at Wordpress.com for convenience, there is nothing stopping me from switching to Blogger or Tumblr. The tools that you would use to consume my blog would remain unchanged and the programmers who make those tools would not need to program defensibly against Wordpress' API. They simply need to be given the URL of my RSS feed and they are good to go.

This makes Wordpress an open tool in the open blogosphere.

Blogging is open.

Microblogging should be open too.

To summarize. Open, in my definition, does not mean the software is open source or free. It means that the software receives open standards data, provides open standards data, has an interoperable API and can easily be switched out for other software.

Today I was challenged on Twitter that Echo is not 'Open' because it is proprietary code and costs money to use.

This person does not understand my definition of Open. Echo is open because it is not a destination site, it sits on any site anywhere. The owner of that site can take it off and replace it with another engagement tool at any time. The data being absorbed by Echo, for the most part, is RSS or Atom, and the data coming out of Echo is RSS.

It does not have any proprietary namespaces (except our useless legacy login system which we are trying to get rid of as quickly as possible) and does not pretend to create some amazing social network of its own. It is just a tool to communicate on the open, social web.

Is Echo perfect? No, of course not, but our intention is to make each and every aspect of the product as interoperable and distributed as possible. We will even use and contribute to open source where appropriate.

How does your product, or the tools you choose, compare? Tell me in the comments.

Next up, we should start to redefine the 'Open' community that creates open standards. Much of it is not very open.

Merry Christmas - The power of memes

Added on by Chris Saad.

Many, many of the things in our lives could be called 'Memes'.  Here's what happens when you type 'Define:meme' into Google.

Memes are everywhere. We just experienced a country wide meme here in the US called 'Thanksgiving'. We are about to hit a similar meme (except this one is global) called 'Christmas'.

Memes are fascinating things. They are almost as important as Context, Perspective and Metaphors. Together these three things compose the great majority of our thought processes.

What is this like (metaphor), What else is going on (context), What does everyone else think (meme), What does my experience and current state of mind tell me (Perspective).

Some memes emerge organically over time - like folding the end of hotel toilet paper into a little triangle. Others are created through brute force by strategic construction and repetition. No one has mastered this better than the extreme right wing of the US political system. Fox news is a bright shining example of how to craft, seed, propagate and manipulate a meme.

Silicon Valley loves a meme. We live on them. In fact one could argue that the whole ecosystem would shut down without the meme of the day, week and bubble.

.Com, Web 2.0, Data Portability, Real-time web, RSS is dead, Myspace, Facebook, Twitter, Cloud, Semantic Web, Synaptic Web and so on and so forth.

Like in real life, some of these memes emerge organically, some through brute force. Some make more sense than others. Some of these memes get undue attention. Some are created to stir controversy. Others form organically to create a shorthand. Some are genuine cultural shifts that have been observed and documented.

These memes matter. They matter a lot. They dictate a large part of how people act, what they pay attention to and their assumptions about the world in which they live, and the people they encounter. In Silicon Valley they dictate who gets heard and which projects get funded. They form the basis of many of our decisions.

Some services like Techmeme do a very good job at capturing daily memes. I've yet to see a service that captures memes that span weeks, months, years or even decades though. I dream of such a service. Particularly one focused on news memes.

Imagine being able to zoom in and out of the news, and drag the timeline back and forth like some kind of Google maps for headlines. Imagine being able to read about an IED explosion in Bagdad and quickly understand its context in the decade long struggle for the entire region through some kind of clustered headline/topic view.

Consider the context, perspective and metaphoric power such a tool would give us. How could it change our world view and help turn the temporary, vacuous nature of a microblog update into something far more substantial and impactful with an in line summary of the rich historic narrative inside which it belongs.

The algorithm to create such correlations and the user interface to present it would challenge even the smartest mathematicians and user interaction designers I imagine. It's commercial value is vague at best. It probably shouldn't be attached to a business at all - maybe it should be some kind of wikipedia style gift to the world.

Maybe the news media, Reuters, CNN and Washington Post might take it upon themselves to sponsor such a project in an effort to re-contextualize their news archives in the new AAADD, real-time, now, now now, every one is a journalist media world.

I've bought some domains and done some mockups of such a service, but I probably would never have the time or the patience to build it - at least not in the foreseeable future.

Maybe I'm just dreaming. But I think it's a good dream!

Twitter Lists and Tags

Added on by Chris Saad.

In my previous post (written 5 minutes ago) I talk about Twitter Lists in relation to shared namespaces (Hint: They are not in a shared namespace). Another under-reported fact, however, is that lists are also Tags. They are a great way for Twitter to learn how Twitter users are perceived and grouped (As a side note, they are also great for people to see how other people perceive them - one of my favorite lists in which I am listed: @chadcat/unreasonably-talented haha).

One could easily see an algorithm that can determine accurate APML data about each user not just by looking at their Tweet history, but by also checking their Bios and the Tweet History/Bios of the people they are listed with. The list name itself, in fact, is a very concentrated form of topic/tag data.

Do lists double as Twitter's user tagging feature?

Who will be the first to ship an automated user discovery directory based on analyzing the relationship between users who are on the same lists?

I hope MrTweet is already working on this!

Twitter Lists and Namespaces

Added on by Chris Saad.

A very important fact that seems to be getting little to no coverage at the moment about Twitter Lists is the issue of namespaces. Twitter's number one asset is its control and allocation of namespaces. Those little things we call 'Usernames'. @chrissaad is not just my Twitter Name, it is a short form addressable identity that concretely links to my Twitter inbox any time someone uses it in a Tweet.

Addressable, convenient namespaces that can be used in a sentence like this are so interesting and important that facebook went to great lengths to copy them. Nothing on the open web has yet come close to this simplicity and effectiveness. Which is not to say there won't be an alternative soon.

The important fact with Twitter usernames, though, is that they are unique. There is a finite and shared 'space' in which 'names' can be allocated.

The result is that early adopters end up with all the best names and squatters rush to lock up all the best phrases. Late comers to the system end up with names like chris2423.

Twitter Lists, however, are different. They include the list creator's username. For example my JS-Kit list is "@ChrisSaad/jskit".

As you can see, the list 'JSKIT' is attached to my username. This means means that each user has their own namespace.

This result: There can't be a landrush for List names because the list naming convention sits on top of the username. It also means that no one can own a definitive list on a subject because each list is subjective.

This is an important design decision for Twitter. One that has both pros and cons for the community. Overall, however, I think the decision was a correct one. Lists can rise and fall organically (or at least based on the influence and popularity of their creators) without the pain and pressure (for Twitter) of maintaining yet another shared namespace.

Twitter's username namespace, however, is just rife with and waiting for all sorts of headaches. I don't envy their position and I can't wait for an open alternative.