FriendFeedHolic – A Social Media Ranking Model for Advertising and Marketing Success

One of the most challenging things in social media is finding the conversation leaders. Those people who drive the conversation, and create a community.
FriendFeedHolic (ffholic) has taken the base knowledge that exists in FriendFeed and added a ranking mechanism to it based on input and output. In fact, they weight the participation in the FriendFeed community more heavily than participation in other communities.
This is important. Although FriendFeedHolic is separate from FriendFeed, they have found the way to isolate and target those users who are most likely to participate and create conversations. These users, be it Scoble or Mona N, are where advertisers and marketers can target their money.
How would they do this?
Think about it. If someone that is a large commenter or conversation-creator on FriendFeed creates new content, they are assigned a higher ranking in the new conversation-driven ad-discovery model that advertisers will have to create to succeed.
This new targeted advertising logic will be forced to discover:

  • The content of the conversation
  • The context of the conversation
  • The tone of the conversation
  • The participants in the conversation

This model will be able to identify when it is an inward-facing conversation that involves mostly super-users, or if it is a conversation that engages a wide-spectrum of people.
Conversations among super-users will lead to more passive advertising being shown, as that is a spectator event, with only a few participants.
Conversations created by super-users, or that involve super-users, but have a higher participation from the general community will get more intelligent attention to ensure that the marketing messages and advertising shown fit the four criteria above.
In this new model, advertisers will have to see that they can’t simply slap a set of ads up on the popular kids web sites. They will have to understand who leads a community, who generates buzz, and who can engage the most people on a regular basis.
In this model, the leader has far less power than the community that they create. And maintain.

Advertising to the Community: Is PageRank a Good Model for Social Media?

In previous posts about advertising and marketing to the new social media world [here and here], I postulated that it is very difficult to assign a value to a stream of comments, a community of followers, or a conversation.
As always, Google seems (to think) it has the answer. BusinessWeek reports the vague concept of PageRank for the People [here]. Matt Rhodes agrees with this idea, and that advertising will become more and more focused on the community, rather than on the content.
Where the real value in this discussion lies is in targeting the advertising to be relevant to the conversation. It’s not just matching the content. It’s all about making the advertising relevant to the context.
Is the tone of the conversation about the brand positive or negative? I like to point out that I see my articles about Gutter Helmet creating a content-match in the AdSense logic that drives this product to be advertised. What is lost in the logic that AdSense uses is that I am describing my extremely negative experience with Gutter Helmet.
Shouldn’t the competitors of Gutter Helmet be able to take advantage of this, based on the context of the article? Shouldn’t Gutter Helmet be trying to respond to these negative posts by monitoring the conversation and actively trying to turn a bad customer experience into a positive long-term relationship?
Conversation and community marketing is a far more complex problem than a modified PageRank algorithm. It is not about the number of connections, or the level of engagement. In the end, it is about ensuring that advertisers can target their shrinking marketing dollars at the conversations that are most important.
Injecting irrelevant content into conversation is not the way to succeed in this new approach. Being an active participant in the conversation is the key.
In effect, the old model that is based on the many eyeballs for the lowest cost approach is failing. A BuzzLogic model that examines conversations and encourages firms to intelligently and actively engage in them is the one that will win.
The road to success is based on engagement, not eyeballs.

The Dog and The Toolbox: Using Web Performance Services Effectively

The Dog and The Toolbox

One day, a dog stumbled upon a toolbox left on the floor. There was a note on it, left by his master, which he couldn’t read. He was only a dog, after all.
He sniffed it. It wasn’t food. It wasn’t a new chew toy. So, being a good dog, he walked off and lay on his mat, and had a nap.
When the master returned home that night, the dog was happy and excited to see him. He greeted his master with joy, and brought along his favorite toy to play with.
He was greeted with yelling and anger and “bad dog”. He was confused. What had he done to displease his master? Why did the master keep yelling at him, and pointing at the toolbox. He had been good and left it alone. He knew that it wasn’t his.
With his limited understanding of human language, he heard the words “fix”, “dishwasher”, and “bad dog”. He knew that the dishwasher was the yummy cupboard that all of the dinner plates went in to, and came out less yummy and smelling funny.
He also knew that the cupboard had made a very loud sound that had scared the dog two nights ago, and then had spilled yucky water on the floor. He had barked to wake his master, who came down, yelling at the dog, then yelling at the machine.
But what did fix mean? And why was the master pointing at the toolbox?

The Toolbox and Web Performance

It is far too often that I encounter companies that have purchased Web performance service that they believe will fix their problems. They then pass the day-to-day management of this information on to a team that is already overwhelmed with data.
What is this team supposed to do with this data? What does it mean? Who is going to use it? Does it make my life easier?
When it comes time to renew the Web performance services, the company feels gipped. And they end up yelling at the service company who sold them this useless thing, or their own internal staff for not using this tool.
To an overwhelmed IT team, Web performance tools are another toolbox on the floor. They know it’s there. It’s interesting. It might be useful. But it makes no sense to them, and is not part of what they do.
Giving your dog the toolbox does not fix your dishwasher. Giving an IT team yet another tool does not improve the performance of a Web site.
Only in the hands of a skilled and trained team does the Web performance of a site improve, or the dishwasher get fixed. As I have said before, a tool is just a tool. The question that all organizations must face is what they want from their Web performance services.
Has your organization set a Web performance goal? How do you plan to achieve your goals? How will you measure success? Does everyone understand what the goal is?
After you know the answers to those questions, you will know that that as amazing as he is, your dog will not ever be able to fix your dishwasher.
But now you know who can.

Managing Web Performance: A Hammer is a Hammer

Give almost any human being a hammer, and they will know what to do with it. Modern city dwellers, ancient jungle tribes, and most primates would all look at a hammer and understand instinctively what it does. They would know it is a tool to hit other things with. They may not grasp some of the subtleties, such as that is designed to drive nails into other things and not beat other creatures into submission, but they would know that this is a tool that is a step up from the rock or the tree branch.
Simple tools produce simple results. This is the foundation of a substantial portion of the Software-as-a-Service (SaaS) model. SaaS is a model which allows companies to provide a simple tool in a simple way to lower the cost of the service to everyone.
Web performance data is not simple. Gathering the appropriate data can be as complex as the Web site being measured. The design and infrastructure that supports a SaaS site is usually far more complex than the service it presents to the customer. A service that measures the complexity of your site will likely not provide data that is easy to digest and turn into useful information.
As any organization who has purchased a Web performance measurement service, a monitoring tool, a corporate dashboard expecting instant solutions will tell you, there are no easy solutions. These tools are the hammer and just having a hammer does not mean you can build a house, or craft fine furniture.
In my experience, there are very few organizations that can craft a deep understanding of their own Web performance from the tools they have at their fingertips. And the Web performance data they collect about their own site is about as useful to them as a hammer is to a snake.

Web Performance and Advertising: Latency Kills

One of the ongoing themes is the way that slow or degrading response times can have a negative effect on how a brand is perceived. This is especially true when you start placing third-party content on your site. Jake Swearingen, in an article at VetureBeat, discusses the buzz currently running through the advertising world that Right Media is suffering from increasing latency, a state that is being noticed by its customers.
In the end, the trials and tribulations of a single ad-delivery network are not relevant to world peace and the end of disease. However, the performance of an advertising platform has an affect on the brands that host the ads on their sites and the on the brand of the ad platform itself. And in a world where there are many players fighting for second place, it is not good to have a reputation as being slow.
The key differentiators between advertising networks fighting for revenue are not always the number of impressions or the degree to which they have penetrated a particular community. An ad network is far more palatable to visitors when it can deliver advertising to a visitor without affecting or delaying the ability to see the content they originally came for.
If a page is slow, the first response is to blame the site, the brand, the company. However, if it is clear that the last things to load on the page are the ads, then the angst and anger turns toward those parts of the page. And if visitors see ads as inhibitors to their Web experience, the ads space on a page is more likely to be ignored or seen as intrusive.

Welcome Back!

If you can see this post, the DNS system has finally propagated my new host information out to the Web, and you have reached me at the new server, located at BlueHost.
After my LinkedIN request last night, I got two separate rcommendations for BlueHost, both from folks I highly respect.
Let me know what you think.

Web Performance: Managing Web Performance Improvement

When starting with new clients, finding the low-hanging fruit of Web performance is often the simplest thing that can be done. By recommending a few simple configuration changes, these early stage clients can often reap substantial Web performance improvement gains.
The harder problem is that it is hard for organizations to build on these early wins and create an ongoing culture of Web performance improvement. Stripping away the simple fixes often exposes deeper, more base problems that may not have anything to do with technology. In some cases, there is no Web performance improvement process simply because of the pressure and resource constraints that are faced.
In other cases, a deeper, more profound distrust between the IT and Business sides of the organization leads to a culture of conflict, a culture where it is almost impossible to help a company evolve and develop more advanced ways of examining the Web performance improvement process.
I have written on how Business and IT appear, on the surface, to be a mutually exclusive dichotomy in my review of Andy King’s Website Optimization. But this dichotomy only exists in those organizations where conflict between business and technology goals dominate the conversation. In an organization with more advanced Web performance improvement processes, there is a shared belief that all business units share the same goal.
So how can a company without a culture of Web performance improvement develop one?
What can an organization crushed between limited resources and demanding clients do to make sure that every aspect of their Web presence performs in an optimal way?
How can an organization where the lack of transparency and the open distrust between groups evolve to adopt an open and mutually agreed upon performance improvement process?
Experience has shown me that a strong culture of Web performance improvement is built on three pillars: Targets, Measurements, and Involvement.

Targets

Setting a Web performance improvement target is the easiest part of the process to implement. it is almost ironic that it is also the part of the process that is the most often ignored.
Any Web performance improvement process must start with a target. It is the target that defines the success of the initiative at the end of all of the effort and work.
If a Web performance improvement process does not have a target, then the process should be immediately halted. Without a target, there is no way to gauge how effective the project has been, and there is no way to measure success.

Measurements

Key to achieving any target is the ability to measure the success in achieving the target. However, before success can be measured, how to measure success must be determined. There must be clear definitions on what will be measured, how, from where, and why the measurement is important.
Defining how success will be measured ensures transparency throughout the improvement process. Allowing anyone who is involved or interested in the process to see the progress being made makes it easier to get people excited and involved in the performance improvement process.

Involvement

This is the component of the Web performance improvement process that companies have the greatest difficulty with. One of the great themes that defines the Web performance industry is the openly hostile relationships between IT and Business that exist within so many organizations. The desire to develop and ingrain a culture of Web performance improvement is lost in the turf battles between IT and Business.
If this energy could be channeled into proactive activity, the Web performance improvement process would be seen as beneficial to both IT and Business. But what this means is that there must be greater openness to involve the two parts of the organization in any Web performance improvement initiative.
Involving as many people as is relevant requires that all parts of the organization agree on how improvement will be measured, and what defines a successful Web performance improvement initiative.

Summary

Targets, Measurements, and Involvement are critical to Web performance initiatives. The highly technical nature of a Web site and the complexities of the business that this technology supports should push companies to find the simplest performance improvement process that they can. What most often occurs, however, is that these three simple process management ideas are quickly overwhelmed by time pressures, client demands, resource constraints, and internecine corporate warfare.

Web Performance: Outages and Reputation

In the last few months, I have talked on a couple of occasions on how an outage can affect a brand, be it personal or corporate [here and here].
Yesterday my servers experienced a 11-hour network outage due to a broken upstream BGP route.
It’s sometimes scary to see how worn the cobbler’s shoes are.

GrabPERF Network Outage

Today, there was a network outage that affected the servers from September 21 2008 15:30 GMT until September 22 2008 01:45 GMT.
The data from this period has been cut and hourly averages have been re-calculated.
We apologize for the inconvenience.