Archives for the month of: December, 2011

These kind of posts are always a bit of a gamble. This time next year I could either be revered as a technological oracle or shamed as a false prophet. So with this in mind I will avoid predicting the Rise of the Robots and have a look at what other people are saying before sticking my neck out.


Whereas the size of your Facebook network is probably in excessive of 100 people, Tim Bajarin predicts that 2012 will see the rise of social networking tools that allow us to interact with smaller groups of friends.

Perfectly located to embrace this trend of intimate social networking is the Circles feature of Google+. You can easily organize your contacts into friends/colleagues/groups etc and interact with each circle in a unique way. For a brand – this could involve organizing your fans and advocates, or for a company this could be different departments.

I expect 2012 to see major gains for the infant social network. According to one report Google+ already has 650,000 members – and at current growth rate is set to hit the 300 million mark by the end of 2012.  I don’t think 2012 will be the year that Google+ explodes (I think Google are playing the long game) but it will certainly see itself seeping into new areas and opening up new possibilities for social networking.

Integration with other Google services such as Mail, Android and the ever-improving Google Apps Office suite will all offer an incentive for businesses to sign up. American manufacturing giant General Motors have reportedly signed a deal for access to Google App’s for it’s 100,000 strong workforce – I’m sure that the features of Google+ will find an abundance of uses in huge corporations like this.

However, the most important factor of Google+ that will see it grow through 2012 is how the network will effect normal search functions. Google+ brand pages will soon be placed on the first page of Google search results and articles that people in your network ‘+1’ will be given weighting in any search query you find yourself making through Google.

This relationship between search and social will make it an important battleground for the 2012 US presidential elections. A Google search for ‘healthcare’ will present pages that people in your Google+ network have shared – so it is crucial for any political campaign to penetrate peoples Google network.


Whilst Google+ will find itself a home, it won’t come close to the king of social networks. Valued at $100 billion, pretty much everyone agrees that Facebook will continue to ascend. Frictionless sharing (when anything you read, watch or listen to on the web is posted to Facebook automatically) will continue to grow – yet it will need to be significantly tweaked as people realize that they don’t want everything posted to the world.

Having acquired location check-in service Gowalla this year it is likely we will see a growth of Facebook location updates. Marketers still don’t know how to deal with check-ins, but 2012 will see that change. One hotel has already offered a discount to people that match a real life check-in with a Facebook one.

My main prediction about Facebook is a change of public consciousness about the network. I think that in 2012 people will realize the implications of a world where every location they check in, every song they listen to, every news article they share and every comment they make is recorded and displayed as part of their Facebook Timeline.

People will realise that the Timeline will be something they can look back at in 40 years time – a complete record of their own life – and this will have a profound effect on our relationships to social networks. The effects of this are impossible to guess.

The Media

Newspaper print revenues will inevitably continue to plummet, but new models will begin to rise. News organizations will begin creating Facebook apps to follow the success of the Guardian and NY Times.

Citizen journalism will continue to soar as new tools allow for better organisation of contributions and developments of a news story. These new tools are also creating a new breed of journalist – the curator. Content curation, categorization and dissemination will become more crucial  as journalism moves into a ‘decentralized, real-time, collaborative, and curated future‘.

TV 2.0

The humble television set is due an upgrade. Using my Virgin Media box seems archaic when compared to the potential of the internet. Apple will release an astronomically priced TV and create a buzz and then towards the end of the year, Google will release their fair priced version just in time for Christmas.

‘The Battle for the Living Room’ will start in earnest, but games consoles are far better situated than most to win. Having browsed through YouTube on my TV using voice commands and hand gestures with my Kinect (yes, like minority report) – I don’t feel much need to change. And as Matt Roseff says ‘any company who hopes to compete with the Xbox by selling an add-on box that DOESN’T play games is in a deep state of denial’

Opensource social network

The main problem with Facebook is that it is ran for profit. 2012 will see more adverts crammed into the website – and they have just announced a daily sponsored advert that will be placed in your news feed. For people that care about these things, liberation could be in sight!

Joe Brockmeier predicts that Mozilla, the guys behind FireFox, will release an open source, privacy enabled version of Facebook (without adverts). Whilst I hope this is true, and I will certainly be signing up, I doubt that this David and Goliath fight will be won by the little guy.

Digital Identification

The era of the fingerprint is over, suggests Amy Webb. Police forces around the world are using iris scanning iPhone app’s and biometric cameras (which can scan 46,000 data points on a face) to query government databases. The latest update to Google’s mobile Android operating system uses facial recognition to unlock a handset – and I imagine this technology will soon be used to pay for goods. Will we see frictionless check-ins based on face recognition cameras in 2012…

Finally – The Rise of the Robots

I knew I said I wouldn’t talk about robots, but I reckon this year we will see the early stages of the new Robotic Age.  Robotics will take over jobs ranging from the menial to the educational and medical. The sex industry will begin selling shed loads of pleasure robots, voice recognition will become almost perfect and humans will become more cyborg-like as we begin to implant computer chips into our body.


Blekko is a search engine that allows you to narrow down your search in a unique way. After entering a search query, use a ‘/’ followed by a word. So if you search for ‘Global Warming /conservative’ you will only see search results from conservative websites. Add a ‘/tech’ to a search and you will only see technology related websites.

The website offers an extensive list of slashtags available for you to start using, and encourages you to develop your own.

A great additional feature is the ability to delve into the SEO of the site – so if you search for a domain name using the  ‘/seo’ slashtag you can see all the relevant information.

The site also allows you to mark anything as spam and remove it from all future search results.

Check out the welcome video for more information

In my last post I described some of the main metrics used in social network analysis graphs. In this post I am going to look at some of the important considerations regarding the look and design of a network diagram.

A social network can be very vast, and a network diagram can quickly become very cluttered and unreadable. Netviz Nirvana has been developed to combat this. It is a set of principles that can guide you in your graphing projects. Your diagram should come as close as possible to matching these requirements:

  • Node Visibility – Each node should stand apart and clear from all others – no node should occlude another node.
  • Edge Visibility – You should be able to count the amount of edges coming off from every node
  • Edge Crossing – The less crossings – the better. The more often an edge crosses over another, the more visually complex the image becomes, and the harder it is to follow paths.
  • Edge Tunnels – These are when a node lies on an edge that is not its own. The problem could lie with either the position of the node or the position of the edge.
  • Text Readibility – All text should be clear enough for a reader to read.
  • Text Distinction – All text should be appropriately truncated (use a key if necessary).
  • Clusters and outliers should be clearly visible and distinct.

These are all good points to keep in mind when producing a graph. I would add that you should be careful that all colours used are distinctive from each other, and that they shouldn’t clash (you want your diagram to look good don’t you?)

For a video that goes a bit further into Netviz Nirvana – click here

The last few years have seen the adoption of social networking increase rapidly. From Facebook to Twitter,  LinkedIn to Flickr – there is a social network for just about anything.

As the revolution of social networking continues unabated, there comes a growing need to explore patterns within the networks – a process called social network analysis (SNA)

Previously, the world of social network analysis could only be accessed with a bit of computing knowledge. However, an open source programme called Nodexl has changed that by bringing some of the important metrics used to understand a network, and the ability to create impressive network graphs, into Excel.

Nodexl makes understanding a social network graph easy for anyone who can navigate around a spreadsheet. Excel is often where the world of computer programmers and the rest of us can meet up and speak the same language. Nodexl also makes it easy to import data from existing social networks such as Twitter, Flickr and Youtube

The people that can begin to make use of network graphs range from marketers to activists – and I imagine they are now a staple of any well equipped social media political campaign. Using a social network graph you can (among other things):

  • Spot the trusted influencers in a network
  • Find the important people that act as bridges between groups
  • Uncover isolated people and groups
  • Find the people who seem good at connecting a group
  • Plot who is at the centre and who is at the periphery of a network
  • Work out the where the weakest points of a network are
  • Assess who is best placed to replace a network admin
There are two basic components of a social graph:
  • Node: In a social network a node will usually represent a single person – but it can also represent an event, hashtag etc
  • Edge: A connection/interaction between two nodes – such as a friendship in Facebook, a follow on Twitter or an attendance at an event or Twitter Hashtag.

One major question that a social network analysis asks is how connected nodes (or people) are. But what determines how connected any person is? What metrics can be used to work it out how influential or powerful any individual player is?

These are some of the major metrics used in Nodexl – and they offer a good way to start thinking about your own networks:

  • Centrality – A key term which refers to how ‘in the middle’ a node is in a network.
  • Degree centrality – a count of the number of nodes a node is connected to. This could be the number of people that follow you on Twitter, or the amount of people that viewed a YouTube video. It is important to remember that a high degree score isn’t necessarily the most important factor in measuring a nodes importance.
  • In Degree and Out Degree – A connection between two nodes can be undirected (we are mutual friends on Facebook) or directed (you follow someone on Twitter that doesn’t follow you back). The In-Degree refers to the number of inbound connections, and Out-Degree refers to the number of outbound connections.
  • Geodesic distances – A geodesic distance is the shortest possible distance between two nodes  (popularly known as the degree of separation). In social network analysis, a nodes shortest and longest geodesic distance is recorded (the longest possible distance between a node and another is sometimes refered to as its eccentricity and can be used to work out the diameter of a network). An average geodesic distance of an entire network is worked out to assess how close community members are to each other.
  • Closeness centrality – This metric determines how well connected a node is in the overall network. It takes into account a nodes geodesic distance from all other nodes. Using this metric you can find people that don’t have strong connections.
  • Betweenness centrality – A score of how often a node is on the shortest path between two other nodes. This can be thought of as a bridge score – how important a node is at bridging other connections. People with a high betweenness centrality are often known as key players. A node could only have a degree centrality of 2, but if those two connections bridge to large unconnected groups, then that node will have a high betweenness centrality.
  • Eigenvector centrality – This looks at how well connected the people you are connected to are. It scores how much of a network a node can reach in comparison to the same amount of effort enacted by every other node in the network.

I am going to be exploring social network analysis over the next few weeks and blogging what I find here – if you want to follow along make sure you follow me on twitter or subscribe for updates.

My last two blog posts have explored the basic concepts of SEO and how SEO is used to get to the top of Google News. This post I want to shift over to the murky side of SEO and see how it is used as one of the ‘Dark Arts’.

An undercover investigation by the Bureau of Investigative Journalism recently exposed the inner workings of one of Britain’s largest lobbying companies, Bell Pottinger. Posing as agents for a country with terrible human rights abuses, the investigative team secretly recorded senior executives making promises to use the ‘dark arts’ to help bury negative coverage of human rights violations and child labour.

The techniques used by Bell Pottinger ranged from using their connections to the Prime Minister to ‘fixing Wikipedia’ and manipulating Google to ‘drown’ out negative coverage of their clients.

The manipulation of Google that Bell Pottinger refers to is the mastery of SEO techniques to push positive media coverage up a Google search engine results page (known as a SERP) and to pull negative coverage down. The idea is that the higher up a SERP a link is, the more likely it is to be visited by the searcher – a study found that 42% of searchers click on the first link on a page, and that 90% click somewhere on the first 10 links.

The use of SEO to give your company a boost in search engine authority is nothing new – the process of using it to drown out negative coverage is a capability Bell Pottinger bragged about pioneering in 2007, calling it ‘Crisis Management’

(from PR Week)

The group claims the firm, headed by MD Paul Mead, will link PR with SEO in a genuinely new way.

‘Previously SEO has only been used to make sure a brand is noticed and high-up on a relevant search,’ said BP Group chairman Kevin Murray. ‘What we are doing is taking the world’s biggest reputation management tool – Google – and turning it into a tool for crisis management.’

Whilst the basic’s of SEO are simple enough, to be an expert takes a lot of effort and the field changes daily. Google is said to have over 10,000 signals that tell it how to rank web-pages – signals that are kept a closely guarded secret. It is a SEO specialists job to experiment with Google and crawl through the hundreds of blogs and forums dedicated to the topic.

The unethical dark side of SEO has a name, Black Hat Search Engine Optimisation. It is frowned upon by the more ethically minded SEO practitioners and generally considered a short term solution to an SEO problem. However, for a lobbying company like Bell Pottinger – these short term solutions can be just the fix needed to drown out negative coverage.

Common techniques include:

  • Keyword Stuffing: This is where as many keywords as possible are stuffed into the content and the meta data. This is quite an old technique that most search engines can avoid.
  • Invisible Text: Placing a long list of keywords in white text on a white background so that it is invisible to a viewer but visible to a search engine.
  • Doorway Page: A page on a website that viewers will never visit optimised for search engines. If anyone does happen to visit this page, they are redirected to the main page.
  • Link Farming: Harvesting links from unrelated pages
  • Throw Away Domains: Purchasing domains with a keyword heavy address and linking to your page.
  • Deceptive Headlines: Luring people to your site with misinformation to increase your authority.

Another interesting technique is called ‘Google Bombing‘. This is a process that involves creating lots of links around the web that point to the page being bumped up and filling the anchor text (the visible, clickable part of a hyperlink) with the keywords. One recent example was when Pro-Lifers used a Google Bomb to bring the Wikipedia page for ‘Murder’ to the top of a SERP for ‘Abortion’.

These are just a handful of techniques for unethical optimisation. SEO is a non-stop dance between the search engines that are trying to create a useful and fair search engine result page and those that try and manipulate it. Of course, the people that are the best at manipulating SEO are those that attract the highest fee’s – fee’s that only companies with the budgets of a lobbying company like Bell Pottinger can afford.

The Bell Pottinger investigation highlights the incredible importance of SEO in our digital world. If anything is to be learnt, it is that those that support human rights must learn these techniques in order to combat against them.

I last blogged some notes on the basic concepts behind SEO – Keywords, Links and Content. But what are the methods for getting a story to the top of a Google News search? Being the first news website listed on a search result page would be gold dust to a news organization.

Of course, Google doesn’t divulge the secrets of its trade – so it is up to the SEO specialists to try and work it out. A study released in September asked the top SEO practitioners of major news organisations what they thought were the most important factors. The report is wide ranging and quite detailed – so here is my reading of the most important/interesting considerations.

The ten top signals in order of importance are:

  1. Category authority – if you keep writing optimized stories about a topic then you will gain authority in that area.
  2. Keywords in headline and page titles
  3. Domain authority – the news organisation domain has lots of quality inbound links
  4. Social sharing – lots of tweets, Facebook shares and G+ mentions. This is set to become more imporant, as it has recently been announced that articles that your friends have G+’d will be highlighted
  5. First to publish the story – this will increase the amount of inbound links
  6. Citation rank – the number of high quality sites that link (cite) to a news story
  7. Unique articles
  8. High CTR (click through rates) – the more clicks a site gets from either Google News or other Google SERPs (search engine results page).
  9. Quality content – Google evaluates the quality of the content and looks for things like typo’s and copied content. Apparently – one spelling mistake can blacklist your site!
  10. Use of Google News XML sitemap – a way of structuring your news site in a way that Google can easily understand

Other important factors to consider are:

  • Using an author tag (a HTML tag that declares the author of an article)
  • Having many different authors
  • Number of articles published by that author
  • Local relevance – a local website would rank highly on a local issue
  • Using a syndication tag – if content is syndicated out to other sites, this tag lets Google know which was the original source.
  • Keyword should be first word of page title (for more keyword specific considerations click here)
  • Google will search for the presence of words it considers related to the news story
  • Up-votes for the article on social sites like StumbleUpon and Reddit are relevant
  • A good sentiment of words used in citations and social links.
  • Creating several articles about a topic in a short time increases authority
  • The number of images in an article is important – and the image should be hosted on the same domain as the article
  • A long download time for the article will negatively affect authority

An understanding of SEO (Search Engine Optimisation) is important to anyone that creates content for the web. SEO is the process of making a website show up as high as possible in search engine results, which increases the chances of it being visited by the searcher.

The world of SEO is enormous, and there are many different methods available. Getting the right combination of SEO factors, or signals, is the key to SEO success. The exact recipe for how a company like Google performs search is a company secret, but is said to involve 10,000 different ranking signals!

So, whilst you could never hope to learn them all, here is an overview of some of the main methods that you can understand and start considering for your site.

Inbound Links

The number of inbound links (or backlinks) to your website is a key method that search engines use to assess the authority of your website. It is a major feature of Google’s PageRank search algorithm as well as other major search engines like Technorati.

When a website links to yours it acts as a ‘vote’ of authority and improves the SEO of your site. The more inbound links you have to your website, the more authority your site is given.

Some inbound links are worth more than others – so a link from from the Guardian or the BBC is worth much more than a link from a small WordPress blog.

Also, if the site that links to yours has similar content then the link is considered more relevant than from a site that features irrelevant content.

A further consideration is the anchor text of the link. Anchor text refers to the words that are visible on a page and constitute the length of the link. If the inbound link includes lots of relevant anchor text then it improves the SEO rating.

Blog Comments

When you leave a comment on a news site or blog, you are normally given the opportunity to include a link to your site. Whilst this link is largely ignored by the major search engines, it can still draw traffic to a website if the comment is interesting enough to warrant a readers further interest.


In order to show up in search engines, you need to know what words and phrases people are searching for. If you are running a beauty website, do people use the word ‘face cosmetics’ or ‘make-up’ more often? Ensuring that you have a high quantity of these highly searched for terms is key to driving traffic to your site.

Keyword Density is the percentage of your page that is made up of that keyword. So, if you have 100 words and 10 of them are the keyword then you have a 10% keyword density for that keyword. Whilst it may be tempting to continually repeat the key words throughout the site, it is important to strike a balance between repetition and good sounding writing.

Also – too high a keyword density will make Google think you are spam. A good rule of thumb is 3-6% keyword density. If you are struggling to find places to include the keywords, use them in place of pronouns.

Where these words are placed is also important. The main areas to place keywords are:

  • Domain Name – Google ranks pages with a keyword in the URL highly. Ensure that any blog posts have the dominant keywords included in the URL.
  • Headers – Don’t go for puns or clever titles when naming a page, but make them as explanatory and keyword filled as possible. There are several different types of headers and sub headers, and a search engine will look at all of them for clues to the sites content.
  • File names – If you are uploading media, make sure you name the file type an appropriate keyword.
  • Meta description – This bit appears underneath your site in search results, so make sure it includes the keywords. If you are blogging using WordPress, you can add a plugin that lets you write a meta description for each post.
Researching is an important part of finding the right keywords and here is a pretty comprehensive list of tools to use.


Having good quality content is the most important element of SEO.  Search engines have access to the amount of time users spend on your page and know if people are clicking links on your page or bouncing straight away again.

You need to ensure that you are offering something that other people are not. Content should be fresh, topical and relevant – this is the kind of content that is enjoyed and shared by people.

Keeping content fresh is also important for another reason. If a particular search term becomes unusually popular for a short amount of time, Google will work out why – and if you have content associated with the reason for this spike you will improve your SEO. This is called Query Deserved Freshness – read more here.

To find out more about SEO – check out the authoritative Search Engine Land