Archives for posts with tag: News

An internet blackout by some of the internet heavyweights is looking much more likely. Mashable, one of the biggest tech websites out there, published an editorial calling for a campaign to inform the masses about the danger posed by SOPA.

Facebook, Google and Wikipedia. You’re the Big Three in this fight. You’ve already publicly affirmed your opposition to SOPA. Now it’s time to really be a part of the fight.

Everyone in the tech community knows about SOPA, but that isn’t enough – the anti-SOPA movement needs the average Joe to understand and protest against the bill.

A blackout of Facebook, Google and Wikipedia would get the world talking. It would be on the frontpage of newspapers (except possibly the SOPA supporting Murdoch press). People will ask ‘what is it about SOPA that causes these internet behemoths to take such drastic action?’

January 18th is the date set by members of online community Reddit for the blackout. Hacktivist collective Anonymous have tweeted that they will embark on radio silence on that day, and Wikipedia founder Jimmy Wales has stated that he hopes Wikipedia will be ready to get involved:

I’m all in favor of it [a January 18 blackout of Wikipedia], and I think it would be great if we could act quickly to coordinate with Reddit. I’d like to talk to our government affairs advisor to see if they agree on this as useful timing, but assuming that’s a greenlight, I think that matching what Reddit does (but in our own way of course)[…]

Of course, we really need Google to get involved. After all, ‘Don’t be evil’ is their informal corporate motto. They have stepped up to the mark before by removing Google search capabilities from China, now we have to hope they are prepared to step up again.

The SOPA bill is the desperate bite of a wounded and dying entertainment industry. The internet has liberated artists and content providers. We are seeing the emergence of an organic internet marketplace, free from the layers of middlemen that have exploited artists for so long. They have been creaming money of the work of others for so long that they think what they do is natural.

January 18th is set to be an important day for the internet. How important is up to the big three.


Today offered a glimpse of a truly amazing future for conscientious shoppers that want to boycott products.

A team of anti-SOPA activists (read about the Stop Online Piracy Act here) have created an app that allows you to scan a barcode from a product and see whether the product is made by one of the 800 SOPA supporting companies.

It works by automatically checking a product against a database of companies. If the scanned product  comes from a SOPA supporting company, then a big red ‘x’ is displayed on the screen – enabling the shopper to chose not to purchase.

The idea behind the boycott app is brilliant and could be applied to anything. Simply change the list of companies in the database to whoever you want. If, for example, you want to boycott GlaxoSmithKlein after hearing about their exploitative and illegal vaccine tests that killed 14 babies – you could add them to your ‘boycott list’. Don’t like Coca-Cola for any of their irresponsible acts – add them to the list.

In a world where mobile app’s seem to be the domain of marketers – it is refreshing to see mobile technology being used by activists  to empower consumers and help hold corporations accountable.

Ideally this tool should become opensource so that any activists  or consumers can create their own unique database of companies to use with the app. Campaigning groups could make lists for supporters to upload to the boycott app. It could even be used to discover things about products when in a store – e.g. this cereal manufacturer CEO kills baby seals or this fashion designer has links to the far right.

Barcode scanning is something that is set to become more popular among consumers. This app is the latest incarnation of a broader trend of  scanning technology. Amazon recently released a popular mobile ‘Price Check’ app that encourages consumers to scan products they come across in bricks and mortar stores and receive a discount if they buy the product online through the Amazon app.

You could argue that the time it takes to scan every item of a weekly supermarket shop would be a barrier. However, jump a year or so in the future and every item will contain a RFID (radio frequency identification) chip, which is a superior and more efficient method of identifying objects than a normal barcode.

Then the same kind of friction-less technology we are seeing with Facebook will be a part of our shopping experience. Put a product in your shopping basket and your phone will give you a little alert if it is to be boycotted. Check out this ubercool video on the RFID future of shopping to get what I mean.

I last blogged some notes on the basic concepts behind SEO – Keywords, Links and Content. But what are the methods for getting a story to the top of a Google News search? Being the first news website listed on a search result page would be gold dust to a news organization.

Of course, Google doesn’t divulge the secrets of its trade – so it is up to the SEO specialists to try and work it out. A study released in September asked the top SEO practitioners of major news organisations what they thought were the most important factors. The report is wide ranging and quite detailed – so here is my reading of the most important/interesting considerations.

The ten top signals in order of importance are:

  1. Category authority – if you keep writing optimized stories about a topic then you will gain authority in that area.
  2. Keywords in headline and page titles
  3. Domain authority – the news organisation domain has lots of quality inbound links
  4. Social sharing – lots of tweets, Facebook shares and G+ mentions. This is set to become more imporant, as it has recently been announced that articles that your friends have G+’d will be highlighted
  5. First to publish the story – this will increase the amount of inbound links
  6. Citation rank – the number of high quality sites that link (cite) to a news story
  7. Unique articles
  8. High CTR (click through rates) – the more clicks a site gets from either Google News or other Google SERPs (search engine results page).
  9. Quality content – Google evaluates the quality of the content and looks for things like typo’s and copied content. Apparently – one spelling mistake can blacklist your site!
  10. Use of Google News XML sitemap – a way of structuring your news site in a way that Google can easily understand

Other important factors to consider are:

  • Using an author tag (a HTML tag that declares the author of an article)
  • Having many different authors
  • Number of articles published by that author
  • Local relevance – a local website would rank highly on a local issue
  • Using a syndication tag – if content is syndicated out to other sites, this tag lets Google know which was the original source.
  • Keyword should be first word of page title (for more keyword specific considerations click here)
  • Google will search for the presence of words it considers related to the news story
  • Up-votes for the article on social sites like StumbleUpon and Reddit are relevant
  • A good sentiment of words used in citations and social links.
  • Creating several articles about a topic in a short time increases authority
  • The number of images in an article is important – and the image should be hosted on the same domain as the article
  • A long download time for the article will negatively affect authority

Alan Rusbridger, the Guardian Editor in Chief, has outlined 15 reasons why Twitter is important. I have listed them below and then added 5 other reasons. I’ll add more as I think of them.

  1. It helps with distribution of news.
  2. It is often the source of breaking news.
  3. It can often outdo Google when it comes to search.
  4. It is a formidable aggregation tool.
  5. It is a great reporting tool – for both finding information and asking the crowd.
  6. It is great for marketing and letting people involved in your content know that it is there.
  7. It is a series of common conversations with instant feedback
  8. It is diverse environment.
  9. It is opening up a new tone of writing – brief but humourous, succinct and more personal.
  10. It levels the playing field – hard work is rewarded.
  11. It has different values – a story may make in all the nationals but have little Twitter impact, and vice-versa.
  12. It has a long attention span – conversations around a topic can last for ages.
  13. It creates communities.
  14. It changes notions of authority.
  15. It is an agent of change.

And here are my additions:

  1. You can follow events as they unfold – any news event will be given a hashtag and you can easily find help from the eyes on the ground.
  2. It allows you to be in several places at once.
  3. It encourages serendipity – you stumble across ideas and people that will completely change your opinions and direction.
  4. You can contact people directly – and it is much more likely you will get a response.
  5. It is perfect for finding the exact person you are loooking for – not to mention the possibilities in Geo-Tagging.

The amount you can fit into the 140 character limit of a tweet has just had a significant increase. Popular link sharing website has began to offer a way to send multiple links as one shortened URL – Bundles. aren’t the first company to do this – I’ve been using for some time to send multiple links – but they are the first ‘big’ brand to do so. And I hope they are the ones to help it really catch on.

As you would expect, the process is refined. You can arrange the links in your bundle, add notes, descriptions and headings, share your bundle and then start a conversation.

I have always been surprised that more people don’t tweet multiple URL’s. The possibilities are endless and it is a brilliant way to tell a story.

You could:

  • Collect a series of related Youtube videos
  • Show all the links in an internet debate or controversy
  • Tweet your morning reading to your followers
  • Show off your favourite sites
  • Give all the perspectives in an argument
  • Collect a series of training materials

Can you think of any more?

One feature that I would love to see is the ability to automatically create a bundle from al the links I have open in my browser.

( Bundle is part of my Web 2.0 Toolbox)

With such an abundance of information flying at us from around the web, the role of the curator is becoming increasingly important.

A curator is responsible for helping a clear narrative emerge from all the noise. They find the important parts and pull them together to create a compelling experience.

As blogging platforms have evolved they have increasingly  helped willing curators easily save and present content, allowing them to create highly engaging content for their readers.

Tumblr in particular has recently blossomed as a curating platform.

The easy to use blogging platform has attracted some of the top news organisations. They are finding content from across the web and using Tumblr to create an informal, human and social ‘scrapbook’ (Newsweek and TotalFilm are two excellent examples).

Storify is the next step in the evolution of blogging platforms.

Designed to enable curators to pull what people post on social networks into a compelling story – it allows a user to seamlessly pull video, photos or tweets into a single Storify story.

After playing around with it my initial thoughts are that it would be excellent for creating a story on any event that is being tweeted.

A curator can watch an event unfold on Twitter (by simply following a hashtag from within Storify) and drag and drop noteworthy tweets into a Storify story.

They can add text comments at any point, easily input any relevant links/images/videos/updates, give the story a headline and summary, and then embed the story on a website.

It is quick, easy and intuitive (key ingredients for any popular publishing platform) and destined for large scale adoption.

ReadWriteWeb have already demonstrated its potential by publishing an interview with Twitter founder Evan Williams.

For more excellent examples check out the Storify blog. And to gain access to the beta use this code TCDISRUPT

As if declining newspaper sales hadn’t made journalists fear for their jobs enough – along comes the robot reporter!

I’m not talking about C3PO with a notepad – but rather the ability computers have to analyse data and write reports automatically.

Narrative Science is a company that runs software which recieves date, analyses it and writes a piece of news copy – all without a human hand in sight.

So far the software has been used to successfully report on sports games which, due to financial constraints, couldn’t normally be reported on. But the possibilities of this new form of reporting are enormous.

This technology could potentially be applied to any piece of reporting that requires analysis of data. And with the amount of data around us piling up at a ridiculous rate, there is definitely a need for something to help make sense of it all.

Of course, I seriously doubt that a computer would be able to dig out the finer details that a trained reporting human eye would uncover. But equally as unlikely is the ability for humans to find details and patterns in data which a computer easily can.

The example given by one of the partners of Narrative Science is proof enough: “One machine-generated game story suggested that the pitcher’s excellent performance during that game indicated that he might be coming out of a slump.”

So – with governments around the world opening up their data, alongside a growing need to monitor the actions of bankers and a web that keeps the world constantly producing data, this new method of reporting could really catch on.

And how will this effect journalism? I doubt it will lead to news companies sacking their staff and hiring companies like Narrative Science.

I don’t believe that automating news reporting will diminish the role of human reporters – rather it will help reporters get closer to the story hidden within  the data. And save a lot of time.