Good blogging practice – publishing reliable information

The web is massive bank of data, which is far too big to be regulated. Because the web can’t be regulated, it is very easy for false information to spread – fast.

If you are a blogger, it is really important that you publish information which is reliable and trustworthy. Don’t copy what the crowd says unless you know they are right, as this is not only misleading to your readers, but can also see you get penalties dished out from search engines. If you get a reputation for publishing unreliable content, the likelyhood is that your readership will fall.

When you publish something that you have found out elsewhere, you need to make sure that it is accurate and reliable, before you publish it.

How to Mythbust Rumours

When you find information, on the web, in order to ensure that it is reliable, it is always a good idea to check that it appears elsewhere. A general rule of thumb is to check that what you are reading is the same on 3 other sites, one of which is a highly reputable site.

So what is a reputable website?

Government Websites

There are a few way so to identify if a site is reputable or not. One way is to see if it is a government website. Any site which is government run is likely to be very reputable. Government websites usually end in their own unique domain name extension. If you live in the USA, government sites end in .gov or .fed.us, in the UK .gov.uk, in France .gouv.fr, .gc.ca for Canada, India’s extension is .gov.in and the list goes on.

Major News Corporations

Government sites won’t always report things that you want to verify though, so there are other ways to tell a reputable sites. Big news websites like BBC.co.uk/News and Guardian.co.uk will usually only publish information that is factual and accurate, so you can usually trust them.

The Guardian's logoThe information they publish is likely to be accurate, however it may not be impartial, so that is something to watch out for. Often news firms will take a political side, and therefore report news in a certain way – and may only publish part of a story.

High PageRank Sites

Google PageRank is calculated largely by the number of backlinks a page or site has. If a website has a very high PageRank (6+) then it is likely that it has a lot of other sites linking to it, most probably because it publishes a lot of high quality content, which people find useful and therefore link back to. High PageRank sites aren’t always trustworthy, but the higher up the spectrum of PageRank you go, the less likely it is that a site is going to be providing false information.

If a website is a PageRank 8, 9 0r 10, unless they have manipulated Google’s algorithm (through black hat SEO, which will only work for a short while, before Google catches them) then the site is likely to be extremely reliable and reputable, therefore you should be able to trust the information, data and facts that they produce.

1,000,000 to 1

If 1 highly reputable site is saying one thing, but 1 million other (not reputable) sites are saying another another, then the chances are that the 1,000,000 sites are just recycling the same false information, creating a massive bank of false information. This is one reason why you should be really careful who you trust on the web, and also make sure that you verify information with at least one reputable site. Be careful who you trust.

Academic Research

Verifying information with at least 3 sources, one of which is reputable is something which is also advised in academic research. Therefore if you use the same standards on your blog, you can’t go wrong! Search engines and readers alike will respect you for providing good quality, highly reputable content.

Technology Bloggers Policy

Every time I write an article and quote information/statistics etc. I always try to follow the 3 and 1 rule: check the information appears on 3 other sites, at least one of which is ‘reputable’. This means that everything I write should be reputable.

The post guidelines ask all writers to ensure they use the 3 and 1 rule, however we cannot guarantee that all writers do. In our Privacy Policy, we state how we try to ensure all content is true and factual, however it is always advisable to independently verify information for yourself.

Do You Verify Your Content?

Do you always try to ensure that you use the 3 and 1 rule when publishing information? That not only applies to blog posts, but also to comments. If not what measures do you use, or don’t you think it really matters?

What are your thoughts on the recent PageRank update?

Google’s head of Google webspam, Matt Cutts (a Google employee and guru on everything search) is always telling webmasters not to obsess too much about PageRank. I would agree, it is not always that accurate, (give or take 1 rank either way) probably because it is not publicly updated that frequently – it is always updating, results are just not released regularly to the public.

At the end of the day, PageRank is just a lovely green (or maybe a not so lovely white) bar that a page is given. It doesn’t necessarily correlate to how a site is performing in the SERPs, and doesn’t guarantee good rankings.

That said, I am pleased for the blog, as our green increased a little, and white retreated back, as Technology Bloggers jumped from a 3 to a 4 🙂

Google say on their own website that PageRank represents:

“Google’s view of the importance of a webpage”

That is a direct quote from Google.

So basically, pages ranked 0/1 (in Google’s view) aren’t that special, there are loads out there, nothing makes them stand out. Pages with a PageRank 2 are more important, they are special, but not that special. The further up the scale you go, the more value your page is worth. You might have a high value homepage, but low internal pages, that is to be expected, as a lot of the algorithm is based on links.

One would assume that if you have serious traffic, you should be right at the top of the PageRank scale, as people find your page very useful, and therefore Google must think your page is important.

Google’s PageRank

Until very recently, Google.com has been a PageRank 10. It is the most visited site on the internet, by a long way. From what I understand, the site receives around 1,050,000,000 (1.05 billion) unique visitors a year. Facebook is second, with around 950,000,000 (0.95 billion) unique visitors a year – note not all those people have accounts.

Twitter gets just 220,000,000 visitors a year (0.22 billion). So why is it then that in the recent PageRank update, Google ranked its main homepage (Google.com) 9/10, it ranked Facebook 9/10, but it ranked Twitter 10/10. Twitter is one of around 10 sites on the net with a PageRank 10. Twitter is only the 8th most globally visited site on the web, whereas giants Google and Facebook are clear leaders.

The UN and the The U.S. Government’s Official Web Portal are two of the other few sites with a PageRank 10 on the web. Updates over the last year have seen a lot of PageRank 10’s loose their rankings. Why?

Larry and Sergey with the Google logo in the background

Larry Page and Sergey Brin - the founders of Google.

Is the web getting less ‘important’? What are your thoughts on this? I find it really interesting how Larry Page‘s (co-founder of Google) algorithm, which is used by Google, ranks Google less than top.

Talk to me 🙂

How Google’s Panda Algorithm is changing the web

On April 11, 2011 Google announced on it’s Webmaster Central Blog sweeping changes to it’s search index. In an effort to promote high quality websites and eliminate websites that have poor content the algorithm was adjusted. This adjustment incorporated “feedback signals” which are expected to help Google users find better search results.

What started out as something small

What was described by Amit Singhal, Google fellow, as a “small” update, affected only 2% of US queries has turned out to radical. CNET describes the Google’s algorithm as “radical” and reports the websites that have been affected by reduced rankings.

Great websites drop in ranking

The British Medical Journal, the Cult of Mac, and WikiHow all dropped in visibility. Considered to be reputable sources, and not content farm, these reduction in rank show the algorithm still needs some fine tuning. Other websites that might be considered content farms have been dropped as well.

Facebook, Yelp & Twitter rise in rankings

Websites to increase in ranking are Facebook, Twitter and Yelp. These websites moving up begs a question though. Facebook already has over 500 million active users, of which, 50% log in each day. Does this website really not already appear “high” enough in the search engine. In addition, social media is changing Google search results with the adoption of the “Plus One” on Google and “Likes” on Bing.

Facebook increase in rankings and wikiHow decrease in rankings

Local Searches Soar

Despite the rise of social media websites and some news sources, many poor websites have fallen in ranking, as was expected from this Panda Algorithm shift.

Local searches have significantly improved, however, due to this shift in the algorithm. Websites of companies that are popular in the United States (David’s Bridal, Barnes and Nobles, Walgreens) all appear prominently in the US search index, but not in the UK index. This shift paves the way for a greater index in international search engine results. Improving international search results is a significant win for Google’s index.