Do you make use of Google Webmaster Tools?

Everybody knows that if you want to run a successful website, Google is one of the most important factors to consider.

How Google understands, interprets and indexes your site, is crucial to where your site appears in the SERPs, and how well your site preforms in the SERPs (specifically the Google SERPs) can be a big determinant how much traffic your site receives, and ultimately how popular/successful your site it.

Google Webmaster Tools is a very useful tool which is often underused by site owners, in order to improve the quality and quantity of traffic that your site receives. In this article I will outline some of the key features I find useful, and some of the main reasons why I use Webmaster Tools.

Google Webmaster Tools

Why Google?

Google has a monopoly on the search market, with more than 90% of all searches being done through Google – according to StatCounter Global Statistics. Therefore the chases are the majority of traffic your site receives through search is from Google. It would be naive to ignore Bing and Yahoo’s search tools available to webmasters, however if you plan on just focusing on one, Google is probably the wisest choice.

Google is renowned for its major updates, with Penguin and Panda just two recent examples. Webmaster Tools can be a great aid in helping you understand how your site has been affected by the changes and why, so you can either keep doing things the way you are, or change your strategy.

See how well you are doing

The most recent Google Webmaster Tools update has divided the dashboard into five easy to understand sections: configuration, how your site is set up (locality, URL preferences, sitelinks etc.); health, how Google crawls your site and any errors, or malware it detects, and the URLs Google is denied from crawling; traffic, how do people find your site, which search queries do you appear for, who links to your content with what keywords and how does Google+ influence your visitors; optimization, tips and tweaks on how you could adjust your content and sitemap to improve your search position; labs, the latest tools Google are trailing that may be of use.
Webmaster Tools Options - Dashboard, Messages, Configuration, Health, Traffic, Optimization

Find crawler errors

One of the main reasons I use Google Webmaster Tools is because it lets me see how Google views and interprets the sites I administer. Therefore should there ever be an error, I am able to understand what Google is struggling to read/crawl, and therefore try to address the issue. Google lets you view and test specific URLs your robots.txt file is blocking Google from indexing and crawling – there is a difference. If you are denying it access to something by mistake you can then rectify this.

Google also lets you see any pages it cannot find. If you run a content management system based site (like one powered by WordPress) it is common that you will change things using the system, and unforeseen errors will be created leading to pages not being found where they either should be, or once were. Google lets you see when it can’t find pages, along with when it is denied from accessing pages, and when inadequate redirects are in place.

If you don’t use Google Webmaster Tools and don’t reduce the problems Google encounters when crawling your site, the likelihood is that your site will suffer in the SERPs – there isn’t much debate about that.

Labs

Google say that:

“Webmaster Tools Labs is a testing ground for experimental features that aren’t quite ready for primetime. They may change, break or disappear at any time.”

however this doesn’t mean that these tools should be ignored, in fact I think they are probably one of the most overlooked resources that Google provides webmasters with.

One of the current ‘Labs’ tools that I think is very useful is the ‘Site performance’ tool. Google may not have generated any information about your site, however if you are one of the lucky ones to be analysed, this can prove a very interesting tool. In Google’s own words:

“This page shows you performance statistics of your site. You can use this information to improve the speed of your site and create a faster experience for your users.”

As page load time becomes more and more important to users and therefore search engines alike, this page is of crucial importance for many people.

Improvements

You might not expect it, but in the ‘Optimization’ section, under the ‘HTML Improvements’ section Google will actually suggest areas where you could improve your code to ensure that your content is the best possible. Common errors Google suggests for correction include missing or duplicate title tags, (in most cases, and SEO no, no) and meta tag issues.

Traffic

The tools in the ‘Traffic’ section are probably the ones I use the most. ‘Search Queries’ gives you a fantastic incite into where your site is appearing in search results in all different locations across the world. If you pair Webmaster Tools with Analytics, this can become a lot more useful.

Links to your site and internal links lets you see your post linked to content, and the keywords that are linking to it. Generally speaking, if you want to rank well for a keyword, you need to have some links (internal and/or external) using that keyword.

The great thing about Google Webmaster Tools is that it integrates with many other Google programs, in order to improve your total control and visibility of your site. AdSense, Analytics, YouTube and AdWords are just some of the other Google products that Webmaster Tools integrates with.

That is just a quick overview of what Webmaster Tools has to offer. If you own a website, I strongly recommend that you explore it further to help improve your sites visibility in the search results, and to enable you to weather algorithm changes (like Penguin and Panda) that little bit better.

Do you use Webmaster Tools? What are your favourite features?

Problems with online anonymity

The internet probably knows what your favourite shoes look like. How you may ask? Your data is being monitored through your PC without you hearing as much as a peep about it. Private firms can spy on users from the comfort of their own computers.

The FTC has recently handed in a report advising private firms to be more open about their data collection practices. New laws regarding user privacy are also currently being worked on.

Users who want to preserve any semblance of privacy left are looking into do-not-track tools. Some suggest adding a do-not-track option directly into browsers, while others are in favor of different software that can curb data collection altogether.

With regular website cookies come other tracking cookies that help the sites we’re visiting identify our user pattern and collect our data. Current data collection practices aren’t transparent, so we have no idea what these sites are up to once they have what they need.

Failure to comply

The universal do-not-track button goes as far as requesting a website that a user’s information not be tracked as they browse a site. However there’s no guarantee that the site will comply with the request.

This option does close to nothing in terms of blocking the websites access, largely because it can’t. Google’s recent fine for lifting data from open Wi-Fi connections without user permission and Facebook’s accessing people’s texts on app user’s cell phones is proof that firms don’t always adhere to the norms of privacy – and those are two really big firms.

At best a do-no-track tool will lull you into a false sense of security where in reality you have more than one front to protect yourself on. Large private firms aren’t the only ones stealing data; there are numerous other threats which one needs to take into account.

Monsters beneath your bed

Fighting against tracking cookies alone is as much the same as looking for the monster in the closet without realizing what’s hiding under your bed.

Options such as AVG’s Do-Not-Track or DNT+ will only go as far as the do-not-track button is meant to. However, PC monitoring tools and other forms of spyware could already exist on your system – granted the data would be going to a person and not a company.

Most computer monitoring software is wired to record your browsing history. Whether or not you’re deleting your cookies becomes irrelevant here. The same is the case with spyware or malware that you mistakenly download by clicking on obscure links or opening spam emails.

No free lunches

Free Wi-Fi is a real treat till you realize that there’s a chance it’s been decked up with computer monitoring software which can record every move you make on your browser. Software such are Firesheep and Wireshark can easily make their way into your system if you’re on a network that has them preinstalled. The Wi-Fi owner has no need to break into your system manually or be anywhere near you to figure out what you’re using the Wi-Fi for.

The WiFi LogoThat’s if you’re using someone else’s Wi-Fi. However even if your own Wi-Fi is open you’re in danger of being attacked. During 2010 reports that Google was lifting private data through open Wi-Fi’s first surfaced, and regardless of how apologetic Google was, it never stopped the practice.

Even with new laws in place for the preservation of user data and more transparency as to what cookies are infiltrating user systems, there’s still a large potential for data collection against a user’s will.

The best idea would be to take a holistic approach to your browsing experience and stay safe from all sides – after all don’t-track-tools are only one a small aspect of online safety, not the key.

How to find images for your website

There’s absolutely nothing wrong with taking a DIY approach to building your website. If you choose the right tools *cough* WordPress *cough* you can produce something that looks very professional without having to know web design in an out.

As with Creative Commons images, always check the usage rights of every image cautiously. You might be required to credit the photographer, or use may be forbidden in some situations.

Creative Commons Logo

Purchase cheap stock photography

It is all well and good looking for free pictures, but it’s often easier to invest a small amount of cash than to spend hours finding the right totally free image.

That is where websites like iStock Photo and ShutterStock come in. These vast repositories include thousands of pictures, most of which you can buy for just a few pounds. Sure, they can be frustratingly clichéd at times, but a bit of experimentation with what you search for can generally get outcomes. Expect to pay from £1 upwards for each image.

Ask permission

This is most likely your best option if you are looking for an image of a current occasion or specific individual to use with an article or blog post on your web site. Amateur photographers are often pleased to let their photos be utilised at no charge – if you ask nicely.

A good method to find pictures is through Flickr. For example, there are many David Cameron and Tom Cruise pictures to select from. As soon as you’ve found a photo you like, just use Flickr’s contact choice to send the photographer a message asking their permission.

Do be wary using photos of well-known individuals – whilst generally it’s okay to use them alongside news stories and other editorial, you will get in difficulty if it looks like they’re endorsing your item or service.

Take your own photos

With even cheap mobile phones able to create reasonable-quality pictures, you don’t have to be a pro to capture photos that are good enough for the web.

Even though the company is in all sorts of difficulty, you will find some good suggestions for taking better photos on the Kodak website.

Assuming you already own a camera, this method is practically free – and it holds other benefits more than stock pictures. For example, do you think website visitors would prefer to determine a generic image of someone on the telephone, or an actual member of your sales team at function in your workplace?

How do you discover photos for your website? Leave a comment to let us know.