Robotic Warfare

Noel Sharkey is a Professor at Sheffield University in the UK, and he has just written an article for CNN. He is interested in robotics and artificial intelligence, and he is leading a call to ban the development of “autonomous” killing machines.

We might be thinking about a killer robot here, and as many will know there are already plenty of unmanned systems in operation. Drones are very much in the press, but they are flown by a pilot and the decision to kill someone is taken by a human, even if they might be several thousand miles from the action.

But Sharkey is concerned about the future development of systems that can be programmed for a task, but then autonomously make decisions during that task. He does not believe that a computer can make the types of decisions necessary in warfare, or at least not with morality and judgement.

BAE Spider Robot

BAE Spider Robot

There are 2 real sides to the argument about robotics in war. One states that mechanization of warfare would lead to less casualties, more precision, less danger for the troops and all in all a cleaner fight. There would be no more massacres of civilians because a soldier takes retribution for an unrelated attack, fewer accidental deaths etc.

But on the other side we are talking about machines making decisions that should incorporate humanity, such as how many deaths are justified for a particular objective? Is the death of an individual really of strategic advantage? What if the machines malfunction, or are taken over by hackers? Who can be held responsible for their actions? And aren’t we more likely to go to war if we can send machines and leave the boys at home?

All of these arguments are fought over within the robotics community, but we should remember that we have already travelled some way down the road of computerized and mechanized war. Anti aircraft and missile defence as is being deployed in Asia today is no longer a mechanical affair, they are computerized systems that all but fire themselves, and they certainly do not require a person to aim them like in the old films.

Bomb disposal robots, unmanned vehicles and the likes are already deployed, mechanical spider troops that really do bring the idea of cyber war to the modern scenario are under development as this article explains.

One problem is that of foresight, how can we make legislation today when we do not have any real idea of how and how much technology will advance in the foreseeable future. Also this type of robotics often comes from or aids other developments, such as the robot surgical machinery that I reviewed in a previous post. Infiltration and influence is everywhere.

If you would like to get an idea of how far we have come in terms of movement, take a look at this BBC video. A Boston company has produced a robot for military use (testing chemical suits) that moves remarkably like a human.

I have also written a couple of articles covering this issue on the Bassetti Foundation website. Read this article about recruiting robots for combat for an overview and follow the links.

Here you will also find an interview with robotics professor Ronald Arkin in which he describes how looking for funding lead him into designing robots that were paid for by the US military. They are of course the largest investor, a rather sobering thought given the current state of University funding.

Do you make use of Google Webmaster Tools?

Everybody knows that if you want to run a successful website, Google is one of the most important factors to consider.

How Google understands, interprets and indexes your site, is crucial to where your site appears in the SERPs, and how well your site preforms in the SERPs (specifically the Google SERPs) can be a big determinant how much traffic your site receives, and ultimately how popular/successful your site it.

Google Webmaster Tools is a very useful tool which is often underused by site owners, in order to improve the quality and quantity of traffic that your site receives. In this article I will outline some of the key features I find useful, and some of the main reasons why I use Webmaster Tools.

Google Webmaster Tools

Why Google?

Google has a monopoly on the search market, with more than 90% of all searches being done through Google – according to StatCounter Global Statistics. Therefore the chases are the majority of traffic your site receives through search is from Google. It would be naive to ignore Bing and Yahoo’s search tools available to webmasters, however if you plan on just focusing on one, Google is probably the wisest choice.

Google is renowned for its major updates, with Penguin and Panda just two recent examples. Webmaster Tools can be a great aid in helping you understand how your site has been affected by the changes and why, so you can either keep doing things the way you are, or change your strategy.

See how well you are doing

The most recent Google Webmaster Tools update has divided the dashboard into five easy to understand sections: configuration, how your site is set up (locality, URL preferences, sitelinks etc.); health, how Google crawls your site and any errors, or malware it detects, and the URLs Google is denied from crawling; traffic, how do people find your site, which search queries do you appear for, who links to your content with what keywords and how does Google+ influence your visitors; optimization, tips and tweaks on how you could adjust your content and sitemap to improve your search position; labs, the latest tools Google are trailing that may be of use.
Webmaster Tools Options - Dashboard, Messages, Configuration, Health, Traffic, Optimization

Find crawler errors

One of the main reasons I use Google Webmaster Tools is because it lets me see how Google views and interprets the sites I administer. Therefore should there ever be an error, I am able to understand what Google is struggling to read/crawl, and therefore try to address the issue. Google lets you view and test specific URLs your robots.txt file is blocking Google from indexing and crawling – there is a difference. If you are denying it access to something by mistake you can then rectify this.

Google also lets you see any pages it cannot find. If you run a content management system based site (like one powered by WordPress) it is common that you will change things using the system, and unforeseen errors will be created leading to pages not being found where they either should be, or once were. Google lets you see when it can’t find pages, along with when it is denied from accessing pages, and when inadequate redirects are in place.

If you don’t use Google Webmaster Tools and don’t reduce the problems Google encounters when crawling your site, the likelihood is that your site will suffer in the SERPs – there isn’t much debate about that.

Labs

Google say that:

“Webmaster Tools Labs is a testing ground for experimental features that aren’t quite ready for primetime. They may change, break or disappear at any time.”

however this doesn’t mean that these tools should be ignored, in fact I think they are probably one of the most overlooked resources that Google provides webmasters with.

One of the current ‘Labs’ tools that I think is very useful is the ‘Site performance’ tool. Google may not have generated any information about your site, however if you are one of the lucky ones to be analysed, this can prove a very interesting tool. In Google’s own words:

“This page shows you performance statistics of your site. You can use this information to improve the speed of your site and create a faster experience for your users.”

As page load time becomes more and more important to users and therefore search engines alike, this page is of crucial importance for many people.

Improvements

You might not expect it, but in the ‘Optimization’ section, under the ‘HTML Improvements’ section Google will actually suggest areas where you could improve your code to ensure that your content is the best possible. Common errors Google suggests for correction include missing or duplicate title tags, (in most cases, and SEO no, no) and meta tag issues.

Traffic

The tools in the ‘Traffic’ section are probably the ones I use the most. ‘Search Queries’ gives you a fantastic incite into where your site is appearing in search results in all different locations across the world. If you pair Webmaster Tools with Analytics, this can become a lot more useful.

Links to your site and internal links lets you see your post linked to content, and the keywords that are linking to it. Generally speaking, if you want to rank well for a keyword, you need to have some links (internal and/or external) using that keyword.

The great thing about Google Webmaster Tools is that it integrates with many other Google programs, in order to improve your total control and visibility of your site. AdSense, Analytics, YouTube and AdWords are just some of the other Google products that Webmaster Tools integrates with.

That is just a quick overview of what Webmaster Tools has to offer. If you own a website, I strongly recommend that you explore it further to help improve your sites visibility in the search results, and to enable you to weather algorithm changes (like Penguin and Panda) that little bit better.

Do you use Webmaster Tools? What are your favourite features?