This week the news is full of Lenovo, a computer manufacturer that has been selling machines that they have already fitted with what some call Malware or just Adware. Magic in the machine indeed!
The mal/adware in question is made by a company called “Superfish.” The software is essentially an Internet browser add-on that injects ads onto websites you visit. Details here.
Besides taking up space in your computer, the add-on is also dangerous because it undermines basic computer security protocols.
That’s because it tampers with a widely-used system of official website certificates. That makes it hard for your computer to recognize a fake bank website. This means that you are more likely to give all of your personal data away, let nasty things into your computer, and allow people to monitor your use.
No good I hear you say, and all so that they can feed you adverts while you are browsing.
But this news does bring up another question, what else is in the computer? What else is it programmed to do? The simple answer is that I and probably most of you do not know. We have bought a machine that does the things we want it to do, but who knows what else?
Now as I eat my breakfast, I like to read the ingredients on the side of the packet. It is good for language skills as it is usually in several languages. But can I do this with my computer? You don’t get much in the way of documentation with a $400 laptop. Certainly not considering what is inside it.
So the computer company in question have disabled something at their end and the problem is resolved. But if they tell you that they fixed the problem are you going to believe them? After they did something that put your computer and everything saved on it at risk? Or should you put a new operating system on the new machine, wipe the hard drive and start again?
So if you cannot trust wifi, or computer manufacturers, or Google, or Facebook, or Samsung to treat our data securely and correctly, who can you trust? And more to the point why are we giving them our lives to play with?
I know what I like, and I know what I don’t like, but the problem is so does everyone else. Who would have thought that just liking something on Facebook could be so important. Recent research seems to show that studying what you have liked can tell more about your personality than you would imagine.
University researchers have just published a study (read it here) called “Computer-based personality judgments are more accurate than those made by humans”. They claim that what you ‘like’ on Facebook gives away your personality, to the point that a computer program can gauge your responses to questions better than your friends can.
Well how can that be? Judging personality is a honed social skill, but their research based on just over 86000 volunteers and their friends’ responses seems to prove the theory that computers can do it better with just information about what you like via Facebook.
As the researchers say in their report, “Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy”.
Their findings show that with a sample of 100 likes, the computer can outperform your friends in predicting your answers to the questions of a standard personality test. Obviously the more likes the computer has, the better it performs, so this means that every like has its place, tells a story, guides a narrative, and defines the computers definition of who you are.
So if you like certain types of things, your personality is likely to reflect this. If you like dancing and having a sun tan, you are probably extrovert, if you like Salvador Dali you are probably open to experience and more adventurous with your lifestyle choices, you get the picture? This leads to the machine being able to better predict if you will deviate from social norms or stay within them, experiment or not.
Well if a computer can determine that I am (as we all know after the brain electrocuting experiments) open to experience, then that could possibly be used to market stuff to me, to guess how I might live my life in terms of personal choices (including health risk), and to put me into a little box for insurance or job hunting purposes. They are better at predicting life outcomes than my friends. This is serious!
Obviously computer power will massively increase in the future, and we will no doubt see the development of automated personality assessment tools. How they will be used is anybody and everybody’s guess, and all they need is for us all to continue to give all of this free data away to Facebook.
Anyway, if you are interested, I don’t like Dali, or Iggy Pop, or the KLF Arts Foundation, and only listen to Beethoven, I don’t use Tor and I drive a Skoda. I must have the perfect personality for any highly paid and respectable job. Find me on Linkedin, I don’t use Facebook.
Having a fast website is very important. As I mentioned in my Black Friday post, nobody likes a slow website and if your site take more than a few seconds to load, the chances are you are loosing visitors because of that lag.
This article contains a few easy to implement tips which you can use to help you reduce the load time of your website.
Keep Your Code Tidy
Unless something goes wrong, or someone chooses to view your source code, most of the people who visit your website will never see any of the code that is stuffed away behind the scenes. That doesn’t mean it isn’t important however. After all, the code at the back-end is what creates the website at the front end.
Reduce Files Fetched
It is good practice to fetch as few files as possible when loading your website. For example, many sites use separate style sheets for different parts of the website – for example one for text, one for images and another for general layout. Every file that your page calls upon increases its overall load time. Fetching one big CSS document will usually be faster than fetching three smaller ones.
Also consider how many external resources you load – for example adding a Facebook like button will require the users browser to visit Facebook’s website to pull the code across, whilst loading your page. A link or a delayed load on things like social sharing buttons can give you a big speed boost.
Optimise Your Images
Images make your content more exciting, however if you don’t optimise them then they can often really slow your page load time down. There are various ways you can reduce the file size of your images without compromising on quality.
When you take a picture, it can often be much bigger than you really need it to be. By resizing photos before you upload them, you can massively reduce the file size of your images. If you leave the file big, but resize it using HTML or CSS – by setting a smaller height and width – then the end user still has to load the big image, and then their browser then has to squash it down to fit your new image dimensions.
Choose The Right File Type
The most commonly used image formats are .jpg, .gif and .png. Different images lend themselves to different formats. Reducing the number of colours available to a GIF or a PNG-8 image will reduce the files size, whilst reducing the image quality will lower the size of a JPEG file.
Use An Image Compressor
Image compressors are another way to shrink images. Technology Bloggers currently uses a WordPress plugin called WP Smush.it which uses the Yahoo! Smush.it tool to reduce image files.
Here is a picture that I took several years ago whilst in South Africa.
The full sized image was 3.44 megabytes. Resizing it in Photoshop helped me reduce that to 1.61 megabytes. Because there are lots of colours and the image was quite big, choosing GIF and PNG-8 format made it look too pixelated, so it was between PNG-24 and JPEG. PNG-24 squashed the image down to 831 kilobytes, whilst JPEG compressed it to a tidy 450 kilobytes. Although that is a lot smaller than the original file, it would still take a long time to load on a slow connection so by taking a very small hit on the image quality, I managed to get the file size down to 164 kilobytes. Finally running the image through Smush.it took it down to 157 kilobytes. Some images see a big reduction, most (like this one) see a smaller reduction of just a few percent.
Use A Content Delivery Network
Content delivery networks, or CDNs, can help to improve a websites speed and make it more reliable. Put very simply, when someone tries to access your site, without a CDN they are directed to your hosting provider, who will then serve them your website and all its files from their server. This means that if your host goes down because of a fault, or a sudden surge in traffic you loose your site, and also if your host is not close to a user, it can take a long time for them to communicate.
With a CDN, users can fetch your site faster, because it is offered in multiple locations around the world. Additionally many CDNs can cache a copy of your site, so if your host goes offline, they can provide a static version of your site to users until it comes back up.
For example, Technology Bloggers is currently hosted in Gloucester in the UK. If you access us from Australia, CloudFlare (the CDN we use) will send you to its closest data centre, which could be in Australia, which will then deliver the files you need to see our site. It is faster because your requests don’t have to travel all the way to the UK and nor does the data being sent back to you either.
Control Your Cache
If you use a CMS, then the chances are your content is dynamically delivered upon request. Basically, when the user requests a page, your site creates it and then sends it back. By using some form of caching you can create a static image of your site, so your site doesn’t have to create the content each time a user visits it. There are various plugins you can use to help with this, Technology Bloggers uses CloudFlare’s caching system, as I have found this seems to work better than other WordPress plugins I have tried. Also, using too many plugins, slows your site down, hence why I let the CDN manage it.
A users browser also saves files for later, in case they visit your site again. It is possible to determine what files are saved and for how long these files are saved for, by adding caching headers to your .htaccess file you can change these settings.
How To Test If Your Site Is Faster
Refreshing your page and timing it with a stopwatch is one way to gauge how quick your site loads. This probably isn’t the best way to do it though!
There are various websites which rate your sites speed performance. I tend to measure Technology Bloggers using four main speed analysis sites.
Google are keen for the web to be faster and offer a very useful tool which gives your site a score for mobile load time and desktop load time. It also suggests what it believes is slowing your site down. Google’s tool also gives an image of your fully loaded site – all the content above the fold. Unfortunately, their test doesn’t actually state how fast your site loads, just how well optimised it is.
Probably the most thorough site I use is WebPageTest, which presents loads of different information, including first view load time, repeat view load time (which should be quicker if you have user side caching), a waterfall view of all the files loading, a visual representation of how your site loads, suggestions as to where performance issues lie and loads more.
Pingdom is another useful tool, it gives a handy speed score and also tells you how fast your site is compared to other sites it has tested. It also saves your speed results, so you can view historic test result speeds on a graph, and see how your sites speed has changed.
GTmetrix is another useful site. It also gives lots of details, and helps you to see what is slowing your site down. GTmetrix also lets you compare one site to another, which I’m not really sure is that useful, but it is interesting to see how your competitors site compares to your own.
Remember to enjoy your new, faster site! Hopefully your visitors will too. 🙂