# How many human brains would it take to store the internet?

Last September I wrote an article exploring the concept that if the human brain were a hard drive how much data could it hold?

I discovered that approximately 300 exabytes of data can be stored in the memory of the average human being.

I really enjoyed writing the article, and although it might not be 100% accurate (okay, maybe not even 50% accurate) I learnt a lot and found it a very interesting topic to research.

Now that I have the information as to how much data the human brain can potentially hold, I want to know how many people’s heads would be needed to store the internet.

It sounds like a simple question, but it’s almost like asking how big is the Universe!

### Eric Schmidt

In 2005, Executive chairman of Google, Eric Schmidt, famously wrote regarding the size of the internet:

“A study that was done last year indicated roughly five million terabytes. How much is indexable, searchable today? Current estimate: about 170 terabytes.”

So in 2004, the internet was estimated to be 5 exobytes (or 5,120,000,000,000,000,000 bytes).

### The Journal Science

In early 2011, the journal Science calculated that the amount of data in the world in 2007 was equivalent to around 300 exabytes. That’s a lot of data, and most would have been stored in such a way that it was accessible via the internet – whether publicly accessible or not.

So in 2007, the average memory capacity of just one person, could have stored all the virtual data in the world. Technology has some catching up to do. Mother Nature is walking all over it!

### The Impossible Question

In 2013, the size of the internet is unknown. Without mass global collaboration, I don’t think we will ever know how big it is. The problem is defining what is the internet and what isn’t. Is a businesses intranet which is accessible from external locations (so an extranet) part of the internet? Arguably yes, it is.

A map of the known and indexed internet, developed by Ruslan Enikeev using Alexa rank

I could try and work out how many sites there are, and then times this by the average site size. However what’s the average size of a website? YouTube is petabytes in size, whilst my website is just kilobytes. How do you average that out?

See the red circle? That is pointing at Technology Bloggers! Yes we are on the internet map.

The internet is now too big to try and quantify, so I can’t determine it’s size. My best chance is a rough estimate.

### How Big Is The Internet?

What is the size of the internet in 2013? Well, if in 2004 Google had indexed around 170 terabytes of an estimated 500 million terabyte net, then it had indexed around 0.00000034% of the web at that time.

On Google’s how search works feature, the company boasts how their index is well over 100,000,000 gigabytes. That’s 100,000 terabytes or 100 petabytes. Assuming that Google is getting slightly better at finding and indexing things, and therefore has now indexed around 0.000001% of the web (meaning it’s indexed three times more of the web as a percentage than it had in 2004) then 0.000001% of the web would be 100 petabytes.

100 petabytes times 1,000,000 is equal to 100 zettabytes, meaning 1% of the net is equal to around 100 zettabytes. Times 100 zettabytes by 100 and you get 10 yottabytes, which is (by my calculations) equivalent to the size of the web.

### How Many People Would It Take Memorise The Internet?

If the web is equivalent to 10 yottabytes (or 10,000,000,000,000,000,000,000,000 bytes) and the memory capacity of a person is 0.0003 yottabytes, (0.3 zettabytes) then currently, in 2013, it would take around 33,333 people to store the internet – in their heads.

### A Human Internet

The population of earth is currently 7.09 billion. So if there was a human internet, whereby all people on earth were connected, how much data could we all hold?

The calculation: 0.0003 yottabytes x 7,090,000,000 = 2,127,000 yottabytes.

A yottabyte is currently the biggest officially recognised unit of data, however the next step (which isn’t currently recognised) is a brontobyte. So if mankind was to max-out its memory, we could store 2,127 brontobytes of data.

I estimated the internet would take up a tiny 0.00047% of humanities memory capacity.

The conclusion of my post on how much data the human brain can hold was that we won’t ever be able to technically match the amazing feats that nature has achieved. Have I changed my mind? Not really, no.

Christopher Roberts has written 222 article(s) for Technology Bloggers. Founder of Technology Bloggers and admin team member, I am a big fan of blogging, but a big hater of spam! If you check my archive, you will see that I write about most of the topics we cover, although I specialise in technology. I like to do my bit for charity, and want to improve the world. Find out more about me on my about.me profile, or check out my online home :-)
This entry was posted in Computers, Internet, Science and tagged , , , , , , , , , , , , , , , , , , . Bookmark the permalink.

### 6 Responses to How many human brains would it take to store the internet?

1. Michael from programming says:

This is a great post, that was entertaining to read, but i must say that everything seems a bit fantasy, since it’s not certain how big the internet actually is, or what actually qualifies as being the internet.
It would be amazing though for us to be able to store data on something that was close to our brain capacity!

• Christopher Roberts says:

Hi Michael,

The human brain is pretty amazing and there are some things about it I think we will probably never know. Organic storage would be an interesting idea, that could have major ethical implications though.

I have no doubt that one day we will be able to store 0.3 zettabytes (or the memory capacity of a human brain) on a single physical hard disk, but it might be a few years away yet!

Thanks for the comment, welcome to the community

2. Neil from SEO Services Sydney says:

33,000 people still doesn’t seem like a lot. It’s amazing to think that in the mid 2000′s you could fit the entire capacity of the internet in one persons brain. Organic storage sounds a bit sci fi however who knows one day it might be a possibility.

• Christopher Roberts says:

If we accept Moore’s law, then very soon, data storage of that size could be possible, even without organic storage…

I love topics like this Neil, although highly subjective, they are great fun to research and discuss!

• Neil from SEO Services Sydney says:

Hi Christopher,
Moores law still seems to be working with storage however seems to have slowed quite significantly in relation to processing power.

• Christopher Roberts says:

You think?

In 1995, around 5 million transistors could exist in the same area that only 500,000 could in 1985. Wind forward to 2005 and that figure jumps to over a billion!

Most smartphones today have more processing power than top of the range desktops 10 years ago.

Things are moving, but sometimes it is hard to see…