The size of the Internet – and the human brain

How many human brains would it take to store the Internet?

Last September I asked if the human brain were a hard drive how much data could it hold?

The human hard drive: the brainI concluded that approximately 300 exabytes (or 300 million terabytes) of data can be stored in the memory of the average person. Interesting stuff right?

Now I know how much computer data the human brain can potentially hold, I want to know how many people’s brains would be needed to store the Internet.

To do this I need to know how big the Internet is. That can’t be too hard to find out, right?

It sounds like a simple question, but it’s almost like asking how big is the Universe!

Eric Schmidt

In 2005, Executive chairman of Google, Eric Schmidt, famously wrote regarding the size of the Internet:

“A study that was done last year indicated roughly five million terabytes. How much is indexable, searchable today? Current estimate: about 170 terabytes.”

So in 2004, the Internet was estimated to be 5 exobytes (or 5,120,000,000,000,000,000 bytes).

The Journal Science

In early 2011, the journal Science calculated that the amount of data in the world in 2007 was equivalent to around 300 exabytes. That’s a lot of data, and most would have been stored in such a way that it was accessible via the Internet – whether publicly accessible or not.

So in 2007, the average memory capacity of just one person, could have stored all the virtual data in the world. Technology has some catching up to do. Mother Nature is walking all over it!

The Impossible Question

In 2013, the size of the Internet is unknown. Without mass global collaboration, I don’t think we will ever know how big it is. The problem is defining what is the Internet and what isn’t. Is a businesses intranet which is accessible from external locations (so an extranet) part of the Internet? Arguably yes, it is.

A graph of the internet

A map of the known and indexed Internet, developed by Ruslan Enikeev using Alexa rank

I could try and work out how many sites there are, and then times this by the average site size. However what’s the average size of a website? YouTube is petabytes in size, whilst my personal website is just kilobytes. How do you average that out?

Part of the graph of the internet

See the red circle? That is pointing at Technology Bloggers! Yes we are on the Internet map.

The Internet is now too big to try and quantify, so I can’t determine it’s size. My best chance is a rough estimate.

How Big Is The Internet?

What is the size of the Internet in 2013? Or to put it another way, how many bytes is the Internet? Well, if in 2004 Google had indexed around 170 terabytes of an estimated 500 million terabyte net, then it had indexed around 0.00000034% of the web at that time.

On Google’s how search works feature, the company boasts how their index is well over 100,000,000 gigabytes. That’s 100,000 terabytes or 100 petabytes. Assuming that Google is getting slightly better at finding and indexing things, and therefore has now indexed around 0.000001% of the web (meaning it’s indexed three times more of the web as a percentage than it had in 2004) then 0.000001% of the web would be 100 petabytes.

100 petabytes times 1,000,000 is equal to 100 zettabytes, meaning 1% of the net is equal to around 100 zettabytes. Times 100 zettabytes by 100 and you get 10 yottabytes, which is (by my calculations) equivalent to the size of the web.

So the Internet is 10 yottabytes! Or 10,000,000,000,000 (ten thousand billion) terabytes.

How Many People Would It Take Memorise The Internet?

If the web is equivalent to 10 yottabytes (or 10,000,000,000,000,000,000,000,000 bytes) and the memory capacity of a person is 0.0003 yottabytes, (0.3 zettabytes) then currently, in 2013, it would take around 33,333 people to store the Internet – in their heads.

A Human Internet

The population of earth is currently 7.09 billion. So if there was a human Internet, whereby all people on earth were connected, how much data could we all hold?

The calculation: 0.0003 yottabytes x 7,090,000,000 = 2,127,000 yottabytes.

A yottabyte is currently the biggest officially recognised unit of data, however the next step (which isn’t currently recognised) is a brontobyte. So if mankind was to max-out its memory, we could store 2,127 brontobytes of data.

I estimated the Internet would take up a tiny 0.00047% of humanities memory capacity.

The conclusion of my post on how much data the human brain can hold was that we won’t ever be able to technically match the amazing feats that nature has achieved. Have I changed my mind? Not really, no.

10 thoughts on “The size of the Internet – and the human brain

  1. This is a great post, that was entertaining to read, but i must say that everything seems a bit fantasy, since it’s not certain how big the internet actually is, or what actually qualifies as being the internet.
    It would be amazing though for us to be able to store data on something that was close to our brain capacity!

    • Christopher Roberts

      Hi Michael,

      The human brain is pretty amazing and there are some things about it I think we will probably never know. Organic storage would be an interesting idea, that could have major ethical implications though.

      I have no doubt that one day we will be able to store 0.3 zettabytes (or the memory capacity of a human brain) on a single physical hard disk, but it might be a few years away yet!

      Thanks for the comment, welcome to the community 🙂
      Christopher – Admin Team

  2. 33,000 people still doesn’t seem like a lot. It’s amazing to think that in the mid 2000’s you could fit the entire capacity of the internet in one persons brain. Organic storage sounds a bit sci fi however who knows one day it might be a possibility.

    • Christopher Roberts

      If we accept Moore’s law, then very soon, data storage of that size could be possible, even without organic storage…

      I love topics like this Neil, although highly subjective, they are great fun to research and discuss!

        • Christopher Roberts

          You think?

          In 1995, around 5 million transistors could exist in the same area that only 500,000 could in 1985. Wind forward to 2005 and that figure jumps to over a billion!

          Most smartphones today have more processing power than top of the range desktops 10 years ago.

          Things are moving, but sometimes it is hard to see…

  3. Pingback: The Deep Web
  4. I would argue a bit that even if people’s brains are full, they are storing mostly the same information that some other people already stored. 1 000 000 000 people saw Eiffel’s tower at least from the picture, and if we take 100 or 1000 most different memories about it, we get the full picture.
    My point is that the size would have a meaning after compression (at least after removing duplicates) and it would go probably below 1000 yottabytes. Now if we remove the “data” that has no value at all to any of the people now or in the future, it would shrink somewhere near 1 yottabyte. While most of it would be perhaps audio-visual, internal memories of complex feelings and memories of inputs from other senses.

  5. “I concluded that approximately 300 exabytes (or 300 million terabytes) of data can be stored in the memory of the average person. Interesting stuff right?”

    In my opinion 300 exabytes is a gross overstimation of the capacity of the human brain.

    Why? Even if we assume that the main sensory input when we are awake(through vision and hearing ) is stored permanently in our brain – 12 hrs per day for 70 years, it would hardly be 10 petabytes. Of course daily experience shows that such a thing as lossless storage of information in our brain can’t be farther from the truth.
    Furthermore it has been shown that we do not actually store whole images but only the most important parts of it. All in all, I would be surprised if coherent and ready to use knowledge of such nature would be more than some tens of terabytes.

    Plus when we talk about info , we usually mean useful info, in the form of facts, mathematical formulas, explanation of structures, abstract concepts. And in this regard human capacity is pitifully poor. Because this info requires an effort to be absorbed, dedication, mental and physical stamina.

    Have you ever tried to learn a technical or scientific book of 2000 pages or worth about 200 megabytes? – Not memorize it , but comprehend its info so you can use it in practice with accuracy, recalling more than 95% of it’s contents. I don’t know many people that would be able to do it sooner than a month, and a good number of decent learners would require months. Even if you are a superman and able to achieve the aforementioned feat in 15 days(impossible imo), you would need a year to learn 4.8 gigagbytes(200×2 x12 months), and wouldn’t be able to learn more than 350 gigabytes in 70 years(5x 70). Now, the above makes the unrealistic assumption that you would study non-stop for 70 years and only stop to eat a bit, go to the bathroom and sleep.

    So you see, that our actual academic potential is in the rage of the gigbaytes. Don’t underestimate this number because you are used to terabytes or petabytes. 1 Gigabyte of structured knowledge is huge and can perform miracles. I am just trying to point out , that everyday experience, shows to us our limits and that we should be more humble.

    Yes , our brain is a big miracle, but it is no god, it has limits. And these limits may seem small, but they are big enough . Hey , we have reached this point with the known “limited” capacities of our brain. This means sth.

  6. So the internet may well be big enough to we self aware. The question is if it is would we know? If I was a self aware internet then I wouldn’t let on, would you?

Leave a Reply

Your email address will not be published.