A Zettabyte Is Big. Really Big.


zettabyte-blog.jpgA colleague and I sometimes laugh about weird headlines. My favorites from the past week or so include “Fearsome Lawn Ornament Shot Dead by Cops,” “Proven: Sharks Love AC/DC” and “Roads to Close for Zombie Mayhem in Atlanta Suburb.”

But this headline got me not for its humor, but for being totally and completely unfathomable: “Global Internet Traffic Is Expected to Quadruple by the Year 2015.”

Read the story and you find out that Internet data flow levels area expected to reach nearly 1 zettabyte per year in just four years. For the record, my spell check hates the word zettabyte, and I’m not fond of it either. But if you want to at least attempt to comprehend how much information that is, it’s the equivalent of all the digital data in existence in 2010. So, to reiterate, just a handful of years from now we’re slated to create all of the decades of information we see today in one year.

I think it was Buckminster Fuller who said that if a person wants to radically change the world he should invent a device that the world can’t live without. Well, it seems our 21st-century world can’t live without the Internet. And I wonder if we’ll ever know exactly the profundity of how it’s changing us.

So I’ll leave you with this whopper of a (made-up) headline: “World Wastes $15.3 Trillion in Lost Productivity While Trying to Figure Out How Big a Zettabyte Is.” No. That’s not crazy enough. How about this real one: “Man With Dead Weasel Arrested for Assault.”

Who wrote this?

Meredith has had two careers: one as a writer/editor for both Focus on the Family and The Navigators, and one as an English teacher trekking far-flung corners of Europe, Africa and Asia. She now rejoins Focus, but with souvenirs—including new eyes with which to better view American culture.

Have something to say? Leave a comment.

Anonymous More than 1 year ago

Comment by  YetAnotherTeen:

And then you have to ask, "If human creativity is really as useless to God as people say, why did He give us creativity?"

Anonymous More than 1 year ago

Comment by  illuminati:

It really is just my opinion, based on personal beliefs, which are sinful and imperfect.

The sheer size of the zettabyte is astounding. Even when I begin to think of eternity and existing within it, my mind begins to reel with the sheer possibility. Atomic bombs, holographic displays, the internet...these are all inventions that I think humans could never have imagined. I am certain that if I children, they will be dealing with technologies that even our most innovative and advanced scientiests would laugh and simply say, "impossible!"

However, the idea of technology playing into Heaven isn't me saying that human endeavors will beable to contribute to the Perfection which He has achieved. That would be blasphemous. However, I have no doubt that He will put us to work, and may it be possible that human souls with an engineering bent (I am certainly not one!) would find an infinite amount of things to do?

Anonymous More than 1 year ago

Comment by  Nick:

I have a problem with your reply Illuminati; I believe that it is fundamentally self-defeating. (For those who don't know, the "Technological Singularity" is a technological threshold predicted and heralded by Ray Kurzweil when humans become able to merge their consciousness with computers.) You label the Tower of Babel as a "prideful construction" and then you go on to tell how this "new" technology is actually the key to bringing us back to a sinless, pre-fall state. Can't you see that you have made the Technological Singularity into a new Tower of Babel? The only difference is the level of technology we are dealing with. The story is still the same. Humanity is using technology (whether it be merging human and computer consciousness or using baked bricks instead of stone) to try to reach God.

Anonymous More than 1 year ago

Comment by  Hithwenur:

I have a close friend and relation who is also a Singularitatian, only minus the Christ part of the equation which you claim. So while I disagree with you on the idea of God's likelihood of building a major theoretical human invention into His re-creation, I do find it intrigueing how you are able to believe the one and the other at the same time.

However much I may doubt your particular hypothesis, though, you do raise a question which I imagine I'll be thinking about for awhile now. How much of what humans have made with their God-given ingenuity will be in the re-creation/post-Second Coming world? How much of it will He even allow to still be neccesary?

Anonymous More than 1 year ago

Comment by  illuminati:

I am one of the few with this belief, but I think Technological Singularity will be an intristic part of the Christ restoring the earth and humanity back to its pre-Fall state. I'm no scientist, but we've seen almost mechanical functions occur in our bodies on a cellular level, and I have trouble seeing why God, Himself the Creator and Builder of everything, would have an aversion to human industriousness. I'm not talking about the sort of prideful construction like at the Tower of Babel, but modern medicine, communication, and the way information is stored and shared; for example, individuals from disparate cultures can now form empathic relationships because of the internet, which can allow them to communicate without regards to distance, or language (i.e. pictures).

Whenever we talk about Christ's return to earth, its as if suddenly the mental picture in our heads goes back to "Bible Times," sandles, togas, palm branches, and an archiac scene resembling childish illustrations, and perhaps much further from what it was really like back then. I'm not saying Heaven will have iPods, and perhaps all of our machinations will be rendered pointless in the sheer Glory of His Presence, but our ultimate return to Christ will be a progression and mark a change that only macroevolution could never imagine. Might technology be one of His tools?

Anonymous More than 1 year ago

Comment by  Julie B:

How about this one: Flying bear kills two Canadians in freak accident.


Anonymous More than 1 year ago

Comment by  AndStuff:

I love those crazy headlines.

Anonymous More than 1 year ago

Comment by  Hithwenur:

Joey--WOW. That was one very long and detailed reply. Not that I'm feeling like I have full comprehension on the first read.

Meredith--I can't find it anymore, but I'm rememberinng a headline I ran into recently... "Man mauled by wombat." And the opening paragraph read something very close to, "Mr. So-and-so, formerly of Queensland, stepped out of his trailer house this morning, found a wombat on his doormat, and proceeded to step on it. The wombat reacted rather badly."

Anonymous More than 1 year ago

Comment by  Joey:

Hey Meredith - and all the other viewers reading this. I'm Joey, and I'm a network administrator during the day. If I may be so bold, I'll do my best to give the concept of scale to a zettabyte.

Mathematically speaking, a zettabyte is 10^21 bytes, or 1,000,000,000,000,000,000,000 bytes. I'll wager that most of the people reading this are like me and wouldn't notice if I'm a group of zeroes off or not, because it falls under the category of "unfathomably huge". So, let's try to make sense of this.

Let's take a technology that most people are familiar with: the DVD. DVD comes in a few flavors, so we'll skip to the largest - Dual Layer. Most movies come out on this format, and it has the capacity of 8.5GBytes. If we were to get a stack of DVD movies and copy all of their data onto a one zettabyte hard drive one dual layer DVD at a time, it would take 117,647,058,823.5 (that's 117.6 BILLION) movies to do it. If you had a stupidly fast DVD drive that could make a copy of those discs in one minute each (something that challenges the laws of physics as the disc would likely fly apart; it takes about 10-15 minutes for my desktop to copy a full disc like that), it'd take 223,833.8 years to copy it all.

Now, here's the clincher that seems to have been misconstrued, Meredith. That's the amount of data TRANSFERRED, not GENERATED or STORED. For example, Windows XP Service Pack 3 is about 320MBytes. If we assume that that patch has been transmitted via Windows Update 50 million times (not at all an unrealistic number; it's likely a bit low in reality), then that one single service pack accounts for 16.2 petabytes of that traffic, or 0.000000162% of the data being transferred. The same 320MBytes is being transmitted 50 million times; no one is sitting there writing the code by hand all that time. To give better perspective, consider that nearly 24% of all internet traffic is attributed to Netflix streaming. Sure, it's a LOT of data to transmit, but no netflix subscriber is storing that data anywhere, that's the whole point of streaming. Conversely, Sylvester Stallone isn't acting out his part in "Rocky IV" each time someone watches the movine, and Wall-E isn't being rendered at a render farm on a per-user basis. We're not creating zettabytes of data every year (that estimate, if memory serves, is closer to the petabyte vicinity, a full six orders of magnitude less), we're retransmitting the same data over and over again to millions upon millions of people.

Another part of this is simply the amount of people accessing the data. There's a constant outcry that we're running out of IPv4 addresses, since we've "only" got four billion of them. China is plugging in a whole lot more, and more and more internet connected phones are being used to watch cat videos on the go.  Since the data is redundantly being transmitted, it's simply being transmitted more times, to more people. Rebecca Black's notorious "Friday" video was viewed over 180 million times, by a few hundred million people. The internet connected population is growing, so distributing data to more people is inevitably going to bump up that count.

The data itself is also much more intensive - an all day command line session with my Linux server takes about 1MByte - tops - of data to be transferred. That would have been perfectly fine for an office worker to send perhaps double that over the course of a day over a local area network, since EVERYONE ran a command line of some variety 20-25 years ago. By contrast, our software-as-a-service vendor takes about 100-200MBytes of traffic in and out, per user, and even THAT is to an extent cached and optimized. Anyone want to go back to using WordPerfect 5.1? If you don't know what I'm talking about, the Wikipedia entry is pretty interesting, and I still have one of those F-key maps that everyone had on their keyboards at the time, since there was no mouse input. My point in bringing this up is the fact that saying "we're transmitting more data" is a warrantless claim, since data is being more optimized for the people using the computers, instead of the machines themselves. When the data is machine optimized (like my command line server session), it is much less taxing to the computer. When the computer makes things user friendly to end users, the computer still has to translate back to a language it understands. This is a good thing, but it's still going to be more intensive to computers (including all those pesky intermediate computers, like your home router which is actually a computer).

So yes, a one with twenty-one zeroes after it is a mind bogglingly large number. But when viewed in the context of "it's a very, very large number amongst a swath of smaller but still very large numbers", it's much less surprising. 1 zettabyte of Word documents and Twitter posts...yes, a very large number. 1 zettabyte over the course of a year, divided by ~2 billion people, transceiving 2 billion bytes a pop for each Netflix movie they watch and not storing it in any appreciable manner...well, it kinda kills the magic a bit for me.