Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

mortlocli

macrumors 6502a
Original poster
Feb 23, 2020
686
635
"How much storage do I need to down load the Internet?"

Read this today:
'This new standard avoids the limitations of the BIOS. The UEFI firmware can boot from drives of 2.2 TB or larger—in fact, the theoretical limit is 9.4 zettabytes. That’s roughly three times the estimated size of all the data on the Internet.'

Source:

Ans:
Roughly 3.1 zettabytes. (Theoretically )
 

Amethyst1

macrumors G3
Oct 28, 2015
9,359
11,489

mortlocli

macrumors 6502a
Original poster
Feb 23, 2020
686
635
NetBSD is the OS that runs on everything though. ;)



As of when? :)

I mean, loads and loads of stuff is uploaded every picosecond, so the space required keeps growing.
One should always consider the context at the time of writing of anything written ..don't you think? Time changes everything..well, almost
 
  • Like
Reactions: Amethyst1
Ans:
Roughly 3.1 zettabytes. (Theoretically )

Trick question!

The real question on my mind: how much storage space to cover the entire history of the internet’s contents, including revisions to sites; fly-by-night storage locker links like Mega; DMCA-takedown(ed) content; and everything file-shared with a publicly accessible link?

It’s unanswerable, which is a shame. It didn’t have to be this way.
 

mortlocli

macrumors 6502a
Original poster
Feb 23, 2020
686
635
Trick question!

The real question on my mind: how much storage space to cover the entire history of the internet’s contents, including revisions to sites; fly-by-night storage locker links like Mega; DMCA-takedown(ed) content; and everything file-shared with a publicly accessible link?

It’s unanswerable, which is a shame. It didn’t have to be this way.
That's just simple mathematics, surely?
Nothing basic algebra wouldn't solve..now we have a starting point:
X=3.1

An online course may be required..udemy perhaps ?
 
It tis a good question, mag..when one thinks of the claim the Internet is the closest thing to a human brain.

I guess it’s a matter of looking at the internet as the nearest technological creation to a brain, either in the context of a snapshot of that “brain” or in the added dimension across time. I’m referring more to all the slices of time, combined.

Put another way: the current human brain might recognize the earlier human brain iteration of itself, were they to meet, but I have doubt that the current “brain” that is the internet today would be able to recognize “itself” at any temporal slice in the past. Pick a year… let’s go with 2001. Sure, archive-dot-org has tried to capture static, faithful snapshots (even if many earlier sites have missing or broken components within the snapshots), but so much more is just plain lost to time. Internet 2023 wouldn’t know Internet 2001, even when it has this photo album of sorts to look at.

Now answer the other age-old question. Can I download the internet with only 8GB of Ram or should I upgrade to 16GB?

Start with 8 now, upgrade later as needed. :D
 

Amethyst1

macrumors G3
Oct 28, 2015
9,359
11,489
BRB gonna try to install NetBSD on my 1974 touring bicycle… HEY, YOU NEVER KNOW…
Who’s the first to run an alternative OS on an SSD controller? [Some of] those are ARM.

One should always consider the context at the time of writing of anything written ..don't you think?
Sure… and the site doesn’t provide that info. It says the article was updated three months ago. When was it originally written, and what was updated? Dunno.
 

antiprotest

macrumors 601
Apr 19, 2010
4,022
14,103
So if the Internet has about 3 zettabytes, I'd guess about 2 zettabytes of that are adult content, or about 1 zettabyte if you do not count my own redundant collection stored in the cloud. That leaves 1 zettabytes of regular content on the internet.
 

mortlocli

macrumors 6502a
Original poster
Feb 23, 2020
686
635
well in conclusion, taking in the ram factor raised by Lee, for a little future proofing, we could say:
4 zettabytes of Ram!!!

hey - tis only a matter of time..
 
  • Like
Reactions: Amethyst1

chown33

Moderator
Staff member
Aug 9, 2009
10,760
8,454
A sea of green
As soon as you answer the capacity question, you trip over the bandwidth question.

Given that the internet is always growing, what bandwidth would be needed to exceed its growth rate?

Or put another way, if the internet size grows X per day, and you don't have bandwidth that exceeds X/day, you'll never be able to copy it, because it will always be growing faster than you're copying.

Then you run into the question of how fast the RAM has to be, and how many parallel channels you need in order to fill it.

And how many terawatts of power would be needed to power the copy?
I envision huge tracts of land covered in flux capacitors.
 
  • Love
Reactions: appltech

Analog Kid

macrumors G3
Mar 4, 2003
8,935
11,536
As soon as you answer the capacity question, you trip over the bandwidth question.

Given that the internet is always growing, what bandwidth would be needed to exceed its growth rate?

Or put another way, if the internet size grows X per day, and you don't have bandwidth that exceeds X/day, you'll never be able to copy it, because it will always be growing faster than you're copying.

Then you run into the question of how fast the RAM has to be, and how many parallel channels you need in order to fill it.

And how many terawatts of power would be needed to power the copy?
I envision huge tracts of land covered in flux capacitors.

Bandwidth is the only thing keeping us from crossing into the singularity, I think.

Given that Bing and ChatGPT scrape the internet for content and most of the new content seems to be transcripts of sessions with Bing (well, Sydney 😈) and ChatGPT, if we weren't bandwidth constrained the universe would turn in on itself.
 

appltech

macrumors 6502a
Apr 23, 2020
688
166
Just buy those nuggets with an infinite amount of storage and copy the Internet infinite amount of time.

While entropy goes high, you buy two, three, etc disks, so inevitably you'd face the moment when you started to copy/save your own data that is endlessly copied to your nuggets:


In two words: being part of entropy you cannot measure or embrace the data correctly in full amount, unless you're not the part of it
 
  • Like
Reactions: Amethyst1

Unami

macrumors 65816
Jul 27, 2010
1,358
1,564
Austria
Trick question!

The real question on my mind: how much storage space to cover the entire history of the internet’s contents, including revisions to sites; fly-by-night storage locker links like Mega; DMCA-takedown(ed) content; and everything file-shared with a publicly accessible link?

It’s unanswerable, which is a shame. It didn’t have to be this way.
I'm sure current data stored corresponds to total data transferred (there are probably records for that), so you'll be able to extrapolate a pretty good guesstimate if you know those two points over a fixed amount of time.

Then just get a harddrive double that size, and you're good to go for some time.
IMG_8537.jpeg


(That's an IBM 350 with 5MB of storage - i'm sure they can store more nowadays)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.