OSDev.org

The Place to Start for Operating System Developers
It is currently Thu Mar 28, 2024 8:16 pm

All times are UTC - 6 hours




Post new topic Reply to topic  [ 21 posts ]  Go to page 1, 2  Next
Author Message
 Post subject: Physics stuff (was Linux "High Memory")
PostPosted: Wed Jan 15, 2014 3:07 am 
Offline
Member
Member

Joined: Wed Mar 09, 2011 3:55 am
Posts: 509
Brendan wrote:
Of course now we know that "X bytes of RAM will be enough for everything!" is always wrong sooner or later.


Once you start getting into address space widths of more than 160 bits or so, you're talking about an amount of RAM that would consume a significant fraction of the mass of the Earth, the galaxy, or even the observable universe (around 280 bits you hit one byte per particle in the universe).

So 2^512 bytes *will* be enough for *everything*.

Quote:
For example, for 64-bit 80x86, kernel space can be as large as 128 TiB and (like Linus working on 80386 machines) computers with that much RAM seem hard to imagine now. If you assume RAM sizes double every 3 years then you can expect to see computers with more than 128 TiB in about 30 years. When that happens, you can expect Linux developers will start whining about CPU manufacturers failing to increase virtual address space sizes (and blaming the CPU designers for the kernel's own stupidity) again.


More likely CPU manufacturers will increase virtual address space size well before that point, as the word size for x86-64 is already 64 bit, so the address space has room for expansion without designing an entirely new architecture.


Top
 Profile  
 
 Post subject: Re: Linux "High Memory"
PostPosted: Wed Jan 15, 2014 4:41 pm 
Offline
Member
Member
User avatar

Joined: Wed Mar 05, 2008 12:52 am
Posts: 142
linguofreak wrote:
So 2^512 bytes *will* be enough for *everything*.

That's still a silly statement. Assuming we became able to accurately and fully model everything in the universe, we would then certainly want extra space to store x frames of y results of z simulations for some unknown values of x, y, and z.


Top
 Profile  
 
 Post subject: Re: Linux "High Memory"
PostPosted: Wed Jan 15, 2014 5:19 pm 
Offline
Member
Member
User avatar

Joined: Fri Jun 13, 2008 3:21 pm
Posts: 1700
Location: Cambridge, United Kingdom
inx wrote:
linguofreak wrote:
So 2^512 bytes *will* be enough for *everything*.

That's still a silly statement. Assuming we became able to accurately and fully model everything in the universe, we would then certainly want extra space to store x frames of y results of z simulations for some unknown values of x, y, and z.


You don't seem to understand that to get to 2^280-bits you'd have to turn all the contents of the universe into memory at perfect bitwise efficiency.

This is, of course, patently impossible (Also speed of light delays would make the latency of a floppy disc favorable to that of this memory, though this memory could no doubt do higher transfer rates)


Top
 Profile  
 
 Post subject: Re: Linux "High Memory"
PostPosted: Wed Jan 15, 2014 5:53 pm 
Offline
Member
Member
User avatar

Joined: Wed Dec 01, 2010 3:41 am
Posts: 1761
Location: Hong Kong
I believe computer as we known it, will be revolutionized long before anything close to 2^280 bit processing of one single computer.
Be it quantum computing (if multiple histories is more physical then maths equations), bio-chem powered computer, optical computer, or a way to utilize of all the computers on the internet itself.


Top
 Profile  
 
 Post subject: Re: Linux "High Memory"
PostPosted: Wed Jan 15, 2014 8:24 pm 
Offline
Member
Member
User avatar

Joined: Sat Jan 15, 2005 12:00 am
Posts: 8561
Location: At his keyboard!
Hi,

Owen wrote:
inx wrote:
linguofreak wrote:
So 2^512 bytes *will* be enough for *everything*.

That's still a silly statement. Assuming we became able to accurately and fully model everything in the universe, we would then certainly want extra space to store x frames of y results of z simulations for some unknown values of x, y, and z.


You don't seem to understand that to get to 2^280-bits you'd have to turn all the contents of the universe into memory at perfect bitwise efficiency.


Obviously, inx is right - 2^512 bytes will not be enough for everything. If there's an upper limit to RAM size, then that just means we can't get enough RAM for everything.

Owen wrote:
This is, of course, patently impossible (Also speed of light delays would make the latency of a floppy disc favorable to that of this memory, though this memory could no doubt do higher transfer rates)


If you think "one bit per particle in the observable universe" is a limit; then alternatives include:
  • find ways to observe more of the universe
  • store more than 1 bit of information per atom (e.g. let's say a single atom has a speed ranging from 1 to (2**32+1) in one of 2**32 directions - you'd be able to store 64 bits of information per atom that way).
  • simply use sub-atomic particles instead (e.g. maybe use photons, because they're unlimited - you can create more photons in your spare time if you ever run out)


Cheers,

Brendan

_________________
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.


Top
 Profile  
 
 Post subject: Re: Linux "High Memory"
PostPosted: Thu Jan 16, 2014 7:23 am 
Offline
Member
Member
User avatar

Joined: Fri Jun 13, 2008 3:21 pm
Posts: 1700
Location: Cambridge, United Kingdom
Brendan wrote:
If you think "one bit per particle in the observable universe" is a limit; then alternatives include:
  • find ways to observe more of the universe

Relativity says no to that one (Also: If you can observe more than the observable universe, then you have implied the ability for time travel, and I don't even want to think of the complications that implies)
Brendan wrote:
  • store more than 1 bit of information per atom (e.g. let's say a single atom has a speed ranging from 1 to (2**32+1) in one of 2**32 directions - you'd be able to store 64 bits of information per atom that way).

Heisenberg's Uncertainty Principle says no to that one
Brendan wrote:
  • simply use sub-atomic particles instead (e.g. maybe use photons, because they're unlimited - you can create more photons in your spare time if you ever run out)

Only in the presence of infinite energy.

The point of the number of atoms in the universe as an absolute maximal upper bound is that it is far beyond impossible - the number of bits you can store per atom is finite, but more importantly you need to arrange them in some sort of rigid lattice in order to avoid them wandering off (because if they're moving you will never find them again - see the uncertainty principle) and a single atom thick sheet of carbon (pretty much an optimum as far as single atom thick sheets go) does not have the rigidity you would need in the face of manipulating the atoms to store data.

We are using the same kind of principle here which says you can't brute force a 128-bit symmetric encryption key (In this case because it requires more energy than exists in the solar system)

Also note I'm not disputing anything of a quantum computing nature - quantum computing works very differently, and therefore traditional applications are unlikely to run on quantum computers


Top
 Profile  
 
 Post subject: Re: Linux "High Memory"
PostPosted: Thu Jan 16, 2014 7:27 am 
Offline
Member
Member

Joined: Mon Jul 05, 2010 4:15 pm
Posts: 595
We all know that the answer will be 42 anyway.


Top
 Profile  
 
 Post subject: Re: Linux "High Memory"
PostPosted: Thu Jan 16, 2014 9:04 am 
Offline
Member
Member
User avatar

Joined: Sat Jan 15, 2005 12:00 am
Posts: 8561
Location: At his keyboard!
Hi,

Owen wrote:
Brendan wrote:
If you think "one bit per particle in the observable universe" is a limit; then alternatives include:
  • find ways to observe more of the universe

Relativity says no to that one (Also: If you can observe more than the observable universe, then you have implied the ability for time travel, and I don't even want to think of the complications that implies)


From wikipedia:

"Some parts of the universe may simply be too far away for the light emitted from there at any moment since the Big Bang to have had enough time to reach Earth at present, so these portions of the universe would currently lie outside the observable universe. In the future, light from distant galaxies will have had more time to travel, so some regions not currently observable will become observable."

Basically, the only thing we need to do to observe more of the universe is wait (do nothing for long enough).

Owen wrote:
Brendan wrote:
  • store more than 1 bit of information per atom (e.g. let's say a single atom has a speed ranging from 1 to (2**32+1) in one of 2**32 directions - you'd be able to store 64 bits of information per atom that way).

Heisenberg's Uncertainty Principle says no to that one


Heisenberg's Uncertainty Principle says "the more precisely the position of some particle is determined, the less precisely its momentum can be known, and vice versa". This means we can know the momentum extremely precisely if we don't care what the position is (and in my silly example, there wasn't any need to care what the position was anyway).

We could also do the reverse - rely on the position without caring what the momentum is. For example, lets have 2**64 empty spaces and one atom. Whichever space the atom is in determines the value that atom represents.

Owen wrote:
Brendan wrote:
  • simply use sub-atomic particles instead (e.g. maybe use photons, because they're unlimited - you can create more photons in your spare time if you ever run out)

Only in the presence of infinite energy.


So now you're saying the real limit is the amount of energy in the universe, and not the number of atoms?

Owen wrote:
The point of the number of atoms in the universe as an absolute maximal upper bound is that it is far beyond impossible - the number of bits you can store per atom is finite, but more importantly you need to arrange them in some sort of rigid lattice in order to avoid them wandering off (because if they're moving you will never find them again - see the uncertainty principle) and a single atom thick sheet of carbon (pretty much an optimum as far as single atom thick sheets go) does not have the rigidity you would need in the face of manipulating the atoms to store data.


Even if you accept that as a correct upper limit (which is highly dubious), it's still wrong (we'll still reach a point where we want to store more than we possible can).

Owen wrote:
We are using the same kind of principle here which says you can't brute force a 128-bit symmetric encryption key (In this case because it requires more energy than exists in the solar system)


Which is also wrong - if you're extremely lucky, you might guess the key on your very first attempt and consume a very negligible amount of energy.

Of course the energy you consume isn't destroyed either (conservation of energy). You could provide all the energy you need using a lemon and 2 nails (e.g. copper and zinc), as long as you're able to recycle "waste energy" back into a usable form.


Cheers,

Brendan

_________________
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.


Top
 Profile  
 
 Post subject: Re: Linux "High Memory"
PostPosted: Thu Jan 16, 2014 9:37 am 
Offline
Member
Member
User avatar

Joined: Fri Jun 13, 2008 3:21 pm
Posts: 1700
Location: Cambridge, United Kingdom
Brendan wrote:
Hi,

Owen wrote:
Brendan wrote:
If you think "one bit per particle in the observable universe" is a limit; then alternatives include:
  • find ways to observe more of the universe

Relativity says no to that one (Also: If you can observe more than the observable universe, then you have implied the ability for time travel, and I don't even want to think of the complications that implies)


From wikipedia:

"Some parts of the universe may simply be too far away for the light emitted from there at any moment since the Big Bang to have had enough time to reach Earth at present, so these portions of the universe would currently lie outside the observable universe. In the future, light from distant galaxies will have had more time to travel, so some regions not currently observable will become observable."

Basically, the only thing we need to do to observe more of the universe is wait (do nothing for long enough).

"However, due to Hubble's law regions sufficiently distant from us are expanding away from us much faster than the speed of light (special relativity prevents nearby objects in the same local region from moving faster than the speed of light with respect to each other, but there is no such constraint for distant objects when the space between them is expanding; see uses of the proper distance for a discussion), and the expansion rate appears to be accelerating due to dark energy"
Brendan wrote:
Owen wrote:
Brendan wrote:
  • store more than 1 bit of information per atom (e.g. let's say a single atom has a speed ranging from 1 to (2**32+1) in one of 2**32 directions - you'd be able to store 64 bits of information per atom that way).

Heisenberg's Uncertainty Principle says no to that one
Heisenberg's Uncertainty Principle says "the more precisely the position of some particle is determined, the less precisely its momentum can be known, and vice versa". This means we can know the momentum extremely precisely if we don't care what the position is (and in my silly example, there wasn't any need to care what the position was anyway).
We could also do the reverse - rely on the position without caring what the momentum is. For example, lets have 2**64 empty spaces and one atom. Whichever space the atom is in determines the value that atom represents.

If you know the position, you don't know the momentum - which means it is moving in a direction you do not know!

Brendan wrote:
Owen wrote:
Brendan wrote:
  • simply use sub-atomic particles instead (e.g. maybe use photons, because they're unlimited - you can create more photons in your spare time if you ever run out)

Only in the presence of infinite energy.


So now you're saying the real limit is the amount of energy in the universe, and not the number of atoms?

Owen wrote:
The point of the number of atoms in the universe as an absolute maximal upper bound is that it is far beyond impossible - the number of bits you can store per atom is finite, but more importantly you need to arrange them in some sort of rigid lattice in order to avoid them wandering off (because if they're moving you will never find them again - see the uncertainty principle) and a single atom thick sheet of carbon (pretty much an optimum as far as single atom thick sheets go) does not have the rigidity you would need in the face of manipulating the atoms to store data.


Even if you accept that as a correct upper limit (which is highly dubious), it's still wrong (we'll still reach a point where we want to store more than we possible can).

And I want infinite free energy. That isn't going to happen.
Brendan wrote:
Owen wrote:
We are using the same kind of principle here which says you can't brute force a 128-bit symmetric encryption key (In this case because it requires more energy than exists in the solar system)


Which is also wrong - if you're extremely lucky, you might guess the key on your very first attempt and consume a very negligible amount of energy.

For specific keys, yes, you might successfully brute force them.

Of course, if your random number generator spat out the key "0x00000000000000000000000000000000", then I'd probably question why it hadn't failed its' internal self test.
Brendan wrote:
Of course the energy you consume isn't destroyed either (conservation of energy). You could provide all the energy you need using a lemon and 2 nails (e.g. copper and zinc), as long as you're able to recycle "waste energy" back into a usable form.
Brendan


There are actually scientific principles which define the minimum energy required to perform computation (That is, the minimum energy required to do a trivial binary operation on two bits). The energy consumed is turned into heat (At 100% efficiency), and therefore the amount of energy you can recover from that is dependent upon the amount of cold you can find (per the laws of thermodynamics), where "cold" is defined as something colder than your heat source, in this case your supercomputer.

Given that you know the temperatures of your heat source and heat sink, you can calculate the maximum power this engine can produce (The Carnot Engine is an idealised heat engine which defines the maximum amount of energy you can extract from a thermal gradient). Of course, that is going to decrease over time (as you heat up your heat sump).

Being that 99.9% of the mass of the solar system is the sun and therefore you only have 0.1% to exploit as cold (Requiring a rather warm supercomputer if you wish to use, say, Mercury for this - with resulting decrease in computational efficiency), we can determine the amount of energy you can recover in this manner.

Suffice to say that it is negligible.

P.S. the calculations which show that the solar system contains insufficient energy to brute force a 128-bit key assume, for purposes of simplification, that the supercomputer is running at 0K (Impossible) and that it has an infinite heat sink at a temperature of 0K (Also impossible)


Top
 Profile  
 
 Post subject: Re: Linux "High Memory"
PostPosted: Thu Jan 16, 2014 10:56 am 
Offline
Member
Member

Joined: Thu Jul 05, 2012 5:12 am
Posts: 923
Location: Finland
If we think about 20-bit addressing, I do not think it had never seriously thought to be "enough for everything". 32-bit addressing is a lot bigger, of course, but according to Wikipedia there were 4,830,979,000 people in 1985. It is more than 32-bit number can store. If people thought that 32-bit number is like infinity, it was not enough to even count people in the world. There were "practical" examples about the limitness of previous bit sizes and I do not think anyone wise enough thought that those were "enough for everything". If there were people saying otherwise, they ended up being wrong sooner or later. If we are talking about 64-bit addressing, it still is not big enough and I think we have known it from the very beginning.

If we are talking about very big numbers, there are no "practical" examples of them being insufficient. I think this makes a huge difference if leave the theoretical debate aside.

_________________
Undefined behavior since 2012


Top
 Profile  
 
 Post subject: Re: Linux "High Memory"
PostPosted: Thu Jan 16, 2014 11:09 am 
Offline
Member
Member

Joined: Thu Jul 05, 2012 5:12 am
Posts: 923
Location: Finland
If we simulated the universe, would that simulation contain the computer that does the simulation?

_________________
Undefined behavior since 2012


Top
 Profile  
 
 Post subject: Re: Linux "High Memory"
PostPosted: Thu Jan 16, 2014 11:39 am 
Offline
Member
Member
User avatar

Joined: Sat Jan 15, 2005 12:00 am
Posts: 8561
Location: At his keyboard!
Hi,

Owen wrote:
Brendan wrote:
Owen wrote:
We are using the same kind of principle here which says you can't brute force a 128-bit symmetric encryption key (In this case because it requires more energy than exists in the solar system)


Which is also wrong - if you're extremely lucky, you might guess the key on your very first attempt and consume a very negligible amount of energy.

For specific keys, yes, you might successfully brute force them.

Of course, if your random number generator spat out the key "0x00000000000000000000000000000000", then I'd probably question why it hadn't failed its' internal self test.


And in the same way, if your brute force algorithm starts by testing the least likely key it's equally broken.

Owen wrote:
Brendan wrote:
Of course the energy you consume isn't destroyed either (conservation of energy). You could provide all the energy you need using a lemon and 2 nails (e.g. copper and zinc), as long as you're able to recycle "waste energy" back into a usable form.


There are actually scientific principles which define the minimum energy required to perform computation (That is, the minimum energy required to do a trivial binary operation on two bits). The energy consumed is turned into heat (At 100% efficiency), and therefore the amount of energy you can recover from that is dependent upon the amount of cold you can find (per the laws of thermodynamics), where "cold" is defined as something colder than your heat source, in this case your supercomputer.

Given that you know the temperatures of your heat source and heat sink, you can calculate the maximum power this engine can produce (The Carnot Engine is an idealised heat engine which defines the maximum amount of energy you can extract from a thermal gradient). Of course, that is going to decrease over time (as you heat up your heat sump).


Here's the first part of the Carnot's maximum efficiency formula: n = W/Qh

W is the work done (equivalent to the electricity generated in our case), and Qh is the heat put into the system (equivalent to the heat produced by the CPU). The CPU converts 100% of the electrical energy into heat; therefore Qh = W. The maximum efficiency is W/Qh = W/W = 1. Basically, 100% efficient is possible. This should not be surprising at all - it's just reinforcing conservation of energy (isolated system with no energy escaping).

Of course perfection is rare; but "100% perfect conversion of heat to electricity" isn't needed to show that the "brute forcing a 128-bit symmetric encryption key requires more energy than exists in the solar system" theory is false. If it's only 99% efficient I'm guessing the you'd need 2% of the energy that exists in the solar system. :)


Cheers,

Brendan

_________________
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.


Top
 Profile  
 
 Post subject: Re: Linux "High Memory"
PostPosted: Thu Jan 16, 2014 2:30 pm 
Offline
Member
Member
User avatar

Joined: Fri Jun 13, 2008 3:21 pm
Posts: 1700
Location: Cambridge, United Kingdom
You can go and do the calculations for it if you want. With perfect computational efficiency, the total energy in the solar system is less than that required to do 2^127 block decryptions.

And the first thing you ignored is that the Carnot engine equations only apply for a heat engine - a device which converts thermal energy to work. The other thing you'll notice is that n=1-Tc/Th -- or that The maximum conversion efficiency is proportional to the difference in temperature between your hot and cold resovoirs.


Top
 Profile  
 
 Post subject: Re: Linux "High Memory"
PostPosted: Thu Jan 16, 2014 10:48 pm 
Offline
Member
Member
User avatar

Joined: Sat Jan 15, 2005 12:00 am
Posts: 8561
Location: At his keyboard!
Hi,

Owen wrote:
You can go and do the calculations for it if you want. With perfect computational efficiency, the total energy in the solar system is less than that required to do 2^127 block decryptions.


No I can't. The calculations rely on the false assumption that changing state must consume energy (rather than merely storing energy and reclaiming previously stored energy). All calculations that rely on false assumption are wrong by default.

Owen wrote:
And the first thing you ignored is that the Carnot engine equations only apply for a heat engine - a device which converts thermal energy to work.


You already stated that CPUs convert 100% of electrical energy into heat energy (I implicitly agree). Converting heat back into some other form of energy (e.g. electricity) is the only part where we disagree, which is what the Carnot engine does.

Owen wrote:
The other thing you'll notice is that n=1-Tc/Th -- or that The maximum conversion efficiency is proportional to the difference in temperature between your hot and cold resovoirs.


Given that it's possible to use heat pumps to shift the heat back (and that the inefficiency of the heat pump just creates more heat that can be reclaimed), the efficiency of both the heat engine and the heat pump are irrelevant. For example, you could have 3 objects (CPU, heat store and cold store) where heat pumps are used to ensure that the temperatures of the CPU and cold store remain constant (by pumping excess heat into the heat store).

Mostly, there are only 2 ways to show that this setup isn't possible. Either you prove that conservation of energy is false (e.g. energy is destroyed), or you prove that it's impossible to prevent energy from escaping out of an isolated system.


Cheers,

Brendan

_________________
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.


Top
 Profile  
 
 Post subject: Re: Linux "High Memory"
PostPosted: Fri Jan 17, 2014 3:04 am 
Offline
Member
Member
User avatar

Joined: Fri Jun 13, 2008 3:21 pm
Posts: 1700
Location: Cambridge, United Kingdom
Brendan wrote:
Hi,

Owen wrote:
You can go and do the calculations for it if you want. With perfect computational efficiency, the total energy in the solar system is less than that required to do 2^127 block decryptions.


No I can't. The calculations rely on the false assumption that changing state must consume energy (rather than merely storing energy and reclaiming previously stored energy). All calculations that rely on false assumption are wrong by default.


"The temperature of the cosmic microwave background radiation gives a practical lower limit to the energy consumed to perform computation of approximately 4kT per state change, where T is the temperature of the background (about 3 kelvins), and k is the Boltzmann constant. While a device could be cooled to operate below this temperature, the energy expended by the cooling would offset the benefit of the lower operating temperature."

(See Lloyd 1999)

(Or: You can't break even except at absolute zero, and you can't ever reach absolute zero)

Brendan wrote:
Owen wrote:
And the first thing you ignored is that the Carnot engine equations only apply for a heat engine - a device which converts thermal energy to work.


You already stated that CPUs convert 100% of electrical energy into heat energy (I implicitly agree). Converting heat back into some other form of energy (e.g. electricity) is the only part where we disagree, which is what the Carnot engine does.


The Carnot engine converts a difference in temperatures to energy, not heat.

Brendan wrote:
Owen wrote:
The other thing you'll notice is that n=1-Tc/Th -- or that The maximum conversion efficiency is proportional to the difference in temperature between your hot and cold resovoirs.


Given that it's possible to use heat pumps to shift the heat back (and that the inefficiency of the heat pump just creates more heat that can be reclaimed), the efficiency of both the heat engine and the heat pump are irrelevant. For example, you could have 3 objects (CPU, heat store and cold store) where heat pumps are used to ensure that the temperatures of the CPU and cold store remain constant (by pumping excess heat into the heat store).

Mostly, there are only 2 ways to show that this setup isn't possible. Either you prove that conservation of energy is false (e.g. energy is destroyed), or you prove that it's impossible to prevent energy from escaping out of an isolated system.


Cheers,

Brendan


The energy required to pump heat against a temperature gradient ΔT is greater than the energy that a carnot engine can recover from temperature gradient ΔT. In other words, your heat pump would consume all the energy from your carnot engine in order to not maintain the temperature gradient.


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 21 posts ]  Go to page 1, 2  Next

All times are UTC - 6 hours


Who is online

Users browsing this forum: No registered users and 60 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group