PDA

View Full Version : 128 bit ubuntu



jmore9
November 1st, 2009, 01:55 AM
Can you just imagine a 128 bit ubuntu ! Now that would be a very fast machine indeed.

tom66
November 1st, 2009, 01:58 AM
It may well be slower, because it's copying around larger, 16-byte pointers instead of 4-byte and 8-byte pointers. Plus few, if any programs can use more than 64-bit math.

hobo14
November 1st, 2009, 02:57 AM
It may well be slower, because it's copying around larger, 16-byte pointers instead of 4-byte and 8-byte pointers. Plus few, if any programs can use more than 64-bit math.

I imagine the OP means 128 bit Ubuntu on 128 bit hardware, which would not be slower.

Crunchy the Headcrab
November 1st, 2009, 03:03 AM
Meh. We're barely using 64bit now. More bits doesn't mean faster btw.

The Toxic Mite
November 1st, 2009, 07:45 AM
I don't get why 64-bit architectures are better than 32-bit ones... :?

dragos240
November 1st, 2009, 07:55 AM
I don't get why 64-bit architectures are better than 32-bit ones... :?

The only difference is 64 bit archs can use more than 4 gb of ram.

Bachstelze
November 1st, 2009, 08:20 AM
The only difference is 64 bit archs can use more than 4 gb of ram.

No it's not.

Dayofswords
November 1st, 2009, 08:39 AM
just throwing the idea out there


1024-bit ubuntu

Frak
November 1st, 2009, 08:46 AM
Can you just imagine a 128 bit ubuntu ! Now that would be a very fast machine indeed.
If you recompiled every program to run on 128-bit hardware, its 32-bit equivalent would run at nearly the same speed.

Only a few number crunching applications would truly benefit from it.

coldReactive
November 1st, 2009, 09:10 AM
No it's not.

*cough* Might as well give a link to back it up.

http://en.wikipedia.org/wiki/64-bit#32_vs_64_bit

spupy
November 1st, 2009, 02:54 PM
I think 64-bit will be adequate for quite a long time. Maybe for some kind of scientific research more is needed, but why do you need more than 17 billion gigabytes of RAM for a desktop PC? :) Windows 15?

coldReactive
November 1st, 2009, 03:00 PM
I think 64-bit will be adequate for quite a long time. Maybe for some kind of scientific research more is needed, but why do you need more than 17 billion gigabytes of RAM for a desktop PC? :) Windows 15?

Actually..............

64-bit processors can handle up to 16.8 million terabytes of RAM.

NoaHall
November 1st, 2009, 03:00 PM
The only difference is 64 bit archs can use more than 4 gb of ram.

lol? The whole thing about 64 bit is that they can handle word sizes of 64 bits, which is twice as large. This means, *in theory* that they can deal with twice as much data in the same amount of time. PAE increases the word size to 36 bits, which is why it can support more RAM.

afeasfaerw23231233
November 1st, 2009, 03:02 PM
I found an wikipedia article about 128-bit.
http://en.wikipedia.org/wiki/128-bit

infestor
November 1st, 2009, 03:05 PM
in a world currently where 64bit optimized programs are not majority, i think switching to 128bit is really far off.

coldReactive
November 1st, 2009, 03:06 PM
I found an wikipedia article about 128-bit.
http://en.wikipedia.org/wiki/128-bit

Some graphics cards even do 512-bit ;)

spupy
November 1st, 2009, 03:23 PM
I think 64-bit will be adequate for quite a long time. Maybe for some kind of scientific research more is needed, but why do you need more than 17 billion gigabytes of RAM for a desktop PC? :) Windows 15?


Actually..............

64-bit processors can handle up to 16.8 million terabytes of RAM.

Yes, it is the same number. ;) (Although I'm starting to get confused with milliard and billion... :O)

Dark Aspect
November 1st, 2009, 04:41 PM
The bits do not effect speed but rather the precision of a given operation. Thats what my computer programming professor says, you can take it or leave it.

Frak
November 1st, 2009, 05:34 PM
Actually..............

64-bit processors can handle up to 16.8 million terabytes of RAM.
It's the same number (of course, roughly because the OP wasn't being specific).

gletob
November 1st, 2009, 05:36 PM
Actually..............

64-bit processors can handle up to 16.8 million terabytes of RAM.

And that also equal to 17,203,200,000 GiB or about 17.2 Billion
Or 17.6 Trillion Megabytes

So
147,774,362,773,094,400,000 Bits
18,471,795,346,636,800,000 Bytes
18,038,862,643,200,000 Kilobytes
17,616,076,800,000 Megabytes
17,203,200,000 Gigabytes
16,800,000 Terabytes
16,406 petabytes
16 exabytes > Most proper way to say it due to the number being under 1024

AlexZaim
November 1st, 2009, 05:57 PM
Not all 64 bits are used by the way. My professor said that the max count is 52 (Maybe just for memory addressing,i'm not sure on this one). Which means that an average user uses way below 50!
That's why i think 128 bits is kind of too futuristic for now.

xir_
November 1st, 2009, 06:11 PM
128 bit processors would be very useful to me and most theoretical scientists.

I've always been interested by this rumor of 512 bit graphics processors, it strange most academics say "we will never use graphics cards as they only support single point precision"

is this just an outdated idea on their part or is the processor not a true 512 bit processor?

hobo14
November 1st, 2009, 11:46 PM
The only difference is 64 bit archs can use more than 4 gb of ram.

What rubbish.



The bits do not effect speed but rather the precision of a given operation. Thats what my computer programming professor says, you can take it or leave it.

Also an incorrect statement as a whole. Yes, more bits allows higher precision, and won't really speed up many existing programs, but more bits does allow more speed (a higher rate of data over time).



If you recompiled every program to run on 128-bit hardware, its 32-bit equivalent would run at nearly the same speed.

Only a few number crunching applications would truly benefit from it.

True for most existing software, but as soon as 128 bit hardware is available, software for it will begin to be written, and all that new software will run much faster on 128 bit hardware than it could on 32 bit hardware.

stmiller
November 2nd, 2009, 04:51 AM
64bit operating systems can run 32bit code natively so (as long as your CPU is 64bit) there's no reason to not use a 64bit operating system these days.

As 4GB becomes the minimum/standard, we will look back at these historical 32bit vs 64bit chats with a laugh or smile. :)

Sporkman
November 2nd, 2009, 04:58 AM
Actually..............

64-bit processors can handle up to 16.8 million terabytes of RAM.

Same thing.

Frak
November 2nd, 2009, 04:59 AM
64bit operating systems can run 32bit code natively

Or you could have the Itanium that could run 32-bit applications, but only at a performance loss. Also, it had to be done through software, since the processor doesn't natively support 32-bit execution.

Sporkman
November 2nd, 2009, 05:00 AM
It's the same number (of course, roughly because the OP wasn't being specific).

Beat me to it.

Exodist
November 2nd, 2009, 05:02 AM
64bit operating systems can run 32bit code natively so (as long as your CPU is 64bit) there's no reason to not use a 64bit operating system these days.

As 4GB becomes the minimum/standard, we will look back at these historical 32bit vs 64bit chats with a laugh or smile. :)

You have to have 32bit libs for backwards compatibility. So most useful 64bit OSes are 32/64bit hybrids.

I dont think 128bit CPUs for scientific use are to far in the future.
The only reason we dont see more talk of them now is that the CPU is not the bottleneck and everyone is working on fiber optic solutions to help resolve this issue before moving forward. So the next mother board to support 128bit chips may be fiberoptic boards of some form or fashion.

Frak
November 2nd, 2009, 05:05 AM
You have to have 32bit libs for backwards compatibility. So most useful 64bit OSes are 32/64bit hybrids.

I dont think 128bit CPUs for scientific use are to far in the future.
The only reason we dont see more talk of them now is that the CPU is not the bottleneck and everyone is working on fiber optic solutions to help resolve this issue before moving forward. So the next mother board to support 128bit chips may be fiberoptic boards of some form or fashion.
AMD is working on a 128 bit processor now. They've already released working prototypes to Microsoft so they can develop for it.

Exodist
November 2nd, 2009, 05:22 AM
AMD is working on a 128 bit processor now. They've already released working prototypes to Microsoft so they can develop for it.
Sweetness :-)

stinger30au
November 2nd, 2009, 05:22 AM
Can you just imagine a 128 bit ubuntu ! Now that would be a very fast machine indeed.

you can do this now

if you install a 64 bit ubuntu on a 64 bit machine and then install a 64 bit ubuntu inside ubuntu using virtual box you have a dual 64 bit pc so it must be 128 bit

Cowabunga Dude!
128 bit system...

yippie

hahahahahahahahahahaa:p;)

toupeiro
November 2nd, 2009, 06:45 AM
128 bit processors would be very useful to me and most theoretical scientists.

I've always been interested by this rumor of 512 bit graphics processors, it strange most academics say "we will never use graphics cards as they only support single point precision"

is this just an outdated idea on their part or is the processor not a true 512 bit processor?

GPGPU processing is being used pretty heavily in parallel compute clusters these days. The nice thing about GPGPU versus FPGA is the ease in ability to program for it. In any case, I fail to see where 128-bit CPU processing would benefit you more than 64-bit right now. Compute nodes, and software that can really utilize parallel clusters, never maximize the architecture capacity because the results yield much higher with quantity of very many nodes than quality or capacity of one or a few. Not to say that we won't get there, but there is a whole lot of life still in x64. We're only scratching the surface.

xir_
November 2nd, 2009, 05:29 PM
GPGPU processing is being used pretty heavily in parallel compute clusters these days. The nice thing about GPGPU versus FPGA is the ease in ability to program for it. In any case, I fail to see where 128-bit CPU processing would benefit you more than 64-bit right now. Compute nodes, and software that can really utilize parallel clusters, never maximize the architecture capacity because the results yield much higher with quantity of very many nodes than quality or capacity of one or a few. Not to say that we won't get there, but there is a whole lot of life still in x64. We're only scratching the surface.


in quantum chemistry precision is everything, parallelisation wouldn't be nearly as useful. As the most computationally intensive code is iterative and cannot be split.

But at 64 bit we still often run into rounding errors even in atomic units.

One project i would love to do is see if i could code a Fock matrix diagonaliser using CUDA, if i could do that it could really speed some stuff up in chemistry. The problem is that these matrices grow with n^7 if you are lucky (where n is a gaussian function).

Still im really unclear about what a 512 bit graphics processor means. Is that the precision of the gpu flop's, or just some memory bandwidth doohicky.