PDA

View Full Version : Physicists Find Nature's Limit to faster PCs



stuart.reinke
November 29th, 2009, 08:58 PM
Thought this was intersting.

http://www.foxnews.com/story/0,2933,564286,00.html?sPage=fnc/scitech/personaltechnology

This means that I will sooner or later find my self owning the fastest available computer, even though it will probably be several years old before I can afford to buy it.:D

Psumi
November 29th, 2009, 09:06 PM
The limit is not even given, for shame.

LinuxFanBoi
November 29th, 2009, 09:25 PM
The limit is not even given, for shame.

Don't worry, we will not reach any theoretical speed limits with any microprocessor built on with todays materials and technology.

Information moves as a result of matter moving. In todays microprocessors the matter moving is electrons. Since our current understanding of physics doesn't allow for any thing to move faster than the speed of light the limit is 299,792,458 m/s. If one day light (moving photon) is the medium by which information is moved we may approach that limit, however you must keep in mind that a chain is only a strong as it's weakest link. In this case information can only travel as fast as the slowest medium.

Given two computers connected to the internet, one by copper wire the other by fiber optic link, the maximum speed of any data transfer between them is limited to the speed of the one on copper wire as it is the slowest of the two.

Right now it seems the limit to how fast we can make a chip is bound by the amount of heat the materials it's made of can withstand.

Some may argue that cooling solutions are the answer. I say this is where price also becomes a limiting factor. Is it really cost effective to spend $2000 on an elaborate phase change cooling method to keep a $300 chip cool? Is it even practical? You start getting into issues with protection from condensation that will complicate a typical PC beyond what the average user is willing or even capable of endeavoring into.

There was a time when I used water cooling in my PC to eek out a significant performance increase from a cheaper processor. Eventually, it became cheaper to buy the faster chip than it was to setup a water cooling system.

That being said, it's still cool (no pun intended) to buy the faster chip and water cool it anyway to get as much out of it as it can handle without melting down. I've personally found that in these cases it's other hardware that reaches it's limit before my chip does.

gnomeuser
November 29th, 2009, 09:35 PM
Don't worry, we will not reach any theoretical speed limits with any microprocessor built on with todays materials and technology.

right now it seems the limit to how fast we can make a chip is bound by the amount of heat the materials it's made of can withstand.

Some may argue that cooling solutions are the answer. I say this is where price also becomes a limiting factor. Is it really cost effective to spend $2000 on an elaborate phase change cooling method to keep a $300 chip cool? Is it even practical? You start getting into issues with protection from condensation that will complicate a typical PC beyond what the average user is willing or even capable of endeavoring into.

There was a time when I used water cooling in my PC to eek out a significant performance increase from a cheaper processor. Eventually, it became cheaper to buy the faster chip than it was to setup a water cooling system.

That being said, it's still cool (no pun intended) to buy the faster chip and water cool it anyway to get as much out of it as it can handle without melting down. I've personally found that in these cases it's other hardware that reaches it's limit before my chip does.

I've wanted to do water cooling for a while, but I am concerned with all the issues such as critical pump failures, metal conduction problems caused by the water. It seems to me that to get this all working reliably you need to invest a great deal of time and have redundancy as well as early warning systems and periodic control and testing procedures. This however is just not how I want to use a computer, I want to turn it on and go about my business.

I do however not want watercooling for overclocking purposes, I would want to replace every fan with such a system, then to eliminate the heatsink fan I might want to dig down a tank in my garden and let mother nature cool it.

A completely silent setup, that would be bliss.

Now science has come to my rescue a bit on this, with ARM Cortex A9 cpus and my present dual core Intel ATOM 330 you get decent performance with very low heat output, meaning smaller if any fan needed.

Still there is the lure of mixing water with electricity that is undeniable.. it just seems so.. wrong and yet.. tempting. My inner engineer is just begging to budget an attempt, it can only end up cool - with explosions and great bodily harm or with a silent computing experience.

TheNessus
November 29th, 2009, 09:36 PM
Binary computers may have a size limit (and its far from now anyway)

But once there will be Quantum computers... well, that's a whole new universe.

LinuxFanBoi
November 29th, 2009, 09:45 PM
I've wanted to do water cooling for a while, but I am concerned with all the issues such as critical pump failures, metal conduction problems caused by the water. It seems to me that to get this all working reliably you need to invest a great deal of time and have redundancy as well as early warning systems and periodic control and testing procedures. This however is just not how I want to use a computer, I want to turn it on and go about my business.

I water cooled for the better part of the last five years. up untill my last build it was cost effective. now not so much as I didn't do it because I wanted a quieter machine.

the last cooling system I build was 100% enclosed within my case using 1/2" interior diameter tubing and a 12v DC pump. If your pump fails your system has mechanisms in place that will shutdown your PC before it reaches a temperature that will damage the PC.

One key thing to do when building a setup is to connect everything without any of the PC componates installed and let it run for 24 hours to test it for leaks. Also it is advisable to use deionized water as it is less likely to conduct electricity, though it still can.

For your first setup I would highly recommend buying something like a Koolance Exos, this was my first one and I loved it. My second system was a custom built setup from Danger Den and I did it for about $200.

Danger Den also has a lot of useful info on installation and construction. There website is:

http://www.dangerden.com/

Mike'sHardLinux
November 29th, 2009, 09:47 PM
Water cooling generally is not quieter than air cooling. You still use at least one fan, unless you use one of the fanless setups with the huge external heatsinks (http://www.newegg.com/Product/Product.aspx?Item=N82E16835118015&cm_re=zalma_water-_-35-118-015-_-Product). But, even that setup make some noise. The local Fry's had a system set up with that cooler, and the pump makes noise. If I could hear it in the retail store's noisey environment, I am sure it would be audible at home.

It is actually not difficult to set up water cooling. Just pay attention to detail and make sure hoses and clamps are secured and you most likely won't have any problems. I did it for about a year, and it was easy. I never had a single leak or issue at all.

LinuxFanBoi, that article is about the speed of processing, not networking. Really, those are 2 different issues. Though, I agree, it comes down to moving electrons in some medium. Different media have different resistance to the flow of electrons, and this slows them down and also creates friction. There's tons of articles on the web about new materials being tested for use in CPUs. Check out IEEE Spectrum (http://spectrum.ieee.org/).

ZankerH
November 29th, 2009, 09:48 PM
The article assumes the atom is the smallest possible unit capable of carrying out operations in binary logic. For all we know, the same could be done with subatomic particles, of which the majority probably haven't even been discovered yet. For all we know, a single atom could hold the potential for computing power greater than all the computing devices in existence today. We simply don't know enough about subatomic particles yet to pass judgement like this.

falconindy
November 29th, 2009, 09:48 PM
Binary computers may have a size limit (and its far from now anyway)

But once there will be Quantum computers... well, that's a whole new universe.
You should read the article. The whole point of it is that eventually we will have stable quantum computers. Its reasonable to expect that the maximum speed of this kind of processor is based on the minimum length of time required for a qubit (quantum bit) to change its state.

Fenris_rising
November 29th, 2009, 09:55 PM
Head over to Wizd forums. Water cooling information and heavily modded PC's await to tempt you!

regards

Fenris

MadCow108
November 29th, 2009, 09:59 PM
The article assumes the atom is the smallest possible unit capable of carrying out operations in binary logic. For all we know, the same could be done with subatomic particles, of which the majority probably haven't even been discovered yet. For all we know, a single atom could hold the potential for computing power greater than all the computing devices in existence today. We simply don't know enough about subatomic particles yet to pass judgement like this.

as far as I understood this far to brief article it has nothing to do with any physical realization but more with the general fundamentals of quantum information theory (of course the whole theory could be wrong, but that would be on some crazy scale not applicable on earth)

(also we know loads about the subatomic particles, they are useless for any kind of practical purpose. They are either extremely unstable or bound by so called "confinement" to nuclear states)

Mike'sHardLinux
November 29th, 2009, 10:01 PM
(also we know loads about the subatomic particles, there useless for any kind of practical purpose)

I was thinking along similar lines. The Heisenberg uncertainty principle (http://en.wikipedia.org/wiki/Heisenberg_uncertainty_principle) came to mind. But, maybe, in the future, technology will conquer such problems.

ZankerH
November 29th, 2009, 10:02 PM
as far as I understood this far to brief article it has nothing to do with any physical realization but more with the general fundamentals of quantum information theory (of course the whole theory could be wrong, but that would be on some crazy scale not applicable on earth)

(also we know loads about the subatomic particles, they are useless for any kind of practical purpose. They are either extremely unstable or bound by so called "confinement" to nuclear states)

"Loads" isn't enough. The discoveries we haven't made yet could easily force us to change the very fundamentals of the way we understand matter works.

LinuxFanBoi
November 29th, 2009, 10:04 PM
LinuxFanBoi, that article is about the speed of processing, not networking. Really, those are 2 different issues.

Sorry, I wasn't trying to claim that it was. just needed an example for my weakest link scenario. The reason I used it is because you may have a CPU at some point that uses light as the medium, but you also have to bus the information from input through the CPU and then to output. If there is another medium besides light by which the information is moved anywhere in the system, the speed of light is no longer the limitation.

MadCow108
November 29th, 2009, 10:12 PM
"Loads" isn't enough. The discoveries we haven't made yet could easily force us to change the very fundamentals of the way we understand matter works.

On theoretical level yes. But in practice nothing will change.
Elementary particle physics hasn't brought up anything useful since 1930.

The current theory (the standard model) fits all experimental evidence to a huge precision. The problems of it are more problems what was before and directly after the big bang, a situation clearly not relevant to us now.