Baring everything that i dont know about (which is everything)
Ace, i was running the numbers ignoring the fact that they wouldnt "scale linearly" as you say. I knew something like that would probably make my calculations moot. But Ace, what would happen, would the end computer for 350 million be more powerful than what the numbers were that i came up with? Or would it be slower?
And by your guestimation, how much in each way?
Cmon mr. smart guy, enlighten us laymen.
(Btw, i love your icon Crypt)
It would definitely come out with less power than the numbers you were coming up with. Basically, the more nodes you add to a cluster, the lower your TeraFlop/node average will be for the cluster as a whole. To be honest, I couldn't even come close to providing a reasonable guess as to what kinda performance you would get out of $350 million worth of XServe G5s. It would depend on what interconnects they use and how they lay out the cluster network. The only thing I can say for sure is that it would eat the Earth Simulator alive.