Following my computers slow demise into oblivion, I've started to look at a rebuild; any opinion on these components? I'm going for a bare minimum and trying to salvage what I can from my current machine (which in turn I'm thinking of cannibalising into a Linux machine for various bodging purposes), although I'm not sure about what to do with the IDE drives....
CPU
AMD Athlon 64 X2 5600+ AM2 2.8GHz
Motherboard
Asustek AM2 nForce 590 SLI ATX 8 ch Audio 2xLan W/Less Lan
RAM
Crucial 2x1GB 240-Pin DIMM PC2-6400 Unbuffered Non-ECC CL4
Graphics Card
Gigabyte GeForce 8800GTS 320MB PCIE DVI
Hard drive
Western Digital Caviar 160GB 7200RPM S300 16MB
DVD
LiteOn DVD+-R/RW/RAM 20x Internal SATA Black Nero 7
PSU
Dabs Value 650W PSU Black 12cm SATA PFC
I find it, frankly, terrifying getting back into the PC buying game after all this time. And expensive.
4
Future-proofing.Thesizzler wrote:Bare minimum? That thing is like three times better than mine!
Dual core 64bit AM2 socket. I believe. Granted, very few apps use multithreading, but I've read that (for example) Bioshock does. Plus using an AM2 socket means I should be upgrade to, say, quad core when it becomes necessary.Hunter wrote:AMD Athlon 64 X2 5600+ AM2 2.8GHz ?
I've been away far too long.
At least, that's what I think.
10
"...Granted, very few apps use multithreading,..."
This is true, but there is usualy something going on in the background, and the OS can divy up the chores, so to speak. While lighting mines I will open Speed Fan, which moniters usage of each core. The core that is running radiosity will be maxed out. The other core will vary, from a few percent just running Speed Fan, to whatever another app wil take. With a single core procceser, lighting the mine was very risky and would fail if any other app opened, for example if Norton was scheduled to run a virus scan and I forgot to disable it. (Although, using W98 at the time may have been a factor, also...lol!) So the second core seems to act as a backup, a reserve of computing power. Based on my observation, that is.
(See this machine here.)
This is true, but there is usualy something going on in the background, and the OS can divy up the chores, so to speak. While lighting mines I will open Speed Fan, which moniters usage of each core. The core that is running radiosity will be maxed out. The other core will vary, from a few percent just running Speed Fan, to whatever another app wil take. With a single core procceser, lighting the mine was very risky and would fail if any other app opened, for example if Norton was scheduled to run a virus scan and I forgot to disable it. (Although, using W98 at the time may have been a factor, also...lol!) So the second core seems to act as a backup, a reserve of computing power. Based on my observation, that is.
(See this machine here.)
11
That's not really multi-threading, though. Multithreading (in this context meaning splitting workload across multiple parallel processing units, as opposed to software multithreading which pretty much every application would do) is a program using synchronized (this bit being very important; the problem with dual core is if you have thread contention across the multiple cores, which is why you see single threaded programs running on one core) execution threads spread across parallel processors (or more specifically, processing paths - I forget the proper name, but multiple ALUs etc).WillyP wrote:"...Granted, very few apps use multithreading,..."
This is true, but there is usualy something going on in the background, and the OS can divy up the chores, so to speak. While lighting mines I will open Speed Fan, which moniters usage of each core. The core that is running radiosity will be maxed out. The other core will vary, from a few percent just running Speed Fan, to whatever another app wil take. With a single core procceser, lighting the mine was very risky and would fail if any other app opened, for example if Norton was scheduled to run a virus scan and I forgot to disable it. (Although, using W98 at the time may have been a factor, also...lol!) So the second core seems to act as a backup, a reserve of computing power. Based on my observation, that is.
(See this machine here.)
Of course the OS will split up the running tasks across cores, but this is not really too different from time-slicing them across a single core... arguably it can be worse, too, as it's harder to achieve peak performance as you can't perfectly (AFAIK) load balance across 2 discrete cpu cores on a purely program-basis.
12
I understand what your saying but I think that one core running at 100% and another running at 5% is still faster than one single core that is running at the same clock speed. So what if it's not true multithreading, or not opimized... Of course, having two cores doubles your potenial computing power, not your actual computing power. But if a single core is at 100%, well that's all there is. But, the same app running one core of two at 100% still leaves the other core open for business.
13
Well, the point is that really if you look for bang-for-buck, it's inefficient. Future-proofed, of course (because most programs are naturally suited to being parallelized), but not actually great value. Yeah, you can get two ~2.8Ghz cores, but they probably won't be as good as a theoretical 5GHz (and cheaper) single core would be. Not that you can compare on the basis of clock speed alone, of course, and now I think of it I'm not sure if the multicore speed rating is based on individual or combined throughput.WillyP wrote:I understand what your saying but I think that one core running at 100% and another running at 5% is still faster than one single core that is running at the same clock speed. So what if it's not true multithreading, or not opimized... Of course, having two cores doubles your potenial computing power, not your actual computing power. But if a single core is at 100%, well that's all there is. But, the same app running one core of two at 100% still leaves the other core open for business.
Especially if you have shared resources like cache or memory manager; for example I think the Intel chips share L2 cache, which could possibly cause contention between (hardware) threads.
Anyways, it's not really taken advantage of at the moment. It's partially used; if you say you have a 100% used core and a 5% used core, then really it's highly inefficient. I'm just betting it won't be in the future, now that these things are taking off.
14
Well, C# is supposed to auto-adopt multi threading using a generic thread pool (Though there are some doubts as to exactly how much time this will save) so multi-threading will certainly become more compatible.
The big problem with Multi-thread, as Aldo says, is the fact that you have to protect variables that are in a single thread, otherwise you'll get situations where one routine is trying to read a variable that another routine is trying to write etc, this, I think, is what causes stability problems on older programs.
That said, I use a lot of rendering software like Lightwave which have had multi-thread support for a long time, and the difference in speed for the Core 2 Duo 6600 there is startling to say the very least.
The big problem with Multi-thread, as Aldo says, is the fact that you have to protect variables that are in a single thread, otherwise you'll get situations where one routine is trying to read a variable that another routine is trying to write etc, this, I think, is what causes stability problems on older programs.
That said, I use a lot of rendering software like Lightwave which have had multi-thread support for a long time, and the difference in speed for the Core 2 Duo 6600 there is startling to say the very least.