I am looking into getting a new workstation, but I am confused about the tradeoffs between number of cores, number of processes and processor speed. Can anyone point me to a good source to help me to decide which computer to buy with Gauss in mind? I want to use it for ecological simulation, and will have large matrices and arrays. Thanks so much, Peter.
It will depend on the operations in the algorithm and how the data is accessed.
The biggest bottleneck is usually getting data from memory to the CPU, rather than at the CPU. So I would look more at the memory bandwidth than at the CPU clockspeed.
If your simulations are independent, then I would definitely go for more cores, rather than faster cores.
Are there any particular workstations you are considering?
|Intel i9-12900【16cores / 24 threads】2.4GHz(↑5.1G)/30M||1|
|ASUS TUF GAMING Z690-PLUS WIFI D4||1|
|Solid state drive：Micron Crucial P5 Plus 1TB/M.2 PCIe 2280||2|
|CPU water-cooled radiator -ASUS||1|
|Graphics card: Leadtek T400(4G GDDR6 64bit/CUDA||1|
|Seasonic FOCUS PX-750(750W) 2 x 8 / platinum / Full module||1|
|Logitech MK295 mouse & keyboard||1|
|Windows 10 Pro (English) 64bits||1|
|ASUS TUF Gaming case||1|
Thanks so much for responding to my question. This is a workstation that I have been quoted. The simulations that I am doing involve iterative matrix operations on very large nonsparse matrices. The workstation is for development of the simulation program. I will ultimately run it on an HPC system to fully investigate it.
That looks like a good computer for your purposes. The only thing I might consider changing is the RAM, depending on the size of the matrices or if you would run multiple simulations at once.
If your total loaded data will be more than 1/4 of your total RAM, you might want more. For context, a 1,000,000 x 1,000 matrix would be about 800 MB. So if you had 20 of those at a time (or the equivalent), I would increase the memory from 64 GB to 128 GB.
Let us know if you have any more questions!