I'm running some Monte Carlo simulation procedures which basically involves drawing random Bernoulli numbers with threadfor loop. The problem is when I run GAUSS over night the memory usage builds up and reaches the maximum of the computer, which greatly reduce the speed of the simulation.
I've cleared the output window and there is no global/local variables that accumulate in the background. It would be great if you could tell me what causes this memory usage problem.
Thanks a lot!
How about using Halton draw instead, I guess the following link include a GAUSS code of how to generate random number using Halton sequence,
visit his website for other associated code, you might find answer of what are you looking for.