Disclaimer: I am a C newbie.
I wrote a program to generate a bunch of random numbers in a normal distribution with mean mean
and standard deviation stddev
. The numbers are put into an array r
. The number of random numbers to generate is specified by the constant NUMRANDOMS
.
I wrote a function getRandom
that is passed mean
and stddev
. It creates the random numbers and puts them into r
.
I wrote two versions of getRandom
. The first version creates r
as a static array and returns a pointer to r
:
double * getRandom(double mean, double stddev) {
static double r[NUMRANDOMS];
for (int i = 0; i < NUMRANDOMS; i++) {
r[i] = normal(mean, stddev);
}
return r;
}
That version works perfect no matter how big I set NUMRANDOMS
. I have gone up to 100 million and it works perfect.
For the second version of getRandom
, I pass it the array r
and the length of the array (i.e., NUMRANDOMS
):
void getRandom(double r[], int len, double mean, double stddev)
for (int i = 0; i < len; i++) {
r[i] = normal(mean, stddev);
}
return;
}
That version works fine for NUMRANDOMS
with values of 1,000 or 10,000 or 100,000 but when I set it to one million I get the runtime error 0xC00000fd
. From other posts, I learned that that error message means "stack overflow".
Why does the second version work fine until getting to one million? Why does the first version work fine even with 100 million?
Most likely, you are allocating r
in the stack in the caller like:
double r[NUMRANDOMS];
getRandom(r, NUMRANDOMS, ...);
While in the first version you have it as a static
:
static double r[NUMRANDOMS];
In this latter version, the memory is reserved by the operating system when loading the program.
The stack in typical operating systems, by default, is quite small (for example, 1 MB). On contrast, the BSS/data sections usually have a limit orders of magnitude higher.
User contributions licensed under CC BY-SA 3.0