J.C. Pizarro wrote:
Q: What does it print?
A: Random data.
Q: How does it print?
A: Stochastically.
I said "it prints stochasticly random data". Is it wrong?
Yes.
Stochastic, Adjective: of or pertaining to a process involving a
randomly determined sequence of observations each of which is considered
as a sample of one element from a probability distribution.
It's not "stochastically" printing anything. In fact, for any given
instance of the compiled unit, it's entirely predictable what's going to
happen. Just look at the machine code. And it's hardly random data
anyways. 5 32-bit numbers where most of the bits are the same is hardly
"random." At best, they're different, and more correctly, undefined.
You don't need to invent terminology.
int a;
printf("a == %d\n", a);
That's not "random," nor is it "stochastic," or even "perplexing!" for
that matter. It's undefined. I can't tell you what that will print.
But I can justify what it did print [if that makes any sense...].
Tom