J.C. Pizarro wrote:
My input's sequence for the program was { 1 to 10, 1 to 10, 1 to 10, 1
to 10, 1 to 10 }
and the output's sequence was { sequence of random data and inestimable }.
Ok, first off, I'm a cryptographer so "random" has an actual meaning to
me, as oppose to "undetermined."
Second, there is a difference between "undefined" and "random."
Third, those numbers are NOT random. 5 32-bit numbers should have quite
a few different bits between them. If I took two 32-bit numbers I'd
expect on average at least 16 bits to be different between them (well
more so, that'd be the most likely outcome).
Fourth, it's not a "random process." The compiled instance of your
undefined code will follow a very logical and predictable course given
the current contents of unitialized memory.
suppose in
int a;
printf("a = %d\n", a);
The memory for "a" contained the int "5" . This is undefined code, as
in, the standard has no prescribed behaviour for the resulting program.
However, it WILL print "5" every time the "a" variable contains 5. It
WILL do that. As in, it's NOT a random process.
int a;
printf("a == %d\n", a);
That's not "random," nor is it "stochastic," or even "perplexing!" for
that matter. It's undefined. I can't tell you what that will print.
But I can justify what it did print [if that makes any sense...].
Tom
It's not a "functional programming", it's an "imperative programming",
they are different.
?
Use terminology from the standard if you're going to use anything. What
I wrote is a syntactically correct program (hence it compiles). It's
just not going to produce a defined behaviour (since a is uninitialized).
Here is your homework list of words to learn
- undefined
- defined
- behaviour
- deterministic
- initialized [and uninitialized]
I understand that English is likely not your first language, but please
stop trying to invent new terminology. Explain what you mean in simple
English, and we can help you explain it in terminology prescribed by the
standards.
Tom