Patrick Horgan wrote:
I was completely wrong in that. A universal character CAN be the first character as long as it does not represent a digit, and is listed in Annex D. \uA0A0 is not listed in Annex D.debugger_gcc wrote:If you back up to 6.4.2, you'll find out that a universal character can't be the first character of an identifier.hello,For the scenario where i declare a variable as int \uA0A0; // in a C programi get an error with message error: universal character \uA0A0 is not valid in an identifier when compiled using gcc-4.4.1 on linux.However C99 standards specify only the following constraints on universalcharacters : Section 6.4.3 ISO/IEC 9899:TC3
Patrick