Re: How to turn a glibc internal variable into an external resolvable symbol?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



"Haizhi Xu" <xuhaizhi@xxxxxxxxxxx> writes:

> I have a glibc internal variable named "__libc_missing_32bit_uids"
> declared in getuid.c
> The original delcaration is
> int __libc_missing_32bit_uids attribute_hidden = -1;
> 
> Now I NEED to turn it into an external symbol. So I changed the above
> code to
> int __libc_missing_32bit_uids = 1;
> weak_extern (__libc_missing_32bit_uids);
> 
> After compile, it does not work as I expected.
> From getuid.os, it seems right...
> 00000000    w   O   .data   00000004 __libc_missing_32bit_uids
> 
> But in libc.so, it is changed to a local variable..
> 001166d8  l    O  .data 00000004  __libc_missing_32bit_uids
> 
> Can anybody tell me what I need to do?  what does the 'O' mean in the
> 3rd column output of objdump?

This is probably not a gcc question.  It sounds more like a glibc
question or a binutils question.

My guess is that it became a local variable in libc.so because glibc
uses a linker script which forces all symbols other than ones which
are explicitly named to be local symbols.  But I don't know for sure.
It's an issue of how glibc is built.

The 'O' in the objdump output means that the symbol names an object,
as opposed to a function.

Ian

[Index of Archives]     [Linux C Programming]     [Linux Kernel]     [eCos]     [Fedora Development]     [Fedora Announce]     [Autoconf]     [The DWARVES Debugging Tools]     [Yosemite Campsites]     [Yosemite News]     [Linux GCC]

  Powered by Linux