Hello, I was having issues trying to use pj_sock_setsockopt using PJ_SOL_IP on a Windows system. The call failed because the call ended up addressing the socket level instead of the IP protocol level. I can work around the problem by defining: #ifndef SOL_IP # define SOL_IP IPPROTO_IP #endif in my config_site.h or as a preprocessor flag but I was wondering if this is the desired solution or if this might not be a bug? In pjlib\src\pj\sock_bsd.c PJ_SOL_IP is set to SOL_IP if it exists, but it does not exist, so 0xFFFF is used instead. Searches for IPPROTO_IP on the internet show that 0 is a common value and not 0xFFFF. Was 0xFFFF chosen because of *nix compatibility? /* * Socket level values. */ const pj_uint16_t PJ_SOL_SOCKET = SOL_SOCKET; #ifdef SOL_IP const pj_uint16_t PJ_SOL_IP = SOL_IP; #else const pj_uint16_t PJ_SOL_IP = 0xFFFF; #endif /* SOL_IP */ On my system SOL_SOCKET and IPPROTO_IP are defined in: C:\Program Files\Microsoft SDKs\Windows\v6.0A\Include\WinSock2.h c:\Program Files\Microsoft SDKs\Windows\v6.0A\Include\ws2def.h Thank you, Yann