Recently some folks were having trouble with podracer and parsing some feeds and I questioned whether any of them were from feedburner.com. That can be determined by looking at the various addresses you're subscribed to. Well, I don't have podracer but my home grown perl script called plpodder, was having a problem choking on every feedburner feed. I just made a change and it now works again. For background sake, my script used the parse_file method from the XML::LibXML package to retreve the URL and parse the feed. This worked fine up til a week or so ago. The strange thing is, parse_file can accept a URL or local file name. When I changed the argument to parse a local file and I previously download the same feed with wget, it worked fine. So something is strange with XML::LibXML->parse_file's implementation of getting documents via http instead of local files. I now wget the feed, then use parse_file on the local file and it seems to be working perfectly now. I only have one feed that is consistently broken but it looks like ladgitmate errors in parsing. My previous errors with the old method claimed the file was empty. Anyway, podracer may be suffering something similar? I haven't changed anything with libxml2 or any of its components so maybe feedburner has done something which causes a timing issue with libxml? Anyway, I'm getting my podcasts again and that's important!:) -- HolmesGrown Solutions The best solutions for the best price! http://holmesgrown.ld.net/