Re: Problems with working with large text files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Adam Niedzwiedzki wrote:
I have a simple php script that I'm running from command line, it opens up a
http web log, proccess's it, then zips it when done.
If the http log is under 200MB (approx) this all hum's along nicely, as soon
as the files are up over 300MB php falls over.

Fatal error: Out of memory (allocated 378535936) (tried to allocate
381131220 bytes)
I'm running php5.2.2 on Windows 2003 64Bit Enterprise.
I have my php.ini memory_limit set to -1 and in my scripts I set the
following

;;;;;;;;;;;;;;;;;;;
; Resource Limits ;
;;;;;;;;;;;;;;;;;;;

max_execution_time = 30 ; Maximum execution time of each script, in seconds
max_input_time = 60	; Maximum amount of time each script may spend
parsing request data
memory_limit = -1		; Maximum amount of memory a script may
consume (128MB)

I have this in inline code..

ini_set("memory_limit",-1);
set_time_limit(0);

It seems to fall over on either fopen() or on gzcompress() or both if the
file is over 300MB.
Anyone know of another option to tell php to just be unlimtied on it's ram
usage?
The machine it's runnning on is an 8GB machine, has over 3GB free. (it's a
quad opteron box).

Anyone have any clues to help me out :(

Yeah, don't load the whole frickin' log into memory at the same time. Refactor your code so it can process the log line by line and you'll save yourself many many headaches in the future. I've never come across a good reason to load a large file into memory all at just to "process" it.

-Stut

--
PHP Windows Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [PHP Users]     [PHP Database Programming]     [PHP Install]     [Kernel Newbies]     [Yosemite Forum]     [PHP Books]

  Powered by Linux