On Mon, Mar 19, 2007 at 08:03:35PM -0400, Matthew Martz said: > >> The databases appear to be about 450MB-600MB. > > > > Not bad, but what operation were you performing on the database at the > > time? complex queries with joins and such will drive memory/processor > > usage up. > > > > I was selecting a database from the drop down menu. > > >> On a side note I had a requirement from the dev group here to set php to > >> a > >> maximum memory usage of 1GB due to the large files that are being parsed > > > > Holy CRAP that's huge. Is this a real need, or is this just the dev > > group wanting to not fix a problem? This seems like it could very > > quickly get out of hand if more than a user or two decide to kick off > > similar processes. > > > > I have no idea where the file comes from but it is a 600-800MB delimited > txt document. They use a web page to parse the file and create an Excel > spread sheet. > > I don't know what the txt file contains or why they need it converted to xls. > > In any case I am told to make the system capable of meeting the dev teams > needs. I think what you are running into is that phpMyAdmin isn't the right tool for this. It sounds like phpMyAdmin is loading the entire file into memory and working on it there... This works fine with smaller files but doesn't scale. We ran into a similar need on an internal project and wrote our own tool for importing / manipulating GIS data, which can be huge (many gigs.) Our tool is quite complex as we autoidentify file formats and columns based on content. It's trivial if you don't need that level of complexity. Perl is *Perfect* for this kind of thing. _______________________________________________ CentOS mailing list CentOS@xxxxxxxxxx http://lists.centos.org/mailman/listinfo/centos