Re: a better way to do a data import?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



If you unset/NULL out *every* variable, then you should not run out of
RAM...

It might be a heck of a lot faster to LOAD DATA INFILE to a temp
table, and then use SQL statements to compare the data and
update/insert any altered/missing data...

I do something similar with about the same number of rows, so it can
be done...

On Mon, January 21, 2008 11:35 am, blackwater dev wrote:
> I  have a text file that contains 200k rows.   These rows are to be
> imported
> into our database.  The majority of them will already exists while a
> few are
> new.  Here are a few options I've tried:
>
> I've had php cycle through the file row by row and if the row is
> there,
> delete it and do a straight insert but that took a while.
>
> Now I have php get the row from the text file and then to
> array_combine with
> a default array I have in the class so I can have key value pairs.  I
> then
> take that generated array and do array_diff to the data array I pulled
> from
> the db and I then have the columns that are different so I do an
> update on
> only those columns for that specific row.  This is slow and after
> about
> 180,000 rows, php throws a memory error.  I'm resetting all my vars to
> NULL
> at each iteration so am not sure what's up.
>
>
> Anyone have a better way to do this?  In MySQL, I could simply a
> replace on
> each row...but not in postgres.
>
> Thanks!
>


-- 
Some people have a "gift" link here.
Know what I want?
I want you to buy a CD from some indie artist.
http://cdbaby.com/from/lynch
Yeah, I get a buck. So?

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux