RE: Copying large volumes of data to a DB

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



HI,

Sort of yes.

It is actually transmitted as SOAP but I'm using a PHP PEAR package
called Services_Ebay which manages the SOAP/XML interface and presents
the data to me in UTF-8 text.

Thanks

Alan 

-----Original Message-----
From: Bastien Koert [mailto:bastien_k@xxxxxxxxxxx] 
Sent: 04 July 2005 23:39
To: lord_alan@xxxxxxxxxxx; php-db@xxxxxxxxxxxxx
Subject: RE:  Copying large volumes of data to a DB

How is the data coming down? Text?

Bastien

>From: Alan Lord <lord_alan@xxxxxxxxxxx>
>To: php-db@xxxxxxxxxxxxx
>Subject:  Copying large volumes of data to a DB
>Date: Mon, 04 Jul 2005 22:45:04 +0100
>
>Hi All,
>
>I'm starting to build an app to manage my eBay sales. Now one of the 
>things I need to do is to write the Category information into a local
DB.
>
>Once the initial download is done I'm O.K. about updating the content 
>but my questions relates more to this:
>
>What is the best way to manage the Mbytes of data which will be coming 
>down a relatively slow line (DSL) and then writing it into my DB.
>
>Should I have some sort of buffer where I read n bytes or records and 
>then write those to the DB and repeat until the process is finished, 
>or, do I wait until the lot has downloaded then write it to my DB in
one big gulp?
>
>I've done plenty or read/writinig to/from DBs but it has always single 
>records in the past and I'm just wondering about the "best" way to 
>handle this?
>
>Thanks in advance
>
>Alan
>
>--
>PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: 
>http://www.php.net/unsub.php
>

-- 
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [PHP Users]     [Postgresql Discussion]     [Kernel Newbies]     [Postgresql]     [Yosemite News]

  Powered by Linux