Search Postgresql Archives

Re: large database

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




On Dec 11, 2012 2:25 PM, "Adrian Klaver" <adrian.klaver@xxxxxxxxx> wrote:
>
> On 12/11/2012 01:58 PM, Mihai Popa wrote:
>>
>> On Tue, 2012-12-11 at 10:00 -0800, Jeff Janes wrote:
>>>
>>> On Mon, Dec 10, 2012 at 12:26 PM, Mihai Popa <mihai@xxxxxxxxxxx> wrote:
>>>>
>>>> Hi,
>>>>
>>>> I've recently inherited a project that involves importing a large set of
>>>> Access mdb files into a Postgres or MySQL database.
>>>> The process is to export the mdb's to comma separated files than import
>>>> those into the final database.
>>>> We are now at the point where the csv files are all created and amount
>>>> to some 300 GB of data.
>>>
>>>
>>> Compressed or uncompressed?
>>
>>
>> uncompressed, but that's not much relief...
>> and it's 800GB not 300 anymore. I still can't believe the size of this
>> thing.
>
>
> Are you sure the conversion process is working properly?
>
Another question is whether there's a particular reason that you're converting to CSV prior to importing the data?

All major ETL tools that I know of, including the major open source ones (Pentaho / Talend) can move data directly from Access databases to Postgresql. In addition, provided the table names are all the same in the Access files, you can iterate over all of the access files in a directory at once.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux