Hello all and thank for the answers. Just to clarify I don't know C yet, so I hope I wont have to learn C to do it. But about the insertions - Yes, I'll probably use Perl if you say its fastest. Since the insertion will probably happen once or once a month, I don't mind if it will take even a week. Since I am more familiar with MySQL I think I rather use it. About what I'll do after that - I'll probably use several regex queries and things like that. And you got a point - I don't know all the queries I'll run yet, but I'll probably do them with Perl. And although PHP can handle files that large, shouldn't I split them anyway - in case of some error or debugging, its better to do it before then after, no? I've read a bit about working with large databases, But since I haven't used REGEX too much on MySQL queries, I would like to know how long do you think it will take me to do a simple regex search (likes?) on the database? and it will probably appears in most of the entries... I know I can do a test-search by trying it on a simple and multiply it by the number of entries (or something similar), but lets say I do it in PHP, This can change alot because I recieved a very small data-example which weights about 1MB. Multiply it by so much? It could be very very inaccuarate even if I loop and handle the file several times, Because, a regex search on table with 10 entries and a regex search on a table with 9 billion entries.... that is a problem... Is anyone familiar with working with such a large databases?