On 08/21/2017 08:11 AM, Ilias Stamatis
wrote:
There are two scenarios here in my opinion. Backup, and emergency backup :-) As I've previously stated: performance is important. It should not take forever to process a 100 million entry database. I think the tool should use multiple index files (id2entry + friends) if we can generate the LDIF faster. But, if some of those indexes are corrupted, then we need an alternate algorithm to generate it just from id2entry. Also, if we are dealing with a corrupted db, then performance is not important, recovery is. So if we can do it fast, do it, otherwise grind it out. All that being said there is something we need to consider, which I don't have an answer for, and that is when databases do get corrupted which files typically get corrupted? Is it indexes, or is it id2entry? To be honest database corruption doesn't happen very often, but the tool should be smart enough to realize that the data could be inaccurate. Perhaps a parent could be missing, etc. So the tool should be robust enough to use multiple techniques to complete an entry, and if it can't it should log something, or better yet create a rejects file that an Admin can take and repair manually. I know this is getting more complicated, but we need to keep these things in mind. Regards, Mark
|
_______________________________________________ 389-devel mailing list -- 389-devel@xxxxxxxxxxxxxxxxxxxxxxx To unsubscribe send an email to 389-devel-leave@xxxxxxxxxxxxxxxxxxxxxxx