List, I have a client process that is causing some performance issues with my app. On my current platform, the standard automated profiling tools are not available. Consequently, I can't easily determine where the bottleneck is in the process. I have an inkling that rewriting the client process to a stored procedure (implemented in C), will give at least a modest speed boost. What I'm trying to evaluate is the risk proposition of my effort. I really need a big bang for the buck, and I'm willing to risk a week of effort if the general consensus is that it's a good bet. I estimate a week to re-implement as a stored procedure, vs about 3 to manually profile what I need. Will I find out in 3 weeks what I already suspect? If I do profile it and fine-tune the slow points, would the newly architected client STILL run faster as a stored proc? So - any sage words of wisdom for those who've been down this path? I realize that w/o specifics no one can say for sure. General feelings from those with experience are welcome. Process specifics: Currently, the client process is run once a night. It's run on a separate machine than the database. Values in one table are updated by evaluating user defined functions against values in another table. Here's the pseudo-code: For each formula_row in formula_table: newValue = evaluate(formula_row.body) update formula_row set value = newValue where sid = formula_row.sid The evaluate function is a typical lexx/yacc parser. The formula body may refer to other formulas in the table, or to constant values in another table. If I do implement it as a stored proc, I'd likely add triggers so that the formulas would be re-evaluated once values in the constants table get updated.