> Hi, > I want to know what are the best practice to use in order to > decompose a big query which contains so many joins.Is it recommended > to use stored procedures ? or is there any other solution? The main problem with SP's is that they can really screw up optimization. Most of the time you'll get more bang for the buck by managing the SQL, breaking it into more manageable chunks. Views can be a big help for this. Materialized views can help by pushing lock-heavy or long-running subqueries into the wee hours where resources can be managed a bit more easily (e.g., you don't collide with interactive queries or may be able to just lock the tables and be done with it). They also give you a more stable dataset to start with (e.g., no need to worry about inserts causing disconnects in totals or requiring table locks to avoid). Temporary tables can help by breaking out messy portions of the query and allowing you to pre-filter portions of the result. Scratch tables (i.e., a permenant table that gets truncated after use) can also help by staging portions of the query in indexed areas. With careful use they can be a big help because you can pre-aggregate portions of the query into a table with nice, specific indexes. Many of the worst queries involve reporting on aggregates over time that are re-calculated for each report. Pre-aggregating on, say, a daily basis can both simplify the reporting query and allow you to push some of the work off to the wee hours. You can also get the benefit of more specific values that may allow for unique indexes on the subtotals. If the query involves "many" joins there is a good chance that they break into clumps of related data. Breaking, say, daily usage vs. user account vs. accounting data into separate [materialized] views or temp tables keeps the joins more manageable and helps release resources that might otherwise get consumed for the entire query. -- Steven Lembark 1505 National Ave Workhorse Computing Rockford, IL 61103 lembark@xxxxxxxxxxx +1 888 359 3508