Hi, Would it be correct to say that the (target machine influenced) RTL representation has ... less intelligence (wrong word?) than the tree representation (GENERIC or GIMPLE)? Sort of like describing a circle with a point and radius or as a polygon? I am trying to figure out where to place this optimization? Early where the tree is still available or purely in the RTL (where from various reading most other optimizations take place)? Can someone suggest a good optimization (available in the 3.4.6acene era) to study? Finally, does the 3.4.6 tree representation have a name? Was it called GENERIC? Or is that 4.x only? Are they similar (4.x is a revision of 3.4.6)? I took a look at the -fdump-translation-unit output and it is 6400+ lines long for a simple file? (I guess what I am looking for here are pointers to filter out ... less interesting stuff.) The optimization I am working on's goal is to change the way that string constants are accessed by architectures like the register adequate PowerPC. I guess another way to ask is, are string literals easily identifiable in RTL (I've only started looking at RTL)? I currently figure it is easier to gather information about a functions string pool from the tree representation (I guess IL is the GCC approved term)? Thanks! kevin