question about optimization and integer truncation

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,

I have a question regarding where certain optimization takes place. Given the following code:

235         u32 string_area_size;
237         u32 cid_list_size;
239         u32 count;
...
294         cid_list_size = sizeof(struct acpica_device_id_list) +
295             ((count - 1) * sizeof(struct acpica_device_id)) + string_area_size;


I get the following ssa representation:

D.27638_41 = count_7 + 268435455;
D.27639_42 = D.27638_41 * 16;
D.27640_43 = D.27639_42 + string_area_size_4;
cid_list_size_44 = D.27640_43 + 24;
D.27641_45 = (acpi_size) cid_list_size_44;

As you can see, the computation of (count - 1) * sizeof(...) is based on unsigned integer truncation.
Can you tell me where exactly in gcc this optimization happens? (which file/function)

Thanks,
Emese


[Index of Archives]     [Linux C Programming]     [Linux Kernel]     [eCos]     [Fedora Development]     [Fedora Announce]     [Autoconf]     [The DWARVES Debugging Tools]     [Yosemite Campsites]     [Yosemite News]     [Linux GCC]

  Powered by Linux