I made some simple comparisons about CPU usages between running Wine and running XP on a VirtualBox. In both cases, I ran exactly the same 2 applications with about the same configuration. The Wine I use is 1.3.26. The XP is XP Home with SP2 but trimmed down to nearly its bare-bones (having only the possibly minimum number of services running). The computer is a Dell Studio laptop with "Intel Core 2 Duo CPU P8700 2.53GHz" and 4GB of RAM running Ubuntu 10.10 64bits. The virtual machine for XP is configured to use 1 CPU and 512MB of RAM. Separately, I used "top" to measure the CPU usages. Results are as follows: These numbers are from running Wine: Code: PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 13770 cnbiz850 20 0 2665m 110m 15m S 16 2.8 1:12.70 mytrader2009.ex 14073 cnbiz850 20 0 2632m 60m 11m S 14 1.5 0:45.51 TradeBlazer.exe 14230 cnbiz850 20 0 9036 6080 708 S 10 0.2 0:29.95 wineserver 13773 cnbiz850 20 0 7084 4356 692 S 6 0.1 0:35.47 wineserver 14258 cnbiz850 20 0 2603m 31m 14m S 2 0.8 0:06.99 tbdatacenter.ex The following numbers are from running XP: Code: PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 20603 cnbiz850 20 0 1268m 649m 599m S 23 16.5 3:56.50 VirtualBox Notice the difference in CPU usage is pretty significant. In the Wine case, total is about 48%, and in XP' case, it is 23%. I have been under the impression throughout the years that Wine uses much less resources than using a virtual machine, and also feel that I understand that in theory. But can anyone please explain about the above results? Is there anything wrong, or am I missing something?