Hi, Christian, On Wed, 20 May 2020 at 16:00, Christian König <ckoenig.leichtzumerken@xxxxxxxxx> wrote: > > So I've used an ancient system (32bit) to setup a test box for this. > > > The first GPU I could test is an RV280 (Radeon 9200 PRO) which is easily > 15 years old. Oh, I have one of those in box somewhere, but no AGP machine to install it (yet). > What happens in AGP mode is that glxgears shows artifacts during > rendering on this system. > > In PCI mode those rendering artifacts are gone and glxgears seems to > draw everything correctly now. > > Performance is obviously not comparable, cause in AGP we don't render > all triangles correctly. I agree, correctness before performance, always. > The second GPU I could test is an RV630 PRO (Radeon HD 2600 PRO AGP) > which is more than 10 years old. > > As far as I can tell this one works in both AGP and PCIe mode perfectly > fine. > > Since this is only a 32bit system I couldn't really test any OpenGL game > that well. I usually test with my distro's (Debian or Ubuntu, in my case) games. For example, I used Nexuiz when wiring up the shader cache on r300g. > But for glxgears switching from AGP to PCIe mode seems to result in a > roughly 5% performance drop. > > The surprising reason for this is not the better TLB performance, but > the lack of USWC support for the PCIe GART in radeon. > > So if anybody wants to get his hands dirty and squeeze a bit more > performance out of the old hardware, porting USWC from amdgpu to radeon > shouldn't be to much of a problem. Well, FWIW, I would argue that a regression in performance, if avoidable, should be avoided. I have not nearly enough knowledge of the driver to do it myself, but I'll glady test any patches, on both x86-64 (Radeon Xpress 1250) and ppc32 (Mobility Radeon 9550). Cheers, Rui _______________________________________________ dri-devel mailing list dri-devel@xxxxxxxxxxxxxxxxxxxxx https://lists.freedesktop.org/mailman/listinfo/dri-devel