On Fri, Jun 13, 2008 at 2:38 PM, Daniel Brown <parasane@xxxxxxxxx> wrote: > On Fri, Jun 13, 2008 at 4:35 PM, Nathan Nobbe <quickshiftin@xxxxxxxxx> > wrote: > > > > btw. im curious to see your implementation dan ;) > > I thought I was going to have time to do it all start to finish > today, but that won't be the case. However, once I have it working, > it'll be in the Subversion repository and on the website of my as-yet > unannounced non-profit open source company. it took me longer than i thought too. the main issues are performance ones. 1. time to capture a single screen (need a decent box to ensure this is reasonable) there are several things here, load url from client; bring it up in browser, snap it, put it on disc so subsequent requests can be fast, resize, and finally, ship it off to the client via php's binay output support 2. handling silmultaneous requests using the browser / full screen capture approach, simultaneous requests have to be queued. perhaps a priority queue is best because cached requests can be served immediately. this is what i ran into anyway. and i had plans to streamline my prototype, but other things came up :D anyway, the best way to do it would be to hack something into firefox, directly. another lesser issue is capturing all of a page that runs off the screen. for example, a web page that produces a scroll bar. you cant get that w/ a snap of the screen because its cut off... and dont bother w/ export to print either, that butchers the web pages =/ w/ my prototype on a ~400MHz box w/ linux and a crappy video card; it was taking like 30 seconds for a complete turnaround on an uncached request. using lighttpd and php, imagemagik, ajax and some other cli programs, i cant remember off the top of my head. again, eager to see your stab at it. -nathan