Re: Re: > 120 meg of json analyzed by the browser...

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Thu, 2010-01-28 at 17:30 +0100, Jochem Maas wrote:

> Op 1/28/10 5:03 PM, Rene Veerman schreef:
> > Oh, i forgot to mention that firefox takes about a gigabyte of memory
> > after having stalled at "200mb parsed" in a 330mb document..
> > 
> > And despite using setTimeout(), firefox frequently freezes (for about
> > 2 to 10 minutes), before updating the decoding-status display again.
> > 
> > I'd really appreciate someone checking my non-eval() json
> > parser-decoder to see if my code is at fault, or if i've triggered a
> > firefox 'bug'.
> 
> just guessing but I doubt you have a real issue in your parser-decoder,
> the memory used by firefox seems reasonable to my untrained eye - I'd guess
> that a factor 5 memory overhead is normal given the amount of abstraction
> involved in the browser doing it's thing.
> 
> pretty cool what your trying to do - but, totally nuts of course :)
> 
> I would think that you're only recourse really is to chunk the output
> of both the server and the browser so that you can, theoretically, page
> through the data structure in the browser ... might be totally inpractical,
> if not impossible. not to mention you likely to have to use file-based storage
> for the chunks of output on the server side to avoid running out of mem.
> 
> hard problem!
> 
> > 
> 
> 


You could page through the data and make it look like it's happening all
in the browser with a bit of clever ajax

Thanks,
Ash
http://www.ashleysheridan.co.uk



[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux