I'm using python-fu to script a sequence of animation frames in Partha's
Gimp 2.7.5 for win 32 (Vista) (Thanks!) It's expected to be a relatively
long sequence when done, probably several thousand frames, but the
animation is pretty simple, very algorithmic, just a series of layer
pans, zooms, fades, a little text, and drawing and stroking a growing
path, over and over -- a glorified slideshow. I don't mind if it takes
overnight to render. Anyway, I got the first version to load, and start
writing frames out as .pngs (targeting a later assembly process using
ffmeg). But it crashes gimp entirely around frame 175 or so (it starts
out quick, and slows noticeably near the end). Turns out it leaks memory
pretty badly, and it dies right about when task manager shows the memory
use in the gimp process just passing 2GB. Signed 32-bit integer
pointers, I guess.
Image is 1280x720 RGB, largest layers are 4800x3400 or so. Maybe 10
layers at most at any one time.
The original intent was to manipulate, copy, and create layers as needed
from within a single image, make a new layer from visible, write that
out, then remove old layers and start over for the next frame. I can get
individual commands and short sequences to appear to have the desired
effect when typed into the console (after a lot of reviewing source on
github -- this is a pretty cranky API; don't get me started on the
*_insert_* methods "parent" parameters!). Thinking that I was missing
something about properly managing gimp or python, I tried isolating a
much simpler set of commands, basically copying a reference layer from
one image into a new displayed image, then destroying that new image and
starting over. Just entering them into the console repeatedly I can see
the same gimp process memory usage jump up each iteration, at first
accumulating just under 1MB on each step, then apparently growing
steadily larger. I can see from the layer toolbox that my layer count
doesn't seem to grow larger than intended, though I suppose there could
be layers not properly inserted or deleted that might not show in the
toolbox.
I'm stumped on how to use python gimp to create this animation (and I
came to gimp because synfig couldn't handle all my over-one hundred
source images). I haven't found any example scripts that deal with this
scale yet. Do the experts on this board have any corrections to what I'm
doing wrong, or any tips or tricks that I might apply to manage this
issue? For example, if it's not my script, are certain methodologies
more leaky than others (such as opacity changes vs visibility changes,
copy-pasting visible to a re-used layer vs creating a new layer from
visible, layer copies within an image vs across images, layer
creation/destruction vs image creation/destruction)?
Here's the simple command sequence that demonstrates the "leak". I start
with an image containing an RGB layer 4800x3400 or so at layer[1]
(there's a transparent top layer). Open task manager and watch memory
use in the gimp process. Open the python-fu console, initialize gp
=gimp.pdb and src =gimp.image_list()[0], then iterate the following:
img =gp.gimp_image_new (1280, 720, 0)
dsp =gp.gimp_display_new (img)
lyr =gp.gimp_layer_new_from_drawable (src.layers[1], img)
img.insert_layer (lyr, position =0)
gp.gimp_display_delete (dsp)
I see memory use jump after each iteration. Any help is appreciated.
twv@
_______________________________________________
gimp-developer-list mailing list
gimp-developer-list@xxxxxxxxx
http://mail.gnome.org/mailman/listinfo/gimp-developer-list