Re: open/save/export

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



hi,

after some more thinking, here's my reasoning why indeed 'Save' is the problem.

This becomes clear in comparison with a database-driven, version-controlled approach
as e.g. sketched in the last mockup of [1]. A Save command is superfluous in this
scenario. When an image gets edited, it's reasonable to assume the user wants to keep
the changes. Otherwise she can undo, revert (= bulk undo) or simply delete the image.
The only IO-commands required here are Import and Export, to exchange images between
the world and the database. Easy interface _and_ technically clean, i think.


Now current GIMP already employs sort of a temporary database of version-controlled
images: that is the working set of currently opened images.
Prior to editing, images are opened (=imported) into the working set.
The equivalent for Export is Save-a-Copy.

The major difference from a user's point of view is that the composition
explicitely has to be saved to disk, otherwise it gets lost.
So obviously the problem must be rooted in having to Save manually,
which is a legacy concept from the era of floppy disks [2].


The current spec builds a clean model on top of that legacy Save concept.
And the result is not as easy as we all would like it to be. That a lot of
effort is required to communicate the model in the UI is probably a sign of that.

In mid-term i see GIMP going the database-driven path anyway [3], but for now
we clearly have to support the classic Open/Edit/Save cycle. For that scenario,
it seems we can't have both of easy and clean. So i think it's worthwile
to reconsider easy-but-dirty models.


greetings,
peter




Notes sorted from on-topic to off-topic:

[1] http://gimp-brainstorm.blogspot.com/2009/04/version-control.html

[2] slightly related, i was quite surprised to find that messing with files also
    builds a good share of the accidental complexities of batch processing. compare:
    http://gimp-brainstorm.blogspot.com/2009/05/basic-batch-processing.html

[3] the canonical objections are that version control is too expensive for
    graphical work and that databases lead to application lock-in.

    With GEGL under the hood, the first is not valid anymore. In general,
    it is wasteful to store a second XCF instead of a diff of the GEGL tree.
    And storing each composition in a self-contained file will face efficiency
    problems as well: consider a composition created from 5 JPEGs of GPixel
    size each. Should all the source data be duplicated in the XCF?


    Application lock-in is a serious concern, though. I think there's concensus
    that at desktop level, hierarchical file systems don't serve users well anymore,
    due to their thousands of multimedia documents. Applications like F-Spot and Rhythmbox
    maintain their own databases, which leads to lock-in or at best duplicated effort.

    So ideally, these databases have to be provided by the desktop environment,
    and that's the point when GIMP will shurely jump on that train.
    Anyhow, it is debatable wether a private database for GIMP causes application
    lock-in any worse than the XCF format already does now.


_______________________________________________
Gimp-developer mailing list
Gimp-developer@xxxxxxxxxxxxxxxxxxxxxx
https://lists.XCF.Berkeley.EDU/mailman/listinfo/gimp-developer

[Index of Archives]     [Video For Linux]     [Photo]     [Yosemite News]     [gtk]     [GIMP for Windows]     [KDE]     [GEGL]     [Gimp's Home]     [Gimp on GUI]     [Gimp on Windows]     [Steve's Art]

  Powered by Linux