[Bf-python] Persistent data: new way

Willian Padovani Germano wgermano at ig.com.br
Sun Sep 7 02:51:50 CEST 2003


Hi Michel,

From: "Michel Selten" <michel.s at home.nl>
(...)
> I see from your examples that it is possible to store arrays of data and
> assign that array to one key, so that could be enough for storing big
> chunks of data.

Sure, and that's a problem, actually.  The data in this dictionary will be
kept for the rest of the Blender session (or until someone deletes (part of)
the data, of course).

> I assume you have thought of some types of data that are cumbersome to
> store with this module and consider those as 'big chunks of data'. So my
> question then would be: do you plan to have something for big chunks of
> data in the future then?

By big chunk I do mean that: big.  Python has no problem with storing any
kind of pytype in its dicts, of course, and that's what a script can
generate: some pytype.  As you see the problem is preventing that scripts
keep too much data in the Registry dict.  For that it's better that they
save the info to a file and reload it when needed.  We can add support for
not accepting more than N kB or Mb, whatever is decided, if necessary.

> This will make the Blender.ReleaseGlobalDict() function obsolete then?
> Would it be wise to add a warning message to the function when it is
> called? Just like I added one to the Object.getSelected() functions.

Yes, completely obsolete.  I'll make it do nothing and we should make it
warn the users (preparing for this I had already mentioned in the docs that
it would probably go away).

And that's not the only change that well behaved scripts will have to
support in order to benefit from the new hooks, so it's ok to make the
change now.

--
Willian, wgermano at ig.com.br




More information about the Bf-python mailing list