[Bf-committers] Blender 2.5+ Roadmap (Proposal)
Fri, 19 Sep 2003 19:27:34 +0200
In the long text below you'll find my ideas on a new architecture for
Blender. I want to thank Jeroen Lamain (RoccoD on irc) for helping me
out with some issues and reviewing this text. The text below is still a
concept and very draft. There are still some open issues and the details
are missing. Instead of fleshing out all the details first, I decided to
throw it out in the open.
What I hope to achieve with this text is to get an idea if the concept
is something to go for, or to simply stop with it.
Let the discussion begin.
Blender 2.5+ Roadmap
Blender is a software package that found its roots in the early 90's.
Since that time, a lot of additions and modifications have been made
The original structure is still there, but is a little out of place
for the huge number of features.
This document describes the ideas to a new restructured Blender. The
intention is not to put the emphasis on new features, but more on the
internal architecture and the use of plugins.
Since most developers work on Blender in their spare time, no
commitments can be asked from those developers, so it's important to
have something that works even halfway finished. With the concept
below, new releases of Blender can be made even with the restructure
in full progress. It's also possible to just start from scratch if the
preference goes to that option.
The ideas still can be applied in that case.
* General concept
To get a better understanding of the sources, an option is to split
the current source base up in modules. Each module has a dedicated
task within Blender. A module can be thought of a Windows DLL or a
Unix shared library.
A module exports a fixed set of functions and the user of the module
needs to know which functions are exported.
One step further is to think of components instead of modules.
Multiple architectures are present on various platforms that provide a
mechanism for components. Think of COM, CORBA, .NET and others. Most
of these component architectures are either not platform independent
or have no interface to C.
The concept behind component development is very interesting though!
Each component has a clearly defined interface and the calling
application uses the interface. It is unknown to the application what
component is actually being called. Using that mechanism, it's
possible to use either component1 or component2 for a specific task.
As long as the interface is the same, this is no problem. Think for
example of the old Python implementation and the new implementation.
The interface to both implementations are the same, the underlying
code however is not - including part of the functionality.
Another example is the current discussion about the use of the game
engine. If the interface to both engines is the same, the user can use
either Enji or Ketsji.
The most interesting component technology (with regards to Blender) is
XPCOM. This technology is used by Mozilla and is available on a lot of
platforms - including the ones Blender runs on. Unfortunately, XPCOM
has no C bindings. So we have to drop this ... for now.
This doesn't mean we can't use the same concept. Each component will
be implemented as a plugin. The interface of the component is provided
in an easy to read text file, preferably XML. The core of Blender
parses such an interface file and knows which functions the specific
plugin has and how to call those functions.
Each component needs the following three functions:
- AddRef ()
Increases the reference count of the component.
- Release ()
Decreases the reference count of the component. When the reference
count is zero, the component can be freed from memory.
- QueryInterface (IID *)
This is the function that queries the component for the requested
interface. An IID (Interface ID) is a 128 bit number. It can be
generated by the tool uuidgen or an equivalent tool.
The calling function calls the QueryInterface function with the
requested interface as an argument. The IID is provided in the XML
Through the interface, the calling function can access the
One technology that we need to make the above mechanism to work, is
the ability to dynamic load and unload libraries. This is a bit
troublesome since each platform has different ideas about supporting
This is where nspr (NetScape Portable Runtime) comes in. This library
provides a lot of generic features. One of those features is the
support for dynamic loading of shared libraries. The nspr library is
used by Mozilla and is a highly portable library (according to the
website, over 20 platforms are supported - including the ones blender
needs to run on).
* Speed issues
The above mechanism looks like it will slow down blender at startup.
The same discussion about this was just a few weeks ago about
loading/scanning Python scripts.
The general idea for plugins is to have one config file that holds
references to all available plugins for blender. At the first startup
of Blender, no such config file exists, so Blender needs to look for
the plugins and create the config file. The first time to start will
take the longest. After that, Blender reads the config file, scans the
directory containing the plugins and records the changes (if any).
(Side note: With XPCOM, this is handled differently - and faster.)
Components are loaded on-demand, so it may well be that Blender will
start up just as fast as it currently does, or even faster.
Plugins can register themselves to so-called extention points in the
UI (toolbar, menu, toolbox, ...). By default, there will only be an
icon and/or label. When this item is selected, the plugin will be
The icons, labels and information on where to connect to in the main
application are stored in the xml file amongst other information.
* Incremental development
One of the things that I still kept in my mind was that this new
architecture must be introduced in the current source tree without
affecting release planning or whatever. Of course, if there's a
decision that this new architecture should be started from scratch,
then that's possible too!
We have this big code base where we insert the possibility to add
support for plugins. From there, we specify what part of the code we
want to 'convert' to a plugin. During this time, new development on
that specific part of the code may be a bit difficult, but this is
where communication is important. At all times we should know what
part of the original code is being converted to a plugin.
If possible, the original code should still be accessible and working
so that a new Blender release is not hindered by an in progress
plugin. Look at how I added the configuration option for the EXPPYTHON
to the build environment.
* Reusing current code
The figure below shows what each component can contain:
| Python API |
| UI blocks |
| Data modification |
| routines |
| Data storage |
(Note: this is just an example. Other components may contain different
Most of the code in the current code base can be taken and put in one
of the above presented layers. The new Python implementation can be
used without major modifications. (Each module is already on its own).
* .Blend file compatibility
Each component maintains its own data. This means that if a component
is not used by the user, no data from that component will be stored in
the .blend file. If a component is used, and it needs to store data to
a file, it also needs to leave some identification behind in the
.blend file. If such a .blend file later is used, Blender can
automatically load the required plugins. Such an identification may be
a UUID (128 bit number). Most systems come with the tool uuidgen or
something similar which generates such a number.
The current .blend file format is not suitable for the above presented
mechanism. There is a project working on .XML file format support in
Blender. A .blend file format importer needs to be written (copy /
paste) to support backwards compatibility. The .XML file format should
then become the new file format for Blender.
* Critical issues
At the moment there's one critical issue that I can think of. It needs
to run on many platforms. One of the things is that general external
libraries are used, another is a solid working build system. Since
Blender has become open source, some new build systems have been
introduced, most of them specific for one platform/tool. One of these
systems aimed at cross-platforms however, but failed to be implemented
correctly. I'm talking about the autoconfig build environment. This
drew some scepticism of that system.
There's always been the original Make environment developed during the
NaN period. Currently, this environment does not have support for
building plugins. And building plugins is different on each platform,
we're going to face some heavy development in this department if we
want to add that to the original Make environment.
The nspr library has a very impressive build environment supported on
a lot of platforms. This environment is based on the autoconf system;
exactly the same system some of us are a bit sceptic about.
We can look at how nspr implements the platform specific options and
adapt our system to use those features.
My proposal would be to create a small test application to test out
the build system. This small application should then be tested by the
platform maintainers - and if possible updated to get it to work on
their specific platform.
* External plugin support
The new architecture opens possibilities for third parties to develop
plugins for Blender. We need to provide them with a solid foundation
to build their plugins on. Again, the most important part of this is
the build system. We can't expect all third parties to just put the
source code for their plugins on blender.org.
Of course, we can reuse a lot of the build system used for Blender.
The only support we need is plugin compilation so the rest is on the
nomination for removal.
* Plan of attack
The task ahead is pretty huge and therefore a small plan is very
useful. Others can see where we are and it's nice to know if no steps
are being overlooked.
a. Work on the Critical Issue - build environment
b. Create a mechanism how Blender can be distributed with plugins.
Eventually, users will get more files needed by Blender. These
be installed on the system as simple as possible for the user.
c. Update the current Blender build system with the proof of concept
generated in step (a).
d. Implement the plugin mechanism in Blender
Besides converting the current code base to plugins, a mechanism
must be in place to actually load/use those plugins.
e. Create a couple of small plugins to test out the mechanism
developed in step (d).
For example, we could take Suzanne and convert her to a plugin. Ah,
Suzanne as the pioneer! This step should be done in parallel with
the previous step (d).
f. First major plugin could be the material system
g. Incrementally convert the rest of the source base to plugins
* Looking to the future
The ultimate situation of the Blender code base would be that the core
is only a very small application that contains just enough
functionality to load and unload plugins. It may take some time before
this goal has been achieved.
The next step could be to actually work on a real component structure.
The core would need to be rewritten to C++, XPCOM support needs to be
added and a small C++ layer needs to be added to each plugin or
convert the plugins from C to C++ in its entirety.
Since this is looking way to the future, I don't want to spent too
much time discussing this subject. It's just that I want to point out
that with the plugin / component architecture, we can have a solid
foundation that can be easily updated.
Design Patterns, Gamma ...
* Closing thoughts
I do not want to commit myself to work on this task alone. It is huge
and currently there is still a lot that needs to be fleshed out.
My current goals are to collect all comments and update this document.
Also, I want to further investigate the possibilities, pitfalls and