[Bf-committers] Proposal/Potential Roadmap for Blender/BGE Integration

Dalai Felinto dfelinto at gmail.com
Sat Jun 22 07:54:25 CEST 2013


My 20 cents:

Sharing Blender Data:
Why can’t we do like Cycles do? I’m almost sure Cycles copy (or
convert) some of the data to allow Blender to keep changing things
while rendering. Anyways, I think the downside of not allowing for
play/pause change material, change object property, … during the game
is a big one. But I’m sure, even if the blenderplayer is a separate
process we can get that to work.

As for embedding OSX I think there may be a solution, but it’s far
from straightforward, leave alone “elegant”. At least that was my
first impression when looking at it.

Viewport Drawing:
I like most of it, but I didn’t understand the need for dropping
Single Texture mode. Can’t that be a subset of Multitexture and keep
existing? (so it would disappear as a mode, but the same/similar
results should be achievable with the same amount of work from the
users).

I don't even care if internally it's implemented as GLSL shadeless. If
implemented properly it can be just as fast as the original fixed
pipeline I believe. So it's not a matter of implementation, but
instead of allowing for simple workflow/pipelines for the user - in
this case, to simplify the assignment of texture faces without having
to worry about setting up a full material.

Cheers,
Dalai
--
blendernetwork.org/member/dalai-felinto
www.dalaifelinto.com


2013/6/21 Mitchell Stokes <mogurijin at gmail.com>:
> I'm all for some better integration between Blender and the BGE. I've
> posted an article on my blog detailing how I think we should take our first
> steps toward better integration:
> http://mogurijin.wordpress.com/2013/06/21/towards-tighter-blender-and-bge-integration/
>
> Below is the full blog post for everyone's convenience:
>
> In Ton’s recent blog post, he discussed a roadmap for Blender 2.7, 2.8 and
> beyond, which included a more tighter integration between Blender and the
> BGE. While this initially caused quite the stir in the BGE community with
> some thinking this meant dropping the BGE entirely, I see it more as a
> desire to get the two to share more code. Blender has smoke, fluids and
> particles, why shouldn’t we use those in the BGE? Too slow? Then lets speed
> them up and make Blender users happier in the process. The way I see it,
> the BGE can benefit from new features in Blender and Blender can benefit
> from performance improvements from the BGE. But, how do we get there?
> That’s what I aim to discuss in this article.
>
> Sharing Blender Data
>
> The first major problem that needs to be tackled is how the BGE handles
> Blender data. Currently, one of the BGE’s major design decisions is to
> never modify Blender data. While the BGE does modify Blender data in a few
> places (most notably lights), we’ve mostly stuck to this design principle,
> which has helped prevent numerous bugs and potentially corrupting users’
> data. However, in doing so, we’ve had to recreate most of Blender’s data
> structures and convert all Blender data to BGE data. This also limits how
> we can interact with existing Blender tools. Blender has a lot of powerful
> mesh editing tools, but we can’t use those in the BGE because they require
> a Blender Mesh object while the BGE has a RAS_MeshObject, and using the
> original Blender Mesh would cause that data to change.
>
> If we want a tighter integration between Blender and the BGE, we need to
> allow the BGE to have more direct control over Blender data. This means we
> need to find a way to allow the BGE to modify and use Blender data without
> changing the original data. The most obvious method is to give the BGE a
> copy of all of the data and then just trash the copy when the BGE is done.
> However, I think there is a bit more elegant solution to the problem. If
> you look at the existing code base, you can see that the Blenderplayer
> actually doesn’t have to worry about modifying Blender data as long as it
> never saves the Blendfile it reads. Only the embedded player has issues
> because it is using the Blender data already loaded in Blender. So, why not
> have the embedded player read from disk like the Blenderplayer? When the
> embedded player starts, the current Blendfile could be saved to disk and
> then loaded by the BGE. There are some details that have to be worked out
> here though, such as where do we save the file? A temporary location (e.g.,
> /tmp)? That will cause path issues in larger games. Instead, I see two
> feasible locations: the original file or the original file appended with a
> “~”. The first would behave like a compiler would, you save before running
> your program, and is the approach I prefer. However, this changes the
> current behavior, which might upset some users.
>
> A more long term solution to the problem of modifying Blender data is to
> drop the embedded player. As I mentioned before, the Blenderplayer doesn’t
> run into issues using Blender data since it doesn’t share a memory space
> with Blender. And, since the Blenderplayer supports being embedded into
> other applications, we can still have games running in what appears to be
> the viewport. In other words, we would not lose features! Some benefits to
> this approach:
>
>   * Get rid of a lot of code (the whole source/gameengine/BlenderRoutines
> folder)
>   * A lot less duplicate code
>   * Smaller Blender runtime size (all BGE code would only be in the
> Blenderplayer, and not Blender)
>   * Playing the game in the viewport and the Blenderplayer would be
> guaranteed to be the same (right now small differences exist)
>   * The ability to modify Blender data without breaking Blender
>   * A BGE crash won’t affect Blender since they will be in separate
> processes (like Chrome tabs)
>
> However, there are some downsides, which include:
>
>   * It will be more difficult to affect the BGE from Blender. At the moment
> this isn’t a problem, but if we want some goodies like Unity offers with
> adjusting the game using the editor while the game is running, we’d need to
> develop some inter-process communication protocol to get Blender and the
> BGE communicating.
>   * We currently don’t allow embedding on OS X. I’m not sure if this is a
> limitation of OS X itself, or a lack of development effort on our part.
>
> Using Blender Data
>
> So, we’ve got some ways to minimize the issues of the BGE using Blender
> data, but what do we do with it? First off, I’d start to clean up the BGE
> code to use DNA data as storage and then shift the focus of the various BGE
> classes to act as wrappers around that storage. Where possible, the member
> functions of those classes could delegate to the various Blender kernel
> (BKE) functions. Once that is done, we can look into what Blender goodies
> we can start adding to the BGE using these new classes.
>
> Viewport Drawing
>
> While the BGE and Blender already share a fair amount of viewport drawing
> code (especially in GLSL Mode), this area could be much improved. The first
> task here is to get all of the OpenGL (and any calls to bf_gpu) into the
> Rasterizer, and only the Rasterizer. This requires moving material and
> lighting data out of Ketsji and into the Rasterizer. Once this is done, we
> can worry about how the BGE handles it’s drawing. The Rasterizer should
> have two modes (possibly implemented as two Rasterizers): fixed function
> pipeline and programmable pipeline. To do this, I would propose dropping
> Singletexture and making Multitexture code the basis for the fixed function
> Rasterizer, while GLSL mode would be the basis for the programmable
> Rasterizer. The programmable Rasterizer could have an OpenGL minimum of 2.1
> as Ton suggested for his proposed roadmap, but I’d keep the fixed function
> Rasterizer as compatible with older hardware as possible.
>
> After we have the Rasterizer cleaned up, we can start offloading as many
> tasks as possible from the Rasterizer to the bf_gpu module, which the
> viewport code also uses. The more we can put into this module, the more
> Blender and the BGE can share viewport drawing. Ideally, the Rasterizer
> would not have any OpenGL code and would rely entirely on bf_gpu,
> maximizing code reuse and sharing.
>
> Conclusion
>
> Using the ideas outline in this article, we’d have two main points of
> interaction between Blender and the BGE: BKE and bf_gpu. We could certainly
> look into more ways to increase integration between Blender and the BGE,
> but what I have discussed here will give us more than enough work for the
> foreseeable future. Also, please note that this is only a proposal and a
> listing of ideas, and by no means a definitive plan. Discussion and
> feedback is much encouraged and appreciated.
>
> --Mitchell Stokes
> _______________________________________________
> Bf-committers mailing list
> Bf-committers at blender.org
> http://lists.blender.org/mailman/listinfo/bf-committers


More information about the Bf-committers mailing list