[Bf-committers] New event system

Matt Ebb matt at mke3.net
Mon Jul 2 01:57:40 CEST 2007


On 6/28/07, Jean-Luc Peurière <jlp at nerim.net> wrote:
>
>
> however we need to :


...

- get rid of all the while loops and define clear communication paths
>

This is something that interests me greatly. All the tight subloops (eg.
knife tool) that block all other input have got to go, and be replaced by
something that works with the main event system. The functionality of being
able to pick up various tools like that or edge slide is great, but it's
very silly that upon using them, the entire UI is blocked, and the only
method of interaction with them is via difficult-to-discover modal hotkeys,
and the entire range of richer forms of interaction through Blender's GUI
gets thrown aside.

One thing that I wanted to ask about in this events work is mouse dragging.
Dragging the mouse is already used in some areas of Blender (moving panels,
sliders, node connections) and could potentially be used very positively in
a lot of other areas such as in the outliner or re-arranging
constraints/modifiers/textures/menu items, if it weren't currently so
difficult (rather impossible).

I already implemented a form of drag and drop in the outliner [1], and it
posed some hurdles:
* The entire outliner data structure was being re-built on *every redraw*.
Not only does this seem wasteful, especially when you have heaps of objects,
but it meant that as an item was being dragged around, all of the items
would flicker in and out, cycling between each one as things were changed
underneath. To solve this, I added a flag that could be checked to prevent
re-building if it was inside a drag - like if (!is_dragging) { rebuild }
* Because there is no global level support for it, I had to make a horrible
subloop and do everything manually, such as using low-level commands to
redraw the screen, detecting when drags were started and stopped, keeping
track of where the start and end points were, displaying something to follow
the mouse pointer.

It would be great if the events system itself could help here, it would
simplify things a lot and make it much easier for this sort of thing to be
used across Blender without having to re-invent the wheel each time. I won't
say I'm an expert on the best way to implement it, but it would make sense
for drags to be a proper low-level event. They could be checked at the
lowest level when processing the mouse inputs (i.e. if button is held down
and displacement crosses a threshold, or when it's released).

Then global info could be set, and accessed anywhere in Blender, (perhaps
similar to get_mbut()), you could have something like get_dragging(float
*startcoord) that would tell you if the user doing a drag motion, and where
it started (and ended?). Then tool coders could just use the main events
system and let it do the hard work of checking input devices, redrawing the
screen, and life would be much easier.

(as an aside, the main reason the outliner stuff wasn't taken much further
is that there are no generic functions in Blender for parenting or datablock
assignment such as object->object or object->list of objects or
obdata->object, etc. etc. and I would have to rewrite all of that code
myself, which I wasn't very keen on. Ideally this sort of functionality
should be split up, or perhaps made as 'tools' in a tool API)

cheers,

Matt



[1] http://mke3.net/weblog/whirlwind-minus-the-tour/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.blender.org/pipermail/bf-committers/attachments/20070702/93813ee1/attachment.htm 


More information about the Bf-committers mailing list