[Bf-taskforce25] windowmanager issues

Ton Roosendaal ton at blender.org
Wed Nov 19 19:21:32 CET 2008


Hi,

Thanks! See comments mixed in below.

> * wm_apple.c: should be moved to ghost?

Yes can be moved. It was old cruft from ghostwinlay.c, I wasn't sure 
where to put it, will leave that to Jean Luc.

> * Keymap storage: they are currently as ListBases in wmWindowManager.
> Maybe this needs a wmKeymap struct with name + the ListBase, so it's
> more extensible.

Is open issue yes.

Original idea: only the global window/screen-level keymaps were 
supposed to be there.
The other ones should be on Area type level, so it's nice local and 
pluginnable.

Problem: how to register custom keymaps (like for transform)? Or how to 
find maps if you want to make own key bindings. Your suggestion could 
work, needs to be solved.

> * There seems to be no way to specify "any key", or "any modifier key"
> in keymaps.

For time being we should avoid hotkeys to be that flexible. When would 
you do that?

> * wm_event_system.c. "MVC demands to not draw in event handlers... for
> now we leave it.". I think we should be strict here and not allow it.

Sure, it was a note I wrote when coding in the first week. :)

> * The exact function of the area/region refresh() callback is not
> clear to me. comment says external bContext changes, but i'm not sure
> what that means, also compared to init(). For example size changes
> causes an init() call in which the area repositions regions so that's
> not refresh(). Other things may be handled in listener(). So I'm not
> sure what kind of things that would be handled there.

I thought it would be nice to separate init/config code (UI layout, 
size) from contextual changes (new object active, new material to draw, 
etc).

> * The area init() callback seems to be doing region registration,
> setting up callbacks and such. Maybe regiontypes should get registered
> with their callbacks, keymaps, etc more like for areas with a unique
> name in an initialization when Blender starts, not repeatedly?

Sounds cool, at the time I stopped coding area/regions were mostly 
undefined.

> * Will Panels be regions? Makes sense if you do this and then allow
> regions to be free floating, or 'docked', etc. Maybe not a wm topic
> per se, just trying to think of what functions a  region would have,
> i.e. if a you would be able to drag them around, or tab them, add
> regions from a menu etc. Also if the regions can basically be opened
> and closed at will then you will need most code in the regions and not
> in the area because the area will know little about it's contents.

The concept of panels itself is under review still! :)

Regions were meant to be able to divide an area in useful ways, 
following the subdivision principle strictly. Action, Ipo, etc. all 
need good region handling (channel list). We also should check on 
context-editing in areas, like using a mini-outliner or so.
Regions also allow to make a 3D window become 4-split, with only one 
header/toolbar etc.
Something like the "Nkey" properties panel can also become a region, on 
side or bottom of area. That will make going to full-area-window useful 
too.

Optionally floating button-panels/toolbars can be allowed too. Can be 
coded as exceptional region inside area context (to prevent them to be 
dragged around to other editors!).

> * Areas have spacedata, do regions need something similar? I've added
> a void *regiondata; pointer since I needed it for the UI code, which
> is used by the region. Should this be a list like spacedata (imagine
> tabbed regions)?

Sounds cool, needs experimenting a bit... customizing area-regions is 
also an issue.

> * Modal operator context vs. handler. A transform operator wants to
> handle events for the full window, blocking. However it still works on
> the data in it's area/region. I've added a convention now that the
> window/area/region where the operators was invoked 'owns' it as long
> as the operator is running (in the modalops list) and will cancel it
> if it the window/area/region is deleted. That may not be ideal, and
> it's still not clear which window/area/region should be in the context
> when the modal() callback is used, the one where the handler is or
> where the operator was invoked?

Can't the modal-handler store a subwindow index, so it can check/set 
context?

> * Operators vs. handlers. My initial interpretation of operators was
> that they would be doing things that seem to be intended for handlers
> instead. My idea of handlers was that they were just a way to 'hook
> up' an operator somewhere and operators might keep running for a
> while. Now i'm not sure anymore what would be a handler or an operator
> or something else, for example: the timeline play, shift+p preview
> render, autosave, opening a file dialog to set an image path.

I'll document that better; but in theory the operator was meant to be 
called by anyone, on operations on data, and especially possible 
without handlers.
Your examples:

- timeline play = notifier (UI stuff)
- shift+p previewrender = notifier
- autosave = operator with timer handler
- opening a file dialog to set an image path = notifier

UI stuff should not be done with operators, preferably. Unless it 
involves user actions, like dragging area-edges or splitting areas etc.

However, I notice the issue... if you want keymaps for such notifiers, 
we have to allow this... Will think it over.

> * Operators have an interface with callbacks and defined lifetime, for
> handlers this does not exist, so it's not that extensible currently
> for python or for decoupling things. Think for example a verse
> handler, continuous exporter for a game engine. Do handlers need a way
> to make new handlers without changing the windowmanager code itself
> like exists for operators?

Not sure what this means... isn't this example just a timer handler?

> * Timers, are they events, notifiers, or something else? Can a region
> get a timer event for shift+p preview render, or is that a handler,
> operator, .. ?

Hrms... at first I thought to make a couple of special handler types. A 
timer handler, a uiBlock handler, a keymap handler, etc.
Also the afterqueue concept in Blender needs to work well again, didn't 
think of it yet!

> * Error reporting. I've added a way to report errors and warnings
> generally and add them to a list in the windowmanager, this can be a
> way to make that non-blocking, but it may not be ideal. Many places
> that want to report errors or warnings may not have a
> context/windowmanager pointer available.

If we want to extend this to a built-in console, some global (or Main) 
can store this?

> * Notifier/listener. Still not sure about the types of notifiers that
> will exist. Redraw and changed are clear to me, but what is the
> purpose of an area_split, area_drag, gesture_changed notifier?

I already removed the first two :)
A notifier should ideally not say what it should do, but say *what 
happened*, so the listeners can evaluate that themselves. Is still 
WIP... there's only few notifiers now.


> * Notifier/listener. How can we avoid allqueue(REDRAWX, 0) everywhere?
> Since object_change, material_changed type notifier don't work well
> with the delayed nature of listener if you wanted to find out if a
> specific object changed for example, how do areas and regions find out
> that this data changed? Immediate notifiers? Notifiers that don't pass
> any specific information about which data changed, just that some
> material changed for example?

You probably picture something that's depsgraph based?
Notifiers/Listeners are only UI stuff... and don't have to be very 
picky?

Thanks!

-Ton-


>
> Brecht.
>
>
> _______________________________________________
> Bf-taskforce25 mailing list
> Bf-taskforce25 at blender.org
> http://lists.blender.org/mailman/listinfo/bf-taskforce25
>



More information about the Bf-taskforce25 mailing list