[Soc-2012-dev] Fwd: Multitouch Week 6 Update

Nick Rishel nrishel at iupui.edu
Sat Jun 30 09:45:22 CEST 2012


All Multitouch Weekly
Reports<http://wiki.blender.org/index.php/User:PrototypeNM1/Weekly_Report#Week_6>

Week 6 synopsis:

   - I realized my prior design ideas will not work for BGE
      - As a result this week was spent figuring out how this might work
      instead
      - Another result is what I believe is becoming a better design
      overall (other design was starting to feel too heavily hard coded)
      - I will begin work on the alternative design tomorrow
   - Mike set up a conference call with the BlenderTUIO developers
      - They are interested in developing tools on the resulting platform
      - They might be able to provide bug and usability testing
      - Another call to be set up will occur in the next week or two
      - They have already confirmed with user testing what I previously
      believed to be important points to accomplish with this project
   - A internal touch library will need to be written
      - This is the only foreseeable way to have a single touch manager for
      BGE and Blender
      - Possible context information will need to be registered to the
      manager (during initialization in Blender and as the game
developer devises
      contexts in BGE)
      - To expose actions to the user, the Python API will be used for
      operations
   - Next week's work
      - Begin work on the library
      - Write functions to register contexts with the touch manager and add
      to Blender's initialization
      - Write functions to feed touch information to the touch manager and
      add to Blender's main loop
      - Try to finish stream generator prior to the midterm

Full Week 6 report from Wiki:
Tuesday

In IRC conversation with jesterKing and defelinto, I realized some
fundamental design directions I was intending to go was not going to work
if I wanted the same code base to work for BGE and Blender. jesterKing
promptly informed me that BGE and Blenders' code diverge strongly, thus
assumptions as to what information I have available are fewer, I can not
base my system solely around context in Blender.

This information (while important) is troubling in how it will set me back.
In the end I believe a better design will result from it though, the way I
was looking at it was becoming a very hard coded solution, which would
certainly limit usability.
Wednesday

I sat in with an Skype Conference with Mike and previous Blender TUIO
developer Marc Herrlich. Marc expressed interest in finding students to
test and develop touch input in Blender with respect to how creating new
gestures in an existing touch environment. This relationship would benefit
us both, as I would reduce my burden to crate gestures once the system is
in place, allowing more time to stabilize the system instead. The benefit
to Marc is a research area for his students that will allow for rapid
prototyping of gesture commands.

In our discussion, Marc confirmed what I believed to be the most
immediately important things to have work with the gesture recognition,
being tablet and touch support for sculpting.
Friday

I believe the best solution to the code duplication problem is to have a
unique touch library which both Blender and BGE share. This library should
have a means for registering possible context information; in Blender this
would be at minimum the editor type and what is directly under the
touchpoint. It might also be useful to know the Mode of the editor, but for
the time being I will assume this is context that I do not need to know.
Upon initialization, Blender should register existing editor types with the
touch library, as well as what might be under any given touch point. In
BGE, providing context and under touch information might be best to leave
to the game designer, which means exposing the register context information
to python.

I believe it would be best to handle actions through Python. This would
work in both BGE and Blender, and more importantly make user defined
actions during touch events most easily customizable (very important for
BGE). This also leaves open the possibility for complex actions written in
Python by users.

I don't like that not a lot of coding work was done this week, most of my
time ended up being committed to researching how this problem was addressed
in existing code in Blender. The best example I have found thus far is
GHOST's usage.
Next Week

My first goal for next week is to have a library started which can be fed
context information and display a list of possible contexts, and is fed
editor types available in Blender. To accomplish this, I will need to find
the appropriate location in Blender's initialization to feed this
information to the touch manager. I want to have this finished by Tuesday.

After this, I will begin work on feeding it touch information, individual
touch points, the editor they reside in, and storing relative changes in
location. The week after next is the beginning of midterm reviews. Before
that process begins, I want to have the stream generator mostly finished.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.blender.org/pipermail/soc-2012-dev/attachments/20120630/e66d2764/attachment.htm 


More information about the Soc-2012-dev mailing list