[Bf-python] Strange behavior of Pysqlite

Roland Hess rolandh at reed-witting.com
Mon Dec 5 15:08:06 CET 2005


Not really looking for a solution to this, as the workaround is 
simple enough, but I did want to point this out for posterity's sake.

I'm using SQLite in Blender via the Pysqlite Python package. When I 
make my connection object and cursors as standard Py objects, they 
work fine. But, when I set those objects as objects of the global 
Blender object for use across different functions (Blender.dbConn and 
Blender.dbCursor) as opposed to simply passing them as non-global 
objects (dbConn and dbCursor) as parameters of the functions, I 
experience a significant slowdown. I didn't have this problem when 
using Blender with MySQL and the MySQLdb Python package. I have 
absolutely no idea where one would look to solve something like this, 
and my guess is that it's a complete non-issue, as the amount of 
people doing what I'm doing is probably very near zero. Anyway, a 
quick example to show:

from pysqlite3 import dbapi2 as sqlite
dbConn=sqlite.connect("dbActors.db")
dbCursor=dbConn.cursor()

Blender.dbConn = dbConn
Blender.dbCursor = dbCursor

dotest(dbConn,dbCursor)

def dotest(dbConn,dbCursor):
    SQL="INSERT (x) INTO tblActors VALUES (5);"
    for i in [0..2200]:
       dbCursor.execute(SQL)
    dbConn.commit()

    for i in [0..2200]:
       Blender.dbCursor.execute(SQL)
    Blender.dbConn.commit()

Assuming correct syntax on the Python code and the SQL (didn't bother 
to look it up for this email), I get results of around 1.13 seconds 
for the test using the passed objects (dbConn), and around 86 seconds 
for the test using the global object (Blender.dbConn).

Strange. That's all I'm saying.
-- 
Roland Hess - harkyman



More information about the Bf-python mailing list