I have a Sql Alchemy application that is returning TimeOut:
TimeoutError: QueuePool limit of size 5 overflow 10 reached,
connection timed out, timeout 30
I read in a different post that this happens when I don't close the session but I don't know if this applies to my code:
I connect to the database in the init.py:
from .dbmodels import (
DBSession,
Base,
engine = create_engine("mysql://" + loadConfigVar("user") + ":" + loadConfigVar("password") + "@" + loadConfigVar("host") + "/" + loadConfigVar("schema"))
#Sets the engine to the session and the Base model class
DBSession.configure(bind=engine)
Base.metadata.bind = engine
Then in another python file I'm gathering some data in two functions but using DBSession that I initialized in init.py:
from .dbmodels import DBSession
from .dbmodels import resourcestatsModel
def getFeaturedGroups(max = 1):
try:
#Get the number of download per resource
transaction.commit()
rescount = DBSession.connection().execute("select resource_id,count(resource_id) as total FROM resourcestats")
#Move the data to an array
resources = []
data = {}
for row in rescount:
data["resource_id"] = row.resource_id
data["total"] = row.total
resources.append(data)
#Get the list of groups
group_list = toolkit.get_action('group_list')({}, {})
for group in group_list:
#Get the details of each group
group_info = toolkit.get_action('group_show')({}, {'id': group})
#Count the features of the group
addFesturedCount(resources,group,group_info)
#Order the FeaturedGroups by total
FeaturedGroups.sort(key=lambda x: x["total"],reverse=True)
print FeaturedGroups
#Move the data of the group to the result array.
result = []
count = 0
for group in FeaturedGroups:
group_info = toolkit.get_action('group_show')({}, {'id': group["group_id"]})
result.append(group_info)
count = count +1
if count == max:
break
return result
except:
return []
def getResourceStats(resourceID):
transaction.commit()
return DBSession.query(resourcestatsModel).filter_by(resource_id = resourceID).count()
The session variables are created like this:
#Basic SQLAlchemy types
from sqlalchemy import (
Column,
Text,
DateTime,
Integer,
ForeignKey
)
# Use SQLAlchemy declarative type
from sqlalchemy.ext.declarative import declarative_base
#
from sqlalchemy.orm import (
scoped_session,
sessionmaker,
)
#Use Zope' sqlalchemy transaction manager
from zope.sqlalchemy import ZopeTransactionExtension
#Main plugin session
DBSession = scoped_session(sessionmaker(extension=ZopeTransactionExtension()))
Because the session is created in the init.py and in subsequent code I just use it; at which point do I need to close the session? Or what else do I need to do to manage the pool size?
question from:
https://stackoverflow.com/questions/24956894/sql-alchemy-queuepool-limit-overflow 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…