Hi Dick,
I've got an application using Python 2.4.1 and wxPython 2.6.1 approximating 30k lines of code.
It stores data on PostgreSQL backend.
Middleware I use is PsycoPg for database connectivity.
It runs all day in Dr.'s offices and there is no memmory leak of this nature.
I also have a spam reporting tool which I stuff spam and statistics into a PostgreSQL database. It's only about 3k lines of code, but I rarely shut it down becaues spam coutinues to arrive. It's current memory usage is about 5k, but it's peak was 21.7k.
Sounds like to me that there may be a reference to a query which is not being properly dropped.
If you're not using PsycoPg, I give it two thumbs up! I would like to think that PostgreSQL has nothing to do with this issue, but the database adapter may. I was using PygreSQL but found a few minor issues with it, that drove me nutzo. When PostgreSQL 8 came out, I found it fairly painless to migrate to PsycoPg and found it's speed a nice advantage.
In either event, len(gc.get_objects()) should probably become your new friend and scattered amongst your k's of code.
-Joe
Dick Kniep wrote:
ยทยทยท
Hi list,
We have developed a big application (approximately 55 KLines) with a Postgres database on wxPython 2.4.2.4.
When a user logs on it uses approximately 60 MB, 21 MB shared memory. After a while memory use shoots up to 250 MB + Off course this is unacceptable. Has anyone on the list any experience with this size of application, and is there anyone who can advise me on the way to find out where the memory is lost?
As the application has grown over time, programming has been sloppy to start with, but the loss of memory is so big, I can hardly understand how that can happen.
Cheers,
Dick Kniep---------------------------------------------------------------------
To unsubscribe, e-mail: wxPython-users-unsubscribe@lists.wxwidgets.org
For additional commands, e-mail: wxPython-users-help@lists.wxwidgets.org