Hi everyone. I'm designing an interface that contains quite a few images (png
format) which I would like to embed in a .py module. I will be generating
the .py files using img2py. I've noticed a few odd things that I would
appreciate some clarification on.
I designed a small test script to display 1) an image string from the .py
file generated by img2py, and 2) the same png but imported as an image.
Here's what I've noticed. Using process explorer to monitor memory usage,
I've noticed that the GUI with the img2py image string uses FAR less memory
in the working set than directly importing the png. Here's some stats I
pulled:
GUI with image stream from img2py:
Private bytes: 15,660K, Working set: 2,396K (GUI maximized on screen)
Private bytes: 15,660K, Working set: 888K (GUI minimized)
GUI with image as png:
Private bytes: 15,576K, Working set: 21,300K (GUI maximized on screen)
Private bytes: 15,576K, Working set: 21,288 K (GUI minimized)
So my question is, why the differences in memory usage? Will I see an
improvement in start-up times using the img2py approach? Any downsides to
embedding images? Thanks
···
--
View this message in context: http://wxpython-users.1045709.n5.nabble.com/Embedded-images-using-img2py-memory-usage-tp4413728p4413728.html
Sent from the wxPython-users mailing list archive at Nabble.com.