I work with large-ish datasets, 4D medical images. Using the fantastic
numpy package, working with them is a breeze, and fairly memory efficieny even. There is one tiny problem though: I work with a lot of these 4D images at the same time (occasionally). Loading a single 150MB image is no problem, but opening 30 of them is close to tricky. Especially when one needs to manipulate these images, and making copies of them in case one needs to manipulate them in different ways. When my 16GB laptop ran out of memory, I decided to simply break up the script, because I don’t quite know how to handle memory as well as in C++. Through clever but simple scoping rules and RAII, managing memory in C++ is easy peasy. Python however, not so much. I decided to look into this a bit deeper today, and it seems I simply can’t. I thought of using the
with statement, but it turns out scoping and RAII are simple not linked in Python as they are in C++. More details here. Apparently dynamic languages and RAII do not go together. Here I was thinking
numpy had solved all my problems with Python (fast numerics), but alas…
Perhaps Julia is my last, best hope for peace and easy memory management. Have to dig into that another day.