[IPOL discuss] [IPOL tech] Démo de la méthode K_SVD
Miguel Colom
colom at cmla.ens-cachan.fr
Wed Oct 23 21:13:44 CEST 2013
Quoting Nicolas Limare <nicolas.limare at cmla.ens-cachan.fr>:
> It's not the "demo system" (ie python code) that is stopping processes
> demanding too much memory, it is the operating system kernel.
Well, I consider our configuration or policy about the memory as part
of the demo system.
> And for the general discussion, could you elaborate on why denoising
> methods need more than 1G memory? I have no opinion on this point, but
> I would like to understand where this memory need comes from. 1Gb is
> more than 130 million double-precision variables.
Despite all the optimizations we put, denoising (and also noise
estimation) are very costly in memory. I'm thinking on the (yet
unpublished) Noise Clinic.
In general, you consider overlapping blocks of size w*w. A typical
value if w=8, so in practice the amount of memory you need just to
keep the image in memory is w^w=64 bigger. And of course, some
algorithms need to deal with color, so they need to have all channels
loaded, and can't process them sequentially.
And also, the denoising algorithms are multiscale, that means that
they create a sort of pyramid structure of the images, which means
more memory.
And they are also multifrequency, so they need information about the
image at each frequency.
Denoising is likely to be one of the topics that demands more
resources (memory and CPU).
More information about the discuss
mailing list