Abstract
Across the scientific realm, we find ourselves subtracting or dividing stochastic signals. For instance, consider a stochastic realization, x, generated from the addition or multiplication of two stochastic signals, a and b, namely x = a+b or x = ab. For the x = a+b example, a can be fluorescence background and b the signal of interest whose statistics are to be learned from the measured x. Similarly, when writing x = ab, a can be the illumination intensity and b the intensity due to the fluorescent molecules of interest. Yet dividing or subtracting stochastic signals amplifies noise, and we ask whether, using the statistics of a and the measurement of x as input, we can recover the statistics of b. Here, we show how normalizing flows can approximate the probability distribution over b, thereby avoiding subtraction or division altogether. This approach is implemented in our software package, NFdeconvolve, available on GitHub with tutorials available.