Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Variance as error for aperture photometry #1063

Open
ysBach opened this issue Aug 7, 2020 · 1 comment
Open

Variance as error for aperture photometry #1063

ysBach opened this issue Aug 7, 2020 · 1 comment

Comments

@ysBach
Copy link
Contributor

ysBach commented Aug 7, 2020

Currently, _prepare_photometry_data or do_photometry takes error as standard deviation, while they convert the error to the variance map.

This, however, can be simplified by accepting the user to provide variance map instead of standard deviation map. It is useful since the calculation strategy by using stddev error-map takes more time than variance map:

import numpy as np
np.random.seed(1234)
bias = 100
gain = 9.5
rdnoise = 10.1

for npix in [1024, 2048, 4096]:
    print(npix)
    data2d = (np.random.uniform(1000, 30000, size=(npix, npix)) 
              + np.random.normal(scale=rdnoise, size=(npix, npix))
             )
    data2d = bias + np.random.poisson(data2d)/gain
    data2d = data2d.astype('float32')
    var = data2d/gain + rdnoise**2
    sigma = np.sqrt(data2d/gain + rdnoise**2)

    %timeit data2d/gain + rdnoise**2
    %timeit np.sqrt(data2d/gain + rdnoise**2)
    %timeit sigma**2

# 1024
# 448 µs ± 89.6 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
# 1.41 ms ± 5.11 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
# 649 µs ± 29.6 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
# 2048
# 4.77 ms ± 223 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)
# 9.82 ms ± 316 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)
# 4.07 ms ± 366 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)
# 4096
# 21.8 ms ± 1.01 ms per loop (mean ± std. dev. of 7 runs, 10 loops each)
# 38.8 ms ± 1.02 ms per loop (mean ± std. dev. of 7 runs, 10 loops each)
# 16.9 ms ± 396 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)

The first time is to prepare variance map, while the sum of the last two times are that required at the current version of photutils. Therefore, if variance map is used, the user can save ~ 50-70% of the time used for the variance preparation. I have ~ 1000 images (for light curve) of 4K CCD in float32, which means I can save 35ms * 1000 ~ 35 sec for variance preparation done by _prepare_photometry_data. This is NOT a major improvement, but I guess worth commenting. I cannot come up with an example where it can be a major one (though I feel like there must be some cases).

A possibility, imho, is to add variance to aperture_photometry and _prepare_photometry_data.

How do you think?
If it is too minor an improvement, I will close the issue soon.

@larrybradley
Copy link
Member

@ysBach I've thought about this in the past, but it wasn't a high priority. I did significantly improve the performance with stddev errors, but going to variance would allow for some further (smaller) gains. Many people do have stddev errors so that option makes sense. My thoughts were to add an uncertainty keyword that could accept many types of uncertainties (stddev, var, inverse-var, astropy.nddata.NDUncertainty, etc.) using the astropy.nddata.NDData uncertainty conventions.

In any case, the error (stddev) keyword has been in aperture_photometry for so long now that I'd rather not remove it. Any changes would like require an additional keyword. I'll think about this more. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants