Giter VIP home page Giter VIP logo

monitor's Introduction

The DESC Monitor

Travis CI build status

Extracts the light curves of ("monitors") all time-variable objects of cosmological interest. The basic concept is a general purpose light curve extraction tool, that can be sub-classed to specialize in supernova monitoring or lensed quasar monitoring. In an ideal world, this code would merely take care of the book-keeping involved in assembling light curves in the appropriate format for modeling codes like SNCosmo and SLTimer by simply querying the various DM Level 2 databases. However, we are aware of the possibility that we may need to implement multi-object extended MultiFit (a.k.a. "SuperFit") routines into the Monitor, if the Level 2 light curves do not provide sufficient accuracy. (This is a particular worry for the SLMonitor, which will need to deal with highly blended objects). The Monitor is therefore being developed as "Level 3" code, following DM standards and module structure.

Setup and testing:

From bash:

$ source setup/setup.sh
$ nosetests

or use the c-shell alternative. The Monitor uses some DM stack code, notably the Butler; see the installation notes for help getting set up.

Setting up ssh tunnel for database access:

In order to produce light curves with output from the Twinkles project one needs to be able to access the SQL database where DM processed output is stored. To do that one needs to setup an ssh tunnel for access. We use the same tools as the connection to the UW LSST CATSIM Database with instructions here. Follow the step at the beginning to install the necessary tools and then replace step 1 code with the following command line entry with your NERSC username in the proper place:

ssh -L 3307:scidb1.nersc.gov:3306 [email protected]

Do not worry about step 2 and continue to step 3 where you are instructed to create a db-auth.paf file, but replace the code on the website with the following parameters:

database: {
  authInfo: {
    host: '127.0.0.1'
    port: 3307
    user: $db_username
    password: $db_password
  }
}

If you do not have the $db_username or $db_password and are interested in access please contact a member of the Monitor team for more information.

Demo

We have a couple example notebooks available for users to get started using the Monitor:

  • depth_curve_example.ipynb

    • Provides an introduction on the Monitor's light curve construction tools and shows how to open a database connection to get data from NERSC. This is the best starting point if you want to try using the Monitor for the first time.
  • simple_error_model.ipynb

    • An introduction to the analysis possible with the Monitor. This is where we are showing results for our latest analysis on the Twinkles project. It is available for other users to use as a jumping off point for their own analysis.

People

License etc

This is open source software, available under the BSD license. If you are interested in this project, please do drop us a line via the hyperlinked contact names above, or by writing us an issue.

monitor's People

Contributors

jbkalmbach avatar jchiang87 avatar rbiswas4 avatar drphilmarshall avatar brianv0 avatar linan7788626 avatar

Stargazers

 avatar  avatar Heather Kelly avatar

Watchers

David Reiss avatar  avatar James Cloos avatar  avatar  avatar  avatar Chris Fassnacht avatar  avatar Will Dawson avatar Michael Wood-Vasey avatar Curtis McCully avatar Chris Walter avatar Seth Digel avatar  avatar  avatar Juan Pablo Reyes Gómez avatar Tom Glanzman avatar  avatar Scott Dodelson avatar Heather Kelly avatar saurabh w. jha avatar Fabrice Feinstein avatar  avatar

Forkers

heather999

monitor's Issues

License information in source files, README etc

Hey @brianv0 - check out our poster child DESC code project! We used @jchiang87 's template repo, and @jbkalmbach is working on the python source files. Would you mind please looking through these and adding appropriate licence notes to each one, in the recommended locations and formats? The README might need improving as well. If you can give us a good example here, we'll follow it forever. Thanks! :-)

LightCurve class

@jbkalmbach

I propose we make a LightCurve class as the centerpiece of the Monitor code. We'll want to be able to do things like this:

import desc.monitor
Object = 2341626719828
we = desc.monitor.LightCurve()
we.build_lightcurve(of=Object)
we.visualize_lightcurve(with='interaction')

See #6 for thoughts on light curve visualization! :-)
I think this class is the key component of the Simplest Possible Implementation #1

Simplest possible implementation

I love a one-issue coding project. Let's extract a Twinkles light curve, Bryce!

  • Write LightCurve class #7
  • Get test dataset #4
  • Compute light curve quality #9

Comparison with SNLS/CFHTLS

Hi @DominiqueFouchez @KirkGilmore!

As you can see from #7, @jbkalmbach has implemented a very basic Monitor, and is working with @rbiswas4 and @jchiang87 to use it to extract light curves from the Twinkles Run 1 images/forced source catalogs. We are not doing any image differencing yet, let alone MultiFitting - that will come later.

Both of you have mentioned doing light curve analysis on CFHTLS data - starting from either the DESC-Reprocessed images and catalogs (Dominique) or the old SNLS data with the SNLS methods (Kirk). We should discuss these programs next week, but for now, I thought you'd like to start following teh Monitor project, and add your preliminary thoughts to this thread.

Welcome! :-)

cc: @wmwv @saurabhwjha and @DarkEnergyScienceCollaboration/twinkles

monitor LightCurve non-fatal warnings and errors on nersc.

While trying to retrieve light curves from nersc @jbkalmbach and I got a non-fatal (for our work) error and warning. Do people know why we are getting these and if they should be ignored?

  • When creating the ssh tunnel using the instructions in README:
    ssh -L 3307:scidb1.nersc.gov:3306 [email protected]
    we get the following at nersc:
ModuleCmd_Load.c(226):ERROR:105: Unable to locate a modulefile for 'Base-opts'

It does not seem to create a problem.

On my laptop I get the following:

from desc.monitor import LightCurve
lc = LightCurve()
lcs = lc.build_lightcurve_from_db(3245)

At this stage I get the warning:
WARNING: ErfaWarning: ERFA function "dtf2d" yielded 1 of "dubious year (Note 6)" [astropy._erfa.core]
However, it seems to be fine, in that I can at least get the light curve through:

lc.lightcurve

Speed up Monitor Light Curves for each Object

In trying to analyze the light curves of SNe using the Monitor, @jbkalmbach and I discussed that the light curve retrieval was slow, and how we could speed up the process.

We plan to speed up the process by using forcedSources and then joining with visit information either from the database or from OpSim outputs. This will automatically fix #46

Light curve quality metric

Reissuing from LSSTDESC/Twinkles#143 :

I suggest we use something like the rms difference between observed and true light curves, weighted by observation errors, to make a $\chi^2$-like quantity. Then, we can compute the number of sigma offset between the observation and the truth using the Gaussian (Fisher) approximation to the chi-squared distribution, to get $N_{\sigma} = \sqrt(2\chi^2) - \sqrt(2N_d)$ where $N_d$ is the number of datapoints.

we.compare_with_truth(from='reference.fits')
we.compute_lightcurve_quality()
print("Observed light curve is",we.quality,"sigma from truth.")

Test observed light curves against references

@jchiang87 If you can take @rbiswas4 code and use it to make Monitor plots of an observed light curve with reference light curve overlaid, you win the prize of being able to close this issue! You may need to upgrade the Monitor's visualization capabilities to do this - you can issue that too, and hence double the size of your prize! :-)

Tiny example dataset

I think we need a very small data file checked in to this repo, to enable some basic tests to be run by Travis (and help development). @SimonKrughoff do you have a suitable ForcedSource table snippet that we could use for this purpose? I guess it needs at least two objects in it and at least a handful of epochs, and preferably multiple filters, but the objects needn't be anything special: all we're doing is validating at this point. Thanks!

Start Sphinx Documentation

As suggested by @drphilmarshall here!

@heather999 I believe we already have sphinx docstrings in some of our code in the Monitor (which should be using numpydoc). So, this may be another starting point for the sphinx documentation request in LSSTDESC/SLTimer#15. On the other hand, if SLTimer
turns out to be an easier starting point, I would be happy to copy over the example from SLTimer!

Subtract DC level from light curves?

At the August 25th Twinkles weekly meeting, we identified a function we need the Monitor to perform: according to @rbiswas4 it is standard practice to subtract off the flux due to the host galaxy from SN forced photometry light curves. To implement this, we could have the Monitor find the "DC level" of a light curve (far from the SN brightening and fading) and subtract it from the light curve (propagating errors appropriately). We could avoid this if sncosmo perfomed fits that included a constant background level - does it?

As the seeing varies from visit to visit, the amount of galaxy flux brought into the flux measurements will change - but perhaps this can be taken to be an additional noise term? Advice would be most welcome before we code this, @rbiswas4 @wmwv

Simplest Possible Error Model

  • #61 Develop monitor functionality to get true fluxes from instance catalog files
  • #62 Match up an object's true fluxes to observed fluxes
  • #63 Develop 2-d bias map in depth, seeing space
  • #64 Develop 2-d variance map in depth, seeing space
  • #65 Make notebook to show bias maps and truth comparisons
  • #67 Use CcdVisit Table skyNoise to measure depth
  • #68 Have Monitor pull relevant Opsim data

The first major chunk of work in developing a working error model for Twinkles will be to develop a simple model outlined on the board pictured here. This is broken up into a list of issues outlined above.

Compute nightly fluxes and uncertainties

So far we have restricted the Twinkles light curves to one visit per filter per night - but in Run 3 we will get the full DDF stream, with multiple visits per night. The rough DM plan for the DDFs seems to be "do the same as we do in WFD, wherever possible" which means that we will probably just run the same Level 2 pipeline that we already developed, and so produce a ForcedSource for every visit, and ~10 per night. I think this means we need the Monitor to be able to compute nightly fluxes by combining the measurements taken in different visits during the same night. Comments welcome, SN afficionados! @rbiswas4 @wmwv

Can the Monitor write new tables to the Pserv database, @jchiang87 ? This would save us having to sum the same fluxes over and over again. Lets postpone implementing this until we've seen how fast it is to combine fluxes on the fly, though.

Module structure

@jchiang87

Hi Jim - Would you mind doing the necessary things to set this repo up so it can be summoned via

import desc.monitor

please? I'd like to see how this is done - normally I would put an __init__.py in monitor and insert statements like from source.py import *, so that I can then simply import monitor. But in LSSTDESC/Twinkles#105 you recommended renaming the monitor folder to python - so we might need some guidance in the README after you've done some moving and editing. Thanks!

Changes to TwinSN table?

Hi @rbiswas4,

I'm trying to run your https://github.com/DarkEnergyScienceCollaboration/Monitor/blob/master/examples/reference_lc.ipynb notebook, but I'm getting this error for cell 14:

DatabaseError                             Traceback (most recent call last)
<ipython-input-14-aeb7556a9869> in <module>()
----> 1 ids = reflc.allIdinTable(chunksize=None)
      2 print(ids.astype(int).values.flatten())

/u/gl/jchiang/links/desc_projects/Monitor/python/desc/monitor/truth.pyc in allIdinTable(self, sqlconstraint, chunksize)
    272 
    273         x = pd.read_sql_query(query, con=self.dbConnection,
--> 274                               chunksize=chunksize)
    275         return x
    276 

/nfs/farm/g/desc/u1/LSST_Stack_2016-04-12/lsstsw/miniconda/lib/python2.7/site-packages/pandas/io/sql.pyc in read_sql_query(sql, con, index_col, coerce_float, params, parse_dates, chunksize)
    429     return pandas_sql.read_query(
    430         sql, index_col=index_col, params=params, coerce_float=coerce_float,
--> 431         parse_dates=parse_dates, chunksize=chunksize)
    432 
    433 

/nfs/farm/g/desc/u1/LSST_Stack_2016-04-12/lsstsw/miniconda/lib/python2.7/site-packages/pandas/io/sql.pyc in read_query(self, sql, index_col, coerce_float, params, parse_dates, chunksize)
   1597 
   1598         args = _convert_params(sql, params)
-> 1599         cursor = self.execute(*args)
   1600         columns = [col_desc[0] for col_desc in cursor.description]
   1601 

/nfs/farm/g/desc/u1/LSST_Stack_2016-04-12/lsstsw/miniconda/lib/python2.7/site-packages/pandas/io/sql.pyc in execute(self, *args, **kwargs)
   1574             ex = DatabaseError(
   1575                 "Execution failed on sql '%s': %s" % (args[0], exc))
-> 1576             raise_with_traceback(ex)
   1577 
   1578     @staticmethod

/nfs/farm/g/desc/u1/LSST_Stack_2016-04-12/lsstsw/miniconda/lib/python2.7/site-packages/pandas/io/sql.pyc in execute(self, *args, **kwargs)
   1562                 cur.execute(*args, **kwargs)
   1563             else:
-> 1564                 cur.execute(*args)
   1565             return cur
   1566         except Exception as exc:

pymssql.pyx in pymssql.Cursor.execute (pymssql.c:7483)()

DatabaseError: Execution failed on sql 'SELECT snid FROM TwinkSN': (207, "Invalid column name 'snid'.DB-Lib error message 207, severity 16:\nGeneral SQL Server error: Check messages from the SQL Server\n")

Has there been a change to that table?

Requirements of Monitor

I got a crash of desc.monitor.render_fits_images due to the lack of WCSAxes.

render_fits_image(test_open[1], title='post.fits')

The error message was clear about what needed to be done to fix this and is good enough for interactive use. But would it be good to add this in some kind of optional requirements document (This is actually invoked by astropy, so it must be an optional requirement of astropy)?
The error message I received was:

ImportError                               Traceback (most recent call last)
<ipython-input-100-b2932ff943e5> in <module>()
----> 1 render_fits_image(test_open[1], title='post.fits')

/Users/rbiswas/doc/projects/DESC/Monitor/python/desc/monitor/Display.pyc in render_fits_image(hdu, cmap, stretch, xlabel, ylabel, title, subplot, fig, norm)
     37     if fig is None:
     38         fig = plt.figure()
---> 39     axes = fig.add_subplot(subplot, projection=wcs)
     40 
     41     im = plt.imshow(hdu.data, norm=norm, cmap=cmap, origin='lower',

/usr/local/software/lib/python2.7/site-packages/matplotlib/figure.pyc in add_subplot(self, *args, **kwargs)
    985         else:
    986             projection_class, kwargs, key = process_projection_requirements(
--> 987                 self, *args, **kwargs)
    988 
    989             # try to find the axes with this key in the stack

/usr/local/software/lib/python2.7/site-packages/matplotlib/projections/__init__.pyc in process_projection_requirements(figure, *args, **kwargs)
     98         projection_class = get_projection_class(projection)
     99     elif hasattr(projection, '_as_mpl_axes'):
--> 100         projection_class, extra_kwargs = projection._as_mpl_axes()
    101         kwargs.update(**extra_kwargs)
    102     else:

/usr/local/software/lib/python2.7/site-packages/astropy/wcs/wcs.pyc in _as_mpl_axes(self)
   3045             from wcsaxes import WCSAxes
   3046         except ImportError:
-> 3047             raise ImportError("Using WCS instances as Matplotlib projections "
   3048                               "requires the WCSAxes package to be installed. "
   3049                               "See http://wcsaxes.readthedocs.io for more "

ImportError: Using WCS instances as Matplotlib projections requires the WCSAxes package to be installed. See http://wcsaxes.readthedocs.io for more details.

Make ipython demo notebook run remotely at NERSC

For the Oxford DESC Meeting Twinkles would like to have an open house session. An ipython demo notebook that could generate both light curves and postage stamp images on the fly would be a great tool for this. In order to get things running, @jchiang87 suggested running the notebooks at NERSC so that the images could be served up in the browser.

zero points in desc.monitor.LightCurve.build_lightcurve* need to be set correctly

The fluxes extracted from the forced source FITS catalogs using this code
https://github.com/DarkEnergyScienceCollaboration/Monitor/blob/master/python/desc/monitor/monitor.py#L114
need to have the zero points set for each visit from the calexp data. See
https://github.com/DarkEnergyScienceCollaboration/pserv/blob/master/python/desc/pserv/utils.py#L73

For the lightcurves built from the Level 2 ForcedSource table, since the units of fluxes in that table are nmgy, this line
https://github.com/DarkEnergyScienceCollaboration/Monitor/blob/master/python/desc/monitor/monitor.py#L167
should be

lightcurve['zp'].append(22.5)

DM setup failure, notes needed

Hey @jbkalmbach the tests currently fail for me, see below - apparently I need a stack module called daf.persistence, but I have never needed that before. Do I need to "setup" something? Happy to edit the README once I am up and running!

bash-3.2$ nosetests .
E
======================================================================
ERROR: Failure: ImportError (No module named daf.persistence)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/pjm/lsst/DarwinX86/anaconda/2.2.0/lib/python2.7/site-packages/nose/loader.py", line 414, in loadTestsFromName
    addr.filename, addr.module)
  File "/Users/pjm/lsst/DarwinX86/anaconda/2.2.0/lib/python2.7/site-packages/nose/importer.py", line 47, in importFromPath
    return self.importFromDir(dir_path, fqname)
  File "/Users/pjm/lsst/DarwinX86/anaconda/2.2.0/lib/python2.7/site-packages/nose/importer.py", line 94, in importFromDir
    mod = load_module(part_fqname, fh, filename, desc)
  File "/Users/pjm/work/stronglensing/LSST/DESC/Monitor/tests/test_Monitor.py", line 7, in <module>
    import desc.monitor
  File "/Users/pjm/work/stronglensing/LSST/DESC/Monitor/python/desc/monitor/__init__.py", line 1, in <module>
    from monitor import *
  File "/Users/pjm/work/stronglensing/LSST/DESC/Monitor/python/desc/monitor/monitor.py", line 7, in <module>
    import lsst.daf.persistence as dp
ImportError: No module named daf.persistence

----------------------------------------------------------------------
Ran 1 test in 0.035s

FAILED (errors=1)

Replace nose with pytest as the testing engine?

LSST is migrating to py.test as the test runner rather than unittest. Like nose, which we have been using in Twinkles and Monitor, py.test will work with tests written in the unittest framework as well. I think this is an attractive option, because of

  • continued support to tests written in the unittest or nose framework
  • New features (I believe) not present in nose, that enable writing tests which seem harder to do in nose, but with less boilerplate than unittest, which I often find excessive for simple tasks.

So, I think we have two things to get done

Comparison with reference light curves

  • Identify PhoSim Catalog entry for each forced photometry #21
  • Organize PhoSim Catalog information (including object metadata) into light curve #22
  • 3rd party operation, leading to light curve plots #29

Here's a good idea from @rbiswas4 , reproduced from #1:

I have bits of code generating light curves from simulated instance catalogs as examples in https://github.com/lsst/sims_catUtils/blob/master/examples/SNCatalog_example.ipynb but putting them together in a sensible, organized manner (like the monitor) would be helpful for many purposes.

Thanks Rahul!

Test data

@jbkalmbach we need a small test dataset to develop against. Any DM ForcedSource catalog would do, for example, although it would be fun if the three objects in that catalog had amusing light curves. And I guess we need at least two filters, to make it interesting. What do you think?

Since bandpasses are in in visits, do we need the bandpass argument separately?

                                                   bandpasses=['u', 'g', 'r', 'i', 'z', 'y'], 
                                                   visitLists=[[488076,488756],[490142],[495735],
                                                                               [490162,495753],[490187,490844],
                                                                               [490867,495799]], 
                                                   mjdFile = os.path.join('../data/','selectedVisits.csv'))

Not sure if we want to pass in both the bandpasses and visits when visits contain the bandpasses. Also if we decide to pass this in anyway, then can we keep this flat: so that the bandpasses array has the same shape as visitList.

Calibrate SNCosmo fluxes correctly

In the Monitor code so far we have the ability to output DM forced photometry flux information as light curves in each filter. However, we're not sure of the zeropoints or how to get them. This needs to be figured out to understand flux values correctly.

SLMonitor

Hey @jbkalmbach - it looks like our strong lens capabilities are falling behind a bit. Correcting this oversight is an Epic task. What else do we need to do, apart from the following?

  • #40: Use RefLightCurves code from @rbiswas4 to extract reference lensed quasar light curves
  • Sub-class the Monitor into an SLMonitor that knows it needs to extract 2, 3 or 4 objects and plot/analyze them together. I know there is not much to sub-class yet - but maybe there will become more once we start trying to use the SLMonitor in anger?
  • Produce demo notebook walking the user through an example, and have someone else test it

Measure depths of visits from

Since we are interested in doing comparisons of truths and detected values as a function of observation parameters, we have to measure the sky noise. Let us obtain the values of five sigma depth from

  • The Opsim database
  • PhoSim Centroid Files and log files
  • The DM analysis of the images.

The interface that we are thinking of is:

monitor  = Monitor()
monitor.measure_depth(*args)
# where `*args` might tell us to obtain this from the OpSim Database, and obsHistID,  or a centroid file.

We need a workaround for the sncosmo/astropy incompatibilities in the conda distribution of lsst-sims

This problem is causing our builds to fail at travis-ci, e.g.,

https://travis-ci.org/DarkEnergyScienceCollaboration/Monitor/builds/147903861

@heather999 I think you were able to roll back astropy to a previous version for the SLAC installation of v12_0 so that it would work with the sncosmo in the conda distribution. If so, then I'd like to update the .travis.yml install script to do the same. What's the procedure for that?

@danielsf Any chance of getting a new conda dist of lsst-sims that doesn't suffer from this issue soon-ish?

Prevent the leakage of imports into namespace

Currently, modules in the monitor package import several packages without defining an __all__, and then use a from module import * style. This causes a leakage of all imported package names into the monitor namespace. We need to change this.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.