Giter VIP home page Giter VIP logo

kcor-pipeline's Introduction

KCor Pipeline

The COronal Solar Magnetism Observatory (COSMO) K-coronagraph (K-Cor) is one of three proposed instruments in the COSMO facility suite. It is specifically designed to study the formation and dynamics of coronal mass ejections and the evolution of the density structure of the low corona. The K-Cor records the polarization brightness (pB) formed by Thomson scattering of photospheric light by coronal free electrons. The National Center for Atmospheric Research (NCAR), via the National Science Foundation (NSF), provided full funding for the COSMO K-Cor, which was deployed to the Mauna Loa Solar Observatory (MLSO) in Hawaii in September 2013, replacing the aging MLSO Mk4 K-coronameter.

This pipeline produces level 1 and level 2 data products from the raw data from the instrument. The level 1 product contains polarization brightness (pB) images of the corona and sky, pB of the sky only, and total intensity, while the level 2 product contains pB images with sky polarization removed.

There is a near real-time component of the pipeline which produces fully calibrated level 2 pB images along with an end-of-day component which produces averages, differences, and many engineering products.

Requirements

  • IDL 8 or later
  • cmake 3.1.3 or later
  • MySQL developer installation
  • Python 2.7+ (including 3.x) in order to run command line utility script including the simulators, the production pipeline does not strictly require Python

Installation

To build the KCor pipeline code, your system must have IDL, the MySQL client development package, and CMake 3.1.3 or later. Make sure these are installed on your system before continuing.

These instructions will work on Linux and Mac systems. It should be possible to install the KCor pipeline on Windows systems, but it is not described here.

Configuring for your system

To configure the KCor pipeline for your system, do the following from the top-level of the pipeline source code (change the location of your IDL installation and the location where you want the pipeline to your needs):

cd kcor-pipeline
mkdir build
cmake \
  -DCMAKE_INSTALL_PREFIX:PATH=~/software/kcor-pipeline \
  -DIDL_ROOT_DIR:PATH=/opt/share/idl8.5/idl85 \
..

There are example configuration scripts, linux_configure.sh and mac_configure.sh, in the pipeline source code that are more detailed examples of the above configuration command.

Build and install

Next, to build and install the KCor pipeline, run:

cd build
make install

Run the KCor pipeline

Config file

The options of the pipeline are specified via a configuration file. See the configuration specification file kcor.spec.cfg in the config directory of the distribution for all the options and their documentation. The filename of the config file must match the pattern kcor.[NAME].cfg with a name such as "production", "latest", or "geometry-fix". These configuration files must be placed in the config/ directory.

All files with the cfg extension in the config directory will be copied into the installation during a make install.

Process a day

For example, to process the data from 20220712 with the kcor.latest.cfg configuration file use the kcor utility script in the bin/ directory of the installation:

kcor process -f latest 20220712

Creating the configuration file, in this case kcor.latest.cfg, is the main work in running the pipeline.

Code for KCor pipeline

Directories

  • analysis for routines to perform various analyses of KCor data
  • bin for shell and Python scripts
  • cmake for CMake modules
  • cme_detection for code for the automated CME detection pipeline
  • config for configuration files
  • gen for non-KCor-specific MLSO IDL routines used in the KCor pipeline
  • hv for helioviewer specific IDL code
  • lib for 3rd party IDL routines used in KCor pipeline
  • observing for KCor-related observing code
  • resources for data files such as color tables used in KCor pipeline
  • scripts for various scripts to be run on processed data
  • src for KCor pipeline IDL code
  • ssw for SSW IDL routines used in KCor pipeline
  • stream for code to remove aerosols in real-time from stream data
  • unit for unit tests

kcor-pipeline's People

Contributors

detoma avatar jburkepile avatar kolinski avatar mgalloy avatar wafels avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Forkers

wafels mgalloy

kcor-pipeline's Issues

Parse calibration text file

For example, a calibration text file is at:

/export/data1/Data/KCor/process/20161127/calibration_files.txt

The filenames in the first column and the exposure in the second column will be needed.

Complete camera non-linearity correction

  1. IG to complete acquisition of all camera data at various exposures and digital number (DN) values (SCOTT SEWELL / PHIL)
  2. Complete analysis and fitting of data to create pixel-by-pixel correction mapping for each of 4 cameras. (ALFRED)
  3. Implement correction in data pipeline.

Verify platescale before and after new camera stages

We have good alignment of KCor structures in low corona with other data (AIA, CoMP). Need to compare data in outer field with LASCO C2 (and possibly SWAP as well) to verify platescale.
Need to do this with data before and after installation of new camera stages in June 2015.

Reprocess script

We need a script to reprocess a day.

This script should do all the reprocessing for a day. For the database, this needs to:

  1. clear data for a day before starting
  2. generate a list of files with which to call the insert scripts

Create NetCDF files

This requires:

  1. Switch to corrected calibration (Alfred's fix to bugs).
  2. Remove the phase angle in transformation from cartesian to tangential.
  3. Adjust scaling for the NRGFs.
  4. Add back a kludge value to prevent the background from going negative from sky polarization removal.

Create insert scripts

There are six scripts, corresponding to the KCor tables:

  • kcor_img_insert
  • kcor_cal_insert
  • kcor_sw_insert
  • kcor_eng_insert
  • kcor_hw_insert
  • kcor_mission_insert

Create GBU for calibration files

@jburkepile will provide min/max values for darks, flats, and the 8 (or 9) different angles for the two cameras. The GBU code just needs to check that the median for each image is in the allowed range for its type.

Add sine theta sky polarization correction

Sine (or cosine) theta correction removes the polarization change from horizon to zenith relative to the Sun's position. This should be removed in the geo-coordinate system.

Scan circular profile at many radii

Find the mean of a circular profile at each of about 50 prescribed radii of a single L1 image. For now: choose the 20th image unless there are fewer than 20 images, then choose the first image.

This should be added to the kcor_sci database table.

Create subtraction and direct daily movies in GIF and MPEG

Joan says:

We can use mencoder to convert gif movies to mpgs which are alot smaller than gif animations. Let's talk in October about creating subtraction and direct K-Cor daily movies in both gif and mpg formats in the KCor pipeline.

p.s. here are the commands to create gif and mpg movies, Giuliana sent me this command for creating a gif movie from gif images:

 convert -delay 10 -loop 0 file*gif animation.gif   (creates gif movie called animation.gif)

To convert to an mpg type:

mencoder animation.gif -ovc lavc -lavcopts vcodec=mpeg4 -o animation.mpg
  • direct daily movies
  • subtraction movies

Verify script

A script like CoMP's COMP_VERIFY would be nice for KCor.

Calibrated data engineering plots

Daily plots:

  1. intensity averaged over a fixed height vs time (for a couple radii)
  2. daily radial gradient plot, radius vs calibrated intensity (from a average good image for the day, see issue #36)

Missing keywords set to null

This refers to inserting values into the database. For example, in SGS table, SGSRAZR and SGSDECZR were not present in the earlier part of the mission.

2 minute average and replace NRGF with 2 minute average NRGF

In the end-of-day processing, create a two minute average at the same times NRGF images are created. Also, replace existing NRGFs with an NRGF of the two minute average files. Create FITS and fullres/lowres GIFs for each of these files.

Steps

  • create two minute average FITS and fullres/lowres GIFs
  • create daily average FITS and fullres/lowres GIFs
  • distribute average FITS and GIF files
  • add the average files to the kcor_img table?
  • set num_kcor_pb_avg_fits, num_kcor_pb_dailyavg_fits, num_kcor_pb_avg_lowresgif, and num_kcor_pb_avg_fullresgif fields of mlso_numfiles table
  • delete existing NRGF FITS/GIF files in level1, archive, fullres, and cropped directories
  • delete existing NRGF files, i.e., producttype = 'nrgf', in kcor_img table
  • create a new NRGF file corresponding to each average file, including the daily average
  • distribute new averaged NRGF FITS and GIF files
  • add the new averaged NRGF files to the kcor_img table
  • correct num_kcor_nrgf_fits, num_kcor_nrgf_lowresgif, num_kcor_nrgf_fullresgif, and num_kcor_nrgf_dailyavg_fits entries in the mlso_numfiles table of the database

Use previously created cal files

This requires an epoch value, USE_PIPELINE_CALFILES, indicating when to start doing this. If this is set, search back in time to find the most recent cal file and use it.

Crash when cme_movies does not exist

Reported by Ben:

In kcor_cme_det_movie 102 error opening file>
/export/data1/Data/KCor/cme_movies/kcor_latest_cme_detction.gif

Here it looks like the problem is cme_movies does not exist.

Call calibration in the end of day run

The calibration should be performed at the end of the day run.

There might not be calibration from the day, so the calibration routines should handle that situation gracefully.

Amend FITS keywords for calibrated L2 data

Add

  • 'method' or 'technique' keyword for averaging
  • 'contrast' or some keyword to capture 'nrgf' (PRODUCT captures this)

Change scaling to units of B/Bsun

  • BSCALE to 1.
  • Change pipeline. We are currently multiplying by 1000 and saving as 16 bit integers. We want to multiply data by 10^-06 so they are in units of B/Bsun. Data will be saved as floats. Data values should be in range of 10^-6 to 10^-09
  • change BUNIT keyword value to "B/Bsun"
  • gain is in units of 1e-6 B/Bsun, correct for that when applying

Fix comment in BOPAL

  • change from: " 'Opal Transmission Calibration by Elmore at 775 nm' to: 'Opal Transmission Calib. by Elmore at 775 nm'

Wavelength

  • Change WAVELNTH keyword (not FITS standard) to WAVECENT (leave as WAVELNTH)

Rsun

  • change RSUN to RSUN_OBS with the same comment
  • add R_SUN (radius of sun in KCor pixels) = RSUN_OBS / CDELT1

Check values of sine(2*theta) sun-center correction

Additional parameters were added to the sine(2*theta) sky polarization fit to account for offsets between actual sun center and occulter center. However, the code sometimes gives large values for the offset between sun-center and occulter center. Check data if these offsets are reasonable.

Scan annulus at 1.3 and 1.8 R

The scans should be done on a 0.5 degree and plus/minus 0.01 R resolution and performed on a single L1 image. For now: choose the 20th image unless there are fewer than 20 images, then choose the first image.

These scans will be added to the kcor_sci table daily.

Automated, real-time CME detection

The CME detection program should be able to run all day from a cronjob. The current version can run as a batch job with all the data already present or as a GUI in real-time mode, but not as a combination of these, i.e., as a batch job where all the data is not already present.

check impact of out-of-focus optics on camera 1

CONCLUSION: Impact is negligible.
Background: When new camera stages were installed in 2016 the Transmitted camera (camera 1) could not be properly focused despite multiple attempts by Alfred, Dennis and Ben. Dennis and Alfred concluded that the camera optics are out-of-focus. This can only be fixed by bringing instrument down and shipping box with optics and beam splitter back to HAO.
Test: I decided to check the impact of the out-of-focus camera on the data quality. I processed multiple days with only camera 0, only camera 1 and compared to data processed with both cameras.
Results: There is no noticeable difference except the noise in the far field is higher when using only one camera. This indicates the fuzziness from camera 1 is of the same magnitude as interpolating pixels during processing (distortion correction, rotating data for north up).
Here is an image processed with only camera 1 (out of focus camera):
20170820_193643_kcor

Here is the same image processed with only camera 0 (in focus camera):
20170820_193643_kcor

Here is the same image processed with both cameras (nominal processing):
20170820_193643_kcor

Make new engineering plots

  1. Reorganize/rename plots into KCor+{engineering, occulter centering, raw occulter} and SGS info.
  2. Add more SGS plots

Put L0 with corrected headers on HPSS

Must put originals in a specific location on the HPSS beforehand.

There are problems through 2015, but only 5-6 months besides lookup table issues.

Issue #95 handles fixes to the calibration data that need to be sent to the HPSS.

Create additional daily and cumulative engineering and science plots

Pipeline currently produces a few daily plots of information: modulator temperature, SGS DIMV, and SGS seeing values. Add software to the pipeline to create additional plots:

  1. values of calibrated pB at a fixed height integrated over 360 degrees for each image of the day
  2. values of calibrated pB vs. radius for each image of the day
  3. create longer time series plots showing average values of modulation matrix, daily pB vs height, daily pB at a fixed height.

Create realtime simulator

In order to test the realtime processing, we need a simulator that will "download" the data for a day into the raw directory as it would come in.

The simulator should assume that the first data file for the day corresponds to the current time when started and copy files into the new raw directory in realtime.

The interface should be:

kcor_simulator [-h] [--raw-dir RAW_DIR] [--batch-time BATCH_TIME] archivedir

Verify stability of KCor calibration

  1. Plot the mean intensity of each type of raw calibration image for the entire KCor mission (~October 2013 to present) and compute the mean and standard deviation of each type. There are 3 types of calibration images: 1) Darks, 2) Flats, 3) calibration polarization (calpol) images. Calpol images are taken at 8 different angles (zero to 180 degrees in steps of 22.5 degrees. Zero and 180 degrees are equivalent). Each type and angle should be plotted separately.

  2. Use plots to determine stability of each type of calibration image (i.e. darks, flats, calpol at various angles) over the mission.

  3. Use plots to determine where changes in hardware and software caused large changes in the calibration values. Use these changes to define the start and end of each epoch.

  4. Visually inspect plots and remove outlying points; then recalculate mean and standard deviations for GBU (good-bad-ugly) assessment for each epoch and calibration image type.

  5. Create calibration files (.ncdf) for every day and check the day-to-day variability of the modulation matrix.

Determine cause of intensity offsets between Q and U polarization states

Q and U polarization states should have approximately the same background levels. Currently one state is consistently higher after the demodulation matrix is applied. Need to determine cause of this and fix. It may be due to cross talk.

  1. plot background levels of each state to quantify the differences.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.