Persistent Identifier
|
doi:10.18738/T8/GV0ASD |
Publication Date
|
2021-05-27 |
Title
| ASTRIANet Data for: Python Computational Inference from Structure (PyCIS) |
Author
| Feuge-Miller, Benjamin (University of Texas at Austin)
Kucharski, Daniel (University of Texas at Austin)
Iyer, Shiva (University of Texas at Austin)
Jah, Moriba (University of Texas at Austin) |
Point of Contact
|
Use email button above to contact.
Feuge-Miller, Benjamin (University of Texas at Austin) |
Description
| Usage and Overview
This data set contains FITS-formatted files of annotated image data collected by the ASTRIANet telescope network, along with a technical specification document on the telescope sensor and an Excel-formatted table of example image annotations. Two full passes of data are provided as packaged tar.gz files containing contiguous observations of one LEO (Starlink-1422) and one GEO (Navstar-48) satellite over a 5-minute and 11-minute period respectively, and GIF-format video visualizations of each observation are provided. Individual zipped FITS files of single-frame data are also provided for the Starlink satellite. This data is provided as an example of typical ASTRIANet sensor capabilities, and may be used to run demonstrations of the related PyCIS object detection software. For more details on this software, the ASTRIANet telescope network, and the provided data and documents, please see the descriptions provided below.
ASTRIANet
ASTRIANet is a robotic telescope network operated by ASTRIA, designed to record time-series of optical observations of Anthropogenic Space Objects (ASOs) from all orbital regimes LEO-to-GEO. The network supports the Space Domain Awareness (SDA) mission of ASTRIAGraph by closing the information lifecycle from raw observation to trajectory data fusion. Per its operational strategy, the network collects electro-optical observations with significant imaging and tracking noise, and does not collect traditional calibration frames. Accurate ASO detection and tracking algorithms are required to enable Orbit Determination (OD) and bridge the ASTRIANet telescope network with the ASTRIAGraph database by delivering actionable trajectory information.
PyCIS Algorithm Details: A-Contrario Detection and Tracking
Under the so-called "a-contrario" paradigm, structures in data are considered "meaningful" if they are unlikely to occur by chance according to a background noise model. This "Helmholtz Principle" is formalized by a "Number of False Alarms" (or NFA) function which probabilistically defines a structure's “meaningfulness” given both a noise model assumption and some measurement function [2]. We use this dataset to demonstrate the feasibility of detecting ASOs from 16-Megapixel visual imagery observations under the “a-contrario” paradigm.
Our motivation in implementing an “a-contrario” detection scheme is two-fold. Firstly, we seek to avoid traditional CCD sensor reductions (e.g. background subtraction, flat-fielding) which can reduce precision as well as operational observation time. Secondly, we seek to avoid making discrete point detections on each image fame, in contrast to many implementations of so-called “track-before-detect” schemes and other three-dimensional “a-contrario” algorithms.
Our "Python Computation Inference from Structure" (PyCIS) algorithm [1] builds upon existing low-level implementations of the “a-contrario” principle, which detect edge features in 2D image data by probabilistically detecting meaningful regions of gradient vector field alignments [3, 4]. We have extended these implementations to extract centerline trajectories of of stars and other space objects moving over time. Maintaining the “a-contrario” framework, we then classify meaningful trajectory populations such as stars and sensor noise and identify potential anthropogenic space objects by rejection. For further details, please reference the related software.
Data
The curated set of data included in this submission is as follows:
- 20201220_45696_starlink-1422/*.zip: The raw optical image data from the ASTRIANet02 telescope (New Mexico Skies Observatory), stored as separate Flexible Image Transport System (FITS)-format files containing single Header/Data Units (HDUs) of single-exposure image frames and associated metadata. This header and image data can be processed using the Astropy module for Python (see the PyCIS GitHub for details). The ASTRIANet02 system follows a target track from publicly available two-line elements (TLEs) and collects unfiltered and uncorrected optical CCD data (see attached documents). Upon data collection, locally stored data is curated for quality and uploaded to the Corral storage system of the Texas Advanced Computing Center (TACC). This observation set includes 66 frames of data for a Starlink satellite collected on December 20, 2020 over a 5-minute observation, chosen for use in PyCIS demonstrations due to ASO visibility and motion patterns in the image plane.
- FULLPASS_20201220_45696_starlink-1422.tar.gz: A packaged collection of optical data of the Starlink-1422 LEO satellite collected on December 20, 2020 from the ASTRIANet02 telescope containing 66 individual FITS files over a 5-minute observation. While individual frames of FITS data may be downloaded for analysis (see above), this package is provided for users interested in the full time-series data.
- FULLPASS_20201224_26407_navstar-48.tar.gz: A packaged collection of optical data of the Navstar-48 GEO satellite collected on December 24, 2020 from the ASTRIANet02 telescope containing 93 individual FITS files over an 11-minute observation. This package is provided for users interested in full time-series data, and inspection of individual frames will require downloading the full pass.
- VISUAL_*.gif: Video visualizations of each FULLPASS data set. For visual comprehension, the videos use 4-pixel binning to reduced 1024x1024 resolution, are clipped and normalized to 5 standard deviations from the mean for improved contrast, and are animated at 5 frames per second.
Documentation
Documentation on the sensor technical specifications and tracking information used to produce the dataset is included as follows:
- example_header.xlsx: Extracted header data from the 20201220013535730_45696.zip data file. Standard sensor information is provided, along with details on the object being tracked and the celestial environment, auto-populated by the telescope software during collection. The TLEs are sourced from Space Track or CelesTrak (for Starlink satellites). The RA_OBJ/ DEC_OBJ coordinates are predictions from the associated TLE using SGP4, for use in validating detections. The header design complies with the Electro-Optical Space Situational Awareness (EOSSA) data product format [5].
- Questionnaire_sensor_NMSkies.docx: Technical specifications of the ASTRIANet telescope used to collect the observational data, including sensor details and station information.
Usage
These datasets can be reused by policymakers, by industry, by researchers and by the public interested in compliance with international regulations for ASOs. The PyCIS algorithm and demonstrations which utilize this data set can be found at https://github.com/ut-astria/PyCIS [1]. (2021-11-12) |
Subject
| Engineering; Computer and Information Science |
Keyword
| Anthropogenic Space Objects
Space Domain Awareness
Optical Data
Detection Algorithm
Target Tracking |
Related Publication
| Feuge-Miller, B., “PyCIS - Python Computational Inference from Structure”, GitHub Repository, (2021) https://github.com/ut-astria/PyCIS |
Production Date
| 2020-12-20 |
Production Location
| ASTRIANet02 at New Mexico Skies Observatory |
Contributor
| Data Curator : Esteva, Maria |
Depositor
| Feuge-Miller, Benjamin |
Deposit Date
| 2021-05-27 |
Time Period
| Start Date: 2020-12-20 ; End Date: 2020-12-20 |
Data Type
| 16-Megapixel visual imagery; documentation |
Software
| PyCIS - Python Computational Inference from Structure, Version: 2021-06-12, https://github.com/ut-astria/PyCIS |
Other Reference
| [1] Feuge-Miller, B., “PyCIS - Python Computational Inference from Structure”, GitHub Repository, (2021): https://github.com/ut-astria/PyCIS; [2] Desolneux, A., Moisan, L., Morel, J.M., “Meaningful Alignments”, International Journal of Computer Vision, (2000). 40. 7-23. 10.1023/A:1026593302236.; [3] von Gioi, R.G., Jakubowicz, J., Morel, J.M., Gregory Randall, “LSD: a Line Segment Detector”, Image Processing On Line, 2 (2012), pp. 35{55. http://dx.doi.org/10.5201/ipol.2012.gjmr-lsd; [4] Liu, C., Abergel, R., Gousseau, Y., Tupin, F., “LSDSAR, a Markovian a contrario framework for line segment detection in SAR images”, Pattern Recognition, Volume 98, (2020), 107034, ISSN 0031-3203, https://doi.org/10.1016/j.patcog.2019.107034.; [5] Payne, T., Mutschler, S., Meiser, D., Crespo, R., and Shine, N., “A Community Format for Electro-Optical Space Situational Awareness (EOSSA) Data Products”, in Advanced Maui Optical and Space Surveillance Technologies Conference, (2014): https://ui.adsabs.harvard.edu/abs/2014amos.confE..90P/abstract |