A Unified Computational Framework for Temporal Reconstruction and Gap Mitigation in Gamma-Ray Burst Light Curves
This repository contains the implementation of a comparative framework for reconstructing Gamma-Ray Burst (GRB) light curves, specifically targeting the mitigation of temporal data gaps inherent to satellite observations. The framework integrates two distinct methodological paradigms: a deep learning approach utilizing an Attention-based U-Net architecture, and a statistical approach employing Quartic Smoothing Splines (QSS).
The primary objective is to enhance the fidelity of light curve feature extraction—specifically the plateau emission end time (
GRB light curves observed by instruments such as Swift (BAT/XRT) often suffer from temporal gaps due to orbital constraints, Earth occultation, or instrumental downtime. These gaps disrupt the continuity required for accurate determination of the plateau phase parameters.
Gamma-ray flux
Asymmetric flux errors
This model adapts the U-Net architecture, originally designed for biomedical image segmentation, for 1D time-series reconstruction. The core innovation is the integration of attention gates (AGs) into the skip connections.
The network comprises an encoder-decoder structure:
- Encoder: Extracts hierarchical features at progressively lower temporal resolutions via Max Pooling.
- Bottleneck: Captures global context at the coarsest resolution.
- Decoder: Upsamples features using UpSampling1D layers.
Attention Gates (
Where
Since the U-Net is deterministic, uncertainty is quantified via a stochastic error synthesis. The distribution of observed errors is fitted to a Normal distribution. During inference, synthetic noise is sampled and added to the mean prediction to generate a distribution of possible realizations. The 95% confidence interval is derived from the percentiles of these Monte Carlo samples.
This approach utilizes non-parametric regression to fit a smooth curve through the observed data points.
A smoothing spline minimizes the penalized sum of squares:
Where:
-
$k=4$ denotes the quartic order (cubic penalty on curvature), providing a higher degree of smoothness compared to standard cubic splines ($k=3$ ). -
$\lambda$ is the smoothing parameter.
The smoothing factor
To generate synthetic error bars for reconstructed points, the framework analyzes the empirical distribution of the input flux errors. It fits both Normal and Laplace distributions, selecting the best fit via log-likelihood maximization. This accounts for the often leptokurtic (heavy-tailed) nature of astrophysical errors which standard Gaussian assumptions fail to capture.
It is highly recommended to use a virtual environment.
# Clone the repository
git clone https://github.com/your-username/grb-reconstructor.git
cd grb-reconstructor
# Create environment
python -m venv venv
source venv/bin/activate # Linux/Mac
# venv\Scripts\activate # Windows
# Install dependencies
pip install -r requirements.txtstreamlit>=1.28.0
numpy>=1.23.0
pandas>=1.5.0
matplotlib>=3.6.0
scikit-learn>=1.1.0
scipy>=1.9.0
tensorflow>=2.12.0
requests>=2.28.0
This application is secured with a password mechanism for restricted access. You must create a .streamlit/secrets.toml file in the root directory:
# .streamlit/secrets.toml
password = "your_secure_password_here"Execute the Streamlit server via the terminal:
streamlit run Research.py- Select Model: Use the sidebar to choose between "Attention U-Net" or "Quartic Smoothing Spline".
- Input Data: Provide a raw GitHub URL to a CSV file. The CSV must contain columns for Time, Flux, and corresponding positive/negative errors.
- Hyperparameters:
- Epochs/Batch Size: Relevant for the U-Net model training convergence.
- Spline Degree/Multiplier: Relevant for the QSS model to control smoothness vs. data fidelity.
The application generates:
- Reconstructed Plot: Visualizing observed points, the mean prediction curve, and the 95% confidence region.
- Downloadable CSV: Containing interpolated time steps, reconstructed flux values, and synthetic error margins.
- High-Res PDF: Vector-graphics format plot suitable for publication.
The input CSV structure should conform to the standard GRB light curve format. The script automatically detects columns using keyword matching.
| Recommended Column Names | Description |
|---|---|
t or time
|
Time since trigger (seconds). |
flux |
Flux density ( |
pos_flux_err |
Positive error bound on flux. |
neg_flux_err |
Negative error bound on flux. |
Note: Negative time or flux values are automatically filtered during preprocessing.
├── Research.py # Main Streamlit application logic
├── requirements.txt # Python dependencies
├── .streamlit/
│ └── secrets.toml # Authentication credentials (User created)
└── README.md # Documentation
If you use this code or the methodologies implemented herein, please cite the associated paper:
@article{KAUSHAL2025100519,
title = {Multi-Model Framework for Reconstructing Gamma-Ray Burst Light Curves},
journal = {Journal of High Energy Astrophysics},
pages = {100519},
year = {2025},
issn = {2214-4048},
doi = {https://doi.org/10.1016/j.jheap.2025.100519},
url = {https://www.sciencedirect.com/science/article/pii/S2214404825002009},
author = {A. Kaushal and A. Manchanda and M.G. Dainotti and K. Gupta and Z. Nogala and A. Madhan and S. Naqi and Ritik Kumar and V. Oad and N. Indoriya and Krishnanjan Sil and D.H. Hartmann and M. Bogdan and A. Pollo and J.X. Prochaska and N. Fraija and D. Debnath},
keywords = {γ-ray bursts, statistical methods, machine learning, light curve reconstruction},
}This project is licensed under the MIT License - see the LICENSE file for details.
We acknowledge the use of public data from the Swift data archive. This implementation relies on the robust scientific stack provided by the Python community, specifically TensorFlow, SciPy, and Streamlit.