NAISS
SUPR
NAISS Projects
SUPR
Uncovering the Magnetar Population with ZTF: Skysurvey Simulations of Type I Superluminous Supernova Detectability and Volumetric Rates
Dnr:

NAISS 2025/22-1756

Type:

NAISS Small Compute

Principal Investigator:

Avinash Singh

Affiliation:

Stockholms universitet

Start Date:

2025-12-17

End Date:

2027-01-01

Primary Classification:

10305: Astronomy, Astrophysics, and Cosmology

Webpage:

Allocation

Abstract

Hydrogen poor superluminous supernovae (SLSNe-I) are among the most luminous stellar explosions known, but their volumetric rates and progenitor channels remain poorly constrained because they are intrinsically rare and strongly affected by survey selection effects. SLSNe-I are leading candidates for tracing the birth of highly magnetised, rapidly rotating neutron stars (magnetars) and for probing massive star formation and chemical enrichment out to high redshift. Robust rate estimates therefore require both a well-defined observational sample and a detailed forward model of survey detectability. This project will measure the volumetric rates of SLSNe-I in the local Universe (z < 0.5) using the Zwicky Transient Facility (ZTF), and use these results to inform forecasts for the Vera Rubin Observatory Legacy Survey of Space and Time (LSST). ZTF is a high-cadence, all sky optical survey (47 square degrees field of view) that has operated since 2018 with an untargeted strategy, making it well suited for constructing a homogeneous sample of SLSNe-I with minimal host-galaxy bias. Spectroscopic classification by the Caltech/Palomar team provides a well-characterised reference sample on which to anchor the rate measurement. The key technical challenge is to derive accurate detection efficiencies and selection functions for SLSNe-I in a real survey, given their diversity in light-curve shape, colour, and environment. To address this, we will use the skysurvey and simsurvey frameworks, together with sncosmo, to inject synthetic SLSNe-I into the real ZTF pointing history (Phase I and II, ~2500 days, typically two filters per night). For each realisation we will simulate up to 1,000,000 transients drawn from a parameterised luminosity function over 0.1 < z < 0.5, randomly sampling from a library of >100 SED templates and varying explosion parameters, host extinction, and sky position within the survey footprint. We will repeat these simulations for at least ten independent realisations (∼10 million light curves in total) to marginalise over Monte Carlo noise. The resulting selection functions will be used to infer volumetric SLSN-I rates from the observed ZTF sample, including systematic uncertainties from survey cadence, detection thresholds, and model diversity. We will then propagate these constraints to LSST-like survey configurations to predict SLSN-I yields, quantify the impact of reduced cadence but greater depth, and assess the utility of SLSNe-I as probes of magnetar formation and massive star evolution at higher redshift. Carrying out these simulations requires significant computing resources because each synthetic transient must be propagated through thousands of survey epochs and processed through realistic detectability criteria. Our local benchmarks indicate a cost of a few CPU seconds per transient, implying a total of order 10 million simulated light curves and 1000 - 10,000 core-hours for the full campaign. The workload is trivially parallel over transients and realisations, making it an excellent fit to the Tetralith architecture and enabling timely delivery of selection functions and rate measurements that will directly support the exploitation of ZTF and the preparation for LSST.