Measurement of Spin-Density Matrix Elements in φ(1020) → KSKL
Overview
As part of my PhD. dissertation I measured the Spin-Density Matrix Elements of φ(1020) using data collected by the GlueX Collaboration. This measurement was published in Phys. Rev. C.
Paper abstract
We measure the Spin-Density Matrix Elements (SDMEs) for the photoproduction of φ(1020) off of the proton in its decay to KS0 KL0, using 105 pb−1 of data collected with a linearly polarized photon beam using the GlueX experiment. The SDMEs are measured in nine bins of the squared four-momentum transfer t in the range −t = 0.15–1.0 GeV2, providing the first measurement of their t-dependence for photon beam energies Eγ = 8.2–8.8 GeV. We confirm the dominance of Pomeron exchange in this region, and put constraints on the contribution of other Regge exchanges. We also find that helicity amplitudes where the helicity of the photon and the φ(1020) differ by two units are negligible.
Paper highlights
- Large event sample: ~6.5 × 10⁵ φ(1020) mesons were analysed, a dataset orders of magnitude larger than prior measurements in this beam-energy range.
- First measurement of the t-dependence of φ(1020) SDMEs in this photon-energy region (8.2-8.8 GeV).
- High-precision extraction of nine independent SDMEs using a linearly polarized beam, allowing separation of natural vs. unnatural parity exchange contributions and helicity‐flip vs. helicity‐conserving amplitudes.
- Confirmation that at low –t the production is consistent with s-channel helicity conservation (SCHC) and natural parity exchange (NPE), whereas at higher –t deviations appear, in line with predictions from a Regge-based model by the Joint Physics Analysis Center (JPAC).
- Quantitative constraints on possible contributions from non-pomeron exchanges (such as π and η exchange) and on double-helicity-flip amplitudes, which are found to be negligible within uncertainties.
Technologies Used
- Data analysis: Led the data exploration phase, developed data quality metrics, validated analysis approach with Monte Carlo techniques, and led the modeling and interpretation of this measurement.
- High Performance Computing: I used HPC clusters at Jefferson Lab and clusters with multiple 32 core servers to process terabytes of data and perform Monte Carlo simulations. Used batch systems such as swif2 from Jefferson Lab and high-throughput computing systems such as HTCondor.
- Programming: I worked with C++ and Python to efficiently process and study this dataset. I also developed a multithreaded command-line tool to automate and speed up several steps in the analysis workflow.
- Linux environment: The HPC and clusters I worked on used CentOS and later AlmaLinux.
- Fitting framework: Use of the AmpTools framework to perform unbinned extended-maximum-likelihood fits of an angular-distribution model to extract SDMEs in each –t bin; formalism incorporates the angular dependence of vector-meson decay, beam polarization, and production kinematics.