Intern Emma Carrier 2024

Emma was born and raised on the Big Island of Hawai’i, and is currently enrolled at Emory University. She graduated from Kealekehe High School in 2023 and is now pursuing a B.S in data science with a minor in astronomy. She plans on moving back to her hometown after graduating from college, bringing some of her technical knowledge back into the community. In her free time, she loves to try new sports, play video games, and work on projects with her clubs.

Home Island: Hawai’i Island

High School: Kealekehe High School

Institution when accepted: Emory University

Project Title: Improving MUSE/VLT Data Cube Post-Processing for Optimal Source Detection

Site: Gemini Observatory, Hilo, Hawaii

Mentors: Emanuele Farina, Brian Lemaux, & Anniek Gloudemans

Project Abstract:

Integral Field Spectroscopy (IFS) is revolutionizing how we study the Universe as it utilizes both spectrographic and imaging capabilities to obtain spectra for every spatial element within a specified field-of-view. However, the complexity of these instruments makes data reduction and post-processing challenging. Being able to remove non-astronomical information (noise) from astronomical photons (signal) is important for the detection and identification of astrophysical sources, like galaxies and quasars. The REQUIEM survey is making use of these IFS techniques with the MUSE instrument on the Very Large Telescope (VLT) to study the first quasars and their surroundings. This survey requires the raw data to be optimally reduced and post-processed in order to identify and characterize faint astrophysical sources in the field of view. This project is aimed to optimize “cleaning” procedures to improve noise characterization and remove artifacts in the data cubes, which can enhance the reliability of the source detection software and generate more accurate source catalogs. The success matrix for the cleaning procedures will be determined iteratively.  Each data cube will be compared to similar studies to verify results. The total purity and completeness of the detections will then be maximized by optimizing the detection of “fake” sources injected in the data cubes. These two parameters define our confidence in source differentiation (purity) and the extent to which real sources are recovered (completeness). The end result of this project is a python framework that will perform cleaning, source detection, and additional thresholding on the data cubes and produce catalogs of sources with quantified purity and measured physical properties.