Past research summary
During my undergraduate degree, I developed a toolbox that relies on the characteristics of the human visual system to improve the display design process (BiPMAP). It has been deployed by CIVO and various companies for applications in VR/AR and mobile phones. While interning at Illumina, I built a framework to unify all genetic sequencing services which is now deployed under the name DRAGEN-Reports. Since I graduated, I have been working on building data-driven models of the early primate visual system at unprecedented spatiotemporal scale and rigor (fixational-transients). In addition, I have been working with event-based cameras, which are designed to capture dynamics, and are inspired by the retina. Recently I showed theoretically and empirically that the noise in event cameras can be used to recover the static scene, which was thought to require additional hardware to capture (Noise2Image).
Acknowledgements
I am grateful for the support of both:
in sponsoring my research and studies.