Principal Applied Research Scientist & technical lead at Meta Reality Labs, driving R&D across motion tracking, sensor fusion, and on-device AI for Smart Glasses and AR/VR. Core expertise spans IMU & multi-modal sensor fusion, indoor/outdoor motion tracking, SLAM & non-visual localization, multi-agent trajectory prediction, and LLM-based agentic frameworks. PhD, Electrical & Computer Engineering — UT Austin.
Open to collaboration — I'm currently co-supervising and collaborating with PhD students and researchers working on motion & agentic AI problems. Please reach out if you seek collaboration.
Real-time ML for edge devices — fitness tracking, AR/VR, sensor fusion, trajectory analysis with strict memory and power constraints.
Deep learning models for multi-agent motion forecasting, SLAM, non-visual localization, and multi-level data fusion systems.
Full-stack sensor system development from signal quality assessment to application abstraction. IMU, GNSS, magnetometers, ultrasonic.
LLM-driven swarm intelligence frameworks, automated hardware debugging pipelines, and autonomous engineer persona agents.
Technical lead across 4+ product groups and 6+ teams. Research in outdoor/indoor motion tracking, fitness AI, SLAM pipelines, and non-visual localization for Smart Glasses and AR/VR devices. Architected LLM-based swarm intelligence frameworks for product development.

Technical lead for the algorithms team. Developed real-time deep learning systems for object tracking, lane detection, road recognition, and traffic sign recognition for autonomous vehicles.
Designed algorithm for processing raw ultrasonic sensor data to detect and track multiple objects using unsupervised deep learning — subsequently patented. Worked on static mapping using Extended Kalman Filters.
Designed, executed, assessed, and troubleshot software programs and web-based applications across multiple client projects.
Co-founded and ran a startup in the AR/VR space. Simultaneously worked as a freelance machine learning consultant specializing in computer vision.
1,699+ citations across CVPR, ECCV, ICCV, IEEE PerCom, and WACV. Google Scholar
Trajectory prediction for autonomous and non-autonomous objects. Member of the Mobile Automation and Sensing Systems (MASS) lab.
Open to research collaborations, speaking engagements, and advisory opportunities in applied AI, sensor systems, and autonomous systems.