Sibaek Lee

prof_pic.jpg

Hi!😀 I am a PhD student in Intelligent Robotics at Sungkyunkwan University advised by Prof. Hyeonwoo yu where I am a member of the Lab of Artificial Intelligence and Robotics (Lair).

As a researcher, I hope my research contributes positively to society and I believe that artificial intelligence is a step towards achieving that goal. My research focuses on using deep learning techniques to enhance robots’ spatial perception. By improving their ability to perceive their environment accurately, we can enable robots to make better decisions, perform more complex tasks, and ultimately help people in various ways.

Research Interests

  • 🧠 Deep Learning: 3D Vision, Vision Language Model, Generative model, Bayesian learning
  • 🤖 Robotics: Simultaneous Localization and Mapping (SLAM), Robot Perception and navigation

News

Sep 2025 🎉 Our paper “LAMP: Implicit Language Map for Robot Navigation” is accepted to RA-L 2025.
Aug 2025 🏆 Honored to receive the LG AI Research Best Paper Award at the Conference on Korean Artificial Intelligence Association (CKAIA) 2025.
Jul 2025 🎉 Our paper “Spatial Coordinate Transformation for 3D Neural Implicit Mapping” is accepted to RA-L 2025.
Jan 2025 🎉 Our paper “Bayesian NeRF: Quantifying Uncertainty With Volume Density for Neural Implicit Fields” is accepted to RA-L 2025.
Aug 2024 🏢 Joined Naver LABS (Vision Group) as a 3D Vision & Deep Learning Research Intern.
Jun 2024 🎉 Our paper “Just flip: Flipped observation generation and optimization for neural radiance fields to cover unobserved view” is accepted to IROS 2024.

Selected Publications

Show all NeRF 3D Representation Bayesian Learning Vision Language Model Robot Navigation 3D Perception Embedded Systems Gaussian Process Scene Representation Loop Closing Place Recognition SLAM

2025

  1. RA-L
    lamp.png
    LAMP: Implicit Language Map for Robot Navigation
    Sibaek Lee, Hyeonwoo Yu, Giseop Kim, and Sunwook Choi
    IEEE Robotics and Automation Letters (RA-L), 2025
    Vision Language Model Robot Navigation Scene Representation Embedded Systems
  2. Preprint
    e3dp.png
    Efficient 3D Perception on Embedded Systems via Interpolation-Free Tri-Plane Lifting and Volume Fusion
    Sibaek Lee, Jiung Yeon, and Hyeonwoo Yu
    arXiv preprint arXiv:2509.14641, 2025
    3D Perception Embedded Systems
  3. RA-L
    gp_nerf.png
    Spatial Coordinate Transformation for 3D Neural Implicit Mapping
    Kyeongsu Kang, Seongbo Ha, Sibaek Lee, and Hyeonwoo Yu
    IEEE Robotics and Automation Letters, 2025
    NeRF 3D Representation Gaussian Process
  4. RA-L
    bayesian.png
    Bayesian NeRF: Quantifying Uncertainty With Volume Density for Neural Implicit Fields
    Sibaek Lee, Kyeongsu Kang, Seongbo Ha, and Hyeonwoo Yu
    IEEE Robotics and Automation Letters, 2025
    NeRF 3D Representation Bayesian Learning

2024

  1. IROS
    justflip.gif
    Just flip: Flipped observation generation and optimization for neural radiance fields to cover unobserved view
    Sibaek Lee, Kyeongsu Kang, and Hyeonwoo Yu
    In 2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2024
    NeRF 3D Representation Bayesian Learning

2023

  1. Preprint
    nfc.png
    Necessity feature correspondence estimation for large-scale global place recognition and relocalization
    Kyeongsu Kang, Minjae Lee, and Hyeonwoo Yu
    arXiv preprint arXiv:2303.06308, 2023
    Loop Closing Place Recognition SLAM