Skip to content
Massachusetts Institute of Technology
  • on: Nov. 22, 2021
  • in: Eurographics STAR

Neural Fields in Visual Computing and Beyond

  • Yiheng Xie
  • Towaki Takikawa
  • Shunsuke Saito
  • Or Litany
  • Shiqin Yan
  • Numair Khan
  • Federico Tombari
  • James Tompkin
  • Vincent Sitzmann
  • Srinath Sridhar
equal advising
@inproceedings{xie2021neuralfield,
    title = { Neural Fields in Visual Computing and Beyond },
    author = { Xie, Yiheng and 
               Takikawa, Towaki and 
               Saito, Shunsuke and 
               Litany, Or and 
               Yan, Shiqin and 
               Khan, Numair and 
               Tombari, Federico and 
               Tompkin, James and 
               Sitzmann, Vincent and 
               Sridhar, Srinath },
    year = { 2021 },
    booktitle = { Eurographics STAR },
}
  • Copy to Clipboard

Recent advances in machine learning have created increasing interest in solving visual computing problems using a class of coordinate-based neural networks that parametrize physical properties of scenes or objects across space and time. These methods, which we call neural fields, have seen successful application in the synthesis of 3D shapes and image, animation of human bodies, 3D reconstruction, and pose estimation. However, due to rapid progress in a short time, many papers exist but a comprehensive review and formulation of the problem has not yet emerged. In this report, we address this limitation by providing context, mathematical grounding, and an extensive review of literature on neural fields. This report covers research along two dimensions:

In Part I, we focus on techniques in neural fields by identifying common components of neural field methods, including different representations, architectures, forward mapping, and generalization methods.

In Part II, we focus on applications of neural fields to different problems in visual computing, and beyond (e.g., robotics, audio).

Our review shows the breadth of topics already covered in visual computing, both historically and in current incarnations, demonstrating the improved quality, flexibility, and capability brought by neural fields methods. Finally, we present a companion website that contributes a living version of this review that can be continually updated by the community.