Neural Implicit Mapping via Nested Neighborhoods

3University of Coimbra

Using Neural Implicit Mapping it is possible to transfer detailed normals between neural implicit surfaces and to parametric non-neural mesh surfaces without UV-mapping or UV-unwrapping.


description [Feb 27th 2023] Page online.
description [Feb 27th 2023] Paper available.
description [Feb 27th 2023] Video available.


We introduce a novel approach for rendering static and dynamic 3D neural signed distance functions (SDFs) in real-time. We rely on nested neighborhoods of zero-level sets of neural SDFs, and mappings between them. This framework supports animations and achieves real-time performance without the use of spatial data-structures. It consists of three uncoupled algorithms representing the rendering steps. The multiscale sphere tracing focuses on minimizing iteration time by using coarse approximations on earlier iterations. The neural normal mapping transfers details from a fine neural SDF to a surface nested on a neighborhood of its zero-level set. It is smooth and it does not depend on surface parametrizations. As a result, it can be used to fetch smooth normals for discrete surfaces such as meshes and to skip later iterations when sphere tracing level sets. Finally, we propose an algorithm for analytic normal calculation for MLPs and describe ways to obtain sequences of neural SDFs to use with the algorithms.


The objective is to render level sets of neural SDFs in real-time and in a flexible way. Given the iterative nature of sphere tracing, a reasonable way to increase its performance is to optimize each iteration or avoid them. The key idea is to consider neural SDFs with a small number of parameters as an approximation of earlier iterations and map the normals of the desired neural SDF to avoid later iterations. Those ideas come from the following fact: if the zero-level set of a neural SDF f is contained in a neighborhood V of the zero-level set of another neural SDF, then we can map f into V. We formalize and analyze this idea through the so defined nesting condition. Taking advantage of such neighborhoods results in novel algorithms for sphere tracing and normal mapping, which may be used in a variety of applications. We also propose an algorithm to calculate analytic smooth normals of neural MLP SDFs through GEMM kernels, without any use of automatic differentiation (more details on the paper).


Nesting Condition

Let S1, S2, S3 be a sequence of surfaces pairwise close and with increasing detail. To render S2 using the multiscale sphere tracing, we first sphere trace the boundary of a neighborhood of S1 (gray area), resulting in q1. Then we continue to sphere trace S2, reaching q2. Now we can increase the details using neural normal mapping: since q2 belongs to a (tubular) neighborhood of S3, we evaluate the normal N3 at q2 of a parallel surface of S3 (red dotted). Notice that these surfaces share the same normal field and, as a consequence, no parametrizations such as UV-coordinates are needed.
Those algorithms are possible because of the nesting condition. It states that all rays that should intersect a detailed surface should also intersect a coarser version of it sufficiently inflated. In the figure, Sj is a coarser version of Sj+1. To use Sj as a conservative approximation of Sj+1, we inflate it by a parameter δ. We then say that the SDF fj+1 associated with Sj+1 is nested in the SDF fj associated with Sj. Notice that this inflation operation is very easy to be done on an implicit context: subtract δ from fj. Please check the paper for formal guarantees about the nesting condition.



Neural Implicit Mapping via Nested Neighborhoods

Vinícius da Silva, Tiago Novello, Guilherme Schardong, Luiz Schirmer, Hélio Lopes and Luiz Velho

description Paper preprint (PDF, 4.2 MB)
description arXiv version
insert_comment BibTeX
videocam Video

Please send feedback and questions to Vinícius da Silva.


	title = {Neural Implicit Mapping via Nested Neighborhoods},
	author = {da Silva, Vin\'icius and Novello, Tiago and Schardong, Guilherme and Schirmer,
		Luiz and Lopes, H\'elio and Velho, Luiz},
	journal = {arXiv:2201.09147},
	year = {2023},
	month = jun


We would like to thank Towaki Takikawa, Joey Litalien, Kangxue Yin, Karsten Kreis, Charles Loop, Derek Nowrouzezahrai, Alec Jacobson, Morgan McGuire and Sanja Fidler for licensing the code of the paper Neural Geometric Level of Detail: Real-time Rendering with Implicit 3D Surfaces and project page under the MIT License. This website is based on that page.

We also thank the Stanford Computer Graphics Laboratory for the Bunny, Dragon, Armadillo, Happy Buddha, and Lucy models, acquired through the Stanford 3D scan repository.