
Does wisp._C.ops.hashgrid_interpolate_cuda provide gradients …
Sep 13, 2022 · It seems that NVlabs' tiny-cuda-nn framework supports the computation of the gradient w.r.t to the input coordinates: method kernel_grid_backward_input in …
tiny-cudann doesn't return correct gradeint w.r.t input ... - GitHub
Feb 26, 2023 · Below is an example of a comparison of gradient returned by tinycudann vs numerical gradients computed for a simple field based on hash grid. The difference between …
Trying extrinsics optimization on a grid-based NeRF
Apr 16, 2023 · I was wondering if it Would it be possible to add the gradient computation for the coordinates as well, since it would be a great enhancement to make codebook-based …
Releases: NVIDIAGameWorks/kaolin-wisp - GitHub
Highlights First kaolin-wisp release. Supports full optimization pipelines of neural radiance fields and signed distance functions. Also includes an interactive renderer to visualize the …
RuntimeError: element 0 of tensors does not require grad and …
Aug 21, 2022 · RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn #23