admin管理员组

文章数量:1122832

I'm trying to fit some large (relatively to GPU memory size) Gaussian Process model on my dataset using GPyTorch with KeOPS. However, it looks like I'm not able to benefit from KeOPS in reducing the memory footprint.

I started from GPyTorch tutorial with KeOPS, taken from GitHub here. When running the notebook, I do get the following error message: OutOfMemoryError: CUDA out of memory. Tried to allocate 528.38 GiB.. I don't think the notebook expects the user to have 500+GB of GPU ram to run.

What am I doing wrong? Or is the example outdated? Currently on GPyTorch 1.13.

I'm trying to fit some large (relatively to GPU memory size) Gaussian Process model on my dataset using GPyTorch with KeOPS. However, it looks like I'm not able to benefit from KeOPS in reducing the memory footprint.

I started from GPyTorch tutorial with KeOPS, taken from GitHub here. When running the notebook, I do get the following error message: OutOfMemoryError: CUDA out of memory. Tried to allocate 528.38 GiB.. I don't think the notebook expects the user to have 500+GB of GPU ram to run.

What am I doing wrong? Or is the example outdated? Currently on GPyTorch 1.13.

Share Improve this question asked Nov 21, 2024 at 13:17 stavoltafunziastavoltafunzia 778 bronze badges
Add a comment  | 

1 Answer 1

Reset to default 0

Found it, was a very dummy mistake. I did not install pykeops before running the code. Weird enough, if pykeops is not installed then keops kernels in gpytorch.kernels.keops fallback to non-keops kernels. The fallback in my case was happening silently, with no warning (somehow the warning generated here was suppressed).

I figured this out by inspecting source code. IMHO, I think that gpytorch.kernels.keops should raise some exception when it's used whitout pykeops installed.

本文标签: pythonGPyTorch Regression With KeOpsOutOfMemoryErrorStack Overflow