You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
what is the GPU memory requirement for running prediction? Is it system dependent? If so is there a simple way to estimate the memory required?
I was running an inference for a complex with N_asym 6, N_token 2372, N_atom 18500, N_msa 4940 on a GPU with 24 GB memory. and the job was killed by a OOM:
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 5.37 GiB. GPU
The text was updated successfully, but these errors were encountered:
what is the GPU memory requirement for running prediction? Is it system dependent? If so is there a simple way to estimate the memory required?
I was running an inference for a complex with N_asym 6, N_token 2372, N_atom 18500, N_msa 4940 on a GPU with 24 GB memory. and the job was killed by a OOM:
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 5.37 GiB. GPU
The text was updated successfully, but these errors were encountered: