Skip to content

Commit e369846

Browse files
committed
Merge branch 'update_hugectr_version_24.4.0' into 'main'
Update new version: 24.4.0 See merge request dl/hugectr/hugectr!1530
2 parents 19e6017 + c809630 commit e369846

30 files changed

+59
-59
lines changed

HugeCTR/include/common.hpp

+2-2
Original file line numberDiff line numberDiff line change
@@ -58,8 +58,8 @@
5858

5959
namespace HugeCTR {
6060

61-
#define HUGECTR_VERSION_MAJOR 23
62-
#define HUGECTR_VERSION_MINOR 12
61+
#define HUGECTR_VERSION_MAJOR 24
62+
#define HUGECTR_VERSION_MINOR 4
6363
#define HUGECTR_VERSION_PATCH 0
6464

6565
#define WARP_SIZE 32

README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ If you'd like to quickly train a model using the Python interface, do the follow
4444

4545
1. Start a NGC container with your local host directory (/your/host/dir mounted) by running the following command:
4646
```
47-
docker run --gpus=all --rm -it --cap-add SYS_NICE -v /your/host/dir:/your/container/dir -w /your/container/dir -it -u $(id -u):$(id -g) nvcr.io/nvidia/merlin/merlin-hugectr:23.12
47+
docker run --gpus=all --rm -it --cap-add SYS_NICE -v /your/host/dir:/your/container/dir -w /your/container/dir -it -u $(id -u):$(id -g) nvcr.io/nvidia/merlin/merlin-hugectr:24.04
4848
```
4949

5050
**NOTE**: The **/your/host/dir** directory is just as visible as the **/your/container/dir** directory. The **/your/host/dir** directory is also your starting directory.

docs/source/hierarchical_parameter_server/hps_torch_user_guide.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -33,12 +33,12 @@ HPS is available within the Merlin Docker containers, which can be accessed thro
3333

3434
To utilize these Docker containers, you will need to install the [NVIDIA Container Toolkit](https://github.com/NVIDIA/nvidia-docker) to provide GPU support for Docker.
3535

36-
The following sample commands pull and start the Merlin PyTorch container:
36+
The following sample commands pull and start the Merlin HugeCTR container:
3737

38-
Merlin PyTorch
38+
Merlin HugeCTR
3939
```shell
4040
# Run the container in interactive mode
41-
$ docker run --gpus=all --rm -it --cap-add SYS_NICE nvcr.io/nvidia/merlin/merlin-pytorch:23.12
41+
$ docker run --gpus=all --rm -it --cap-add SYS_NICE nvcr.io/nvidia/merlin/merlin-hugectr:24.04
4242
```
4343

4444
You can check the existence of the HPS plugin for Torch after launching the container by running the following Python statements:

docs/source/hierarchical_parameter_server/profiling_hps.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -67,13 +67,13 @@ To build HPS profiler from source, do the following:
6767
Pull the container using the following command:
6868

6969
```shell
70-
docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.12
70+
docker pull nvcr.io/nvidia/merlin/merlin-hugectr:24.04
7171
```
7272

7373
Launch the container in interactive mode (mount the HugeCTR root directory into the container for your convenience) by running this command:
7474

7575
```shell
76-
docker run --gpus all --rm -it --cap-add SYS_NICE --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 -u root -v $(pwd):/HugeCTR -w /HugeCTR -p 8888:8888 nvcr.io/nvidia/merlin/merlin-hugectr:23.12
76+
docker run --gpus all --rm -it --cap-add SYS_NICE --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 -u root -v $(pwd):/HugeCTR -w /HugeCTR -p 8888:8888 nvcr.io/nvidia/merlin/merlin-hugectr:24.04
7777
```
7878

7979
3. Here is an example of how you can build HPS Profiler using the build options:

docs/source/hugectr_user_guide.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -83,7 +83,7 @@ The following sample command pulls and starts the Merlin Training container:
8383

8484
```shell
8585
# Run the container in interactive mode
86-
$ docker run --gpus=all --rm -it --cap-add SYS_NICE nvcr.io/nvidia/merlin/merlin-hugectr:23.12
86+
$ docker run --gpus=all --rm -it --cap-add SYS_NICE nvcr.io/nvidia/merlin/merlin-hugectr:24.04
8787
```
8888

8989
### Building HugeCTR from Scratch

hps_tf/hps_cc/config.hpp

+1-1
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
*/
1616
#pragma once
1717

18-
// TODO: The configurations are not needed anymore in merlin-base:23.12
18+
// TODO: The configurations are not needed anymore in merlin-base:24.04
1919
// #include <absl/base/options.h>
2020
// #undef ABSL_OPTION_USE_STD_STRING_VIEW
2121
// #define ABSL_OPTION_USE_STD_STRING_VIEW 0

hps_tf/notebooks/hierarchical_parameter_server_demo.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@
5858
"\n",
5959
"### Get HPS from NGC\n",
6060
"\n",
61-
"The HPS Python module is preinstalled in the 23.12 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.12`.\n",
61+
"The HPS Python module is preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
6262
"\n",
6363
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
6464
"\n",

hps_tf/notebooks/hps_multi_table_sparse_input_demo.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@
5858
"\n",
5959
"### Get HPS from NGC\n",
6060
"\n",
61-
"The HPS Python module is preinstalled in the 23.12 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.12`.\n",
61+
"The HPS Python module is preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
6262
"\n",
6363
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
6464
"\n",

hps_tf/notebooks/hps_pretrained_model_training_demo.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@
5858
"\n",
5959
"### Get HPS from NGC\n",
6060
"\n",
61-
"The HPS Python module is preinstalled in the 23.12 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.12`.\n",
61+
"The HPS Python module is preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
6262
"\n",
6363
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
6464
"\n",

hps_tf/notebooks/hps_table_fusion_demo.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -57,7 +57,7 @@
5757
"\n",
5858
"### Get HPS from NGC\n",
5959
"\n",
60-
"The HPS Python module is preinstalled in the 23.12 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.12`.\n",
60+
"The HPS Python module is preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
6161
"\n",
6262
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
6363
"\n",

hps_tf/notebooks/hps_tensorflow_triton_deployment_demo.ipynb

+7-7
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@
5858
"\n",
5959
"### Get HPS from NGC\n",
6060
"\n",
61-
"The HPS Python module is preinstalled in the 23.12 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.12`.\n",
61+
"The HPS Python module is preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
6262
"\n",
6363
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
6464
"\n",
@@ -854,9 +854,9 @@
854854
"INFO:tensorflow:Automatic mixed precision has been deactivated.\n",
855855
"2022-11-23 01:37:23.028482: I tensorflow/core/grappler/devices.cc:66] Number of eligible GPUs (core count >= 8, compute capability >= 0.0): 1\n",
856856
"2022-11-23 01:37:23.028568: I tensorflow/core/grappler/clusters/single_machine.cc:358] Starting new session\n",
857-
"2022-11-23 01:37:23.121909: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 30991 MB memory: -> device: 0, name: Tesla V100-SXM2-32GB, pci bus id: 0000:06:00.0, compute capability: 7.0\n",
858-
"2022-11-23 01:37:23.128593: W tensorflow/compiler/tf2tensorrt/convert/trt_optimization_pass.cc:198] Calibration with FP32 or FP16 is not implemented. Falling back to use_calibration = False.Note that the default value of use_calibration is True.\n",
859-
"2022-11-23 01:37:23.129761: W tensorflow/compiler/tf2tensorrt/segment/segment.cc:952] \n",
857+
"2022-11-23 01:37:24.041909: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 30991 MB memory: -> device: 0, name: Tesla V100-SXM2-32GB, pci bus id: 0000:06:00.0, compute capability: 7.0\n",
858+
"2022-11-23 01:37:24.048593: W tensorflow/compiler/tf2tensorrt/convert/trt_optimization_pass.cc:198] Calibration with FP32 or FP16 is not implemented. Falling back to use_calibration = False.Note that the default value of use_calibration is True.\n",
859+
"2022-11-23 01:37:24.049761: W tensorflow/compiler/tf2tensorrt/segment/segment.cc:952] \n",
860860
"\n",
861861
"################################################################################\n",
862862
"TensorRT unsupported/non-converted OP Report:\n",
@@ -872,9 +872,9 @@
872872
"For more information see https://docs.nvidia.com/deeplearning/frameworks/tf-trt-user-guide/index.html#supported-ops.\n",
873873
"################################################################################\n",
874874
"\n",
875-
"2022-11-23 01:37:23.129860: W tensorflow/compiler/tf2tensorrt/segment/segment.cc:1280] The environment variable TF_TRT_MAX_ALLOWED_ENGINES=20 has no effect since there are only 1 TRT Engines with at least minimum_segment_size=3 nodes.\n",
876-
"2022-11-23 01:37:23.129893: I tensorflow/compiler/tf2tensorrt/convert/convert_graph.cc:799] Number of TensorRT candidate segments: 1\n",
877-
"2022-11-23 01:37:23.120667: I tensorflow/compiler/tf2tensorrt/convert/convert_graph.cc:916] Replaced segment 0 consisting of 9 nodes by TRTEngineOp_000_000.\n"
875+
"2022-11-23 01:37:24.049860: W tensorflow/compiler/tf2tensorrt/segment/segment.cc:1280] The environment variable TF_TRT_MAX_ALLOWED_ENGINES=20 has no effect since there are only 1 TRT Engines with at least minimum_segment_size=3 nodes.\n",
876+
"2022-11-23 01:37:24.049893: I tensorflow/compiler/tf2tensorrt/convert/convert_graph.cc:799] Number of TensorRT candidate segments: 1\n",
877+
"2022-11-23 01:37:24.040667: I tensorflow/compiler/tf2tensorrt/convert/convert_graph.cc:916] Replaced segment 0 consisting of 9 nodes by TRTEngineOp_000_000.\n"
878878
]
879879
},
880880
{

hps_tf/notebooks/sok_to_hps_dlrm_demo.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@
5858
"\n",
5959
"### Get SOK from NGC\n",
6060
"\n",
61-
"Both SOK and HPS Python modules are preinstalled in the 23.12 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.12`.\n",
61+
"Both SOK and HPS Python modules are preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
6262
"\n",
6363
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
6464
"\n",

hps_torch/notebooks/README.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ If you prefer to build the HugeCTR Docker image on your own, refer to [Set Up th
1313
Pull the container using the following command:
1414

1515
```shell
16-
docker pull nvcr.io/nvidia/merlin/merlin-pytorch:23.12
16+
docker pull nvcr.io/nvidia/merlin/merlin-hugectr:24.04
1717
```
1818

1919
### Clone the HugeCTR Repository
@@ -28,7 +28,7 @@ git clone https://github.com/NVIDIA/HugeCTR
2828
1. Launch the container in interactive mode (mount the HugeCTR root directory into the container for your convenience) by running this command:
2929

3030
```shell
31-
docker run --runtime=nvidia --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr -p 8888:8888 nvcr.io/nvidia/merlin/merlin-pytorch:23.12
31+
docker run --runtime=nvidia --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr -p 8888:8888 nvcr.io/nvidia/merlin/merlin-hugectr:24.04
3232
```
3333

3434
2. Start Jupyter using these commands:
@@ -55,4 +55,4 @@ The specifications of the system on which each notebook can run successfully are
5555

5656
| Notebook | CPU | GPU | #GPUs | Author |
5757
| -------- | --- | --- | ----- | ------ |
58-
| [hps_torch_demo.ipynb](hps_torch_demo.ipynb) | Intel(R) Xeon(R) CPU E5-2698 v4 @ 2.20GHz<br />512 GB Memory | Tesla V100-SXM2-32GB<br />32 GB Memory | 1 | Kingsley Liu |
58+
| [hps_torch_demo.ipynb](hps_torch_demo.ipynb) | Intel(R) Xeon(R) CPU E5-2698 v4 @ 2.20GHz<br />512 GB Memory | Tesla V100-SXM2-32GB<br />32 GB Memory | 1 | Kingsley Liu |

hps_torch/notebooks/hps_torch_demo.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@
6060
"\n",
6161
"### Get HPS from NGC\n",
6262
"\n",
63-
"The HPS Python module is preinstalled in the 23.12 and later [Merlin PyTorch Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-pytorch): `nvcr.io/nvidia/merlin/merlin-pytorch:23.12`.\n",
63+
"The HPS Python module is preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
6464
"\n",
6565
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
6666
"\n",

hps_trt/notebooks/benchmark_tf_trained_large_model.ipynb

+5-5
Original file line numberDiff line numberDiff line change
@@ -1279,17 +1279,17 @@
12791279
" ```shell\n",
12801280
" git clone https://github.com/NVIDIA-Merlin/Merlin.git\n",
12811281
" cd Merlin/docker\n",
1282-
" docker build -t nvcr.io/nvstaging/merlin/merlin-base:23.12 -f dockerfile.merlin .\n",
1283-
" docker build -t nvcr.io/nvstaging/merlin/merlin-tensorflow:23.12 -f dockerfile.tf .\n",
1282+
" docker build -t nvcr.io/nvstaging/merlin/merlin-base:24.04 -f dockerfile.merlin.ctr .\n",
1283+
" docker build -t nvcr.io/nvstaging/merlin/merlin-hugectr:24.04 -f dockerfile.ctr .\n",
12841284
" cd ../..\n",
12851285
" ```\n",
12861286
"- **Option B (G+H optimized HugeCTR)**:\n",
12871287
" ```shell\n",
12881288
" git clone https://github.com/NVIDIA-Merlin/Merlin.git\n",
12891289
" cd Merlin/docker\n",
12901290
" sed -i -e 's/\" -DENABLE_INFERENCE=ON/\" -DUSE_HUGE_PAGES=ON -DENABLE_INFERENCE=ON/g' dockerfile.merlin\n",
1291-
" docker build -t nvcr.io/nvstaging/merlin/merlin-base:23.12 -f dockerfile.merlin .\n",
1292-
" docker build -t nvcr.io/nvstaging/merlin/merlin-tensorflow:23.12 -f dockerfile.tf .\n",
1291+
" docker build -t nvcr.io/nvstaging/merlin/merlin-base:24.04 -f dockerfile.merlin.ctr .\n",
1292+
" docker build -t nvcr.io/nvstaging/merlin/merlin-hugectr:24.04 -f dockerfile.ctr .\n",
12931293
" cd ../..\n",
12941294
" ````"
12951295
]
@@ -1325,7 +1325,7 @@
13251325
"\n",
13261326
"Your filesystem or system environment might impose constraints. The following command just serves as an example. It assumes HugeCTR was downloaded from GitHub into the current working directory (`git clone https://github.com/NVIDIA-Merlin/HugeCTR.git`). To allow writing files, we first give root user (inside the docker image you are root) to access to the notebook folder (this folder), and then startup a suitable Jupyter server.\n",
13271327
"```shell\n",
1328-
"export HCTR_SRC=\"${PWD}/HugeCTR\" && chmod -R 777 \"${HCTR_SRC}/hps_trt/notebooks\" && docker run -it --rm --gpus all --network=host -v ${HCTR_SRC}:/hugectr nvcr.io/nvstaging/merlin/merlin-tensorflow:23.12 jupyter-lab --allow-root --ip 0.0.0.0 --port 8888 --no-browser --notebook-dir=/hugectr/hps_trt/notebooks\n",
1328+
"export HCTR_SRC=\"${PWD}/HugeCTR\" && chmod -R 777 \"${HCTR_SRC}/hps_trt/notebooks\" && docker run -it --rm --gpus all --network=host -v ${HCTR_SRC}:/hugectr nvcr.io/nvstaging/merlin/merlin-hugectr:24.04 jupyter-lab --allow-root --ip 0.0.0.0 --port 8888 --no-browser --notebook-dir=/hugectr/hps_trt/notebooks\n",
13291329
"``` "
13301330
]
13311331
},

hps_trt/notebooks/demo_for_hugectr_trained_model.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@
3131
"\n",
3232
"### Use NGC\n",
3333
"\n",
34-
"The HPS TensorRT plugin is preinstalled in the 23.12 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:23.12`.\n",
34+
"The HPS TensorRT plugin is preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
3535
"\n",
3636
"You can check the existence of the required libraries by running the following Python code after launching this container."
3737
]

hps_trt/notebooks/demo_for_pytorch_trained_model.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@
3131
"\n",
3232
"### Use NGC\n",
3333
"\n",
34-
"The HPS TensorRT plugin is preinstalled in the 23.12 and later [Merlin PyTorch Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-pytorch): `nvcr.io/nvidia/merlin/merlin-pytorch:23.12`.\n",
34+
"The HPS TensorRT plugin is preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
3535
"\n",
3636
"You can check the existence of the required libraries by running the following Python code after launching this container."
3737
]

hps_trt/notebooks/demo_for_tf_trained_model.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@
3131
"\n",
3232
"### Use NGC\n",
3333
"\n",
34-
"The HPS TensorRT plugin is preinstalled in the 23.12 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.12`.\n",
34+
"The HPS TensorRT plugin is preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
3535
"\n",
3636
"You can check the existence of the required libraries by running the following Python code after launching this container."
3737
]

notebooks/README.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -19,16 +19,16 @@ git clone https://github.com/NVIDIA/HugeCTR
1919
Pull the container using the following command:
2020

2121
```shell
22-
docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.12
22+
docker pull nvcr.io/nvidia/merlin/merlin-hugectr:24.04
2323
```
2424

2525
Launch the container in interactive mode (mount the HugeCTR root directory into the container for your convenience) by running this command:
2626

2727
```shell
28-
docker run --gpus all --rm -it --cap-add SYS_NICE --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 -u root -v $(pwd):/HugeCTR -w /HugeCTR --network=host --runtime=nvidia nvcr.io/nvidia/merlin/merlin-hugectr:23.12
28+
docker run --gpus all --rm -it --cap-add SYS_NICE --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 -u root -v $(pwd):/HugeCTR -w /HugeCTR --network=host --runtime=nvidia nvcr.io/nvidia/merlin/merlin-hugectr:24.04
2929
```
3030

31-
> To run the Sparse Operation Kit notebooks, specify the `nvcr.io/nvidia/merlin/merlin-tensorflow:23.12` container.
31+
> To run the Sparse Operation Kit notebooks, specify the `nvcr.io/nvidia/merlin/merlin-tensorflow:24.04` container.
3232
3333
## 3. Customized Building (Optional)
3434

release_notes.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -252,7 +252,7 @@ In this release, we have fixed issues and enhanced the code.
252252
253253
```{important}
254254
In January 2023, the HugeCTR team plans to deprecate semantic versioning, such as `v4.3`.
255-
Afterward, the library will use calendar versioning only, such as `v23.12`.
255+
Afterward, the library will use calendar versioning only, such as `v23.01`.
256256
```
257257
258258
+ **Support for BERT and Variants**:
@@ -334,7 +334,7 @@ The [HugeCTR Training and Inference with Remote File System Example](https://nvi
334334
335335
```{important}
336336
In January 2023, the HugeCTR team plans to deprecate semantic versioning, such as `v4.2`.
337-
Afterward, the library will use calendar versioning only, such as `v23.12`.
337+
Afterward, the library will use calendar versioning only, such as `v23.01`.
338338
```
339339
340340
+ **Change to HPS with Redis or Kafka**:

samples/criteo/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -11,11 +11,11 @@ HugeCTR is available as buildable source code, but the easiest way to install an
1111

1212
1. Pull the HugeCTR NGC Docker by running the following command:
1313
```bash
14-
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.12
14+
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:24.04
1515
```
1616
2. Launch the container in interactive mode with the HugeCTR root directory mounted into the container by running the following command:
1717
```bash
18-
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.12
18+
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:24.04
1919
```
2020

2121
### Build the HugeCTR Docker Container on Your Own ###

0 commit comments

Comments
 (0)