Skip to content

codesteller/trt-custom-plugin

Repository files navigation

TensorRT Custom Plugin Example

Adapted Gelu Plugin from https://github.com/NVIDIA/TensorRT/tree/master/plugin/geluPlugin

This repository explain how to work with custom layers in an end-to-end Deep Learning Pipeline. In this repository, I have added a custom layer in the model architecture using Keras custom Layer api. Thereafter, the model is trained on a demo dataset to do dogs vs cats classification. Post training, I have written the converter code to convert the trained model to a TensorRT engine. I have also included a inference code to do inference with the converted engine.

Train model with GeLu

The model can be trained using the script do_train.py.

Save the Models

The do_train.py saves the model after training.

Converting the Model

Before the model is converted to tensorRT engine file we need to compile the custome plugin.

Compiling the custom layer

Custom TensorRT layer for Gelu is inside the "geluPluginv2" directory. To compile the gelu custom layer,

cd geluPluginv2
mkdir build && cd build
cmake ..

Convert model to TRT Engine

You can use the converter.py in the trt_utils directory to do this. Please change this line to the correct path in your system

Inference code

You can run the inference.py code in trt_utils directore. Like the last section please change the path to the libGeluPlugin.so.

More Details

For more details please refer the documentation

About

TensorRT Custom Plugin Example

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published