You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, we save all model parameters together. This leads to a problem that if the code is changed, we will not be able to load the model. What's even worse is that the failure happens silently.
This problem becomes more serious when we need to fine-tune the BERT model. It is common that we need to fine-tune the BERT model and load the fine-tuned BERT model to train GNN model. It is expensive to fine-tune the BERT model. Once it is tuned, we prefer to keep using it even if the GraphStorm code is changed.
To solve this problem, we should save the BERT model separately from the remaining model parameters.
The text was updated successfully, but these errors were encountered:
Currently, we save all model parameters together. This leads to a problem that if the code is changed, we will not be able to load the model. What's even worse is that the failure happens silently.
This problem becomes more serious when we need to fine-tune the BERT model. It is common that we need to fine-tune the BERT model and load the fine-tuned BERT model to train GNN model. It is expensive to fine-tune the BERT model. Once it is tuned, we prefer to keep using it even if the GraphStorm code is changed.
To solve this problem, we should save the BERT model separately from the remaining model parameters.
The text was updated successfully, but these errors were encountered: