Hi, I want to fine-tune the model using my custom data. Do you have any suggestion about the GPU computation power during training? Thanks.