Skip to content

Commit b72c8d1

Browse files
authored
Merge pull request #100 from graphcore/conclusions
GPT-J POPXL: add conclusions to notebook to match with Paperspace
2 parents 2d8b565 + 78b1df0 commit b72c8d1

File tree

1 file changed

+16
-0
lines changed

1 file changed

+16
-0
lines changed

nlp/gpt_j/popxl/finetuning.ipynb

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -969,6 +969,22 @@
969969
"print(out)\n",
970970
"# [{'generated_text': ' contradiction'}]"
971971
]
972+
},
973+
{
974+
"attachments": {},
975+
"cell_type": "markdown",
976+
"id": "0fd3bd49",
977+
"metadata": {},
978+
"source": [
979+
"## Conclusion\n",
980+
"This notebook has demonstrated how easy it is to perform Fine-Tuning on GPT-J on the Graphcore IPU for a text entailment task. While not as powerful as larger models for free text-generation, medium-size auto-regressive models GPT-J can still be successfully fine-tuned to handle a range of NLP downstream tasks such as question answering, sentiment analysis, and named entity recognition. In fact, for these kind of tasks you don't need GPT-3 175B sized models. GPT-J at 6B has very good language understanding and is ideally suited & highly efficient for most of these scenarios.\n",
981+
"\n",
982+
"In this example we performed fine-tuning on GPT-J as a Causal Language Model (CLM) for Text Entailment on GLUE MNLI dataset.\n",
983+
"\n",
984+
"You can easily adapt this example to do your custom fine-tuning on several downstream tasks, such as question answering, named entity recognition, sentiment analysis, & text classification in general – by preparing your data accordingly.\n",
985+
"\n",
986+
"Overall, this notebook showcases the potential for GPT-J to be used effectively and efficiently for Fine-Tuning. Next, find out how GPT-J can be used effectively and efficiently on several downstream tasks after a simple fine-tuning with our Text generation on IPU using GPT-J – Inference notebook, GPTJ-generative-inference.ipynb."
987+
]
972988
}
973989
],
974990
"metadata": {

0 commit comments

Comments
 (0)