Skip to content

Commit 0b92ae4

Browse files
committed
Adding packedBERT deep dive walkthrough to Tutorials
1 parent 30278aa commit 0b92ae4

File tree

7 files changed

+1790
-0
lines changed

7 files changed

+1790
-0
lines changed

tutorials/blogs_code/README.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,3 +5,4 @@ This directory contains code used in Graphcore blogs
55

66
## Contents
77
- The [packedBERT](packedBERT) folder contains supporting code for: ["Packing: Towards 2x NLP BERT Acceleration"](https://arxiv.org/abs/2107.02027) and for the corresponding blog post: ["Introducing Packed BERT for 2x Training Speed-up in Natural Language Processing"](https://towardsdatascience.com/introducing-packed-bert-for-2x-faster-training-in-natural-language-processing-eadb749962b1).
8+
- The [finetuning-packedBERT](finetuning-packedBERT) folder contains a supporting deep-dive walkthrough for the "Accelerating fine-tuning and inference with PackedBERT" article and the [Paperspace fine-tuning packedBERT notebooks](https://console.paperspace.com/github/gradient-ai/Graphcore-HuggingFace?file=%2Fpacked-bert).
Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
# A deeper dive into faster fine-tuning with Packed BERT
2+
3+
This notebook is a walkthrough diving deeper into fine-tuning with a more online-capable version of PackedBERT, building on the original Packed BERT used for BERT pre-training. It explores all of the modifications made for preprocessing, dataset creation, the model itself and postprocessing required to implement packing for BERT (and similar models) using an example of sequence classification. The code is broken down and explained step by step, explaining the logic behind the choices made.
4+
5+
You can try the [notebooks](https://console.paperspace.com/github/gradient-ai/Graphcore-HuggingFace?file=%2Fpacked-bert) that use these functions in Optimum Graphcore in Paperspace.
6+
7+
Before starting the walkthrough, ensure you have all requirements installed with
8+
9+
```
10+
pip install -r requirements.txt
11+
```
40.7 KB
Loading
196 KB
Loading
79.4 KB
Loading
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
transformers==4.25.1
2+
datasets
3+
numpy
4+
matplotlib
5+
torch==1.13.0

tutorials/blogs_code/finetuning-packedBERT/walkthrough.ipynb

Lines changed: 1773 additions & 0 deletions
Large diffs are not rendered by default.

0 commit comments

Comments
 (0)