Skip to content

Commit 8b2e521

Browse files
natemail-awsaws-mesharma
authored andcommitted
Neuron SDK 2.13.0 Release Notes and new samples
1 parent 6d7f621 commit 8b2e521

File tree

77 files changed

+18568
-15957
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

77 files changed

+18568
-15957
lines changed

README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ Samples are organized by use case (training, inference) and deep learning framew
1414

1515
| Usage | Description | Instance Type |
1616
| --- | --- | --- |
17-
| [Megatron-LM for Neuron](https://github.com/aws-neuron/aws-neuron-reference-for-megatron-lm) | A library that enables large-scale distributed training of language models such as GPT and is adapted from Megatron-LM. | Trn1, Trn1n |
17+
| [Nemo Megatron for Neuron](https://github.com/aws-neuron/neuronx-nemo-megatron) | A library that enables large-scale distributed training of language models such as Llama and is adapted from Nemo Megatron. | Trn1, Trn1n |
1818
| [AWS Neuron samples for ParallelCluster](https://github.com/aws-neuron/aws-neuron-parallelcluster-samples) | How to use AWS ParallelCluster to build HPC compute cluster that uses trn1 compute nodes to run your distributed ML training job. | Trn1, Trn1n |
1919
| [AWS Neuron samples for EKS](https://github.com/aws-neuron/aws-neuron-eks-samples) | The samples in this repository demonstrate the types of patterns that can be used to deliver inference and distributed training on EKS using Inferentia and Trainium. | Trn1, Trn1n |
2020
| [AWS Neuron samples for SageMaker](https://github.com/aws-neuron/aws-neuron-sagemaker-samples) | SageMaker Samples using ml.trn1 instances for machine learning (ML) training workloads on the AWS ML accelerator chips Trainium. | Trn1, Trn1n |

releasenotes.md

+12
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,17 @@
11
# Change Log
22

3+
## August, 28th 2023
4+
* Added sample script for LLaMA V2 13B model inference using transformers-neuronx
5+
* Added samples for training GPT-NEOX 20B and 6.9B models using neuronx-distributed
6+
* Added sample scripts for CLIP and Stable Diffusion XL inference using torch-neuronx
7+
* Added sample scripts for vision and language Perceiver models inference using torch-neuronx
8+
* Added camembert training/finetuning example for Trn1 under hf_text_classification in torch-neuronx
9+
* Updated Fine-tuning Hugging Face BERT Japanese model sample in torch-neuronx
10+
* Updated OPT and GPT-J transformers-neuronx inference samples to install transformers-neuronx from whl instead of using github repo
11+
* Upgraded numpy package to 1.21.6 in GPT-2 and several training samples under hf_text_classification in torch-neuronx
12+
* Removed pinning of torch-neuron and tensorflow-neuron libraries and other minor changes in several of torch-neuron and tensorflow-neuron Inf1 inference samples.
13+
14+
315
## February, 23rd 2023
416
* Added OPT-13B, OPT-30B, OPT-66B inference examples under transformers-neuronx
517
* Added distilbert-base-uncased training/finetuning example for Trn1 under torch-neuronx

tensorflow-neuron/inference/unet/UnetTF2.ipynb

+6-9
Original file line numberDiff line numberDiff line change
@@ -20,15 +20,12 @@
2020
]
2121
},
2222
{
23-
"cell_type": "code",
24-
"execution_count": null,
25-
"id": "d1081826",
23+
"attachments": {},
24+
"cell_type": "markdown",
25+
"id": "b79b1ad3",
2626
"metadata": {},
27-
"outputs": [],
2827
"source": [
29-
"# Set Pip repository to point to the Neuron repository\n",
30-
"%pip config set global.extra-index-url https://pip.repos.neuron.amazonaws.com\n",
31-
"# now restart the kernel"
28+
"Verify that this Jupyter notebook is running the Python kernel environment that was set up according to the [Tensforlow Installation Guide](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/general/setup/tensorflow-neuron.html#setup-tensorflow-neuron). You can select the kernel from the \"Kernel -> Change Kernel\" option on the top of this Jupyter notebook page."
3229
]
3330
},
3431
{
@@ -41,7 +38,7 @@
4138
"outputs": [],
4239
"source": [
4340
"#Install Neuron Tensorflow\n",
44-
"%pip install -U tensorflow-neuron==2.5.2.2.1.14.0 neuron-cc matplotlib\n",
41+
"%pip install -U tensorflow-neuron neuron-cc matplotlib\n",
4542
"# use --force-reinstall if you're facing some issues while loading the modules\n",
4643
"# now restart the kernel again"
4744
]
@@ -206,7 +203,7 @@
206203
"x = next(iter(x))[0]\n",
207204
"\n",
208205
"# https://awsdocs-neuron.readthedocs-hosted.com/en/latest/neuron-guide/neuron-cc/command-line-reference.html\n",
209-
"os.environ[\"NEURON_CC_FLAGS\"] = \"--verbose=DEBUG --neuroncore-pipeline-cores=1 --workdir=logs/ --dynamic-batch-size\"\n",
206+
"os.environ[\"NEURON_CC_FLAGS\"] = \"--neuroncore-pipeline-cores=1 --workdir=logs/ --dynamic-batch-size\"\n",
210207
"neuron_model = tfn.trace(unet_model, x)\n",
211208
"neuron_model.save('unet_circles_neuron')"
212209
]

0 commit comments

Comments
 (0)