blog_posts: b641abcad3
This data as json
id | createdDate | title | link | postExcerpt | featuredImageUrl | hash | contributors | modifiedDate | displayDate |
---|---|---|---|---|---|---|---|---|---|
blog-posts#60-3703 | 2024-05-29 13:52:05 | Large scale training with NVIDIA NeMo Megatron on AWS ParallelCluster using P5 instances | https://aws.amazon.com/blogs/hpc/large-scale-training-with-nemo-megatron-on-aws-parallelcluster-using-p5-instances/ | Launching distributed GPT training? See how AWS ParallelCluster sets up a fast shared filesystem, SSH keys, host files, and more between nodes. Our guide has the details for creating a Slurm-managed cluster to train NeMo Megatron at scale. | https://d2908q01vomqb2.cloudfront.net/e6c3dd630428fd54834172b8fd2735fed9416da4/2024/05/29/Large-scale-training-with-NeMo-Megatron-on-AWS-ParallelCluster-using-P5-instances-1-300x169.png | b641abcad3 | Aman Shanbhag, Akshit Arora, Peter Dykas, Pierre-Yves Aquilanti, Sean Smith | 2024-05-29 19:56:54 | 29 May 2024 |
Links from other tables
- 16 rows from blog_post_hash in blog_post_tags