philschmid blog

philschmid blog

#Machine Learning #Cloud #NLP #Serverless #Bert

MLOps: Using the Hugging Face Hub as model registry with Amazon SageMaker

November 16, 2021 · 8 min read

Learn how to automatically save your model weights, logs, and artifacts to the Hugging Face Hub using Amazon SageMaker and how to deploy the model afterwards for inference.

A remote guide to re:Invent 2021 machine learning sessions

November 11, 2021 · 6 min read

If you are like me you are not from the USA and cannot easily travel to Las Vegas. I have the perfect remote guide for your perfect virtual re:Invent 2021 focused on NLP and Machine Learning.

MLOps: End-to-End Hugging Face Transformers with the Hub & SageMaker Pipelines

November 10, 2021 · 9 min read

Learn how to build an End-to-End MLOps Pipeline for Hugging Face Transformers from training to production using Amazon SageMaker.

Going Production: Auto-scaling Hugging Face Transformers with Amazon SageMaker

October 29, 2021 · 6 min read

Learn how to add auto-scaling to your Hugging Face Transformers SageMaker Endpoints.

Deploy BigScience T0_3B to AWS & Amazon SageMaker

October 20, 2021 · 5 min read

🌸 BigScience released their first modeling paper introducing T0 which outperforms GPT-3 on many zero-shot tasks while being 16x smaller! Deploy BigScience the 3 Billion version (T0_3B) to Amazon SageMaker with a few lines of code to run a scalable production workload!

Scalable, Secure Hugging Face Transformer Endpoints with Amazon SageMaker, AWS Lambda, and CDK

October 06, 2021 · 6 min read

Deploy Hugging Face Transformers to Amazon SageMaker and create an API for the Endpoint using AWS Lambda, API Gateway and AWS CDK.

Few-shot learning in practice with GPT-Neo

June 05, 2021 · 6 min read

The latest developments in NLP show that you can overcome this limitation by providing a few examples at inference time with a large language model - a technique known as Few-Shot Learning. In this blog post, we'll explain what Few-Shot Learning is, and explore how a large language model called GPT-Neo.

Distributed Training: Train BART/T5 for Summarization using 🤗 Transformers and Amazon SageMaker

April 09, 2021 · 10 min read

Learn how to train distributed models for summarization using Hugging Face Transformers and Amazon SageMaker and upload them afterwards to huggingface.co.

1 of 4
Next