👋 Hi there, welcome to the 12th issue of the 🤗 newsletter! Here's what's been brewing this month:
- Part 2 of the 🤗 course
- AutoNLP Free Tier for one week
- We welcome GPT-J to the 🤗 Transformers family
- ... and more!
We're excited to release the second part of the 🤗 course in November 🥳. To celebrate, we'll be hosting a week-long community event packed with talks and hands-on projects! We've lined up a great list of speakers to kickoff the event, including Jakob Uszkoreit (Inceptive), Mark Saroufim (Facebook), Jay Alammar (Cohere), Meg Mitchell (🤗), and Thomas Wolf (🤗).
Participants will work in teams to train a model and make a live demo that can be shown to friends, family, or prospective employers. Register here to take part and receive a certificate at the end of the event.
For one week, your first 🤗 AutoNLP project will be FREE! With AutoNLP, you can train, evaluate and deploy state-of-the-art transformer models without writing a single line of code 🤯. Sign up here 👉: huggingface.co/autonlp.
Please note that the free tier applies to your first project and for a limited number of concurrent model searches.
🤗 Transformers v4.11 was released two weeks ago, thanks to the work of more than 50 contributors! The highlight of this release is the inclusion of EleutherAI's GPT-J 6B parameter model. It shows greatly improved text and code generation, as well as natural language understanding capabilities that unlock arithmetic, paper writing, and many more tasks.
The model is available on the Inference API and we encourage you to play with it to see just how crazy the generations are 🤯: https://huggingface.co/EleutherAI/gpt-j-6B
🤗Datasets recently crossed 10,000 ⭐on GitHub! Thanks to our wonderful community, NLP practitioners now have access to 1,500+ datasets in hundreds of languages ❤️.
Here's what's been happening in 🤗Datasets this month:
📚Restructured documentation to make it easier for users to find the content they are looking for. Tutorials focus on helping new learners gain the basic skills they need to use 🤗Datasets, and how-to guides show users how they can apply their skills to solve real-world problems.
📜Our demo paper was accepted to EMNLP 2021 🥳
We’re thrilled to partner with DeepLearning.AI to create some great new content for their Natural Language Processing Specialization on Coursera!
With this update, you can access exciting new material and lectures that cover the state of the art in NLP. Sign-up to know when it’s live.
🌌 With Spaces coming out of beta soon, we've created two blogs that show you how to easily demo your favorite models with Streamlit and Gradio.
🔐 The Hub now supports commit signature verification with GPG. Read here to find out why this matters.
Hugging Face Infinity is our new containerized solution to deploy fully optimized inference pipelines for state-of-the-art Transformer models into your own production environment 🔥. Watch Philipp Schmid optimize a Sentence-Transformer to achieve 1.Xms latency with Hugging Face Infinity on GPU!
If you are interested in trying out Infinity, sign-up for a trial at hf.co/infinity-trial
🧑🔬Our research at EMNLP and NeurIPS
To find out what our research team has been up to this month, check out these papers that have been accepted to the EMNLP and NeurIPS conferences!