Cracking Open the Hugging Face Transformers Library

Author:Murphy  |  View: 20544  |  Time: 2025-03-23 13:09:33

This is the 3rd article in a series on using large language models (LLMs) in practice. Here I will give a beginner-friendly guide to the Hugging Face Transformers library, which provides an easy and cost-free way to work with a wide variety of open-source language models. I will start by reviewing key concepts and then dive into example Python code.

Photo by Jéan Béller on Unsplash

In the previous article of this series, we explored the OpenAI Python API and used it to make a custom chatbot. One downside of this API, however, is that API calls cost money, which may not scale well for some use cases.

In these scenarios, it may be advantageous to turn to open-source solutions. One popular way to do this is via Hugging Face's Transformers library.

What is Hugging Face?

Hugging Face is an AI company that has become a major hub for open-source Machine Learning (ML). Their platform has 3 major elements which allow users to access and share machine learning resources.

First is their rapidly growing repository of pre-trained open-source ML models for things such as natural language processing (NLP), computer vision, and more. Second is their library of datasets for training ML models for almost any task. Third, and finally, is Spaces which is a collection of open-source ML apps hosted by Hugging Face.

The power of these resources is that they are community generated, which leverages all the benefits of open-source (i.e. cost-free, wide diversity of tools, high-quality resources, and rapid pace of innovation). While these make building powerful ML projects more accessible than before, there is another key element of the Hugging Face ecosystem – the Transformers library.

Tags: Getting Started Hugging Face Large Language Models Machine Learning Python

Comment