Transformers
One of the most prominent libraries for modern natural language processing (NLP) model frameworks, Transformers comes from the NLP powerhouse Hugging Face. The variety of pre-trained models available in Transformers is vast, with both foundational and fine-tuned models designed for tasks such as text classification, translation, question answering, and more.
Key Features:
versatility (models exist for backends like PyTorch and TensorFlow), plentiful pre-trained models that can be customized, user-friendly APIs and docs, a robust user base to answer questions and help.
Transformers is good for new users, as it is very simple to pick up the basics, but also useful enough to help with even the most complex of tasks. The library comes with extensive documentation, user-friendly APIs, and a nearly-unfathomable collection of available models. With Transformers, beginners can start using state-of-the-art models without a ton of deep learning knowledge.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 |
# pip install transformers # Your currently installed version of Keras is Keras 3, but this is not # yet supported in Transformers. Please install the # backwards-compatible tf-keras package with # pip install tf-keras from transformers import pipeline classifier = pipeline("sentiment-analysis") result = classifier("I like ice-cream.") print(result) result = classifier("I don't like ice-cream.") print(result) result = classifier("Я люблю мороженное.") print(result) result = classifier("Я не люблю мороженное.") print(result) |
Output:
1 2 3 4 5 6 |
All the weights of TFDistilBertForSequenceClassification were initialized from the PyTorch model. If your task is similar to the task the model of the checkpoint was trained on, you can already use TFDistilBertForSequenceClassification for predictions without further training. [{'label': 'POSITIVE', 'score': 0.9990261793136597}] [{'label': 'NEGATIVE', 'score': 0.9980624318122864}] [{'label': 'POSITIVE', 'score': 0.5804928541183472}] [{'label': 'NEGATIVE', 'score': 0.770568311214447}] |
Recent Comments