release notes
release notes
Published 1/8/2026
Contains breaking changesThis release candidate was focused mostly on quantization support with the new dynamic weight loader, and a few notable 🚨 breaking changes🚨:
from_pretrained is now auto!Mostly QOL and fixed + support back CPU offloading.
Mostly added support for fbgemme , quanto,
The dynamic weight loader broke small things, this adds glue for all models but MoEs.
Tokenization needed more refactoring, this time its a lot cleaner!
rope_parameters to empty dict if there is something to put in it by @hmellor in https://github.com/huggingface/transformers/pull/42651We omitted a lot of other commits for clarity, but thanks to everyone and the new contributors!
Full Changelog: https://github.com/huggingface/transformers/compare/v5.0.0rc0...v5.0.0rc1
release notes
Published 1/8/2026
Contains breaking changesThis release candidate was focused mostly on quantization support with the new dynamic weight loader, and a few notable 🚨 breaking changes🚨:
from_pretrained is now auto!Mostly QOL and fixed + support back CPU offloading.
Mostly added support for fbgemme , quanto,
The dynamic weight loader broke small things, this adds glue for all models but MoEs.
Tokenization needed more refactoring, this time its a lot cleaner!
rope_parameters to empty dict if there is something to put in it by @hmellor in https://github.com/huggingface/transformers/pull/42651We omitted a lot of other commits for clarity, but thanks to everyone and the new contributors!
Full Changelog: https://github.com/huggingface/transformers/compare/v5.0.0rc0...v5.0.0rc1
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.