Note
This note effectively serves as a cheatsheet for huggingface and/or transformers.
Install and Download
Download models with Huggingface-cli
pip install -U "huggingface_hub[cli]"
then log in with access token:
huggingface-cli login
Generation
sample
# https://github.com/huggingface/transformers/blob/831bc25d8fdb85768402f772cf65cc3d7872b211/src/transformers/generation/utils.py#L2724-L2725
next_token_scores = logits_processor(input_ids, next_token_logits)
next_token_scores = logits_warper(input_ids, next_token_scores)
logits_processors
are always applied before logits_warpers
.