Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to get the accuracy per epoch or step for the huggingface.transformers Trainer?

I'm using the huggingface Trainer with BertForSequenceClassification.from_pretrained("bert-base-uncased") model.

Simplified, it looks like this:

model = BertForSequenceClassification.from_pretrained("bert-base-uncased")
tokenizer = BertTokenizer.from_pretrained("bert-base-uncased")

training_args = TrainingArguments(
        output_dir="bert_results",
        num_train_epochs=3,
        per_device_train_batch_size=8,
        per_device_eval_batch_size=32,
        warmup_steps=500,
        weight_decay=0.01,
        logging_dir="bert_results/logs",
        logging_steps=10
        )

trainer = Trainer(
        model=model,
        args=training_args,
        train_dataset=train_dataset,
        eval_dataset=val_dataset,
        compute_metrics=compute_metrics
        )

The logs contain the loss for each 10 steps, but I can't seem to find the training accuracy. Does anyone know how to get the accuracy, for example by changing the verbosity of the logger? I can't seem to find anything about it online.

like image 888
CptBaas Avatar asked Sep 10 '25 19:09

CptBaas


1 Answers

You can load the accuracy metric and make it work with your compute_metrics function. As an example, it would be like:

from datasets import load_metric
metric = load_metric('accuracy')

def compute_metrics(eval_pred):
    predictions, labels = eval_pred
    predictions = np.argmax(predictions, axis=1)
    return metric.compute(predictions=predictions, references=labels)

This example of compute_metrics function is based on the Hugging Face's text classification tutorial. It worked in my tests.

like image 169
lucasresck Avatar answered Sep 12 '25 11:09

lucasresck