alime-reranker-large-zh

The alime reranker model.

Usage


from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch

pairs = [["θ₯ΏζΉ–εœ¨ε“ͺ?", "θ₯ΏζΉ–ι£Žζ™―εθƒœεŒΊδ½δΊŽζ΅™ζ±Ÿηœζ­ε·žεΈ‚"],["δ»Šε€©ε€©ζ°”δΈι”™","δ½ ε“ζ­»ζˆ‘δΊ†"]]

if torch.cuda.is_available():
    device = torch.device("cuda")
else:
    device = torch.device("cpu")

tokenizer = AutoTokenizer.from_pretrained("Pristinenlp/alime-reranker-large-zh")
model = AutoModelForSequenceClassification.from_pretrained("Pristinenlp/alime-reranker-large-zh").to(device)

inputs = tokenizer(pairs, padding=True, truncation=True, return_tensors='pt', max_length=512).to(device)
scores = model(**inputs, return_dict=True).logits.view(-1, ).float()
print(scores.tolist())
Downloads last month
10
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Spaces using Pristinenlp/alime-reranker-large-zh 10

Evaluation results