--- library_name: peft base_model: jinaai/jina-embeddings-v2-base-en tags: - medical - cardiology - embeddings - domain-adaptation - lora - sentence-transformers - sentence-similarity language: - en license: apache-2.0 --- # CardioEmbed-Jina-v2 **Domain-specialized cardiology text embeddings using LoRA-adapted Jina-v2** Part of a comparative study of 10 embedding architectures for clinical cardiology. ## Performance | Metric | Score | |--------|-------| | Separation Score | **-0.175** | ## Usage ```python from transformers import AutoModel, AutoTokenizer from peft import PeftModel base_model = AutoModel.from_pretrained("jinaai/jina-embeddings-v2-base-en") tokenizer = AutoTokenizer.from_pretrained("jinaai/jina-embeddings-v2-base-en") model = PeftModel.from_pretrained(base_model, "richardyoung/CardioEmbed-Jina-v2") ``` ## Training - **Training Data**: 106,535 cardiology text pairs from medical textbooks - **Method**: LoRA fine-tuning (r=16, alpha=32) - **Loss**: Multiple Negatives Ranking Loss (InfoNCE) ## Citation ```bibtex @article{young2024comparative, title={Comparative Analysis of LoRA-Adapted Embedding Models for Clinical Cardiology Text Representation}, author={Young, Richard J and Matthews, Alice M}, journal={arXiv preprint}, year={2024} } ```