nielsr HF Staff commited on
Commit
a350e93
·
verified ·
1 Parent(s): e71673f

Add pipeline tag and library name to model card

Browse files

Hi! I'm Niels from the Hugging Face community team.

This PR improves your model card's metadata by adding:
- `pipeline_tag: feature-extraction`: This helps users find your model when filtering by task on the Hub.
- `library_name: transformers`: This enables an automated code snippet on the model page, showcasing how to use the model with the Transformers library.

The rest of the model card remains unchanged.

Files changed (1) hide show
  1. README.md +11 -5
README.md CHANGED
@@ -1,14 +1,16 @@
1
  ---
2
- license: apache-2.0
 
3
  language:
4
  - en
5
  - zh
6
- base_model:
7
- - Qwen/Qwen3-Embedding-4B
8
  tags:
9
  - embedding
10
  - retriever
11
  - RAG
 
 
12
  ---
13
 
14
  # Mindscape-Aware RAG (MiA-RAG)
@@ -99,13 +101,17 @@ Use this mode to retrieve narrative text chunks. A **Global Summary** is injecte
99
  def get_query_prompt(query, summary="", residual=False):
100
  """Construct input prompt with global summary (Eq. 5 in paper)."""
101
  task_desc = "Given a search query with the book's summary, retrieve relevant chunks or helpful entities summaries from the given context that answer the query"
102
- summary_prefix = "\n\nHere is the summary providing possibly useful global information. Please encode the query based on the summary:\n"
 
 
 
103
 
104
  # Insert PAD token to capture residual embedding before the summary
105
  middle_token = tokenizer.pad_token if residual else ""
106
 
107
  return (
108
- f"Instruct: {task_desc}\n"
 
109
  f"Query: {query}{middle_token}{summary_prefix}{summary}{node_delimiter}"
110
  )
111
 
 
1
  ---
2
+ base_model:
3
+ - Qwen/Qwen3-Embedding-4B
4
  language:
5
  - en
6
  - zh
7
+ license: apache-2.0
 
8
  tags:
9
  - embedding
10
  - retriever
11
  - RAG
12
+ pipeline_tag: feature-extraction
13
+ library_name: transformers
14
  ---
15
 
16
  # Mindscape-Aware RAG (MiA-RAG)
 
101
  def get_query_prompt(query, summary="", residual=False):
102
  """Construct input prompt with global summary (Eq. 5 in paper)."""
103
  task_desc = "Given a search query with the book's summary, retrieve relevant chunks or helpful entities summaries from the given context that answer the query"
104
+ summary_prefix = "
105
+
106
+ Here is the summary providing possibly useful global information. Please encode the query based on the summary:
107
+ "
108
 
109
  # Insert PAD token to capture residual embedding before the summary
110
  middle_token = tokenizer.pad_token if residual else ""
111
 
112
  return (
113
+ f"Instruct: {task_desc}
114
+ "
115
  f"Query: {query}{middle_token}{summary_prefix}{summary}{node_delimiter}"
116
  )
117