sitloboi2012/SEMIKONG-8B-Instruct-GPTQ

SEMIKONG is a Foundation Model in the semiconductor manufacturing process. This model aims to become the baseline for future development LLM in various use case that required domain-specific knowledge in the field of Semiconductor Manufacturing Process.

SEMIKONG is the result of a collaboration between FPT AI Center, AItomatic and Tokyo Electron. With supported from the AI Alliance and IBM.

Model Details

The base model of SEMIKONG was built upon Llama 3 Instruct Model.

We divided the training process into 3 main phases: Pretrained Domain Knowledge -> Self Finetuning (Instruction Dataset) -> Merging and Quantization.

The model got benchmark on domain-expert reviewed dataset which will be release for the community soon !

Model Description

  • Developed by: FPT AI Center, AItomatic and Tokyo Electron
  • Funded by: FPT AI Center, AItomatic
  • Shared by: AI Alliance
  • Model type: Instruction Model
  • Language(s): English
  • License: Apache License 2.0
  • Finetuned from model: Llama 3

Uses

The model can be use as an endpoint API for extended development use case such as RAG, Chatbot, etc.

You can use either vLLM or NVIDIA NIM to access and setup the infrastructure for the model

Out-of-Scope Use

This is the first version of SEMIKONG which will only be supported Etching Process and General Domain Specific Knowledge in Semiconductor Manufacturing Process

Future version of SEMIKONG will extend it knowledge to end-to-end Semiconductor Manufacturing Process

Model Card Authors

Model Card Contact

Downloads last month
15
Safetensors
Model size
8B params
Tensor type
I32
·
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Datasets used to train sitloboi2012/SEMIKONG-8B-Instruct-GPTQ