Skip to content

Commit

Permalink
Fix DPO finetuning example (#12313)
Browse files Browse the repository at this point in the history
  • Loading branch information
JinBridger authored Nov 1, 2024
1 parent 05c5d02 commit 126f95b
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 4 deletions.
5 changes: 3 additions & 2 deletions python/llm/example/GPU/LLM-Finetuning/DPO/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,9 @@ conda create -n llm python=3.11
conda activate llm
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
pip install transformers==4.36.0 datasets
pip install trl peft==0.10.0
pip install datasets
pip install peft==0.10.0
pip install 'trl<0.9'
# Note, if you don't want to reinstall BNBs dependencies, append the `--no-deps` flag!
pip install --no-deps --force-reinstall 'https://github.com/bitsandbytes-foundation/bitsandbytes/releases/download/continuous-release_multi-backend-refactor/bitsandbytes-0.44.1.dev0-py3-none-manylinux_2_24_x86_64.whl'
```
Expand Down
4 changes: 2 additions & 2 deletions python/llm/example/GPU/LLM-Finetuning/DPO/dpo_finetuning.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,12 +34,12 @@
import os
import torch

from ipex_llm.transformers.qlora import get_peft_model, prepare_model_for_kbit_training
from ipex_llm.transformers import AutoModelForCausalLM
import transformers
from transformers import AutoTokenizer, TrainingArguments, BitsAndBytesConfig
from datasets import load_dataset
from peft import LoraConfig
from ipex_llm.transformers.qlora import get_peft_model, prepare_model_for_kbit_training
from ipex_llm.transformers import AutoModelForCausalLM
from trl import DPOTrainer
import argparse

Expand Down

0 comments on commit 126f95b

Please sign in to comment.