Skip to content

Commit

Permalink
Limit trl version in example (#12332)
Browse files Browse the repository at this point in the history
* Limit trl version in example

* Limit trl version in example
  • Loading branch information
JinBridger authored Nov 5, 2024
1 parent 923d696 commit 82a61b5
Show file tree
Hide file tree
Showing 15 changed files with 29 additions and 29 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ conda activate llm
# install ipex-llm with 'all' option
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu

pip install torchvision tiktoken transformers==4.42.4 trl
pip install torchvision tiktoken transformers==4.42.4 "trl<0.12.0"
```

On Windows:
Expand All @@ -30,7 +30,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[all]
pip install torchvision tiktoken transformers==4.42.4 trl
pip install torchvision tiktoken transformers==4.42.4 "trl<0.12.0"
```

### 2. Run
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu

# install packages required for GLM-4
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
```

On Windows:
Expand All @@ -29,7 +29,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[all]
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
```

## 2. Run
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pyt

# transformers>=4.43.1 is required for Llama3.1 with IPEX-LLM optimizations
pip install transformers==4.43.1
pip install trl
pip install "trl<0.12.0"
```
On Windows:

Expand All @@ -31,7 +31,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[all]
pip install transformers==4.43.1
pip install trl
pip install "trl<0.12.0"
```

### 2. Run
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ conda activate llm
# install ipex-llm with 'all' option
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
pip install torchvision==0.16.2 --index-url https://download.pytorch.org/whl/cpu
pip install transformers==4.40.0 trl
pip install transformers==4.40.0 "trl<0.12.0"
```
On Windows:

Expand All @@ -28,7 +28,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[all]
pip install torchvision==0.16.2 --index-url https://download.pytorch.org/whl/cpu
pip install transformers==4.41.0 trl
pip install transformers==4.41.0 "trl<0.12.0"
```

### 2. Run
Expand Down
4 changes: 2 additions & 2 deletions python/llm/example/CPU/PyTorch-Models/Model/glm4/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu

# install packages required for GLM-4
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
```

On Windows:
Expand All @@ -32,7 +32,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[all]
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
```

### 2. Run
Expand Down
6 changes: 3 additions & 3 deletions python/llm/example/GPU/HuggingFace/LLM/gemma2/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ In this directory, you will find examples on how you could apply IPEX-LLM INT4 o
## Requirements
To run these examples with IPEX-LLM on Intel GPUs, we have some recommended requirements for your machine, please refer to [here](../../../README.md#requirements) for more information.

**Important: According to Gemma2's requirement, please make sure you have installed `transformers==4.43.1` and `trl` to run the example.**
**Important: According to Gemma2's requirement, please make sure you have installed `transformers==4.43.1` and `trl<0.12.0` to run the example.**

## Example: Predict Tokens using `generate()` API
In the example [generate.py](./generate.py), we show a basic use case for a Gemma2 model to predict the next N tokens using `generate()` API, with IPEX-LLM INT4 optimizations on Intel GPUs.
Expand All @@ -19,7 +19,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte

# According to Gemma2's requirement, please make sure you are using a stable version of Transformers, 4.43.1 or newer.
pip install "transformers>=4.43.1"
pip install trl
pip install "trl<0.12.0"
```

#### 1.2 Installation on Windows
Expand All @@ -33,7 +33,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte

# According to Gemma2's requirement, please make sure you are using a stable version of Transformers, 4.43.1 or newer.
pip install "transformers>=4.43.1"
pip install trl
pip install "trl<0.12.0"
```

### 2. Configures OneAPI environment variables for Linux
Expand Down
4 changes: 2 additions & 2 deletions python/llm/example/GPU/HuggingFace/LLM/glm4/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/

# install packages required for GLM-4
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
```

### 1.2 Installation on Windows
Expand All @@ -27,7 +27,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/

# install packages required for GLM-4
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
```

## 2. Configures OneAPI environment variables for Linux
Expand Down
4 changes: 2 additions & 2 deletions python/llm/example/GPU/HuggingFace/LLM/llama3.1/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte

# transformers>=4.43.1 is required for Llama3.1 with IPEX-LLM optimizations
pip install transformers==4.43.1
pip install trl
pip install "trl<0.12.0"
```

#### 1.2 Installation on Windows
Expand All @@ -31,7 +31,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte

# transformers>=4.43.1 is required for Llama3.1 with IPEX-LLM optimizations
pip install transformers==4.43.1
pip install trl
pip install "trl<0.12.0"
```

### 2. Configures OneAPI environment variables for Linux
Expand Down
4 changes: 2 additions & 2 deletions python/llm/example/GPU/HuggingFace/LLM/llama3.2/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte

pip install transformers==4.45.0
pip install accelerate==0.33.0
pip install trl
pip install "trl<0.12.0"
```

#### 1.2 Installation on Windows
Expand All @@ -31,7 +31,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte

pip install transformers==4.45.0
pip install accelerate==0.33.0
pip install trl
pip install "trl<0.12.0"
```

### 2. Configures OneAPI environment variables for Linux
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ conda activate llm
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/

pip install transformers==4.41.0 trl
pip install transformers==4.41.0 "trl<0.12.0"
```

#### 1.2 Installation on Windows
Expand All @@ -27,7 +27,7 @@ conda activate llm
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/

pip install transformers==4.41.0 trl
pip install transformers==4.41.0 "trl<0.12.0"
```

### 2. Configures OneAPI environment variables for Linux
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ conda activate llm
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/

pip install transformers==4.40.0 trl
pip install transformers==4.40.0 "trl<0.12.0"
```

#### 1.2 Installation on Windows
Expand All @@ -27,7 +27,7 @@ conda activate llm
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/

pip install transformers==4.40.0 trl
pip install transformers==4.40.0 "trl<0.12.0"
```

### 2. Configures OneAPI environment variables for Linux
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ conda activate llm
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/

pip install tiktoken transformers==4.42.4 trl
pip install tiktoken transformers==4.42.4 "trl<0.12.0"
```

#### 1.2 Installation on Windows
Expand All @@ -27,7 +27,7 @@ conda activate llm
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/

pip install tiktoken transformers==4.42.4 trl
pip install tiktoken transformers==4.42.4 "trl<0.12.0"
```

### 2. Configures OneAPI environment variables for Linux
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
pip install transformers==4.36.0 datasets
pip install peft==0.10.0
pip install bitsandbytes scipy trl
pip install bitsandbytes scipy "trl<0.12.0"
```

### 2. Configures OneAPI environment variables
Expand Down
2 changes: 1 addition & 1 deletion python/llm/example/GPU/Lightweight-Serving/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ pip install gradio # for gradio web UI
conda install -c conda-forge -y gperftools=2.10 # to enable tcmalloc

# for glm-4v-9b
pip install transformers==4.42.4 trl
pip install transformers==4.42.4 "trl<0.12.0"

# for internlm-xcomposer2-vl-7b
pip install transformers==4.31.0
Expand Down
4 changes: 2 additions & 2 deletions python/llm/example/GPU/PyTorch-Models/Model/glm4/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/

# install packages required for GLM-4
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
```

#### 1.2 Installation on Windows
Expand All @@ -29,7 +29,7 @@ conda activate llm
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/

# install packages required for GLM-4
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
```

### 2. Configures OneAPI environment variables for Linux
Expand Down

0 comments on commit 82a61b5

Please sign in to comment.