Skip to content

Latest commit

 

History

History
174 lines (133 loc) · 7.01 KB

INSTALL.md

File metadata and controls

174 lines (133 loc) · 7.01 KB

aGLM is the natural evolution of Professor Codephreak creating automind then automindx creating MASTERMIND to build an Autonomous General Learning Model. automindx is now the conda build environment for aGLM MASTERMIND with RAGE

sudo apt install wget
wget https://github.com/GATERAGE/aglm/blob/main/aglm.install
chmod +x aglm.install && ./aglm.install

USE MANUAL INSTALL FOR NOW


clone aGLM and install requirements

git clone https://github.com/gaterage/aglm/
cd aglm
#install pip if you haven't already
sudo apt install python3-pip
#display version of pip installed
pip3 --version
#build a conda environment
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh
chmod +x Miniconda3-latest-Linux-x86_64.sh
sudo ./Miniconda3-latest-Linux-x86_64.sh
#reload the shell configuration settings change for your shell example is for bash source ~/.bashrc
conda create --name automindx python=3.9.1
conda init
conda activate automindx
#install automind requirements with pip
pip install -r requirements.txt

uiux call downloads language model on first deployment

python3 uiux.py --model_name="TheBloke/llama2-7b-chat-codeCherryPop-qLoRA-GGML" --tokenizer_name="TheBloke/llama2-7b-chat-codeCherryPop-qLoRA-GGML" --model_type="ggml" --save_history --file_name="llama-2-7b-chat-codeCherryPop.ggmlv3.q4_1.bin"

To install right click "Save link as ..."automindx.install chmod +x automindx.install && automindx.install

automind.install

detailed and verbose procedure

  1. Right-click the following link: automindx.install

  2. Choose "Save link as..." or "Download linked file" from the context menu.

  3. Select a location on your computer to save the file.

  4. from the terminal

  5. chmod +x automind.install && ./automind.install

---------------------------------

Example loading automind using Ubuntu 22.04LTS

Creates Professor Codephreak

Professor Codephreak is an expert in machine learning, computer science and computer programming
codephreak agenda: to create AUTOMINDx autonomous deployment

default model llama-2-7b-chat-codeCherryPop.ggmlv3.q4_1.bin

manual install miniconda

wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh
chmod +x Miniconda3-latest-Linux-x86_64.sh
sudo ./Miniconda3-latest-Linux-x86_64.sh
#reload the shell configuration settings change for your shell example is for bash source ~/.bashrc
conda create --name automind python=3.9.1

#replace with bash with your shell
conda activate bash

#conda reference for env display and quit conda env list
conda list
conda info package_name
conda deactivate

clone original Professor Codephreak automind and install requirements

git clone https://github.com/Professor-Codephreak/automind
cd automind
#install pip if you haven't already
sudo apt install python3-pip
#display version of pip installed
pip3 --version
#install automind requirements with pip
pip install -r requirements.txt

uiux call downloads language model on first deployment

python3 uiux.py --model_name="TheBloke/llama2-7b-chat-codeCherryPop-qLoRA-GGML" --tokenizer_name="TheBloke/llama2-7b-chat-codeCherryPop-qLoRA-GGML" --model_type="ggml" --save_history --file_name="llama-2-7b-chat-codeCherryPop.ggmlv3.q4_1.bin"

same call to uiux.py RUNS model post model download

python uiux.py --model_name="TheBloke/llama2-7b-chat-codeCherryPop-qLoRA-GGML" --tokenizer_name="TheBloke/llama2-7b-chat-codeCherryPop-qLoRA-GGML" --model_type="ggml" --save_history --file_name="llama-2-7b-chat-codeCherryPop.ggmlv3.q4_1.bin"

file structure

model_name.txt in the models folder with your model will autoread. the above call to uiux.py over-rides the call to model_name.txt
models folder = models
memory folder = memory



# troubleshooting
llamacpp source build dependencies include
sudo apt-get install build-essential cmake gcc g++ git python3-dev python3-pip libstdc++6 make pkg-config
# git and wget
sudo apt-get install git wget
# manual llamacpp pip install and uninstall
pip uninstall llama-cpp-python
pip install llama-cpp-python
# On Ubuntu 22.04.6 automind.install works with cmake version 3.27.2
cmake --version
# On Ubuntu 22.04.6 automind.install works with gcc (Ubuntu 11.4.0-1ubuntu1`22.04) 11.4.0
gcc --version
# config gcc alternatives
sudo update-alternatives --config gcc
# install pip3
sudo apt-get install python3-pip
pip3 --version

# conda
conda deactivate
conda activate -n automindx
# diagnostics
sudo apt-get install hardinfo htop nvtop
# fortran for scipy sudo apt install gfortran

To install right click "Save link as ..." automindx.install chmod +x automindx.install && automindx.install

details and verbose procedure

  1. Right-click the following link: automindx.install

  2. Choose "Save link as..." or "Download linked file" from the context menu.

  3. Select a location on your computer to save the file.

  4. from the terminal

  5. chmod +x automind.install && ./automind.install

rewrite of basic manual install barely documented

sudo apt-get install build-essential cmake gcc g++ git python3-dev python3-pip libstdc++6 make pkg-config
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh
chmod +x Miniconda3-latest-Linux-x86_64.sh
sudo ./Miniconda3-latest-Linux-x86_64.sh
source ~/.bashrc
conda create --name automind python=3.9.1
conda init
source ~/.bashrc
conda activate bash
git clone https://github.com/Professor-Codephreak/automind
cd automind
#install pip if you haven't already
sudo apt install python3-pip
#display version of pip installed
pip3 --version
#install automind requirements with pip
pip install -r requirements.txt


# RUN codephreak

python3 uiux.py --model_name="TheBloke/llama2-7b-chat-codeCherryPop-qLoRA-GGML" --tokenizer_name="TheBloke/llama2-7b-chat-codeCherryPop-qLoRA-GGML" --model_type="ggml" --save_history --file_name="llama-2-7b-chat-codeCherryPop.ggmlv3.q4_1.bin"