Skip to content

Latest commit

 

History

History
31 lines (22 loc) · 1.24 KB

README.md

File metadata and controls

31 lines (22 loc) · 1.24 KB

Dataset Tools

Blog post

I use this collection of scripts for creating new datasets to train LLM models.

  • stage1: Store raw CSV file. Cleanup the base data manually
  • stage2: Create a JSONL file from each CSV file
  • stage3: Combine all JSONL files into one

Example Usage:

  • Required: Be sure to install the dependency in requirements.txt
$ python ./data/stage2/create-jsonl.py
usage: create-jsonl.py [-h] input_file output_dir output_file
create-jsonl.py: error: the following arguments are required: input_file, output_dir, output_file

$ python ./data/stage3/combine-jsonl.py
usage: combine-jsonl.py [-h] directory_path output_file
combine-jsonl.py: error: the following arguments are required: directory_path, output_file
$ python ./data/stage2/create-jsonl.py ./data/stage1/scrape-results1.csv ./data/stage2 scrape-results1.jsonl
2024-05-21 03:33:50 [info     ] CSV processing complete        output_file=data/stage2/scrape-results1.jsonl

$ python ./data/stage3/combine-jsonl.py ./data/stage2 ./data/stage3/final.jsonl
2024-05-21 03:36:42 [info     ] Merged JSONL files             output_file=./data/stage3/final.jsonl