A simple, lightweight, easy use pythonic AI Conference paper collector.
Open Manual_total.xlsx, all crawled papers are listed in there.
与:条件在同一行
或: 条件在不同行
不包含: <>*包含字符*
包含: *包含字符*
注意:条件不加引号
Conferences | 2018 | 2019 | 2020 | 2021 | 2022 |
---|---|---|---|---|---|
AAAI 🐛 | ☑ | ☑ | ☑ | ☑ | |
CVPR | ☑ | ☑ | ☑ | ☑ | |
ICCV | ☑ | ☑ | ☑ | ☑ | |
ECCV | ☑ | ☑ | ☑ | ☑ | |
ICLR | ☑ | ☑ | ☑ | ☑ | |
ICML | ☑ | ☑ | ☑ | ☑ | |
IJCAI | ☑ | ☑ | ☑ | ☑ | |
NIPS | ☑ | ☑ | ☑ | ☑ | |
Journals | |||
---|---|---|---|
AI | |||
TPAMI | |||
TNN |
- Scrapy
- Selenium
Peewee (ORM framework for SQLite)
We offer a shell script to crawl all Conferences' papers.
We assuming your start path is this repo's directory.
After your environment is ready, just run:
cd spider_conference
bash ./crawl.sh
CSV files will be created under project path.
If you just want to crawl Specific Conferences' paper, run:
scrapy crawl <CONFERENCE_NAME>
you can find <CONFERENCE_NAME> in crawl.sh
After collect all conferences, go to project path, run:
python csv2xlsx.py
and you will get Collector.xlsx, containing all papers.
- Use scrapy to get paper infos from denoted Conference urls, for example title, abstract, etc.
- Save these infos into CSV files, or lightweight sqlite3 database.
- Offer tools to get papers we want from database, convert into excel format (In the furture).
This chapter records resources for developing a spider program.
-
Abstract in two record in ECCV.csv is spliced into multiple lines, need manual correction.
In csv, find :
angle$ triplet needs to be maintained correctly. For example
ho$. Furthermore
to see these two bad records.
-
AAAI: collect over 1000 more records than published, indicating that some records are splited in to multiple rows.
-
NIPS: when write into csv file, there is some records' chaos.
-
Possible Problem:
Excel encoding should be saved and opened in UTF-8 encoding.
Content of these record make an error when opening the CSV format file.
-
- Finish all listed conferences' spider
- Add sql support
- Create a tiny website, with UI and some search tools.