An advanced web directory scanning tool that will be more powerful than DirBuster, Dirsearch, cansina, and Yu Jian
After a lot of research, summarizing an excellent web directory scanning tool has at least the following features:
- Concurrency engine
- Can use the dictionary
- Can be purely blasted
- Can crawl the page dynamically to generate a dictionary
- Can fuzz scan
- Custom request
- Custom response processing...
Then take a look at the features of Dirmap.
- Support n target*n payload concurrent
- Support recursive scanning
- Support custom status codes that require recursive scanning
- Support (single | multi) dictionary scan
- Support custom character set blasting
- Support crawler dynamic dictionary scanning
- Support custom label fuzz target url
- Custom Request User-Agent
- Custom request random delay
- Custom request timeout
- Custom Request Broker
- Custom Regular Expressions Match False 404 Pages
- Customize the response status code to process
- Customize skipping pages of size x
- Custom display content-type
- Customize the display page size
- Save the results by domain name and remove duplicates
git clone https://github.com/H4ckForJob/dirmap.git && cd dirmap && python3 -m pip install -r requirement.txt
Single target, default is http
python3 dirmap.py -i https://target.com -lcf
python3 dirmap.py -i 192.168.1.1 -lcf
Subnet(CIDR format)
python3 dirmap.py -i 192.168.1.0/24 -lcf
Network range
python3 dirmap.py -i 192.168.1.1-192.168.1.100 -lcf
python3 dirmap.py -iF targets.txt -lcf
The above format is supported in targets.txt
- The result will be automatically saved in the
output
folder in the project root directory. - Each target generates a txt with the naming format
domain/ip.txt
- The result is automatically deduplicated
Customize the dirmap configuration and start exploring dirmap advanced features
Temporarily configure the configuration file by loading the configuration file. ** It is not supported to use the command line parameters for detailed configuration**!
Edit dirmap.conf
in the root directory of the project to configure it.Detailed instructions for use in dirmap.conf
- command line parameter parsing global initialization
- Cngine Initialization
- set the number of threads
- Target Initialization
- automatic parsing of input formats( -i,inputTarget)
- IP
- Domain
- URL
- IP/MASK
- IP Start-End
- file reading(-iF,inputLocalFile)
- automatic parsing of input formats( -i,inputTarget)
- Bruter Initialization
- Load Configuration Mode()
- read command line parameter values
- read configuration file(-lcf,loadConfigFile)
- Recursive Mode Option(RecursiveScan)
- recursive scan (-rs, recursive_scan)
- status code requiring recursion (-rd, recursive_status_code)
- exclude certain directories (-es, exclude_subdirs)
- Scan Mode Option (ScanModeHandler)
- dictionary mode (-dm, dict_mode)
- load a single dictionary (-dmlsd, dict_mode_load_single_dict)
- load multiple dictionaries (-dmlmd, dict_mode_load_mult_dict)
- blast mode (-bm, blast_mode)
- blasting directory length range (required)
- minimum length (-bmmin, blast_mode_min)
- maximum length (-bmmax, blast_mode_max)
- based on the default character set
- based on a-z
- based on 0-9
- based on custom character set (-bmcc, blast_mode_custom_charset)
- breakpoint resume generating payload(-bmrc, blast_mode_resume_charset)
- blasting directory length range (required)
- crawler mode (-cm, crawl_mode)
- custom parsing tags (-cmph, crawl_mode_parse_html) (a:href, img:src, form:action,script:src,iframe:src,div:src,frame:src,embed:src)
- parsing robots.txt (-cmpr, crawl_mode_parse_robots)
- crawler dynamic fuzz scan (-cmdf, crawl_mode_dynamic_fuzz)
- fuzz mode (-fm, fuzz_mode)
- fuzz single dictionary (-fmlsd, fuzz_mode_load_single_dict)
- fuzz multiple dictionaries (-fmlmd, fuzz_mode_load_mult_dict)
- fuzz tag (-fml, fuzz_mode_label)
- dictionary mode (-dm, dict_mode)
- Request Optimization Option (RequestHandler)
- custom request timeout (-rt, request_timeout)
- custom request delay (-rd, request_delay)
- limit single target host coroutine scan (-rl, request_limit)
- limit the number of retries (-rmr, request_max_retries)
- http persistent connection (-rpc, request_persistent_connect)
- custom request method (-rm, request_method) (get, head)
- 302 state processing (-r3, redirection_302) (redirected)
- custom header
- customize other headers (-rh, request_headers) (resolve 401 authentication required)
- custom ua(-rhua,request_header_ua)
- custom cookie (-rhc, request_header_cookie)
- Dictionary Processing Option (PayloadHandler)
- dictionary processing (payload modification - de-slash)
- dictionary processing (payload modification - first character plus slash)
- dictionary processing (payload modification - initial capitalization of words)
- dictionary processing (payload modification - de-extension)
- dictionary processing (payload modification - remove non-alphanumeric)
- Response Result Processing Module (ResponseHandler)
- skips files of size x bytes (-ss, skip_size)
- automatically detect 404 pages (-ac4p, auto_check_404_page)
- custom 503 page (-c5p, custom_503_page)
- customize regular matching response content and perform some action
- custom regular match response (-crp, custom_response_page)
- some operation (temporarily undefined)
- output is a custom status code (-rsc, response_status_code)
- output payload to full path (default output completion url)
- output results show content-type
- automatically repeats the results
- Status Processing Module (StatusHandler)
- status display (waiting for start, ongoing, paused, abnormal, completed)
- progress display
- status control (start, pause, resume, stop)
- continued scanning module (not yet configured)
- breakpoint continuous sweep
- line selection continues
- Logging Module (ScanLogHandler)
- scan log
- error log
- Proxy Module (ProxyHandler)
- single agent (-ps, proxy_server)
- proxy pool
- Debug Mode Option (DebugMode)
- debug(--debug)
- Check For Update Options (CheckUpdate)
- update(--update)
- Load Configuration Mode()
The dictionary file is stored in the data
folder in the project root directory.
- dict_mode_dict.txt "dictionary mode" dictionary, using
dirsearch
default dictionary - crawl_mode_suffix.txt "crawler mode" dictionary, using the
FileSensor
default dictionary - fuzz_mode_dir.txt "fuzz mode" dictionary, using the
DirBuster
default dictionary - fuzz_mode_ext.txt "fuzz mode" dictionary, a dictionary made with common suffixes
- dictmult This directory is the "dictionary mode" default multi-dictionary folder, including: BAK.min.txt (backup file small dictionary), BAK.txt (backup file large dictionary), LEAKS.txt (information leak file dictionary)
- fuzzmult This directory is the default multi-dictionary folder of "fuzz mode", including: fuzz_mode_dir.txt (default directory dictionary), fuzz_mode_ext.txt (default suffix dictionary)
- "crawler mode" only crawls the current page of the target and is used to generate a dynamic dictionary. The project will separate the "crawler module" from the "generate dynamic dictionary function" in the future.
- About bruter.py line 517
bar.log.start()
error. Solution: Please install progressbar2. Uninstall the progressbar. Prevent the import of modules of the same name. Thanks to a brother for reminding.
python3 -m pip uninstall progressbar
python3 -m pip install progressbar2
- If there is a problem during use, please feel free to issue an issue.
- The project is under maintenance and new features will be added in the future. Please refer to the “TODO” list for details.
In the process of writing dirmap, I borrowed a lot of models and ideas from excellent open source projects. I would like to express my gratitude.
mail: xxlin.ujs@qq.com