##An Aggressive OSINT Script in Python + Google
Unveiling Digital Constellations π
The Quest for Hidden Files
In the vast expanse of the internet, information lies scattered like constellations in the night sky. Some data is readily accessible, while other nuggets remain elusive, tucked away in obscure corners. As seekers of knowledge, we yearn to unravel these mysteries β to find that elusive file, that critical document, or that hidden gem that holds the answers we seek.
π A simple Python script designed to harness the power of Google Search in our quest for digital treasures.
Picture it as a digital bloodhound, sniffing out relevant documents, spreadsheets, and presentations across the vast online universe. Letβs explore how this tool works, its ethical considerations, and its potential impact. May your cosmic searches lead you to hidden galaxies of wisdom! β¨
Python3
Git
Pip
git clone https://github.com/snooptsz/findanyfile
cd findanyfile
pip install -r requirements.txt
python3 FindAnyFile.py
Video > https://www.youtube.com/watch?v=kuZ4eVpAqVA
""""""
from googlesearch import search
def find_files(query, file_types):
"""
Searches for files online using Google Search.
Args:
query: The search terms for finding files.
file_types: List of file types to search for.
Returns:
A dictionary where keys are file types and values are lists of URLs containing potentially relevant files.
"""
file_links = {}
for file_type in file_types:
search_term = query + f" filetype:{file_type}"
results = search(search_term)
file_links[file_type] = results
return file_links
def main():
"""
Prompts the user for a search query and displays results for various file formats.
"""
query = input("Enter your search query for files: ")
file_types = ['txt', 'bin', 'asp', 'doc', 'pdf', 'exif', 'xls', 'xlsx', 'xml', 'csv', 'html', 'docx', 'msg', 'json', 'log', 'gdoc'] # Add more file types as needed
try:
file_links = find_files(query, file_types)
for file_type, links in file_links.items():
if links:
print(f"Found {file_type.upper()} files:")
for link in links:
print(link)
else:
print(f"No {file_type.upper()} files found for your query.")
except Exception as e:
print(f"An error occurred: {e}")
if __name__ == "__main__":
main()
"""""
Our tool wields a virtual crowbar, prying apart the layers of corporate intrigue. It can reveal clandestine boardroom discussions, financial maneuvers, and hidden agendas. The forbidden fruit of insider knowledge tempts us β but at what cost?
Like a digital whisperer, our script deciphers encrypted memos. Classified directives, covert operations, and policy shifts emerge from the shadows. But tread lightly, for these corridors hold echoes of power struggles and delicate diplomacy.
The toolβs gaze penetrates private realms β the intimate exchanges of emails, the hushed conversations in chat logs. It reveals love letters, family secrets, and confessions.
- Yet, here lies the greatest ethical precipice: to respect privacy or to delve deeper?
Ensuring strong cybersecurity measures is akin to maintaining order and cleanliness in our physical lives. We can take numerous actions to enhance our cybersecurity, such as regularly updating passphrases, not passwords, implementing robust data backup strategies, and using two-factor authentication as a minimum security standard. Moreover, there are a plethora of valuable tips to consider.
I will also provide some informative links and share one of my recent stories that presents an innovative approach to thwarting crawlers or bots by adjusting your "Robots.txt" file.
It is perfect for hunting for data online. It can search for any type of file; just edit the file type in the script as below.
Name and surname can be a good hint to start when hunting for people online. Note that names are not all unique but countless and similar all over the world. A tip that helps is to focus on line these on the target and location, not included. For example, om - If you look for Mary in Italy apart, then screen or save all the and it &, and.com, you may be likely to exclude.fr, au or .uk, etcetera if you are unaware of the target link within.
Are you hunting for a book you could not find? You know research is Life on the World Wide Web, but you can't find it. Cool, we got you. Just write your search, set the typefile for your search & wait for results. Please be ethical & do not abuse the tool.
FindAnyFile is perfect for Gray literature as it will crawl the entire web with Google, reaching even files only sometimes meant for the public worldwide. Be nice.
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
You can search for any file, but please be aware that the results may include confidential content. Note that the App crawls the entire web, accessing confidential and unprotected files, some of which may be unwanted. Please use the App ethically, report any security issues, and do not abuse it. Remember, this App is intended for educational purposes only; you are solely responsible for your actions.
Remember to use Proxy servers, VPNs, sandbox environments, VMs, rockers, and at least a customizable and secure browser.
Always double-check and READ the web links. You may only click on government, police, or similar websites if they are outside the public domain.
Lastly, remember that the App has a hidden power that can be unleashed differently. I will not disclose this by now, but I can tell you that if you blend the 2 Apps I used to make the App on a Website (and I don't mean build a website), you will have only to copy and paste the code for your online ethical development.
////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
The robots.txt
file serves as a crucial tool for webmasters to communicate instructions to web robots,
particularly search engine robots, about the crawling and indexing of specific pages on their websites.
It plays a pivotal role in the Robots Exclusion Protocol (REP), governing the interactions between robots and web pages.
A typical robots.txt
file includes one or more rules, each comprising a User-agent
line and one or more Disallow
or Allow
directives.
- Full Story on Medium
https://medium.com/the-first-digit/robots-txt-a-file-we-all-better-know-a-bit-about-218794b6c1c7
β π₯ - Disclaimer: No bananas, cucumber or watermelon were harmed during the making of F!Nd@NyFiL3. - π β
π£XxX - The App was on Github Already, but mysteriously, a day just disappeared....π€¨... Today, we made it live again with the whole profile. If anyone wants to get in touch, please do not damage it, as Life is a circle...If so, it may disappear By mistake... - XxX π₯