Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 01/04/23 in all areas

  1. Se pare ca si-au luat-o in freza, nu imi apartine dar e la liber arhiva cu baza de date. BIG LOL https://anonfiles.com/I7GbZbP8y5/jerryspizza.ro_dump_zip
    3 points
  2. Salut a todos, Am un proiect pe partea de CyberSecurity, la care lucrez de ceva timp impreuna cu un fost coleg de munca si pe care as vrea sa vi-l prezit si voua cu speranta in a veni si a-l testa. Am fost si la Defcamp unde am avut un stand (multumim @Andrei cu aceasta ocazie) si am rulat un program de BugBounty (care inca e valid - cine gaseste un bug valid il raporteaza si in functie de severitate vom premia cu vouchere Emag) Pe scurt, este vorba de https://razdon.com , un website care va ofera posibilitatea de a "onboardui" si a veda traficul vostru LIVE cu un extra context de securitate la fiecare request. Aceasta parte de live este prezentata sub forma unui dashboard unde poti vedea harta lumii si toate request-urile venind spre locatia serverului tau. Aveti un screenshot atasat mai jos: Dupa cum vedeti proiectul a fost dezvoltat pe RST si aici puteti vedea un window de aproximativ 8 ore cu toate statisticiile legate de RST in aceste 8 ore + traficul live, bineinteles. Pe langa partea de dashboard live, care necesita interactiune minima (practic este doar selectia site-ului in scop - in cazul in care aveti mai multe), avem si partea de analiza de trafic. In partea de analiza de trafic ai optiunea de a cauta in toate request-urile pe o anumita perioada de timp dupa ceva anume (ex. toate requesturile cu status code 4XX or 3XX). In partea de analiza este prezenta si un scurt istoric al atacurilor recente (cu tot cu tipul lor) Puteti vedea o bucata din acesta pagina mai jos: Un alt meniu destul de interesat este cel cu partea de certificate SSL, unde va puteti verifica data de expirare a certificatului (iar pe viitor vom implementa si sistem de alerte - atat la certificate cat si la atacuri). Un screenshot cu partea de certificate mai jos: Putem implementa si partea de WAF, dar momentan avem 0 focus in aceasta directie. Foarte curand vom face release si la un beta pe partea de artificial intelligence / machine learning, cu ajutorul carora vom maximiza eficienta detectarii atacurilor. Acestea fiind spuse, daca cineva este interesat de un asemenea produs, inregistrariile sunt deschise si puteti urma pasii necesari pentru a viziona traficul. Pentru a evita intrebariile de tipul cum faceti asta, va informez de pe acum ca singurul lucru care e necesar pentru aceste actiun sunt logurile de apache, respectiv nginx cu traficul website-ului. Momentan preluam aceste loguri cu un binar scris in GO (pentru eficienta) dar voi pune si varianta (raw) cea de a trimite catre API-ul nostru log-urile fara a rula un binar (safety reasons). O mica schema pentru a intelege mai bine cum sta treaba aveti jos. Mersi si o seara faina! P.S. In caz ca vrea sa ne cumpere careva cu vreo 2-3 mil de euro sa-mi dea un MP, dupaia e mai scump.
    2 points
  3. @spiderincearca https://parsec.app daca nu ai rezolvat pana acum.
    1 point
  4. This is an OSINT tool. The main purpose is recolect information from different sources like Google, Tinder, Twitter and more. It combines facial recognition methods to filter the results and uses natural language processing for obtaining important entities from the website the user appears. The tool is able to calculate a final score which indicates the amount of public exposition an user has on the Internet. It has two different modules that can work indepently: CLI and Web Interface. Both modules are built using docker and are easy to deploy. If you like the tool, give us a star! CLI CLI Module for web scraping: Tinder Instagram Yandex Google Facebook BOE Twitter Prerequisites Docker and docker-compose Installation docker build -t spyscrap . docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap [options] You must put the image you want to be used for facial recognition under the shared volume in docker as in the next example: docker run -ti -v /Users/ruthgnz/Documents/osint/SpyScrap/src/data:/spyscrap/data sp -t twitter -n "ruth gonzalez novillo" -i ./data/descarga.jpeg Usage docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap [options] Get Tinder users and store data in sqlite3 database. Tinder Token must be capturen when logging into Tinder App under Local Storage. docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap -t tinder -k TOKEN Search in google. Add -i to download images and do facial recognition Add -p to only search in an specific site Ex: Linkedin docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap --tag google -n "<name surname>" docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap --tag google -n "<name surname>" -i <imagePath> docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap --tag google -n "<name surname>" -i <imagePath> -p "<Place>" Search twitter profiles docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap -t twitter -n "<name surname>" -s <number of twitter pages to search> Search facebook profiles Add -i to download images do facial recognition docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap -t facebook -n "<name surname>" docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap --tag facebook -n "<name surname>" -i <imagePath> Search instagram profiles Add -i to download instagram profile image and do facial recognition docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap -t instagram -n "<name surname>" docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap -t instagram -n "<name surname>" -i <imagePath> Search DNI, Names and Surnames in BOE docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap -t boe -n "<text to search>" -s <number of BOE pages to search> docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap -t boe -n "<text to search>" -s <number of BOE pages to search> -e <boolean> -d <init date> -f <final date> OTHER EXAMPLES: docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap -t tinder -k TOKEN docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap --tag google -n "<name surname>" docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap --tag google -n "<name surname>" -i <imagePath> docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap --tag google -n "<name surname>" -i <imagePath> -p "<Place>" docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap -t twitter -n "<name surname>" -s <number of twitter pages to search> docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap -t facebook -n "<name surname>" docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap --tag facebook -n "<name surname>" -i <imagePath> docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap -t instagram -n "<name surname>" docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap -t instagram -n "<name surname>" -i <imagePath> docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap -t boe -n "<text to search>" -s <number of BOE pages to search> docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap -t boe -n "<text to search>" -s <number of BOE pages to search> -e <boolean> -d <init date> -f <final date> docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap main.py -t yandex -k <imgur id> -i <imagePath> docker run -ti -v /PATH/TO/SpyScrap/src/data:/spyscrap/data spyscrap main.py -t yandex -i <imgUrl> All the results are stored in the docker shared volume you must have configured on your localhost when running the container. The first part is the path for your local folder and you can change it. The second part must be the one in the example (/spyscrap/data) -v /PATH/TO/SpyScrap/src/data:/spyscrap/data Web Interface This is a wrapper for the CLI. Prerequisites Docker and docker-compose Installation cd web docker-compose up Once the images are built, open the browser: http:\\localhost For searching in Tinder you must put the database.db file created using the CLI in the volume inside the folder: SpyScrap\web\data You will also find in this folder the results of all your searches on the web interface. DISCLAIMER This tool is for educational purposes only. Please only use this tool on systems you have permission to access! Ethical use only. Any actions and or activities related to the tools we have created is solely your responsibility. The misuse of the tools we have created can result in criminal charges brought against the persons in question. We will not be held responsible in the event any criminal charges be brought against any individuals misusing the tools we have made to break the law. Authors Ruth González - @RuthGnz Miguel Hernández - @MiguelHzBz Thanks BBVA Next Technologies SecLab Team Feel free to collaborate!! with by @RuthGnz & @MiguelHzBz Download: SpyScrap-master.zip or git clone https://github.com/RuthGnz/SpyScrap.git Source
    1 point
  5. Un XSS Dom Based in www.intel.com. Din pacate, nu ofera bani pentru aplicatiile web. https://app.intigriti.com/programs/intel/intel/detail Vulnerabilitatea a fost raportata.
    1 point
×
×
  • Create New...