Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 09/09/17 in all areas

  1. dcrawl is a simple, but smart, multi-threaded web crawler for randomly gathering huge lists of unique domain names. How it works? dcrawl takes one site URL as input and detects all <a href=...> links in the site's body. Each found link is put into the queue. Successively, each queued link is crawled in the same way, branching out to more URLs found in links on each site's body. How smart crawling works: Branching out only to predefined number of links found per one hostname. Maximum number of allowed different hostnames per one domain (avoids subdomain crawling hell e.g. blogspot.com). Can be restarted with same list of domains - last saved domains are added to the URL queue. Crawls only sites that return text/html Content-Type in HEAD response. Retrieves site body of maximum 1MB size. Does not save inaccessible domains. How to run? go build dcrawl.go ./dcrawl -url http://wired.com -out ~/domain_lists/domains1.txt -t 8 Usage ___ __ __| _/________________ __ _ _| | / __ |/ ___\_ __ \__ \\ \/ \/ / | / /_/ \ \___| | \// __ \\ /| |__ \____ |\___ >__| (____ /\/\_/ |____/ \/ \/ \/ v.1.0 usage: dcrawl -url URL -out OUTPUT_FILE -t THREADS -ms int maximum different subdomains for one domain (def. 10) (default 10) -mu int maximum number of links to spider per hostname (def. 5) (default 5) -out string output file to save hostnames to -t int number of concurrent threads (def. 8) (default 8) -url string URL to start scraping from -v bool verbose (default false) Mirror: package main import ( "fmt" "io" "io/ioutil" "net/http" "net/url" "net" "strings" "regexp" "flag" "os" "bufio" "time" "golang.org/x/net/publicsuffix" ) const Version = "1.0" const BodyLimit = 1024*1024 const MaxQueuedUrls = 4096 const MaxVisitedUrls = 8192 const UserAgent = "dcrawl/1.0" var http_client *http.Client var ( start_url = flag.String("url", "", "URL to start scraping from") output_file = flag.String("out", "", "output file to save hostnames to") max_threads = flag.Int("t", 8, "number of concurrent threads (def. 8)") max_urls_per_domain = flag.Int("mu", 5, "maximum number of links to spider per hostname (def. 5)") max_subdomains = flag.Int("ms", 10, "maximum different subdomains for one domain (def. 10)") verbose = flag.Bool("v", false, "verbose (def. false)") ) type ParsedUrl struct { u string urls []string } func stringInArray(s string, sa []string) (bool) { for _, x := range sa { if x == s { return true } } return false } func get_html(u string) ([]byte, error) { req, err := http.NewRequest("HEAD", u, nil) if err != nil { return nil, err } req.Header.Set("User-Agent", UserAgent) resp, err := http_client.Do(req) if err != nil { return nil, err } if resp.StatusCode != http.StatusOK { return nil, fmt.Errorf("HTTP response %d", resp.StatusCode) } if _, ct_ok := resp.Header["Content-Type"]; ct_ok { ctypes := strings.Split(resp.Header["Content-Type"][0], ";") if !stringInArray("text/html", ctypes) { return nil, fmt.Errorf("URL is not 'text/html'") } } req.Method = "GET" resp, err = http_client.Do(req) if err != nil { return nil, err } defer resp.Body.Close() b, err := ioutil.ReadAll(io.LimitReader(resp.Body, BodyLimit)) // limit response reading to 1MB if err != nil { return nil, err } return b, nil } func find_all_urls(u string, b []byte) ([]string) { r, _ := regexp.Compile(`<a\s+(?:[^>]*?\s+)?href=["\']([^"\']*)`) urls := r.FindAllSubmatch(b,-1) var rurls []string ru, _ := regexp.Compile(`^(?:ftp|http|https):\/\/(?:[\w\.\-\+]+:{0,1}[\w\.\-\+]*@)?(?:[a-z0-9\-\.]+)(?::[0-9]+)?(?:\/|\/(?:[\w#!:\.\?\+=&amp;%@!\-\/\(\)]+)|\?(?:[\w#!:\.\?\+=&amp;%@!\-\/\(\)]+))?$`) for _, ua := range urls { if ru.Match(ua[1]) { rurls = append(rurls, string(ua[1])) } else if len(ua)>0 && len(ua[1])>0 && ua[1][0] == '/' { up, err := url.Parse(u) if err == nil { ur := up.Scheme + "://" + up.Host + string(ua[1]) if ru.MatchString(ur) { rurls = append(rurls, ur) } } } } return rurls } func grab_site_urls(u string) ([]string, error) { var ret []string b, err := get_html(u) if err == nil { ret = find_all_urls(u, b) } return ret, err } func process_urls(in <-chan string, out chan<- ParsedUrl) { for { var u string = <-in if *verbose { fmt.Printf("[->] %s\n", u) } urls, err := grab_site_urls(u) if err != nil { u = "" } out <- ParsedUrl{u, urls} } } func is_blacklisted(u string) (bool) { var blhosts []string = []string{ "google.com", ".google.", "facebook.com", "twitter.com", ".gov", "youtube.com", "wikipedia.org", "wikisource.org", "wikibooks.org", "deviantart.com", "wiktionary.org", "wikiquote.org", "wikiversity.org", "wikia.com", "deviantart.com", "blogspot.", "wordpress.com", "tumblr.com", "about.com", } for _, bl := range blhosts { if strings.Contains(u, bl) { return true } } return false } func create_http_client() *http.Client { var transport = &http.Transport{ Dial: (&net.Dialer{ Timeout: 10 * time.Second, }).Dial, TLSHandshakeTimeout: 5 * time.Second, DisableKeepAlives: true, } client := &http.Client{ Timeout: time.Second * 10, Transport: transport, } return client } func banner() { fmt.Println(` ___ __ `) fmt.Println(` __| _/________________ __ _ _| | `) fmt.Println(` / __ |/ ___\_ __ \__ \\ \/ \/ / | `) fmt.Println(`/ /_/ \ \___| | \// __ \\ /| |__`) fmt.Println(`\____ |\___ >__| (____ /\/\_/ |____/`) fmt.Println(` \/ \/ \/ v.` + Version) fmt.Println("") } func usage() { fmt.Printf("usage: dcrawl -url URL -out OUTPUT_FILE\n\n") } func init() { http_client = create_http_client() } func main() { banner() flag.Parse() if *start_url == "" || *output_file == "" { usage() return } fmt.Printf("[*] output file: %s\n", *output_file) fmt.Printf("[*] start URL: %s\n", *start_url) fmt.Printf("[*] max threads: %d\n", *max_threads) fmt.Printf("[*] max links: %d\n", *max_urls_per_domain) fmt.Printf("[*] max subd: %d\n", *max_subdomains) fmt.Printf("\n") vurls := make(map[string]bool) chosts := make(map[string]int) dhosts := make(map[string]bool) ldhosts := make(map[string]int) var qurls []string var thosts []string fo, err := os.OpenFile(*output_file, os.O_APPEND, 0666) if os.IsNotExist(err) { fo, err = os.Create(*output_file) } if err != nil { fmt.Fprintf(os.Stderr, "ERROR: can't open or create file '%s'", *output_file) return } defer fo.Close() scanner := bufio.NewScanner(fo) nd := 0 for scanner.Scan() { hn := scanner.Text() if hd, err := publicsuffix.EffectiveTLDPlusOne(hn); err == nil { ldhosts[hd] += 1 } dhosts[hn] = true thosts = append(thosts, hn) nd++ } fmt.Printf("[+] loaded %d domains\n\n", nd) w := bufio.NewWriter(fo) su := *start_url in_url := make(chan string) out_urls := make(chan ParsedUrl) for x := 0; x < *max_threads; x++ { go process_urls(in_url, out_urls) } tu := 1 ups, err := url.Parse(su) if err != nil { fmt.Fprintf(os.Stderr, "[-] ERROR: invalid start URL: %s\n", su) return } if _, sd_ok := dhosts[ups.Host]; sd_ok { fmt.Printf("[*] start URL detected in saved domains\n") fmt.Printf("[*] using last %d saved domains for crawling\n", *max_threads) for _, d := range thosts[len(thosts)-*max_threads:] { fmt.Printf("[+] adding: %s\n", ("http://" + d)) qurls = append(qurls, ("http://" + d)) } in_url <- qurls[0] } else { in_url <- su } for { var purl ParsedUrl = <-out_urls tu -= 1 if purl.u != "" { if du, err := url.Parse(purl.u); err == nil { if _, d_ok := dhosts[du.Host]; !d_ok { fmt.Printf("[%d] %s\n", len(dhosts), du.Host) dhosts[du.Host] = true fmt.Fprintf(w, "%s\n", du.Host) w.Flush() } } urls := purl.urls for _, u := range urls { // strip # out of url if exists u = strings.Split(u,"#")[0] up, err := url.Parse(u) if err == nil { h := up.Host hd := "" d_ok := true if hd, err = publicsuffix.EffectiveTLDPlusOne(h); err == nil { if n, ok := ldhosts[hd]; ok { if n >= *max_subdomains { d_ok = false } } } _, is_v := vurls[u] if !is_blacklisted(u) && chosts[h] < *max_urls_per_domain && !is_v && d_ok && len(qurls) < MaxQueuedUrls { vurls[u] = true chosts[h] += 1 if hd != "" { ldhosts[hd] += 1 } qurls = append(qurls, u) } } } } if len(qurls) == 0 { fmt.Fprintf(os.Stderr, "ERROR: ran out of queued urls!\n") return } // push more urls to channel for tu < *max_threads && len(qurls) > 0 { u := qurls[0] qurls = append(qurls[:0], qurls[1:]...) in_url <- u tu++ } if len(vurls) >= MaxVisitedUrls { vurls = make(map[string]bool) } } } License dcrawl was made by Kuba Gretzky from breakdev.org and released under the MIT license. Download dcrawl-master.zip Source
    3 points
  2. Daca nici macar nu stii ce inseamna termenii, cine crezi ca o sa te angajeze ? Plus o cautare pe google iti returneaza mii de definitii pt fiecare termen in parte. Probabil cel mai important skill in IT este sa stii sa cauti pe google.
    3 points
  3. Posibil sa fie cu "surpriza". Folositi-le in VM! Daca va cere parola bagati 123123 https://mega.nz/#F!ZOYzwJbY!y6cxKgbl3S7P901yFou9Og https://mega.nz/#F!FgBxkJaY!5twvwjAh9KN354Yt-arroQ Cobalt Strike: https://mega.nz/#F!jIBEiBwS!qUEIWb5ey6ihSWVtjzMdeQ Burp Suite: https://t.me/burpsuite/22 AWVS 10: https://mega.nz/#!MVoxkaqa!wgpk0Cv1I0kpX10tRofQTKiEoiSpFS84YaA6WmIE7kU Netsparker Professional Edition 4.8.0.13139: http://uploaded.net/file/92okflte/Netsparker Pro 4.8.0.13139.rar Posibil sa fie cu "surpriza". Folositi-le in VM! ### Web Application Security Scanner List Collaborative Penetration Test and Vulnerability Management Platform: https://github.com/infobyte/faraday YGN tools: http://yehg.net/lab/#tools Tools for auditing WAFS: https://github.com/lightbulb-framework/lightbulb-framework Bypass MOD_SECURITY WAF: https://github.com/sqlmapproject/sqlmap/blob/master/tamper/modsecurityzeroversioned.py Vulnerability scanner based on vulners.com audit API: https://github.com/videns/vulners-scanner Belati - The Traditional Swiss Army Knife For OSINT: https://github.com/aancw/Belati CrackMapExec - A swiss army knife for pentesting networks: https://github.com/byt3bl33d3r/CrackMapExec more info and other tools: https://byt3bl33d3r.github.io/ E-mail, subdomain and people names harvester: https://github.com/laramies/theHarvester BeEF - The Browser Exploitation Framework: http://beefproject.com/ https://github.com/beefproject/beef
    2 points
  4. S-a bușit ceva când ai făcut root (apk-ul nu era bun, nu a reușit să scrie ceva, a scris prost). Trebuie să-i bagi la loc un firmware oficial, după care să încerci din nou rootarea. Eventual încearcă alt apk pentru rootare. Ca să nu ți-l mai vadă sistemul de operare că e rootat, încearcă Magisk Manager.
    2 points
  5. https://cryptohub.nl/zines/ TeaMp0isoN hacked Unkn0wn.eu
    2 points
  6. Lista mai mare pe http://web.textfiles.com/ezines/, probabil sa fie si mirrors. Iar aici altele cached https://web.archive.org/web/20120426235852/http://www.gonullyourself.org:80/ezines/
    2 points
  7. Ai luat ba ratonule bacu'? Vezi ca se deschide sezonul angajărilor pe șantier
    2 points
  8. Iti fac eu unul, da mi add pe skype sa discutam prețul.
    1 point
  9. ai fail-at: https://github.com/dumbo0 scanageala pe sistem turbo-turbina? iti zice clar acolo: due to RDPSecurityNegoFail try standard security layer "Cum instalez programe in Debian?" Esti o laba de om!
    1 point
  10. gophirc A simple IRC bot framework written from scratch, in Go. Description Event based IRC framework. Warning The API might break anytime. Framework managed events Manages server PING requests (not CTCP PING) Registers on first NOTICE * Identifies on RPL_WELCOME (event 001) Joins the received invites & sends a greeting to the channel Logs if the bot gets kicked from a channel Features Capability to connect to multiple servers Multiple per event callbacks State & general logging Graceful exit handled either by a SIGINT (Ctrl-C) Parses a user from an IRC formatted nick!user@host to a User{} Config implements a basic checking on values Already implemented basic commands - JOIN, PART, PRIVMSG, NOTICE, KICK, INVITE, MODE, CTCP commands Many (?) more More: https://github.com/vlad-s/gophirc Bonus, IRC bot using gophirc - gophircbot: https://github.com/vlad-s/gophircbot
    1 point
  11. Joined 2015. Dupa doi ani pe RST, el ne intreaba ce inseamna QA - quality assurance. https://www.nofap.com/forum/index.php?members/hanlee97.145191/ https://hackforums.net/showthread.php?tid=4432606 Postase si pe HF cu catalogu zdipii: https://hackforums.net/showthread.php?tid=4623029 Iesi in invatarea ma-tii, du-te la joaca ca-ti rup urechile !!!
    1 point
  12. Cred ca e https://just-dice.com/
    1 point
  13. Site oficial: http://www.grab-m.com Download Free Version: http://grab-m.com/download/setup.exe M-am chinuit o luna la programelul asta eu si inca un coleg. Sper sa va placa. Daca nu returneaza prea multe in simple search incercati un query gen: yahoo.com
    1 point
  14. Asta seamana cu RAT-ul acela Darkcomet, imi amintesc ca il foloseam acu' 4-5 ani. Daca nu e tot acelasi doar cu unele update-uri.
    1 point
  15. Versiunea Python2 dupa cum am promis. Imi cer scuze legat de calitatea codului.
    1 point
  16. Da, momentan pwn este aproape finalizat si testat de noi ( @SilenTx0 inca lucreaza la ceva tutoriale). Peste putin timp o sa puteti sa accesati versiunea beta.
    1 point
  17. Black Hat Publicat pe 31 aug. 2017 A processor is not a trusted black box for running code; on the contrary, modern x86 chips are packed full of secret instructions and hardware bugs. In this talk, we'll demonstrate how page fault analysis and some creative processor fuzzing can be used to exhaustively search the x86 instruction set and uncover the secrets buried in your chipset. Full Abstract:https://www.blackhat.com/us-17/briefi... Download PDF: https://www.blackhat.com/docs/us-17/thursday/us-17-Domas-Breaking-The-x86-Instruction-Set-wp.pdf
    1 point
  18. Nu ai ce face cu lista e plină de minori
    1 point
  19. Incearca sa intri in portul constanta. Am auzit ca se intra usor si nici nu iti trebuie cunostinte IT.
    1 point
  20. 'es nebun man, da-i pe persoana fizica daca intermediezi, fa-ti firma si o sa sara pe tine ca mustele pe kkt,. Daca o sa ai bani de sa-ti permiti un contabil poti sa o faci da pana atunci esti unu ce nu e comerciant ci un mandatar ce mai face bani in timpu' liberi. Iar daca faci bani cu 600 de $ iti faci off-shore in Delaware
    1 point
  21. aici si iti trebuie doar imeiul succes!
    0 points
  22. Buna, sunt incepator in securitatea IT si hacking si am instalat un VM cu kali linux. Am gasit cateva porturi pe un site, ce sunt open iar apoi am incercat sa intru in ele prin netcat dar fara succes. va rog daca se poate un sfat ar fi de ajutor multumesc anticipat PS:La conectarea porturilor primesc "Connection refused"
    -1 points
  23. Salut, am facut un live usb ce contine Parrot Security OS (in trecut am avut kali pe live USB) si nu am reusit sa le fac sa obtina acces wi-fi in timpul live session-ului. Am incercat $ifconfig **** down/up $ifconfig **** mode monitor si nimic Vreun sfat?
    -1 points
  24. Salut! am nevoie de un programator care poate sa faca un scrappe la 1 website + import ... astept ofertele voastre!!! si daca totul iese bine , colaboram si pe viitor .
    -1 points
  25. Va rog frumos din inima sa ma ajutati cu 0.35 centi paypal ca am vreau sa imi iau un joc si atat mai am nevoie, va dau si un skin USP-S de 0.55 pe CSGO STEAM. EMAIL PAYPAL - specialminecraft690@yahoo.com
    -1 points
  26. Buna, as dori sa ma angajez, măcar Part time si as dori sa va întrebat despre câțiva termeni pe care i-am văzut si nu am inteles ce înseamnă, asa ca apelez la voi cu rugămintea de a ma lămuri. Printre altele, menționez ca aș dori sa lucrez pe partea de securitate/pentesting dar nici ca si programator nu n-ar deranja... Însă.. In situația in care sunt as accepta cam orice in domeniul IT Lista termeni pe care va rog sa mi-i explicatii: QA (văzut la multe firme) Survey (asemenea, văzut la mai multe firme) Tehnician help-desk Mulțumesc anticipat!!
    -4 points
×
×
  • Create New...