Jump to content

Search the Community

Showing results for tags 'search'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Informatii generale
    • Anunturi importante
    • Bine ai venit
    • Proiecte RST
  • Sectiunea tehnica
    • Exploituri
    • Challenges (CTF)
    • Bug Bounty
    • Programare
    • Securitate web
    • Reverse engineering & exploit development
    • Mobile security
    • Sisteme de operare si discutii hardware
    • Electronica
    • Wireless Pentesting
    • Black SEO & monetizare
  • Tutoriale
    • Tutoriale in romana
    • Tutoriale in engleza
    • Tutoriale video
  • Programe
    • Programe hacking
    • Programe securitate
    • Programe utile
    • Free stuff
  • Discutii generale
    • RST Market
    • Off-topic
    • Discutii incepatori
    • Stiri securitate
    • Linkuri
    • Cosul de gunoi
  • Club Test's Topics
  • Clubul saraciei absolute's Topics
  • Chernobyl Hackers's Topics
  • Programming & Fun's Jokes / Funny pictures (programming related!)
  • Programming & Fun's Programming
  • Programming & Fun's Programming challenges
  • Bani pă net's Topics
  • Cumparaturi online's Topics
  • Web Development's Forum
  • 3D Print's Topics

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


Yahoo


Jabber


Skype


Location


Interests


Biography


Location


Interests


Occupation

Found 15 results

  1. Am avut nevoie sa gasesc repede link-uri directe la aproape 50 de carti dupa amazon asa ca am scris 10 linii de cod si gata. Requirements: Python 2.7 pip install google googlesearch.py from google import search from time import sleep import sys def direct_search(title, ext, multi='off'): print title sleep(2) for url in search(title + ' ' + ext, stop=10): if (url.endswith('.pdf')): print url if (multi == 'off'): break if __name__ == "__main__": if (len(sys.argv) < 4): print 'usage: ./%s file.txt format multi=\'on/off\'' % sys.argv[0] print 'ex. : ./%s book-titles.txt pdf off' % sys.argv[0] else: with open(sys.argv[1], 'r') as file: for line in file: line = line.rstrip() links = direct_search(line, sys.argv[2], sys.argv[3]) if not line: continue file.close() Se foloseste in urmatorul fel: Creati un fisier si puneti in el numele de la carti listate unu sub celalt in urmatorul fel: fisier.txt Test-Driven Development with Python Fluent Python 1st Edition Foundations of Python Network Programming 3rd edition Python Network Programming Cookbook Si apoi rulati applicatia cu: ./googlesearch.py fisier.txt mp4 off daca vreti sa salvati linkurile intrun fisier: ./googlesearch.py fisier.txt mp4 off > urls.txt In caz ca nu va da rezultate incercati sa schimbati stop cu o valuare mai mare de 40.
  2. SURSA SophosLabs researchers recently uncovered a hack being used by unscrupulous web marketers to trick Google's page ranking system into giving them top billing, despite Google's ongoing efforts to thwart this sort of search poisoning. Over on the Sophos Blog, technical expert Dmitry Samosseiko explains how the scammers did it, and how SophosLabs spotted what they were up to. Here on Naked Security, we decided to take a look at why search engine poisoning matters, and what we can do as a community if we see that something is not what it seems. The power of search Put your hand up (literally, if you like) if you have ever done either or both of these: Set out to research a topic or a product thoroughly. Used your favourite search engine. Then gone no further than the first couple of results on the very first page. Job done. Used a search engine to gauge whether a business or website has been around a while and built up trust in that time. Seen it near the top of the first page of results. Job done. If you have, you aren't alone, and that's why doing well in search results is so important for a modern organisation. And that, in turn, is why Search Engine Optimisation (SEO) exists: you make every effort to write your web pages so they are clear and relevant, and you do your best to build up a reputation that makes already-trusted sites want to link to you. When others link to you, that acts as an implicit recommendation, and search engines let you bask in some of the reflected glory of the sites that have linked to you. Poisoning the chalice Of course, getting high up in the search rankings gives great results for cybercrooks too, and they don't play by the rules. Treachery by cybercrooks gives search companies a double whammy: the search engines end up not only giving away artificially high rankings for free, but also conferring trust even on web pages that put users in harm's way. As a result, the search companies have been in a constant battle with the Bad Guys to stamp out tricks that poison search rankings. One search poisoning technique involves being two-faced: looking honest and reputable when a search engine visits in the course of indexing the web, yet serving up malevolent content when a user clicks through. This trick is called cloaking, and it's been going on for years. As you can imagine, the search engines have become adept at detecting when websites feed back content that doesn't look right. For example, they can compare what happens when their own search engine software (known as a spider or a crawler) comes calling, and what shows up when a regular browser visits the site. Servers often tweak the pages they present depending on which browser you're using, so some variation between visits is to be expected. But if a browser sees a story about apples while the crawler is being sold on oranges, then something fishy is probably going on. Additionally, a search engine can analyse the pages that its crawler finds in order to estimate how realistic they look. Google's crawler is known – officially, as you see in the HTTP header example above – as the Googlebot, and it has been taught to be rightly suspicious of web pages that seem to "try too hard" because they've been artificially packed with fraudulent keywords. Scamming the Googlebot But even Google doesn't get it right all the time. Indeed, SophosLabs recently spotted dodgy web marketers using a surprisingly simple trick to persuade the usually-sceptical Googlebot to accept bogus content. The trick inflated the reputation of dubious pages, and sent them dishonestly scooting up the search rankings. Our researchers immediately informed Google so that the problem could be fixed, but the story makes for fascinating reading. Dmitry Samosseiko of SophosLabs has published a highly readable report about what happened; we're not going to spoil the fun by repeating it here, so please head over to our Sophos Blog for the details.
  3. Seo Tools for Excel On-page SEO By installing the SeoTools for Excel add-in your get access to a set of functions that are useful when working with online marketing. For on-page SEO analysis you have functions like HtmlH1, HtmlTitle and HtmlMetaDescription which you can use to verify that your pages are correctly setup. Off-page SEO SeoTools also comes in handy when looking at off-page SEO factors. Using CheckBacklink you can verify that your backlinks are still available and with GooglePageRank you can fetch PageRanks for multiple urls at ease. Integrations SeoTools comes packed with integrations to other marketing platforms. Use the Google Analytics integration to build your own automated KPI reports or Majestic to analyze your backlink profile. Some integrations are available for free while others require a Pro subscription. Scrapers By building your own Scraper using our XML format you can extend SeoTools to integrate with an external API or fetch a particular part of any webpage. SeoTools includes with over 25 open-source scrapers such as Facebook.Likes, Google.Results and Youtube. Spider With the Spider you can combine all the features in SeoTools into a powerful webpage crawler. Either supply the Spider with a list of urls or start with a root url. Other Tools Excel is an invaluable tool for any online marketer but some features are missing. SeoTools provides you with over 100 helper functions like XPathOnUrl, RegexpFind, DomainAge, SpinText and UrlProperty. Read the documentation for a full function reference. Page Download Page Videos Pro Version Pro edition only €49/year With Pro your get: Adfree! (no startup screen) Unlimited pagecrawls with the amazing SeoTools Spider Unlimited result rows in Google Analytics and MCF reporting Google Adwords integration (requires API key) SEMrush, Ahrefs and SEOlytics integrations Email support The incredible feeling of contributing to the future development of this amazing tool! SEMrush integration Integrated features Anchors Backlinks Competitors in organic search Competitors in paid search Display advertisers Domain organic search keywords Domain overview Domain paid search keywords Indexed pages Keyword difficulty Keyword overview Organic results Paid results Phrase match keywords Referring domains Referring ips Related keywords Url organic search keywords Url paid search keywords This integration requires SeoTools Pro, a SEMrush subscription and a SEMrush API package. PS. Se poate face crack la versiunea Pro!?
  4. Microsoft product manager Duane Forrester says it will encrypt all Bing search traffic later this year. Forrester says the move follows Cupertino's 2014 decision to allow users to opt-in to HTTPS for web searches. "Beginning this (Northern hemisphere) summer, we will begin the process of encrypting search traffic by default," Forrester blogged. "This means that traffic originating from Bing will increasingly come from https as opposed to http." Microsoft will also drop query search terms from referrers strings in a bid to further shore up privacy. Web ad bods will be able to learn the queries that lead users to their pages through Microsoft's search terms report, universal event tracking, and webmaster tools. " While this change may impact marketers and webmasters, we believe that providing a more secure search experience for our users is important," Forrester says. The HTTPS move brings Microsoft up to speed with Google which began encrypting search traffic in 2011 making it compulsory in 2013, and Yahoo! which deployed HTTPS for its search in 2014. Encrypting search traffic and other non-sensitive web traffic is seen widely by privacy and security pundits as necessary to a more safer web. Source
  5. Bugtroid este un instrument inovativ dezvoltat de echipa de BugTraq-Team. Principalele caracteristici ale acestei aplicatie este c? are mai mult de 200 de instrumente pentru Android si Linux (PRO) pentru pentesting. Are un meniu clasificate în func?ie de natura instrumentului poti g?si: 1 Anonymity : Proxy Browser Clean Master QuickLab cleaner Orbot Fakegps ChangeMac Orweb Proxydroid IP Checker Proxy server Direccion IP Spy Kit-Universal Mailer 2. 802.11(WIFI) Claves wifi Wifi Analyzer WifiLeaks Mac2wepkey WifiKill Wifi Radar Airmon 3. BruteForce Router Brute Force Routerpwn WIBR 4. DDOS AnDOsid Droidswarm Loic 1 Loic 2 SMS Bomber SMS reliator OFS Doser 5. Crypto HashDroid HashDecrypt Cryptonite APG CrypticSMS HashPass 6. Forensics Loggy Wifi Credetials Recovery Undelete CellID Info aLogcat Exit Viewer 7. Networking Wirless teher Netowrk port database aNmap Foxfi Fing AndFTP AndSMB Wake on Lan ConnectBot SSHtunel Connect SQL 8. Pentesting Bulbsecurity framework Nessus Zanti Dsploit Wifiinspect 9. People Search 123people Gmon2 Wigle Wifi Wardriving People Search Search People KGB People 9.Remote Flu Client DynDNS Blue Remote No-Ip Airdroid TeamViewer Android VNC 10. Scripting Scripting for android Perl for android Python for android Llama Script Launcher 11. Security Stripg Guard Keepass Droidwall Wifi Protector BLacklist apk Security Key Generator RedPhone DroidSheep Guard 12. Sniffers SSLStrip Droid Sniff Droidsheep Dsploit Shark Shark Reader Facesniff ArpSpoof Intercepter-NG 13. System Root Browser Autorun Cpuoverlay Zarchiver Osmonitor ROM toolbox Lite androPHP 14. Web Admin Panel Finder. 15. Av's VirusTotal Zoner Antivirus Antivirus Dr Web Avast Avira Download : DepositFiles
  6. KhiZaRix

    CPAN

    What is CPAN Search? CPAN Search is a search engine for the distributions, modules, docs, and ID's on CPAN. It was conceived and built by Graham Barr as a way to make things easier to navigate. Originally named TUCS [ The Ultimate CPAN Search ] it was later named CPAN Search or Search DOT CPAN. Mai pe în?eles , înghe?at? la cornet Link? http://search.cpan.org/
  7. Vând GSA Search Engine Ranker Nu e furat e al meu, o pot dovedi pentru cei interesa?i. Pre?: 69$ paypal Dau primul userilor cu vechime ?i de încredere, dac? nu facem tranzac?ia printr-un moderator.
  8. When it comes to search on mobile devices, users should get the most relevant and timely results, no matter if the information lives on mobile-friendly web pages or apps. As more people use mobile devices to access the internet, our algorithms have to adapt to these usage patterns. In the past, we’ve made updates to ensure a site is configured properly and viewable on modern devices. We’ve made it easier for users to find mobile-friendly web pages and we’ve introduced App Indexing to surface useful content from apps. Today, we’re announcing two important changes to help users discover more mobile-friendly content: 1. More mobile-friendly websites in search results Starting April 21, we will be expanding our use of mobile-friendliness as a ranking signal. This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results. Consequently, users will find it easier to get relevant, high quality search results that are optimized for their devices. To get help with making a mobile-friendly site, check out our guide to mobile-friendly sites. If you’re a webmaster, you can get ready for this change by using the following tools to see how Googlebot views your pages: If you want to test a few pages, you can use the Mobile-Friendly Test. If you have a site, you can use your Webmaster Tools account to get a full list of mobile usability issues across your site using the Mobile Usability Report. 2. More relevant app content in search results Starting today, we will begin to use information from indexed apps as a factor in ranking for signed-in users who have the app installed. As a result, we may now surface content from indexed apps more prominently in search. To find out how to implement App Indexing, which allows us to surface this information in search results, have a look at our step-by-step guide on the developer site. source: Google Webmaster
  9. The Judicial Conference Advisory Committee on Criminal Rules voted 11-1 to modify a federal rule – known as Rule 41 – that expands hacking authority for the FBI, the National Journal reported on Monday, citing a Department of Justice (DOJ) spokesperson. As the rule stands, judges can only approve search warrants for materials within their own judicial district – modified, courts would have the ability to grant search warrants for electronic information located outside their judicial district, the report indicates. A variety of organizations – including Google and a number of other civil rights and civil liberties groups – have spoken out against the proposal, but David Bitkower, deputy assistant attorney general of the Criminal Division in the Justice Department, defended it in a December 2014 memo. Bitkower wrote that “the proposed amendment would ensure that a court has jurisdiction to issue a search warrant in two categories of investigations involving modern Internet crime: cases involving botnets and cases involving Internet anonymizing techniques.” He went on to say, “The proposal would not authorize the government to undertake any search or seizure or use any remote search technique not already permitted under current law. As with all search warrant applications, such concerns must ultimately be resolved through judicial determination on a case-by-case basis.” Google responded in February with comments written by Richard Salgado, director of law enforcement and information security with Google. Salgado took issue that the amendment makes it so the government may use “remote access” to search and seize or copy electronic data, stating that the wording is too vague and does not specify how searches will be conducted and what may be searched. “The term “remote access” is not defined,” Salgado wrote. “Sample search warrants submitted by the DOJ to the Committee indicate that “remote access” may involve network investigative techniques, or NITs, which include, for example, the installation of software onto a target device to extract and make available to law enforcement certain information from the device, including IP address, MAC address, and other identifying information.” Salgado went on to add, “In short, “remote access” seems to authorize government hacking of any facility wherever located,” and later wrote that the amendment would authorize remote searches of millions of computers because, according to the FBI, botnets can grow to include of a large number of computers. The Electronic Frontier Foundation (EFF) shared in Google's concerns. In a comment emailed to SCMagazine.com on Tuesday, Hanni Fakhoury, EFF senior staff attorney, called the amendment a substantive legal change masquerading as a mere procedural rule change. “That is, by seeking to change the procedural rules about how the government can execute remote searches (which in essence means how they can deploy malware), the government is essentially pushing for approval of the idea that it should have the power to deploy malware and execute remote searches,” Fakhoury said. “To us, it seems like that's a decision Congress should make.” Piggybacking on that idea, Nathan Freed Wessler, staff attorney with the American Civil Liberties Union (ACLU), said in a comment emailed to SCMagazine.com on Tuesday that the amendment expands the government's ability to use malware and zero-day exploits, and without imposing necessary protections. “The current proposal fails to strike the right balance between safeguarding privacy and internet security and allowing the government to investigate crimes,” Freed said. The National Journal report indicated that a number of other steps must a occur before the changes are made official, and the process could take longer than a year. A Justice Department spokesperson did not respond to a SCMagazine.com request for additional information. Source
  10. Spybot Search & Destroy 1.6.2 Security Center Service Privilege Escalation Vendor: Safer-Networking Ltd. Product web page: http://www.safer-networking.org Affected version: 1.6.2 Summary: Spybot – Search & Destroy (S&D) is a spyware and adware removal computer program compatible with Microsoft Windows 95 and later. It scans the computer hard disk and/or RAM for malicious software. Desc: The application suffers from an unquoted search path issue impacting the service 'SBSDWSCService' for Windows deployed as part of Spybot S&D. This could potentially allow an authorized but non-privileged local user to execute arbitrary code with elevated privileges on the system. A successful attempt would require the local user to be able to insert their code in the system root path undetected by the OS or other security applications where it could potentially be executed during application startup or reboot. If successful, the local user’s code would execute with the elevated privileges of the application. Tested on: Microsoft Windows Ultimate 7 SP1 (EN) Vulnerability discovered by Aljaz Ceru aljaz@insec.si Advisory ID: ZSL-2015-5237 Advisory URL: http://www.zeroscience.mk/en/vulnerabilities/ZSL-2015-5237.php 17.02.2015 --- C:\Users\user>sc qc SBSDWSCService [SC] QueryServiceConfig SUCCESS SERVICE_NAME: SBSDWSCService TYPE : 10 WIN32_OWN_PROCESS START_TYPE : 2 AUTO_START ERROR_CONTROL : 1 NORMAL BINARY_PATH_NAME : C:\Program Files\Spybot - Search & Destroy\SDWinSec.exe LOAD_ORDER_GROUP : TAG : 0 DISPLAY_NAME : SBSD Security Center Service DEPENDENCIES : wscsvc SERVICE_START_NAME : LocalSystem C:\Users\user> Source
  11. There is an entire section of the Internet that you probably don’t see on daily basis, it’s called the "Darknet" or "Deep Web", where all browsing is done anonymously. About a week ago, we reported about the 'Memex' Deep Web Search Engine, a Defense Advance Research Projects Agency (DARPA) project to create a powerful new search engine that could find things on the deep web that isn't indexed by Google and other commercial search engines, but it isn't available to you and me. Now, there is another search engine that will let anyone easily search the Deep Web for large swaths of information for free, and without an application; you only need is an Internet connection. Onion.City, a new search engine for online underground markets that makes it more easier to find and buy drugs, guns, stolen credit cards directly from your Chrome, Internet Explorer or Firefox browser without installing and browsing via Tor Browser. Just two days after Memex story came to light, Virgil Griffith announced Onion.City Deep Web search engine onto the Tor-talk mailing list, that actually gives you the feel of a normal search engine, but can search the ".onion" domains on Deep Web and throw up results on your normal browser. ONION.CITY — SEARCH ENGINE FOR TOR ONION SITES Onion.City darknet search engine is powered using Tor2web proxy which enables it to access deep into the anonymous Tor network, finds ".onion" sites by aggregating the hidden marketplaces and makes them available to the normal web browser with easiest navigation. Tor Network is one of the most well-known Darknets, where web addresses on the Tor network follow the form of a random string of letters followed by the ".onion" suffix and are only accessible through the Tor browser. Online users visit and run so-called hidden services on ".onion" domains or deep web, but the way to get around the ".onion" websites is to first have a Tor browser. However, Onion City darknet search engine made it easy and effective for Internet users in order to search on the deep web from our favorite, insecure web browser. Those who aren't much familiar with the Deep Web can read our wonderful and detailed article on "What is the Deep Web? A first trip into the abyss". GRAMS — BLACK MARKET SEARCH ENGINE However, Onion.city isn't the first ever Deep Web search engine. Last year, the first search engine for online underground Black Markets, called Grams, was launched that lets anyone to easily find illegal drugs and other contraband online in an easier way ever and it's pretty fast like Google Search Engine. Such a search engine like Grams and Onion.city are mostly considered to be illegal or illicit, but not every website on the Deep Web is dubious. The Frequently Asked Questions (FAQs) on Onion.City website even provides an email address to report content that may be illegal, though it's unclear exactly what steps they’ll take. For now, leaving controversies aside, Onion.city seems to be a nice and effective Deep Web search engine for providing a means for regular web users to search things they would otherwise have to work a little harder to find. Source
  12. Singura chestie pe care nu o stiu este cum sa calculez total number of comparisons si average number( dupa cateva calcule, am ajuns la faptul ca pentru a vedea daca o valoare este sau nu acolo, nr de comparatii e log(n). Putin ajutor pt mai departe?
  13. Am zis sa bag stirea aici ca nu prea e de securitate.... By EJ Dickson on September 15, 2014 This article contains sexually explicit material that may be NSFW. Everyone has a favorite search engine for finding porn. But it’s a well-established fact that thanks to certain tech juggernauts imposing restrictions on adult content in search results, some search engines are just better at finding smut than others (*cough Bing cough*). If you have an ultra-high-powered government job, or you share a computer with a roommate who’s studying for the clergy, there’s always a concern that your late-night searches for busty Brazilian teens will show up in your search history. But apparently, you won’t have to worry about that happening with Boodigo, which is being touted as “the world’s first adult search engine.” Unlike other search engines, which make it intentionally difficult for users to access naughtier content, Boodigo “is designed to find ‘real’ adult sites and give top listings to them,” Colin Rowntree, one of Boodigo’s founders, said in a press release. “That avoids the problem of going to Google, searching for, say, ‘blowjob’, and getting the first multiple results pages of Wikipedia articles, women’s magazine how-to guides, etc., before the online user can actually find a link to sites that focus on blowjob photos and movies.” Boodigo isn’t actually the first search engine designed exclusively for porn: There’s also Search.xxx, an adult-friendly mockup of Google, as well as PornMD. But unlike PornMD, which will take you directly to free tube sites (which many performers in the adult industry have claimed encourages the spread of illegal piracy), Boodigo is marketing itself as a search engine for the ethical porn aficionado: The site directs you to individual performer and studio pay sites, instead of sites that might feature illegally posted or unlicensed content. Curious about the potential of a porn search engine that encourages people to actually pay for porn, I decided to give Boodigo a whirl. I started with an easy one: adult performer and Duke porn star Belle Knox, whom I met at her birthday party earlier this year. Here’s what came up when I searched for Knox on Google, sans SafeSearch settings: And here’s Knox on Boodigo: These search results either link to Belle’s entries on various porn databases, or to pay sites that feature her work, where you have to again search for her there. (Not all of them even do: Baremaidens.com, for instance, which shows up in a Boodigo search, features performers named “Bailey Knox” and “Natasha Belle,” but not the Duke porn star herself.) Next, I tried “eel anal porn,” based on an unnamed coworker’s suggestion that a film called Eels Out the Ass Like Whoa is a real thing. When I searched on Google, the clip immediately came up in the second search result, for better or for worse: Sadly, that was not the case on Boodigo. Apparently, the site had some trouble differentiating between the specific niche I was searching for (i.e. eel anal porn), and good old-fashioned anal porn, which in the world of porn searches is kind of like being unable to tell the difference between a Burgundy and a Bordeaux and just saying, "meh, they're both red wines." Boodigo also pulled up a performer named “Anal Alan,” whom I had never heard of but apparently has an empty YouTube channel. (Given that his height is listed as “0,” I guess it’s no surprise that his career never took off.) Because “eel anal porn” is admittedly fairly obscure, I decided to search for just “anal.” My luck was a little better with Boodigo this time around: Not so much with Google, however, which pulled up Wikipedia and the r/anal subreddit in lieu of actual anal porn: That's like asking for a glass of Bordeaux and getting a warm can of 7-Up instead. Shame on you, Google. Shame. On. You. So, OK, if it wants to go around calling itself the world’s first porn search engine, Boodigo obviously needs to work out a few kinks first. But in light of Google’s recent AdWords policy change restricting adult content advertising, many porn performers and producers have expressed concern that tech giants are increasingly censoring adult content, which might lead to them eliminating adult content from their platforms altogether. If that actually ever happens, a search engine like Boodigo won’t just be helpful to porn aficionados looking for a secure, anonymous, cookie-free J.O. experience—it’ll be necessary. Let’s just hope for the sake of eel anal enthusiasts that it tweaks its algorithms a bit first. Photo via morgueFile Archive (PD) Source: Can the 'world's first porn search engine' beat Google?
  14. Text dump websites are used by programmers and system administrators to share and store pieces of source code and configuration information. Two of the most popular text dump websites are pastebin and pastie. Day by day more and more programmers, amateur system administrators and regular users are captivated by the attractive functional features of these web tools and use them in order to share large amounts of configuration and source code information. Therefore, like happening in each famous web platform, sensitive information sharing is inevitable. Potential attackers use these web platforms to gather information about their targets, while on the other side penetration testers search into these sites to prevent critical information leakage. Most of the text dump web platforms offer a searching mechanism and therefore anyone can manually query the database for matching strings. Although an automated script/tool capable to query all these text dump websites and generate an overall searching report, would be very useful for the reconnaissance phase of a penetration test. Pen-testers can use such an automate tool, in order to efficiently search for potential configuration and login credentials information leakage that will help an attacker to profile the victim system and find a security hole. Recently I came across in the web with such a script, pastenum. Pastenum is a ruby script written by Nullthreat member of the Corelan Team. It can query pastebin, pastie and github for user defined strings and generate an overall html report with the searching results. Installation information: http://redmine.corelan.be:8800/projects/corelan-pastenum/wiki Download: http://redmine.corelan.be:8800/attachments/download/477/Pastenum2.zip
  15. stiu ca suna ciudat, ca o sa fiu considerat ratat stiu ca postul o sa fie mutat in cele mai penale threaduri, poate iau si locul 1....dar.... Poate afla cineva varianta de maine de la bac - proba romana? Ma gandesc ca e urcata deja pe site-ul inspectoratului dar probabil ai nevoie de ceva account inregistrat sau full acces sa o poti gasii/downloada... Totusi, daca binevoieste cineva sa incerce sa caute, o sa ii multumesc din suflet! Tnx ^^
×
×
  • Create New...