Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 01/28/23 in all areas

  1. CloudBrute A tool to find a company (target) infrastructure, files, and apps on the top cloud providers (Amazon, Google, Microsoft, DigitalOcean, Alibaba, Vultr, Linode). The outcome is useful for bug bounty hunters, red teamers, and penetration testers alike. The complete writeup is available. here At a glance Motivation While working on HunterSuite, and as part of the job, we are always thinking of something we can automate to make black-box security testing easier. We discussed this idea of creating a multiple platform cloud brute-force hunter.mainly to find open buckets, apps, and databases hosted on the clouds and possibly app behind proxy servers. Here is the list issues we tried to fix: separated wordlists lack of proper concurrency lack of supporting all major cloud providers require authentication or keys or cloud CLI access outdated endpoints and regions Incorrect file storage detection lack support for proxies (useful for bypassing region restrictions) lack support for user agent randomization (useful for bypassing rare restrictions) hard to use, poorly configured Features Cloud detection (IPINFO API and Source Code) Supports all major providers Black-Box (unauthenticated) Fast (concurrent) Modular and easily customizable Cross Platform (windows, linux, mac) User-Agent Randomization Proxy Randomization (HTTP, Socks5) Supported Cloud Providers Microsoft: Storage Apps Amazon: Storage Apps Google: Storage Apps DigitalOcean: storage Vultr: Storage Linode: Storage Alibaba: Storage Version 1.0.0 Usage Just download the latest release for your operation system and follow the usage. To make the best use of this tool, you have to understand how to configure it correctly. When you open your downloaded version, there is a config folder, and there is a config.YAML file in there. It looks like this providers: ["amazon","alibaba","amazon","microsoft","digitalocean","linode","vultr","google"] # supported providers environments: [ "test", "dev", "prod", "stage" , "staging" , "bak" ] # used for mutations proxytype: "http" # socks5 / http ipinfo: "" # IPINFO.io API KEY For IPINFO API, you can register and get a free key at IPINFO, the environments used to generate URLs, such as test-keyword.target.region and test.keyword.target.region, etc. We provided some wordlist out of the box, but it's better to customize and minimize your wordlists (based on your recon) before executing the tool. After setting up your API key, you are ready to use CloudBrute. ██████╗██╗ ██████╗ ██╗ ██╗██████╗ ██████╗ ██████╗ ██╗ ██╗████████╗███████╗ ██╔════╝██║ ██╔═══██╗██║ ██║██╔══██╗██╔══██╗██╔══██╗██║ ██║╚══██╔══╝██╔════╝ ██║ ██║ ██║ ██║██║ ██║██║ ██║██████╔╝██████╔╝██║ ██║ ██║ █████╗ ██║ ██║ ██║ ██║██║ ██║██║ ██║██╔══██╗██╔══██╗██║ ██║ ██║ ██╔══╝ ╚██████╗███████╗╚██████╔╝╚██████╔╝██████╔╝██████╔╝██║ ██║╚██████╔╝ ██║ ███████╗ ╚═════╝╚══════╝ ╚═════╝ ╚═════╝ ╚═════╝ ╚═════╝ ╚═╝ ╚═╝ ╚═════╝ ╚═╝ ╚══════╝ V 1.0.7 usage: CloudBrute [-h|--help] -d|--domain "<value>" -k|--keyword "<value>" -w|--wordlist "<value>" [-c|--cloud "<value>"] [-t|--threads <integer>] [-T|--timeout <integer>] [-p|--proxy "<value>"] [-a|--randomagent "<value>"] [-D|--debug] [-q|--quite] [-m|--mode "<value>"] [-o|--output "<value>"] [-C|--configFolder "<value>"] Awesome Cloud Enumerator Arguments: -h --help Print help information -d --domain domain -k --keyword keyword used to generator urls -w --wordlist path to wordlist -c --cloud force a search, check config.yaml providers list -t --threads number of threads. Default: 80 -T --timeout timeout per request in seconds. Default: 10 -p --proxy use proxy list -a --randomagent user agent randomization -D --debug show debug logs. Default: false -q --quite suppress all output. Default: false -m --mode storage or app. Default: storage -o --output Output file. Default: out.txt -C --configFolder Config path. Default: config for example CloudBrute -d target.com -k target -m storage -t 80 -T 10 -w "./data/storage_small.txt" please note -k keyword used to generate URLs, so if you want the full domain to be part of mutation, you have used it for both domain (-d) and keyword (-k) arguments If a cloud provider not detected or want force searching on a specific provider, you can use -c option. CloudBrute -d target.com -k keyword -m storage -t 80 -T 10 -w -c amazon -o target_output.txt Dev Clone the repo go build -o CloudBrute main.go go test internal in action How to contribute Add a module or fix something and then pull request. Share it with whomever you believe can use it. Do the extra work and share your findings with community ♥ FAQ How to make the best out of this tool? Read the usage. I get errors; what should I do? Make sure you read the usage correctly, and if you think you found a bug open an issue. When I use proxies, I get too many errors, or it's too slow? It's because you use public proxies, use private and higher quality proxies. You can use ProxyFor to verify the good proxies with your chosen provider. too fast or too slow ? change -T (timeout) option to get best results for your run. Credits Inspired by every single repo listed here . Sursa: https://github.com/0xsha/CloudBrute
    1 point
  2. Table of Contents Why is IDS necessary? Hardware requirements Software requirements Switch Setup Install Elasticsearch, Kibana and Wazuh Configure the Elasticsearch Configure Kibana Configure the Filebeat Set Suricata, Filebeat and Rogue Access Point on the Raspberry Pi 4 Configure Suricata Configure Filebeat Configure the Rogue Access Point Check the logs Why is IDS necessary? The IDS analyses traffic flows to the protected resource in order to detect and prevent exploits or other vulnerability issues, IDS can offer protection from external users and internal attackers, where traffic doesn’t go past the firewall at all. In this article, I will explain how to build your own home network-based Intrusion Detection System (IDS) using a low budget. NOTE: This network-based could be easily transformed into a strong SIEM by installing Wazuh-Agent on all devices that are part of the infrastructure, see the example. Hardware requirements Any router with multiple ports TP-Link TL-SG108E Smart Switch Raspberry Pi4 8GB Netgear AC1200 network adapter 1 x DigitalOcean VPS, minimum requirements: 4 GB Memory / 50 GB Disk / Ubuntu 22.10 x64 Software requirements Elasticsearch Kibana Filebeat Filebeat modules Suricata Switch Setup First of all, we need to set the ports that we want to mirror, in this case, ports 1, 2, and 3 will be mirrored to port 8. Let’s assume that you have already the Easy Smart Configuration Utility installed and configured. Login to your Switch Windows Application / Web interface Go to “Monitoring” Choose the “Port Mirror” option on the left menu Ok, all you have to do is to change the Port Mirror status to enable and Mirroring Port to port 8, next enable the “Ingress” and “Egress” for ports 1, 2, and 3, after that just click on “Apply”. To check if the traffic is mirrored, login to your Raspberry and capture the port 80 traffic while you do a browser/curl request to http://testphp.vulnweb.com/ using your device connected to any of the following ports 1, 2, or 3. It looks good: Install Elasticsearch, Kibana and Wazuh For this project, I used a Ubuntu VPS from DigitalOcean, as you probably know those components require many resources, if this system is used on a large scale is recommended to use a distributed system. In my case I use “All-in-one deployment”, so I highly recommend a server with a minimum of 4 GB Memory / 50 GB Disk / Ubuntu 22.10 x64. Login and update your server apt-get update Bash Install the requirements apt-get install curl apt-transport-https zip unzip lsb-release libcap2-bin -y Bash Trust the GPG key and add the Elasticsearch to your source list curl -s https://artifacts.elastic.co/GPG-KEY-elasticsearch --max-time 300 | apt-key add - echo 'deb https://artifacts.elastic.co/packages/7.x/apt stable main' | eval "tee /etc/apt/sources.list.d/elastic-7.x.list" Bash Trust the GPG key, add the Wazuh to your source list and update the system curl -s https://packages.wazuh.com/key/GPG-KEY-WAZUH --max-time 300 | apt-key add - echo "deb https://packages.wazuh.com/4.x/apt/ stable main" | tee -a /etc/apt/sources.list.d/wazuh.list apt-get update Bash Install components apt-get install elasticsearch kibana=7.11.2 wazuh-manager filebeat -y Bash Configure the Elasticsearch Create the file below to /etc/elasticsearch/elasticsearch.yml network.host: 0.0.0.0 node.name: elasticsearch cluster.initial_master_nodes: elasticsearch # Transport layer xpack.security.transport.ssl.enabled: true xpack.security.transport.ssl.verification_mode: certificate xpack.security.transport.ssl.key: /etc/elasticsearch/certs/elasticsearch.key xpack.security.transport.ssl.certificate: /etc/elasticsearch/certs/elasticsearch.crt xpack.security.transport.ssl.certificate_authorities: /etc/elasticsearch/certs/ca/ca.crt # HTTP layer xpack.security.http.ssl.enabled: true xpack.security.http.ssl.verification_mode: certificate xpack.security.http.ssl.key: /etc/elasticsearch/certs/elasticsearch.key xpack.security.http.ssl.certificate: /etc/elasticsearch/certs/elasticsearch.crt xpack.security.http.ssl.certificate_authorities: /etc/elasticsearch/certs/ca/ca.crt # Elasticsearch authentication xpack.security.enabled: true path.data: /var/lib/elasticsearch path.logs: /var/log/elasticsearch YAML Note: You can use the same configuration. Create the file below to /usr/share/elasticsearch/instances.yml instances: - name: "elasticsearch" ip: - "127.0.0.1" YAML Note: Don’t forget to change your public IP. Generate the certificates using the bash script below: #!/bin/bash /usr/share/elasticsearch/bin/elasticsearch-certutil cert ca --pem --in instances.yml --keep-ca-key --out ~/certs.zip unzip ~/certs.zip -d ~/certs mkdir /etc/elasticsearch/certs/ca -p cp -R ~/certs/ca/ ~/certs/elasticsearch/* /etc/elasticsearch/certs/ chown -R elasticsearch: /etc/elasticsearch/certs chmod -R 500 /etc/elasticsearch/certs chmod 400 /etc/elasticsearch/certs/ca/ca.* /etc/elasticsearch/certs/elasticsearch.* Bash Start the Elasticsearch systemctl start elasticsearch Bash Generate the passwords /usr/share/elasticsearch/bin/elasticsearch-setup-passwords auto -b Bash You will receive an output like this: Check if it works by accessing https://<your-public-ip>:9200 with the user: elastic and your generated password. Configure Kibana Create the file below to /etc/kibana/kibana.yml server.host: 0.0.0.0 server.port: 443 elasticsearch.hosts: https://localhost:9200 elasticsearch.password: <elasticsearch_password> # Elasticsearch from/to Kibana elasticsearch.ssl.certificateAuthorities: /etc/kibana/certs/ca/ca.crt elasticsearch.ssl.certificate: /etc/kibana/certs/kibana.crt elasticsearch.ssl.key: /etc/kibana/certs/kibana.key # Browser from/to Kibana server.ssl.enabled: true server.ssl.certificate: /etc/kibana/certs/kibana.crt server.ssl.key: /etc/kibana/certs/kibana.key # Elasticsearch authentication xpack.security.enabled: true elasticsearch.username: elastic uiSettings.overrides.defaultRoute: "/app/wazuh" elasticsearch.ssl.verificationMode: certificate YAML Note: Don’t forget to add your generated password. Configure kibana certs and install the Wazuh plugin #!/bin/bash mkdir /usr/share/kibana/data chown -R kibana:kibana /usr/share/kibana/ cd /usr/share/kibana sudo -u kibana /usr/share/kibana/bin/kibana-plugin install https://packages.wazuh.com/4.x/ui/kibana/wazuh_kibana-4.1.5_7.11.2-1.zip mkdir /etc/kibana/certs/ca -p cp -R /etc/elasticsearch/certs/ca/ /etc/kibana/certs/ cp /etc/elasticsearch/certs/elasticsearch.key /etc/kibana/certs/kibana.key cp /etc/elasticsearch/certs/elasticsearch.crt /etc/kibana/certs/kibana.crt chown -R kibana:kibana /etc/kibana/ chmod -R 500 /etc/kibana/certs chmod 440 /etc/kibana/certs/ca/ca.* /etc/kibana/certs/kibana.* setcap 'cap_net_bind_service=+ep' /usr/share/kibana/node/bin/node Bash Start Wazuh-manager systemctl start wazuh-manager Bash Configure the Filebeat Create the following file to /etc/filebeat/filebeat.yml # Wazuh - Filebeat configuration file output.elasticsearch.hosts: ["127.0.0.1:9200"] output.elasticsearch.password: <elasticsearch_password> filebeat.modules: - module: wazuh alerts: enabled: true archives: enabled: false setup.template.json.enabled: true setup.template.json.path: /etc/filebeat/wazuh-template.json setup.template.json.name: wazuh setup.template.overwrite: true setup.ilm.enabled: false output.elasticsearch.protocol: https output.elasticsearch.ssl.certificate: /etc/elasticsearch/certs/elasticsearch.crt output.elasticsearch.ssl.key: /etc/elasticsearch/certs/elasticsearch.key output.elasticsearch.ssl.certificate_authorities: /etc/elasticsearch/certs/ca/ca.crt output.elasticsearch.username: elastic YAML Don’t forget to edit the following parameters output.elasticsearch.hosts and output.elasticsearch.password. Download wazuh-filebeat module and copy the certificates #!/bin/bash curl -so /etc/filebeat/wazuh-template.json https://raw.githubusercontent.com/wazuh/wazuh/4.1/extensions/elasticsearch/7.x/wazuh-template.json --max-time 300 chmod go+r /etc/filebeat/wazuh-template.json curl -s https://packages.wazuh.com/4.x/filebeat/wazuh-filebeat-0.1.tar.gz --max-time 300 | tar -xvz -C /usr/share/filebeat/module mkdir /etc/filebeat/certs cp -r /etc/elasticsearch/certs/ca/ /etc/filebeat/certs/ cp /etc/elasticsearch/certs/elasticsearch.crt /etc/filebeat/certs/filebeat.crt cp /etc/elasticsearch/certs/elasticsearch.key /etc/filebeat/certs/filebeat.key Bash Start the Filebeat systemctl start filebeat Bash Test the config filebeat test output Bash Ok, it looks good Set Suricata, Filebeat and Rogue Access Point on the Raspberry Pi 4 In order to install Filebeat, the source below should be added. curl -s https://packages.wazuh.com/key/GPG-KEY-WAZUH --max-time 300 | apt-key add - echo "deb https://packages.wazuh.com/4.x/apt/ stable main" | tee -a /etc/apt/sources.list.d/wazuh.list apt-get update apt install filebeat apt install suricata Bash Configure Suricata Make sure that the following file /etc/systemd/system/suricata.service looks like: [Unit] Description=Suricata Intrusion Detection Service After=network.target syslog.target [Service] ExecStart=/usr/bin/suricata -c /etc/suricata/suricata.yaml -i eth0 -S /var/lib/suricata/rules/suricata.rules ExecReload=/bin/kill -HUP $MAINPID ExecStop=/bin/kill $MAINPID [Install] WantedBy=multi-user.target Bash Start the Suricata sudo systemctl start suricata Bash To test your IDS, run the following script on any device that the traffic is mirrored. On your Raspberry use the command below to see logs: sudo tail -f /var/log/suricata/fast.log Bash Configure Filebeat Create the following file to /etc/filebeat/filebeat.yml # Wazuh - Filebeat configuration file output.elasticsearch.hosts: ["206.189.6.131:9200"] output.elasticsearch.username: elastic output.elasticsearch.password: wB1t1Fhp7snQgsg0TaAY filebeat.modules: - module: wazuh alerts: enabled: true archives: enabled: false filebeat.config.modules: path : /etc/filebeat/modules.d/*.yml setup.template.json.enabled: true setup.template.json.path: /etc/filebeat/wazuh-template.json setup.template.json.name: wazuh setup.template.overwrite: true setup.ilm.enabled: false output.elasticsearch.ssl.verification_mode: none output.elasticsearch.protocol: https output.elasticsearch.ssl.certificate: /home/<any-location>/elastic-certs/certs/elasticsearch.crt output.elasticsearch.ssl.key: /home/<any-location>/elastic-certs/certs/elasticsearch.key output.elasticsearch.ssl.certificate_authorities: /home/<any-location>/elastic-certs/certs/ca/ca.crt output.elasticsearch.username: elastic YAML Copy the certificates from the Manager server to your Raspberry scp -r root@<digital-server>:/etc/elasticsearch/certs/ /home/<any-location>/ Bash Edit the parameters output.elasticsearch.ssl.certificate, output.elasticsearch.ssl.key and output.elasticsearch.ssl.certificate_authorities according to your certificate locations. Enable the module for Suricata sudo filebeat modules enable suricata Bash Let’s check the Filebeat modules sudo filebeat modules list Bash Configure the Suricata module (/etc/filebeat/modules.d/suricata.yml) as the code below: # Module: suricata # Docs: https://www.elastic.co/guide/en/beats/filebeat/7.9/filebeat-module-suricata.html - module: suricata # All logs eve: enabled: true # Set custom paths for the log files. If left empty, # Filebeat will choose the paths depending on your OS. var.paths: ["/var/log/suricata/eve.json"] YAML Start filebeat sudo systemctl start filebeat Bash Test the filebeat output NOTE: As you probably noticed, the Filebeat configuration file between the Elasticsearch & Kibana (DigitalOcean) server and Raspberry differ, due to the architecture on the Raspberry is used another Filebeat version. I want to mention that the certificates are usually generated only for one IP, which is why the following option output.elasticsearch.ssl.verification_mode: none is used. Configure the Rogue Access Point Q: Ok, maybe you ask why use I a Rogue Access Point instead of a second Wifi router? A: Using the following adapter AC1200 to create a router the entire traffic could be manipulated, which isn’t possible by using a conventional router. Note: To know what my system looks like, see the pic below. Install the requirements: sudo apt install iptables hostapd dnsmasq Bash Create your config file /etc/hostapd/hostapd.conf interface=wlan1 ssid=Syzhack channel=4 hw_mode=g wpa=3 wpa_key_mgmt=WPA-PSK wpa_pairwise=TKIP CCMP wpa_passphrase=Yours4f3pass auth_algs=3 ap_max_inactivity=99999 ieee80211n=1 wmm_enabled=1 Bash Adapt the script below with your configuration details, the wlan1 IP range could be also modified. Note: You could run it in the background using a screen session or just create a systemd file. Don’t forget to set the Burpsuite to listen for all interfaces and enable the “invisible proxying” option. #!/bin/bash airmon-ng check kill ip link set dev wlan1 up ip a a <your-range>/24 dev wlan1 iptables -F -t nat iptables -X -t nat iptables -F iptables -X iptables -P FORWARD ACCEPT iptables -t nat -A POSTROUTING -s <your-range>/24 -o eth0 -j MASQUERADE echo 1 > /proc/sys/net/ipv4/ip_forward sysctl -w net.ipv4.ip_forward=1 sysctl -w net.ipv4.conf.all.send_redirects=0 iptables -t nat -A PREROUTING -i wlan1 -p tcp --dport 80 -j DNAT --to-destination <your-burp-porxy-ip>:8080 iptables -t nat -A PREROUTING -i wlan1 -p tcp --dport 443 -j DNAT --to-destination <your-burp-porxy-ip>:8080 sleep 3 hostapd -T -B /etc/hostapd/hostapd.conf -f /var/log/hostapd.log sleep 3 dnsmasq -d -i wlan1 Bash Run the script: sudo bash mitm.sh Bash Check the Burpsuite results Check the logs Go to your Kibana Dashboard using your IP on set port (443 - https://<ip>) and wait for the Wazuh plugin to be automatically configured, if an error occurs, please wait. The Wazuh logs should be displayed as follow: Suricata logs: De mult ma gandeam sa fac un experiment de genul, din discutia asta am implementat unele idei. Daca aveti intrebari/feedback, va stau la dispozitie. // Pe blog se vede mai bine
    1 point
  3. Prowler Prowler is a Network Vulnerability Scanner implemented on a Raspberry Pi Cluster, first developed during Singapore Infosec Community Hackathon - HackSmith v1.0. Capabilities Scan a network (a particular subnet or a list of IP addresses) for all IP addresses associated with active network devices Determine the type of devices using fingerprinting Determine if there are any open ports on the device Associate the ports with common services Test devices against a dictionary of factory default and common credentials Notify users of security vulnerabilities through an dashboard. Dashboard tour Planned capabilities Greater variety of vulnerability assessment capabilities (webapp etc.) Select wordlist based on fingerprint Hardware Raspberry Pi Cluster HAT (with 4 * Pi Zero W) Raspberry Pi 3 Networking device Software Stack Raspbian Stretch (Controller Pi) Raspbian Stretch Lite (Worker Pi Zero) Note: For ease of setup, use the images provided by Cluster Hat! Instructions Python 3 (not tested on Python 2) Python packages see requirements.txt Ansible for managing the cluster as a whole (/playbooks) Key Python Package dispy (website) is the star of the show. It allows allows us to create a job queue that will be processed by the worker nodes. python-libnmap is the python wrapper around nmap, an open source network scanner. It allows us to scan for open ports on devices. paramiko is a python wrapper around SSH. We use it to probe SSH on devices to test for common credentials. eel is used for the web dashboard (seperate repository, here) rabbitmq (website) is used to pass the results from the cluster to the eel server that is serving the dashboard page. Ansible Playbooks For the playbooks to work, ansible must be installed (sudo pip3 install ansible). Configure the IP addresses of the nodes at /etc/ansible/hosts. WARNING: Your mileage may vary as these were only tested on my setup shutdown.yml and reboot.yml self-explanatory clone_repos.yml clone prowler and dispy repositories (required!) on the worker nodes setup_node.yml installs all required packages on the worker nodes. Does not clone the repositories! Deploying Prowler Clone the git repository: git clone https://github.com/tlkh/prowler.git Install dependencies by running sudo pip3 install -r requirements.txt on the controller Pi Run ansible-playbook playbooks/setup_node.yml to install the required packages on worker nodes. Clone the prowler and dispy repositories to the worker nodes using ansible-playbook playbooks/clone_repos.yml Run clusterhat on on the controller Pi to ensure that all Pi Zeros are powered up. Run python3 cluster.py on the controller Pi to start Prowler To edit the range of IP addresses being scanned, edit the following lines in cluster.py: test_range = [] for i in range(0, 1): for j in range(100, 200): test_range.append("172.22." + str(i) + "." + str(j)) Old Demos Cluster Scan Demonstration Jupyter Notebook Single Scan Demonstration Jupyter Notebook Try out the web dashboard here Useful Snippets To run ssh command on multiple devices, install pssh and pssh -h pssh-hosts -l username -A -i "command" To create the cluster (in compute.py): cluster = dispy.JobCluster(compute, nodes='pi0_ip', ip_addr='pi3_ip') Check connectivity: ansible all -m ping or ping p1.local -c 1 && ping p2.local -c 1 && ping p3.local -c 1 && ping p4.local -c 1 Temperature Check: /opt/vc/bin/vcgencmd measure_temp && pssh -h workers -l pi -A -i "/opt/vc/bin/vcgencmd measure_temp" | grep temp rpimonitor (how to install): Contribuitors: Faith See Wong Chi Seng Timothy Liu ABSOLUTELY NO WARRANTY WHATSOEVER! Feel free to submit issues though. Download: prowler-master.zip Source
    1 point
  4. Am comandat un Raspberry zero, si o sa incerc sa fac un poc. Va tin la curent cu ce mi se intampla
    1 point
  5. The Backdoor Factory Proxy (BDFProxy) v0.2 For security professionals and researchers only. NOW ONLY WORKS WITH MITMPROXY >= v.0.11 To install on Kali: apt-get update apt-get install bdfproxy DerbyCon 2014 Presentation: About 18 minutes in is the BDFProxy portion. Contact the developer on: IRC: irc.freenode.net #BDFactory Twitter: @Midnite_runr This script rides on two libraries for usage: The Backdoor Factory (BDF) and the mitmProxy. Concept: Patch binaries during download ala MITM. Why: Because a lot of security tool websites still serve binaries via non-SSL/TLS means. Here's a short list: sysinternals.com Microsoft - MS Security Essentials Almost all anti-virus companies Malwarebytes Sourceforge gpg4win Wireshark etc... Yes, some of those apps are protected by self checking mechanisms. I've been working on a way to automatically bypass NSIS checks as a proof of concept. However, that does not stop the initial issue of bitflipping during download and the execution of a malicious payload. Also, BDF by default will patch out the windows PE certificate table pointer during download thereby removing the signature from the binary. Depends: Pefile - most recent ConfigObj mitmProxy - Kali Build .10 BDF - most current Capstone (part of BDF) Supported Environment: Tested on all Kali Linux builds, whether a physical beefy laptop, a Raspberry Pi, or a VM, each can run BDFProxy. Install: BDF is in bdf/ Run the following to pull down the most recent: ./install.sh OR: git clone https://github.com/secretsquirrel/the-backdoor-factory bdf/ If you get a certificate error, run the following: mitmproxy And exit [Ctr+C] after mitmProxy loads. Usage: Update everything before each use: ./update.sh READ THE CONFIG!!! -->bdfproxy.cfg You will need to configure your C2 host and port settings before running BDFProxy. DO NOT overlap C2 PORT settings between different payloads. You'll be sending linux shells to windows machines and things will be segfaulting all over the place. After running, there will be a metasploit resource script created to help with setting up your C2 communications. Check it carefully. By the way, everything outside the [Overall] section updates on the fly, so you don't have to kill your proxy to change settings to work with your environment. But wait! You will need to configure your mitm machine for mitm-ing! If you are using a wifiPineapple I modded a script put out by hack5 to help you with configuration. Run ./wpBDF.sh and enter in the correct configs for your environment. This script configures iptables to push only http (non-ssl) traffic through the proxy. All other traffic is fowarded normally. Then: ./bdf_proxy.py Here's some sweet ascii art for possible phyiscal settings of the proxy: Lan usage: <Internet>----<mitmMachine>----<userLan> Wifi usage: <Internet>----<mitmMachine>----<wifiPineapple>))) Testing: Suppose you want to use your browser with Firefox and FoxyProxy to connect to test your setup. Update your config as follows: transparentProxy = None Configure FoxyProxy to use BDFProxy as a proxy. Default port in the config is 8080. Logging: We have it. The proxy window will quickly fill with massive amounts of cat links depending on the client you are testing. Use tail -f proxy.log to see what is getting patched and blocked by your blacklist settings. However, keep an eye on the main proxy window if you have chosen to patch binaries manually, things move fast and behind the scences there is multi-threading of traffic, but the intial requests and responses are locking for your viewing pleasure. Attack Scenarios (all with permission of targets): -Evil Wifi AP -Arp Redirection -Physical plant in a wiring closet -Logical plant at your favorite ISP Sursa: https://github.com/secretsquirrel/BDFProxy
    1 point
  6. Protecting SCADA From the Ground Up – PDF Detecting Bluetooth Surveillance Systems – PDF Dropping Docs on Darknets: How People Got Caught – PDF Hacking 911: Adventures in Disruption, Destruction, and Death – PDF How to Disclose an Exploit Without Getting in Trouble – PDF Reverse Engineering Mac Malware – PDF NSA Playset: PCIe – PDF The Monkey in the Middle: A pentesters guide to playing in traffic. – PDF Investigating PowerShell Attacks – PDF Is This Your Pipe? Hijacking the Build Pipeline. – PDF Screw Becoming A Pentester – When I Grow Up I Want To Be A Bug Bounty Hunter! – PDF Home Alone with localhost: Automating Home Defense – PDF Meddle: Framework for Piggy-back Fuzzing and Tool Development – PDF Instrumenting Point-of-Sale Malware: A Case Study in Communicating Malware Analysis More Effectively – PDF White Paper One Man Shop: Building an effective security program all by yourself – PDF RF Penetration Testing, Your Air Stinks – PDF Touring the Darkside of the Internet. An Introduction to Tor, Darknets, and Bitcoin – PDF USB for all! – PDF ShareEnum: We Wrapped Samba So You Don’t Have To – PDF An Introduction to Back Dooring Operating Systems for Fun and Trolling – PDF Android Hacker Protection Level 0 – PDF Anatomy of a Pentest; Poppin’ Boxes like a Pro – PDF Bug Bounty Programs Evolution – PDF Extras Practical Foxhunting 101 – PDF Client-Side HTTP Cookie Security: Attack and Defense – PDF Bypass firewalls, application white lists, secure remote desktops under 20 seconds – PDF PropLANE: Kind of keeping the NSA from watching you pee – PDF Getting Windows to Play with Itself: A Hacker’s Guide to Windows API Abuse – PDF Weaponizing Your Pets: The War Kitteh and the Denial of Service Dog – PDF Through the Looking-Glass, and What Eve Found There – PDF White Paper Summary of Attacks Against BIOS and Secure Boot – PDF I am a legend: Hacking Hearthstone with machine learning – PDF The Secret Life of Krbtgt – PDF The $env:PATH less Traveled is Full of Easy Privilege Escalation Vulns – PDF Hacking US (and UK, Australia, France, etc.) traffic control systems – PDF The Cavalry Year[0] & a Path Forward for Public Safety – PDF NSA Playset: DIY WAGONBED Hardware Implant over I2C – PDF Abuse of Blind Automation in Security Tools – PDF Why Don’t You Just Tell Me Where The ROP Isn’t Suppose To Go – PDF Steganography in Commonly Used HF Radio Protocols – PDF Extras Saving Cyberspace by Reinventing File Sharing – PDF Empowering Hackers to Create a Positive Impact – PDF Just What The Doctor Ordered? – PDF Check Your Fingerprints: Cloning the Strong Set – PDF Shellcodes for ARM: Your Pills Don’t Work on Me, x86 – PDF Blowing up the Celly – Building Your Own SMS/MMS Fuzzer – PDF Mass Scanning the Internet: Tips, Tricks, Results – PDF Deconstructing the Circuit Board Sandwich: Effective Techniques for PCB Reverse Engineering – PDF Saving the Internet (for the Future) – PDF Burner Phone DDOS 2 dollars a day : 70 Calls a Minute – PDF Hack All The Things: 20 Devices in 45 Minutes – PDF Stolen Data Markets: An Economic and Organizational Assessment – PDF Raspberry MoCA – A recipe for compromise – PDF White Paper 1 White Paper 2 Girl… Fault-Interrupted. – PDF Extreme Privilege Escalation On Windows 8/UEFI Systems – PDF White Paper NinjaTV – Increasing Your Smart TV’s IQ Without Bricking It – PDF Oracle Data Redaction is Broken – PDF Weird-Machine Motivated Practical Page Table Shellcode & Finding Out What’s Running on Your System – PDF Catching Malware En Masse: DNS and IP Style – PDF White Paper Attacking the Internet of Things using Time – PDF Open Source Fairy Dust – PDF Learn how to control every room at a luxury hotel remotely: the dangers of insecure home automation deployment – PDF White Paper Generating ROP payloads from numbers – PDF DEF CON Comedy Jam Part VII, Is This The One With The Whales? – PDF The NSA Playset: RF Retroreflectors – PDF 1 PDF 2 VoIP Wars: Attack of the Cisco Phones – PDF Playing with Car Firmware or How to Brick your Car – PDF Measuring the IQ of your Threat Intelligence feeds – PDF Secure Because Math: A Deep Dive On Machine Learning-Based Monitoring – PDF Abusing Software Defined Networks – PDF NSA Playset : GSM Sniffing – PDF Cyberhijacking Airplanes: Truth or Fiction? – PDF Am I Being Spied On? Low-tech Ways Of Detecting High-tech Surveillance – PDF Detecting and Defending Against a Surveillance State – PDF Acquire current user hashes without admin privileges – PDF You’re Leaking Trade Secrets – PDF Veil-Pillage: Post-exploitation 2.0 – PDF From Raxacoricofallapatorius With Love: Case Studies In Insider Threat – PDF Don’t DDoS Me Bro: Practical DDoS Defense – PDF Advanced Red Teaming: All Your Badges Are Belong To Us – PDF I Hunt TR-069 Admins: Pwning ISPs Like a Boss – PDF The Only Way to Tell the Truth is in Fiction: The Dynamics of Life in the National Security State – PDF A Journey to Protect Points-of-sale – PDF Impostor — Polluting Tor Metadata – PDF Domain Name Problems and Solutions – PDF White Paper Optical Surgery; Implanting a DropCam – PDF Manna from Heaven: Improving the state of wireless rogue AP attacks – PDF The Open Crypto Audit Project – PDF Practical Aerial Hacking & Surveillance – PDF White Paper From root to SPECIAL: Pwning IBM Mainframes – PDF PoS Attacking the Traveling Salesman – PDF Don’t Fuck It Up! – PDF Source
    1 point
  7. This is a free 12 lesson course which will bend your brain a little, so not for the faint hearted. So long as you like a tough mental challenge this is a bare metal course which will give you a solid understanding of programming basics. 1 Requirements In order to complete this course you will need a Raspberry Pi with an SD card and power supply, as well as another computer running a version of Linux, Microsoft Windows or Mac OS X, capable of writing to the SD card, and installing software. It is helpful, but not necessary, for your Raspberry Pi to be able to be connected to a screen. In terms of software, you require a GNU compiler toolchain that targets ARMv6. Links for downloads are available on the Downloads Page, along with model answers for all of the exercises. http://www.cl.cam.ac.uk/projects/raspberrypi/tutorials/os/
    1 point
×
×
  • Create New...