Jump to content
sicilianul

@ Facebook, zero-day exploits, backdoor code bring war games drill to life

Recommended Posts

Va recomand sa cititi stirea de mai jos chiar daca e lunga.

How do companies prepare for the worst? By exposing workers to lifelike crises.

facebook-wargame.jpg

Early on Halloween morning, members of Facebook's Computer Emergency Response Team received an urgent e-mail from an FBI special agent who regularly briefs them on security matters. The e-mail contained a Facebook link to a PHP script that appeared to give anyone who knew its location unfettered access to the site's front-end system. It also referenced a suspicious IP address that suggested criminal hackers in Beijing were involved.

"Sorry for the early e-mail but I am at the airport about to fly home," the e-mail started. It was 7:01am. "Based on what I know of the group it could be ugly. Not sure if you can see it anywhere or if it's even yours."

fbiemail-640x525.jpeg

Facebook employees immediately dug into the mysterious code. What they found only heightened suspicions that something was terribly wrong. Facebook procedures require all code posted to the site to be handled by two members of its development team, and yet this script somehow evaded those measures. At 10:45am, the incident received a classification known as "unbreak now," the Facebook equivalent of the US military's emergency DEFCON 1 rating. At 11:04am, after identifying the account used to publish the code, the team learned the engineer the account belonged to knew nothing about the script. One minute later, they issued a takedown to remove the code from their servers.

With the initial threat contained, members of various Facebook security teams turned their attention to how it got there in the first place. A snippet of an online chat captures some of the confusion and panic:

Facebook Product Security: question now is where did this come from

Facebook Security Infrastructure Menlo Park: what's [IP ADDRESS REDACTED]

Facebook Security Infrastructure Menlo Park: registered to someone in beijing…

Facebook Security Infrastructure London: yeah this is complete sketchtown

Facebook Product Security: somethings fishy

Facebook Site Integrity: which means that whoever discovered this is looking at our code

If the attackers were able to post code on Facebook's site, it stood to reason, they probably still had that capability. Further, they may have left multiple backdoors on the network to ensure they would still have access even if any one of them was closed. More importantly, it wasn't clear how the attackers posted the code in the first place. During the next 24 hours, a couple dozen employees from eight internal Facebook teams scoured server logs, the engineers' laptop, and other crime-scene evidence until they had their answer: the engineer's fully patched laptop had been targeted by a zero-day exploit that allowed attackers to seize control of it.

This is only a test

The FBI e-mail, zero-day exploit, and backdoor code, it turns out, were part of an elaborate drill Facebook executives devised to test the company's defenses and incident responders. The goal: to create a realistic security disaster to see how well employees fared at unraveling and repelling it. While the attack was simulated, it contained as many real elements as possible.

The engineer's computer was compromised using a real zero-day exploit targeting an undisclosed piece of software. (Facebook promptly reported it to the developer.) It allowed a "red team" composed of current and former Facebook employees to access the company's code production environment. (The affected software developer was notified before the drill was disclosed to the rest of the Facebook employees). The PHP code on the Facebook site contained a real backdoor. (It was neutralized by adding comment characters in front of the operative functions.) Facebook even recruited one of its former developers to work on the team to maximize what could be done with the access. The FBI e-mail came at the request of Facebook employees in an attempt to see how quickly and effectively various employee teams could work together to discover and solve the problems.

"Internet security is so flawed," Facebook Chief Security Officer Joe Sullivan told Ars. "I hate to say it, but it seems everyone is in this constant losing battle if you read the headlines. We don't want to be part of those bad headlines."

The most recent dire security-related headlines came last week, when The New York Times reported China-based hackers had been rooting through the publisher's corporate network for four months. They installed 45 separate pieces of custom-developed malware, almost all of which remained undetected. The massive hack, the NYT said, was pursued with the goal of identifying sources used to report a story series related to the family of China’s prime minister. Among other things, the attackers were able to retrieve password data for every single NYT employee and access the personal computers of 53 workers, some of which were directly inside the publisher's newsroom.

As thorough and persistent as the NYT breach was, the style of attack is hardly new. In 2010, hackers penetrated the defenses of Google, Adobe Systems, and at least 32 other companies in the IT and pharmaceutical industries. Operation Aurora, as the hacking campaign came to be dubbed, exploited zero-day vulnerabilities in Microsoft's Internet Explorer browser and possibly other widely used programs. Once attackers gained a foothold on employee computers, they used that access to breach other, more sensitive, parts of the companies' networks.

The hacks allowed the attackers to make off with valuable Google intellectual property and information about dissidents who used the company's services. It also helped coin the term "advanced persistent threat," or APT, used to describe hacks that will last weeks or months targeting a specific organization that possesses assets the attackers covet. Since then, reports of APTs have become a regular occurrence. In 2011, for instance, attackers breached the servers of RSA and stole information that could be used to compromise the security of two-factor authentication tokens sold by the division of EMC. A few months later, defense contractor Lockheed Martin said an attack on its network was aided by the theft of the confidential RSA data relating to its SecurID tokens, which some 40 million employees use to access sensitive corporate and government computer systems.

"That was the inspiration around all this stuff," Facebook Security Director Ryan "Magoo" McGeehan said of the company's drills. "You don't want the first time you deal with that to be real. You want something that you've done before in your back pocket."

Even after employees learned this particular hack was only for practice—about a half hour after the pseudo backdoor was closed—they still weren't told of the infection on the engineer's laptop or the zero-day vulnerability that was used to foist the malware. They spent the next 24 hours doing forensics on the computer and analyzing server logs to unravel that mystery. "Operation Loopback," as the drill was known internally, is notable for the pains it took to simulate a real breach on Facebook's network.

"They're doing penetration testing as it's supposed to be done," said Rob Havelt, director of penetration testing at security firm Trustwave. "A real pen test is supposed to have an end goal and model a threat. It's kind of cool to hear organizations do this."

He said the use of zero-day attacks is rare but by no means unheard of in "engagements," as specific drills are known in pen-testing parlance. He recalled an engagement from a few years ago of a "huge multinational company" that had its network and desktop computers fully patched and configured in a way that made them hard to penetrate. As his team probed the client's systems, members discovered 20 Internet-connected, high-definition surveillance cameras. Although the default administrator passwords had been changed, the Trustwave team soon discovered two undocumented backdoors built into the surveillance cameras' authentication system.

hacked-surveillance-cam-640x504.png

Havelt's team exploited the backdoors to remotely take control of the cameras. With the ability to view their output, change their direction, and zoom in and out, the Trustwave employees trained them on computer keyboards as employees in the unidentified company entered passwords. With the help of the cameras' 10x zoom, the pen testers were able to grab a "ton" of credentials and use them to log in to the company's network. From there, the employees escalated privileges to gain administrative control of the network. (The employees later reported the vulnerability to the camera manufacturer, resulting in the eventual release of this security advisory.)

We "ended up with domain admin on the internal network just because [the client] left these cameras on the Internet," Havelt said during a talk at last year's RSA conference.

Havelt recalled a separate engagement in the last 12 months that involved a different client. After his team gained access to a system that was on the company's internal network, the hired hackers injected malicious code into webpages regularly accessed by the company's developers. The malicious Java applet exploited a recently discovered vulnerability in the Java software framework that Oracle had yet to patch. With full access to one of the developer's machines, the payload installed a new set of cryptographic keys that was authorized to access the company's servers using the SSH, or secure shell protocol. With that significant toehold established, the pen testers were able to escalate their control over the client's network.

Adriel Desautels, CEO of pen testing firm Netragard, is also no stranger to the use of zero-day exploits, although he said he's often able to infect his clients using less sophisticated methods. During a recent engagement for a sensitive governmental agency located in the US, for instance, his team used social engineering to trick an agency employee into clicking on a link. The link, unbeknownst to the employee, installed "Radon," which is the name of pseudo-malware designed by Netragard to allow employees the same kind of sophisticated access many state-sponsored hackers behind espionage campaigns have.

With the employee's desktop computer infected, Radon rummaged through the agency's network and added malicious commands to the "batch file" every computer ran when it logged in. The modified file caused each computer to also become infected with Radon. Seizing control of hundreds of independent machines gave the Netragard hackers a higher likelihood of maintaining persistence over the network, even in the event that the initial infection was discovered and cleaned up.

"Eventually, it was game over," Desautels told Ars. "We had more control over their network than they did. That's how you do it. You don't just infect one system and stick it in their network and then try to infect the company. That doesn't guarantee you're going to be successful."

Desautels praised the architects of Operation Loopback because Facebook "did more than most other companies in this industry will do." But he went on to say that the engagement was significantly more limited than most attacks waged by well-funded and experienced hackers who are intent on penetrating a Fortune 500 company.

"If this were a real attack, they probably would have gone after multiple employees, especially with a zero day," he explained. "Why target one user when you have potentially hundreds of users you can target and get hundreds of points of entry?"

Facebook, he continued, "probably got some good insight. But [the engagement] is not nearly as realistic as it would be if it was a nation-state attack just because [Operation Loopback] was very singular."

Stress testing Facebook's incident response

To be fair, the drill Facebook executives devised wasn't intended to replicate every characteristic of a real-world attack. Instead, the executives wanted to develop employees' ability to work together to respond to an attack that could have a catastrophic effect on the site's security. Sullivan, Facebook's CSO, calls it a "stress test" of his incident response team.

"The team had grown substantially in the prior year, and we wanted to see if everyone is going to start screaming at each other or blaming each other because 'your logging system broke,' or 'your automated alerting should have triggered over here.' That was the human side of the test."

Operation Loopback also wasn't the first drill to test employees' ability to respond effectively in times of crisis. Six months earlier, McGeehan, the company's security director, installed a host of powerful hacking tools on a laptop computer, connected it to the Facebook internal wireless network, and stashed it behind a supply cabinet in a public hallway. A few days later, employees with the company's physical security team reported the discovery of the mysterious laptop to the security team, touching off another tense response. Over the following day, employees scouring server logs found the computer's MAC, or media access control, address had accessed key parts of Facebook's network.

Chan004-20120401-125649-0-2-640x491.jpg

"The first thing is: 'Oh my God. Panic,'" McGeehan said as he recalled his team's response to the incident. For almost 24 hours, the situation gave most employees every indication of being real. "As we're dealing with this, we realize that our network has been intruded on by some bad guy. Everyone in this room [is] thinking about 'how are we going to tear down our entire network? How are we going to basically deal with the worse-case scenario as a security incident?"

To ratchet up the stress even further, the drill organizers sent an e-mail to members of Facebook's security team a few hours after the laptop was disconnected from the Facebook network. The e-mail purported to come from members of what's known as the Koobface Gang, whose members last year were identified as the perpetrators of virulent malware that spread over the social networking site. It made a series of demands of Facebook and promised serious reprisals if they weren't met.

With Project Vampire, as the drill was dubbed, the employees worked a full 24 hours before they learned it wasn't a real hack.

"We felt it was a necessary thing to have a great security team to put them through this kind of stuff," Sullivan explained. The organizers made an exception, however, when early in the drill, an employee said the magnitude of the intrusion he was investigating would require him to cancel a vacation that was scheduled to begin the following week. McGeehan pulled the employee aside and explained it was only a drill and then instructed him to keep that information private.

Drills that use real zero-day vulnerabilities, require outside penetration testing firms, and suck up hundreds or thousands of man hours on non-production activities are expensive to carry out. But in a post-Operation Aurora world, where companies as security-savvy as Google and RSA are hacked and ransacked of valuable data, it is becoming increasingly necessary.

"These things used to be unheard of when back when, except for governmental type organizations," Trustwave's Havelt said. "Now, you're seeing this more in the private sector. It's good to see. If it were any other industry and it was any other critical function of a product not doing this you'd have people screaming that [the companies] were negligent and wanting to sue them left and right."

Sursa: At Facebook, zero-day exploits, backdoor code bring war games drill to life | Ars Technica

Via: Digg - What the Internet is talking about right now

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.



×
×
  • Create New...