Episode163

From Paul's Security Weekly
Jump to: navigation, search
Palo Alto Networks
Tenable Network Security
The SANS Institute
Pwnie Express
Black Hills Information Security

Sponsors

  • Tenable Network Security - This episode sponsored by Tenable network security. Tenable is a developer of enterprise vulnerability, compliance and log management software, but most notably the creators of Nessus, the worlds best vulnerability scanner. Tenable Security Center software extends the power of Nessus through reporting, remediation workflow, IDS event correlation and much more. Tenable also offers a Nessus Professional Feed to detect vulnerabilities in your network today! Tenable – Unified Security Monitoring!
  • Core Security - This episode is also sponsored by Core Security Technologies, helping you penetrate your network. Now version 10.0 with WiFi-fu good to go! Rock out with your 'sploit out! Listen to this podcast and qualify to receive a 10% discount on Core Impact, the worlds best penetration testing tool.
  • Trustwave Spiderlabs - Trustwave's SpiderLabs - providing advanced information security services to planet Earth. Visit them online at trustwave.com/spiderlabs!

Announcements & Shameless Plugs

PaulDotCom Security Weekly - Episode 163 - August 13th, 2009

  • BruCON in Brussels, baby. Are you in Europe around September 16th thru the 19th? Doesn't matter. Be there!
  • The Louisville Metro InfoSec Conference in, well, Louisville, offers John Strand as Keynote and serves PaulDotCom Asadoorian as Breakout Speaker. If that were not enough, they will also have a Capture The Flag event and Irongeek! All the above for the very low price of $99 on October 8th.

Episode Media

mp3

Interview: Roelof Temmingh

Roelof has been working in the security industry for 15 years. After completing his degree in electronic engineering he worked for a crypto development house before he co-founded SensePost in 2000 as technical director. He later headed up the innovation centre where he developed many successful security assessment tools, and he has contributed to several books (e.g., Aggressive Network Self-Defense, How to own a continent, Nessus Network Auditing). At the start of 2007 he left SensePost to start Paterva, a privately held company that develops information gathering tools and provides information gathering services.

Roelof is with us to discuss Maltego Mesh, version 3 of Maltego, and the problems of 1:n/ n:1 "conversations" leading to virtual populations of people who can sway public opinion, create FUD or leave false footprints.

According to the video introduction, Matego Mesh is a Firefox extension that parses interesting information/data from your browser's rendered page, like e-mail addresses, names, netblocks and IP Addresses, then seamlessly exports them into Maltego.

Maltego is an open source intelligence and forensics application. It will offer you timous mining and gathering of information as well as the representation of this information in a easy to understand format. Coupled with its graphing libraries, Maltego allows you to identify key relationships between information and identify previously unknown relationships between data.

Tech Segment: Web App Testing: Become Spiderman

Spidering is important, it provides the foundation for your attacks. If you don't know what is on the web server, how do you know what to attack? Of course, manual spidering is the way to go. This method involves setting up a proxy (like TRAT or Webscarab) and just browsing the site. Use the site as it was intended to be used, recording all of the transactions. I highly recommend that you do this, as you may find things that a web spider may miss. Of course, the web spider is going to find things that you may miss in the manual inspection, such as hidden directories and other elements of the site that are not neccessarily accessible during normal operations.

Fortuntely there are several tools out there to help. We'll go through a few here, but I want to point out that this is different from mirroring. Mirroring is where you just want to make a copy of the web site, such as with wget or httrack. There is value in doing this! We do it on every pen test, mostly to anlyze the metadata and be able to view and search the HTML and Javascript with local tools (like grep). What are we looking for? Comments in the code, version numbers of software running, internal IP addresses, all this can be found (sometimes) in the sites code. Metadata, well, we talked A LOT about that in the past.

Spider Features

Important features your spider should have:

  • It should check robots.txt - A great way to get a glimpse into the web site to see some hidden directories is robots.txt. Well, sometimes! Some people are sneaky and will put nasty stuff in the robots.txt. For example, if you access /private because it was in robots.txt, you may get logged or even banned from the web site. Whoops.
  • Support cookies - No question one of the greatest challenges to spidering (or mirroring) is authentication and cookies. Many web sites expect to set cookies, and change thebehavior of the site based on these cookies. For example, DVWA has a cookie that determines the security level of the site. The site acts weird if this cookie is not set, and therefore your spider will miss stuff. Not good!
  • Support other forms of authentication - Yes, thats a pun :) Ideally you should have the ability to enter a password for Basic Auth (or digest), and be able to enter information about a login form. This allows your spider to login into the application. Bonus, if your spider can make sure that you are still logged in at regular intervals, thats nice!
  • Support regex excludes - This avoid problems such as logging out or deleting data when you are spidering a web site. I've done this a few times, scan a web site as a "test", and then oops you just dropped all the tables. It is best, I find anyhow, to run your scans as a user that does not have privliges to hurt anything. Of course this can limit your results, but its a good start until you really get a feel for how the app works.
  • Level of depth - Control how many levels deep you should spider. This helps keep the spider under control, especially for large sites.

DirBuster

Enumerating the directories is to important, as new directories could reveal intersting files, or even whole applications! Sometimes an app is only accessible from a certain directory, and if you don't know it exists, you can't test it. DirBuster works great for this. I'm using Samurai, because well, all these tools are already installed an configured (and for the most part configured correctly and to my liking).

In Samuarai, you can start DirBuster, then pull dictionary files from the /usr/bin/samurai/DirBuster/ directory. These work well to bust out the directories and find neat stuff. You then take this information and pass it to your spider, or just start testing what you find (example. /forum).

robots.txt

Can be a trap! As I stated earlier, this can be a trap. But, some people do not use it as a trap and its worth looking at. Some examples:

Okay, I better stop, I think I hear the black helicopters coming...

w3af: webSpider

So, the w3af GUI crashed on me. Seemingly for the last time, as I've switched to the command-line interface. I setup the web spider as follows:

target
set targetOS windows
set targetFramework php
set target http://192.168.1.218/mut
back
discovery
discovery webSpider
discovery config webSpider
set onlyForward True # Only search within the directory provided
back
discovery
output console, htmlFile
back
start

Sample:

- http://192.168.1.218/mut/index.php | Method: GET | Parameters: (php_file_name="add-to-you...", page="source-vie...", submit="Submit")
- http://192.168.1.218/mut/index.php | Method: GET | Parameters: (php_file_name="index.php", page="source-vie...")
- http://192.168.1.218/mut/index.php | Method: GET | Parameters: (php_file_name="index.php", page="source-vie...", submit="Submit")
- http://192.168.1.218/mut/index.php | Method: GET | Parameters: (php_file_name="register.p...", page="source-vie...", submit="Submit")
- http://192.168.1.218/mut/index.php?page=add-to-your-blog.php | Method: POST | Parameters: (input_from_form="")
- http://192.168.1.218/mut/index.php?page=dns-lookup.php | Method: POST | Parameters: (target_host="")
- http://192.168.1.218/mut/index.php?page=login.php | Method: POST | Parameters: (password="", user_name="")
- http://192.168.1.218/mut/index.php?page=register.php | Method: POST | Parameters: (password="", user_name="", my_signature="", password_confirm="")
- http://192.168.1.218/mut/index.php?page=text-file-viewer.php | Method: POST | Parameters: (text_file_name="http://www...")
- http://192.168.1.218/mut/index.php?page=user-info.php | Method: POST | Parameters: (view_user_name="", password="")
- http://192.168.1.218/mut/index.php?page=view-someones-blog.php | Method: POST | Parameters: (show_only_user="adrian")
- http://192.168.1.218/mut/index.php?page=view-someones-blog.php | Method: POST | Parameters: (show_only_user="ed")
- http://192.168.1.218/mut/index.php?page=view-someones-blog.php | Method: POST | Parameters: (show_only_user="|/bin/id")

Burp Suite

Burp provides what I would call an "Active" spider. Its more manual than the other, but by far provides you with tons of functionality. It has awesome support for authentication, regex for excludes, supports cookies, robots.txt, etc... Use it! Its on Samurai, and a bit of a learning curve, but with some time is awesome.

Stories For Discussion

  1. UN Web Site Still Vulnerable After Two Years - [PaulDotCom] - This one is interesting, and a battle that many of us face. Here is a web site, for UN mind you, that has been vulnerable to SQLi for two years! Now, it may be that there isn't much sensitive info, if any on this site, and they just clean up the defacements. Kind of an interesting approach, is it cheaper for them to just live with the vulnerability and clean up after the defacements than to fix the problem? I would think that some hacker has penetrated more deeply, so do they just revert to snapshot on their VM when they detect a hack? Or is this a honeypot of sorts, a low hanging fruit vulnerability that's just an illusion to keep the script kiddies busy, which really goes no where, but serves to run interference? Okay, either I've had way too much to drink or they should just fix the problem, or both.
  2. If you're playing blue team in a CFT, pay attention - [PaulDotCom] - Two SANS instructors, Hal Pomeranz and Jason Fossen, put together a guide for blue teams in CFT events. Its goal was to prevent the common compliant from the blue team in the first 15 minutes of play, "I just got my ass handed to me, now what?". To be honest, I was expecting advice that I could easily poke fun at and find ways around easily. I did not. Their advice is sound, and if you follow it you will do pretty well, that is until I can get a kernel-level rootkit on your system, then its game over. Rootkits are not something I would use on a pen test, but should be taken into consideration when evaluating risk and catagorizing threats against your environment. With a kernel level root kit, you won't see my connections. Someone needs to build this into a metasploit payload... Carlos!
  3. Funny Article About Defcon - [PaulDotCom] - I have to say, my Defcon experience was very different. I found it to be a very cool conference, where "hackers" get together to socialize and interact. I am just glad to be a part of the community, as many of our ideals are in fact the same. However, don't let the power of being a hacker go to your head. Breaking into web sites without permission, stealing money from ATMs, stealing people's passwords for no reason, its just asinine. We should be responsible, hack to make things better and everyone more secure, not for personal gain or to prove a point (however, sometimes its okay to hack just to prove a point, depends what your point is :)
  4. Tor as a backdoor - [PaulDotCOm] -we have more coming on this topic, but lets just say this is scary. An encrypted, anonymous, and cross-platform backdoor channel out of networks is a scary thing. As a defender, you should pay attention to this research and start thinking about what you have in place, or could put in place, to detect it.
  5. Social Zombies: Talk to them before blowing their heads off - [PaulDotCom] - Education is key, and if there is one thing that I arm my immediate family with its knowledge and education about attacks. So, you should read about the latest attacks against social networks, nothing is "private" if its on the Internet. Also, I found it scary that you can create an encrypted C&C botnet channel over twitter. This poses yet even more new challenges for defenders, and well, lets just say that after this years BH & DC, I'm glad I don't have the job of a defender, because its the toughest one in our industry, hands down.
  6. Why the long fix? - [Larry] - For some reason I'm shocked that it takes more than two year before something gets hacked, and it still isn't fixed. I mean, simple script kiddie-ish SQL injection that lead to a defacement, still isn't fixed. And it is the freaking UN.
  7. We told you so! - [Larry] So, remember that nice segment that Mick and I did on ID theft through p2p networks? Well, this guy if going to jail for using the information that he found for the forces of evil. On another note, this guys stole IDs, and used the resulting funds for interesting purposes.
  8. UK ID cards cloned - [Larry] Oh yes, RFID for the win. Adam Laurie (Major Malfunction) was able to read, clone and even modify the new UK ID cards in under 12 minutes. Some of the changes included changing of identifiable information, picture, address, and even notes. One was added to say "I am a terrorist, shoot on sight"
  9. 2wire Auth Bypass - [Larry] - While it does only affect some older versions of firmware, who updates these things? Certainly not the customer/enduser; If it ain't broke, don't fix it. It seems to be a fairly easy attack; without authentication post a value to the password change page that makes the password longer than 512 characters. Password is now changed without authentication. Ok, so you ask, how popular are these 2wire routers? Just as an example, go look at the WPA-PSK rainbow tables for the list of included SSIDs, generated from the top 1000 SSIDs on WIGLE. Roughly 40% of the top 1000 are 2wire, and just over 5% of the entire database is 2wire (Linksys has about 20%, so still a decent number).