Introduction

First three things you do when you approach a web challenge:

  • Open up the developer tools, can be done different number of ways
    • Open up the menu, usually the three vertical dots and select More Tools > Developer Tools
    • Right-click on the page and selected the Inspect options
    • Just use the keyboard shortcut to open up the inspector if right clicks are disabled on the page
    • We will probably be going through the uses of the Developer Tools in a later post or workshop, but it’s quite useful for enumeration (finding out all possible information)
  • Check /robots.txt
    • An explanation, Google works by having heaps of web crawlers (you might learn about this in websystems) that go out and index the internet and allow google search queries to return a list of results quickly. It’s essentially scraping the internet, which costs money to whoever hosts a website and has their assets (images, gifs, etc) downloaded. That bandwidth costs money, so the robots.txt file that most websites has contains a list of directories that web crawlers (if they’re good web crawlers and non malicious) should avoid if they encounter the website.
    • You can see some of these robots.txt files in websites that you may use everyday, e.g. https://www.reddit.com/robots.txt or https://www.facebook.com/robots.txt
    • You can access the robots.txt file by appending the URL of a website with /robots.txt, i.e. URL + /robots.txt
  • Viewing the source code of the website
    • Throwing view-source: in the address bar before the URL, i.e. view-source:https://www.facebook.com will allow you to view the source code of the website, useful if the page contains client-side PHP scripts that might not show up in the inspector straight away.

So uh… what now?

Let’s talk a little bit about the approach to web hacking and hacking in general. The single most important thing when approaching any challenges, pentests or whatever is enumeration.

Enumeration is basically trying to find the maximum possible information about a target that you can, with more information, you have more potential avenues to exploit and a better chance to get the flag.

Enumeration can include things like:

  • Finding out possible directories in the web application

    • This allows you to check out directories that are not apparent at first, giving more information that can result in you compromising the site.
    • Look up directory enumeration using tools like:
      • Dirb
      • Dirbuster
      • Gobuster
  • Checking what stack the web application uses:

    • Knowing what technology is used to make the web application can help you exploit the web app by using possible vulnerabilities in the underlying components of the web app.
      • Is it the LAMP stack?
      • Or maybe it’s the MEAN stack?
      • Is the server running a flask web app?
      • Could even be .net core?
  • Familiarise yourself with the flow of the application

    • Play around with the web app
    • Notice how it behaves, does anything seem out of place or odd?
    • Experiment with site functionality, is there a sign up or login process? Go through the process and understand what the web application is doing.

There are definitely more enumeration techniques/methods, but these general processes will allow you to get used to examining web applications and hopefully compromising them.


What the shit is the developer tools?

The developer tools are mainly used for developers to test and measure the speed/efficiency of their applications to to make minute tweaks when they need to.

For us hackyhackhackers, the more useful tabs in the inspector are as follows (at least in Chrome):

  • Elements
    • This tab allows us to check and make edits to the html/css/javascript on the page which is really useful in cases where some of the web application logic is done client side, where we have control
  • Console
    • The console allows us to write and execute javascript as well as call javascript functions that may be present in the web application
  • Sources
    • Sources basically shows what files are included in the web application (that you can see), usually used for checking out javascript/css files for any juicy secrets/vulnerable functions
  • Network
    • Usually displays load time for resources, can be used to check the HTTP headers of resources that are loaded, good for examining if any HTTP headers are out of place/strange.

I highly encourage you to have a poke around the developer tools to see how you can better use it for future challenges.

Um… HTTP Headers? Nani the fuck?

HTTP headers are essentially additional information passed along in the header section of HTTP requests and responses. They can specify things like cookies, user agents (what browser type is the request coming from), referer (which page you came from) and MANY more.

I would suggest researching HTTP headers on your own when you have the time.

Things to look for in the HTTP headers:

  • Cookie

    • If the first two characters start with ey, then the cookie is actually a JSON Web Token (JWT), you can check at jwt.io
    • If the cookie is readable, check if it sets admin privileges using the cookie
    • If you can see that admin privileges are set using Cookies, you can use a chrome extension called Mod Header or you can use BurpSuite to set the cookie values.
  • User-Agent

    • Some pages my respond differently to different User-Agents
    • Using the Mod Header extension, you can alter your User-Agent and access pages for flags
    • For a list of User-Agents, google is your friend

ToDo: Some actual exploitation/web techniques

I might not have gotten to them yet, but feel free to do your own research on some web attacks like:

  • SQL Injection

    • Quick little freebie:
    • The injection 'OR 1=1-- seems to work in most challenges, except for when it doesn’t
  • XML External Entities Injection (XXE)

  • Cross-site Scripting (XSS)

  • Template Injection

  • Directory Traversal

  • Local File Inclusion (LFI)

  • Remote Code Execution (RCE)

  • Cross-site Request Forgery (CSRF)