Tackling Pico: picoctf 2019 — where are the robots (web)

Bri
Nov 20, 2022

--

This was a pretty easy web exploitation challenge looking into web files/directories — in particular the robots.txt file.

The challenge description is as follows:

“Can you find the robots? https://jupiter.challenges.picoctf.org/problem/56830/ (link) or http://jupiter.challenges.picoctf.org:56830"

On clicking the link, we are taken to a simple web page:

Robots on a web page/site would most likely refer to the robots.txt file that some websites use as a means to give instructions to site crawlers on which directories and files can be crawl or index and which to avoid (more on robots.txt files can be found here & here).

In this case, let’s see if this website has a robots.txt file i.e. https://jupiter.challenges.picoctf.org/problem/56830/robots.txt

There’s a file /1bb4c.html that is restricted from indexing by crawlers

Let’s look at the restricted file using the link: https://jupiter.challenges.picoctf.org/problem/56830/1bb4c.html

Sure enough, we find our flag! yay!

--

--

No responses yet