Capstone Project: Week 13, Finale
Hello and welcome to my penultimate capstone update post. This was the last week that I had to add and/or modify things within my project so I have made the best of it.
This week I have only done one major thing which is actually fixing my application based on all of the vulnerabilities reported from my tests.
Hadolint
The following were the errors reported by my Hadolint scanner in my SKF-Labs application:
I went through my Dockerfile and fixed all of the errors and warning messages.
- For the
MAINTAINER is depracated
error, I simply replaced theMAINTAINER
tag withLABEL
and updated the syntax accordingly. - The next warning about Pin versions was a little harder to fix as the version of alpine I did not know what package versions to use
- I went to https://pkgs.alpinelinux.org/packages to get the right packages, but they did not have a package version for
py3-pip
listed. - I decided to instead secure my container even more by using a more up-to-date version of the Alpine image:
alpine:3.15
. This version has all of the necessary package versions
- I went to https://pkgs.alpinelinux.org/packages to get the right packages, but they did not have a package version for
- For the
Set the SHELL option -o pipefail
warning, I simply prepended the following to my Dockerfile before any shell commands:SHELL ["/bin/bash", "-o", "pipefail", "-c"]
- For warning
SC2038
, I decided to simply ignore this file in thehadolint.yaml
file because ware searching for all files with certain file extensions, not actually searching for non-alphanumeric filenames.
WhiteSource Bolt
WhiteSource scanned my repository and found that all of the packages I have been using in my code was out of date, extremely. So much that after upgrading the packages in my requirements.txt
file to the versions specified, it only created new vulnerability alerts for those versions.
After some time, I finally upgraded my packages as much as I could without breaking anything, effectively eliminating all of the vulnerabiltiies reported by WhiteSource. I used the following versions in my requirements.txt
file:
1Flask==2.0.0
2flask-cors==3.0.10
3requests==2.26.0
4Werkzeug==2.0.2
This did actually sort of break Nikto, as it apparently found something not in the database and wanted user authentication to submit it to its database, effectively holding up the pipeline. I did not notice this for almost half an hour, which has costed me very precious build minutes!
I managed to fix this by simply specifying -ask no
in the nikto script.
Flake8
Fixing this is rather self-explanatory as it is a simple linter. Nonetheless, its safe to say that my code is much cleaner now because of it and also Flake8 has nothing to complain about anymore!
Nikto
The nikto scan was the most fun one to fix. For this one, I had to modify some things in my app.py
file.
- The
Python/3.6.9 version
error was fixed by simply updating my application requirements when I fixed the vulnerabilities found from WhiteSource Bolt. - To fix the
/console
issue, I simply had to specifyapp.config['DEBUG]=False'
at the start of my flask application. - The message
no CGI Directories found
was fixed by just adding-C all
to the command. - For the two header errors, I had to research how to send headers along with rendered templates back to the client. After some research I finally figured out how to in the flask documentation. I sent these headers after the request as shown below:
1@app.after_request
2def apply_caching(response):
3 response.headers["X-Frame-Options"] = "SAMEORIGIN"
4 response.headers['X-Content-Type-Options'] = 'nosniff'
5 return response
Finally, the Allowed HTTP Methods
message was still causing Nikto to throw an error, which was rather annoying to me. Nikto uses plugins
that scan for specific vulnerabilities individually. I found out that this HTTP method message was actually one of these plugins.
I quickly found that there was no way to specify all plugins besides... in the command. After some frustration, I simply decided to fork the nikto repository and delete the HTTP Method plugin. I win!.
Now nikto reports absolutely nothing, how nice?
Pytest
For the pytest unit test I was able to fix it by simply deleting the vulnerable code in the rendered Jinja2 template as demonstrated in my week 11 blog post.
Bandit
For Bandit, I get the following vulnerability found:
1 "results": [
2 {
3 "code": "39 if __name__ == '__main__':\n40 app.run(host='0.0.0.0')\n",
4 "col_offset": 17,
5 "filename": "/home/vsts/work/1/s/app.py",
6 "issue_confidence": "MEDIUM",
7 "issue_severity": "MEDIUM",
8 "issue_text": "Possible binding to all interfaces.",
9 "line_number": 40,
10 "line_range": [
11 40
12 ],
13 "more_info": "https://bandit.readthedocs.io/en/latest/plugins/b104_hardcoded_bind_all_interfaces.html",
14 "test_id": "B104",
15 "test_name": "hardcoded_bind_all_interfaces"
16 }
17 ]
I don't think that this is necessarily bad, and may even be necessary for my Azure App Service deployment. I believe this from this article and from my research on Azure App Service where I believe even Microsoft puts 0.0.0.0 within their application, but I may be wrong. Regardless, I am willing to accept this risk.
OWASP ZAP
Zap without the aggressive scan doesn't really report much else besides Nikto other than a few missing headers and tokens which I don't believe are really necessary to fix at the moment.
However, I did discover upon writing this post that I still had a vulnerability with my directory traversal function as I only deleted the place for the output to go: {read}
in the template, and not the vulnerability itself.
After a few hours of trying to fix this vulnerability, I was only able to come up with a "psuedo solution," where the vulnerability still exists: the user being able to manually choose what file to access, but it has been mitigated within flask.
Now, no matter what the user inputs, it will always return default.txt
unless one of the three options are specified. My code is shown here:
1@app.route("/pathtraversal", methods=['GET', 'POST'])
2def pathtraversal():
3 if request.method == 'POST':
4 filename = request.form['filename']
5 localfiles = {
6 1: "text/intro.txt",
7 2: "text/chapter1.txt",
8 3: "text/chapter2.txt",
9 4: "text/default.txt"
10 }
11 locator = 4
12 for i in localfiles:
13 if filename != localfiles[i]:
14 locator = 4
15 else:
16 locator = i
17 break
18 filename = localfiles.get(locator)
19 filename = request.form['filename']
20 f = open(filename, 'r')
21 read = f.read()
22 return render_template("index.html", read=read)
23 else:
24 return redirect(url_for('start'))
With that done, now every input besides the ones listed in the dropdown box will always return the following:
I am pretty embarrassed at my mistake here, and I don't know if I'll have the time to go back and figure out a better solution. At least this makes the pain of my next mistake a little less disheartening.
GitHub Issue
To make sure everything worked out, I decided I want to deploy my fixedbranch
. Instead of modifying the pipeline temporarily, I decided just to merge it into main as I found out you could "reverse" commits and pull requests. I did this, and everything worked successfully, and then I reverted the merge request.
Then I decided to demonstrate my project to my friend Aristher and did the same process, only to find out that I could not merge fixedbranch
to main
as Git did not detect any changes even though the branches were definitely different. I effectively broke my main branch.
Thankfully, I backed up my branch into mainclone
right before committing, and the commit history remained at that point. To fix this, I simply switched the name of main
to OLDmain
and then made mainclone
the main branch and then renamed it to main
, effectively undoing everything I have done.
For my class presentation I will either break my branch again, or merge fixedbranch
into another clone of main with a modified deployment pipeline and keep main
untouched. We shall see. Still, I did panick for a bit! This just exemplifies the need to quicksave before you do anything risky!
Conclusion
Next week I will be demonstrating my project, so I will need to prepare my presentation, video, and paper, all of which should be included in the next and last blog post.