Capstone Project: Week 10, Communication

Share on:
Capstone Project: Week 10, Communication

Hello and welcome to week 10 of my capstone project: CI/CD pipelines! This week was mostly about learning again as I have been going through some books on unit testing and DevOps methodologies as well as exploring more of the Azure cloud by participating in the Microsoft Ignite Cloud Skills Challenge 2021. Still, I have made some progress on the project and I am excited to share it with you all!


Outcomes

  • Slack Integration - Created a new Slack environment with integrations + webhooks from Azure DevOps, Datadog, GitHub, and Dockerhub.
    • Goal of this is to help improve productivity and efficiency of team members as well as improving collaboration between team members like quick references to work items.
    • Another goal is to allow for a quick response to alerts such as Datadog incidents or failed builds from Azure Pipelines.
  • Flake8 testing - I added flake8 testing to my pipeline for additional SAST testing.
  • Unit Testing - Learned how to do simple Python unit tests with pytest and the process for how to use it in Azure Pipelines.
    • I still need to integrate this and also figure out how I am going to test for the vulnerability in this application - autoescape with the jinja2 template.

Communication

I have decided to go with Slack as my method of communication between team members. One of the important aspects of DevOps is the creation of effective communication between different teams. Slack offers a lot of support for the creation of "apps" which help to add functionality to Slack and increase collaboration. For my project, I have implemented Datadog monitoring, GitHub, and Azure Pipelines + Boards integration to help increase efficiency.

Datadog Monitoring Integration

One of the most important integrations for Slack and communication is integration with your monitoring service, which in this case is Datadog. Some of the main benefits of Datadog is the ability to receive updates from your monitors to let the team know when something goes wrong, the ability to quickly pull down graphs and metrics to discuss with the team, and the ability to create an incident directly from the Slack channel.

Datadog Slack Integration

As one can see from the picture above, I was able to both pull down some metrics easily and I also received an alert that I had lost connection with my application. I was able to respond swiftly and found that one of my locations could not complete a HTTP GET request on my browser. It turns out that one of the facilities Datadog uses was not correctly working. I can add any monitor I would like to notify Slack by modifying the description within the trigger portion of the monitor to alert the slack channel.

I also have since found out that Azure App Service does not accept pings, so I changed my routine 5 minute ping to be a simple TCP request. Now, whenever one of the facilities goes down, an alert will be triggered of a specified severity level and alert the team if desired.

Another function is the ability to create an incident directly in chat. When something serious is discovered, the worker can not waste any time in trying to log on to the Datadog GUI and create a new incident, so one can create the incident directly from the Slack channel.

Datadog Incidents

GitHub integration

I have also integrated GitHub into my Slack channel, allowing for team members to quickly see any update such as commits, pull requests, and merges that occur on the selected repository.

Slack GitHub Integration

Azure Pipelines + Boards Integration

While we already received events via the use of WebHooks in Datadog from my Azure DevOps project, it would be even better to include it in the primary resource for communication. I am using two integrations: Azure Boards and Azure Pipelines

The Slack Azure Boards integration allows me to quickly link work items to discuss with team members. I can also add a new work item of any type to quickly update progress. This helps improve efficiency and collaboration between team members.

Slack Azure Boards

I also have added Slack Azure Pipelines integration which allows the team to see updates on the progress of any pipeline you desire, you simply have to link it and Azure and Slack take care of the rest. You can also approve and deny releases directly from Slack.

Slack Azure Pipelines

Flake8

I recently incorporated flake8 into my pipeline which is an additional SAST tool that consists of the Pylakes, pycodestyle, and Ned Batchelder's McCabe script. These tools combined check for Python errors, issues with style conventions (linting), and code complexity.

I incorporated this into my pipeline with a simple script function and I also tee off the output to a textfile as an artifact for future use. I have continueOnError = True because this code does detect some issues with my linting of the code and throws an error. These issues are non-critical but I will make it a point to demonstrate me fixing the issues it reports.

1 - script: |
2              python -m pip install flake8
3              flake8 . | tee $(Build.ArtifactStagingDirectory)/flake8output.txt              
4            displayName: 'Run Flake8 Test'
5            continueOnError: true

Unit Testing

While I have not yet added this to my pipeline, unit testing is inevitably going to be a part of it. While SAST and DAST is great for a lot of different things, they are sort of generic, and often teams require developers to write their own unit tests for the pieces of functionality that they write with code. DevOps managers need to make sure that they can extract the tests that are routinely done by developers and implement it into the pipeline. Not all unit tests needs to be added, and it could even waste time, but the skill is still required. As such, I have decided to move forward with learning how to write my own tests, something I haven't done yet before.

As my application is a flask application, it is written with python, which has led me to take courses with pytest which is a widely used frameowrk for writing python-based tests. The picture below demonstrates a sample output from a successful pytest run.

Pytest Unit Test

However, I realized soon after starting that my vulnerability that I am trying to detect is actually within the jinja2 template. I will need to figure out how to run form validation on the jinja2 template where I will expect to get escaped characters but won't, which will lead the test to fail. I also don't have much Python to test, but I will still incorporate some python functionality so I can run pytest on it and view the results.

I had initially been surprised to find that bandit supports autoescape detection in Jinja but the vulnerability is within the template itself, so it won't work. However, I could also try to switch the vulnerability location as a backup plan.

Still, I have discovered the process for unit testing in Azure Pipelines, at least for pytest. First, I will need to install pytest and pytest-azurepipelines within my pipeline, and then run the pytest file within a script and export the results to a junit.xml file. From there, I can run the task PublishTestResults@2 to get the nifty test results page on my pipeline showing how many tests succeeded and failed!

Conclusion

This week was mostly spent on learning how to run unit tests and integrating Slack into my DevOps infrastructure. Next week I should definitely have unit tests integrated into my pipeline and hopefully some more SAST + DAST integration!

Header image Designed by Freepik