How to Use GitHub Webhooks, Docker, and Python for Automatic End-to-End Deployments | by Chris Mahoney | Apr, 2022

An in-depth look into the method for mechanically retaining your App updated

Repo’s to Apps through Webhooks and API’s
(Picture by Writer, supply here and here)

1. Motivation
Section One: Context and Overview
2. Context
3. Other Methods
— 3.1. Manual Process
— — 3.1.1. Advantages
— — 3.1.2. Disadvantages
— 3.2 Docker Process
— — 3.2.1. Advantages
— — 3.2.2. Disadvantages
Section Two: Details About the App
4. The Automated Webhook is Method
5. Explanation of the Key Files
— 5.1. File: /requirements.txt
— 5.2. File: /docker/uvicorn.Dockerfile
— 5.3. File: /docker/docker-compose.yml
— — 5.3.1. The environment section
— — 5.3.2. The volumes section
— — 5.3.3. The ports section
— 5.4. File: /src/api/main.py
— — 5.4.1. Import Libraries
— — 5.4.2. Compile Variables
— — 5.4.3. Set Landing Page
— — 5.4.4. Instantiate App
— — 5.4.5. Set Custom Functions
— — 5.4.6. Set Custom Classes
— — 5.4.7. Define Endpoint: Landing Page
— — 5.4.8. Define Endpoint: Health Check
— — 5.4.9. Define Endpoint: Main Endpoint
— 5.5. File: /templates/landing_page.html
Section Three: How to Use the App
6. How to Use It
— 6.1. Create Server (using AWS)
— 6.1. Add docker-compose.yml file
— 6.1. Launch Docker on Server
— 6.1. Add Webhook on Git (using GitHub)
— 6.1. Test it
Section Four: More Info
7. What’s next
8. Where to Find More Information
9. The full file for: /src/api/main.py
10. Contact

If you happen to’re something like me, you’re a curious creature. So, after I began studying about what Webhooks are, and how one can use them, it felt like I used to be pulling at a free thread on my tee shirt. The extra I pulled, the extra the world of Webhooks unraveled, the extra I learnt, the extra I understood, the extra I unveiled in regards to the mysterious world of API’s and Webhooks and Automation, and the extra I needed to be taught extra!

So, the motivation right here is to create a streamlined and seamless means for doing deployments from GitHub to manufacturing servers. I really feel like it’s doable to write down a easy Python App to do that. So let’s learn the way!

This text can be break up in to 4 sections, every outlining a special facet of this course of. Section Two is extremely detailed, however I’ve tried to incorporate as a lot description and display screen photographs as needed, with a purpose to make it simpler to grasp what is going on right here. Section Three incorporates hands-on directions for how one can use the App. Get pleasure from!

I’ve been coding for a number of years now, and I’ve fairly a number of Repo’s on GitHub (and GitLab, and some others). And I see this button:

GitHub > *Repo* > Settings > Webhooks > Add webhook
(Picture by Writer)

So, I learn a bit about it on GitHub’s Webhooks Guide web page, together with a number of Blogs on how one can set it up (like this one and this one). The best way I perceive it’s like this:

While you arrange Webhooks on a Repo, and also you commit then push to the Repo, GitHub will mechanically ship a HTTP POST request to the URL you specify.

The payload for that request will include the metadata for the occasion from GitHub, however is not going to include the content material from Repo itself. Due to this fact, the system that’s receiving the Webhook request will then must take some kind of motion to pull or clone the newest information from the Repo.

This Webhook methodology shouldn’t be unique to GitHub, however can be obtainable on GitLab and BitBucket, and lots of different Git platforms.

First, it’s necessary to know that there are various, many (many!) alternative ways to deploy apps in manufacturing environments. Let’s take a look at two different widespread strategies:

3.1. Guide Course of

The handbook intervention course of would look one thing like this:

  1. To do git commit then git push on Native PC to push to the Upstream Repo.
  2. When able to deploy, to manually ssh in to the goal server.
  3. To do git clone (if for the primary time) or git pull (for subsequent updates) to fetch the newest code from the Repo to the Server.

Clearly there are lot of steps, and a whole lot of complexity on this handbook course of. Nonetheless, I’ve saved it to only the high-level deployment steps, with the intention to get the concept of the method.

3.1.1. Benefits:
— Simple to keep up
— Developer has full management
— No requirement/dependency on exterior instruments or apps (eg. Docker)

3.1.2. Disadvantages:
— Requires time, effort, focus by the Developer throughout deployment
— One incorrect transfer, and errors can simply be made; and the Developer could not even realise it till a lot later
— Any dependencies (eg. Different apps, different libraries, and so forth), will must be manually executed along with this Git course of.

3.2. Docker Course of

The Docker app streamlines this course of so much, and makes the lifetime of the Developer so much simpler. The high-level course of is:

  1. Run docker construct and docker run on Native PC to make sure the picture/container is working appropriately (this may be streamlined operating docker compose up)
  2. Publish the container utilizing docker login then docker push to add the container to the container repository (this may be AWS ECR or Azure ACR or Docker Hub, or every other container internet hosting platform)
  3. Then to ssh on to the Server, and to do docker pull, adopted by docker construct and docker run (or simply merely docker compose up, assuming the compose file is appropriately arrange).

Whereas it looks as if there are extra steps on this course of, truly the method is so much less complicated. As a result of it’s not deploying via a Git Repository, however quite a Container Repository. Which means to say that the container will be totally constructed and dealing in your Native PC, and when deploying to the Server it can work the very same means.

3.2.1. Benefits:
— Docker will run on any system, precisely the identical means; so it is vitally dependable
— All dependencies are already dealt with via Docker, due to this fact much less effort/focus/focus wanted from the Developer

3.2.2. Disadvantages:
— The configuration setup can generally be a bit of complicated
— If the Developer shouldn’t be acquainted with Docker, the setup and debugging can take a short time to complete

It is vitally straightforward to Automate this course of. The answer right here is to create a really quite simple Python App (utilizing FastAPI) which can obtain the HTTP POST request from Git, then name a Python script to then pull the newest replace from the Git upstream Repo. It is vitally just like the abovementioned handbook course of; whereby the developer will pull the newest code from the Repo. Nonetheless, the distinction right here is that we’ll use the ability of Python to mechanically pull the newest code for us.

The tip-to-end course of will seem like this:

Finish-to-Finish Course of Movement
(picture by Writer, supply here and here)

To interrupt it down step-by-step, that is what is going on:

  1. The developer will work on the App on their native PC
  2. The developer will push updates to the upstream Git repo (on this instance, we use GitHub; however every other internet hosting platform also can work),
  3. The Git Repo will then set off the Webhook course of, and can ship a HTTP POST request to a specified URL,
  4. Behind that URL is an IP deal with, which must be hosted on a cloud computing platform (right here, we use AWS, however every other cloud computing platform, like Azure, will even work),
  5. The Relying on if the Git repo makes use of HTTP or HTTPS, it can hit the Server on port 80 (for HTTP) or port 443 (for HTTPS),
  6. The server will then have it’s port 80 and port 443 uncovered to the general public, to permit for community site visitors to go via.
  7. Hosted on the server is a Docker container and the container will map the exterior ports 80 and 443 to an inner port 8880, which might then be consumed by the interior App,
  8. Mounted throughout the Docker container is a FastAPI App which can hearken to the port 8880 and the Endpoint which is outlined by the developer,
  9. As soon as the HTTP POST message hits the endpoint URL, then the FastAPI App will name a Python script,
  10. That Python script will then set off a git clone (or git pull) course of to fetch the newest code from the upstream repo,
  11. Python will then save that information to a listing on Docker (on this occasion, we’ll put it aside to the /app/repo listing, however that is configurable and it can save you it anyplace),
  12. Docker will then use a course of known as volume mapping to persist the info from that listing on the container to a listing exterior the container (which implies that as quickly because the /app/repo listing on the container is up to date, the /repo listing exterior the container will instantly even be up to date),
  13. As soon as the /repo listing is up to date on the server-level, then it’s doable to execute every other App as needed.

To the astute eye, you’ll observe right here that it’s truly doable to cluster a number of Docker containers collectively, in order that one will run the App itself, as a substitute of counting on the Server-level to run it. Sure, that’s proper, and it’s additionally quite common to do it like this. Nonetheless, for simplicity, I’ve saved the method right here to easily present the Webhook course of from Git to the Server. Utilizing this as a foundation, it’s then doable so as to add far more complexity on prime.

There are a selection of key recordsdata which can be used on this course of. The subsequent few sections will clarify these recordsdata intimately.

5.1. File: /necessities.txt

The necessities are tremendous easy. We solely want three packages:

  1. fastapi: For truly constructing the API endpoints
  2. gitpython: For clone‘ing/ pull‘ing from the upstream Repo
  3. python-decouple: For dealing with default surroundings variables

There can be another libraries which can be imported at run-time; nonetheless, these can be a part of the Python built-in’s, and never wanted to be added to the necessities.txt file.

/necessities.txt
(picture by Writer, supply here and here and here)

5.2. File: /docker/uvicorn.Dockerfile

Throughout the world of Docker, the Dockerfile is a ‘set of directions’ which must be executed on a specific container, with a purpose to set it up for it’s operation. Whereas there are various, many (many!), totally different choices and configurations obtainable, right here we now have saved issues fairly easy.

The precise steps are:

  1. Use the bottom picture as tiangolo/uvicorn-gunicorn-fastapi, which incorporates all of the supply infrastructure wanted for establishing and operating a server for efficiently operating FastAPI.
  2. Copy the necessities.txt file from the native surroundings on to the container.
  3. Operating three totally different PIP statements to:
    1. Improve pip, in case it’s outdated,
    2. Set up the packages from the necessities.txt file,
    3. Improve the uvicorn package deal, in case it’s outdated.
  4. Copy all of the required recordsdata from the src and the templates directories from native surroundings on to the container.
  5. Set the working listing in to the /app listing throughout the container.
  6. Set the command line, which can be executed as soon as the container is completed being constructed, and is able to be run.
/docker/uvicorn.Dockerfile
(picture by Writer, supply here and here and here)

The ultimate line (the CMD one) is kind of attention-grabbing. Let me level out a number of necessary bits of knowledge:

  • It’s going to execute the uvicorn software, which is successfully a server for dealing with the FastAPI processes and instructions
  • It is going to execute the appliance which is known as app, discovered throughout the main.py module.
  • The principal.py module is definitely inside a sub-directory on the container, to be discovered throughout the /src/api listing.
  • It’s hosted to watch the IP deal with: 0.0.0.0. Which is successfully saying ‘monitor the localhost’. It is because the Server IP deal with is utilized by the exterior techniques (through the URL), however the sign is throughout the server itself, will probably be discovered on the localhost IP. So, due to this fact, Uvicorn solely wants to watch the localhost throughout the server itself.
  • The system will monitor port 8880, which is why it’s necessary to map the exterior ports to the interior ports (which can be coated within the subsequent part, in regards to the docker-compose.yml file).
  • The foundation-path of the appliance is on the present folder (denoted with the "." syntax).
  • The output from the appliance can be printed to the terminal to make use of some fairly colors; which is useful for studying the log recordsdata and messages at a later time.

5.3. File: /docker/docker-compose.yml

After the Dockerfile, the docker-compose.yml file is the logical subsequent step, which can take the method additional once more. It may be used to create clusters of containers, every of which will be set to work in unison and relying off each other. Whereas that is a tremendous performance, it’s not wanted for our functions right here. On this occasion, we are going to solely require one container, which can be constructed utilizing the Dockerfile as outlined above. The explanation why we use the docker-compose.yml methodology is as a result of it’s handy; it permits for simple addition of surroundings variables and quantity mappings.

/docker/docker-compose.yml
(picture by Writer, supply here and here and here)

Word right here that this can be a SAMPLE file for the docker-compose.yml. The Git URL from which that is pulling information shouldn’t be truly an software (merely a set of code snippets). So, that is merely displaying the method how of to arrange the appliance.

5.3.1. The surroundings part:

There are 9 values that might/must be included within the surroundings part, together with:

  1. GIT_URL (Necessary): The URL from which the Repo can be cloned.
  2. API_ENDPOINT (Optionally available, with default /api/webhook): The Endpoint which can be known as by the Webhook.
  3. REPO_DIR (Necessary): The DIR to which the Repo can be cloned.
  4. VERSION (Optionally available, with default 0.0.1): The model quantity for the app.
  5. TITLE (Optionally available, with default Replace from Git): The title of the app.
  6. DESCRIPTION (Optionally available, with default Automated course of for pulling from Git repo upon webhook name.): The outline of the app.
  7. CONTACT_NAME (Optionally available, with default None): The identify of the individual to contact in regards to the app.
  8. CONTACT_URL (Optionally available, with default None): The web site for the contact individual.
  9. CONTACT_EMAIL (Optionally available, with default None): The e-mail for the contact individual.

See here for extra particulars.

5.3.2. The volumes part:

In an effort to utilise storage processes between the dad or mum container (ie. the Server) and the Docker container, the volumes part is the mechanism designed to realize persistence between the 2 environments. Successfully, the worth on the left-hand-side of the colon (:) is the listing location on the dad or mum container, and the worth on the right-hand-side is the listing location on the Docker container itself. By doing so, any modifications in any way to both listing will instantaneously replace the opposite.

Default values:

- ../repo:/app/repo

5.3.3. The ports part:

The port mappings are necessary to make sure that the exterior Ports will be appropriately mapped to the interior processes inside Docker. Since port 80 is for HTTP, and port 443 is for HTTPS processes, it’s cheap to count on any upstream Git course of to utilise both of these. And because the FastAPI app is listening to port 8880, it’s logical to map these two exterior ports (80 and 443) to the interior 8880 port. The mapping for 8880:8880 is probably going redundant on the Server, however included right here as a result of it’s needed for doing testing on Native PC.

For ease of reference, the worth on the left-hand-side of the colon (:) is the exterior port, and the worth on the right-hand-side is the inner port.

Default values:

- 8880:8880
- 443:8880
- 80:8880

5.4. File: /src/api/principal.py

Because of the measurement of the file, I’ve damaged it in to chunks and defined every part individually. The complete file is copied to the top of this text (see here).

5.4.1. Import Libraries

Very first thing to do is to import the required libraries. The best way that that is structured is to not import the complete code base from every library; however as a substitute to solely import the required capabilities/lessons from every.

Extra particularly:

  • exec_info for dealing with errors and returning them again via the API
  • config for dealing with the default surroundings variables
  • FastAPI and Query for operating the API endpoints
  • BaseModel as a pedantic (or, ‘pydantic’ 😂) technique for dealing with objects via the API endpoints
  • PlainTextResponse, JSONResponse, and HTMLResponse are the particular objects which can be returned by the FastAPI endpoints
  • Repo for truly calling the git clone and git pull strategies
  • exists for checking whether or not the /repo folder exists or not
  • And at last, rmtree for bodily deleting the info within the /repo earlier than executing the git processes.
/src/api/principal.py (strains: 19–27)
(picture by Writer, supply here and here and here)

5.4.2. Compile Variables

As talked about above, there are 9 surroundings variables used on this course of. Two of that are Necessary, and the remaining are Optionally available. The best way that these necessary and non-obligatory processes are dealt with are arrange utilizing the config() perform of the python-decouple package deal. Successfully, it will enable the developer to outline a default worth for the Atmosphere variable, and to forged no matter worth is parsed in to the specified information sort.

/src/api/principal.py (strains: 30–45)
(picture by Writer, supply here and here and here)

5.4.3. Set Touchdown Web page

The touchdown web page is extremely easy. It’s only a fundamental HTML desk, which permits the Developer (and certainly every other consumer) to see the important thing surroundings variables straight on the Touchdown web page for the app. The template which is loaded can be additional outlined under in this paragraph. The method to bodily import it in to the Python surroundings is definitely fairly easy, as proven under. Furthermore, the str.format() technique is utilised to parse the particular Atmosphere variables straight in to the HTML string. Computationally talking, that is very environment friendly.

/src/api/principal.py (strains: 48-61)
(picture by Writer, supply here and here and here)

5.4.4. Instantiate App

As soon as all packages are loaded and constants outlined, it’s time to instantiate the App. The FastAPI package deal has a collection of very intuitive, quite simple, and really helpful processes. Many of the default values are completely usable, which eliminates a whole lot of the complexity (and, certainly, ambiguity) on behalf of the developer. In only a few strains of code, your entire FastAPI software will be outlined and is able to use. As proven under

The one facet which must be elaborated upon is the openapi_tags part. That is an aesthetic selection, in order that when the /swagger web page is loaded, it is vitally straightforward to see which API endpoints are grouped collectively in to which sections. That is one other solely non-obligatory (but extremely suggested) configuration to make use of.

/src/api/principal.py (strains: 64-76)
(picture by Writer, supply here and here and here)

5.4.5. Set Customized Capabilities

Whereas I’ve reserved a piece for crafting some customized capabilities, truly there’s solely actually one which I’ve added in right here. It’s a helper perform which is used for eradicating all of the recordsdata from inside a listing. That is needed as a result of generally the gitpython package deal doesn’t prefer to clone a repo in to a listing when there’s already different information and recordsdata current in the identical location. So, this perform will readily take away the outdated recordsdata.

/src/api/principal.py (strains: 85-89)
(picture by Writer, supply here and here and here)

5.4.6. Set Customized Lessons

The FastAPI package deal depends on the useful processes contained throughout the BaseModel class of the pydantic package deal. Through the use of this class, it’s extremely straightforward to outline request and response objects for use by the API. Extra particulars will be discovered here. For our functions right here, I’ve solely outlined three customized lessons, and the construction of them could be very easy and easy.

/src/api/principal.py (strains: 97-110)
(picture by Writer, supply here and here and here)

5.4.7. Outline Endpoint: Touchdown Web page

Now that all the things is ready up and able to go, it’s time to outline the precise endpoints. Right here, we are going to use three particular endpoints; the primary of which is the Touchdown web page. Every of those endpoints will utilise FastAPI’s decorator methodology, particularly the @app.get() or the @app.publish() strategies.

For this primary Endpoint, we all know that that is the touchdown web page as a result of we now have outlined path="/", which principally implies that this would be the root listing for this web site. Every part else can be constructed on prime of this root listing. It’s a quite simple endpoint which returns a HTMLResponse class, which incorporates the LANDING_PAGE which we now have outlined earlier. Additionally, the schema for this endpoint is turned off, in order that when the /swagger web page is inspected, the reader shouldn’t be seeing pointless and redundant schema definitions.

The code appears like this 👇

/src/api/principal.py (strains: 118-134)
(picture by Writer, supply here and here and here)

When it’s operating, the Swagger web page appears like this 👇

The Swagger web page for the Touchdown Web page Endpoint
(picture by Writer)

When it’s operating, the URL appears like this 👇

The URL web page for the Touchdown Web page Endpoint
(picture by Writer)

5.4.8. Outline Endpoint: Well being Verify

The second API Endpoint to declare is a well being verify. That is necessary in order that the consumer is ready to know if your entire software is up and operating, and is returning a wholesome response. It’s best observe to arrange a single endpoint for this goal.

The construction of this endpoint is simply so simple as the Touchdown Web page. Maybe much more easy, as a result of it returns a PlainTextResponse as a substitute, which is barely going to return a brief string of textual content.

The code appears like this:

/src/api/principal.py (strains: 137-153)
(picture by Writer, supply here and here and here)

When it’s operating, the Swagger web page appears like this:

The Swagger web page for the Well being Verify Endpoint
(picture by Writer)

When it’s operating, the URL appears like this:

The URL web page for the Well being Verify Endpoint
(picture by Writer)

5.4.9. Outline Endpoint: Predominant Endpoint

And at last, we attain the Predominant Endpoint. That is the place the principle magic occurs. Whereas this Endpoint appears lengthy, it’s actually not. This code chunk will be damaged in to 3 elements:

  1. The decorator half:
    — That is the part that really defines that it’s a HTTP POST technique.
    — The path is a parameter, as outlined by the consumer within the API_ENDPOINT a part of the surroundings variables.
    — The description is break up over three strains (primarily so I could make it look aesthetically pleasing on the display screen 😉), and it additionally contains the particular GIT_URL and REPO_DIR, as outlined by the consumer additionally within the surroundings variables.
    — It is going to all the time return a JSON object.
    — Relying on how the info is parsed to this API Endpoint, it can both return considered one of three statuses: 200 for Success, 422 for Validation Error, or 500 for when there’s an Inside Server Error. These are the three customized lessons that we outlined above.
  2. The declaration half:
    — This part is barely two parameters: the git_url for the place the info is being pulled from, and the repo_dir for the place the info can be saved to.
    — Underneath the hood, every of them can be a Query objects.
    — The one look lengthy as a result of the description half covers a number of strains.
  3. The perform execution half
    — The precise code execution half goes to attempt to to 2 issues: 1) Verify the goal listing (and delete something that exists in there, if needed), and a pair of) Clone the newest information from the upstream repo.
    — If it meets an error, it can return a InternalServerError response to the caller via the API to allow them to see and debug the error if needed.
    — Whether it is profitable, then it can return a Success response again to the caller.

The code appears like this:

/src/api/principal.py (strains: 156-205)
(picture by Writer, supply here and here and here)

When it’s operating, the Swagger web page appears like this 👇

The Swagger web page for the Predominant Endpoint
(picture by Writer)

5.5. File: /templates/landing_page.html

As talked about above, I’ve written an excellent easy HTML web page which can be utilized for the Touchdown Web page for the API. You’ll discover two perculiarities right here:

  1. The values throughout the <model> tag aren’t correct CSS syntax, as they’ve double-curly-brackets (,). It is because this information can be parse’d in to Python, which can then interpret something throughout the curly brackets to be parameters, and try and assign variables to them. To flee this, the double-curly-bracket syntax is used. This course of is outlined within the Python documentation.
  2. The values throughout the single-curly-brackets are the World Constants, as outlined by the environmental variable (see above).
/templates/landing_page.html
(picture by Writer, supply here and here and here)

Now that we now have constructed the app, and we perceive what all the parts are doing, it’s now time to set it up and start utilizing it.

6.1. Create Server (utilizing AWS)

Very first thing’s first, it’s essential to create the server within the cloud. It’s doable to make use of many various cloud-computing platforms; the 2 hottest ones are the Elastic Compute Cloud (aka EC2) on the Amazon Web Services (aka AWS) platform, and the Azure Virtual Machine (aka AVM) on the Microsoft Azure (aka Azure) platform. If you wish to learn in regards to the variations between the 2 platforms, check out this text: Comparing AWS and Azure Compute Services.

For the needs of this tutorial, we’re going to make use of the AWS EC2 platform. Right here, there are various, many totally different tutorials about creating EC2 situations. Reminiscent of this one or this one or this one. I’ve personally discovered the articles by Jan Giacomelli to be probably the most useful, as a result of he contains loads of particulars, explanations and display screen photographs. See tutorial: Deploying Django to AWS with Docker.

Please pay shut consideration to the step on this Tutorial to Install Docker on the server. So as to add to this, there are two different pages on Docker Docs to assist information you with this set up: Install Docker Engine on Ubuntu and Install Docker Compose V2 on Linux. When you ssh in to the server, the particular set of directions which you’ll need to run are:

Set up Docker and Docker Compose on EC2 Occasion
(picture by Writer, supply here and here and here)

After getting created the EC2 occasion and have it up and operating efficiently, it’s necessary so as to add one further step. That’s to whitelist the IP addresses from GitHub, in order that when GitHub will set off the HTTP POST request in opposition to your EC2 occasion, it is not going to be blocked by the networking guidelines that are arrange by default on the EC2. For extra details about why that is necessary, see particulars on About GitHub’s IP Addresses and Possibility to Whitelist Webhooks IP addresses and GitHub Webhooks IP Ranges.

To whitelist these IP’s, comply with the directions given on this tutorial: Authorize Inbound Traffic for your EC2 Instances. After getting executed so, your Inbound Guidelines ought to seem like this 👇

Inbound Guidelines required for GitHub to entry EC2 Occasion
(picture by Writer)

You could ask the query: “However why do I even want to do that?” or perhaps “How do I do know it has labored?” or perhaps “How do I do know if GitHub is failing as a result of the IP was not whitelisted?”. Properly, the reply to all these questions is similar.

Particularly: if the GitHub IPs are not whitelisted, then when the Webhook is triggered from GitHub to AWS, will probably be blocked and the error message which is returned can be: failed to hook up with host. See under display screen shot for an instance:

Error message acquired when IP Addresses aren't appropriately whitelisted
(picture by Writer)

6.2. Add docker-compose.yml file to Server

Now that the EC2 occasion is prepared, the following steps is so as to add the docker-compose.yml file to it. For this, there are three strategies for including the file to the Server:

  1. Utilizing ssh:
    — Observe the directions here and here for a way to go online to the EC2 occasion utilizing ssh.
    — As soon as on the server, add an empty file utilizing: contact docker-compose.yml.
    — Open the file utilizing nano docker-compose.yml.
    — Actually copy-and-paste the textual content out of your Native PC in to the server straight.
    — Save and shut the file by urgent ctrl+O then ctrl+X.
  2. Utilizing scp:
    — Observe the directions here and here for how one can execute an scp script to push recordsdata to a safe server.
    — The command must be one thing like:
    scp -i path/to/pem/file/ec2_identity_file.pem /path/on/localpc/docker-compose.yml username@server:path/on/server
  3. Utilizing FileZilla (or different comparable FTP software program):
    — Observe the directions here and here and here and here for how one can push recordsdata to an EC2 occasion utilizing FileZilla.

The precise file which you need to be copying to the server is given under. Be sure to replace the <replace> sections together with your particular info.

Template for the docker-compose.yml file
(picture by Writer, supply here and here and here)

6.3. Launch Docker on Server

Now that all the things is ready up and able to go, the following bit is straightforward. Only one line of code to fireplace up the Docker container and begin the API listener 👇

Set up Docker and Docker Compose on EC2 Occasion
(picture by Writer, supply here and here and here)

And if you run it on the server, then that is what it appears like:

Script for organising Docker on the EC2 Server
(picture by Writer)

6.4. Add Webhook on Git (utilizing GitHub)

Subsequent, we have to truly configure GitHub to set off the Webhook. The documentation provided by GitHub is definitely actually useful with this, and it’s fairly straightforward to comply with. As well as, there are some fairly good Blogs and Tutorials obtainable on-line for this, together with: Deploy from Github/Gitlab to server using Webhook and Create your first CI/CD pipeline with Jenkins and GitHub.

On this instance, I’ll use one other Repo of mine known as code-snippets. Whereas it’s technically not an App, per se, it’s nonetheless easy sufficient to assist us to know the method.

After we navigate to the Repo, we are able to then open the Settings then Webhooks part, then press the Add Webhook button. As proven under.

The URL for that is additionally fairly easy: https://github.com/<username>/<repo>/settings/hooks.

The place to seek out Webhooks on GitHub Repo
(picture by Writer)

The subsequent menu requires you so as to add the URL. That is the Elastic IP deal with which you had set within the EC2 Occasion (see this section). If you understand how so as to add sub-domains to your current web sites, then you are able to do that additionally. The half after the IP deal with is similar worth which you had configured within the API_ENDPOINT worth of the surroundings part of the docker-compose.yml file. It’s all beginning to fall into place now.

For Content material-Sort, choose software/json, go away the Occasions at simply the push occasion, after which choose Add Webhook.

Settings for including a Webhook on GitHub
(picture by Writer)

As soon as saved, you’ll see a message on the prime of the display screen:

Okay, that hook was efficiently created. We despatched a ping payload to try it out!
Efficiently Created Webhook on GitHub Repo
(picture by Writer)

And if you go in to that Webhook to examine the Latest Deliveries, you’ll discover one Request has already been despatched, and a response already acquired. And, hopefully, if all your configuration is efficiently arrange, you will note a inexperienced tick and a response status_code of: 200.

Affirmation of Profitable response from GitHub Webhook ping
(picture by Writer)

6.5. Take a look at it

Testing it’s tremendous straightforward. First, let’s hop on to the EC2 server and verify the docker compose logs are wanting good. Then we are going to verify the contents of the listing utilizing ls repo. That is the end result:

Checking the recordsdata on the Server
(picture by Writer)

Very first thing’s first, you possibly can see that the ping request from GitHub has come via efficiently! And that the /repo listing is already containing the up to date information from the upstream Git repo!

Subsequent, on our native PC, throughout the listing for the repo that you just need to push to GitHub, let’s simply create one very small textual content file, known as check.txt. Then we are going to do git add, git commit, git push to get it tot he upstream Repo. The code appears like this 👇

Add a brand new file to Git and Push to upstream Repo
(picture by Writer, supply here and here and here)

And the terminal appears like this:

Add a brand new file to Git
(picture by Writer)

After which once we return to the EC2 server to verify it, we are able to re-run docker compose logs to verify it has come via efficiently (which it has!), and likewise verify the ls repo to verify that the file is now current (which it’s!). That is what it appears like:

Re-Checking the recordsdata on the Server
(picture by Writer)

Every part is now working like a allure!

Word: Now that the Webhooks are arrange and your Server is mechanically being saved updated, there’s technically yet another step which you’d must do. And that’s normally to restart the app which you may have operating on the server, to make sure that it’s all the time referencing newest supply recordsdata. Nonetheless, as a result of varied apps have varied methods of being restarted, I’m not going to say them right here.

Additionally, extra just lately, I’ve discovered that there’s a Webhook course of obtainable for varied Docker Container processes. Together with on Docker Hub, AWS ECR, and Azure ACR. That is inspiration for me to do one other weblog for how one can use this!

Fortunately, this course of shouldn’t be tough. I’ve set it as much as be open supply, and available. All supply code is saved to GitHub, and the ready-to-use container is hosted on Docker Hub.

9. The complete file for: /src/api/principal.py

/src/api/principal.py
(picture by Writer, supply here and here and here)

Thanks for studying.

Wish to Join?Shoot me queries,concepts and solutions at https://chrimaho.com.

More Posts