Creating My Own PaaS: Azure and AWS at Home (P1)

I often find myself relaxing, doing nothing, and then suddenly coming up with a random project idea, thinking "Man, that would be pretty fun to make". I then spend a few days making that project and have tons of fun doing it. I then throw these projects on hosting services but over time these projects stack up and start costing me a lot of money. I figured the best solution would be just to host it myself, and then it would cost me nothing at all. That's why I decided to make my own PaaS.

The Creation Of My PaaS

Getting a Server

I figured that all a PaaS really does is take your code and run it on their server. So I immediately knew I needed a place to actually run the code. I pulled out an old laptop of mine that was just collecting dust in my drawer. It had Windows 7 on it so I completely wiped the laptop and installed Linux on it, I went with the Ubuntu Server distro since it's pretty lightweight but doesn't require a lot of setup. The reason I did this is that Windows is rather heavy on resources and I won't be needing the things that are heavy on resources, Windows 7 is also not supported anymore and a lot of applications and frameworks I'm going to need will not work on Windows 7.

Making my PaaS Accessible

The problem with hosting something locally is that it's, well, hosted locally. I'm going to need to figure out how to make my PaaS accessible from outside my local network. The problem with doing that is most networks use dynamic IPs, what that means is the IPs change over time. If the IP changes then the location of my server changes, and therefore I can't set up stable access to my server.

Luckily the solution was rather easy, I grabbed a domain I had lying around and set up a script on my Ubuntu Server to automatically update that domain to point to my IP. That way whenever my IP changes - the domain will be updated to point to my IP. This means whenever I want to send something to my server, I can just do it through the domain because the domain will always point to my IP.

Lastly, I set up a Cloudflare Proxy, what this means is requests to the domain first go through Cloudflare before they go to me. The reason I do that is to make use of Cloudflare's end-to-end encryption, Cloudflare automatically gives my domain an SSL/TSL certificate which lets my domain use HTTPS.

Developing the PaaS backend

After getting the server running and making it accessible, I started developing the software that'll be responsible for actually deploying apps, I first tried thinking about what a PaaS does when I deploy a project on it.

  1. I give it my project, normally via a github repo link.
  2. It downloads the project and runs it on their server.
  3. It gives me options to customize it like setting a PORT to expose my app, setting custom domains, and setting environment variables.

This tells me I'm going to need to:

  1. Set up github Oauth to log in via Github, that way my server will have access to my github and can access my projects.
  2. Make code that can automatically download and store the code from github and then run it.
  3. Create a config that lets me:
  • Automatically expose PORTS for my projects, give each project a domain, and then route port 80 from that domain to the port the project is running on.
  • Set up a custom domain by issuing a SSL/TSL certificate to the custom domain being used for the domain.
  • Make an API to let users add environment variables directly to their project.

My Progress

Setting up Github OAuth

Setting Github OAuth up was relatively straightforward. I created code for an express server, then set up 'passport' to handle sessions. I then created a Github OAuth application and connected it with Github Context so that users can go to a link, accept the OAuth, have their session stored in a database and be redirected to my homepage.

Downloading and Running Projects

Setting up code to download the repo code was very easy, I used a package called simple-git, passed user access tokens, and then passed the project repo link and it downloaded it for me to a specific place. Next I just simply ran "npm install", "npm run build" and "npm run start". This worked fine but the problem was that it didn't run in an isolated context and only worked for NodeJS applications.

To run projects in an isolated context I could just start up a 'virtual machine' and run each project in that virtual machine. This is essentially what Docker does, you can create a new isolated environment for a project to run in.

I set up docker, now each project gets downloaded and then gets put in a docker container, an image gets created for the container and that image is run, if no Dockerfile is installed then I just put a basic one for NodeJS applications in it. This way each project is run in an isolated context and I can use any framework, I'm not restricted to NodeJS. In the future, I can make my server automatically add Dockerfiles depending on context clues, like if package.json is in the project then it's NodeJS, and a default NodeJS Dockerfile is loaded.

Issuing Custom Domains with Reverse Proxy

The next problem is giving each project a custom domain and routing requests from that custom domain to the docker image that the project is running in. Giving each project a custom domain was very easy, when I told Cloudflare to route all requests to my domain to my server's IP, I also told Cloudflare to route all requests from any subdomain of my domain to the same location.

This is called wildcard subdomains, in this case, any request from any subdomain gets routed to the same place, and both go to the same place. Now on my server, I can just check the subdomain and then if it's the subdomain I gave to my project I can just route that to the port that my project is running on.

Setting up the Reverse Proxy to actually route those requests was a different story though. I couldn't figure out how to do it and ended up asking some of my friends for help, they originally recommended Nginx, I tried Nginx but it wasn't the right tool for my project. Nginx was hard to configure programmatically and is more meant for manual use. Next, I was recommended to use Caddy, it does the same thing Nginx does but has a CLI included. CLI stands for Command Line Interface and basically just lets me use it using shell commands. This was perfect since that means I can use it programmatically.

I played around with Caddy for a while but it just wouldn't work, whenever I try to use Caddy as a reverse proxy for a domain the connection times out. After an entire day of struggle, I decided not to use any external service and instead make my own Reverse Proxy. After doing that I got it working perfectly.


So far I've learned a lot, it's been an incredibly fun project to work on, although I decided to take a break to work on my university projects a little. So far I can:

  • Login via Github
  • Give a github repo URL and have the server download it
  • Have the server automatically generate a docker container, docker image, and docker-compose file, and spin up a docker image to run the project
  • Have the server expose a port and domain for my project, and automatically route all requests to the domain to the port my docker-image is running in
  • Remove the project again by deleting the files it downloaded, removing the docker container and docker image, and then removing the reverse proxy

What I still need to do however is:

  • Make the UI
  • Make API to forward docker logs to the UI
  • Allow users to set up custom domains

What I would like to do if I have time is:

  • Allow users to specify a custom subdomain instead of the gibberish I give it.
  • Allow for cronjobs to be made, which basically just run the scripts at specific times automatically
  • Potentially remake it or at least refactor the code. There's a lot I'd do differently if I had the chance to remake this project.

I'll post an update later once I've made some more progress, take care!