Skip to content

dmytrostriletskyi/heroku-load-balancer

Repository files navigation

Cheaper than built-in, the load balancer for your Heroku pipeline applications. The way to host CPU-bound

Release Build Status License: MIT

Getting started

What is a load balancer

A load balancer is a device that distributes network or application traffic across a cluster of servers. A load balancer sits between the client and the server farm accepting incoming network and application traffic and distributing the traffic across multiple backend servers. By balancing application requests across multiple servers, a load balancer reduces individual server load and prevents any one application server from becoming a single point of failure, thus improving overall application availability and responsiveness.

Illustation on how load balancer works

Motivation

Heroku has built-in load balancer:

Heroku’s HTTP routers distribute incoming requests for your application across your 
running web dynos. So scaling an app’s capacity to handle web traffic involves 
scaling the number of web dynos. A random selection algorithm is used for HTTP request 
load balancing across web dynos - and this routing handles both HTTP and HTTPS traffic.

But at the same time, built-in load balancer can't help if your application is CPU-bound.

If you are processing individual requests slowly due to CPU or other shared resource 
constraints (such as database), then optimizing concurrency on the dyno may not help 
your application’s throughput at all.

So, there are pros of the solution:

  • costs — the solution is cheaper. If you use the built-in approach and buy the Performence plan for your application (it means you will have 4 dynos — 4 server instances), it will cost you $100 per month. In case of the solution, you can buy 4 independent applications (Hobby plan — $7 per month per instance), setup an indential software, put the load balancer before them (also, $7 per month) — it will cost you $35 (7$x5) — ~3 times cheaper,
  • CPU-bound applications — as mentioned above, Heroku cannot completely fit you in this case. You can even buy the Performance plan, but it will not increase your CPU performance too much to pay a few hundred dollars for this. But if you create tens of the instances with identical software and put a load balancer before them, it may solve your problems.

And cons of the solution. Keep in mind that this solution requires multiple, technically independent applications. The applications do not behave as a single application:

  • any add-ons must be manually attached to each app — makes operations more complex,
  • all logging is spread across apps — makes debugging harder,
  • performance metrics are spread across the apps — makes understanding app behavior harder,
  • the Heroku platform does not operate them as a single app (could cause downtime during deployments or daily dyno cycling) when the single load balancer Hobby dyno cycles (restarts) each day or on deployment, the entire app will go offline temporarily,
  • added request latency (another two HTTP hops in front of the Heroku router).

Big thank to @mars (works at Heroku) who provided the information above in issue #6.

How to use

  1. Press the button named Deploy to Heroku below.

Deploy

  1. Enter the name for the application which will host the load balancer. Choose the region and add to the pipeline if needed.

  1. Visit the Heroku account setting page, find the API Key section, reveal the key and paste it to the HEROKU_API_KEY field.

  1. Open the preferable pipeline and copy its identifier from the URL. On the screenshoot it is f64cf79b-79ba-4c45-8039-57c9af5d4508 mentioned by red arrow at the top.

  1. Return to the deploying page, paste the identifier to the PIPELINE_IDENTIFER field.

  1. Press the button named Deploy app. The process of deploying will start immediately as illustrated below.

  1. When build is finished, you can manage your application (rename, etc.) and view it (open URL in the browser).

  1. To check if load balancer works properly, just open logs of each production back-end servers (heroku logs --tail -a application-name in the terminal), and send the request to the load balancer application. As the result, the load balancer will proxy your request to the each back-end server in round-robin method (one by one in order).

How it works

  1. You specify pipeline's identifier (PIPELINE_IDENTIFER) to create load balancer for its applications in production stage.
  2. Through the Heroku API using your HEROKU_API_KEY, URLs of applications are fetched.
  3. Then configuration file for load balancing based on fetched URLS is created.
  4. And served by the Nginx in round-robin method (one by one in order).

Development

Clone the project with the following command:

$ git clone https://github.com/dmytrostriletskyi/heroku-load-balancer.git
$ cd heroku-load-balancer

To build the project, use the following command:

$ docker build -t heroku-load-balancer . -f Dockerfile

To run the project, use the following command. It will start the server and occupate current terminal session:

$ docker run -p 7979:7979 -v $PWD:/heroku-load-balancer \
      -e PORT=7979 \
      -e HEROKU_API_KEY='8af7dbb9-e6b8-45bd-8c0a-87787b5ae881' \
      -e PIPELINE_IDENTIFIER='f64cf79b-79ba-4c45-8039-57c9af5d4508' \
      --name heroku-load-balancer heroku-load-balancer

If you need to enter the bash of the container, use the following command:

$ docker exec -it heroku-load-balancer bash

Clean all containers with the following command:

$ docker rm $(docker ps -a -q) -f

Clean all images with the following command:

$ docker rmi $(docker images -q) -f