Skip to content
This repository has been archived by the owner on Sep 7, 2023. It is now read-only.

Latest commit

 

History

History
91 lines (60 loc) · 3.17 KB

README.md

File metadata and controls

91 lines (60 loc) · 3.17 KB

Challenge 6: Networking - Load balancing your WWW Server Farm

Here is what you will learn 🎯

  • How to load balance http traffic to 2 webserver VMs
  • Create an external load balancer using the Azure portal
  • Learn to know the requirements for an Azure external load balancer and how to configure it.

Our final architecture should look like this: Final architecture

At first you will deploy the start environment and then you will add the external load balancer.

Table of Contents

  1. Deploy the Starting Point
  2. Deploy the Load Balancer
  3. Test server outage (optional)
  4. Cleanup

Deploy the Starting Point

In this directory there is an ARM-template which includes 2 web server VMs and its requirements (networking, disks,...):

'Starting Point' Architecture

Deploy this scenario into your subscription by clicking on the button.

Name Value
Resource group (new) rg-lbwww
Location West Europe
Admin user demouser
Admin password %some complex value%
Vm Size Standard_B2s or try e.g. Standard_F2s_v2
Disk Sku StandardSSD_LRS

The result should look similar to this:

Deployment result

Deploy the Load Balancer

Now let's add an external Azure load balancer in front of the two parallel web server machines.

[Azure Portal] 
-> '+' Add 
-> Search the marketplace for 'Load balancer'
Name Value
Resource group rg-lbwww
Name lb-wwwfarm
Region West Europe
Type Public
SKU Basic
Frontend Configuration %Use existing public IP% pip-wwwfarm

Configure the Load Balancer

To get your load balancer working you need to configure the following:

  • A backend pool that contains the endpoints i.e. the VMs to which the traffic will be routed.

    backend pool

  • A health probe for TCP port 80 (http) to check if the endpoints are 'responsive' to web requests

    health probe

  • A lb rule to forward incoming traffic (TCP port 80) on lb's frontend IP address to backend pool (TCP port 80)

    lb rule

To check if your lb is working do a HTTP request to the endpoint http://%PIP of your lb%. Depending which endpoint serves your request the result should look like:

lbresult1

lbresult2

Test server outage (optional)

  1. Stop one VM and verify if the webpage is still served.
  2. Restart the VM and check if the lb notices it and re-balances load.

Cleanup

Delete the resource group rg-lbwww.

◀ Previous challenge | 🔼 Day 1 | Next challenge ▶