Protecting your SQL Server with Windows Firewall – Part 3

So far we have looked at why we need to use firewalls to protect our servers in Part 1. Then how we capture inbound traffic on our servers in Part 2 of this series. Here we will now look at how we can take that log data and process it so that we can build our firewall rules.

Reading The Log.

Building on my last post I am going to assume that you now have a neat folder full of log files from the Windows Firewall rules we configured. But how do we now look at what is there without spending hours playing with Notepad++? As with most things, PowerShell provides an answer for a quick and dirty way to process this data.

Note: If you are reading the Windows Firewall log directly then you will need to run these as admin.

Here we are getting the log file location which we want to process as well as defining a working directory which we will use as we take the log file and turn it into a CSV so that it is more usable. If you have copied the file out to another location then you can just set the $FwLog variable as needed.

## Setup working variables.
$FwProfileName = "Domain"
$FwLog = Get-NetFirewallProfile -Name $FwProfileName `
    | Select-Object LogFileName
$WorkingFile = "$env:TEMP\FWLog_Temp.csv"

Building on this we can now have a look at what we have to work with in the log file. We could go open it in Notepad++ (remember to run as admin) or we can use Get-Content in PowerShell as seen below. We are combining with a select just to grab the first 25 rows for this illustration.

Get-Content -Path $FwLog.LogFileName `
    | Select-Object -First 25

This reveals to us that the log file is structured as space delimited for our data but has a bunch of info at the top relating t the conent. Overall it has the information we need but we cannot just import it and process it as if it were a CSV.

Get-Conttent PowerShell output of the raw Windows Firewall log.

This is why we set a working file, what we can do here is convert it to a CSV format with a little PowerShell trickery. Here we are extracting the field list from the file header and will then create a string array of the fields to act as a header for our CSV.

## Get the column headers from the Firewall Log File and turn it into a String Array to use as a header later.
[string]$FieldList = Select-String -Path $FwLog.LogFileName -Pattern "Fields"
$CsvHeader = $FieldList.Substring($FieldList.IndexOf(": ")+2) -split " "

Once we have our header defined then we can now look to convert the content of the log file into a format we can use. In this case we are going to use Get-Content again to pull the data in but combine it with Select-Object to skip the first five rows and go straight to the data.

Then we push that out to our working file before then using Import-Csv to get the data from our working file. Here we are specifying the delimiter is a space and passing in the header string array we created a moment ago. Also, we must not to forget to clean up after ourselves.

## Pull data from Firewall Log file (Need to run as admin),
## skip the first 5 rows of junk and then create our working CSV data.
$CsvBody = Get-Content -Path $FwLog.LogFileName `
    | Select-Object -Skip 5
$CsvBody | Out-File -FilePath $WorkingFile -Force
$CsvFinal = Import-Csv -Path $WorkingFile -Delimiter " " -Header $CsvHeader

## Clean up after ourselves.
Remove-Item -Path $WorkingFile -Force

Now we have the data in a format which we can make use of.

PowerShell output showing formatted Firewall log data in a structured table.

I would recommend that now you store the CSV somewhere nice and safe for future reference. This can be done with the Export-Csv PowerShell command.

Finding Something Useful.

Now that we have the data in a variable we can look to query that rather than loading data into and out of files.

Here we are going to make use of the following PowerShell commands;

These allow us to filter, sort, aggregate and output the information in a number of different ways.

By combinging these we can run a basic query to pull back a consolidated list of sources which are sending data to our server. We can do this by using Where-Object to pull out rows where the path property is “Receive”. Then we can look to group it by the Source IP, Protocol, and the port it was received on.

From here we can then look to sort and present it as required.

## Let's see what is talking to our server.
$CsvFinal | Where-Object {
    $_.Path -eq "RECEIVE"
} `
    | Group-Object -Property "src-ip","Protocol","dst-port" `
    | Sort-Object -Property Count -Descending `
    | Select-Object -Property Name, Count

The PowerShell above will pass out the following information;

PowerShell output of an aggregated and sorted data set of traffic received.

This lets us see what we are dealing with. Finding source IP addresses means we can backtrack into the appropriate network segments and sources. Now we have to start investigating if these are appropriate or not. I’m sure that you will be in for a surprise or two along the way. But it is far better for us to know now and be able to do something about it, rather than waiting for a data breach and discovering after the case.

Wikipedia have a good list of common TCP and UDP ports which you can find here.

After doing a thorough review and making sure that unauthorised sources are identified and a list of things to resolve has been built we can focus on SQL Server traffic. We can take our previous PowerShell and with some minor modifications to add predicates looking for the SQL Server port numbers we can refine our log details further.

## Now focus on SQL Server
$CsvFinal | Where-Object {
    ($_.Path -eq "RECEIVE") -and 
    ($_."dst-port" -eq "1433" -or $_."dst-port" -eq "5022")
} `
    | Group-Object -Property "src-ip","Protocol","dst-port" `
    | Sort-Object -Property Count -Descending `
    | Select-Object -Property Name, Count

I am running SQL Server on its default port (we will discuss whether this is a good idea or not in another post). Additionally, we have an Always On Availability Group in play. This means that we need to make sure that the traffic can be sent between the endpoints on their ports.

Aggfregated firewall log output showing SQL Server traffic.

In this example we are looking at a very simple scenario where we only have two other hosts that are sending data to our SQL Server. In more complex environments then we will likely see a lot more entries here.

By iterating through this we can narrow down the Firewall rules we will need to build and deploy.

Turning The Firewall On.

Once all the hard work has been done to create the definitions for our rules the implementation becomes a lot clearer. While it is possible to create these via the user interface for the Advanced Firewall config utility, we can also use New-NetFirewallRule in PowerShell which we can use to manage the deployment and management of our rules.

Based on the previous information we are going to create two new rules for our SQL Server. One for TDS traffic from the source, and one for AG log transfer traffic from on port 5022.

Now, one bone of contention is whether to implement a rule that also permits TDS traffic between the AG replicas. This is very much an “it depends” answer. On one hand we should be using Secure Access Workstations or dedicated management servers so we should not need this. But at the same time, in a large scale failure do you want/need to use management tools from the servers? This is another contentious subject covered really well by Andy Mallon in his blog post “Should I install SSMS on a server running SQL Server?

In this example we will not be creating a rule that permits TDS traffic between our servers in the same subnet.

Our first rule definition is below;

## Now we set the Firewall Rules which we want on the Server.
## TDS Traffic to SQL Server.
$FirewallRuleParams = @{
    Name = "Allow_TCP_1433_In"
    DisplayName = "Allow TCP 1433 In"
    Description = "Rule to allow all inbound TCP traffic to port 1433 for SQL Server."
    Action = "Allow"
    Direction = "Inbound"
    Profile = "Domain"
    RemoteAddress = ""
    LocalPort = "1433"
    Service = "MSSQLSERVER"
    Enabled = "False"
New-NetFirewallRule @FirewallRuleParams

I would always advocate for making the rule as explicit as possible so that we can clearly define what is and is not permitted. Ambiguity in security situations is something that we need to avoid as much as possible. In this case traffic will only be allowed in via this rule if the SQL Server service is running.

Now we can use the same pattern for the AG traffic rule.

## AG Synchronisation Traffic to SQL Server.
$FirewallRuleParams = @{
    Name = "Allow_TCP_5022_In"
    DisplayName = "Allow TCP 5022 In"
    Description = "Rule to allow all inbound TCP traffic to port 5022 for SQL Server Availability Group synchronisation traffic."
    Action = "Allow"
    Direction = "Inbound"
    Profile = "Domain"
    RemoteAddress = ""
    LocalPort = "5022"
    Service = "MSSQLSERVER"
    Enabled = "False"
New-NetFirewallRule @FirewallRuleParams

Once these are in place then we need to enable them, also do not forget to disable the Allow-All rule so that we don’t undo all our hard work.

## Enable new rules, disable allow all.
Set-NetFirewallRule -Name "Allow_TCP_5022_In" -Enabled "True"
Set-NetFirewallRule -Name "Allow_TCP_1433_In" -Enabled "True"
Set-NetFirewallRule -Name "AllowAll-In" -Enabled "False"

These are not the only Firewall rules which will likely be in place. We will need to make sure that all of the other required rules are in place for AD authentication and domain traffic are in place. This will allow us to manage systems effectively while ensuring we have a secure environment.


Now that we have done the research and turned on the firewall we should have a much more secure environment. However, we have now added another management element to consider with our estate. So we now need to put some thought into how we manage Windows Firewall at a large scale, but how do we do this?

In Part 4 of this series we will look at how we can manage Windows Firewall by adopting an Infrastructure as Code approach.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.