Parsing IIS logs

By  Liisa Tallinn and Raido Karro

This article is a step-by-step guide on pointing SpectX at your IIS logs, applying one of the built-in IIS parsers to your data and customizing the schema if needed.  Once the parser matches the raw data, SpectX makes it quick and easy to run queries on a large number of log files from their current location. This is especially handy if the volumes are large and the time is limited.

IIS logs seem fairly easy to parse - the structure is well standardized and available in the header of the files. At the same time, there is no pattern to rule them all because logging use cases are unique, storage is limited and not all IIS servers have every field and option turned on in their configuration. The situation gets even trickier when looking at multiple servers with multiple configurations. The solution we've drawn up at SpectX is a full IIS log pattern (or schema) to cover all fields potentially available at IIS versions 8.5 and up. Fields not present in the data can be commented out in the pattern.

To get started, download and install SpectX. The Desktop edition is free of charge with unlimited data volumes. If you happen to get stuck anywhere, feel free to join our Slack community to ask for advice.

Where are IIS Log Files Stored?

If the IIS server runs on-prem and centralized logging has not been set up, the logs are most likely sitting in the server itself. The default configuration has them collected in the folder
The fastest option for reading, parsing and analyzing these files with SpectX is to create an AD security group for users running SpectX in their machines and give them access to the IIS log folder. This will let the user map the log folder to the machine running SpectX and running complex queries without first importing or copying the data from the log folder.  If the IIS server runs in Azure and the logs sit at an Azure blob, SpectX can read the files directly from that blob. The third option is centralized logging. SpectX can read files from the central file storage via ssh or much faster if adding the SpectX Source Agent to the server.

The Trouble with IIS Headers

Looking at the raw IIS file, it is quite easy for a human to grasp the data structure since all the fields  are specified in the header starting with #Fields:

It is slightly harder for a machine to read because the header lines starting with the '#' are present not only at the beginning of the file but also in the middle, repeating after a certain number of events. This metadata needs to be skipped to be able to analyze the actual data. Here's how to do it with SpectX. 

Point to the Storage and Parse the Logs

If the IIS log folder is mapped to the machine running SpectX, you can simply open the Input Data browser and navigate to one of the IIS log files. If the logs are stored in the cloud, create a New>Datastore and configure SpectX access to the storage. Having clicked on a log file in the data browser, follow these steps:

1.  Click 'Prepare pattern'. SpectX will automatically apply one of the three IIS patterns included in the Desktop edition to your data. When the pattern matches the data, fields in the parse preview light up in yellow and blue:

2. If you're not happy with the pattern, click on the shared > pattern folder in the resource tree and drag-drop another IIS pattern to the pattern editing window.

4. For further customization, take a look at the '# Fields' section in the raw file header. Compare the list with the fields listed in the SpectX pattern. Comment out fields in the pattern that are missing in the header.

Run the First Query

When the pattern matches the data, click on 'Prepare Query' and 'Run'.  The result should give you the first 1000 lines of the IIS log file you selected, parsed and typified.

Tip! If you’d like to look at more than just one file, add an asterisk to the path at the beginning of the query. For example, the scope for
can be expanded with a wildcard to look at the whole month
Here's how to look at the full year

If your logs are stored in multiple locations and you would like to analyze them all simultaneously, first create a datastore for each of the sources. Then use the following syntax to list all the datastore URIs as a source for the parser. 
@src = PARSE(pattern:$pattern, src: [‘file:/C:/inetpub/logs/u_ex18*.log', 'file:/D:/inetpub/logs/u_ex1810*.log', 'file:/E:/inetpub/logs/u_ex181029.log']);

More Queries

Now that the data is parsed and the scope of the data is as broad as needed, you can start experimenting with queries. Take a look at these 20 questions and copy-pastable queries you can ask from your IIS logs. 

Back to articles