Getting Started with Azure AD and Office 365 audit logs¶
After having turned on audit logging in Azure AD and/or Office 365, you can either retrieve the logs via API, send the logs to blob storage or import the data into a CSV to query the data with SpectX. When you’ve finished the setup, read more about parsing and analyzing Office 365 logs.
A) Retrieve the logs via Microsoft APIs¶
SpectX queries the data via Microsoft APIs and writes the data to the storage of the machine running SpectX. Running or scheduling a query at least once a week means you can collect a full archive of logs, even though the API lets you query only the last 7 days.
This option is available as a preview until 1st July 2021 in the free edition of SpectX. To try this feature after this period, contact us for an Enterprise trial.
- Register SpectX in the Azure Management Portal. See the step-by-step instructions here . Make note of the Application ID, Tenant ID and Secret key - you’ll need them in step 3.
- Open SpectX and create a new datastore by clicking on New> Data Store
msapi://as the protocol from the drop-down menu and fill in the details
- Insert the IDs you got in step 1
- Tick the boxes to mark which logs you’d like to retrieve from O365 and Azure
- Specify if you’d like to see the hour and log type in the file name and path
- Click on advanced configuration to specify the lag times and update frequencies for each log type.
- Give the datastore a name
Lag time is the period during which you expect a log record to show up in the cloud. For example, querying logs from 9AM to 10AM with the lag period 2 hours means you expect that an event can appear in the logs 2 hours later.
Update frequency is the interval during which you don’t need SpectX to re-fetch data. For example, running a query at 10:03 AM and 10:05 AM with an update frequency of 5 minutes means that the already existing data does not need to be re-fetched because the last query was made less than 5 minutes ago.
B) For Azure AD logs: querying the logs from an Azure container¶
You can route your AD logs to an Azure blob storage with this simple configuration.
Sign up for an Azure blob storage subscription and set up a destination container for your logs.
Log in to the Azure portal as a global admin or security admin.
Search for and select Azure Active Directory
Scroll down and click on Audit logs on the left menu
Click on ‘Export data settings’ and you’ll reach the Diagnostic settings view.
Create a new diagnostic setting
- Give the setting a name
- Select which logs you’d like to send to the container (Audit, SignIn etc..), specify their retention period (leave it to 0 if to keep them forever)
- Check the box ‘Archive to a storage account’
- Specify the name of your storage account (=the setting name in the first bullet above)
- Click on ‘Save’ in the top left corner
Save the settings and you should soon see your log records in the container.
To query the container with SpectX, connect SpectX to the Azure container.
C) For Office 365 and Azure Audit logs: export the logs into a .csv file¶
If you’d like to work offline with log files, the best option is to get the broadest possible Office 365 dataset extending back a maximum period of time (90 days) is to export the data into .csv using Powershell. That script from Microsoft will extract Office 365 users’ login history report, containing both successful and failed login attempts. Depending on your Powershell skills, you can tweak the $AllAudits variable in the script and include even more fields in the exported file.
If Powershell falls out of your scope, export the logs via the Security & Compliance Center UI.
- Click on ‘Search’ in the left menu
- Then on ‘Audit log search’.
- Apply a filter (or don’t if you wish to just get a maximum amount of the latest records)
- Click on export data. Note that this method allows you to go back 90 days and returns no more than 50 000 rows at a time.
To query the resulting .csv file(s) with SpectX, navigate to the file using Input Data browser* and press on Prepare Query. Use an asterisk in the file path to query multiple files with a similar path.