Configuring Custom Log Collection in Azure Monitor: A Step-by-Step Guide

The Log Analytics agent in Azure Monitor provides the Custom Logs data source, enabling the collection of events from text files on both Windows and Linux computers. Instead of using standard logging services like Windows Event log or Syslog, many applications log information to text files. Once the data is collected, you have the option to parse it into individual fields in your queries or extract it during collection to separate fields.

Please note the following important points:

  • This article focuses on collecting text logs using the Log Analytics agent. If you’re utilizing the Azure Monitor agent, refer to the documentation on how to collect text logs with the Azure Monitor Agent.
  • The legacy Log Analytics agent will be deprecated by August 2024, and Microsoft will cease providing support for it. To continue ingesting data, it is recommended to migrate to the Azure Monitor agent before August 2024.

To ensure successful collection, the log files must meet the following criteria:

  • Each entry in the log should either have a single entry per line or begin with a timestamp in one of the specified formats:
  • yyMMdd HH:mm:ss
  • ddMMyy HH:mm:ss
  • MMM d hh:mm:ss
  • dd/MMM/yyyy:HH:mm:ss zzz
  • yyyy-MM-ddTHH:mm:ssK
  • The log file should not allow circular logging, which refers to log rotation where entries are overwritten or the file is renamed and reused for continued logging.
  • The log file must be encoded in ASCII or UTF-8 format. Other encodings such as UTF-16 are not supported.
  • For Linux, time zone conversion is not supported for timestamps in the logs.
  • It is recommended to include the creation date and time in the log file to prevent log rotation issues caused by overwriting or renaming.

Please take note of the following:

  • If there are duplicate entries in the log file, Azure Monitor will collect them. However, the query results may be inconsistent, showing more events than the actual result count. It is advisable to investigate the application responsible for the log and address any issues causing this behavior before defining the custom log collection.

The following limits apply to a Log Analytics workspace:

  • Up to 500 custom logs can be created.
  • A table supports a maximum of 500 columns.
  • The column name cannot exceed 500 characters.

Please be aware of the following important information:

  • Custom log collection requires the application writing the log file to periodically flush the log content to the disk. This is necessary because the custom log collection relies on filesystem change notifications for the tracked log file.

Defining a custom log table involves the following steps:

  1. Open the Custom Log wizard in the Azure portal, accessible through Log Analytics workspaces > your workspace > Tables.
  2. Select Create and then choose New custom log (MMA-based).
  3. By default, all configuration changes are automatically propagated to all agents. For Linux agents, a configuration file is sent to the Fluentd data collector.

Uploading and parsing a sample log is the next step. You will need to upload a sample log to the wizard, which will parse and display the entries for validation. The wizard uses a delimiter to identify each record, with New Line being the default delimiter for log files with one entry per line. If the line begins with a date and time in the available formats, you can specify a Timestamp delimiter, which supports entries spanning multiple lines.

After uploading the sample log, selecting the appropriate delimiter, and proceeding to the next step, you need to add log collection

paths. This involves specifying the path and name of the log file or providing a path with a wildcard for the name. You can define multiple paths for a single log file, accommodating scenarios where a new file is created daily or when a specific size threshold is reached.

For example, if your application generates log files with names following the pattern logYYYYMMDD.txt in the directory C:\Logs, you can define the log collection path as C:\Logs\log*.txt.

When adding log collection paths, choose between Windows and Linux formats based on your needs. Enter the path and click the + button to add it. Repeat the process for any additional paths.

Next, provide a name and description for the log. The name will serve as the log type and will automatically end with “_CL” to indicate it as a custom log. You can add an optional description to provide further context.

After entering the log name and description, select Next to save the custom log definition.

To validate that the custom logs are being collected, it may take up to an hour for the initial data from a new custom log to appear in Azure Monitor. The system will start collecting entries from the logs specified in the defined paths, beginning from the moment the custom log was created. The entries you uploaded during the custom log creation won’t be retained. Azure Monitor will collect existing entries from the log files it locates.

Once Azure Monitor begins collecting from the custom log, you can use a log query to access the records. Use the custom log’s name as the Type in your query. Please note that if the RawData property is missing from the query results, you might need to close and reopen your browser.

To parse the custom log entries, it’s important to know that the complete log entry is stored in a single property called RawData. For better analysis, you may want to separate the different pieces of information into individual properties for each record. For options on parsing RawData into multiple properties, refer to the documentation on parsing text data in Azure Monitor.

If you need to delete a custom log table, you can follow the instructions provided in the documentation on deleting a table.

Data collection in Azure Monitor occurs approximately every 5 minutes for new entries in each custom log. The agent keeps track of its progress in each log file it collects from. If the agent goes offline for a period of time, Azure Monitor will resume collecting entries from where it left off, even if those entries were created while the agent was offline.

Each custom log record has a type that matches the log name provided, along with specific properties including TimeGenerated (the date and time the record was collected by Azure Monitor), SourceSystem (the type of agent the record was collected from, such as OpsManager for Windows agent or Linux for all Linux agents), RawData (the full text of the collected entry), and ManagementGroupName (the name of the management group for System Center Operations Manager agents or AOI- for other agents).

The sample walkthrough in the article demonstrates the process of adding a custom log using a sample log file with entries consisting of a timestamp, code, status, and message. Screenshots are provided to illustrate uploading and parsing the sample log, adding log collection paths, and providing a name and description for the log. It also mentions the validation of custom log collection by querying the logs and parsing the log entries.

In situations where custom logs are not suitable due to different data structure, log file non-compliance, or the need for preprocessing or filtering before collection, alternate strategies are recommended. These include using custom scripts or other methods to write data to Windows Events or Syslog, which can be collected by Azure Monitor, or sending the data directly to Azure Monitor using the HTTP Data Collector API.

Close Bitnami banner