Thank you for visiting!
My little window on internet allowing me to share several of my passions
Categories:
- FreeBSD
- OpenBSD
- VM
- High Availability
- vdcron
- My Sysupgrade
- FreeBSD
- Nas
- DragonflyBSD
- fapws
- Alpine Linux
- Openbox
- Desktop
- Security
- nvim
- yabitrot
- nmctl
- Tint2
- Firewall
- VPN
- Project Management
- Hifi
- Alarm
Most Popular Articles:
Last Articles:

How to track new lines in log files thanks to awk
Posted on 2024-12-15 11:36:00 from Vincent in OpenBSD
In this blog I'll share a one line command awk allowing me to see only new lines since previous check.
I'm heavily using this feature in my daily scripts: /etc/daily.local or /etc/weekly.local. This allows me to only see what has changed since previous days.
This is really useful for last command, for dmesg or for /var/logmessages
.
The Core Concept
The script leverages the power of the awk command to compare two files: the current log and a stored "previous" log. By comparing line by line, awk
efficiently pinpoints the new lines that have been added to the log.
Unveiling the Magic: A Deeper Dive into the awk
Command
At the heart of this script lies the powerful awk
command.
awk 'NR==FNR {a[$0]++; next} !($0 in a)' "$previous" "$current"
Let's break down the awk
command step by step:
NR==FNR:
- NR
represents the current record number (line number) of the file being processed.
- FNR
represents the current record number within the current input file.
- When NR == FNR
, we're processing the first file (the "previous" log).
a[$0]++:
- If the current line ($0
) is from the first file, it's stored in an associative array a
as a key.
- The value associated with the key is incremented, effectively counting the occurrences of each line in the first file.
next:
- This statement tells awk
to skip to the next record (line) without further processing the current line. This ensures that each line from the first file is processed only once.
!($0 in a):
- When processing the second file (the "current" log), this condition checks if the current line ($0
) is not present as a key in the a
array.
- If the line is not found, it means it's a new line that wasn't in the previous log.
By combining these steps, awk
efficiently identifies and prints the new lines, providing you with a concise and informative view of system changes.
A script
The following script use this powerful command
#!/bin/sh
# Create a temporary file
current=$(mktemp)
# Read stdin into the temporary file
cat > "$current"
previous="$1"
if [ -s "$previous" ]; then
# Use awk to compare the temporary file with the first argument
awk 'NR==FNR {a[$0]++; next} !($0 in a)' "$previous" "$current"
else
cat "$current"
fi
cat "$current" > "$previous"
# Remove the temporary file
rm "$current"
Practical Usage
To monitor daily changes in the last
command output:
/usr/bin/last | /usr/local/bin/new_lines.sh /tmp/daily_last
If daily is not relevant for you, we could post the following line in /etc/weekly.local:
/usr/bin/last | /usr/local/bin/new_lines.sh /tmp/weekly_last
We could even have both in parallel since they store content in different files.
To track new kernel messages since the last check:
/sbin/dmesg | /usr/local/bin/new_lines.sh /tmp/daily_dmesg
To identify new entries in the system log:
/bin/cat /var/log/messages | /usr/local/bin/new_lines.sh /tmp/daily_messages
It's always better to run it once, so we have a copy of the current content in the "previous" file.
Then you are free to use it in cron or in /etc/daily.local, so you will receive this "delta" via email every days.
Conclusion
This simple yet effective script, powered by the awk
command, offers a valuable tool for system administrators to monitor log files efficiently. By useing this is cron or regular checks scripts, you can save time and focus on addressing critical system issues.