Log Parser is a tool that has been around for quite some time (almost six years, in fact). I can’t really do any better than the description on the official download page, so here it is: “Log parser is a powerful, versatile tool that provides universal query access to text-based data such as log files, XML files and CSV files, as well as key data sources on the Windows operating system such as the Event Log, the Registry, the file system, and Active Directory”.
Log Parser is a command line (yes, command line!) tool that uses a SQL dialect to extract information from data sources. In particular, I have found it to be invaluable for extracting information from the web server logs of the sites that I manage and develop.
First, about that SQL syntax Log Parser uses to query the data sources… many developers seem to have a natural aversion to SQL. In addition, many new data access frameworks attempt to abstract SQL away from the developer. However, I have always found SQL easy to work with and believe it to be an essential tool that every developer should at least have a working knowledge of. For Log Parser, all that is necessary is a basic understanding of the core SQL SELECT statement, as implemented within Microsoft’s SQL Server (that is, T-SQL). That means you should be familiar with the following elements of a SELECT statement: TOP, FROM, INTO, WHERE, ORDER BY, GROUP BY. That’s all you need to perform most Log Parser operations.
Curiously, Log Parser has never received the amount of attention that I think it deserves. Beyond a flurry of attention when it was first released, it seems to be mentioned rarely in official Microsoft communications or blogs. Despite that, it remains a viable and valuable tool for parsing not just web server log files, but all types of structured text-based data.
In this post, rather than explaining how to use Log Parser. I’ll give a number of examples of its use. In addition, I’ll document some useful locations where Log Parser information can be found on the web.
Examples
Keep in mind that most of the examples that I give here are all-in-one command line queries (even though many wrap to multiple lines when displayed here). However, queries can also be run as
logparser file:XXXXX.sql
where XXXXX is the name of a file containing a logparser-friendly sql query. There are a couple examples of this in the following list.
The examples given here have been obtained from a variety of sources, including the documentation that ships with the tool, blogs and online documentation, and my own experience. Unfortunately, I don’t have a record of the origin of each individual example, as I’ve compiled these piecemeal over the last two or three years.
I hope you’ll find something useful here and gain an appreciation for just how robust this tool is.
1) All pages hits by a given IP address
logparser "select cs-uri-stem, count(cs-uri-stem) as requestcount from [LogFileName] where c-ip = ’000.00.00.000′ group by cs-uri-stem order by count(cs-uri-stem) desc"
2) Hits on a particular page by IP address
logparser "select c-ip, count(c-ip) as requestcount from [LogFileName] where cs-uri-stem like ‘/search.aspx%’ group by c-ip order by count(c-ip) desc"
3) ReverseDNS example. This attempts to find the domain associated with a given IP address.
logparser "select c-ip, REVERSEDNS(c-ip) from [LogFileName] where c-ip = ’000.00.00.000′ group by c-ip"
4) CSV example. All hits on a page, written to a CVS file.
logparser "select * into OUTPUT.CSV from [LogFileName] where cs-uri-stem like ‘/pagename.aspx’"
5) Chart example. All hits on a page by an IP address, displayed on a chart.
logparser "select c-ip, count(c-ip) as requestcount intologparserchart.gif from [LogFileName] where cs-uri-stem like ‘/pagename.aspx’ group by c-ip order by count(c-ip) desc" -o:chart
6) Hits per hour from a particular IP address
logparser "select TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time), 3600)), count(*) as numberrequests from [LogFileName] where c-ip=’000.000.00.000′ group by TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date,time), 3600))"
7) Basic list of IP addresses generating traffic
logparser "select c-ip, count(c-ip) as requestcount from [LogFileName] group by c-ip order by count(c-ip) desc"
8) Basic list of pages being hit
logparser "select cs-uri-stem, count(cs-uri-stem) from [LogFileName] where cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’ group by cs-uri-stem order by count(cs-uri-stem) desc"
9) Basic list of pages being hit, including which IPs are doing the hitting
logparser "select cs-uri-stem, c-ip, count(cs-uri-stem) from [LogFileName] where cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’ group by cs-uri-stem, c-ip order by count(cs-uri-stem) desc"
10) Pages being hit after a specific date and time
logparser "select cs-uri-stem, c-ip, count(cs-uri-stem) from [LogFileName] where cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’ and date=’2009-06-04′ and time > ’15:00:00′ group by cs-uri-stem, c-ip order by count(cs-uri-stem) desc"
11) Counts of hits of ASPX/ASHX pages by hour from a particular IP address
logparser "select TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time), 3600)), count(*) as numberrequests from [LogFileName] where c-ip=’000.000.00.00′ and (cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’) group by TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date,time), 3600))"
12) Counts of hits against specific pages by hour from a particular IP address
logparser "select TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time), 3600)), cs-uri-stem, count(*) as numberrequests from [LogFileName] where c-ip=’000.000.00.00′ and (cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’) group by TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date,time), 3600)), cs-uri-stem order by numberrequests desc"
13) Top browsers
logparser "Select top 50 to_int(mul(100.0,PropCount(*))) as Percent, count(*) as TotalHits, cs(User-Agent) as Browser from [LogFileName] group by Browser order by Totalhits desc"
14) Hourly Bandwidth (chart)
logparser "Select TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time), 3600)) As Hour, Div(Sum(cs-bytes),1024) As Incoming(K), Div(Sum(sc-bytes),1024) As Outgoing(K) Into BandwidthByHour.gif From [LogFileName] Group By Hour"
15) Requests by URI
logparser "SELECT top 80 QUANTIZE(TO_TIMESTAMP(date, time), 3600) as Hour, TO_LOWERCASE(STRCAT(‘/’,EXTRACT_TOKEN(cs-uri-stem,1,’/’))) as URI, COUNT(*) AS RequestsPerHour, SUM(sc-bytes) AS TotBytesSent, AVG(sc-bytes) AS AvgBytesSent, Max(sc-bytes) AS MaxBytesSent, ADD(1,DIV(Avg(time-taken),1000)) AS AvgTime, ADD(1,DIV(MAX(time-taken),1000)) AS MaxTime FROM [LogFileName] GROUP BY Hour, URI Having RequestsPerHour > 10 ORDER BY RequestsPerHour ASC"
16) Top 10 Images by size
logparser "Select Top 10 StrCat(Extract_Path(TO_Lowercase(cs-uri-stem)),’/’) AS RequestedPath, Extract_filename(To_Lowercase(cs-uri-stem)) As RequestedFile, Count(*) AS Hits, Max(time-taken) As MaxTime, Avg(time-taken) As AvgTime, Max(sc-bytes) As BytesSent From [LogFileName] Where (Extract_Extension(To_Lowercase(cs-uri-stem)) IN (‘gif’;’jpg’;’png’)) AND (sc-status = 200) Group By To_Lowercase(cs-uri-stem) Order By BytesSent, Hits, MaxTime DESC"
17) Top 10 URLs for a website, with total hits, max time to serve, and average time to serve
logparser "Select TOP 10 STRCAT(EXTRACT_PATH(cs-uri-stem),’/’) AS RequestPath, EXTRACT_FILENAME(cs-uri-stem) AS RequestedFile, COUNT(*) AS TotalHits, Max(time-taken) AS MaxTime, AVG(time-taken) AS AvgTime, AVG(sc-bytes) AS AvgBytesSent FROM [LogFileName] GROUP BY cs-uri-stem ORDER BY TotalHits DESC"
18) Top 20 clients
logparser "Select Top 20 c-ip AS Client, Count(*) AS Hits INTO Chart.gif FROM [LogFileName] GROUP BY c-ip ORDER BY Hits Desc"
19) Referrer Broken Links (i.e. external references to broken links on your site)
logparser "SELECT DISTINCT cs(Referer) as Referer, cs-uri-stem as Url INTOReferBrokenLinks.html FROM [LogFileName] WHERE cs(Referer) IS NOT NULL AND sc-status = 404 AND (sc-substatus IS NULL OR sc-substatus=0)" -tpl:ReferBrokenLinks.tpl
20) Status codes
logparser "SELECT sc-status As Status, COUNT(*) As Number INTOStatusCodes.gif FROM <2> GROUP BY Status ORDER BY Status"
21) Search the Event Log for W3SVC (IIS) log entries and color-coordinate as to Error, Warning, Information. This example writes the output of the query to an HTML file that is generated using a template file.
logparser "SELECT TimeGenerated,EventTypeName,Strings,Message,CASE EventTypeName WHEN ‘Error event’ THEN ‘RED’ WHEN ‘Warning event’ THEN ‘YELLOW’ WHEN ‘Information event’ THEN ‘WHITE’ ELSE ‘BLUE’ END As Color INTO file.html FROM System WHERE SourceName = ‘W3SVC’" -tpl:IISEventLogEntries.tpl
Where IISEventLogEntries.tpl is a file that contains the following:
<LPHEADER>
<HTML>
<HEAD>
<STYLE>
TD { font-family: Arial };
TH { font-family: Arial };
</STYLE>
</HEAD>
<BODY>
<TABLE BORDERCOLOR="BLACK" BORDER="1" CELLPADDING="2" CELLSPACING="2">
<TR>
<TH COLSPAN=4 BGCOLOR="BLACK"><FONT COLOR=WHITE>New W3SVC Messages in System Event Log</FONT></TH>
</TR>
<TR>
<TH ALIGN=LEFT BGCOLOR="#C0C0C0">Time Generated</TH>
<TH ALIGN=LEFT BGCOLOR="#C0C0C0">Event Type</TH>
<TH ALIGN=LEFT BGCOLOR="#C0C0C0">Strings</TH>
<TH ALIGN=LEFT BGCOLOR="#C0C0C0">Message</TH>
</TR>
</LPHEADER>
<LPBODY>
<TR bgCOLOR="%Color%">
<TD>%TimeGenerated%</TD>
<TD>%EventTypeName%</TD>
<TD>%Strings%</TD>
<TD>%Message%</TD>
</TR>
</LPBODY>
</TABLE>
</BODY>
</HTML>
22) Upload Log Parser query results directly to a table in SQL Server
logparser "select * into LogTable from [LogFileName] where cs-uri-stem like ‘/folder/filename%’" -o:SQL -createTable:ON -server:[DatabaseServer] -database:[Database] -username:[SqlUser] -password:[SqlPassword]
23) Top 10 images by size sent. Note that this example also shows how to query multiple log files at once.
logparser "Select Top 10 StrCat(Extract_Path(TO_Lowercase(cs-uri-stem)),’/’) AS RequestedPath, Extract_filename(To_Lowercase(cs-uri-stem)) As RequestedFile, Count(*) AS Hits, Max(time-taken) As MaxTime, Avg(time-taken) As AvgTime, Max(sc-bytes) As BytesSent INTO TOP10ImagesBySize.txt FROM logs\iis\ex*.log WHERE (Extract_Extension(To_Lowercase(cs-uri-stem)) IN (‘gif’;’jpg’;’png’)) AND (sc-status = 200) GROUP BY To_Lowercase(cs-uri-stem) ORDER BY BytesSent, Hits, MaxTime DESC"
24) Browser types (two different approaches)
logparser "SELECT distinct cs(User-Agent), count(*) as hits INTOuseragentsalltypes.txt FROM logs\iis\ex*.log GROUP BY cs(user-agent) ORDER BY hits DESC"
logparser "SELECT TO_INT(MUL(100.0,PROPCOUNT(*))) AS Percent, COUNT(*) AS Hits, cs(User-Agent) as Browser INTO UseragentsHits.txt FROM logs\iis\ex*.log GROUP BY Browser ORDER BY HITS DESC"
25) Unique visitors per day. This requires two queries. The first query selects from the IIS logs into a CSV file, and the second selects from that CSV file.
logparser "SELECT DISTINCT cs-username, date INTOtempUniqueVisitorsPerDay.csv FROM logs\iis\ex*.log WHERE cs-username <> NULL Group By Date, cs-username"
logparser "SELECT date, count(cs-username) as UniqueVisitors intotest.txt FROM tempUniqueVisitorsPerDay.csv GROUP BY date"
26) Top 10 largest ASPX pages.
logparser "Select Top 10 StrCat(Extract_Path(TO_Lowercase(cs-uri-stem)),’/’) AS RequestedPath, Extract_filename(To_Lowercase(cs-uri-stem)) As RequestedFile, Count(*) AS Hits, Max(time-taken) As MaxTime, Avg(time-taken) As AvgTime, Max(sc-bytes) As BytesSent INTO top10pagesbysize.txt FROM logs\iis\ex*.log WHERE (Extract_Extension(To_Lowercase(cs-uri-stem)) IN (‘aspx’)) AND (sc-status = 200) GROUP BY To_Lowercase(cs-uri-stem) ORDER BY BytesSent, Hits, MaxTime DESC"
27) Top 10 slowest ASPX pages
logparser "SELECT TOP 10 cs-uri-stem, max(time-taken) as MaxTime, avg(time-taken) as AvgTime INTO toptimetaken.txt FROM logs\iis\ex*.log WHERE extract_extension(to_lowercase(cs-uri-stem)) = ‘aspx’ GROUP BY cs-uri-stem ORDER BY MaxTime DESC"
28) Top 10 slowest ASPX pages on a specific day
logparser "SELECT TOP 10 cs-uri-stem, max(time-taken) as MaxTime, avg(time-taken) as AvgTime INTO toptimetaken.txt FROM logs\iis\ex*.log WHERE extract_extension(to_lowercase(cs-uri-stem)) = ‘aspx’ AND TO_STRING(To_timestamp(date, time), ‘MMdd’)=’1003′ GROUP BY cs-uri-stem ORDER BY MaxTime DESC"
29) Daily bandwidth
logparser "Select To_String(To_timestamp(date, time), ‘MM-dd’) As Day, Div(Sum(cs-bytes),1024) As Incoming(K), Div(Sum(sc-bytes),1024) As Outgoing(K) Into BandwidthByDay.gif From logs\iis\ex*.log Group By Day"
30) Bandwidth by hour
logparser "SELECT QUANTIZE(TO_TIMESTAMP(date, time), 3600) AS Hour, SUM(sc-bytes) AS TotalBytesSent INTO BytesSentPerHour.gif FROM logs\iis\ex*.log GROUP BY Hour ORDER BY Hour"
31) Average page load time per user
logparser "Select Top 20 cs-username AS UserName, AVG(time-taken) AS AvgTime, Count(*) AS Hits INTO AvgTimePerUser.txt FROM logs\iis\ex*.log WHERE cs-username IS NOT NULL GROUP BY cs-username ORDER BY AvgTime DESC"
32) Ave page load time for a specific user
logparser "Select cs-username AS UserName, AVG(time-taken) AS AvgTime, Count(*) AS Hits INTO AvgTimeOnSpecificUser.txt FROM logs\iis\ex*.log WHERE cs-username = ‘CONTOSO\User1234’ GROUP BY cs-username"
33) Error trends. This query is quite long, and is easier expressed in a text file than on the command line. So, Log Parser reads and executes the query contained in the specified text file.
logparser file:errortrend.sql
Where errortrend.sql contains the following:
SELECT
TO_STRING(To_timestamp(date, time), ‘MMdd’) AS Day,
SUM(c200) AS 200s,
SUM(c206) AS 206s,
SUM(c301) AS 301s,
SUM(c302) AS 302s,
SUM(c304) AS 304s,
SUM(c400) AS 400s,
SUM(c401) AS 401s,
SUM(c403) AS 403s,
SUM(c404) AS 404s,
SUM(c500) AS 500s,
SUM(c501) AS 501s,
SUM(c502) AS 502s,
SUM(c503) AS 503s,
SUM(c504) AS 504s,
SUM(c505) AS 505s
USING
CASE sc-status WHEN 200 THEN 1 ELSE 0 END AS c200,
CASE sc-status WHEN 206 THEN 1 ELSE 0 END AS c206,
CASE sc-status WHEN 301 THEN 1 ELSE 0 END AS c301,
CASE sc-status WHEN 302 THEN 1 ELSE 0 END AS c302,
CASE sc-status WHEN 304 THEN 1 ELSE 0 END AS c304,
CASE sc-status WHEN 400 THEN 1 ELSE 0 END AS c400,
CASE sc-status WHEN 401 THEN 1 ELSE 0 END AS c401,
CASE sc-status WHEN 403 THEN 1 ELSE 0 END AS c403,
CASE sc-status WHEN 404 THEN 1 ELSE 0 END AS c404,
CASE sc-status WHEN 500 THEN 1 ELSE 0 END AS c500,
CASE sc-status WHEN 501 THEN 1 ELSE 0 END AS c501,
CASE sc-status WHEN 502 THEN 1 ELSE 0 END AS c502,
CASE sc-status WHEN 503 THEN 1 ELSE 0 END AS c503,
CASE sc-status WHEN 504 THEN 1 ELSE 0 END AS c504,
CASE sc-status WHEN 505 THEN 1 ELSE 0 END AS c505
INTO ErrorChart.gif
FROM
logs\iis\ex*.log
GROUP BY
Day
ORDER BY
Day
34) Win32 errors
logparser "SELECT sc-win32-status as ErrorNumber, WIN32_ERROR_DESCRIPTION(sc-win32-status) as ErrorDesc, Count(*) AS Total INTO Win32ErrorNumbers.txt FROM logs\iis\ex*.log WHERE sc-win32-status>0 GROUP BY ErrorNumber ORDER BY Total DESC"
35) Substatus codes
logparser "SELECT sc-status, sc-substatus, Count(*) AS Total INTO 401subcodes.txt FROM logs\iis\ex*.log WHERE sc-status=401 GROUP BY sc-status, sc-substatus ORDER BY sc-status, sc-substatus DESC"
36) Substatus codes per day. This is another example of executing a query contained in a text file.
logparser file:substatusperday.sql
Where substatusperday.sql contains the following:
SELECT
TO_STRING(To_timestamp(date, time), ‘MMdd’) AS Day,
SUM(c1) AS 4011,
SUM(c2) AS 4012,
SUM(c3) AS 4013,
SUM(c4) AS 4014,
SUM(c5) AS 4015,
SUM(c7) AS 4017
USING
CASE sc-substatus WHEN 1 THEN 1 ELSE 0 END AS c1,
CASE sc-substatus WHEN 2 THEN 1 ELSE 0 END AS c2,
CASE sc-substatus WHEN 3 THEN 1 ELSE 0 END AS c3,
CASE sc-substatus WHEN 4 THEN 1 ELSE 0 END AS c4,
CASE sc-substatus WHEN 5 THEN 1 ELSE 0 END AS c5,
CASE sc-substatus WHEN 7 THEN 1 ELSE 0 END AS c7
INTO
401subcodesperday.txt
FROM
logs\iis\ex*.log
WHERE
sc-status=401
GROUP BY
Day
ORDER BY
Day
37) Substatus codes per page
logparser "SELECT TOP 20 cs-uri-stem, sc-status, sc-substatus, Count(*) AS Total INTO 401Pagedetails.txt FROM logs\iis\ex*.log WHERE sc-status=401 GROUP BY cs-uri-stem, sc-status, sc-substatus ORDER BY Total"
38) MB sent per HTTP status code
logparser "SELECT EXTRACT_EXTENSION(cs-uri-stem) AS PageType, SUM(sc-bytes) as TotalBytesSent, TO_INT(MUL(PROPSUM(sc-bytes), 100.0)) AS PercentBytes INTO PagesWithLargestBytesSent.htm FROM logs\iis\ex*.log GROUP BY Pagetype ORDER BY PercentBytes DESC"
39) 500 errors per ASPX and Domain User
logparser "SELECT cs-username, cs-uri-stem, count(*) as Times INTO 500PagesByUserAndPage.txt FROM logs\iis\ex*.log WHERE sc-status=500 GROUP BY cs-username, cs-uri-stem ORDER BY Times DESC"
40) Percent of 500 errors caused by each user
logparser "SELECT cs-username, count(*) as Times, propcount(*) as Percent INTO 500ErrorsByUser.csv FROM logs\iis\ex*.log WHERE sc-status=500 GROUP BY cs-username ORDER BY Times DESC"
41) Determine what percentage of the total bytes sent are being caused by each page type
logparser "SELECT EXTRACT_EXTENSION(cs-uri-stem) AS PageType, SUM(sc-bytes) as TotalBytesSent, TO_INT(MUL(PROPSUM(sc-bytes), 100.0)) AS PercentBytes INTO PagesWithLargestBytesSent.txt FROM logs\iis\ex*.log GROUP BY Pagetype ORDER BY PercentBytes DESC"
42) Top 20 pages with a specific HTTP return code
logparser "SELECT TOP 20 cs-uri-stem, sc-status, Count(*) AS Total INTO TOP20PagesWith401.txt FROM logs\iis\ex*.log WHERE TO_LOWERCASE(cs-uri-stem) LIKE ‘%.aspx’ and sc-status=401 GROUP BY cs-uri-stem, sc-status ORDER BY Total, cs-uri-stem, sc-status DESC"
43) Check traffic from IP addresses
logparser "Select c-ip AS Client, Div(Sum(cs-bytes),1024) As IncomingBytes(K), Div(Sum(sc-bytes),1024) As OutgoingBytes(K), MAX(time-taken) as MaxTime, AVG(time-taken) as AvgTime, count(*) as hits INTO errorsperip.txt FROM logs\iis\ex*.log GROUP BY client ORDER BY Hits DESC"
44) Check errors by IP address
logparser file:errorbyip.sql
Where errorbyip.sql contains the following:
Select
c-ip AS Client,
SUM(c400) AS 400s,
sum(c401) AS 401s,
SUM(c403) AS 403s,
SUM(c404) AS 404s,
SUM(c500) AS 500s,
SUM(c501) AS 501s,
SUM(c502) AS 502s,
SUM(c503) AS 503s,
SUM(c504) AS 504s,
SUM(c505) AS 505s
USING
CASE sc-status WHEN 400 THEN 1 ELSE 0 END AS c400,
CASE sc-status WHEN 401 THEN 1 ELSE 0 END AS c401,
CASE sc-status WHEN 403 THEN 1 ELSE 0 END AS c403,
CASE sc-status WHEN 404 THEN 1 ELSE 0 END AS c404,
CASE sc-status WHEN 500 THEN 1 ELSE 0 END AS c500,
CASE sc-status WHEN 501 THEN 1 ELSE 0 END AS c501,
CASE sc-status WHEN 502 THEN 1 ELSE 0 END AS c502,
CASE sc-status WHEN 503 THEN 1 ELSE 0 END AS c503,
CASE sc-status WHEN 504 THEN 1 ELSE 0 END AS c504,
CASE sc-status WHEN 505 THEN 1 ELSE 0 END AS c505
INTO
IPNumberFileName.txt
FROM
logs\iis\ex*.log
WHERE
c-ip=’<IP address goes here>’
GROUP BY
client
45) Find broken links
logparser "SELECT DISTINCT cs(Referer) as Referer, cs-uri-stem as Url INTO ReferBrokenLinks.txt FROM logs\iis\ex*.log WHERE cs(Referer) IS NOT NULL AND sc-status=404 AND (sc-substatus IS NULL OR sc-substatus=0)"
46) Top 10 pages with most hits
logparser "Select TOP 10 STRCAT(EXTRACT_PATH(cs-uri-stem),’/’) AS RequestPath, EXTRACT_FILENAME(cs-uri-stem) AS RequestedFile, COUNT(*) AS TotalHits, Max(time-taken) AS MaxTime, AVG(time-taken) AS AvgTime, AVG(sc-bytes) AS AvgBytesSent INTO Top10Urls.txt FROM logs\iis\ex*.log GROUP BY cs-uri-stem ORDER BY TotalHits DESC"
47) Unique users per browser type (requires two queries)
logparser "SELECT DISTINCT cs-username, cs(user-agent) INTO UserAgentsUniqueUsers1.csv FROM logs\iis\ex*.log WHERE cs-username <> NULL GROUP BY cs-username, cs(user-agent)"
logparser "SELECT cs(user-agent), count(cs-username) as UniqueUsersPerAgent, TO_INT(MUL(PROPCOUNT(*), 100)) AS Percentage INTO UniqueUsersPerAgent.txt FROM UserAgentsUniqueUsers1.csv GROUP BY cs(user-agent) ORDER BY UniqueUsersPerAgent DESC"
48) Bytes sent per file extension
logparser "SELECT EXTRACT_EXTENSION( cs-uri-stem ) AS Extension, MUL(PROPSUM(sc-bytes),100.0) AS PercentageOfBytes, Div(Sum(sc-bytes),1024) as AmountOfMbBytes INTO BytesPerExtension.txt FROM logs\iis\ex*.log GROUP BY Extension ORDER BY PercentageOfBytes DESC"
49) Domains referring traffic to your site
logparser "SELECT EXTRACT_TOKEN(cs(Referer), 2, ‘/’) AS Domain, COUNT(*) AS [Requests] INTO ReferringDomains.txt FROM logs\iis\ex*.log GROUP BY Domain ORDER BY Requests DESC"
50) OS types (requires two queries)
logparser "SELECT DISTINCT c-ip, cs(user-agent) INTOUserAgentsUniqueUsers.csv FROM logs\iis\ex*.log WHERE c-ip <> NULL GROUP BY c-ip, cs(user-agent)"
logparser file:getos.sql
Where getos.sql contains the following:
SELECT
SUM (c70) AS Win7,
SUM (c60) AS Vista,
SUM (c52) AS Win2003,
SUM (c51) AS WinXP,
SUM (C50) AS Win2000,
SUM (W98) AS Win98,
SUM (W95) AS Win95,
SUM (W9x) AS Win9x,
SUM (NT4) AS WinNT4,
SUM (OSX) AS OS-X,
SUM (Mac) AS Mac-,
SUM (PPC) AS Mac-PPC,
SUM (Lnx) AS Linux
USING
CASE strcnt(cs(User-Agent),’Windows+NT+6.1′) WHEN 1 THEN 1 ELSE 0 END AS C70,
CASE strcnt(cs(User-Agent),’Windows+NT+6.0′) WHEN 1 THEN 1 ELSE 0 END AS C60,
CASE strcnt(cs(User-Agent),’Windows+NT+5.2′) WHEN 1 THEN 1 ELSE 0 END AS C52,
CASE strcnt(cs(User-Agent),’Windows+NT+5.1′) WHEN 1 THEN 1 ELSE 0 END AS C51,
CASE strcnt(cs(User-Agent),’Windows+NT+5.0′) WHEN 1 THEN 1 ELSE 0 END AS C50,
CASE strcnt(cs(User-Agent),’Win98′) WHEN 1 THEN 1 ELSE 0 END AS W98,
CASE strcnt(cs(User-Agent),’Win95′) WHEN 1 THEN 1 ELSE 0 END AS W95,
CASE strcnt(cs(User-Agent),’Win+9x+4.90′) WHEN 1 THEN 1 ELSE 0 END AS W9x,
CASE strcnt(cs(User-Agent),’Winnt4.0′) WHEN 1 THEN 1 ELSE 0 END AS NT4,
CASE strcnt(cs(User-Agent),’OS+X’) WHEN 1 THEN 1 ELSE 0 END AS OSX,
CASE strcnt(cs(User-Agent),’Mac’) WHEN 1 THEN 1 ELSE 0 END AS Mac,
CASE strcnt(cs(User-Agent),’PPC’) WHEN 1 THEN 1 ELSE 0 END AS PPC,
CASE strcnt(cs(User-Agent),’Linux’) WHEN 1 THEN 1 ELSE 0 END AS Lnx
INTO
GetOSUsed.txt
FROM
UserAgentsUniqueUsers.csv
51) Get timeout errors from the server Event Log. Display results in a datagrid.
logparser "select * from \\servername\application where message like ‘%timeout expired%’" -i:EVT -o:datagrid
52) Get exceptions from the server Event (Application) Log
logparser "select timegenerated, eventtypename, eventcategoryname, message into webserverlog.csv from \\servername\application where message like ‘%myapplication%exception%’" -i:EVT
Links
Check out the links below to find more in-depth discussion of Log Parser, as well as even more examples of its usage.
- Visit the Log Parser Forums at the official Microsoft IIS Site (despite the tool’s age, these forums still get traffic)
- Purchase the Microsoft Log Parser Toolkit at Amazon.com (a Kindle edition is also available)
- Use Microsoft’s Logparser to Analyze IIS Logs – a Log Parser article on a blog named “Linux Lore”? Odd, but it’s a great reference article nonetheless.
- 23 Log Parser examples can be found at this MSDN blog
- Here’s an article posted to the official IIS blog shortly after the tool was released in 2006
'security_downloads' 카테고리의 다른 글
사용자 몰래 가상화폐 채굴하는 스마트폰 악성앱 분석 (0) | 2014.05.27 |
---|---|
로그파서 사용방법 및 다운로드 받는곳 (0) | 2014.05.27 |
BIOS의 위협 (0) | 2014.05.26 |
산업 제어시스템에서 활동하는 웜 바이러스 분석 (0) | 2014.05.23 |
SSL/TLS attacks: Part 1 – BEAST Attack (0) | 2014.05.21 |