Answers to: How to include parts of live log files into websitehttp://linuxexchange.org/questions/390/how-to-include-parts-of-live-log-files-into-website<p>How would I go about having apache pull some lines from log files to be displayed on a website.</p> <p>I want to pull the last 5 vistiors IPs (possiblly while censoring them slightly) and the last 3 IPs to be banned for trying to connect via SSH (which is currently blocked using fail2ban.)</p> <p>I really have no clue where to start with this other than I already have apache up and running and my website is hosted in what can be thought of as a "normal" configuration.</p> <p>The log files are in /var/log/ and I can control the access to them.</p>enSat, 08 May 2010 13:55:57 -0400Answer by Web31337http://linuxexchange.org/questions/390/how-to-include-parts-of-live-log-files-into-website/394<p>I wasn't using apache httpd for a long time but if log line begins with IP, you can use </p> <pre><code>tail -n 5 /path/to/log | awk -F. '{print $1"."$2"."$3".x"}' </code></pre> <p>to extract 5 last IPs hiding last number or just</p> <pre><code>tail -n 5 /path/to/log | awk '{print $1}' </code></pre> <p>to extract 5 last IPs. This may be a cronscript, generating an HTML page can be included in your plain HTML document. Can't describe same process for fail2ban but I guess it's just the same, you'll just have to read the manuals for tools like <em>sed</em> and <em>awk</em> in order to parse logs with shell script.</p> <p>Just remember: if your logs may contain resolved rDNS hostnames instead of IPs this may be dangerous to display them on website, who knows what it could be set to?</p>Web31337Sat, 08 May 2010 13:55:57 -0400http://linuxexchange.org/questions/390/how-to-include-parts-of-live-log-files-into-website/394Answer by gregularexpressionshttp://linuxexchange.org/questions/390/how-to-include-parts-of-live-log-files-into-website/392<p>My first thought is that the logs in /var/log are probably owned by Root and thus your web applications won't be able to read them (You can confirm with <code>ls -al</code>)</p> <p>You could use a (Perl/PHP/Shell) script (cron'd as root) to extract the last X IPs from /var/log/httpd/access_log and /var/log/Fail2Ban.log and write them to a file owned by Apache (or the site user if you're using SuPHP/SuExec) which could then be read by the sites pages and dynamically included.</p> <p>I can't speak much regarding php but if you use Shell or Perl <code>awk</code> will help you get just the IPs:</p> <p>Log Excerpt: </p> <blockquote> <p>221.192.199.xx - - [08/May/2010:08:32:22 +0100] "GET <a href="http://www.wantsfly.com/prx2.php?hash=6039A91133E74FD3D454BB8200505327F2E95B294F70" rel="nofollow">http://www.wantsfly.com/prx2.php?hash=6039A91133E74FD3D454BB8200505327F2E95B294F70</a> HTTP/1.0" 404 287 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0)" 221.192.199.xx - - [08/May/2010:08:57:01 +0100] "GET <a href="http://www.wantsfly.com/prx2.php?hash=6039A91133E74FD3D454BB8200505327F2E95B294F70" rel="nofollow">http://www.wantsfly.com/prx2.php?hash=6039A91133E74FD3D454BB8200505327F2E95B294F70</a> HTTP/1.0" 404 287 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0)" 94.232.9.xx - - [08/May/2010:12:37:04 +0100] "GET /w00tw00t.at.ISC.SANS.DFind:) HTTP/1.1" 400 310 "-" "-"</p> </blockquote> <p><code>tail -10 access_log | awk -F"-" {'print $1'}</code></p> <pre><code>221.192.199.xx 94.232.9.xx 221.192.199.xx 221.192.199.xx 94.232.9.xx </code></pre>gregularexpressionsSat, 08 May 2010 13:15:10 -0400http://linuxexchange.org/questions/390/how-to-include-parts-of-live-log-files-into-website/392