You are not logged in.
Pages: 1
Topic closed
Here's a little script you can use if you have Squid and PHP.
It will tell you what downloads are currently running through your squid server. It is useful in monitoring large downloads (it will usually not report small downloads such as web pages or images or anything that loads in less then 3 seconds - or whatever you have set $interval to).
It will also tell you the bandwidth used by the download and the host that requested it. The list of downloads are sorted in descending order.
It is useful to know who is hijacking your squid bandwidth and what for.
It is my work
=========== WARNING! ===========
You HAVE TO find all occurrances of "shell_exxec" in the following script and REMOVE ONE "x"
The forum would NOT LET ME POST WITH THE CORRECT SPELLING (probably a security measure), so i had to use this workaround to post my code.
Thanks.
==============================
#!/usr/bin/php -q
<?php
// dowload monitor for squid
// written by rolf - email: rode at [remove text between brackets] dergham dot com
$interval = 3 ; // sampling interval in seconds
$results = 6 ; // number of results shown
$squidclient_cmd = '/usr/sbin/squidclient -h 192.168.0.1 cache_object://192.168.0.1/active_requests' ; // the squidclient command used to retrieve data from squid
function parse_output ($active_requests) {
$active_requests = explode ("\n\n", $active_requests);
foreach ( $active_requests as $key=>$block ) {
$active_requests[$key] = explode("\n", $block);
}
array_splice ( $active_requests[0], 0, 10 ); // remove header
array_pop ($active_requests);
foreach ( $active_requests as $block ) {
$key = substr ( $block[0], 12 );
$active_requests2 [$key]['uri'] = substr ( $block[8], 4 );
$active_requests2 [$key]['peer'] = substr ( $block[4], 7 );
$out = explode(',', $block[10]);
$bytes = explode(' ', trim($out[0]) );
$active_requests2 [$key]['bytes'] = $bytes[1];
}
return $active_requests2 ;
}
$output1 = $output2 = $result = array(); // init
$output1 = parse_output(shell_exxec($squidclient_cmd));
sleep ( $interval ) ;
$output2 = parse_output(shell_exxec($squidclient_cmd));
foreach ( $output2 as $key=>$block ) {
if (isset($output1[$key]) and substr($block['uri'],0,15) != 'cache_object://') {
$Bps = ( $output2[$key]['bytes'] - $output1[$key]['bytes'] ) / $interval ;
$result_key = $Bps.'-'.$key;
$result_block = $block['uri']."\r\n".$block['peer']."\t". round ( $Bps / 1024 , 2 ) ." KB/s";
$result[$result_key] = $result_block ;
}
}
krsort($result, SORT_NUMERIC);
$result = array_slice($result , 0 , $results);
echo "=======================================\r\n";
if (count($result) > 0) {
echo implode("\r\n", $result);
echo "\r\n";
}
else {
echo "No downloads.\r\n";
}
echo "=======================================\r\n";
?>
You need to make sure that the "squidclient" command bundled with squid is properly configured and working.
To test it you can issue the command
squidclient mgr:info
or
squidclient -h YOUR_SQUID_IP -p YOUR_SQUID_PORT mgr:info
If you get an error message, make sure that you have added YOUR_SQUID_IP:YOUR_SQUID_PORT to /etc/squid/cachemgr.conf
If it still doesnt work... google is your friend.
Also edit the beginning of the above script so that it contains the correct squidclient command (see $squidclient_cmd)
This script was developped using Squid 2.5 stable 10. Let me know if it work or doesnt on any other squid version... i can adapt it to other versions if required.
Anyway if all works you should have an output similar to this (in this case we have only 2 downloads but it will show up to 6 or the number you set $results to):
[root@summer root]# download_monitor
=======================================
http://us.download.nvidia.com/Windows/nForce/nTune/5.05.38.00/5.05.38.00_ntune_winxp_international.exe
192.168.0.126:1561 5.18 KB/s
http://www.lebgeeks.com/forums/
192.168.0.126:1573 0 KB/s
=======================================
(yes i know, i shouldnt be logged in as root :) )
I am considering adding some command line parameters to the script to add to its flexibility and enable further filtering and formatting options to the results. For example:
- Filtering results by hosts
- changing the number of results from the command line
- one line per result option so that you can grep the output
Your feedback is appreciated.
Last edited by rolf (June 24 2007)
moved*
oh great rolf
someone is geeking
don't accuse me of being a mahwouseh in DBs
but it will be really nice if u can add a table to save the activity and logs
and then later u can view some reporting and such
like most downloaded item.. aya time users are downloading the most and the such
haida l host shu 3imil download and his history blalblabla
so if u like the idea tell me.. maybe can give u a hand iza ma2 ilak jledeh continue it
Not a bad idea mir but for reports i'm using "squint" which parses squid log files and gives you user activity reports.
Here is an online demo of the reports that it generates.
http://sqnt.sourceforge.net/example/index.html
As you can see, it does all this, so no need to reinvent the wheel...
BTW, you are mahwouseh database
Last edited by rolf (June 25 2007)
for teh ones who don't use the default port (3128), they have to add "-p port" to the squidclient command (/usr/sbin/squidclient -h 192.168.0.1 -p port .....)
If you're interested I now have an ncurses version of that script.
Here's what the interactive ncurses version looks like.
I added a "truncate" function to truncate URL's to screen width. Can be toggled by pressing t.
I have also fixed an issue where you'd sometimes get negative bandwidth values.
Let me know if you're interested I'll post the code.
Squid download monitor. Press Ctrl-C to end.
Probing every 2 seconds. Change with +/-
truncate:on paused
================================================================
http://www.symantec.com/security_response/writeup.jsp?docid= ...
192.168.1.14:1504 4.24 KB/s
http://www.lebanese-forces.org/vbullet/showthread.php?p=3518 ...
192.168.0.116:1505 1.41 KB/s
http://www.symantec.com/css/20070313/symantec2.css
192.168.1.14:1506 0 KB/s
================================================================
Last edited by rolf (June 26 2007)
Pages: 1
Topic closed