Paul Matthews wrote:
>
> I do a daily aggregation of my arts files collected for all my routers by
> running a scripts which runs :-
> artsagg -q -T netm,intfm ag.$EXTENSION $AG_FILE_NAME
> Further each month I run artsagg again on all my daily files to aggregate
> them down to a single file.
> Up till now this has been working fine - I am now running into issues with a
> some of the busier routers and I'm getting an out of memory message and the
> output file is (probably incomplete).
> Clearly there is an issue in scaling when trying to aggregate so many files
> in one pass.
>
> What I want is one summary file at the end of each month for my trend
> analysis.
> Will artsagg support a more cumulative aggregation ie can I aggregate using
> the previous aggregation file as input and output ie something like :-
> artsagg -T netm,intfm -o ag.monthly ag.monthly ag.daily ???
>
> Any suggestions would be appreciated.
Hi Paul,
I have tried the cumulative aggregation with some success. My opinion
is that it helps "a bit". In my particular situation I was producing
arts files with 5 minute aggregation. At the end of each day I wanted
an arts file with daily aggregation. I started by performing the
aggregation in one step. On the busier collectors this took a lot of
memory and they would end up swapping. I found that going from 5
minute to 1 hour aggregation, then to daily, was actually quicker and
used less memory. I'm not sure (logically) how that can be, but I put
it down to quirkiness in the memory handling of cflowd, and not the
act of aggregation of stats over time per se. It's worth
experimenting with I think, unless you have memory to burn. My
artsagg processes were consuming > 800MB at times!
-Martin
-- Expert carrier network traffic analysis and visualisation http://www.gadgets.co.nz/products.shtml xenaphobia: The fear of being beaten to a pulp by a leather-clad, New Zealand woman -- cflowd mailing list cflowd@caida.org
This archive was generated by hypermail 2b29 : Wed Oct 03 2001 - 14:28:22 PDT