caseih wrote:I'm guessing you're talking about percent CPU usage, correct?
Yes.
you could implement a moving average or two.
I am doing that with the average session load. Polling is done at 8Hz. The oldest reading of eight is dropped, a new reading taken and the average of 8 is calculated. The result is displayed at 2Hz. My logic was to try and smooth out some of the volatility.
Task Manager, Process Explorer and Process Hacker all have the option to choose different refresh rates, up to 10s with the latter two. What none of them are clear about is the polling rate employed. A polling rate and a display rate may not be the same. Mine are not: The polling rate is 8Hz and the display rate is 2Hz.
Your suggestion of having large intervals got me to wonder what the above would look like when using large intervals. Since we are examining CPU Load, % CPU usage, then, clearly, we need at least two readings. If we 'dial in' the slowest refresh rate with any of the above there is a corresponding delay before the first readings are displayed. On the face of it then it looks like none of them are employing conditioning, but I am. The Performance Monitor also has a display delay corresponding to the refresh rate.
My next tweak was to synchronise the polling. If we poll at the system interrupt then when we subtract two readings we 'eliminate' the interrupts resolution so it doesn't matter if the resolution is 1ms or 16ms or anything in between.
When we have a tweaking regime it is sometimes worthwhile to halt the proceedings and ask if it is necessary to keep all the tweaks made to date. Some early tweaks could be dispensed with by virtue of later tweaks. Microsoft do this with consolidating updates. This happens with tax law where a consolidating tax act is introduced after several 'tweaking budgets'.
In my case I questioned the early 'average of 8' tweak, preceding the later synchronizing tweak. To my mind there is no case to not use the synchronizing tweak. The 'average of 8' tweak needs a fast polling, so I now questioned the fast polling.
We still have the issue of volatility. Contrary to my initial approach of faster sampling, I was thinking about digital sound sampling for audio CDs, the slower the sampling the lesser the volatility. However, a slow sample rate is not a smoothing mechanism. A slow sample rate rejects information. The slower the sampling rate the more important the Max Load metric becomes. We need to get a balance.
There is a statistical method worth looking at and that is
confidence intervals. Here we would have not just an average but an interval encompassing that average where we can say, something like, "We are 95% confident that the CPU load is in the interval 18 - 22." With a balanced interval I would prefer to use 20 ± 2. Looking at the time graphs of the Performance Monitor it seems to me that we should use balanced intervals or, in other words, two sided confidence limits.
I do not want to bore you but for 30 or less samples we should use the t-distribution and for greater than 30 samples we should use the normal distribution for our % confidence calculations. I have opted for the normal distribution and am now sampling, and displaying, at one second. So, we need to wait for 30 seconds before accepting the confidence interval.
I have coded the above and get this with one of dodicat's master pieces. Over a session of 120 seconds we have a current load of 3.13%, a max load of 5.66% and can say, with 95% confidence, that the load is between 2.12% and 4.56%. The ± 1.22 may be regarded as an indication of the volatility. Since the max load is without the limits then it can be regarded as unusual and was not sustained for very long.
Here is a snapshot of the Performance Monitor, sampling at one second intervals, for dodicat's program. The graph is not load but processor time. This is typical of a preemptive multitasking operating system - all appears calm and collected on the surface but under the bonnet we have absolute mayhem, a bit like leaving the relativistic world for a visit to the quantum world. <smile>
I have tried this on a few loads and things are looking good.
I need some time to look at the 'paul doe' post and really like the idea of historical data having less credence with the passage of time.
Thanks for the feedback, guys. I had 'called it a day' earlier but it looks like I'l be at it a bit longer. I don't mind because I am enjoying myself. I should remind you that this thread is a "result of a conversation between dodicat and myself in another thread." and that thread was about graphics. <laugh>