Measuring Processing Requirements
There are several difficulties when determining processing requirements of a running process on today’s commonplace PC operating systems:
- A system needs to be probed that currently has the sole task of displaying a yelow rectangle and waiting for any keypress, thus diluting the genuine workload of the system with context switches.
- On modern CPUs with dynamic scaling and wear levelling, determining the occupancy of a CPU by a specific process is not trivial; measurements reported for the sake of this article can be a rough estimate at best.
- CPU impact for such a process program on a recent 3000 MHz CPU is barely measurable in some cases.
- Some solutions multithread (notably PyGame).
- Some solutions offload processing tasks to the GPU more than others (notably OpenGL).
- On a specific note, since i have decomissioned my last genuine AT-PC machine running MS-DOS years ago, the measurement for the DOS solution could only be made on the host operating system, featuring an emulator of the actual CPU. Thus, i have no reliable measurement of the processor requirements of the DOS solution. One could argue that, since DOS is a single-task operating system, every task has a CPU occupancy of 100%, but that is a debatable interpretation at best.
These are the methods of measurement i applied:
- On GNU/Linux i determined an average CPU load as reported by top.
- On Windows i determine an average CPU load using Visual Studio’s profiling session.