Re: microsecond event scheduling in an application

From: Robert Hancock
Date: Sun Sep 20 2009 - 14:26:18 EST


On 09/08/2009 08:27 AM, Junhee Lee wrote:
I am working on event scheduler which handles events in microsecond level.
Actual this program is a network emulator using simulation codes.
I'd like to expect that network emulator is working as simulation behaviors.
Thus high resolution timer interrupt is required.
But high resolution timer interrupt derived by high tick frequency (jiffies
clock) must effect the system performance.
Are there any comments or ways to support microsecond event scheduling
without performance degradation?

Just increasing HZ will degrade performance, yes, but we have hrtimers now which should be able to use granularities smaller than one jiffy, so it shouldn't be needed..
--
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at http://vger.kernel.org/majordomo-info.html
Please read the FAQ at http://www.tux.org/lkml/