We present a laser delay control system based on adaptive averaging which utilises the jitter noise of the laser to stabilise the delay more precisely. The system contains delay lines to measure and control the laser delay and a microcontroller that runs our control algorithm. The algorithm regulates the laser delay on the basis of the average of detected delay values, wherein the steps with which the delay is varied and the averaging length are chosen adaptively, depending on the distance from the target delay. Our complementary numerical simulations show that the jitter of the laser may play a beneficial role here: the error of the delay has a distinct minimum at a non-zero noise level. In a way similar to the dithering principle applied in analogue-to-digital conversion, averaging the noise-modulated detection instances yields a precision in setting the delay that is well beyond the resolution provided by detection time windows, and is close to the theoretical limit determined by the step size of the delay line we applied.