We report a generalized approach to deal with patient motion in robotic image-guided surgery, along with in silico and dry laboratory tests on a neurosurgical robot setup. External patient motion events (excluding physiological motion) can occur during surgery despite body fixation, and can endanger the outcome of the surgery, as they invalidate the registration between the robot and the patient. The core of the compensation algorithm is to rely on the combination of the robot joint encoders, tracking data and the internal coordinate frame of the intra-operative navigation system to ensure accurate execution of the pre-operative plan. The method allows for the continuous correction of the registration through identifying the actual surgical event in the operating room. From the control point of view, the intra-operative events have been categorized as robot motion, camera motion, patient motion and the combinations of these. The registration update is based on the use of the most reliable reference frame and extending-window averaging to compensate for the occurred patient motion. Simulation results were performed on a generic image-guided robot model, and on a skull base surgery robot. Patient motion events were detected in 80% of the cases, which already leads to a gradual improvement of the procedure. The proposed structure allows for a more generic, probability-based handling of the operating room events that may lead to safer and more accurate surgical treatment in the future.