Augmented Reality (AR) has been proven beneficial to External Ventricular Drain (EVD) surgery by providing in-situ visual guidance during operations. During this procedure, the key challenge is estimating the spatial relationship between pre-operative images and actual patient anatomy accurately and efficiently. Previous works have revealed conflicts between tracking accuracy, workflow efficiency, and non-invasiveness in tracking pipelines. This research fully utilizes the capabilities of Time of Flight (ToF) depth sensors, including retro-reflective tool tracking and dense surface information, to construct a convenient and accurate EVD guiding pipeline. As previous studies have proven significant depth errors in ToF depth sensors, we first evaluated the feasibility of using ToF sensors in surgical guidance by estimating its accuracy under different conditions and corrected this error in our pipeline. Our results show depth value errors on human skin under HoloLens 2 depth camera, indicating the significance of depth correction. This error was reduced by over using proposed depth correction method on head phantoms in different materials. The corrected depth information can then be utilized to reconstruct the head surface with sub-millimeter accuracy, validated on a series of 3D-printed models and a sheep head. To demonstrate the effectiveness of the proposed framework, we conducted a case study simulating EVD surgery. Five surgeons were involved in this study, each performing nine k-wire insertions on a head phantom under virtual guidance without tracking for surgical tools. The results revealed translational and orientational guidance accuracy, demonstrating competitive performance with previous research.