A well-established scheme for target detection in infrared (IR) surveillance systems consists of applying a suitable decision rule on the images with background clutter previously removed. Background removal is accomplished by subtracting, from the original image, the estimate of the spatially varying background signal obtained by a background estimation algorithm (BEA). The overall target detection performance is strongly influenced by the effectiveness of the employed BEA. Particularly, the BEA and its design parameters should be chosen so as to get an accurate estimate of the background signal and to avoid biases caused by the possible presence of targets (target leakage). In this work, we present a novel method for the choice and setting of the best performing background removal technique for the detection of dim point targets. The proposed procedure is based on the simulation of dim targets implanted on an acquired sample image representing the scenario of interest. The choice of the best performing BEA is made by exploring the performance of the detection scheme for several configurations of the characteristic parameters of the BEAs. The effectiveness of the BEA selection procedure is evaluated in two case studies where real image sequences acquired by IR cameras are employed. The results confirm the benefits introduced by the proposed technique. Indeed, the performance of the IR detection system with the BEA tuned according to the proposed selection criterion is improved in that the number of false alarms is reduced up to 2 orders of magnitude compared with BEAs in other common configurations.