This paper presents a new light-weight dynamic control strategy for VSLAM. The control strategy relies on sensor motion for adapting the control parameters of the algorithm and computing device used at runtime. We evaluate the strategy on two platforms, desktop and mobile processor, in the context of both direct and indirect keyframebased VSLAM algorithms, ORB-SLAM and DSO, using a control metric based on the change in camera pose over the trajectory of the sensor. As control parameters, the strategy uses DVFS (Dynamic Voltage and Frequency Scaling) on the device and a frame-skipping technique which attempts to identify and skip frames with low value to the accuracy of the VSLAM algorithm. We present results from execution on a number of synthetic and real scenes taken from the ICLNUIM and EuRoC MAV datasets, respectively, illustrating the power savings and impact on accuracy and robustness. Our results show a best-case power reduction of 75% with marginal impact on the accuracy and robustness of the VSLAM algorithms over multiple runs of most of the scenes compared to original base real-time versions of the algorithms. Analysis of the scenes which shows the most impact on robustness indicates this is caused by some critical points in the trajectory which motivates the continued search for improved control metrics.