The introduction of Variable Speed Drives (VSD) motor driven systems in industry is driven by the desire to increase motor efficiencies in plant. The efficiency savings are usually determined by initial energy assessments which consider factors such as the motor load type and operating conditions where the motor actual load may also be measured. However, once the system is installed and in operation, the designed-in energy efficiency of these systems may remain unchecked throughout the lifetime of the installation. Efficiency reductions may be caused by mechanical or electrical degradation of equipment that could remain undetected by the drive or user whilst the equipment appears to operate as normal. On larger systems, the financial cost of reduced efficiency can be significant. The aim of this paper is to simulate minor deteriorations in the operating conditions of a standard motor controlled from a VSD and ascertain if the worsening condition can be detected at an early stage. The deterioration in motor condition will be small enough to remain undetected by the VSD and not cause a drive fault. This paper also reviews the effect of the introduced motor imbalance on motor efficiency and introduces power factor measurement methods which can be a useful indicator of increased operating costs for equipment. Test results from the two drive operating modes of Volts/Hertz (v/f) and Sensorless Vector (SV) are compared. This is to determine if there is any noticeable difference in the measurements obtained for efficiency and power factor between drive operating modes.