When calibrating a machine tool, multiple measurement tasks will be performed, each of which has an associated uncertainty of measurement. International Standards and best-practice guides are available to aid with estimating uncertainty of measurement for individual tasks, but there is little consideration for the temporal influence on the uncertainty when considering interrelated measurements. Additionally, there is an absence of any intelligent method capable of optimising (reducing) the estimated uncertainty of the calibration plan as a whole. In this work, the uncertainty of measurement reduction problem is described and modelled in a suitable language to allow state-of-the-art artificial intelligence planning tools to produce optimal calibration plans. The paper describes how the continuous, non-linear temperature aspects are discretized and modelled to make them easier for the planner to solve. In addition, detail is provided as how the complex uncertainty equations are modelled in a restrictive language where its syntax heavily influences the encoding. An example is shown for a three-axis machine, where the produced plan exhibits intelligent behaviour in terms of scheduling measurements against temperature deviation and the propagation of error uncertainties. In this example, a reduction of 58% in the estimated uncertainty of measurement due to intelligently scheduling a calibration plan is observed. This reduction in the estimated uncertainty of measurement will result in an increased conformance zone, thus reducing false acceptance and rejection of work-pieces.
|Number of pages||10|
|Journal||Engineering Applications of Artificial Intelligence|
|Publication status||Published - 2014|