Affiliations 

  • 1 Universiti Sains Malaysia
MyJurnal

Abstract

We present an algorithm to reduce the number of slices from 2D contour cross sections. The main aim of the algorithm is to filter less significant slices while preserving an acceptable level of output quality and keeping the computational cost to reconstruct surface(s) at a minimal level. This research is motivated mainly by two factors; first 2D cross sections data is often huge in size and high in precisions – the computational cost to reconstruct surface(s) from them is closely related to the size and complexity of this data. Second, we can trades visual fidelity with speed of computations if we can remove visually insignificant data from the original dataset which may contains redundant information. In our algorithm we use the number of contour points on a pair of slices to calculate the distance between them. Selection to retain/reject a slice is based on the value of distance compared against a threshold value. Optimal threshold value is derived to produce set of slices that collectively represent the feature of the dataset. We tested our algorithm over six different set of data, varying in complexities and sizes. The results show slice reduction rate depends on the complexity of the dataset, where highest reduction percentage is achieved for objects with lots of constant local variations. Our derived optimal thresholds seem to be able to produce the right set of slices with the potential of creating surface(s) that traded off the accuracy and speed requirements.