Thursday, September 20, 2007

A curvature estimation for pen input segmentation in sketch-based modeling

by

Dae Hyun Kim and Myoung-Jun Kim

Summary

The paper presents a new curvature estimation method that uses local convexity and monotonicity. The paper notes that for proper stroke segmentation good curvature estimation is needed. They note two classes discussed in previous work for accomplishing this goal: scale space decomposition (Sezgin) and estimating curvature directly from input data. Their contribution is a new means to determine region of support. They discuss the sliding method window. They provide two algorithms: one which provides curvature estimation with local shape information and the other which provides curvature estimation with local convexity. They were able to achieve a success rate at 95%. They mention comparing the four algorithms, but never give success rates.

Discussion

I had difficulty reading this paper; the organization left a lot to be desired. I have no idea why they discussed an evaluation test, and didn't feel the need to share the results. This is one of those concepts I feel I will better understand once I start to implement it. The change in curvature (convexity) makes sense as a means of recognizing points of curvature.

Citation

Kim, D. H., and Kim, M.-J. 2006. A curvature estimation for pen input segmentation in sketch-based modeling. Computer-Aided Design 38, 3, 238--248.

4 comments:

rg said...

I don't understand the assumption that the input is perfect and then ignoring some of it by resampling. Especially when their features are highly local in nature. They've given up significant local feature data.

Paul Taele said...

Oh, really? Heh, I actually really enjoyed the Kim paper myself. Maybe it's just me, but I found the paper to be a breath of fresh air after reading the vague Sezgin and Yu papers. I do agree that the evaluation tests were not a critical necessity in the paper. I believe I actually skipped over that section completely after failing to see how it would contribute to my enlightenment. In addition, I was intrigued by the use of convexity as a metric to detect segmentation points, as you brought up. It felt like one of those things that seemed obvious after reading about it in a "why didn't I think of that" kind of way.

Brian David Eoff said...

The metrics made sense, but they gave no indication how to incorporate them into an algorithm. Should I use them in the same way as I use Fd and Fs in Sezgin? How should I calculate the initial error threshold? These metrics might (and I stress might) be usable in finding corners, but they give no sense of how. Sezgin left out a few thresholds, Kim left out the algorithm.

Anonymous said...

Haven't read the aforementionned article but if it is used on discrete datas (ie datas on a regular grid with integer coordinates up to a dilatation) the sliding window technique cannot work..
ANd even if so called good success rate is achieved it is only on biaised training sets since for a digital curve you can always find infinitely many shapes with the same digitization... Thus the design of so called "robust curvature" make me laugh a lot :p