My first attempt would be to get the centerline of the fiber. Now you can add-up the length in terms of raster units - tracking the centerline from pixel to pixel.
Applying curve smoothing algorithms (assuming this is what you are denoting "vector curve") might be able to slightly improve the measurement. On the other hand you could start with resampling the image to say 10-fold. This will improve your measurement by a factor of approximately 10 - but at the "cost" of significantly higher performance requirements.
I will give more details about the problem. I have some fibers and I want to know their legths. Firstly, I applied a thresholding method to convert the fiber image to a binary image. Using the binary image I applied a thinning algorithm and got the fiber skeleton (raster curve). After that it was calculated the sum of distances between pairs of pixels from the skeleton. This sum represents the fiber length.
The problem is the representation of the image. I am using the matrix plan (pixels) to calculate the fiber length, and it is not enough. Maybe convert the curve from a matrix plan to vector plan I will get better result. Is it possible?
OK - got it so far. If 'skeleton' is what I described as 'centerline', your are more or less following the method I described above. Converting to binary a bit later would not make a big difference.
I think it is correct to assume that the lengths you are getting are "too long". Though: are you searching only vertical/horizontal neighbors or also diagonal ones ? (which would add a distance of sqrt(2).)
You may try spline interpolation or apply bézier curve approximation. I'm not sure whether these will significantly improve the results as I have little experience with these.
Or you may really try the resampling approach which is easily done on the gray scale image.
An alternative would be to get "directional information" in a wider area. The discussion here https://www.researchgate.net/post/How_to_overcome_the_problem_in_image_pixels?tpr_view=fWJlzXZQRIRLYwsJz1k0hb1Pi8YojoKOzD70_3
should be able to show you the general direction. There it is about segmentation, but it is also useful to get information about line orientation beyond the immediate neighbors.
You are probably way over sampling, so to speak. Are you using clumps of fibers or identifiable single fibers? If you can easily identify the individual fibers then the problem that you are facing is an easy one. The skeleton/centerline/medial axis/etc only needs to be a single, clean line. If your 'skeleton' is fuzzy or has a lot of branches then you are not done pruning/thinning your skeleton. Are your fibers singular or branching? If they are singular then they are easily approximated by a spline curve. If they are branching you may still be able to approximate them with a spline curve. Are they realatively straight or curvy? If they are straight enough then you can just use least squares fitting to fit a line to the data. Heck, if they are straight enough then you should drop the skeletonization and just use a curve fitting method. In any case, if the skeleton is clean then the length of the skeletal components will be the length of the fiber. If it is not clean enough to get the length, then you are not done with the skeletonization process.
That makes the problem sound a lot more challenging. Your biggest issue will be the clumps, not the branched and curvy. Branched and curvy can be problematic too, but the problem with clumps is that it will be a lot more difficult to identify a single fiber.
Can you post a picture of worst case scenario?
Are you certain that you are able to identify which fiber each skeletal component (line segment?) is associated with? i.e., are you sure that you are not computing the sum of lengths of the skeletons for all of the fibers in a clump?
Do you have an image of your skeletal raster curves computed from the binary image overlayed on top of the original image? That way we can see the quality of the skeleton vs the original image.
I tested two curve approximations to increase the accuracy, Polynomial Interpolation and Anchored Discrete Convolution. Both approaches overestimated the fiber length. Any suggestion?