Concentricity - How are the calculations done?
I'm inspecting a manifold with a custom bore. See screen shot below from the machining program. There's a concentricity callout for the deep bore (brown) called back to the small pilot (yellow). The tolerance is 0.05mm and we were getting numbers around 0.26mm. We produced this whole feature in the same setup and figured that these numbers weren't a true reflection of reality. I also knew that the referenced datum (yellow) is very short and a long way from the controlled feature, so the smallest measurement error is going to get amplified over the length.
So to put this into perspective, I checked 7 or 8 different circles that are theoretically concentric right down the length of the bore. I then constructed a best fit line to those points and leveled a coordinate system normal to that line. I set the origin to be the intersection of that line and the entry plane and did a rotational alignment to one of the other edges. No big deal.
I then reported all of the circle center points in that coordinate system. Of all of the circles, the largest variation from the axis was 0.008mm away, which is pretty good. One circle in particular (in the brown bore) was off center by 0.001mm one direction and 0.003mm the other direction. It's roundness was 0.003mm, so good bumps.
I then reported concentricity for that same circle (in this same constructed reference frame), and it reported 0.062mm. I've read the book on concentricity and understand all the theory of the diametrically opposed points and everything. There's a good reason that people say "when in doubt, use runout".... But I'm having a hard time wrapping my head around how a circle that's off center by at worst 0.003mm can have a reported concentricity of 0.062mm.
Can someone help me understand this? Thank you!