I am an archaeologist studying bedrock mortars (the holes in boulders and rock outcrops that Native Americans would use to grind foodstuffs) in a large study area.  I’m exploring whether the depth of an individual bedrock mortar is an indicator of a specific kind of food that was meant to be ground in it.  More specifically, whether a bounded range of depths is correlated with a specific type of food (e.g. mortars between 0.25 – 5.5 cm deep were mostly used to grind acorn).  Unfortunately, I did not have the ability to test for the presence of food residues in the mortars, so I only have a single variable- depth- to work with.  Instead of performing regressions or similar multivariate tests to correlate specific foods with specific depth ranges, I am looking for patterns in the depth variable that suggest preferred ranges.  I assume these ranges will be evident as modes in the distribution curve, and potentially as mixed distributions.

 I measured the depth (continuous interval) of 699 bedrock mortars from my study area, and can assume that is essentially the entire population.  When I plot the values, the resultant distribution curve is highly non-normal (right skewed, leptokurtic?, a long tail and lots of outliers on the right, p-value of

More David Price's questions See All
Similar questions and discussions