For my corrosion tests on a steel coating, I obtained a coating potential nobler than that of steel but a negative percentage of inhibition efficiency. Is it normal ? How to explain these results?
I believe that the problem is due to a non-ideal homogeneity of the physical characteristics of the coating (eg presence of porosity and / or surface defects) that can be found directly at its application or sometimes generated by contact with a particularly aggressive medium on the surface layer . If your surface coating, more noble than the substrate, was perfect in its deposition and in its interaction with the corrosive environment, the system would behave, from the corrosion point of view, as if the whole object were made of that material. If there are inhomogeneities, surface defects, porosity, etc. the system becomes accelerating and non-inhibiting as you would have a large cathodic surface and a limited anodic surface. This situation could also favor "dangerous" phenomena of reduction of the adhesion of the coating and localized corrosion of the substrate. It would be interesting to verify these hollows with surface and sectional tests and measurements.
You'd better repeat the test for several times with different samples, and estimate the corrosion performance combining the corrosion current density and corrosion potential.