I am looking for explanation on how internal standard in HLPC analysis help in obtaining accurate results in the event of sample losses in the course of injection and flow rate processes
An internal standard should be as chemically close to the analyte while still being distinguishable. This is why in mass spectrometry the preferred internal standard is isotopically shifted version of the analyte.
If there is sample loss in the analysis system chances are that the same is happening to the internal standard and to the same extent as the analyte The ratio of the two signals therefore should remain the same and thus the loss is corrected for.
First you need to distinguish between surrogate standards (standards that are carried through the entire analytical process) and internal standards (standards that are added to the prepared extract prior to injection). Internal standards are very useful for GC-MS; they are of limited use in LC-MS since the injection process is much more straightforward. Internal standards should be well characterized chromatographically and unaffected by your sample matrix; this allows you to correct for injection volume changes, volatilization changes (for GC-MS), etc.
What Dr. Smarason described is a special technique with surrogate standards known as isotope dilution. Isotope dilution is exceptionally good at correcting for sample matrix and extraction issues since you are using an isotopically-labeled version of the actual compound. This is carried through the entire analytical process, so you get an accurate assessment of both the sample preparation as well as the sample analysis technique. You might still use an internal standard to correct for injection changes (in GC-MS, you would always use an internal standard for isotope dilution) but your primary correction will be against your isotopically-labeled surrogate standard.