These are UV/UV-vis spectrometers. Modern HPLC detectors use photodiode (or CMOS) arrays. The bandwidth is a combination of the slit (or entrance optics, if there isn't an obvious slit) before the dispersion grating, and the range of wavelengths that impinge on an individual pixel on the detector.
Assuming a perfectly narrow entrance slit, light is dispersed from the grating and each pixel sees a range of wavelengths from the grating. On some systems, the output from several pixels can be combined to increase the signal, st the cost of spectral resolution.
The entrance slit may be an actual slit, or it may be an part of the optics such as a fiber-optic. In either case, it has a non-zero width, so the light hitting the dispersion grating is actually spread over an area of the grating. Since this is not adjustable in HPLC detectors, the bandwidth contribution from the entrance slit sets a lower limit on the minimum bandwidth.
for an image of these sort of detectors, see: https://www.researchgate.net/profile/Zarrin_Eshaghi/publication/221915678/figure/fig3/AS:305167682555911@1449769053890/Schematic-of-a-photodiode-array-spectrophotometer_W640.jpg
Good question, and really no need to calculate it as it is measured by the manufacturer. Having the documentation to show what it is will provide with useful information for comparison, if you change detectors.
Actually, this is an old carryover from using UV/VIS Spectrophotometers where the SLIT-WIDTH was fixed (*slit width is the spectrum bandwidth for a peak measured at half height). Sometimes you need to know it for determining resolution using one of the older regulated methods. Many of the older spectrophotometer units had fixed slit widths based on the optical block design, but no standards existed then (or now) for what they should be. It needs to be wide enough to allow enough light through, but not so wide that resolution is lost. A compromise is needed if it is a fixed value, but if variable, it can be optimized for each sample (something very few do). Many manufacturer's used fixed values from 1 to 16 nm, with 8 nm being somewhat common. Later, systems with variable light slit openings became available, esp on scanning diode array detectors (both for UV/VS and HPLC, as many share the same optical blocks). The ability to vary the amount of light falling on the diodes allows users to optimize the amount of energy detected to increase sensitivity or selectivity, as desired. For most users, it is of little importance as used in HPLC (unless you wish to optimize it for the best S/N and sample selectivity).
Perhaps of much greater importance to most users is the ability to vary signal bandwidth for each UV/VIS signal selected. Signal bandwidth can be optimized for each sample and method to improve either sample selectivity or signal intensity/sensitivity (see linked article below for more info). An incorrect signal bandwidth can invalidate all data and the method used, so it is a critical point to understand when learning how to use an HPLC system.
Why calculate the spectrum bandwidth? Calibration is concerned with spectrum accuracy which directly affects the absorbance and calculated concentration of the analyte.