Suppose that you have a sensor, that you can't change the sample rate or any other feature including the anti-aliasing before the signal be sampled.

Beside this, suppose that, the control loop of the system runs with a slower rate than the sensor, and you cannot change the control loop rate, as well.

Now, assume that the sensor rate is multiple of the control loop rate.

Then, suppose that, after 5 samples of the sensor, the control loop runs once using the latest sample from the sensor (observation: the 5 five samples are stored).

As far I understand this characterized the down-sampling of the signal, and it produces the aliasing effects over the signal.

Then, how this effect can be minimized? Should I have to put a digital filter before the control loop?

Similar questions and discussions