The Hidden Markov Model (HMM) is a probabilistic Finite-State Machine (FSM) used to identify structures in sequential (alternatively spatially-ordered) data. A rule-base of FLC can be compared to a FSM with the ability to produce a sequence of output Membership Functions (MFs). The optimization problem then consists of:
+ building a functional mapping from a rule-base of FLC to a HMM,
+ using all the available data to train the HMM.
Such a FLC minimizes the processing time required for the control input by avoiding Mamdani-type inferencing using rule-base and centre of gravity defuzzification procedure.
Handover optimization can be driven in a Markov-based framework...
(F1) Intake of raw inputs inject directly into Hidden Markov Model (HMM) with training the HMM, and
(F2) Defuzzification calculations in FLC.
If apply the HMM and FLC separately then these two factors make the solution computationally complex and hence slower.
For overcoming the factors (F1) and (F2) we can use FLC based HMM to optimize the performance:
(S1) To over come the (F1): Filter the raw inputs before injecting into HMM---Simply, for this task, the FLC tool will work as low pass filter. For example, suppose 100x100 elements inject when we apply purely applied HMM; but at the same time, it may reduce to 5x5 elements (however, it depends upon number of variables chosen by human expert) by using FLC.
(S2)To over come the (F2): Bypass the Defuzzification calculations in FLC---It will further reduce the complexity of HMM.
***However, in addition to (S1) and (S2) there are many factors can be added to improve better results. For example, we can use Higher-order HMM, even with different novel algorithms for Training HMM.