I have set of independent data whose final output (result) is in Binary form (0 or 1). Which form of reliability analysis can be used for such datasets ? I have seen FOSM, AFOSM methods, all of them are applicable for continuous data.
If this is a measurement tool such as a questionnaire or interview form, calculating a composite reliability coefficient may not be right for you. Because such a measurement tool is not standard, it cannot measure a latent trait. If it is not an objective measurement tool, it may be more accurate to deal with concepts such as inter-rater reliability.
I recommend Logistic Regression. It takes a continuous input variable and compares it against a binary outcome. For example, the Challenger explosion investigation used this to understand the probability of O-ring impingement or burn through as correlated to pre-launch temperatures.
I agree that the most advisable tool to use is logistic regression, currently as Steven mentioned it is used to predict situations from continuous data in industry, medicine and other applications
KR 20 is typically used with binary data. However, if you want to use CFA for collecting validity evidence based on internal structure, there is an option to utilize the WLSMV estimator (available in Mplus and R). The WLSMV estimator is recommended for binary and ordinal data, since it yeilds less biased parameter estimates. Then, you can use the reliability function in the SemTools package.