EFA explores or reduces the huge no. of factors into few, by showing which factor loads on which latent factor. To find hw many latent factors, it depends some thumb rule like Eigen value greater than 1, screen plots, factor loading greaterthan 0.55 and cross loadings less than 0.45.
Better to use IBM's AMOS for CFA, where it confirms the factor structure extracted by EFA and is a theory oriented model.It is a model which tests to which extent the relations are valid. To be more confident and certain about the hypothesised structural model, one need to look at composite reliability and average variance extracted and also to perform advanced test if validity like convergent and discriminant validity tests. All these are not conducted in EFA, it is just a priliminary analysis. one can also exclude the factor loading which are not meeting the above mentionec criteria.
adding to these wonderful contributions by our circle in this field, note that there are 2 variants of SEM:
1. Covariance based, e.g AMOS, LISREL
2. PLS based, SmartPLS, WarpPLS
*PLS = partial least square
In both variants, researchers normally use a 2-step approach suggested by Anderson and Gerbing. These are:
1. Measurement model assessment (this is CFA)
2. Structural model assessment (vary based on which SEM variant, e.g. Covariance SEM uses fit indices such as RMSEA, TLI. While PLS-SEM uses Q-square, R-square)
Your question on composite reliability, AVE, discriminant and convergent validity lies in the factor loadings. And these are stats at CFA in both SEM variants.
These links give you what to expect for CFA. although written for PLS-SEM, the measurement model (CFA) assessment stage remains similar in both SEM approaches.