However, multiple regression will do the job for you.
You did not specify the exact dependent variable. Nonetheless, there is a solution.
Multiple regression will tell you whether there is a relationship between your regressors and the dependent variables. It will also tell you whether this relation/association is significant or not (by observing the p-values or t-stat). More importantly, I'll advice that you pay attention to the size of the variance i.e the standard error. The standard error should tell you whether you have omitted or included more variables that explains changes in the dependent variable.
For correlation analysis, it really does not explain much compared to regression. However, if you want to make some sense out of correlation results, you'll need to run the correlation of all your variables and make sure that the statistical software generates their respective p-values. Based on the p-values, you can tell whether there is any significant association. The usual caveat is to avoid or, at least, minimize multicollinearity (or auto-correlation), i.e. where two variables have very correlation.
If your aim is to combine regression and correlation, be sure that you have a justifiable reason for doing so. Why? Because correlation just tells you whether there is an association but it says nothing (or very little) about the association while regression says something about the association among your variables. If your goal is to find an association only, then use correlation and forget regression. On the other hand, if your goal is to say something about the association between your variables, then regression always works for this
If you have data, which are classified and given in terms of frequency, then I would suggest you conduct a chi^2 test for pairs of variables. You can make that by providing an excel template and after having tested 2 by 2 you will see which 2 variables do depend on each other. Second advantage is, that you do not need to identify which depends on which, but you can discuss dependencies based on your results. If you need a template, just let me know, I have plenty of these. As an example I attach you the test done in Excel with an own template.
If you had two sets of variaibles, you can consider canonical analysis: it works like regression,which mutually predict one another . However, you want to analyze three sets of variaibles, even though one of them cosmprizes only one variable. There is no solution to a threesome for canonical analysis. But you may split self-esteem into N groups (like quintiles or quartiles), and look for differences in the N groups of the canonical equations.
Another solution is to resort to categorical canonical analysis: you group each variable into, lets say, deciles, and then apply CCA to three subsets (personality, selfesteem, music preference) . The significant (or meaningful) latent variables will show the relationship among the three sets. This would also allow you to explore non linear relations . You can analyse your data with SPSS (categorical analysis, CATPCA).
I would agree that if you are still pursuing any of this that a correlation matrix would help you initially. I don't know how sophisticated you are looking to get if you are continuing with the project. It also gives you an idea of how strong the associations and will also give you some insight as to what to expect in the subsequent regression. It will also give you an idea if any of the items are so correlated so as to be measuring the same thing, so you might then need to pursue indices or factors. Coding your variables and deciding what you "want to know" will also inform your methodological choices
If your aim is to combine regression and correlation, be sure that you have a justifiable reason for doing so. Why? Because correlation just tells you whether there is an association but it says nothing (or very little) about the association while regression says something about the association among your variables. If your goal is to find an association only, then use correlation and forget regression. On the other hand, if your goal is to say something about the association between your variables, then regression always works for this
You asked a general question for multiple answers! It depend on what you looks for and where to reach, for instances simply t-test is usefull even I would like Chi-Square for general work is good, but like I said it depend on the objectives you like to reach.
You asked a general question for multiple answers! It depends on what you look for and where to reach, for instance simply t-test is useful even I would like Chi-Square for general work is good, but like I said it depends on the objectives you like to reach.@
You asked a general question for multiple answers! It depends on what you look for and where to reach, for instance simply a t-test is useful even I would like Chi-Square for general work is good, but like I said it depends on the objectives you like to reach.
You asked a general question for multiple answers! It depends on what you look for and where to reach, for instance simply a t-test is useful even I would like Chi-Square for general work is good, but like I said it depends on the objectives you like to reach.
You asked a general question for multiple answers! It depends on what you look for and where to reach, for instance simply a t-test is useful even I would like Chi-Square for general work is good, but like I said it depends on the objectives you like to reach.
You asked a general question for multiple answers! It depends on what you look for and where to reach, for instance simply a t-test is useful even I would like Chi-Square for general work is good, but like I said it depends on the objectives you like to reach.
You asked a general question for multiple answers! It depends on what you look for and where to reach, for instance simply a t-test is useful even I would like Chi-Square for general work is good, but like I said it depends on the objectives you like to reach.
For correlation analysis, it really does not explain much compared to regression. However, if you want to make some sense out of correlation results, you'll need to run the correlation of all your variables and make sure that the statistical software generates their respective p-values. Based on the p-values, you can tell whether there is any significant association. The usual caveat is to avoid or, at least, minimize multicollinearity (or auto-correlation), i.e. where two variables have very correlation.
However, multiple regression will do the job for you.
You did not specify the exact dependent variable. Nonetheless, there is a solution.
Multiple regression will tell you whether there is a relationship between your regressors and the dependent variables. It will also tell you whether this relation/association is significant or not (by observing the p-values or t-stat). More importantly, I'll advice that you pay attention to the size of the variance i.e the standard error. The standard error should tell you whether you have omitted or included more variables that explains changes in the dependent variable.
For correlation analysis, it really does not explain much compared to regression. However, if you want to make some sense out of correlation results, you'll need to run the correlation of all your variables and make sure that the statistical software generates their respective p-values. Based on the p-values, you can tell whether there is any significant association. The usual caveat is to avoid or, at least, minimize multicollinearity (or auto-correlation), i.e. where two variables have very correlation.
If your aim is to combine regression and correlation, be sure that you have a justifiable reason for doing so. Why? Because correlation just tells you whether there is an association but it says nothing (or very little) about the association while regression says something about the association among your variables. If your goal is to find an association only, then use correlation and forget regression. On the other hand, if your goal is to say something about the association between your variables, then regression always works for this
If you have data, which are classified and given in terms of frequency, then I would suggest you conduct a chi^2 test for pairs of variables. You can make that by providing an excel template and after having tested 2 by 2 you will see which 2 variables do depend on each other. Second advantage is, that you do not need to identify which depends on which, but you can discuss dependencies based on your results. If you need a template, just let me know, I have plenty of these. As an example I attach you the test done in Excel with an own template.
I would recommend the analysis proposed by Patricia, since this CFA/SEM-method reduces measurement error, and also is the most flexible for testing different relations and models.
I am a little puzzled, though, by you indication of measuring your variables on a interval scale, since they come from questionnaires, which typically will give ordered categorical (ordinal) variables. Ordinarily you should analyze these as categorical, which can be done by the mentioned method. If it could also be analysed as interval data depends on several factors, however using CFA/SEM-methods it is possible to test which type of analysis is best.
I would recommend to use the Mplus program. It is the most powerful (according to the types of data and the types of analyses possible) and also the easiest (because of meaningful defaults for analyses - which can be overrided if necessary). At the web site www.statmodel.com is a lot of free teaching material including videos of presentations and downloadable slides from these, as well as a possibility to test if you like the program by downloading a free (not time-limited) program demo with the same capabilities as the full program, only limited in the number of indicators (variables) it will include.
I would recommend the analysis proposed by Patricia, since this CFA / SEM method reduces measurement error and is also the most flexible for testing different relationships and models. A free program to take into account is jamovi that allows you to perform CFA / SEM in a simple way. You can also perform the matrix correlation between the variables included in the graphic plot and also suggest the TOSTER option that can be downloaded from the JAMOVI library to evaluate if there is equivalence between factors, groups or sex due to the lack of correlation.
I would agree that if you are still pursuing any of this that a correlation matrix would help you initially. I don't know how sophisticated you are looking to get if you are continuing with the project. It also gives you an idea of how strong the associations and will also give you some insight as to what to expect in the subsequent regression. It will also give you an idea if any of the items are so correlated so as to be measuring the same thing, so you might then need to pursue indices or factors. Coding your variables and deciding what you "want to know" will also inform your methodological choices