A systematic review attempts to identify, assess, and synthesize all published work pertinent to a specific research question. A meta-analysis involves this all well, through a systematic identification and inclusion of studies. However, in meta-analysis, we use statistical methods to synthesize information between studies. This can involve estimating the overall effect of a treatment from numerous studies, explaining heterogeneity between effects found in studies, and testing for publication bias. In short, to me a systematic review is a way of going about writing a comprehensive review article, whereas a meta-analysis is a quantitative approach that investigates between-study heterogeneity.
Systematic review is the way you gather all the relevant studies. Meta-analysis is the statistic method you used to combine the estimated effect from each study together.
systematic review is first phase of an meta-analysis study which gather all relative document systematically. but, meta-analysis is statistical method for combining the previews systematic review effect measures.
Daniel & Gyanendra sum it up succinctly although I find Daniels final summarry slightly confusing:
"In short, to me a systematic review is a way of going about writing a comprehensive review article, whereas a meta-analysis is a quantitative approach that investigates between-study heterogeneity."
It seems to imply that the main purpose of meta-analysis is to 'investigate' heterogeneity. I can see that in a formal way it could be argued that it is but I think it might be more useful to restate this as:
"A systematic review is a way of going about a review of the evidence using explicit methods to ensure that all evidence addressing a focused question is considered and the evidence is selected and presented in such a way as to minimize bias. A meta-analysis is a quantitative approach to synthesizing the evidence in such reviews, generally to obtain a more precise estimate of treatment effectiveness or some other parameter."
So meta analysis can be regarded as either a sub set of systematic reviews (as per Gyanendra and other answers) or as a particular set of techniques applied to data within a review. Of course the techniques can be applied to any data but they are meaningless without the underlying systematic review and unbiased selection of evidence.
Systematic Review and Meta-analysis: when one study is just not enough by. Amit Garg, Dan Hackam, and Marcello Tonelli. published in the Clinical Journal of the American Society of Nephrology. doi: 10.2215/CJN.01430307
A systematic review aims to identify all the relevant literature in a systematic manner. The protocol is prespecified and normally at least 2 people will decide which papers are included and extract relevant data.
A meta-analysis is a statistical method of combining the results. It is not always appropriate to do this, particularly if there is a lot of heterogenaity between the studies. You can do a systemaitc review without a meta-analysis.
Peter Donald Griffiths sums this up well I think. Also - meta analysis can be misleading if performed without a systematic review to ensure the data included is comprehensive and appropriately weighted for quality and information provided. In a systematic review however it is not always appropriate to perform a meta analysis.
I agree with most responses above. Generally, we moved from summatively DESCRIBING previous studies (Simple literature review) to ANALYTICALLY summarizing previous studies (Systematic review). Because Simple literature review has a higher likelihood of bias, Systematic reviews try to mitigate this by defining selection criteria for articles to be reviewed a priori. Additionally, during systematic review, one MAY CHOOSE to employ statistical techniques to quantify actual benefit or risk. If results derived from such rigorous pooled measures are reported, the study becomes a meta-analytical systematic review. Therefore, both terms can be used interchangeably where appropriate. But not all systematic review has a meta-analytical component (especially earlier articles).
PS: Because of such concerted efforts as CONSORT, acceptable systematic reviews must adhere to sound meta-analytical techniques.
Practically speaking, systemic overviews will have a quality score or some rigorous way to evaluate the quality of the individual studies that are being put together. Although, meta-analysis will have inclusion/exclusion criteria and a rigorous literature search, they will not focus on the details of individual trial methodology or quality but rather on individual trial results, their statistical distribution, and their aggregate results. Systemic overviews generally do both: a rigorous appraisal of trial methodology and a quantitative summary of trial results.
1: Systematic review is a step-by-step process to perform an unbiased literature review. It may or may not include a meta-analysis which is quantitative analysis of numerical data or meta-synthesis which is qualitative analysis or data.
2: Meta-analysis is combining the numerical results of similar studies to provide a summary conclusion about the quantitative evidence. Most of the time you have to do a systematic review to find the studies to do a meta-analysis but not always! Sometimes if you have access to the numerical data you may not have to do a systematic review. For example some drug companies run trials on their own newly developed drug and have the data so they run a meta-analysis without doing a systematic review because they know that they are the only people who have access to the data and they have all the data on the new drug. Otherwise, as Peter says in this discussion, it is not recommended to do a meta-analysis before doing a systematic review.
So:
1) There are many systematic reviews without performing a meta-analysis.
2) There are some meta-analysis without performing a systematic review [mostly from drug companies].
3) Majority of systematic reviews also have a meta-analysis.
Clear explication from Farhad above. It is worth adding that a meta analysis without a systematic review is likely to be meaningless because it is simply an aggregation of 'some data' with no indication that it can represent the population (of research studies) in an unbiased fashion. By contrast meta analysis of results within the context of a properly conducted systematic review should
i) minimise the risk of bias (because of steps taken to identify and select all valid studies)
ii) allow the risk of bias that arises from limitations of both the review and the original research to be properly assessed (because the process is systematic, replicable and includes an assessment of risk of bias in the source studies)
Perhaps there are some exceptions that I have not considered but it strikes me that while you CAN do a meta analysis without a systematic review you SHOULD NOT actually do it (except as a learning exercise).
A meta-analysis requires deep knowledge and understanding of the analysis method, problem definition and data used in each individual 'participating' publication. Weights to results are given according to the significance of each study, which is defined by the quality and completeness of data that were used, the strength of relationships, and more (e.g relevance/specificity of sample/appropriateness). It is very easy to mess with a meta-analysis process and it requires maestry and deep subject matter knowledge to conduct a successful one.