There are often meta-analyses published without including a systematic review or including only a search strategy and not a thorough evaluation of the methods under which the studies included in the meta-analysis were performed.
My personal opinion is that a systematic review should include not only a search strategy that will enable other researchers to replicate your findings but also an evaluation or at least analysis of the methods used. The evaluation issue is not a trivial one as the quality assessment tools that exist although widely used there is controversy around them.
I will agree with Steffen that if high heterogeneity exists then studies should not be pooled together but are we talking about statistical heterogeneity or heterogeneity based on which these studies were conducted. I have read quite a few meta-analyses were authors do not even explore the sources of heterogeneity in the studies they analyze and this goes undetected by the reviewers as well.
I agree with all of the above; my contribution is to point out that the systematic review should include not only a reproducible literature search strategy but also: 1) a clearly articulated and reproducible strategy for including/excluding studies; and, 2) assessing the quality of the individual studies.
I mostly agree with Steffen's response, especially in that I prefer using "meta-analysis" to mean statistical techniques for comparing and combining quantitative results from several similar studies. Regarding heterogeneity, however, in some situations it's informative to use meta-analytic techniques even in the presence of substantial heterogeneity, such as to quantify this variation/inconsistency or explore it using study-level features as covariates in meta-regression (e.g., effect moderators).
For better or for worse, terminology for research synthesis varies considerably across disciplines and authors and changes over time. For instance, many social/behavioral scientists still use "meta-analysis" to refer to the larger endeavor of research synthesis, though this often seems restricted to syntheses of quantitative empirical results.
To give you a sense of the wide varity of terminology, here's a sample of terms I've seen used for activities that could be considered types or aspects of research synthesis: systematic review, meta-analysis, pooled analysis, data fusion, quantitative review, overview, tertiary review, best-evidence synthesis, narrative review, meta-synthesis, meta-ethnography, validity generalization, reliability generalization, benefit transfer, health technology assessment.
The following two articles offer useful distinctions among different types of research syntheses:
Cooper, H. (2003). Editorial. Psychological Bulletin, 129, 3-9. doi:10.1037/0033-2909.129.1.3
Grant, M. J., & Booth, A. (2009). A typology of reviews: An analysis of 14 review types and associated methodologies. Health Information and Libraries Journal, 26, 91–108. doi:10.1111/j.1471-1842.2009.00848.x
Finally, this post of mine on the Meta-Analysis Resources site describes my (free) bibliography on methodology for research synthesis, with particular attention to introductory treatments of meta-analysis and research synthesis more generally:
By way of an update to that post, I'll note that the largest publicly available version of my bibliography is now posted as (free) supporting information with this article:
Hafdahl, A. R. (2012). Article Alerts: Items from 2011, Part I. Research Synthesis Methods. doi:10.1002/jrsm.1069
If there is a publication bias, then meta-analysis will also produce biased estimates.
Validity of meta-analysis is therefore crucial.
To assess the validity of meta-analysis, we usually use Funnel plots.
For more on statistical methodology used in meta analyses you may find useful this article. There is a nice chapter about the Funnel plot and its interpretation.
If bias is present, funnel plot is usually asymmetrical and skewed.
Adding to what have been said especially by Anastasia Chalkidou and Ted Stevenson, the most promminent difference is the risk of publication bias which is more associated with meta-analysis done without systematic review. If mata-analysis is based on systematic review the risk of bias is dramatically low or non-existent.
From practical perspective it is always a good idea to first search for pre-appraised systematic revieview with meta-analysis, then go for not filtered systematic reviews with meta-analysis. Those unfiltered must be appraised critically befor you wil give them a go.
Then go for any meta-analyses that haven't been reviewed in previously searched systematic revies. This will give you a clue what was missed in them.
Systemic review is a summary of evidence on a particular topic, typically by an expert or expert panel that uses a rigorous process for identifying, appraising, and synthesizing studies to answer a specific clinical question.
Meta analysis : Many systemic reviews incorporate quantitative methods to summarize the results from multiple studies. These reviews are called Meta- Analyses.
Melnyk, B. M., & Fineout-Overhault, E. (2005). Evidence-Based Practice in Nursing & Healthcare. Philadelphia, PA: Lippincott Williams & Wilkins.
A systematic review answers a defined research question by collecting and summarising all empirical evidence that fits pre-specified eligibility criteria.
A meta-analysis is the use of statistical methods to summarise the results of these studies.
I agree with all the above given answers. In addition, meta analysis is purely quantitative research while systematic review is quantitative, qualitative or mixed research depending on the research question to be answered.
Yes, all the above discussion answered the question regarding the difference between the two methods. In addition, recently I came across several articles in BMC open access journals on meta-ethnography, which look interesting. May be someone can elaborate on this in the future.
I agree with the contributors. Meta analysis mostly dwells on effect, mostly quantitative results while Systematic review is more in-depth and may incorporate meta-analysis.
I agree with the answers above, that the systematic review and meta-analysis are two separate (but often related) approaches to answering your research question.
The Systematic Review approach may be a little different than a non-systematic or narrative review. The Systematic Review approach is typcially used to ensure that your methods can be replicated by future researchers. I often tell my students to think of the "systematic" part as a "research road map" that gives directions to other scientists, and leads them through all of the twists and turns you encountered before arriving at your final product. Because of this, we try to document the exact search terms used, the specific search terms used, reasons for excluding full text articles, etc. Some would argue that the systematic review, when performed properly, reduces the likelihood of potential bias that a narrative review may have when the author is allowed to select the articles for their review on their own. With a systematic review, however, I usually have the perspective that other important articles (that may not have met the inclusion criteria) can potentially be included in the discussion section to help frame your topic in the broader context of the cumulative body of research.
The PRISMA guidelines should help provide some guidance.
http://www.prisma-statement.org
Meta-analysis is a statistical approach used to analyze a large collection of analysis results from individual studies to integrate findings that may vary across participants, contexts, and other study design factors. It is a valuable tool for practitioners, because this technique allows the reader to examine the treatment effect of multiple studies, yielding a larger sample than is possible in a single study, and providing a more precise estimate of the treatment effect at the population level. Because of this, a major rationale for the use of meta-analysis is to increase the statistical power of hypotheses, and potentially identify trends in the available data not detected in the original research. It is likely that a better estimate of the true population effect, as well as potential moderators, can be determined by using this aggregate level data due to the increased statistical power. However, a systematic review and meta-analysis can only be as good as the articles on which it is based. So in that sense, I often tell students to imagine the meta-analysis as the "icing on the cake" or the "finishing touches" on your systematic review. If we are trying to gain a more thorough understanding of the basic fundamentals of a research area, in general, I believe that a systematic review can be performed without a meta-analysis included, however I believe that a meta-analysis should always include a methodologically strong systematic review (except in special circumstances).