Evolutionary algorithms are one type of the metaheuristic algorithms . Metaheuristics are divided into two categories including single based mateheuristics and population based metaheuristics. EAs are considered one type of P-metaheuristics such as GA .
In fact I confirm what Hamed says, a metaheuristic is a higher-level procedure and metaheuristics include simulated annealing, evolutionary algorithms, ant colony optimization and particle swarm ... To allow you to broaden your knowledge on this subject here are attached files.
A heuristic is a technique for solving problems. It is used to speed up the search process but you usually don't get the exact solution. So it used when the problem is very hard to solve and you are satisfied with an approximate solution. Heuristics are problem specific. In order to overcome this limitation metaheuristics were designed. That is why evolutionary algorithms are a type of metaheuristics. Evolutionary algorithms are also stochastic optimizers because they use random variables. This means that each run of the algorithm will produce a different solution (it is non-deterministic). Hope this helps to better understand these terms.
All of the answers above contribute to a general understanding of the topic. Let me see if I can add some more.
For a particular class of optimization problems, finding the optimal solution is beyond the ability of exact algorithms (NP-hard problems). This is where heuristic optimization comes to play, traditionally by means problem-speficic heuristics that can be used by greedy algorithms (constructive, meaning they create a solution from scratch) or by hill-climbing algorithms (perturbative, meaning they try to improve an existing, feasible solution) -- they can also be combined into a single algorithm, where you start from a greedy solution and try to improve it via hill climbing.
The issue with any of those approaches is that they are sub-optimal in the context of multi-modal problems, meaning they get trapped into regions of the search space where they can only find a locally optimal solution. For greedy algorithms this is pretty straightforward, as they are only optimal algorithms for problems with particular characteristics. For hill-climbing, what happens is that the degree of perturbation allowed to make the search feasible constrains the possibilities of creating some solutions from the solution you currently have. In that "neighborhood" of solutions, the best ones you can find are still not the global optimum, although you cannot improve them any further using the perturbation you selected.
To avoid these situations and also to have problem-independent search strategies, many different ideas have been proposed and are known as stochastic local search methods or metaheuristic algorithms. In a nutshell, they are iterative approaches that try to have some mechanism for escaping local optima, such as using a random perturbation after a local optimum has been found and hence resume the search from a different region of the search space.
Evolutionary algorithms (EAs) are one type of metaheuristics that are known as populational, which are essentially algorithms where the learning comes from interactions between multiple candidate solutions. In the case of EAs, besides applying perturbations to solutions (mutation), they also create novel solutions from recombinations of existing ones (crossover).
If you take a look at the Handbook of Metaheuristics by Gendreau and Potvin you'll see that there are many more metaheuristics, each with a different insight into how to avoid local optima and have an effective search. The question of which metaheuristic to choose and which ideas to implement for a given problem takes some experience (I'd say it's also NP-hard :P), or you can try automatic design approaches.
To be simple let's say that a heuristic is a specific technique for solving a specific optimization problem and often works with one possible solution in each step, On other hand a metaheurestic is more general and not related to a specific problem, noting also that he second one works simultaneity with more than one possible solution ( population).
Heuristic = A general (although rather simplistic) solution strategy or solution creation strategy (constructive heuristic).
Metaheuristic = A wide framework embodying a general heuristic scheme that can be adapted to a large number of problems and family of problems.
Evolutionary Algorithms = A family of metaheuristics, population-based, meaning that a pool of solutions is used in every iteration of the solution process, that uses operators to combine/modify the solutions aiming at iteratively improving/evolving the solutions of the pool based on a fitness function. The general heuristic pursued inspires from Darwin's Law of Evolution, which is why several solutions are handled, as a population of individuals, that are combined to produce an offspring, randomly aiming at improving the parents fitness evaluation (optimization criteria), thus evolving the initial population throughout the optimization process.
Although we have found in our researches (please see the Haag Theorem and the String Theory Econophysics papers on this researchgate site /Soumitra K. Mallick) that the metaheuristics and evolutionary algorithms can both be genetically programmed by using steady state cardinal evaluation criteria like the utility function, it is not similar in terms of information crtierion. In the metaheuristics case the data is present at either end of the algorithm while in evolutionary programs data is enveloped because of metaheuristics precisely, the evolutionary episodes have to be tied to the heuristics as the inference. Please also see the references in the papers. This follows from String Theory.
Soumitra K. Mallick
for Soumitra K. Mallick, Nick Hamburger, Sandipan Mallick
I basically agree with the previous answers. I would just to add that, besides the Genetic Algorithms, could be understood as Evolutionary all those algorithms that make evolve solutions populations by a systems of operators. If the operators imitate the genetic processes we are in presence of Genetic Algorithms. Starting from this idea I have developed the Integration of Variables Method and some it algorithms, that use non genetic operators to make evolve codes populations. I am attaching two works in which you can find the method and some of the developed algorithms and applications.
José is right when he says that Evolutionary algorithms comprise a broad range of approaches, but in general this term is used to refer to Genetic Algorithms, Evolution Strategies or Differential Evolution.
Genetic algorithms and Evolution strategies have become quite similar, the main difference feature nowadays being that in ES approaches it is common to use co-evolution for self-configuring its parameters.
Differential evolution is very different from the others in that solutions are improved through vectorial operations, much like a gradient-descent approach. The main difference is that in DE algorithms new solutions result from vectorial operations between existing solutions (vectors in the search space), not trying to identify the most promising hill-climbing path (and as you can clearly see, its application is mostly restricted to continuous optimization so far).
Usually, two Optimisation Techniques classes are exist named as Arithmetic Programming Approaches (APAs) and Meta-heuristic Programming Approaches (MPAs). The first class, search for an optimum local. However, it suffers from its limits especially for a global maximum. Hence, the second can find a optimum global but not the exact solution.
In methaheuristic approaches we can find different family as follows:
1) Swarm based algorthms such as (PSO, ACO, ABC, Bat, etc)
2) Evolutionary based optimization such as (GA, DE, etc)
In addition, the Evolutionary Algorithms contains all algorithms based on the notion of population (P) that used the selection and the recombination to generate a new point in the research space.
Metaheuristic can be categorized upon their common features into at least nine basic categories:
Evolution-inspired algorithms: These algorithms attempt to imitate the rules and laws of the natural evolution of the biological world. Regardless of their nature, these evolutionary-based optimization algorithms are regarded as generic population-based metaheuristic algorithms. The search process of this norm of algorithms has two focal stages; exploration and exploitation. The exploration phase precedes the exploitation phase which can be regarded as the process of exploring in detail the search space. At the exploration stage, the progress of the search process is launched with a randomly generated population which is then evolved over a number of subsequent generations. The most applicable point of these heuristics is that the next generation of individuals is shaped by collecting the best individuals and then integrating them together. Through this integration, the population is enhanced over the succeeding generations. On the basis of this, the optimizer of the exploration stage includes some design parameters that have to be randomized as much as possible to globally explore the promising solution search space.
.......................................
to complete the discussion, please refer to page 39 and figure 5 of the following reference:
Article Sea Lion Optimization Algorithm for Solving the Maximum Flow Problem