The question was not really specific. I did assume that the questioner was looking for a single compile invocation for programs written in a single langauage, so I gave the simplest example I could find, hoping to have the questioner think about why there had to be a clue or a step to indentify the language first.
If the question was about mixing statements from multiple languages within a single program, well that would be creating a new language,and I would hate to look at the standard for that. (The Fortran 90 ISO standard on my shelf is 369 pages alone!)
Have a look at the LLVM project. As Peter points out, a compiler is actually a set of tools that run in stages. The front-end must be language-specific, but it can (and does) translate to an intermediate representation that can be shared by many languages. LLVM is a compiler insfrastructure that lets you do precisely that, you can pick the pieces that you would like your compiler to have and assemble them to suit your needs.
By the way, Microsoft's .NET framework and their MSIL aim at achieving the same goal.
Peter is hinting at one of the obstacles toward having a "universal" compiler - that is a valid specification of the language semantics. Although there are several formats for grammar specification - these are construction rules not interpretation rules. Standards committees do exist for some languages which would provided the information needed in those languages to specify the semantics - however many languages just assume a set of semantics.
A "Universal" compiler would need to accept the semantics and grammar rules for a language plus the semantics and grammar rules for a target architecture and then provide the mapping .
Some excellent technologies have been created for portions of this pathway (for example, projects around LLVM, JVM, and GCC focus on how to have a number of different target architectures for a well standardized language input; while Domain specific efforts such as SCALA Macros, HASKELL efforts, even some efforts using C++ templates are focused on how to provide semantic specifications that can assist domain oriented programming - I apologize for any efforts I have left out - I am sure there are many of them) .
However I am not aware of any supported standard for meta-language that describes the front and back end semantics/grammar. specifications. Without such a standard these efforts create partial solutions that may not be easily transferable to new languages/architectures. The closest I can think of is maybe RDF or some other Ontologic description language that could be employed to develop a standard which would then have to be supported by a community of compiler-compilers.
As Robert and Peter have pointed out, compilers generally take in one language specification and output a translated (usually machine language level) executable.
The gcc front-end derives hints from which of it's languages to parse by the dot extension at the end of the file name - .f = fortran; .c = C; .C/.cpp = C++; .ada = gnats, and so forth.
Abstracting syntax has been done with meta-languages BNF, VDL, and others. Tools have been developed to take the meta-language and convert to compiler front ends (the syntax checking portion of the compiler) such as yacc and bison.
In the mid-1980s, Nicholas Habermann (then at Carnigie-Mellon) pursued a line of research (project Gandalf - http://books.google.com/books/about/The_Gandalf_Software_Development_Environ.html?id=LbP2GgAACAAJ) that would 'instantiate' the same toolset for each language environment one wanted to use.
No. But you can try some interesting websites such as www.compileonline.com on which you can compile and execute many programming languages and markup languages.