The history of the discipline of International Relations began thousands of years, perhaps hundreds of thousands of years ago, but its main development was in the last century. It would not be wrong to start the history of international relations in the academic community from the end of the First World War. Is it possible to evaluate this discipline, which was established with the stands in the USA and the UK, as impartial and unbiased? Is England the definition of "the country that does not set sun on it", does the main definition go unnoticed? So why is it not seen as "exploiting every place on earth where the sun rises"? Why is Germany's challenge to this colonial state viewed by Britain? So why did the British perceive a very good administration and the perception that the bad Germans came and broke the order? Why is it called ideal idealism an after the First World War, where injustice was signed and the invitations to the Second World War? Whose idealism? against whom? for what? Then why did Germany, who could not cope with the harsh conditions, change the situation and was discredited by the real war-lovers? The so-called peace-loving Britain and the US did not really do anything to prevent the Second World War from being invoked by the heavy-duty treaties that followed the First World War? Why was international relations called "realist" after the Second World War? What was your idealism so that something could be understood from your subsequent realism. Why look at the world through the glasses of the United States or England? Why should we assimilate and accept the political history written and imposed on us by the rogue state that clearly exploited the previous world and the rogue state who later committed the greatest crime of humanity in Japan?

More Mehmet Emir's questions See All
Similar questions and discussions