The Western world, also known as the West and the Occident, is a term referring to different nations depending on the context. There is no agreed upon definition about what all these nations have in common. The concept of the Western part of the earth has its roots in Greco-Roman civilization in Europe, with the advent of Christianity.
WELCOME TO RICHTOPIA
- 2,894,860 all-time users