Other forms: westernized; westernizing
To westernize is to impose aspects of European or North American culture on a group of people in another part of the world.
When a society westernizes, it adopts the norms or customs (or pop culture) of Western nations. An African city with a McDonald's is one example of this, as is a trend of wearing jeans and sweatshirts in an Indian village. Many people have become westernized by force throughout history, when colonizers moved in and imposed Christianity, Western-style clothing, or the English language. Westernize was first used in the 1800s, originally in reference to Japan becoming "more like the West."