When they say ‘politics’, what do they mean?

I’m off the phone with my cousin. It was an emotional call. She had just severed a relationship with a guy, but not because she didn’t like him. She wanted him fine, but what bothered her about him was his support of the current president. When she made clear that her reason for the separation was a lack of shared core values, he retorted that they do share values — what they don’t share is politics.

 I don’t understand what politics is.

The dictionary definition of politics is:  ‘the art or science concerned with winning and holding control over a government.”

Though the word is associated with government and the art of governing, its definition sounds generic and vague.

Was the holocaust an affront to people’s core values or, since it was authorized by a government, a mere political action?

Was slavery an affront to human decency or, since it was sanctioned by a government, just politics?

When governments subjugate women and take away their rights, is that just a mere political action?

The word ‘politics’ is opaque and suggests it does not affect our lives. It implies that the government has no say in and/or influence on how life treats us.  In reality, it’s smoke and mirrors, permitting people to hide behind a hazy concept while sacrificing the rights of their daughters, granddaughters, wives, and mothers for the promise of an elusive tax deduction or a perceived economic benefit.

The cryptic word ‘politics’ permits people to outsource their biases to the government and pretend it is beyond their control. The genteel people of society can go on and exercise racism, misogyny, and other forms of bigotry by proxy while putting on a mask of civility.