I believe the original usage of "Left" and "Right" meant the extent to which the state intervened in the economic life of the country. To whit, left equalled an activist state, while right equalled a 'Laissez-faire" state.
An activist state typically provides a plethora of government services, while a laissez-faire state provides the barest minimum.
It seems to me that these terms are used very sloppily on this board. People who have an extreme dislike of the current occupant of the White House are thought to be "left", while those who favour him are thought to be "right".
I don't actually believe there is ANY left-wing power or presence in the USA at all. The only time in history that the USA had an interventionist government was during the FDR years. This was largely a response to the complete collapse of the capitalist system that resulted in the great depression. As such, it was a temporary phenomenon.
I don't see US media calling for an extension of socialized anything, by definition they are not left-wing at all.
The Democrats are no more left than the Republicans are.
These words, like "Liberal", have become mere epithets in US political discussion, a reflection of the almost complete banality of the political discourse in the USA.
An activist state typically provides a plethora of government services, while a laissez-faire state provides the barest minimum.
It seems to me that these terms are used very sloppily on this board. People who have an extreme dislike of the current occupant of the White House are thought to be "left", while those who favour him are thought to be "right".
I don't actually believe there is ANY left-wing power or presence in the USA at all. The only time in history that the USA had an interventionist government was during the FDR years. This was largely a response to the complete collapse of the capitalist system that resulted in the great depression. As such, it was a temporary phenomenon.
I don't see US media calling for an extension of socialized anything, by definition they are not left-wing at all.
The Democrats are no more left than the Republicans are.
These words, like "Liberal", have become mere epithets in US political discussion, a reflection of the almost complete banality of the political discourse in the USA.