In
American politics, liberals are people who believe in
government-sponsored social programs, take a generally
tolerant stand on most issues, are concerned with the preservation of the
environment, tend toward
pacifism, and are
pro-choice.
In
America, "
liberal" has somehow become a
derogatory term.
Blame the Republicans for that (and much more,
nyah, nyah).
Writing
nodes about politics is fun.