I continue to be amazed at how the word "liberal" has evolved into a political pejorative. Those who fall for the ploy should take some time to do their homework. Merriam-Webster's Dictionary defines "liberalism" as "a political philosophy based on belief in progress, the essential goodness of the human race, and the autonomy of the individual and standing for the protection of political and civil liberties."
What's so bad about that?