Your typical lefty can give you a dozen reasons America sucks upon command. They constantly run down this country and express their admiration for other countries and are famous for proclaiming that "there are many other better countries than the U.S." You used to understand that lefties fundamentally dislike their own country, but then something happened. Was it as simple an explanation as "California happened?" What was it?