I know I shouldn't be at this point, but I feel like I have to point this out. For as long as this debate over health care has been going on, the refrain from those who are against this reform seems to always be that "this is destroying the America we know." And you know what? They're absolutely right. If we can successfully do this, then it really won't be the America that they knew anymore.
No longer will it be an America where your relative worth as a human being will be determined solely by your financial wealth. No longer will it be an America where a person's right to success will be determined by the family you were born into, but only on the work and dedication you are willing to put forth. No longer will this be a nation where it is acceptable to discriminate based on race, creed, sexual orientation, or any of the other classifications that have been thrown before us in an attempt to divide us rather than bring us together.
In short, we will truly be a progressive nation. We will leave behind the fears and trepidations of our past generations and begin moving forward as one people. And that scares many conservatives. It scares them because they know that once that happens, conservatives won't be in charge anymore. They won't be able to dominate and control what they have in the past. Corporations won't be able to hold over the heads of their workers the fact that they are the only way they can get health care. Workers won't be able to be pitted against each other based on color or national origin. Finally, we can live out the hope of our Declaration that all men, all women, all people everywhere in this nation will truly be equal.
So please, everyone, keep fighting to make this happen. We can't toss aside the trappings of the past that hold us back easily. Many will fight to keep us where we are. We must stand together, and if we do, then we will win.