I went to Texas for the first time ever the weekend before Thanksgiving. I wouldn't say I'm a well traveled person, but I have seen my share of the country. I've done most of the Southeast, Vegas/Arizona, Seattle, and of course most of New England. Every trip usually produced something different for me. A new perspective on the country. You could almost understand the culture of that area better by seeing it and experiencing it.
Okay you know I'm lying. Tourism has actually killed the country. Living in New York, I think it's safe to say that it's the largest disconnect - Hollywood has managed to build an image of the city that exists in reality only for the super rich (Sex in the City, you know who you are). But as I see more and more of this great country, I feel as though I'm not seeing anything new. Homogeny is the name of the game now. Commercialism and capitalism rule the land. Large corporations have dominated to the point where there are no more unique experiences, no more unique local products, no more tourist destinations that are any different.
Sure visiting Hawaii is different than visiting Georgia to the point of beaches and bikinis versus jeans and pickup trucks. But you still see the Best Buys, the McDonalds, the huge LUXURY! retail outlets with Sacks, Bergdorfs, and Sharper Image discounts galore.
So what is really happening to this country? Do we really find comfort in seeing the same things over and over? Let me relate to you my experience in Austin/San Antonio to prove my point.
My image of Texas growing up was one of desert and cacti. Steer and cowboys. Now I'm not naive. I know that Texas isn't all of that. I understand that people there live normal lives just like I do in New York with supermarkets and clothing stores but you expect it to reflect the local cuisine and the local fashion tastes, right? You don't expect HUGE chain stores selling the same thing you can get at home for the same price.
It took me two days to find the real Texas. My friend wanted true BBQ while we were down there and our friend who had attended UT told us to head to the Salt Lick in Driftwood. The drive out there was probably my favorite - huge ranches with horses and cattle, fences and pickups and just open land as far as you could see. I think that Driftwood probably was my favorite part of the entire trip.
It differred so much with what we were used to seeing in San Antonio and Austin, and what we see around us in Jersey, New York, Connecticut, and the rest of the northeast where I had grown up. Huge highways lined with grass and green trees. The aformentioned retail stores. Gigantic SUVs driven by soccer moms.
Do we really want this country to find common threads in our infrastructure and shopping outlets? Are we not losing some piece of the unique identity that is America's 50 states, all independent forming a giant union? Is this the result of the Information Age? The Industrial Revolution? NAFTA? What really is happening in this country?
I'm not sure how I feel about it. Maybe I'm just disappointed that my own mental images did not sync up with what I saw. Maybe I'm a city boy who doesn't translate well to the country/suburbs. Or maybe I'm truly noticing what a lot of other Americans see and are troubled by. I'm not sure, but something is going on here in America, and I'm not sure I like it.
My friend told me yesterday that Taco Bell is finally coming to her small Massachusetts hometown. I asked her, beginning of civilization or the end? She said the end. I tend to agree. Forget the health problems or the labor issues on the tomato farms. I'm not a small town purist, but I wonder what this town will look like in 10, 15, 50 years. Is this really progress? I doubt it.