Sure, why not? We can't possibly buttfuck our country any more than we have already... The way I see it, America/Americans killed the economy a long time ago... by advertising. Sure everyone comes here, because they think "OMG, freedom and good wages!" because that's how it's always been, but once they get here they find a different situation, because sure life is a little better but the economy blows. Thus, they stay here and work the low-end jobs (and sometimes the high-end jobs) because they will do the extreme physical labor. So we end up with all the ppl born here that are out of work.
IMO America shouldn't advertise.