The American people, by and large, don't know that the world around them is changing, wholesale, in so many ways, and isn't going back to the way it used to be.
At all.
One study I saw recently pointed out how our workforce is going to end up being occupied, mostly, by females.
Don't get me wrong--I'm no sexist.
This, to me, is not a bad thing, it's just a matter-of-fact.
And the thing is, up to now, it's not been the other way around, uh, let's see--FOREVER?
And the thing is, it combines two factors.
One is that it is cheaper in the entire world, sadly, to hire women than men so that lends to this trend significantly.
The other thing is that, at least in the United States, more women are going to college than men--and that's a change, too, of course.
There are so many worldwide and nationwide changes that are taking place right now, it's hard to keep up.
Many, many political scientists and economists think that it's highly likely that the United States may have already lost its position in the world, politically and financially.
Smaller issues are about the fact that Caucasians are fewer in number all the time in the US. Hispanic and Latino populations are growing in size every year. We've already passed the place where "White" people were the majority.
If the financial situation pans out the way so many economists warn, America will have fallen from our place of power and strength, at the same time the White Man realizes he's lost his place alongside both women and Hispanics and Latinos.
Believe me, it's not a problem for me.
Let's hope it's not a problem for people of lesser educations and financial means.
Dr. Who Back For Christmas
4 hours ago
No comments:
Post a Comment