The GF told me her sister told her that one of the women she worked with was told by the "human resources manager" she had to go back to college and get a degree to continue doing the same job she's done for the last 29 years! This woman told the "human resources manager" to stick the job up her ass and quit. WTF!! Who, in their right mind, would think that getting a college degree would make this woman with 29 years of experience more knowledgeable doing what she's done for the last 29 years? What makes these assholes think someone, just because they have a college degree, knows more than someone with 29 years of experience doing the same job?
When I did interviews for machinists, I wanted someone who had
working experience and knew what they were doing, I didn't give a shit if they had a piece of paper from some stupid college! What the hell is our country coming to when actual working experience is no longer essential, but a damned piece of paper is? The GF told me that the hospital she works at now requires a college degree for a secretary/ receptionist
End of rant!!