Bryan and I were talking today about the future, whether we try to stay in Jackson throughout the winter and into next summer, or if we make a real push to find "professional" jobs. I'd really like employer-subsidized health insurance again. That would be nice.
I mentioned that neither of us has every worked a job that required a college degree (even though we both have one). That's kind of lame. I also realized that I've never worked a job where I've had to wear anything fancier than jeans and a casual blouse. That's pretty awesome.
It's hard to know what to push for in this economy. "Real job" with benefits? Or seasonal work that's not fancy but is a bit more assured?
I read an article the other day that mentioned that bachelor's degrees are pretty much obsolete; they no longer guarantee a job. Master's degrees don't even guarantee a job some places; people with advanced degrees require a higher pay scale, so some places don't want to hire them. Kind of ridiculous, right? I never considered NOT going to college - it was a given. It certainly doesn't do much nowadays though.
When I was switching majors the end of my sophomore year of college, I considered becoming a radiologist or an ultrasound tech. I don't know if my chances for a job would have been better or worse.
Anyone else feeling very "bleh" about their college degree? Don't get me wrong, I like my job, but it does seem ridiculous at times that I still own $17,000 in student loans for a college degree that I can't seem to put to use.