[ad_1]
Is college (or, for general purposes, "higher education") necessary in today's world to live a better life? For the purpose of this discussion, we'll describe the term "better life" as having a job in a field that you enjoy that brings in a reasonable amount of income to support for yourself and for your family (assuming you have one or are planning to have one).
I question that everyday. In this day and age, high school students strive to be the best, as they should in all that they do in life. This strive towards being the best will lead them through many choices, but most importantly, most will end up concluding that college is the way to go. Again, for the purpose of this discussion, we'll define the word "college" as an educational institution higher then that of a community college.
And so, many end up losing a vast portion of the peak of their lives on studying and trying to get into a good college. And they make it. So then what? In essence, college ends up becoming an extension of high school; only difference is you have to start paying $ 9,000 + a year to learn something that you really do not care about for the first 2 years. You pick a major, only to learn that there seems to be more people with a degree in that major then they have open jobs.
So to answer my own question – no, college is certainly not a necessity to carve out a peaceful life for oneself. It may, as a matter of fact, even serve to be of nuisance for years after you've graduated by being plagued by student debts, which many recent graduations nowdays are starting to find out the hard way.
[ad_2]
Source