I have always wondered why parents drill the idea of ” College is the only path” in our brains. College may open more doors for you, however study shows that in 2013 people with bachelors degree make as much as people who only have high school degrees. Do parents tell us to go to college or does society tell us to go to college?
What do I mean by society telling us to go to college? Here is an example. My parents immigrated into the United States and they never attended college. Even though they never acquired an education, they still are better off than most people who have attended college. They still would tell me that college is the best choice even though they are an example that shows college is not always the best pathway. They tell me that college is the best pathway because society will tell you it is. Even if studies show that student could do fine with a high school degree, society will still look down upon that idea. Ever since I was a child television, teachers, and almost anything will try to persuade me to attend college. Is college the true pathway to success? I now wonder if my choice to go to college was because of society or my own intuition.
College does not always guarantee success. If you do what you love, you don’t need college to do it. If you really need more evidence heres a link to people who became rich without graduating college. I’m not telling you to drop out, but I just want to inform you that people can do well without going to college.