After graduating high school it is very important for most people for the next chapter in their life to be college. It was my next chapter but now that I am here I am just a little bit confused. The first two years are a complete joke. Everyone is forced to take classes that we all had in high school. For instance I have to take a world music class which is all about different types of instruments that are used in different countries however my major is broadcast journalism. I just do not feel like I should be paying hundreds of dollars for a class that has nothing to do with my major. We are paying hundreds of thousands of dollars by the time we finish school in four/five years and most people are in debt with student loans. From what I can see Americans are paying so much money for just a piece of paper that says we graduated on it. I agree college is important, but what I do not understand is why we have to take pointless classes just so we have enough credits to graduate when instead we could be spending that money learning more about our major and the careers that we wish to pursue. Students are spending money so one day they can make money.