I don't mean to sound flipant, but the honest truth is that most people who work in Hollywood have a degree in something that's not even remotely related to their career choice. For example, I have a degree in microbiology. The only time I've come close to using it was when I was writing script coverage for the first draft of the Dustin Hoffman movie, OUTBREAK.
The point is that most film and television skills are learned on the job. So internships, assistant work and the like become much more important than college classes.
Now, on the other hand, that's not to say that your education is worthless -- far from it. Your education will enhance your ability to think, meet deadlines, work with people, and help to round out your knowledge both technically and culturally. It gives you something to draw upon as part of your own life experience, which is ultimately the most important experience to have in Hollywood. Additionally, getting a college degree will provide you with a back up plan should your dreams of working in film and television not pan out.
So, major in whatever subject you desire. Get a few internships or low level jobs in the field and let that be your training should studying film and/or television not be of interest or available.