For pay colleges are closing, and hurting their students in many ways, not just the school closing.
Seriously, why is education not considered a right in this country?
Why would we ever allow some organization/ company to profit off of the education of someone. I don't mean books, supplies, etc. I mean the act of education itself.