This is a very complex question, and the answer is as well.
In the post-WWII years, those with degrees were often automatically moved into the managerial ranks, regardless of the degree. People saw this as an indictment of not having the degree, and as student loan and federal funding opened up, there was a SURGE for everyone to get a degree, ANY degree, as it appeared to guarantee a higher salary.
This game went on though the decadent 1980s and into the mid-1990s where other similar stupidity seemed to permeate American society (like taking unlimited 2nd mortgages out on your home since home prices ALWAYS go up,...).
As tuition and fees at all institutions have continued to outpace the cost of living, other realities (great recession, wage stagnation, housing bust, unemployment, etc.) have sucker-punched the workforce.
If you look only at the unemployment rates for people with degrees (~4%) as opposed to those without degrees (~8%), you will think that indeed, you are better off with a degree, but I will add this caveat:
Most of us don't have trustfunds to pay for college, and it keeps getting more expensive. If you borrow $50,000 - $100,000 for a bachelor's degree (not uncommon today), you need to look at how much you'll make in your new profession and compare that to how much your student loans will cost each month.
Sometimes, it might be better to work in a lower-paying job, attend schoool PART TIME (paying as much on your own as possible) and avoid the high student loan payments that accompany too many of college degrees being awarded today.