Is University always necessary?

This is a massive debate amongst everyone, is university always necessary? In my opinion, no, going to university is not always necessary. There are many different ways which you can evolve into professionally work without having a university degree or the debt. Some of these include apprenticeships or gaining work experience. Some companies will hire trainees for certain roles allowing you to grow with a company and work your way up to the role you desire. However, some jobs do require a university degree. For example, if you were to want to work for the NHS. It is important you gain a medical degree in the sector you wish to work. Most university students will be offered jobs from their work placements when studying medical degrees which is a bonus. But to conclude, my answer is no. Do not street about going to university. There are other ways to build yourself in this world.