Why gaming degrees need levelling up
Aleksey Savchenko
- Published
- Opinion & Analysis

A gaming degree might give graduates a 1-up in the employment market, but the qualifications themselves aren’t fit for purpose, writes Aleksey Savchenko
Game design degrees have exploded in popularity in recent years as young people are increasingly attracted to a career in an industry which is seen as fun, cool, and lucrative.
Among British universities, there is a continued arms race to attract students, fuelled by the mantra that you need a degree to find employment in a games studio. And it’s clearly working – the number of students enrolling on game design degrees has more than doubled in the space of a decade.
But having worked in the industry myself for more than 20 years, and having had to interview my fair share of candidates during that time, I’m here to speak an unspoken truth: game degrees are, largely, not fit for purpose.
While computer programming and computer art degrees tend to place students more successfully in the industry, game design degrees, in general, fail to have the same success. The reason is simple: they fail to prepare students for their job.
They’re great for covering the theoretical side of game design such as such as storytelling, game mechanics, and artistic direction, but they fall down in providing the practical, hands-on skills actually needed to make games. Too often taught by people with no direct experience of the industry, graduates typically leave without a solid understanding of the real-world nuts and bolts of game design: project management, debugging techniques and code optimisation.
They are equally unfamiliar with full-cycle development, being blissfully unaware of the constraints of working within large, collaborative teams that can involve not just them but other designers, programmers, artists, sound engineers and more, as well as the iterative development process where a game has to go through multiple versions before it can be called ‘playable’
And things are only getting worse.
Artificial intelligence has been embraced by the gaming industry because of the promise of automation, efficiency and cost reduction. The inevitable shift towards AI-driven tools has seen the role of the game developer being redefined, with the traditional programming and design skills they have brought to the table becoming less relevant with every passing year. Today’s AI-powered development tools can generate game assets, animate characters, write scripts, and even design entire levels with minimal human intervention. Studios no longer need teams of coders meticulously programming every detail. Instead, they need people who understand how to work with AI, guide it, and refine its output. The role of a game developer is, then, shifting from writing code to curating and directing AI-generated content.
In response, universities have introduced AI modules into their syllabus as a further hook to bring in students. But this will do nothing to actually make them job-ready. Courses were already too broad and impractical before. Now, there’s even less time given in the classroom to building a solid foundation in core development skills, replaced by equally shallow and sub-par instruction in AI.
The current situation is grossly unfair to graduates, who are being misled by false promises. Very few institutions are actually producing competent game designers, meaning many graduates are not able to secure a position in the industry.
And those that do face serious problems without a solid foundation in practical game design. Game design isn’t black and white – there will be many challenges along the way. Yet graduates are often ill-equipped to think on their feet and adapt when unexpected problems arise.
The increasing adoption of AI in the industry only presents more challenges to graduates. It is meant to provide an invaluable set of tools to enhance efficiency, but this is only true if you first know your craft. Without that grounding, game industry graduates risk becoming nothing more than low-level machine operators – lacking the critical thinking, perspective and ability to innovate or optimise the creative process. Take away those most human of skills and, as AI becomes ever more capable, you become increasingly dispensable. Many jobs could be lost this way.
The solution is simple: game development education needs a radical overhaul. Instead of producing generalists with broad but shallow knowledge, universities must focus on developing specialists with strong foundational skills. Education should incorporate hands-on, full-cycle project work, giving students practical experience in working within a team and in working on multiple games. Equally, students should come to understand the principles behind AI and automation rather than just learning how to push the right buttons.
Only then can they develop the expertise needed to maximise their chances of finding employment and, as technology inevitably continues to evolve, remain employable.

Aleksey Savchenko is a veteran game developer, futurist, author, and BAFTA member with nearly three decades’ expertise in the tech and entertainment industries. Currently the Director of of RnD, Technology and External Resources at GSC Game World, he has worked on the studio’s acclaimed S.T.A.L.K.E.R. 2. He has also worked for Epic Games, known for Fortnite and its technical achievements in middleware technologies worldwide, playing an instrumental role in establishing an Unreal Engine with Eastern European developers. He is the author of Game as Business and the Cyberside series of cyberpunk graphic novels.
Images: The European
RECENT ARTICLES
-
The era of easy markets is ending — here are the risks investors can no longer ignore -
Is testosterone the new performance hack for executives? -
Can we regulate reality? AI, sovereignty and the battle over what counts as real -
NATO gears up for conflict as transatlantic strains grow -
Facial recognition is leaving the US border — and we should be concerned -
Wheelchair design is stuck in the past — and disabled people are paying the price -
Why Europe still needs America -
Why Europe’s finance apps must start borrowing from each other’s playbooks -
Why universities must set clear rules for AI use before trust in academia erodes -
The lucky leader: six lessons on why fortune favours some and fails others -
Reckon AI has cracked thinking? Think again -
The new 10 year National Cancer Plan: fewer measures, more heart? -
The Reese Witherspoon effect: how celebrity book clubs are rewriting the rules of publishing -
The legality of tax planning in an age of moral outrage -
The limits of good intentions in public policy -
Are favouritism and fear holding back Germany’s rearmament? -
What bestseller lists really tell us — and why they shouldn’t be the only measure of a book’s worth -
Why mere survival is no longer enough for children with brain tumours -
What Germany’s Energiewende teaches Europe about power, risk and reality -
What the Monroe Doctrine actually said — and why Trump is invoking it now -
Love with responsibility: rethinking supply chains this Valentine’s Day -
Why the India–EU trade deal matters far beyond diplomacy -
Why the countryside is far safer than we think - and why apex predators belong in it -
What if he falls? -
Trump reminds Davos that talk still runs the world

























