It's a Tuesday morning in a professional recording studio, and an engineer is staring at a vocal take riddled with room noise and the distant rumble of traffic. Five years ago, this would have meant thirty minutes of painstaking manual work — surgical EQ cuts, careful gating, perhaps a resigned phone call to the artist about re-recording. Today, the engineer loads iZotope RX 11's Repair Assistant, watches it analyse the waveform, and accepts its suggested fixes with a few clicks. The vocal is clean in under two minutes. The coffee hasn't even gone cold.

This scene — or something very like it — is playing out in studios across Scotland and beyond. AI audio tools have crossed the threshold from curiosity to necessity, and the shift has been remarkably quiet.

The Tools That Changed the Room

The most widely adopted AI tools in professional audio aren't the headline-grabbing song generators. They're the workhorses: iZotope's RX suite for audio repair, its Neutron and Ozone plugins for mixing and mastering assistance, LANDR's automated mastering service, and an expanding roster of stem separation tools now built into major DAWs including Logic Pro, Ableton Live, and Cubase.

A 2026 survey by Sonarworks and Sound On Sound, polling more than 1,100 working producers and engineers, found that one in five is already a regular AI user, while nearly half have experimented with the technology. The most popular applications are firmly utilitarian — audio cleanup, noise reduction, stem separation, and session organisation. These are the tedious, time-consuming tasks that no engineer ever romanticised.

"iZotope has changed the entire process of editing and mixing," says Mike Marchain, supervising sound editor at Warner Bros., whose credits include All American and It's Always Sunny In Philadelphia. "The preservation of production recordings and their natural tone is key. RX 11 has refined and created more tools to enable us to streamline our cutting sessions from the outset."

The Craft Question

But acceptance is far from universal. The same Sonarworks survey revealed that more than a third of respondents worry AI tools could compromise their creative intent or undermine the individuality of their work. The biggest single concern wasn't job security — it was originality. Engineers fear a future awash in competent but characterless audio.

The generational divide is real. Younger engineers, who came up with AI assistance baked into their software, often see it simply as part of the toolkit — no more controversial than Auto-Tune was to the generation before. Experienced hands take a more guarded view, distinguishing sharply between AI that handles drudgery and AI that makes creative decisions.

"The line between assistance and authorship matters deeply," the Sonarworks report concluded. Producers are comfortable letting AI clean up a noisy recording. They are far less willing to let it decide how a vocal should sit in a mix.

Sound editor and designer Tim Walston, whose credits include Wicked and Aquaman and the Lost Kingdom, exemplifies the pragmatic middle ground: "RX Advanced continues to be my daily tool for cleaning, mastering, and creative audio manipulation. Every one of my sound effect recordings goes through RX before it hits my library."

Pricing, Clients, and the Business of Sound

The commercial implications are already being felt. LANDR now offers unlimited AI mastering from around £8 per month — a service that would have cost upwards of £100 per track from a human engineer. For demos, podcasts, and streaming-only releases, the maths is hard to argue with. For album projects and broadcast work, most studios still insist on human ears making the final call.

Clients are increasingly aware of what AI can do, and some are asking for it by name. Studios report being asked whether they use RX, or whether AI mastering is available as a budget option. The hybrid workflow — AI for the heavy lifting, human expertise for the creative decisions — is becoming the default pitch.

What's Next

The Sonarworks survey found that only 3.6 per cent of professionals regard AI as a passing fad. Nearly a third believe it is revolutionising the industry. The consensus, held by 58 per cent of respondents, is that AI will settle into a primarily supportive role — an increasingly capable assistant, but one that still requires a human at the desk.

The skills that matter are shifting accordingly. Manual audio editing and routine mix balancing are seen as declining in importance. Musicality, critical listening, emotional judgment, and the ability to communicate with artists are rising. The most successful engineers of the next decade may not be the ones who master every plugin, but those who know when to trust the machine — and when to override it.

The studio isn't being replaced. It's being reorganised. And the engineers who adapt will find that AI hasn't diminished the craft — it's simply changed what the craft demands.