New Product Alert : Curves AQ

The World’s First Autonomous EQ. Click ‘Learn’ – AQ does the rest.
New Plugin! 🤯 Curves AQ - The World’s First Autonomous EQ.

Everyone is trying to create a ‘smart’ EQ, but none use truly generative technology.

That’s why we created Curves AQ – the world’s first autonomous EQ.

Click ‘Learn’ – AQ Does the Rest.

AQ learns your audio, decides what sound it should have and creates 5 unique EQ profiles. These are not presets – but original spectral curves based on the natural qualities of your audio – vocals, instruments or full mixes.

Then, personalize your sound with 5 powerfully reimagined EQ controls.

Learn more »

Get it Now
Curves AQ is also included in theWaves Ultimate Subscription

2 Likes

when will it appear in the mercury bundle? keen to try

Hi @jpustin,

Welcome to the Waves Forum.

It is included in the Mercury Bundle, You will need to click the get the latest version update for the Mercury bundle you own first from My account >My products> click get latest version next to the Mercury Bundle.

Once done, proceed to Waves Central to activate and install the latest update.

If you need any assistance, do not hesitate to Contact Technical Support.

I guess I’m not surprised, but Waves knocked it out of the park with this one! I had previous bought a widely known competitors similar product, but I never found it that useful…

But this one hit hard with some clever innovations and killer user experience… It’s more what I wish the other had been:

Offering a suggested curve + 4 variations was genius. This keeps the human involved in the aesthetics while still being fast.

One of 5 options is going to get you close, and then the TILT+OFFSET takes you the rest of the way. The manual editing is a nice touch as well.

There are little UX nuances like if you hover above the tilt knob-thing, you can quickly grab the vertical amount adjustment or left/right frequency adjustment.

And the offset to slide the curve up or down the frequency spectrum?! Seriously brilliant and unique.

Lastly, the addition of a LIVE version really makes me feel like you guys listen to customer requests. I’m thoroughly enjoying my Waves Update Plan at the moment. Thanks for adding it to Mercury!

EDIT: And the easy rollover edit of the LFHA frequency bands — there really is nothing like this. This isn’t a “fast follow” plugin – it’s an evolution of the whole genre.

Your team responsible for this deserves an epic cheer, lol. Seriously.

2 Likes

Hello,

is there a known bug in the live version? After the „learn“ process nothing happens… The „not live“ version works fine as aspected. Tried it (both versions) in Reaper, Superrack Performer and Live Professor with the same result!

Cheers
Johannes

Hey @JohannesKimstedt,

Welcome to the Waves Forum.

If you encounter any issue or suspect you found a bug as mentioned, please Contact Technical Support and allow them to further investigate.

Nice move!! If only Waves could make a plugin that will wash the dishes and feed the cat for me?!? :thinking:

Gotta say, I haven’t had much luck with the training aspect of this plugin. It seems to keep suggesting moves that are the opposite of what I’d do normally (which maybe means that I’m just bad at mixing, but I like to think I can tell the difference between a muddy vocal and a muddier vocal…).

The blend of static and dynamic is a lot of fun, though. And in practice, it feels nice and quick to dial in an overall EQ manually and rely on the plugin to not overcook anything.

Maybe I’m missing something about the learning mode, though. Anyone got any success stories/tips to share?

Did you try feeding something in the sidechain and using Mix Sense?? As you know, the mixing is all about making decisions in context and an AI is no different. If you feed it some “context” it should make some more informed decisions.

But then, of course there is all the other controls you can use to help sculpt it into something more desirable. Or you can simply add nodes, set the slider to Static, and use it like a regular eq.

Didn’t realise that was a thing it did? Reading the user guide, it sounds like a more advanced version of what I currently do with C6 and a sidechain (e.g. C6 on instrument bus with vox on the sidechain).

I don’t see anything about it using sidechain context when learning curves for an instrument? I’m not even sure I can route stuff to make that work well, but it is an interesting idea for sure.

1 Like

Well maybe not for the instrument, but if it can hear the rest of the mix, or the key parts it should have a more clear idea how to eq a track to get it to fit better.

For example, if its a darker sounding track, it should, in theory, recognise that and provide you with darker profiles that would compliment it. Or maybe it’ll observe where the build ups are and be more focussed with what its removing .