2021-JUN-01 Info: Celebrate Mix Challenge's 7th Anniversary with us!

Mix Challenge - General Gossip Thread

Ask us a question, give feedback, join surveys, make suggestions
User avatar
Mister Fox
Site Admin
Site Admin
Posts: 2231
Joined: Fri Mar 31, 2017 16:15 CEST
Location: Berlin, Germany

Re: Mix Challenge - General Gossip Thread

#221

Post by Mister Fox »

In response to general feedback in Mix Challenge 069 / September 2020

JohnK wrote:
Tue Sep 29, 2020 11:43 CEST
@Mister Fox

My mix were -15.3 LUFS. The rules is set with a +- 0.3 tolerance and a label of "probably mastered". Well - A mastered version in any sort would be -14 to -8-10-isch or along those lines, depending on genre.

I totally accept if I'm disqualified by .4 basically (meaning if the mix were -15.6), but it's a bit too hard imo :) It would be a huge difference if it was -12-13 or something... it was a fun song to mix!

Just sharing my thoughts.

Cheers

We're having this topic pretty much monthly at this point. I feel sorry that you "didn't quite make it", but we do have long known/well established Rules and Guidelines.

You technically have a range of -23 LUFS ILk to -16 LUFS ILk for a mixdown at your disposal. Loudness Normalization while evaluating the mixes is not of your concern, but a topic for the Song Provider. Unless the add-on rules clearly state "track must be within range of xyz LUFS ILk" (so far, that has never happened - but it could be a twist for a future mixing challenge). If you work with reference levels (e.g. -18 dB = 0 VU / EBU R68 convention, or -20 dB = 0 VU / SMPTE RP 0155 recommendation), then your final mixdown should theoretically produce a readout between -20 LUFS ILk and -16 LUFS ILk. If you overshoot that (and/or -1dBTP), you can easily handle that with either proper gain settings on the summing bus and/or suitable clipping protection. This is part of the learning process on how to provide suitable material for the next editing step in line.

The tolerance on the data sheet is set to be a bit "more forgiving", since every tool out there works different. But in practicality...

:arrow: The value of -16 LUFS ILk and -1 dBTP is the absolute upper limit for your mixdown.

Please also see previous posts.



The "probably mastered" is but a tag on the data sheet. I can remove this in the sheet for MC070 / October 2020 to avoid further confusion.

User avatar
Jorgeelalto
Posts: 98
Joined: Sat Apr 01, 2017 00:41 CEST
Location: Madrid, Spain
Contact:

Re: Mix Challenge - General Gossip Thread

#222

Post by Jorgeelalto »

I think we should establish a tool so that we can all measure our submissions and compare objectively. This is gonna keep coming out every time because it's a too confusing matter, I think.

White Punk OD
Posts: 240
Joined: Tue Aug 14, 2018 23:58 CEST

Re: Mix Challenge - General Gossip Thread

#223

Post by White Punk OD »

Hello Oba and to whom who is interested,
I have learned a lot from Warren and Rick, and a number of others, I am frequently using techniques they had explained.

Warren Huart (look for "produce like a pro", and Ady seems to be part of that educational community) has tutorials on certain topics, interesting "Friday FAQ" sessions, and whole mixing sessions with detailed insight. He also has a number of free mixing track packages, with fully professional recording quality.

There is one thing I would make a distinction about, it is how well a song is suitable to beginners or less experienced students,
here in the exciting challenge house, we may get often a number of components with issues, i.e. perhaps the drums and guitars are close to perfect, but the vocals have noise and distortion. Or in another package there is massive hihat bleed everywhere. Or there is a keyboard bass with no expression at all, just boomy. So we are challenged to correct that also, to complete the production in a sensitive and non-compromising way, before we can complete a comprehensive mix.

But many mix packages by Warren are for beginners, if you do everything like supposed as the standard procedure, and have some taste for the balances, then things will fall in place easily. Then still you can make it much more exciting with effects and automation, but you will see the right walking path through the thick bunch of tracks quite soon. There are few though, and you get charged when you subscript to the full package.

Maria Elisa Ayerbe has tutorials also in Spanish! Great engineer.

Mixbustv (I don't judge their art and style, technically very interesting hints to find):
https://www.youtube.com/user/mixbustv
https://www.youtube.com/watch?v=U9MhZLWem0g vocal cleaning

Rick Beato with a couple of very basic skills to train:
https://www.youtube.com/watch?v=EAGC2fUAU1M EQ -- #Oba this might be interesting to you! Watch @13:30
https://www.youtube.com/watch?v=7oOmX3JHwtE comp
https://www.youtube.com/watch?v=5lTYIHftOrg drums

My "algorithmic companion" in the cloud has learnt to throw in stuff like this:
https://www.youtube.com/watch?v=4ISF8qPupQE
Forget the fun part with the girl, and imagine if this were recorded as hot as our current challenge song.
You could not boost the air band as much as required to get a modern, intimate sound.
Among others, most Japanese mixing is absolutely top notch but also straightforward.
For certain reasons that are also physical, vocal mixing techniques are many, but will repeat also often, no matter if j-pop or baltic rock, or alternative ballads.

Addendum, also served by the cloud brain:
https://www.youtube.com/watch?v=UO_KcgJoQI0
So, this is about referencing, too. Basic sounds that became world famous, and in their components, they are still relevant today. Just we do a different stereo panning, when at that time, they had a very limited track count, like 8 or 16, and only so many compressors and equalizers. Thus, some recordings were grouped and submixed in a way, that they had the complete bass + drums stem on one of 8 mono tracks.

User avatar
Mister Fox
Site Admin
Site Admin
Posts: 2231
Joined: Fri Mar 31, 2017 16:15 CEST
Location: Berlin, Germany

Re: Mix Challenge - General Gossip Thread

#224

Post by Mister Fox »

The following has originally been posted by @cpsmusic in the Mix Challenge 072 / December thread, post #132 on page 14

cpsmusic wrote:
Thu Dec 24, 2020 04:57 CET
After the previous Challenge I posted an idea but it didn't get much of a response so I thought I'd bring it up again. It was the idea of some kind of rating system for the mixes so that people could get an idea of where they stood in the overall Challenge. I always try to take criticism onboard but it's hard to know if problem areas are minor, major or disastrous!

Anyway, my idea was to have some sort of ranking system, maybe like "A, B, C, D, E" where "A" goes through to the second round, "E" is a DNQ, and "B", "C", and "D" represent different grades i.e. "B", is almost good enough to get into the second round, whereas a "C" might need to work on a few areas first, and "D" has a way to go. I realise this is an extra burden for the song provider but it would be a big help for those of us who want to improve.

Anyone else think that this would be a good idea?

I'd love to gather more feedback on this, maybe also put some things on the drawing board.
Please discuss!

White Punk OD
Posts: 240
Joined: Tue Aug 14, 2018 23:58 CEST

Re: Mix Challenge - General Gossip Thread

#225

Post by White Punk OD »

You may move all my posts about this over here.

A rating system makes sense only, when everyone can add their votes and guesses.
The clients are the least consistent factor here, they will do what they do, and next month probably the opposite.
Many mixers here are recurring, and will form opinion groups by their tastes, at least they will apply their consistent standards.
Paying audience, the most general and most relevant judgement, is missing at all (those 800 million YT subscribers etc.)

I see 5 categories:
- spectrum (how much the mastering engineer would have to correct in subs, bass, air band, punch etc.)
- technical quality and solutions of issues
- musical quality and balances, good ideas using sounds and musical elements
- innovation, or great comeback of a certain style
- taste of the client, far from or close to what was expected

User avatar
Mister Fox
Site Admin
Site Admin
Posts: 2231
Joined: Fri Mar 31, 2017 16:15 CEST
Location: Berlin, Germany

Re: Mix Challenge - General Gossip Thread

#226

Post by Mister Fox »

Just to point this out:
I think the main idea here is to create a "helper sheet" for the client/song provider, not turn the Mix Challenge into a "popular vote" engine (which I am highly against).


We did see glimpses of that in previous Mix(ing) Challenges, indeed. But we never had something "working"/official. The question for me would be: what do you folks (especially previous song providers) think about this, and how should it ideally look.

Technically speaking, the "Statistics Sheet" that I'm creating every month, is already a starting point where things might have gone wrong. And even though this sheet means extra work on my end, I think I'll continue it for 2021 as this is adding to the overall learning factor. It additionally helps non-skilled song providers. We might not always have a recording engineer as "client", but most often just musicians. So this takes away the pressure of "I need to check the technical details as well?!". And the focus can be on the creative aspect (sound) of each entry.


So yeah... maybe we can create something that helps the Song Provider, and in turn also add to the overall learning aspect of our community.

White Punk OD
Posts: 240
Joined: Tue Aug 14, 2018 23:58 CEST

Re: Mix Challenge - General Gossip Thread

#227

Post by White Punk OD »

Comrades should comment on each other anyway, so when we agree on and provide categories and a spread sheet, then this is not a thing like a voting system, though it gives better feedback that has statistically relevant attributes, instead of random sentences.

cpsmusic
Posts: 30
Joined: Tue Nov 19, 2019 23:41 CET

Re: Mix Challenge - General Gossip Thread

#228

Post by cpsmusic »

White Punk OD wrote:
Thu Dec 24, 2020 06:56 CET
You may move all my posts about this over here.

A rating system makes sense only, when everyone can add their votes and guesses.
The clients are the least consistent factor here, they will do what they do, and next month probably the opposite.
Many mixers here are recurring, and will form opinion groups by their tastes, at least they will apply their consistent standards.
Paying audience, the most general and most relevant judgement, is missing at all (those 800 million YT subscribers etc.)

I see 5 categories:
- spectrum (how much the mastering engineer would have to correct in subs, bass, air band, punch etc.)
- technical quality and solutions of issues
- musical quality and balances, good ideas using sounds and musical elements
- innovation, or great comeback of a certain style
- taste of the client, far from or close to what was expected
I don't think it needs to be this complicated. My idea was really how "near/far" the mix was from what the song provider wanted. If the mixer decides to take the song in a different artistic direction then that's a punt they have to live with!

E.C.Miraldo
Posts: 18
Joined: Sun Jul 05, 2020 16:26 CEST

Re: Mix Challenge - General Gossip Thread

#229

Post by E.C.Miraldo »

Mister Fox wrote:
Thu Dec 24, 2020 06:34 CET
The following has originally been posted by @cpsmusic in the Mix Challenge 072 / December thread, post #132 on page 14

cpsmusic wrote:
Thu Dec 24, 2020 04:57 CET
After the previous Challenge I posted an idea but it didn't get much of a response so I thought I'd bring it up again. It was the idea of some kind of rating system for the mixes so that people could get an idea of where they stood in the overall Challenge. I always try to take criticism onboard but it's hard to know if problem areas are minor, major or disastrous!

Anyway, my idea was to have some sort of ranking system, maybe like "A, B, C, D, E" where "A" goes through to the second round, "E" is a DNQ, and "B", "C", and "D" represent different grades i.e. "B", is almost good enough to get into the second round, whereas a "C" might need to work on a few areas first, and "D" has a way to go. I realise this is an extra burden for the song provider but it would be a big help for those of us who want to improve.

Anyone else think that this would be a good idea?

I'd love to gather more feedback on this, maybe also put some things on the drawing board.
Please discuss!
I absolutely support this idea!



Although there is one thing that concerns me regarding these mix challenges:

We can all agree there is a lot of subjectivity when you're trying to "rate" a mix, right? You can break the rating into objective parts, but the ratings on those parts will still be subjective, even more when it's a musician rating it imho.

One problem I've seen plenty of times here, and I've been participating for almost a year, is that sound technicians and song providers alike have deficient monitoring. I don't have a studio, neighter acoustic treatment, but i do have a measurement microphone and a DSP to compensate a little bit on the colouring of my system, I also listen to a lot of mixes as well as using reference tracks i've been working for a lot of time and heard in very very transparent systems here so i am very comfortable with my system and its defects. I'd like to consider I am aware of these issues.
It becomes extremely hard, (to be very honest it becomes impossible), to mix and/or to rate a mix while listening on system with +-20-30dB cliffs on the response curve, which is not very uncommon for a system where a guy just bought some HS5's and put them on top of the table.

Imagine the kick you're working on has its punch around 100-130Hz, and your system has -15dB vale in that band. You will boost it in the mix, and everywhere else it will sound excruciating to listen to.

When sound technicians have these problems, you can clearly hear it on their mix(taking into account you have a decent system), but when song providers aren't even aware that this exists then the whole challenge is compromised and it's very demotivating.



I also think it should be discouraged for song providers to use the catch phrase "we want to hear your creativity". Mixing is NOT an essentially creative process, composing IS, music production IS, mixing and mastering IS NOT. Cause what it really sounds like is: "We are not very happy with the result we got in our work, so we want somebody to do it for us. Ohh and if you take a route we don't really like, then you're disqualified not by your mixing skills, but by your production and personal taste". Delays, reverbs, chorus they all have a reason to be used, sometimes the reason is creative and that is the job of the person making the music, not of the person mixing!

This is not at all directed at anyone, but it's something i hear a lot of times, even outside this forum. This needs to change. I am not a music producer and I am not hired to do music production. I am a sound technician and I am hired to mix and master a track. I can use a reference track to get your music very very close to the sound spectrum of the reference, but I do not know how to use a reference track to get the "overall vibe", because that comes mostly from the music, not the sound! I also do not have a "vibe" knob.




That being sad, a system that is separated from the challenge, separated from the song provider's rating where everyone can rate every mix would be fantastic and I think would create a much better learning place in the community.

CantusPro
Posts: 13
Joined: Fri Dec 20, 2019 20:45 CET

Re: MIX CHALLENGE - MC073 February 2021 - Mix Round 1 in evaluation

#230

Post by CantusPro »

Hi all,

Please forgive me for my boldness, but I feel the need to argue my way out of disqualification. I have already been formally disqualified, and I still do not feel dissuaded from attempting this maneuver.

My track was disqualified for true peaking at -1.37 dBTP on one channel and -0.93 dBTP on the other channel. The specification for acceptance is -1 dBTP, but “(-0,95 (+-0,05dBTP tolerance))” is stated at the bottom of the disqualification sheet at https://mix-challenge.com/media/results ... istics.pdf

First, -0.95 +/- 0.05 is 1.00 to 0.90 dBTP, and each of my channels are above 0.90 dBTP. That is the quotation from the bottom of the disqualification sheet, and that is how arithmetic works. There may be diverging rules elsewhere, but this is what was written on the bottom of the disqualification sheet.

Second, the average of my dBTP levels is -1.15 dBTP ((-1.37+-0.93)/2). It is not specified on the disqualification sheet that one of two channels should not exceed the specified maximum true peak measurement, and the average of my channels has not exceed the specified maximum. There may be diverging rules elsewhere, but this it is not stated on the disqualification sheet.

Again, I have already been disqualified from this contest, and I feel it is not unreasonable to try to argue my way back in to the contest over
-0.02 dB of calculated True Peak variance in one channel.

Thank you for reading this,

-Cantus

Post Reply