What criteria would you use to properly evaluate trance tunes?

Aug 23, 2022
123 Posts
114 Thanked
For my Classic Trance Curated Database, I'm trying to come up with different criteria on how to rate the selected trance tunes, which would be extremely helpful for an eventual Classic Trance Top 1000 list, and for just generally rating all the best tunes the genre has to offer. After getting heavily inspired by some suggestions made by Hensmon, chatting a lot about this topic with Bing AI, as well as trying to experiment with their ideas (and some of my own) in the database, here's what I came up with. Please, leave me some suggestions, any help would be appreciated regarding how I could potentially make these criteria better, or whether should I add new criteria or combine multiple ones into one.

The first three criteria are fixed, and the remaining are up for discussion. They will all be used in a single equation (with different dominance) to create a final, weighted score.



1. Curator's Rating [FIXED] [40% dominance / x0.4 multiplier]

Primarily stands for the overall enjoyment level of the track. The curated database only features tracks with a 3.5-star rating or above (so tunes that are at the very least solid and worth listening to on their own, not just as part of mixes/sets).

2. YouTube Rating [FIXED] [20% dominance / x0.2 multiplier]

A rating based on the like-dislike ratio of each track. Depending on the number of uploads and views, a maximum of 5 uploads are taken into account when calculating this score. The overall like-dislike percentage then gets converted to a star format that is used for all other criteria. So a track with 98.35% likes will get 4.92 stars out of 5.

3. Discogs Rating [FIXED] [10% dominance / x0.1 multiplier]

The Discogs community's rating (x out of 5 stars) of the vinyl release on which the given track first appeared. If the track has been released on multiple records in the same year, the most collected vinyl release (the one with the most Have/Want statistics) has to be used to get the necessary data.



The following two criteria are partially subjective and require the knowledge of some historical context, but still mostly objective and heavily based on data that can be acquired from multiple sources.

4. Impact & Recognition [5% dominance / x0.05 multiplier]


How influential and popular the track was among the trance community, the general audience, and the critics at the time of its release? Does it have a lasting legacy or a cult following? Did it gain recognition over time thanks to the exposure provided by streaming services, or YouTube? Does it have a high number of vinyl releases, remixes, streams, or awards?

5. Originality [5% dominance / x0.05 multiplier]

How innovative and creative the track was at the time of its release? Did it introduce new sounds, production techniques, musical elements, and influences, or subgenres to the trance scene? Does it stand out from other tracks in the same era or category?



These two criteria are mostly subjective as they explore the more emotional, artistic, and abstract sides of the selected tracks.

6. Melody & Emotion [5% dominance / x0.05 multiplier]


How well does the track convey a mood or a feeling to the listener? Does it evoke a sense of euphoria, nostalgia, sadness, or excitement? Does it have a memorable hook, a catchy melody, or a powerful climax?

7. Depth & Harmony [5% dominance / x0.05 multiplier]

How diverse and rich are the melodies and sounds of the track? Do they create a harmonious and coherent musical experience? How layered and intricate the track is? Does it reveal new aspects and nuances after repeated listening? Does the track express a message and does it create a journey-like experience, taking the listener from one point to another?



The last two criteria are predominantly objective since they are mostly based on technical measurements and analyses of each track.

8. Structure & Arrangement [5% dominance / x0.05 multiplier]


How well does the track create a multi-layered and dynamic musical experience, with smooth transitions, effective build-ups and breakdowns, and complementary sounds and melodies? How well does the track maintain the emotional impact and the tempo throughout the track, without losing the listener’s interest or attention?

9. Technical Quality [5% dominance / x0.05 multiplier]

How well is the track produced, mixed, and mastered? Does it have a clear and balanced sound, with no distortion, clipping, or noise? Does it have a good dynamic range, with no over-compression or loudness war? Does it have a suitable tempo, pitch, and key for the genre and the mood?



10. Weighted Score

This is the combination of all criteria. Here's the equation for it:

( ( Curator's Rating x 0.40 + YouTube Rating x 0.20 + Discogs Rating x 0.10 + ( Impact & Recognition x 0.05 + Originality x 0.05 + Melody & Emotion x 0.05 + Depth & Harmony x 0.05 + Structure & Arrangement x 0.05 +Technical Quality x 0.05 ) / 6 ) / 5 ) * 100

Since the max score we can get this way is 5, we divide the final result by 5, and then multiply it by 100 to reach a final, percentage-based value (from 0.00% to 100.00%).
 
Last edited:

Dreamless

Member
Dec 7, 2022
64 Posts
39 Thanked
What you have feels overly complicated. Keep it simple bro, no rating system has this many variables and formulas behind it. Music is not a machine with parts to dissect in a microscopic way.

It’s cool to see a YouTube and discog rating in your spreadsheet, but these things are meaningless and not a good gauge of anything. Who actually rates on these things? I don’t believe numbers like this should factor into whether a song is good or not.

But I will say that weighting certain criteria is a good idea from you. Music quality is more important than recognition, otherwise popular hits skew too high. Just make all this much simpler. It’s never a good sign when it requires multiple paragraphs to explain something.
 

Julian Del Agranda

Elite Member
Jul 3, 2020
1,617 Posts
1,862 Thanked
In the end it's all subjective. While I consider Flight 643 a proper trance anthem and a great production, others may say the melody is just one note repeating, and therefor "crap".
 

nightslapper

Senior Member
Oct 5, 2023
579 Posts
361 Thanked
I would measure the skulls of the trance producers

for real I just don't care about the labels (means genres here) of the music I enjoy
 

Propeller

Senior Member
Jul 20, 2020
687 Posts
432 Thanked
UK
I believe that evaluating music is mainly objective. Otherwise, you get complete newbies telling you that the latest Darren Porter track is the best thing ever.

The main criterion is looking at the historical context. The year was it released, what were its influences and what came before it. When there is an abundance of quality to draw upon for inspiration, there is no excuse for making poor quality other than a lack of talent / laziness / trying to fool others / trying to make a fast buck.

Conversely, when there is little/nothing to draw upon for inspiration and ideas, yet you produce an original banger that subsequently goes on to inspire others, then that is a true mark of greatness.

I believe this criterion is used when judging all types of art.
 

Jetflag

Elite Member
Jul 17, 2020
2,775 Posts
2,210 Thanked
Good question, i defo feel there are some objective standards, but the infamous qualia problem makes it hard to quantify.

art wise I think a proper concept is one thing.. I like it when there’s a clear idea behind a track. That said I once saw a producer just doing whatever and proclaiming “yeah but I make chaos-trance and you main stream sheepul wouldn’t understand anyway”

needless to say he didn’t move on to become a very successful musician.

so a degree of structure aswell as some novelty and/or complexity without it becoming too much often sets the mark for me.

then there’s the technical aspects that are a factor..

and then there’s the “I like the way this feels or plays on my memories” subjective factor
 

SaltAcidFatHeatAcid

Senior Member
Jul 19, 2022
285 Posts
400 Thanked
and then there’s the “I like the way this feels or plays on my memories” subjective factor

I think a huge % of of my favorite tracks are because of this factor. Granted, some of them may check a lot of technical boxes as well, though. Sometimes a melody and a groove just resonate with me personally, the way I think, the way my brain hears keys and harmony, the way melodies and vocals or other elements interplay. I've definitely discussed this with others who do not see much in a track that blows my mind, b/c each of us process music differently in our heads. I've also gotten almost nothing from tracks other consider legendary (example, Aalto - Rush does not tickly my fancy almost at all, though I know it's technically a very good track. Don't kill me please).

To add to that, usually I only categorize music in ways that serve me as a DJ (I use tags and crates to designate these):
  • Good enough to make my library - If I kept it, it's of solid quality I would play out.
  • Best tracks - These stand out, tracks that are notably high quality and likely in my memory.
  • Epic - These are tracks I just can't get enough of that are truly a cut above.
The value of a track in a specific moment in a set is sometimes based on my assessment of quality. I may filter the epic and best first to see if one fits. But often the best track for that moment in a set is based on flow, character, and mood. It won't often get that best or epic designation but it might be the perfect track in that moment in a set.
 
Aug 23, 2022
123 Posts
114 Thanked
I try to react to all of the comments here. First of all, I appreciate all the feedback. Second, my original idea was to actually keep the ratings simple, and only include Curator's Rating, which basically stands for my perceived overall quality / enjoyment level of the given music - which multiple people seem to have suggested here. My plan still haven't changed in the sense that I only plan to include tunes that have true artistic merit, stood the test of time, and still sound great today (so predominantly 4.0, 4.5, and 5.0-star tunes, with some 3.5-star picks).

And yeah, this works very well, as I feel there's a clear distinction for example between a 3.5 and a 4.0-star track. However, one of the goals with this Curated Database is to eventually make a Classic Trance Top 1000 list. And the thing is, no one can reliably rate/rank that many tunes in order. It's just impossible. And when there will be 300 4.5 star tunes, how will we differentiate 299 4.5 star tracks from 1 other?

Of course, we can probably assume that 5-star tracks will be at the very top of the list, 4.5-star tunes below that, etc. But how do we actually differentiate tracks with the same ranking so we can formulate a toplist and properly rank tracks?

Thus, it was recommended to me to use certain criteria to evaluate a tune's quality better and I actually believe it was a good suggestion. I think about these criteria as modifiers that give some wiggle room to the base Curator's Score. The Curator's Score is the dominant part in the equation I used above, so technically, every 4.5-star tune (for example) would be relatively close to that rating in percentage (4.5-star is 90%) after taking the modifiers into account. These said modifiers (e.g. Harmony & Depth) would have low dominance to not arbitrary alter the score too much, but alter it just enough so we could differentiate two 4.5-star tunes from each other. For example, one would get 87.95% in total, while the other would get 93.28%.

So again, the Curator's Rating which stands for the overall quality of the track would be the dominant part, but small modifiers would help to differentiate tracks from each other just enough that otherwise have very similar quality, and it would be super helpful for a toplist.

And it's not like you HAVE TO look at all these data to understand what's going on anyways, as someone here suggested. There's literally a dedicated column for everything, so if you want, you only take a look at the weighted rating. You just need to move your eyes a few mms. But the data is also there for those who want it.

I agree with those statements though that historical context matters, and that music quality is the most important. Plus it's interesting to see that even in this comment section there are so many different opinions when it comes to evaluating music. And I also agree that YouTube and Discogs ratings might not be super useful in all cases, but I wanted the community's opinion to be also taken into account in some extent, and data from DC&YT are still the best publicly available raw data we have about how the general community likes/dislikes certain tunes.
 
Last edited:
May 4, 2022
134 Posts
87 Thanked
5 stars is too limiting and shoots yourself in the foot before you start. You’re having to come up with something too complicated to deal with that.

Your own personal editors rating is nice, and might just be fine for your database. If you want public powered ratings then go for something simple, maybe even with voting. You can find a middle ground between your personal rating and the complex algorithmic formula you stated above. It doesn’t have to be one or the other.

Besides keep in mind that when I see a 3/5 rated track I think this must be average to poor, just like a 3/5 movie. I think most people interpret it this way, correct me if I’m wrong guys? 3/5 means there is no real quality worth my time. 7/10 and we understand more clearly. It’s good, but not remarkable.

Broaden the rating system and you solve half your problem
 
  • Thanks
Reactions: Jetflag
Aug 23, 2022
123 Posts
114 Thanked
Besides keep in mind that when I see a 3/5 rated track I think this must be average to poor, just like a 3/5 movie. I think most people interpret it this way, correct me if I’m wrong guys? 3/5 means there is no real quality worth my time. 7/10 and we understand more clearly. It’s good, but not remarkable.

Broaden the rating system and you solve half your problem

Agreed, but there are no 3-star tunes though in the database.

"A curated collection that upon completion will cover the entire history of the classic trance era (mostly from 1992 to 2006), showcasing the best tunes the genre has to offer. There's zero tolerance for nostalgia. Only tracks that stood the test of time, have true artistic merit, and still sound great today are listed here. Thus, the overwhelming majority of the selected tunes have a Curator's Rating of 4 stars or above, with a few exceptions (this most often happens when the selected record has such a great melodic hook that it would be a shame to not feature it in this database, but the overall quality of it do not measure up to most of the other picks). This database includes a wide variety of information about each release, including BPM, key, and Camelot key data for mixers, YouTube, Discogs, and weighted ratings, CD/WEB availability, and many more."
 

Contraband

Member
Aug 29, 2023
52 Posts
31 Thanked
Pick like 3 or 4 criteria to help simplify. Impact and Recognition makes sense and is easiest to measure and agree upon.

Originality is good too but I prefer your word “innovation” that you used to describe it. Implies more influence on the scene. Many weird and original tracks but doesn’t mean much unless it had a downstream effect I.e innovative

Kill all the melody, depth, structure and arrangement ones, it’s too subjective. There are tracks with little or simple melody that are pure fire. Structure is not indicative of quality either. Arrangements same thing. Andy Blueman fans probably tell you the depth, melody and arrangements are god tier with his music.

Just have a single criteria that captures musicality, but prefix it as the subjective side to all this. Keep in mind those things you mentioned but don’t over think or over isolate it.
 
Aug 23, 2022
123 Posts
114 Thanked
Pick like 3 or 4 criteria to help simplify. Impact and Recognition makes sense and is easiest to measure and agree upon.

Originality is good too but I prefer your word “innovation” that you used to describe it. Implies more influence on the scene. Many weird and original tracks but doesn’t mean much unless it had a downstream effect I.e innovative

Kill all the melody, depth, structure and arrangement ones, it’s too subjective. There are tracks with little or simple melody that are pure fire. Structure is not indicative of quality either. Arrangements same thing. Andy Blueman fans probably tell you the depth, melody and arrangements are god tier with his music.

Just have a single criteria that captures musicality, but prefix it as the subjective side to all this. Keep in mind those things you mentioned but don’t over think or over isolate it.

I actually agree with you that the overall impact of technicality and structure is - for the most part - negligible when it comes to the overall quality of a tune. However, there has to be some wiggle room given to each track, so for example, if we have multiple 4.5-star tracks, we can use a metric/set criteria to differentiate them somewhat. This is what ultimately makes a Top 1000 list possible. E.g: the first 4.5-star track's overall score will be 88.23%. The second 4-5 star track's score will be 90.11%. Etc.

So I think ultimately what's important is how much things like technical quality and such matter in the equation/formula, which I can tweak fairly easily for all tracks at once. Technicality, etc. currently only influence 5% of the overall score.
 
Last edited:

Hensmon

Admin
TranceFix Crew
Jun 27, 2020
3,140 Posts
2,637 Thanked
UK
My feeling (as you already know) is the same as others here... The more simple the better.

However it depends on the goal. If you want a top 1000 that is powered by public input then the simple solution is far superior. If you instead want an 'editors' rating associated with your database tracks then coming up with a personal crtieria and algorithm to apply is cool and a nice idea. huge amount of effort though, and personally the simple public method is more appealing to me at least.