Reference Level and Cartridge Transfer Factor

This is where record cutters raise questions about cutting, and trade wisdom and experiment results. We love Scully, Neumann, Presto, & Rek-O-Kut lathes and Wilcox-Gay Recordios (among others). We are excited by the various modern pro and semi-pro systems, too, in production and development. We use strange, extinct disc-based dictation machines. And other stuff, too.

Moderators: piaptk, tragwag, Steve E., Aussie0zborn

Post Reply
User avatar
Wayne Kirkwood
Posts: 41
Joined: Tue Jul 21, 2015 12:43 pm
Location: Dallas, Texas
Contact:

Reference Level and Cartridge Transfer Factor

Post: # 46500Unread post Wayne Kirkwood
Mon Mar 27, 2017 2:44 pm

There may be an official Lathe Troll "Reference Level Thread" but after searching for a few minutes I haven't found one.

I'm often asked to determine proper preamp gain based on vague phono cartridge level specifications.
Often an output level is quoted without any reverence to velocity.

"3.5 mV" doesn't tell me anything.
3.5 mV compared to what?

One manufacturer of some very expensive cartridges when asked by my client said "I've never heard of an output level being referenced to a speed. The level is an absolute measurement."
This was supposedly from the guy who actually assembled them.

Really?
So we're supposed to believe this cart outputs its "2.12" mV whether the platter is stopped or spinning.
Yeah right...

Some manufacturers may specify 5 mV at 5 cm/second 1 kHz.
But what is "5 cm/second?"
Is it RMS or peak?

We're left to guess.
A search of my library led me to Vogel, "The Sound of Silence."

Image
"Phono Cartridge Transfer Factor," Vogel, The Sound of Silence.

Vogel says that the peak velocity provides the units for an RMS voltage measurement.
The mixed units seem odd.
Every test disc I have states velocity as an RMS unit, not peak.

Where does Vogel get his 8 cm/second peak value?
Converting 5 cm/second RMS to peak is 7 cm/second.

I'm sensing a lack of consensus here.

When a manufacturer specifies 5 mV at 5 cm/second we can be fairly certain that the voltage measurement is RMS. (Though we can't be too sure of that either).
But when they say 5 cm/second what do they mean? RMS or peak?
Do they themselves know?

I checked a Stanton 681 with a stated Transfer Factor of 1 mV/cm/second at 1 kHz.
Based on 5 mV I applied the amount of gain that would be required to bring it up to +4 dBu RMS.
Both channels were within 1/2dB based on 5 cm/second RMS lateral modulation and within the cart's channel balance spec by a wide margin.
(It was actually an STR-100 at 3.54 cm/sec RMS Left and Right only equivalent to 5cm/second RMS lateral/mono.)
Stanton used RMS measurements consistently...

For the manufacturer who stated "2.12 mV is an absolute" there's no real hope in decoding the specification. We can only guess.

But for a manufacturer that at least provides a velocity what is safe to assume?
RMS?
What do real standards tell us?

User avatar
Wayne Kirkwood
Posts: 41
Joined: Tue Jul 21, 2015 12:43 pm
Location: Dallas, Texas
Contact:

Re: Reference Level and Cartridge Transfer Factor

Post: # 46502Unread post Wayne Kirkwood
Mon Mar 27, 2017 3:39 pm

Often an output level is quoted without any reverence to velocity.
I actually meant "reference." Too bad we can't correct typos but it made me laugh.

User avatar
jesusfwrl
Posts: 365
Joined: Thu May 16, 2013 2:24 pm
Location: Earth
Contact:

Re: Reference Level and Cartridge Transfer Factor

Post: # 46534Unread post jesusfwrl
Thu Mar 30, 2017 10:32 am

I totally agree with your remarks regarding the necessity of a reference point in cm/s when cartridge output levels are given in datasheets. I also agree that the velocity, just as any other similar amplitude specification should explicitly mention whether the figure is rms or peak.

Regarding the practical aspects of your question, I think you are attacking the problem from the wrong direction.

Even if a manufacturer would be quite precise in informing you that a certain cartridge is designed to produce an output of 5mV for a peak recorded velocity of 7cm/s when terminated by a 47K load, you are still left with sample deviation between different samples of each cartridge.

Manufacturers offer this value as a rough indication of what kind of preamplifier would be required. There are by no means exact figures. In my experience, the deviation in the output level of different samples of the exact same model of cartridge usually far exceeds 1dB. For cheaper cartridges the deviation can easily reach 3-4dB.

There are many standards out there that were in effect at different times in different countries. To complicate matters further, as you have already noticed, there are manufacturers out there who do not seem to understand the content of any such standards or even basic measurement principles.

The only way of calibrating the gain of a preamplifier for a particular cartridge is to hook up the actual cartridge and reproduce a test record containing a 1kHz lateral signal, recorded at a known level.

Then, the data given by the manufacturer becomes insignificant and you can do a proper calibration. This is the only accurate way in which this calibration can be done.

You do not even have to care much about the data offered by the manufacturer of the test record, since many of them failed to indicate whether the values given represent rm or peak velocities.

The easiest way is to verify the recorded level on your own either by using the Buchmann-Meyer light pattern method, or by actually measuring the amplitude of displacement of the groove with a microscope containing a measurement reticle. Then you can be fairly certain about the recorded velocity on the test disk, which you can relate to the electrical output of the cartridge, which may be somewhat effected by the practical implementation of the circuit, and eventually come up with a perfectly calibrated reproduction system. You would still need to define what your particular preference of a reference level would be. At least then, this decision is totally up to you. As soon as the cartridge is replaced with a different one, the calibration procedure must be repeated, even if the replacement cartridge is an identical model.

To solve your confusion regarding Vogels book, which I found interesting to read, his 8cm/s peak velocity is most likely in reference to DIN45547 (1981), which was and probably still is commonly used by German disk mastering engineers. The author mentions that his information comes from Guenther Pauler and SST in Frankfurt.

Most manufacturers of cartridges however appear to be using a 5cm/s rms lateral velocity (7cm/s peak lateral velocity), as their reference.

This is what I also use for calibrating my mastering systems. Keep in mind that this is only useful for measuring a nominal level test tone, since the actual level of the music on the masters I cut is influenced by a huge variety of factors. I will easily reach peaks of 12dB over the nominal 7cm/s peak lateral velocity when space is not at a premium, but I will also cut super long sides with levels barely reaching 0dB. This makes for at least 12dB of peak level deviation between different records with music on them. Unless the user of the preamplifier is interested in measuring my peak levels, their listing experience does not depend much on the nominal level calibration. A far more important consideration is the absolute maximum peak levels that can be encountered, which should not cause clipping under any circumstances.

It is customary to use a 1KHz sine wave for all such level measurements.
If you would require further information or assistance on standards, level calibration methods, and procedures, or custom test records, to assist you in your commercial efforts, please do not hesitate to contact me privately.
~~~ Precision Mechanical Engineering, Analog Disk Mastering ~~~
Agnew Analog Reference Instruments: http://www.agnewanalog.com

User avatar
Wayne Kirkwood
Posts: 41
Joined: Tue Jul 21, 2015 12:43 pm
Location: Dallas, Texas
Contact:

Re: Reference Level and Cartridge Transfer Factor

Post: # 46566Unread post Wayne Kirkwood
Sun Apr 02, 2017 2:07 pm

jesusfwrl
Even if a manufacturer would be quite precise in informing you that a certain cartridge is designed to produce an output of 5mV for a peak recorded velocity of 7cm/s when terminated by a 47K load, you are still left with sample deviation between different samples of each cartridge.
Thank you for the reply.

I agree that the sample deviation is relatively large and precise gain calibration, without a test record is not possible.

Here's my reasoning:
I suppose a typical channel balance error might be 1-2 dB in most carts.
Most seem to be far better than the published limit.
The dB error between 5 or 7 cm/s is just under 3 dB.
When it comes to setting gain, the balance error is probably smaller than confusion resulting from imprecise specs.
Regarding the practical aspects of your question, I think you are attacking the problem from the wrong direction.
I agree and going about it backwards is not by choice.
For my own carts I plop on a test disc and calibrate directly.
That's the straightforward method.

Unfortunately my customers are out of the area and I ship to them and, on request, preset the gain or simply tell them what approximate gain they'll need.
My interest is getting gain in the ballpark without having access to the customer's cart.
I can give them a DCR value to use for the gain trimmer so they can set the gain with an Ohmmeter if they don't have instruments or a test record.

Most manufacturer's transfer factors are usually not too difficult to decipher and provide just enough detail to make me think they know what they're talking about.
Some disclose all the information leaving no guesswork.

After this particular manufacturer told my customer (supposedly by the guy making them) that it was an absolute level and that he'd never heard a reference to "speed" seemed really odd.
Odd enough to not trust their specification.
(Turns out if you dig deep enough they do publish a spec relating to velocity for some of their other models.)
Maybe the guy making them was new and never read the test record label.
He was definitely not the source material expert...

I ended up assuming that 2.12 mV was relative to 5 cm/sec RMS @1 kHz.
I shipped it that way which turned out to be, based on the gain I used, in the ballpark.
If my customer had been sent a unit calibrated to the more-common 5 mV at 5 cm/sec/@1kHz his gain would have been about 7-1/2 dB too low and he would not have been able to hit +4 dBu on his output.

I set my Stanton 681EE using the "backwards" method to perform a reality check for both level and balance and was amazed at just how close the gain channel balance was.
Within a fraction of a dB - close enough not to reset it. YMMV.

Post Reply