Service Manual jargon - help -

thinkchronicity

Member (SA)
Jul 5, 2018
171
30
28
Surrey, UK
Hope someone can decipher the Record Level Adjustment info in a couple of old Goldstar manuals.(581 & 800)
The first step it says is to attach a signal gen @ 1kHz -60dB at line in, press record, measure line out with a VTVM and adjust the trimpot for 'A'. 'A' being the reading on the VTVM. The second step is shove in a 1kHz test tape, press play and it should read 'lower about 3dB from 'A'.

Question is what's 'A'?
Does it stand for 'actual'? so -60dB in this case?

Thanks
 

Superduper

Member (SA)
thinkchronicity said:
The first step it says is to attach a signal gen @ 1kHz -60dB at line in, press record, measure line out with a VTVM and adjust the trimpot for 'A'. 'A' being the reading on the VTVM

Question is what's 'A'?
Does it stand for 'actual'? so -60dB in this case?

Thanks
I'm confused, you included that A is "the reading on the VTM".... where exactly did you get this "definition"? Because I looked at the TSR-581 manual and didn't see where this was defined. Additionally, it doesn't make sense if A is the reading on the VTM because obviously, the prescribed adjustment points will have an effect on the VTM results, meaning no matter what point you set the adjustment to, A will always change to a different number making A (if = VTM reading) to be meaningless on adjustment. A then will always be correct no matter where to place the setting since A will always = A.

IF A was meant to be -60db, then why not specify that, after all, they explicitly prescribed a -60db input, why leave output vague?

To my knowledge, there is no set Jargon to define A. I have seen where A is meant to be amplitude but usually, they will qualify it by saying something like adjust for max A.

I'm a bit puzzled by this as well. My suggestion is to just leave it alone unless you have good reason to want to make that adjustment.
 

thinkchronicity

Member (SA)
Jul 5, 2018
171
30
28
Surrey, UK
Superduper said:
The first step it says is to attach a signal gen @ 1kHz -60dB at line in, press record, measure line out with a VTVM and adjust the trimpot for 'A'. 'A' being the reading on the VTVM
Question is what's 'A'?
Does it stand for 'actual'? so -60dB in this case?
Thanks
I'm confused, you included that A is "the reading on the VTM".... where exactly did you get this "definition"? Because I looked at the TSR-581 manual and didn't see where this was defined. Additionally, it doesn't make sense if A is the reading on the VTM because obviously, the prescribed adjustment points will have an effect on the VTM results, meaning no matter what point you set the adjustment to, A will always change to a different number making A (if = VTM reading) to be meaningless on adjustment. A then will always be correct no matter where to place the setting since A will always = A.

IF A was meant to be -60db, then why not specify that, after all, they explicitly prescribed a -60db input, why leave output vague?

To my knowledge, there is no set Jargon to define A. I have seen where A is meant to be amplitude but usually, they will qualify it by saying something like adjust for max A.

I'm a bit puzzled by this as well. My suggestion is to just leave it alone unless you have good reason to want to make that adjustment.
Thanks Norm for that.
I added the definition by looking at the graphic block diagram above the instructions. If the input is fixed, then the only thing you can look at is the VTVM meter, and I presume the idea is to get a -60dB level on that too, so input=output, by turning the Record Level pot(s). And then the next step is the 1k tape to confirm your adjustment, even though it should be 3dB lower. Hence I think A is abbreviation for actual or maybe analogue, as in meaning same?

Only reason for wanting to know is that I've changed the head with good playback results, so now it's just further down the rabbit hole of knowledge. Might as well check the Record levels now.
Also I believe a different inductance head means the bias current should be tweaked. Looking in the manual it states 10v ac on the head, so do I aim for 10v on this new head?

While I'm at it, the first half of the LED Meter Adjustment makes no sense at all. Lost in translation perhaps.
 

Superduper

Member (SA)
Well, personally, I see nothing on the block diagram above the instructions that suggests or infer that A represents the VTVM output reading at line-out. The input signal strength was very specific, why not the output too. Furthermore, the input was set to db. But the output is supposed to be measured with a VTVM which reads volts. Since signal generator outputs are usually either uV or dbm (watts), then are we supposed to convert from dbm (watts) to the microvolts representation of -60db? Why not put the desired exact microvolt reading into the "adjust-for" field? Finally, something to consider is that the block shows the input signal to inject into the (mic) input, not line-input as you stated. This is a very important distinction because microphones typically have very low signal strengths, which is why the block diagram shows the signal passing through a mic amp and then a flat amp. After being amplified by 2 amplifiers, it's fair to say that the output signal will be far more robust than the input signal. Let me also include the following suggestion which I think may make more sense to you... if the microphone input signal of -60db was to be kept at the same level at the line-output (provided you can even measure -60db), then any device connected to the line-out jacks would not even register such a low signal, which should actually be (reference) -10dbv (.316v) consumer, 0dbu (.775v) professional, or (line) -10db consumer or +4dbu (1.23v) professional. In other words, the mic input level needs to be "boosted" by 60db in order to have the proper line-out signal strength. So if we presume that we are going to use the consumer 0dbv reference or line level signal that we are going to shoot for. In that case, we need to see .316v at line-output.

But lets move on from that discussion and look at a preceeding measurement. The VU meter adjustment. As you can see, they specify that when playing a MTT-112B test tape, the VU meter should read (+4db). I doubt that you have an MTT-112b test tape so you can instead inject a -20db signal into line-in in which case, the expected VU meter should read 0db. Since both settings prescribes adjusting the same pots, one can surmise that either adjustment is fine for setting VU meter. Now moving on, we also see that in the recording level adjustment, the same test tape is prescribed for measuring the output and the expected reading should be (3 or 4db lower than "A", depending on which SM you wish to follow). Lets say 3db. Now, I hope I haven't lost you but if in the VU meter adjustment, we are expecting +4db using that tape, and if in the record level adjustment we expect 3db less than (A), then doesn't it stand to reason that A = +4db + 3db? In the VU meter adjustment, the VU meter itself is what is being calibrated. However in the Rec level adjustment, you are measuring at line-out. If you trust the VU meter, you can use that to set at +7db with -60db input at mic. However, if you would prefer to set using a DMM (presuming nobody has VTVM anymore), then you should first probably find out what the line-out level is at with meters at +4db and -20db at line-in, and adjust your expected output from that reading by +3db over.

As to your last question regarding the VU meter adjustment, it appears to me that:
  1. You set VU meter by injecting -20db signal at line-in, and adjust to read +4db on VU meter
  2. You set VU meter by playing the MTT-112b test tape, and adjust to read +4db on VU meter
Choose 1 or the other.
 

thinkchronicity

Member (SA)
Jul 5, 2018
171
30
28
Surrey, UK
Superduper said:
Well, personally, I see nothing on the block diagram above the instructions that suggests or infer that A represents the VTVM output reading at line-out. The input signal strength was very specific, why not the output too. Furthermore, the input was set to db. But the output is supposed to be measured with a VTVM which reads volts. Since signal generator outputs are usually either uV or dbm (watts), then are we supposed to convert from dbm (watts) to the microvolts representation of -60db? Why not put the desired exact microvolt reading into the "adjust-for" field? Finally, something to consider is that the block shows the input signal to inject into the (mic) input, not line-input as you stated. This is a very important distinction because microphones typically have very low signal strengths, which is why the block diagram shows the signal passing through a mic amp and then a flat amp. After being amplified by 2 amplifiers, it's fair to say that the output signal will be far more robust than the input signal. Let me also include the following suggestion which I think may make more sense to you... if the microphone input signal of -60db was to be kept at the same level at the line-output (provided you can even measure -60db), then any device connected to the line-out jacks would not even register such a low signal, which should actually be (reference) -10dbv (.316v) consumer, 0dbu (.775v) professional, or (line) -10db consumer or +4dbu (1.23v) professional. In other words, the mic input level needs to be "boosted" by 60db in order to have the proper line-out signal strength. So if we presume that we are going to use the consumer 0dbv reference or line level signal that we are going to shoot for. In that case, we need to see .316v at line-output.

But lets move on from that discussion and look at a preceeding measurement. The VU meter adjustment. As you can see, they specify that when playing a MTT-112B test tape, the VU meter should read (+4db). I doubt that you have an MTT-112b test tape so you can instead inject a -20db signal into line-in in which case, the expected VU meter should read 0db. Since both settings prescribes adjusting the same pots, one can surmise that either adjustment is fine for setting VU meter. Now moving on, we also see that in the recording level adjustment, the same test tape is prescribed for measuring the output and the expected reading should be (3 or 4db lower than "A", depending on which SM you wish to follow). Lets say 3db. Now, I hope I haven't lost you but if in the VU meter adjustment, we are expecting +4db using that tape, and if in the record level adjustment we expect 3db less than (A), then doesn't it stand to reason that A = +4db + 3db? In the VU meter adjustment, the VU meter itself is what is being calibrated. However in the Rec level adjustment, you are measuring at line-out. If you trust the VU meter, you can use that to set at +7db with -60db input at mic. However, if you would prefer to set using a DMM (presuming nobody has VTVM anymore), then you should first probably find out what the line-out level is at with meters at +4db and -20db at line-in, and adjust your expected output from that reading by +3db over.

As to your last question regarding the VU meter adjustment, it appears to me that:
  1. You set VU meter by injecting -20db signal at line-in, and adjust to read +4db on VU meter
  2. You set VU meter by playing the MTT-112b test tape, and adjust to read +4db on VU meter
Choose 1 or the other.
All good stuff, thanks a lot. My brain's loving this conundrum. The conundrum of the mysterious 'A'.

Ok, the output=input thing was a bit of a clanger of mine. And I forgot it was a mic amp in the recording set-up. Anyway, I followed everything you said apart from the line starting 'however, if you would prefer to set using...', but still, 95% there (maybe). A couple more observations: in the LED section, are they not telling us that that their 0dB reference is 500mV when they want us to input -20dB and read 50mV at line out? Step 1 here seems to be about calibrating your signal generator only...strange.
Btw the VU meters go up to +5dB max, so I can't use that as a method for setting the recording level, and A would then be 5+3 = 8dB.
Also, something else I spotted in the record level adjustment where in step 1 it is actually printed adjust for DOT A ie .A one-tenth of A, so that would now be -12dB - which is the 2nd led bar lit up for a 0.5mV (-60dB) input (if you go with my ref level above).
What do they mean by flat amp? flat eq?
I suppose all this dB stuff is a leftover from the old meters that sometimes had a dB scale on them. Hoping to get one of those one day, if only to make this business easier.
Well, it's not quite crystal clear yet, but you've moved me on a whole load, so I really appreciate it.
 

Superduper

Member (SA)
Note, it should be unnecessary to quote a post, especially a long one if you reply immediately below, but if you want to accentuate a particular area, then it's best to just trim the quoted text to highlight the subject of the reply.

You have to excuse the discrepencies.... I am reading from both TSR-581 and TSR-801 service instructions, which are both very similar. You never actually disclosed which model you are making adjustments on, I just figured it was a question in general. The TSR-581 apparently uses LED meters but the TSR-801 uses VU meters so it's fair to say there might be some differences in that regards.

Because the VU meter adjustment and Recording level adjustments both utilize the MTT-112b test tape, and because you have a prescribed target adjustment for one test, you can infer from the results of that test, and factor in the mathematical relationship to the other to come up with what A should be. The thing is this... on the VU meter adjustment, once you have that set, then you can measure the output of the line-out signal using MTT-112b or equivalent. Now, moving onto the Record level adjustment, you don't know what A is, but you do know what the output will be from MTT-112b is right? Well, the instructions says to verify that MTT-112b is 3db less than A. So, with this information, you can figure it out right?

As for when I said 'however, if you would prefer to set using...', what I was meaning to say is that IF you trust the VU meter on the deck, you can use that to add 3db to it. However, I thought it was a needle meter and not a led bar. I would be less trusting of a led VU compared to needle, so forget about that, I would suggest the method above in previous paragraph instead at this point.

As for calibrating the signal generator with that test method, that is incorrect. Signal generators, in general (and we are not talking about the consumer stuff whether it's the cheap ebay or computer stuff) are lab grade instruments and are supposed to be regularly calibrated. Furthermore, if the signal generator doesn't have a trustworthy way to inject the prescribed signal level, then that signal strength should be verified and monitored on the input. It is up to the technician to know their instruments and whether or not it's calibrated accurately. In any case, every signal generator should be inherently more accurate than to use a boombox that is being calibrated to actually calibrate a lab grade instrument -- that would boggle the mind. What that VU adjustment procedure is doing is calibrating the VU meter on your boombox so that it displays the strength of the VU signal accurately.

I'm not sure about your math but at this point, I don't want to invest anymore brain cells.

As to your "dot" A observation... it's 99.9999% unlikely they mean .A. It's basically a printing artifact. The instructions for the Record level adjustment procedure in the TSR-801 manual is essentially identical and in that manual, it does not show .A, just a clean A.

A flat amp is an amplfier that amplifies a signal without changing it's frequency balance. Some amplifiers will change the tonal signature such as a phono amp.
 

thinkchronicity

Member (SA)
Jul 5, 2018
171
30
28
Surrey, UK
Cheers for the heads-up on the customs round here, and for your other thoughts of course. Still chuckling at the idea of a boombox being used to calibrate bench gear haha.
I should have mentioned that the 581 is the one I'm focusing on, as it's on the bench with a new head and soon a new roller. I'm not trying to overly complicate things, but I suppose it will seem that way in all the confusion i still have. Anyway, let's leave the record level adjustment out of it, and all that 'A' business, which as you say can be extrapolated from the meter test.
The problem i still have as regards setting up is that 1. I don't have an MTT-112b tape (I see GennLab do copies of nearly all the Teac test tapes APART from the one in question here. They have a 212b 1kHz at -4dB, but I don't know if that level can be trusted to be the same as the 112. Their MTT 0dB tape is at Dolby level btw).
And 2. I'm not sure what -20 dB on the signal gen looks like in volts...what's the ref? I have an old analogue SG with no dB scale.
Looking at the 800 manual at this point, it tells us, as you said, that -20dB line-in should read 0dB on the VUs - that'll be +3dB on the 581's peak LEDs. The 581 manual tells us in step 1 that -20dB SG line-in looks like 50mV line-out. So on the 581, it's 0dB (on the meters) line-out level would be 3dB down from 50mV ie 35mV! Seems a long way off consumer level 316mV...which begs the question what would happen if I injected a normal consumer level at line-in? massive distortion.
Surely line-in=line-out in (auto) record mode, that's why they're called line levels. In that case then, and going with the manuals take on things numberswise, I assume that -20dB on my SG should measure 50mV at its terminals, unless I'm missing something.
But the main thing is:
I don't get why a -20dB line-in input should equate to 0dB on the VUs (+3dB on the LEDs). That's my headache.

Sorry if this is making your head hurt. My brain VUVUs are smashing into the pillars just checking this through :'-(
 

Superduper

Member (SA)
Ok, it's not your fault because Goldstar really did a poor job on the SM instructions. I also think it could be dangerous to make any presumptions, especially given such poor instructions.. The -20db thing with the TSR-581, I think you are misinterpreting to measure 50mv at line-out. When I read it again, it appears it is asking you to inject -20db signal into line-in, adjust the SG to 50mv output; if you think about it, you are adjusting the SG output (line-in) t0 50mv. It isn't a given that line-out will equal line in. In fact, with a line/phono amp AND a flat amp between, it's fair to say it's probably not. Neither should it be presumed that boombox gear conforms to any kind of standard reference. Again, you are combining things that don't seem to be related. By that, you said you didn't understand why -20db equates to 0db on the VU's (+3db on leds). Where did you get that? Because it seems you are only supposed to get that result when playing the MTT-112b test tape. The former test/adjustment (-20db) frankly doesn't even look like there's a purpose to it based on how I'm reading the instructions.

You are also presuming massive distortion on outputs if inputs are greater than -20db. But you haven't even analyzed the circuit. Some circuits have autolevel features built in and this may be the case. You don't know this.

In any event, at this point, when dealing with a boombox, I would just adjust to what you feel sounds good and leave it at that. These aren't precision machines and in reality, are a far drop below even the most basic home stereo tape decks. I personally would just record your favorite programs using a home system and use the boombox for playback. Don't pull your hair out, it's not worth it, certainly not for this model. There are far more sophisticated models with high quality decks and on those, perhaps it's worth it to calibrate everything more precisely, but on this one, I don't think it's worth the effort.
 

thinkchronicity

Member (SA)
Jul 5, 2018
171
30
28
Surrey, UK
Thanks Norm. Yep, i know where you're coming from with your last paragraph - if it sounds good it is good with these machines. This being my first proper restoration i thought I'd go fully into things and sharpen my teeth for better models in the future, plus this 581 is a bit special being made in Nigeria, and being rescued from a boot sale in the north of England where it apparently had a hard life at the beach - if you press eject and get close enough you can still smell ozone and rock pools. It was a rusty sandy affair when i got hold of it.
Superduper said:
By that, you said you didn't understand why -20db equates to 0db on the VU's (+3db on leds). Where did you get that? Because it seems you are only supposed to get that result when playing the MTT-112b test tape.
I got that from the 800 manual, VU meter adjustment step 1, and you reiterated this fact yourself in your second reply. The tape should read +3 on the VUs and +5dB on the leds. So I still find it puzzling that only 50mV line-in is supposed to read 0dB on the VUs (+3 on the leds). The only other tape service manual I've really studied is the one for my Nak CR-2 which wants you to input 435mV to adjust the LEDs to 0dB. Very different beasts clearly.
The good news is that we both agree that their 0dB ref in mV is 500mV. With that info I should be able to do something. I'll get stuck into the practical on this soon and who knows, might even end up with a number for the mysterious 'A'.

A gold star to anyone who's worked it out yet. Should be a safe bet.
 

thinkchronicity

Member (SA)
Jul 5, 2018
171
30
28
Surrey, UK
So I did the tests with my signal generator without making any pot adjustments inside the 581, just to see what levels should be coming out, assuming no-one's had a go inside this box, which is unlikely.

LED meter adjustment: 50mV (-20dB) line-in @ 1kHz read 50 and 56mV right & left channel line-out respectively when in record mode (auto). On the LEDs I had -3dB showing. Zero Gain, therefore line-in=line-out, as I expected. Their diagram with the line amp and flat amp are misleading. My boombox has a weaker right channel in playback/radio mode so not surprised at the different outputs. Wish i knew what to turn to fix that!

REC level adjustment: 0.5mV (-60dB) mic-in @1kHz gave 55mV line out in auto record mode on the left channel. My right mic input channel has a hum problem, so I couldn't test that one. That means the internal mic amp is a tidy 40dB gain, bringing things up to the previous test value in terms of line-out amplitude.
'A' in mV then is supposed to be 50 to 55 maybe.
I played a 0dB Dolby level test tape aswell and that showed 0dB on the LEDs, indicating to me the meters are correctly calibrated. The only odd thing then is that according to the 800 manual 50mV line-in should light up +3dB on the leds. 6dB difference here. Hmmm.