>
> If you run a CRT marginally below it's nominal anode voltage (18.4KV vs.
> 19.5KV), and proportionally lower the cathode, focus and screen voltages,
> how will it change/impact the warm-up process? The complaint being, as the
> CRT "warms up" (over ~15 minutes), the change in display intensity is so
> dramatic, that you go from no picture when cold, to visible re-trace when
> warm. This results in continual (frequent) adjustment of the screen voltage
> during the first 15 minutes of operation.
>
I seriously doubt that a small (less than 5 KV) change in high voltage would
have a noticeable effect in brightness. If high voltage and focus voltage
don't track each other you would see a focus shift. Of course high voltage
dropping WOULD change screen voltage and THAT would change brightness.
Another thing that would cause a slow change in brightness is not enough
heater voltage to the CRT or a weak CRT. But if changing the HV module
fixes the problem it wouldn't be either of these.
On this particular monitor I'd be looking at the video B+ (181 volt) line to
the neck board. I've seen too many bad ZD902 zeners in the past to ever
start trusting that part. I'd just stick a new NTE5100A in there and see
what happens.
> Thanks,
> -Mark
>
---------------------------------------------------------------------------
** Unsubscribe, subscribe, or view the archives at http://www.vectorlist.org
** Please direct other questions, comments, or problems to chris@westnet.com
Received on Thu Jul 22 01:09:37 2010
This archive was generated by hypermail 2.1.8 : Thu Jul 22 2010 - 07:50:00 EDT