I was just thinking about synchronisation and the "taped up" 32nd hole, and how it all works.
As far as I now understand it, the NBTVA standard has synch pulses for 31 of the 32 scanlines.
These pulses are then synched via a phase-locked loop with the incoming signal from the IR sensor looking at 31 holes on a disc (the 32nd being taped up).
Presumably the motor is being driven at some voltage, and the PLL modifies the voltage - increasing if the IR pulse arrives late, and decreasing if it is early.
And between pulse detection, the disk has some inertia but is also slowing due to friction. And there's the characteristics of the drive belt to consider.
But simplistically, if you drive the motor at some constant voltage, the disc will rotate at a constant speed. There may be acceleration issues changing from one voltage to another.
So I got to thinking - why is this operating at 400 Hz (32 scanlines at 12.5 Hz)? Well, 31 if you want to be pedantic, so 387.5 Hz.
Why wouldn't it work with just a single synch hole? The electronics already provides a "frame synch" pulse, so the PLL could use this and mask off 31 of the 32 holes.
I find it hard to understand how if a potentiometer can be used to trim the frame lock to "reasonably OK" - that is, a slow roll - then why is it necessary to be doing a PLL at a much higher frequency than humans would use to keep that picture stable. PLL would have much better control on voltage precision, at a much higher rate (even at 12.5Hz) than a human. It should be able to do a splendid job of locking to a frame, with just a single IR hole.
So what am I not understanding? Why isn't it done this (much simpler) way?