OK, we have a puzzle for everyone, and we're hoping the collective wisdom
here can provide some insight.
We do lots of DVD authoring and DVD replication jobs, but a recent one had an
unacceptable number of complaints from the end-users, and we can't figure
out
why. The job is a fairly straight forward authored DVD. We use DVD Studio
Pro
and Compressor. Each DVD has a 10 second first play, a single two button
menu,
and a total of about 3 hours of video, encoded at a CBR of 2.5Mbps. There
are
12 different DVDs in this job, all with similar specs. The copies are
duplicated since it is low volume. We've been doing this job for several
years
in pretty much the identical fashion, and this is the first year that
there
have been complaints. The big problem for us is that we can't recreate the
problem here. The users complain that the DVD will play for a while then
"freeze" or become pixelated and then stop, and sometimes hang their
computer.
Most of the problems were from PC's (both new and old) with Windows Media
Player, but it seems there were also complaints from
stand alone DVD players.
So it seems we have borderline playable DVDs, that pass our QC tests, but
have
an unacceptable failure rate in the field. Now we're trying to determine
the
mostly likely culprit: either something in the authoring/encoding process,
or
something in the duplication process. The perplexing thing is that we do
dozens of authoring and duplication jobs in the same fashion, but this job
is
the only one that is giving us grief.
We can suspect the master is at fault, since that is one thing that is
obviously unique to this job. But how likely is that? We've heard that
there
may be things in DVDSP that can create titles that have problems playing
in
certain situations. Can anyone confirm or deny that, or shed any more
light on
it? Is 2.5Mb from Compressor ever a problem? Is there any definitive way
to
test? And if it is in the authoring, would it still cause problems if the
job
was replicated?
If it's not in the master, then it would imply that the duplicates had
problems. We know that DVD-R are not as playable as replicated discs, but
we
never see this number of problems. We rechecked the problem copies that
were
returned and found something disturbing. Our standard method of checking
copies is to use "copy and compare" on our duplicators (Verity and R-
Quest),
and then also do a byte count check on a test machine. All these discs
pass
those tests. However, we sent them to a partner to check on their Eclipse
system, and some of the discs failed that compare test. We also have a
standalone DVD tower with an A-card controller that also failed some, but
not
all of the comparisons. What the heck is going on? If the Eclipse test
implies
that there may be a bit or two that doesn't match in a particular block,
could
that cause the failures in the field, or is it impossible to tell. And if
we
can't trust the compare function of the Replication what
should we do? Is there some way to calibrate or verify the compare
function?
So, what's your vote - authoring, replication, media (we use Ritek)? And
what
about the "compare" problem on the duplicatiors? Any help would be greatly
appreciated.
_________________