Hi, Peter: What do you do in such situations? Sundar Dorai-Raj
and I have extended "glm" concepts to models driven by a sum of k
independent Poissons, with the a linear model for log(defectRate[i])
for each source (i = 1:k). To handle convergence problems, etc., I
think we need to use informative Bayes, but we're not there yet. In
any context where things are done more than once [which covers most
human activities], informative Bayes seems sensible. A related
question comes with data representing the differences between Poisson
counts, e.g., with d[i] = X[i]-X[i-1] = the number of new defects
added between steps i-1 and i in a manufacturing process. Most of the
time, d[i] is nonnegative. However, in some cases, it can be
negative, either because of metrology errors in X[i] or because of
defect removal between steps i-1 and i. Comments?