Proof that sparsity of effects not a good assumption?



As a dues-paying member of American Institute of Chemical Engineers (AIChE) I got my Chemical Engineering Progress (CEP) magazine today — the Feb 2006 issue. I see in the article “Designing Experiments for the Modern Micro Industries” that author Phillip H. Williams claims that in his semiconductor industry the processes are so complex that engineers canNOT assume that the sparsity of effects principle* rules. He then supports this contention by showing numerous three-factor interactions (3FIs) from Minitab software analysis of a full 32-run two-level factorial on 5 factors. After realizing that Table 1 showing response data got out of standard order (1,2,3,…19,20,21,24,22,23,25,26,27…32), I got Design-Expert version 7 software to agree with Williams’ results. However, from the Box-Cox plot it is evident that an inverse square root transformation helps. Oh, and by the way, Williams uses the relatively risky p-value of 0.1 as the cut-off for significance. As a practical matter I would say that main effect predominate as predicted by sparsity of effects. However, it appears Williams does have some basis for saying that this principle fall down in his case, which produces a number of apparently significant 3FI’s. Nevertheless, I am not swayed (as he is) from the advice (quoted from the article) that “if you have five factors … never do the full factorial since the 2^5-1 is a resolution-five design.” This still makes sense to me — why do 32 runs when 16 will normally do. I will email my DX7 file to anyone interested in playing with this data.

*From Wikipedia: “The sparsity of effects principle states that a system is usually dominated by main effects and low-order interactions. Thus it is most likely that main (single factor) effects and two-factor interactions are the most significant responses. In other words, higher order interactions such as three-factor interactions are very rare.”

  1. No comments yet.

You must be logged in to post a comment.