Archives

  • 2018-07
  • 2018-10
  • 2018-11
  • 2019-04
  • 2019-05
  • 2019-06
  • 2019-07
  • 2019-08
  • 2019-09
  • 2019-10
  • 2019-11
  • 2019-12
  • 2020-01
  • 2020-02
  • 2020-03
  • 2020-04
  • 2020-05
  • 2020-06
  • 2020-07
  • 2020-08
  • 2020-09
  • 2020-10
  • 2020-11
  • 2020-12
  • 2021-01
  • 2021-02
  • 2021-03
  • 2021-04
  • 2021-05
  • 2021-06
  • 2021-07
  • 2021-08
  • 2021-09
  • 2021-10
  • 2021-11
  • 2021-12
  • 2022-01
  • 2022-02
  • 2022-03
  • 2022-04
  • 2022-05
  • 2022-06
  • 2022-07
  • 2022-08
  • 2022-09
  • 2022-10
  • 2022-11
  • 2022-12
  • 2023-01
  • 2023-02
  • 2023-03
  • 2023-04
  • 2023-05
  • 2023-06
  • 2023-07
  • 2023-08
  • 2023-09
  • 2023-10
  • 2023-11
  • 2023-12
  • 2024-01
  • 2024-02
  • 2024-03
  • 2024-04
  • 2024-05
  • These results were obtained in spite

    2018-10-29

    These results were obtained in spite of choosing one balancing variable poorly. In our piloting, we observed that water treatment practices varied importantly by whether the compound had a connection to the municipal gas supply. A gas connection also appeared to be a useful proxy for better overall socio-economic status. However, gas coverage in our study area turned out to be much higher than in the pilot area (even though the pilot area was nearby and similar in many other respects), and in fact was nearly universal . Despite this unhelpful balancing variable, the algorithm produced a sample that was well-balanced on the other balancing variable and on our main SES variable, household monthly income. We view the robustness of the method as encouraging.
    Discussion and conclusion Given the theoretical optimality of the D-optimal method, and the evidence in its favor from our simulations and other simulation studies (Atkinson, 2002; Senn et al., 2010), it is somewhat puzzling that this free lunch has previously gone unclaimed, both in economics and more broadly. We consider this situation in two ways: first, by comparing D-optimal allocation to the two most popular covariate-adaptive allocation methods, block randomization and minimization (McEntegart, 2003; Taves, 2010; Pond et al., 2010); second, by addressing concerns that have inhibited all covariate-adaptive methods. Block randomization, described in Section 5.1.2, is intuitively appealing, simple to implement, and allows for randomization inference through re-randomization within blocks. However, it can only be used with discrete (or discretized) covariates and as the number of blocks (the number of win 55 212-2 cost of all interacted balancing covariates) grows relative to the overall sample size, a “remainder problem” can arise where marginal balance (overall balance of the number of units per treatment) must be traded off against balance within block. Minimization (Taves, 1974; Pocock and Simon, 1975), in which each unit is assigned to minimize the sum of absolute (or squared) imbalances by balancing variable (see McEntegart, 2003 for an extended discussion and examples), is simple to implement and can be used when there are many balancing covariates. As with block randomization, it requires discrete or discretized balancing variables. Relative to block randomization and minimization, we suspect that the primary drawback of D-optimal allocation is its complexity. McEntegart (2003) notes that “the method is difficult to explain to nonstatisticians and it is probably for this reason that it has never been used in practice.” Furthermore, the algorithm is not trivial to program and implementation requires rapid communication between the field and a centralized database. However, mobile computing power and communication technology continue to improve, which will reduce these barriers. All covariate-adaptive methods have faced academic and regulatory resistance, chiefly on two grounds: predictability and inference. Predictability occurs when the implementer can anticipate, either with certainty or high probability, the treatment to which the next unit with a given set of covariates will be allocated. The implementer may be tempted to manipulate the order in which units are processed in a way that is correlated with potential outcomes. This manipulation is clearly a concern in minimization, because the algorithm\'s simplicity can make it easily predictable. Block randomization is also subject to predictability if the implementer knows the number of treatments and the length of the random block. As the trial moves towards the end of each successive block, the implementer knows which treatments remain and therefore has at least some information on which treatment the next unit may receive. Both minimization and block randomization can be adapted to reduce predictability by adding a probabilistic element similar to the “biased coin” approach we describe in Section 3.7, at the cost of some increase in operational complexity and loss of balance. D-optimal allocation is also subject to predictability, although in this case its relative complexity may be a virtue because it makes prediction difficult. As mentioned in Section 6, predictability can be made virtually impossible without resorting to a biased coin by including “placebo” covariates in the intake form and not telling implementers which covariates are operative.