3 Continuous and integer sample size
The gsDesign package has historically used continuous values for sample size and event counts at the time of design. This has the advantage that timing of analyses can be precise in terms of specified timing as specified by fraction of final sample size (normal and binary endpoints) or fraction of targeted events (time-to-event outcomes). Disadvantages include ambiguity when updating designs at time of analysis based on integer sample size or event counts as will be demonstrated below. We illustrate basic implementation of integer-based sample size in this chapter and provide further examples throughout the book. The toInteger()
function will convert designs to integer-based sample size. For designs for time-to-event endpoints created using the gsSurv()
or gsSurvCalendar()
functions, integer-based event counts are also produced by the toInteger()
design conversion.
We consider a simple design with a user-defined endpoint with a fixed design sample size of n.fix = 1000
with default 1-sided Type I error \(\alpha = 0.025\) and 90% power.
We add a single interim analysis after 60% of the trial is available for analysis using only a superiority bound for the interim analysis (test.type = 1
) and apply a Hwang, Shih, and De Cani (1990) spending function (sfu = sfHSD
) with spending parameter \(\gamma = -3\) (sfupar = -3
). We see immediately that the derived design does not have integer sample sizes at analyses.
The integer-based sample size for this is obtained as follows:
y <- toInteger(x)
y$n.I
#> [1] 611 1019
The interim sample size above has simply been rounded while the final sample size has been rounded up. The ratio
parameter in the toInteger()
function controls how the full trial sample size rounding is done. The value ratio = 1
assumes 1:1 randomization and, thus, an even total sample size.
y <- toInteger(x, ratio = 1)
y$n.I
#> [1] 611 1020
In the toInteger()
function, ratio
is used to specify a conversion of the input design to round up to the next even multiple of ratio + 1
for the total sample size. Thus, if the randomization ratio were 5:2, we might want the sample size to be an even multiple of 7 and would would specify:
toInteger(x, ratio = 6)$n.I
#> [1] 611 1022
There is also a parameter roundUpFinal
with a default value of TRUE
. If FALSE
, rather than rounding the final value up, it is just rounded. In our example above, this makes no difference.
toInteger(x, ratio = 2, roundUpFinal = FALSE)$n.I
#> [1] 611 1020
For the design with 1:1 randomization and an even sample size, rather than the input information fraction of 0.6 at analysis 1 we have slightly smaller value
y$n.I[1] / y$n.I[2]
#> [1] 0.5990196
Also, rather than the originally targeted power of 90%, we have a total power of
100 * sum(y$upper$prob[, 2])
#> [1] 90.0303
Now we update the design assuming instead of 2 analyses after 611 and 1020 observations we have 3 analyses as shown in the code below. Most of the code in the gsDesign()
call is copying in parameters from the design defined in the integer-based sample size design y
above. We note the Z-value bounds for efficacy under the asymptotic distributional assumptions of the previous chapter.
yu <- gsDesign(
# Assume 3 analyses actually done
k = 3, n.I = c(400, 700, 1100),
# Remaining parameters copied from original design
maxn.IPlan = y$n.I[y$k],
test.type = y$test.type,
alpha = y$alpha, beta = y$beta, astar = y$astar,
sfu = y$upper$sf, sfupar = y$upper$param,
sfl = y$lower$sf, sflpar = y$lower$param,
delta = y$delta, delta1 = y$delta1, delta0 = y$delta0,
)
yu$upper$bound
#> [1] 2.754625 2.444650 2.038284
A key parameter in the above that leads to some ambiguity in the case continuous sample size is maxn.IPlan
, the planned sample size or, for time-to-event outcomes, the planned final analysis event count. The prescribed way to do this is as follows which is identical to the coding approach above for yu
, replacing the integer sample size design in y
with the continuous sample size design in x
:
xu <- gsDesign(
# Assume 3 analyses actually done
k = 3, n.I = c(400, 700, 1100),
# Remaining parameters copied from original design
maxn.IPlan = x$n.I[x$k],
test.type = y$test.type,
alpha = x$alpha, beta = x$beta, astar = x$astar,
sfu = x$upper$sf, sfupar = x$upper$param,
sfl = x$lower$sf, sflpar = x$lower$param,
delta = x$delta, delta1 = x$delta1, delta0 = x$delta0,
)
xu$upper$bound
#> [1] 2.754051 2.443665 2.038577
This gives a slightly different result than if we specify the rounded (integer) sample size from the original design in maxn.IPlan
, the only change from the code above.
xu <- gsDesign(
# Assume 3 analyses actually done
k = 3, n.I = c(400, 700, 1100),
# Remaining parameters copied from original design
maxn.IPlan = 1020,
test.type = y$test.type,
alpha = x$alpha, beta = x$beta, astar = x$astar,
sfu = x$upper$sf, sfupar = x$upper$param,
sfl = x$lower$sf, sflpar = x$lower$param,
delta = x$delta, delta1 = x$delta1, delta0 = x$delta0,
)
xu$upper$bound
#> [1] 2.754625 2.444650 2.038284
With the integer-based design in y
, maxn.IPlan
will be 1020 if gotten from y$n.I[y$k]
or entered directly from a summary table removing any ambiguity about how bounds should be updated at the time of study analyses when event counts or sample size realized will generally be different from the original plan simply due to logistical considerations.