Data-structure dynamization is a general approach for making static data
structures dynamic. It is used extensively in geometric settings and in the
guise of so-called merge (or compaction) policies in big-data databases such as
Google Bigtable and LevelDB (our focus). Previous theoretical work is based on
worst-case analyses for uniform inputs -- insertions of one item at a time and
constant read rate. In practice, merge policies must not only handle batch
insertions and varying read/write ratios, they can take advantage of such
non-uniformity to reduce cost on a per-input basis.
To model this, we initiate the study of data-structure dynamization through
the lens of competitive analysis, via two new online set-cover problems. For
each, the input is a sequence of disjoint sets of weighted items. The sets are
revealed one at a time. The algorithm must respond to each with a set cover
that covers all items revealed so far. It obtains the cover incrementally from
the previous cover by adding one or more sets and optionally removing existing
sets. For each new set the algorithm incurs build cost equal to the weight of
the items in the set. In the first problem the objective is to minimize total
build cost plus total query cost, where the algorithm incurs a query cost at
each time $t$ equal to the current cover size. In the second problem, the
objective is to minimize the build cost while keeping the query cost from
exceeding $k$ (a given parameter) at any time. We give deterministic online
algorithms for both variants, with competitive ratios of $\Theta(\log^* n)$ and
$k$, respectively. The latter ratio is optimal for the second variant.