I am a transdisciplinary statistician and data scientist who develops integrative, holistic statistical methodologies for multi-faceted problems, mainly from the environmental and biological sciences. Under an integrative methodological framework, one can formulate a single (often complex) model to address multiple scientific questions simultaneously in a unified manner. A statistical framework addresses research questions through statistical inference (data-based estimation of parameters and their uncertainty). Thus, an integrative statistical methodology is a single statistical inference framework that unifies the multi-faceted evidence-based research, with valid uncertainty propagation across facets. Contrast this with a series of standalone statistical analyses (e.g. through off-the-shelf software packages), whereby the parameter and uncertainty estimates from any analysis in the series correspond purely to one narrow aspect of the overarching research while neglecting the crucial role of uncertainty from the other aspects altogether.
I have coined the nomenclature of some of my methodologies, e.g. the bent-cable regression approach, the latent health factor index (LHFI) modeling framework, the assessment of similarity in preference between networked individuals, and the multiresolution heritability measure.
I develop most of my work under the Bayesian inference paradigm — its flexibility facilitates the proper modeling of complex natural phenomena, typically coupled with complex processes under which data are observed, thereby giving rise to “big-data” structures that are convoluted. An extreme example of convoluted structure might be data on the temporal evolution of the 3-dimensional movement of marine plankton, monitored by a coordinated network of sources — e.g. telemetry data, field observations, output from computer simulation models, etc. — which are mutually dependent in space, time, and across the network. In this case, a key challenge is that implementing such a methodology would require highly advanced computational algorithms and hardware, even for “small data.”
Complexity can also arise when subject-matter theoretical models (e.g. process models, computer simulation models) exist alongside data. I have been employing Bayesian melding to integrate the theoretical model as a type of strong prior into the statistical framework. Typically, Bayesian melding is highly computationally intensive.
Read more about Assoc Prof Grace Chiu on her personal website