Mathematics and Metrology at NIST: Applications for Microfluidics and Biological Systems

Seminar
06/03/2019
Picture of Paul Patrone

2:30 pm
INRIM, Building M
Conference Hall

Paul Patrone
Applied and Computational Mathematics Division
National Institute of Standards and Technology (NIST), Gaithersburg

 

Abstract

In this talk, I will survey two research areas at NIST in which collaboration between applied mathematicians and bioengineers has led to advances that would not have otherwise been possible.

The first area concerns the development of microfluidic devices for measuring properties of individual cells and groups thereof.  In principle, such devices enable one to control the motion of fluids and objects in flow down to the scale of tens of microns. As a measurement platform, microfluidic systems therefore make it easier to reduce uncertainties by (i) bringing objects under study (e.g. cells) closer to instrumentation and (ii) controlling local environmental conditions.  However, much of the underlying theory and engineering behind how to use microfluidic systems as metrology tools does not yet exist. In this context, I will discuss our recent discovery of scaling methods as a means to determine quantities such as the flow rates, which informs the operating conditions of microfluidic devices. The underlying theory has a rich basis in the analysis of non-linear partial differential equations, although the key ideas can be reduced to simple mathematical statements for the purposes of doing measurements. I will also discuss some of our initial attempts to extend these tools to problems such as cell counting.

In the second half of my talk, I will consider the roles of data analysis and optimization as they pertain to uncertainty quantification (UQ) of fluorescence-based melt curves for DNA. In many contexts, this data can be extracted from real-time polymerase chain reaction machines as a means for detecting topological changes and/or mutations in DNA samples. However, small variations in experimental conditions can lead to O(1) differences in fluorescence counts, complicating attempts reproduce and validate results. Moreover, background signals and baseline behavior of fluorophores can confound attempts to extract “true” measurement signals associated with an underlying physical process of interest.  In this context, I will discuss convex optimization methods we have developed to: (i) determine when a collection of melt curves represent identical realizations of the same physical process that only differ in scale of the raw data (e.g. due to experimental variation); and (ii) subsequently transform all datasets onto a master curve to facilitate downstream analysis and UQ. The main idea behind our approach is to develop mathematical models of the data collection process in terms of affine transformations, which can then be used as inputs to the optimization. Constraints can also be applied to test the feasibility of achieving data collapse in a physically meaningful way. Applications to Förster resonance energy transfer experiments demonstrate the usefulness of this approach.

 

Biographical sketch

Paul Patrone is a staff scientist in the Applied and Computational Mathematics Division at NIST, Gaithersburg.  He received a masters degree in applied mathematics and a Ph.D. in physics, both from the University of Maryland in College Park.  His main research interests include partial differential equations, asymptotics and perturbation methods, coarse-graining, and uncertainty quantification.

Last modified: 06/11/2019 - 17:06