Sign In / Sign Out
Research Training Group
Contact Information
General contact information for the School of Mathematical and Statistical Sciences, including mallig address, can be found here.
Specific questions regarding the RTG can be sent to:
Rodrigo B. Platte
rbp@asu.edu
4807278545
GWC 634
Research Experiences for Undergraduates
Undergraduate students are welcome to join the RTG. The most common ways to engage in research under the
suppervision of one of the RTG faculty are honors theses and research assistantships. The latter is most likely
to take place during the summer. Limited NSF funding is available for qualifying US citizens and permanent
residents. If you are intersted, please email one of the faculty
members.
Possible projects are listed below.
 Sampling strategies in function approximation
 Medical and radar imaging
 Experiment designs in functional MRI
 Signal processing
 Data Mining
 Numerical solution of partial differential equations
Research Training Group:
DataOriented Mathematical and Statistical Sciences
Acknowledgment of the challenges of extracting useful information from evergrowing torrents and oceans of raw data has become nearly ubiquitous over the past decade. Mathematical and statistical reasoning are central to addressing these challenges, and the mathematical sciences have established an impressive track record in providing methodology for “big data” problems as they have emerged in recent decades. The ASU Research Training Group (RTG) program is sponsored by the National Science Foundation to keep pace with these challenges. The program includes training in three areas:
 Statistics is by its nature concerned with analysis of data. Concepts like development of sufficient statistics for hypothesis tests and identifying estimators for critical model parameters that make efficient use of collected data remain among the most powerful in the modern arsenal.
 Computational Mathematics has been primarily responsible for algorithmic speedups that have rivaled Moore's law advances in processing technology in enabling meaningful processing of data. It also provides a bridge between ``exact'' solutions and heuristic algorithms by providing rigorous approximate solutions with certificates of fidelity and complexity.
 Harmonic Analysis has underpinned most of the advances in data compression over the past thirty years, providing mechanisms for dimensionality reduction through parsimonious representation of highdimensional data in judiciously chosen bases or frames. More recently, this area of mathematics has been instrumental in advancing ways to identify and exploit compressibility, not just in through lowdimensional subspaces of linear spaces but also by capitalizing on other kinds of lowdimensional structure.
The RTG program fosters integration across these areas to cultivate mathematical scientists who have skills in all three of them and can furthermore understand how to draw on concepts from multiple areas in addressing dataoriented problems. Examples of research questions to be addressed by the synergy of these disciplines include (but are not limited to):
 finding and analyzing efficient and adaptive data collection strategies in sequential experimental design
 reconstructing signals and/or images from incomplete and/or noisy data sources
 devising measurement and other data collection strategies that optimize the value of the data in subsequent statistical tests or estimators
All ASU undergraduate students, graduate students, and postdoctoral fellows are welcome to participate in the RTG seminar, which will include both research and professional development components.
Undergraduate students, graduate students, and postdoctoral fellows participating in the RTG program will have the opportunity to complete some research activity at an offsite location, typically during the summer at a national research laboratory or medical center. This will give participants a chance to collaborate with research from diverse backgrounds and other scientific disciplines on real datadata oriented problems.
Those interested in participating should contact Rodrigo Platte.
Funding is provided by the National Science Foundation and the School of Mathematical and Statistical Sciences.
Faculty affiliiated with RTG include:
Al Boggess
boggess@asu.edu
4809655892
Doug Cochran
cochran@asu.edu
4809657409
Anne Gelb
(Now at Dartmouth College)
MingHung (Jason) Kao
minghung.kao@asu.edu
4809653466
Rodrigo Platte
Rodrigo.Platte@asu.edu
4807278545
John Stufken
John.Stufken@asu.edu
Postdoctoral Fellow
David Kaspar
Probability, PDEs,
Statistical Mechanics
(Current)
Toby Sanders
Inverse Problems, Tomography,
Radar Imaging
(Past)
Graduate Students
 Lauren Crow, PhD Statistics (NSF fellow)
 Michael Culp, PhD Applied Mathematics
 Victoria Dollar, PhD Applied Mathematics (NSF fellow)
 Miandra Ellis, PhD Applied Mathematics (NSF fellow)
 Genesis Islas, PhD Applied Mathematics (NSF fellow)
 Tony Liu, PhD Applied Mathematics (NSF fellow)
 Camille Moyer, PhD Applied Mathematics (NSF fellow)
 Abigael Nachtsheim, PhD Applied Mathematics (NSF fellow)
 Joe Sadow, PhD Applied Mathematics
 Theresa Scarnati, PhD Applied Mathematics
 John Stockton, PhD Statistics (NSF fellow)
Undergraduate Students
 Alyssa Burgueno, REU/MCTP 2017
 Matthew Kinsinger, REU 2017
 Megan Sopa, Honors Thesis 2017
 Courtney PageBottorff, Honors Thesis 2016
 Alexander Reynolds, REU 2016
Seminars
MAT/STP 591 Topic: DataOriented Mathematical and Statistical Sciences
Schedule: Mondays 1:30  2:30pm in WXLR 021 (lower level)
Description: This seminar series is part of the NSFRTG DataOriented Mathematical and Statistical Sciences. Seminar speakers will include ASU faculty and postdocs, outside visitors, and students. The RTG seminar will focus on both research and professional development. Topics of interest include mathematical and statistical challenges related to data problems that have emerged in recent years.
The seminar is open to all ASU students and faculty. In addition, students may register for 1 credit hour (pass/fail) or 3 credit hours (standard grading). Students registering for 1 credit must attend all talks. Students registering for 3 credits must attend all talks and present two regular length seminar talks on preapproved topics (or two parts of the same topic). Under special circumstances, the course instructor may propose a different set of requirements. RTG fellows are required to register for three credit hours.
Prerequisite: Degree or nondegreeseeking graduate student. Registration for three credit hours requires instructor approval.
RTG Seminar  Fall 2018
The RTG seminar is open to everyone. ASU students may register for 1 or 3 credits. Further information is available here.
Course syllabus
The seminars are at 1:30pm in Wexler 021.
 Aug 27, Rodrigo Platte
Function approximation, the curse of dimensionality, and sampling strategies  Sep 10, Rodrigo Platte
Function approximation, the curse of dimensionality, and sampling strategies  Sep 24, Robert Skeel
Pattern Recognition and Machine Learning  Oct 1, Robert Skeel
Pattern Recognition and Machine Learning  Oct 15, Milan Stehlik
Information theory approach to Machine learning, Neural Computing, and Artificial Intelligence: a new perspective for Statistical inference and optimal design?  Oct 22, Milan Stehlik
Information theory approach to Machine learning, Neural Computing, and Artificial Intelligence: a new perspective for Statistical inference and optimal design?  Oct 29, Sharon Crook
Evaluating the Datadriven Model  Nov 5, Toby Sanders
Some Thoughts on Dataoriented Math and Stats through a Parameter Selection Problem  Nov 19, Student Presentations: Demetrios Papakostas and Camille Moyer
 Nov 26, Student Presentations: Casey Smith and Steven Reed
 Dec 03, Student Presentations: Esther Boyle
RTG Seminar  Spring 2018
The RTG seminar is open to everyone. ASU students may register for 1 or 3 credits. Further information is available here.
 Jan 8, Dave Kaspar
Organizational meeting  Jan 22, Doug Cochran
Distributed decision problems, Part I.  Jan 29, Lauren Crider
Distributed decision problems, Part II.  Feb 5, Adeline Kornelus
Version control with git  Feb 12, Dave Kaspar
Pattern theory, Part I.  Feb 19, Dave kaspar
Pattern Theory, Part II.  Feb 26, Richard Hahn
Uncertainty assessment via iterated simulated learning  Mar 12, Toby Sanders
An introduction to inverse problems and synthetic aperture radar imaging  Mar 26, Jason Kao
Computer experiments  Apr 2, Kevin Lin
Discretetime approach to stochastic parametrization of spatiotemporal chaos  Apr 9, Rob McCulloch
A general approach to variable selection in nonlinear models  Apr 16, Victoria Dollar, student
African Easterly Waves in current and future climates  Apr 16, Tony Liu, student
Optimal sampling for polynomial data fitting on complex regions  Apr 23, Abigael Nachtsheim, student
Nonparametric subsampling for big data  Apr 23, Bechir Amdouni, student
Patterns of dropouts and the role of sociodemographic and perception factors for middle school students  Apr 23, Miandra Ellis, student
A multiresolution approach for Superparamagnetic Relaxometry data
RTG Seminar  Fall 2017
The RTG seminar is open to everyone. ASU students may register for 1 or 3 credits. Further information is available here.
 Aug 21, Toby Sanders
Introduction  Aug 28, Rodrigo Platte
Function approximation from discrete data  Sep 11, Toby Sanders
More data with more noise, or less data with less noise: in the context of image reconstruction and electron microscopy  Sep 18, Joe Chen
Xray diffractive imaging of finite crystals  Sep 25, Jason Kao
Functional brain imaging and some of its design issues  Oct 2, Dieter Armbruster, Esma Gel
Data analytics to support efficient soybean variety development  Oct 16, Yang Kuang
Models of hormone treatment for prostate cancer: can mathematical models predict the outcomes?  Oct 23, John Fricks
Motorcargo complexes and stochastic simulation  Oct 30, Various Students
Student reports on summer internships  Nov 6, Stefano Boccaletti
Parenclitic Networks: How to uncover new functions and structural information in biological data  Nov 13, Tony Liu, student
Optimal sampling for polynomial data fitting on complex regions  Nov 13, Michael Byrne, student
Image processing tools for energy dispersive Xray (EDX) imaging  Nov 20, Lauren Crow, student
Modeling motorcargo complexes through particle filtering and the EM algorithm  Nov 27, Miandra Ellis, student
Methods for handling imbalanced datasets  Nov 27, Tin Phan, student
Visceral Leishmaniasis  Nov 27, Abigael Nachtsheim, student
Augmenting definitive screening designs for prediction under secondorder models
Other Talks
Seminar and Conference Talks by Students and Postdocs
2018
 Abigael Nachtsheim, Augmenting Definitive Screening Designs for Estimating SecondOrder Models, Joint Research Conference , Santa Fe, NM, June 2018.
 Lauren Crow, Inferring Multimotor Dynamics Through Cargo Tracking (poster), Joint Statistical Meetings, Vancouver, Canada, July 2018.
 Adeline Kornelus, Higher Order Total Variation for Regularizing Partial Differential Equations, 13th World Congress in Computational Mechanics, New York, NY, June 2018.
 Theresa Scarnati, Reducing the Effects of Bad Data Using Variance Based Joint Sparsity Recovery, SIAM Conference on Imaging Science (IS18), Bologna, Italy, June 2018.
 Alyssa E. Burgueno (undergrad), Magnetic Resonance Recovery from SingleShot Time Dependent Data, Joint Mathematics Meetings, San Diego, CA, January 2018.
2017
 Toby Sanders, Higher Order Total Variation, Multiscale Generalizations, and Applications to Inverse Problems (poster), Foundations of Computational Mathematics, Barcelona, Spain, July 2017.
 Theresa Scarnati, C.R. Paulson, E.G. Zelnio, Exploiting the sparsity of edge information in SAR image formation, SPIE Commercial + Scientific Sensing and Imaging, Anaheim, CA, Apr 2017.
 Toby Sanders, Theresa Scarnati, Combination of correlated phase error correction and sparsity models for SAR, SPIE Commercial + Scientific Sensing and Imaging, Anaheim, CA, Apr 2017.
 Toby Sanders, Multiscale Higher Order TV Operators for l1 Regularization, ASU Computational and Applied Math Seminar, Feb 2017.
2016
 Toby Sanders, Imaging Techniques for Synthetic Aperture Radar, ASU Postdoc Seminar Series, Sep 2016.
 Theresa Scarnati, Exploiting Sparsity in PDEs with Discontinuous Solutions, SIAM Conf. on Imaging Science, Albuquerque, NM, May 2016.
 Toby Sanders, Special Regularization Techniques for Synthetic Aperture Radar, SIAM Conf. on Imaging Science, Albuquerque, NM, May 2016.
Journal Publications
 T. Sanders. PhaseBased Alignment and Improved Projection Matching of Parallel Beam Tomography Data. IEEE Transactions on Computational Imaging (2018) vol. 4, no. 3, pp. 395405.
 T. Scarnati, A. Gelb, R.B. Platte. Using l1 Regularization to Improve Numerical Partial Differential Equation Solvers. Journal of Scientific Computing (submitted).
 T. Sanders, C. Dwyer. Subsampling and Inpainting Strategies for Electron Tomography. Ultramicroscopy 182 (2017): 292302.
 T. Sanders, I. Arslan. Improved 3D Resolution of Electron Tomograms using Robust Mathematical Data Processing Techniques. Microscopy and Microanalysis 19. doi:10.1017/S1431927617012636
 T. Sanders. Parameter Selection for HOTV Regularization. Applied Numerical Mathematics (accepted for publication).
 B. Adcock, R.B. Platte, A. Shadrin. Optimal sampling rates for approximating analytic functions from pointwise samples. IMA Journal of Numerical Analysis (submitted).
 T. Sanders, R.B. Platte. Multiscale Higher Order TV Operators for l1 Regularization and Their Relationship to Daubechies Wavelets. Jour. on Inv. Prob. (submitted).
 T. Sanders, A. Gelb, R.B. Platte. Composite SAR Imaging Using Sequential Joint Sparsity. J. Comput. Phys., 338 (2017) 357370.
 T. Sanders, A. Gelb, R.B. Platte, I. Arslan, K. Landskron. Recovering Fine Details from UnderResolved Electron Tomography Data using Higher Order Total Variation l1 Regularization. Ultramicroscopy, 174 (2017) 97105.
Proceedings
 T. Sanders, T. Scarnati. Combination of correlated phase error correction and sparsity models for SAR, Proc. SPIE, 10222 (2017) Computational Imaging II, 102220E (2017) doi: 10.1117/12.2262861.
 T. Scarnati, E. Zelino, C. Paulson. Exploiting the sparsity of edge information in synthetic aperture radar imagery for speckle reduction. Proc. SPIE, 10201, Algorithms for Synthetic Aperture Radar Imagery XXIV, 102010C (2017) doi: 10.1117/12.2267790.
Student Internships and Research Experiences
Summer 2018
 Abigael Nachtsheim, PhD student, Statistical Sciences Group at Los Alamos National Laboratory, Los Alamos, NM.
Summer 2017
 Genesis Islas, PhD student, WrightPatterson Air Force Research Laboratory, Dayton, OH.
 Lauren Crow, PhD student, Oak Ridge National Laboratory, Oak Ridge, TN.
 Joe Sadow, PhD student, MIT Lincoln Laboratory, Boston, MA.
 Megan Sopa, undergradaute student, State Farm Insurance Company, Atlanta, GA.
Summer 2016
 Theresa Scarnati, PhD student, WrightPatterson Air Force Research Laboratory, Dayton, OH.
 Alexander Reynolds, undergraduate student, WrightPatterson Air Force Research Laboratory, Dayton, OH.
Current and Past Student Projects
African Easterly Waves in Current and Future Climates
Victoria Dollar, RTG Seminar Project, Fall 2017,
Spring 2018, Mentor: Moustaoui
The African Easterly Waves (AEWs) activity during the most recent decade (20082015) is reported and analyzed, and the same methodology is applied to predictions for a decade at the end of the century (20902099) . The data utilized are obtained from assimilated analyses of the National Center for Environmental Prediction (NCEP) and climate projections from the Community Earth System Model (CESM). The power spectral density computed by the multitaper spectral analysis method and averaged over West Africa and over both decades shows the dominance of waves with periods in the 35 day window. The spectrum of AEWs in the future climate shows a shift towards low frequencies. The role of the intensity of the jet on the wave activity is supported by idealized simulations.
A Multiresolution Approach for Superparamagnetic
Relaxometry Data
Miandra Ellis, RTG Seminar Project, Spring
2018, Mentor: Renaut
Superparamagnetic Relaxometry (SPMR) is a novel technique which uses antigenbound nanoparticles to assist in early cancer detection. A challenge of translating this technique to mainstream clinical applications is the reconstruction of the bound particle signal. The primary focus of this semester’s work was to determine if a multiresolution approach could be used to accurately reconstruct the signal, including the position and magnitude of a source. By reducing the search space we hoped for a method which would be less computationally intensive. From our results it appears that the multiresolution approach is promising for accurately localizing the bound particles.
Patterns of dropouts and the role of
sociodemographic and perception factors for middle
school students
Bechir Amdouni, RTG Seminar Project, Spring
2018, Mentor: Mubayi
Numerous research have found an impact of gender, race, socioeconomic status (SES), school achievement, school engagement, and academic ability on academic achievement. Few have looked at more than one factor together. However, no research have combined all these factors together and their impact on academic achievement. In this paper, first, we looked at gathered data at multiple time points using ordered multinomial logistic regression(OMLR) to identify the main factors of academic achievement. Secondly, we built a discrete time Markov chain (DTMC) model using the finding from the OMLR.
Optimal Sampling for Polynomial Data Fitting on
Complex Regions
Tony Liu, RTG Seminar Project, Spring
2017, Fall 2017, Spring 2018, Mentor: Platte
It is wellknown that polynomial interpolation using equispaced points in one dimension is unstable. On the other hand, using Chebyshev nodes in one dimension provides both stable and highly accurate points for polynomial interpolation. In higher dimensional complex regions, optimal interpolation points are not well understood. The goals of this project are to find nearly optimal sampling points in one and twodimensional domains for interpolation, leastsquares fitting, and finite difference approximations. The optimality of sampling points is investigated using the Lebesgue constant.
Nonparametric Subsampling for Big Data
Abigael Nachtsheim, RTG Seminar Project, Spring
2018, Mentor: Stufken
The desire to build predictive models based on datasets with tens of millions of observations is not uncommon today. However, with large datasets, standard statistical methods for analysis and model building can become infeasible due to computational limitations. One approach is to take a subsample from the full dataset. Standard statistical methods can then be applied to build predictive models using only the subdata. Existing approaches to data reduction often rely on the assumption that the full data follow a specified model (Wang et al., 2017). However, such assumptions are not always applicable, particularly in the big data context. We explore two new methods of subdata selection that do not require model assumptions. These proposed approaches use kmeans clustering and spacefilling designs in an attempt to spread the subdata uniformly throughout the region of the full data. We perform a simulation study and an analysis of real data to investigate the efficacy of the predictive models that result from these methods.
Modeling motorcargo complexes through particle
filtering and the EM algorithm
Lauren Crow, RTG Seminar Project, Fall
2017, Mentor: Fricks
Movement of proteins is a biophysical process involving transient binding of particles to a microtubule. Specifically, different types of motors aid in the transport of cargo, such as vesicles and organelles. The movement is modeled as a series of switches, based on a Poisson process, between two possible states: random diffusion or Brownian directed movement. Using observed data that is obscured by assumed Gaussian error, the true movement of the cargo and regime switches are predicted. The predictions are based on the stochastic ExpectationMaximization (EM) algorithm, implementing a particle filter and maximum likelihood estimation. The results are first tested through a simulation study and then applied to real data.
Methods for Handling Imbalanced Datasets
Miandra Ellis, RTG Seminar Project, Fall
2017, Mentor: Swanson
Motivated by a comparison between classifiers built using balanced and imbalanced datasets, this project aimed to address issues with imbalance in training data when using the soft margin Support Vector Machine. Oversampling and Synthetic Minority Oversampling were used to balance the training dataset to illustrate how these resampling techniques could be used to alleviate problems arising from imbalance. This allowed us to conclude that both of these resampling based approaches could increase the specificity of a classifier.
Assorted Methods for Predicting Superior Soybean
Varieties
Camille Moyer, RTG Seminar Project, Fall
2017, Mentor: Armbruster
While genetic modification in soy beans has allowed farmers to increase their yield over the years, models for predicting which genetic strain could be the most successful in particular regions have fallen behind. This project uses three different methods to construct viable prediction models for newly created varieties of soy beans: clustering methods, Kalman filtering, and parenclitic networks.
Augmenting Definitive Screening Designs for
Prediction Under SecondOrder Models
Abigael Nachtsheim, RTG Seminar Project, Fall
2017, Mentor: Stufken
Jones and Nachtsheim (2011) introduced a class of threelevel screening designs called definitive screening designs (DSDs). The structure of these designs results in the statistical independence of main effects and twofactor interactions; the absence of complete confounding among twofactor interactions; and the ability to estimate all quadratic effects. In this paper we explore the construction of series of augmented designs, moving from the starting DSD to designs comparable in sample size to central composite designs. We perform a simulation study to calculate the predictive mean square error for each design to determine the number of augmented runs necessary to effectively fit the correct secondorder model.
Regularization with Shot Noise: A Bayesian Approach
Joe Sadow, RTG Seminar Project, Fall
2017, Mentor: Sanders
In this paper we consider inverse problems in the presence of Poisson noise. A probabilistic treatment of the noisy regularization problem allows for a more comprehensive quantification of uncertainty in the problem. The Bayesian framework for optimization is explored by adding dataoriented terms to the image reconstruction problem and comparing with the classic function space optimization techniques. The reconstruction effort is described and implemented for image data containing Poisson noise, a situation relevant to many particlecounting imaging problems.
Image processing tools for energy dispersive Xray
(EDX) imaging
Michael Byrne, RTG Seminar Project, Fall
2017, Mentor: Sanders
Energy dispersive Xray (EDX) spectroscopy is a technique used to determine the chemical composition. The sample is exposed to an excitation energy, triggering atomic reactions that result in Xray emission. The number of emitted Xrays are recorded at each energy level, and the result is a spectrum indicating peaks for different elements at particular energy levels. From the series of spectrum data, an image representation of the density for each element in the sample may be recovered. While EDX spectroscopy offers the power resolve the densities of each element in the sample, the process of generating images for each element is nontrivial. In this paper we explore various image processing tools such as lowpass filters and principal component analysis that can be used to produce improved images from EDX spectroscopy data. Once we understand how these tools effect the resulting images, we hope to implement more advanced image reconstruction tools to improve the image formation.
Function Approximation on Spherical Domains
Genesis Islas, RTG Seminar Project, Spring
2017, Mentor: Platte
This project investigates a gridding technique for function approximation on a spherical domain. This work is motivated by problems that arise in atmospheric research. The goal is to study the discretization based on the cubed sphere domain decomposition. This method decomposes the sphere into six identical regions where uniformly distributed nodes map onto nearly uniformly distributed nodes on the cube. We contrast this to the latitude and longitude discretization where the uniformity of the node distributions is completely lost by the change of coordinates and results in oversampling near the poles. The effect of using different sampling distributions for function approximation is explored.
(slides)
Tomography and Sampling
Joe Sadow, RTG Seminar Project, Spring
2017, Mentor: Sanders
The purpose of this project is to motivate and develop the general tomographic imaging problem. The Radon transform and its intimacy with the classic Fourier transform will be established. The inverse problem, will be defined along with an exploration of related iterative reconstruction schemes. The optimal use of sampling patterns is also explored.
(slides)
Model Selection and Data with Asymmetric
Distribution Testing Using the IBOSS Approach
John Stockton, RTG Seminar Project, Fall 2016 and
Spring 2017, Mentor: Stufken
With the increasing need to analyze data sets with potentially billions of entries and thousands of predictor variables, many methods have been proposed to computationally efficiently study these socalled “big data” sets; in particular, a recently proposed method called the InformationBased Optimal Subdata Selection (IBOSS) method. Preliminary studies have concluded the effectiveness of the method over previously introduced methods such as the Uniform Sampling Method and Leveragebased Sampling Methods in regards to the linear regression equation constructed from the given subdata by each given method, using a variety of simulated data sets and some real data sets. In the Fall of 2016, I conducted preliminary studies regarding the distribution of simulated data sets, and concluded the success of the process when the distribution used to generate the covariates is generally symmetric, though in all cases, the responses with each data set have been constructed using a linear model, and a linear model was fit for the subdata. Naturally, this raises some questions regarding how successful the IBOSS algorithm would be perform in basic model selection. In this project, I study how model selection performs when using IBOSS across twofactor interaction terms. Additionally, I explore the effects of skewed predictor data has on subdata selection methods.
(slides)
Leverage Subsampling in
MultivariateMultinormallyDistributed Data
Lauren Crow, RTG Seminar Project, Fall 2016 and
Spring 2017, Mentors: Stufken and Cochran
Big data analysis has been on the rise and with it, a need for new research methods. One area of focus is subdata selection. In this project, there are several types of subdata selection methods that are discussed and compared, including basic leverage sampling (BLEV), shrinkage leverage sampling (SLEV), unweighted leverage sampling (LEVUNW), and uniform sampling (UNIF). After an indepth comparison using mean squared error on simulated data as the criteria, it has been determined that the unweighted leverage sampling method resulted in the most accurate estimation of the true parameters among these four methods, making leveragebased subsampling a valuable solution to modeling big data. However, this was only determined under the assumption that an ordinary linear model with one response was being used and that the errors were independent and followed a normal distribution. To see if the results still held in other circumstances, three new models were proposed that both involved multivariatemultinormallydistributed data. The three models had ten parameters and two responses, although they could be generalized to even more responses or a different number of parameters. In the first, the errors were independent and identically distributed. In the second, the errors remained independent but had different levels of variance for each response. Finally, the third model had different levels of dependence among the errors, causing correlation among both the errors and the responses. Leverage sampling proved to perform well in multivariate data with and without the assumption of independence and identical distributions, with unweighted leverage sampling consistently performing the best. That is, the previous results can be extended into these new types of models. Although the methods were implemented using manageablesized data, these methods can be applied in multivariate systems of a much larger size and on real data instead of simulated data.
(slides)
Nonuniform Fast Fourier Transforms
Tony Liu, RTG Seminar Project, Fall
2016, Mentors: Sanders and Platte
The Fast Fourier Transform (FFT) allows for the efficient computation of the Discrete Fourier Transform (DFT) of a set of values into its frequency components. The FFT, along with its inverse, are widely used in many applications in science, engineering, mathematics, and medicine. The FFT reduces the computational workload of the DFT from O(n^2) down to O(n log n); however, in order to implement the FFT, a uniformly spaced set of data is required in both the time or frequency domain. In many applications, samples are nonuniform and multiple iterations of Fourier transforms are required. In order to overcome computational limitations, Nonuniform FFTs (NUFFTs) are often used. In the recent years, a number of algorithms have been developed to solve this type of problem. These NUFFTs are derived by combining interpolation and the use of the traditional FFT on an oversampled uniform space. This project addresses the basics of the Fourier transform as well as the DFT, the derivation of the FFT, motivation for NUFFTs, and the derivation of one NUFFT algorithm.
(slides)
Signal Reconstruction using Least Absolute
Errors
Genesis Islas, RTG Seminar Project, Fall
2016, Mentor: Sanders and Platte
This project compares the l1 and l2 norms for signal reconstruction from noisy measurements. Suppose f is our unknown (nx1 vector). We would like to recover f from a given data vector b where f and b are related such that Af+e=b. Here, A is m×n and e is an unknown vector of errors. Then f can be approximated by solving the minimization problem min_f Af − b. A popular method of solving this problem is least squares, which minimizes the l2 norm. However, the least squares method can perform poorly when the errors on the signal have large magnitude even if they are few. This provides the motivation for solving the minimization problem with the l1 norm. It has been shown that if certain conditions are met on both A and e, solving the minimization problem with the l1 norm is equivalent to solving it with l0. In this project, we explore some numerical examples to illustrate the effectiveness of recovering a signal using the l1 norm.
(slides)
Deep Learning on 3D Geometries
Hope Yao, RTG Seminar Project, Fall 2016, Mentor:
Sanders
This project extended traditional 2D convolutional neural network into 3D. Fourier convolution is investigated to deal with increasing computational cost brought by the extra dimension. Numerical results show that our model is able to achieve nine percent testing error on ModelNet10 dataset, which is comparable to the best result reported in 2015.
(slides)
Bootstrapping in the Context of Big Data
Shantrue John Chang, RTG Seminar Project, Fall
2016, Mentor: Cochran
Bootstrap provides a simple, but powerful way of assessing the quality of estimators, “assessors”. However, when working with big/massive data sets, most computers cannot keep up with the computationally demanding process required for bootstrap. Branches of bootstrap have been developed to deal with computational costs. This project explores Bag of Little Bootstraps, a proposed bootstrap technique for big and massive data.
RTG Applied Mathematics Curriculum
Semester 1 

Semester 2 
APM 506 Computational Methods 
Summer 1 

Semester 3 
APM 501 Differential Equations 1 
Semester 4 

Summer 2 

RTG Statistics Curriculum
Semester 1 

Semester 2 

Summer 1 

Semester 3 
APM 505 Applied Linear Algebra 
Semester 4 
APM 506 Computational methods

Summer 2 
