A statistical method is proposed to learn what the diffusion coefficient is at any point in space of a cell membrane. The method used bayesian non-parametrics to learn this value. Learning the diffusion coefficient might be useful for understanding more about cellular dynamics.
We attempt to analyze the effect of fatigue on free throw efficiency in the National Basketball Association (NBA) using play-by-play data from regular-season, regulation-length games in the 2016-2017, 2017-2018, and 2018-2019 seasons. Using both regression and tree-based statistical methods, we analyze the relationship between minutes played total and minutes played…
We attempt to analyze the effect of fatigue on free throw efficiency in the National Basketball Association (NBA) using play-by-play data from regular-season, regulation-length games in the 2016-2017, 2017-2018, and 2018-2019 seasons. Using both regression and tree-based statistical methods, we analyze the relationship between minutes played total and minutes played continuously at the time of free throw attempts on players' odds of making an attempt, while controlling for prior free throw shooting ability, longer-term fatigue, and other game factors. Our results offer strong evidence that short-term activity after periods of inactivity positively affects free throw efficiency, while longer-term fatigue has no effect.
Using object-oriented programming in MATLAB, a collection of functions, named Fourfun, has been created to allow quick and accurate approximations of periodic functions with Fourier expansions. To increase efficiency and reduce the number of computations of the Fourier transform, Fourfun automatically determines the number of nodes necessary for representations that…
Using object-oriented programming in MATLAB, a collection of functions, named Fourfun, has been created to allow quick and accurate approximations of periodic functions with Fourier expansions. To increase efficiency and reduce the number of computations of the Fourier transform, Fourfun automatically determines the number of nodes necessary for representations that are accurate to close to machine precision. Common MATLAB functions have been overloaded to keep the syntax of the Fourfun class as consistent as possible with the general MATLAB syntax. We show that the system can be used to efficiently solve several differential equations. Comparisons with Chebfun, a similar system based on Chebyshev polynomial approximations, are provided.
In applications such as Magnetic Resonance Imaging (MRI), data are acquired as Fourier samples. Since the underlying images are only piecewise smooth, standard recon- struction techniques will yield the Gibbs phenomenon, which can lead to misdiagnosis. Although filtering will reduce the oscillations at jump locations, it can often have the…
In applications such as Magnetic Resonance Imaging (MRI), data are acquired as Fourier samples. Since the underlying images are only piecewise smooth, standard recon- struction techniques will yield the Gibbs phenomenon, which can lead to misdiagnosis. Although filtering will reduce the oscillations at jump locations, it can often have the adverse effect of blurring at these critical junctures, which can also lead to misdiagno- sis. Incorporating prior information into reconstruction methods can help reconstruct a sharper solution. For example, compressed sensing (CS) algorithms exploit the expected sparsity of some features of the image. In this thesis, we develop a method to exploit the sparsity in the edges of the underlying image. We design a convex optimization problem that exploits this sparsity to provide an approximation of the underlying image. Our method successfully reduces the Gibbs phenomenon with only minimal "blurring" at the discontinuities. In addition, we see a high rate of convergence in smooth regions.
The reconstruction of piecewise smooth functions from non-uniform Fourier data arises in sensing applications such as magnetic resonance imaging (MRI). This thesis presents a new polynomial based resampling method (PRM) for 1-dimensional problems which uses edge information to recover the Fourier transform at its integer coefficients, thereby enabling the use…
The reconstruction of piecewise smooth functions from non-uniform Fourier data arises in sensing applications such as magnetic resonance imaging (MRI). This thesis presents a new polynomial based resampling method (PRM) for 1-dimensional problems which uses edge information to recover the Fourier transform at its integer coefficients, thereby enabling the use of the inverse fast Fourier transform algorithm. By minimizing the error of the PRM approximation at the sampled Fourier modes, the PRM can also be used to improve on initial edge location estimates. Numerical examples show that using the PRM to improve on initial edge location estimates and then taking of the PRM approximation of the integer frequency Fourier coefficients is a viable way to reconstruct the underlying function in one dimension. In particular, the PRM is shown to converge more quickly and to be more robust than current resampling techniques used in MRI, and is particularly amenable to highly irregular sampling patterns.
Analytic research on basketball games is growing quickly, specifically in the National Basketball Association. This paper explored the development of this analytic research and discovered that there has been a focus on individual player metrics and a dearth of quantitative team characterizations and evaluations. Consequently, this paper continued the exploratory…
Analytic research on basketball games is growing quickly, specifically in the National Basketball Association. This paper explored the development of this analytic research and discovered that there has been a focus on individual player metrics and a dearth of quantitative team characterizations and evaluations. Consequently, this paper continued the exploratory research of Fewell and Armbruster's "Basketball teams as strategic networks" (2012), which modeled basketball teams as networks and used metrics to characterize team strategy in the NBA's 2010 playoffs. Individual players and outcomes were nodes and passes and actions were the links. This paper used data that was recorded from playoff games of the two 2012 NBA finalists: the Miami Heat and the Oklahoma City Thunder. The same metrics that Fewell and Armbruster used were explained, then calculated using this data. The offensive networks of these two teams during the playoffs were analyzed and interpreted by using other data and qualitative characterization of the teams' strategies; the paper found that the calculated metrics largely matched with our qualitative characterizations of the teams. The validity of the metrics in this paper and Fewell and Armbruster's paper was then discussed, and modeling basketball teams as multiple-order Markov chains rather than as networks was explored.
The recovery of edge information in the physical domain from non-uniform Fourier data is of importance in a variety of applications, particularly in the practice of magnetic resonance imaging (MRI). Edge detection can be important as a goal in and of itself in the identification of tissue boundaries such as…
The recovery of edge information in the physical domain from non-uniform Fourier data is of importance in a variety of applications, particularly in the practice of magnetic resonance imaging (MRI). Edge detection can be important as a goal in and of itself in the identification of tissue boundaries such as those defining the locations of tumors. It can also be an invaluable tool in the amelioration of the negative effects of the Gibbs phenomenon on reconstructions of functions with discontinuities or images in multi-dimensions with internal edges. In this thesis we develop a novel method for recovering edges from non-uniform Fourier data by adapting the "convolutional gridding" method of function reconstruction. We analyze the behavior of the method in one dimension and then extend it to two dimensions on several examples.
Dividing the plane in half leaves every border point of one region a border point of both regions. Can we divide up the plane into three or more regions such that any point on the boundary of at least one region is on the border of all the regions? In…
Dividing the plane in half leaves every border point of one region a border point of both regions. Can we divide up the plane into three or more regions such that any point on the boundary of at least one region is on the border of all the regions? In fact, it is possible to design a dynamical system for which the basins of attractions have this Wada property. In certain circumstances, both the Hénon map, a simple system, and the forced damped pendulum, a physical model, produce Wada basins.
This paper intends to analyze the Phoenix Suns' shooting patterns in real NBA games, and compare them to the "NBA 2k16" Suns' shooting patterns. Data was collected from the first five Suns' games of the 2015-2016 season and the same games played in "NBA 2k16". The findings of this paper…
This paper intends to analyze the Phoenix Suns' shooting patterns in real NBA games, and compare them to the "NBA 2k16" Suns' shooting patterns. Data was collected from the first five Suns' games of the 2015-2016 season and the same games played in "NBA 2k16". The findings of this paper indicate that "NBA 2k16" utilizes statistical findings to model their gameplay. It was also determined that "NBA 2k16" modeled the shooting patterns of the Suns in the first five games of the 2015-2016 season very closely. Both, the real Suns' games and the "NBA 2k16" Suns' games, showed a higher probability of success for shots taken in the first eight seconds of the shot clock than the last eight seconds of the shot clock. Similarly, both game types illustrated a trend that the probability of success for a shot increases as a player holds onto a ball longer. This result was not expected for either game type, however, "NBA 2k16" modeled the findings consistent with real Suns' games. The video game modeled the Suns with significantly more passes per possession than the real Suns' games, while they also showed a trend that more passes per possession has a significant effect on the outcome of the shot. This trend was not present in the real Suns' games, however literature supports this finding. Also, "NBA 2k16" did not correctly model the allocation of team shots for each player, however, the differences were found only in bench players. Lastly, "NBA 2k16" did not correctly allocate shots across the seven regions for Eric Bledsoe, however, there was no evidence indicating that the game did not correctly model the allocation of shots for the other starters, as well as the probability of success across the regions.
The detection and characterization of transients in signals is important in many wide-ranging applications from computer vision to audio processing. Edge detection on images is typically realized using small, local, discrete convolution kernels, but this is not possible when samples are measured directly in the frequency domain. The concentration factor…
The detection and characterization of transients in signals is important in many wide-ranging applications from computer vision to audio processing. Edge detection on images is typically realized using small, local, discrete convolution kernels, but this is not possible when samples are measured directly in the frequency domain. The concentration factor edge detection method was therefore developed to realize an edge detector directly from spectral data. This thesis explores the possibilities of detecting edges from the phase of the spectral data, that is, without the magnitude of the sampled spectral data. Prior work has demonstrated that the spectral phase contains particularly important information about underlying features in a signal. Furthermore, the concentration factor method yields some insight into the detection of edges in spectral phase data. An iterative design approach was taken to realize an edge detector using only the spectral phase data, also allowing for the design of an edge detector when phase data are intermittent or corrupted. Problem formulations showing the power of the design approach are given throughout. A post-processing scheme relying on the difference of multiple edge approximations yields a strong edge detector which is shown to be resilient under noisy, intermittent phase data. Lastly, a thresholding technique is applied to give an explicit enhanced edge detector ready to be used. Examples throughout are demonstrate both on signals and images.