PhD student, University of California, Irvine
Although I have many academic interests, my primary research area is in Cognitive Psychometrics. I develop probabilistic models that treat observable behavior as a stochastic system, with probabilities for certain outcomes being defined according to unknown specified properties of people's individual cognitive processes as well as experimental stimuli. With these probabilistic models and behavioral data, I use approaches borrowed from modern computational statistics and machine learning to infer these unknown properties of interest, utilizing their probabilistic connection with the observable data. For example, I often work with paired comparison data. Such a behavioral task often takes the form of preference questions, where people are asked whether item a or item b is preferable to them. When you have a set of items, and different people state their preferences for each pair of items, you can deduce quantitative estimates of properties such as the value of each item for the group, individual people's deviations from the average group preference, and the level of disagreement about each item. Making these sorts of theory-based measurements of cognitive phenomena that are not directly observable is the nature of my work. For the conference, I'm presenting my application of this kind of methodology specifically to the study of probabilistic and temporal monetary discounting. Temporal discounting refers to the devaluation of a monetary reward when there is a known time delay, whereas probabilistic discounting refers to the devaluation of a monetary reward when the known probability of receiving the reward gets smaller. Questions such as "would you prefer $1000 now or $1200 in a year", or "would you prefer $1000 or a 50% chance to win $2100" can help study these things. A popular theory behind temporal and probabilistic discounting is that the two are related in that they both can be thought of a choice between two different frequencies of monetary reward. Using new methodology and probabilistic modeling, I've worked out a way to efficiently measure discounting functions for both time and probability for a wide range of monetary amounts in the same people, along with people's difficulty in assessing value across monetary amounts, probabilities, and time delays. Doing so in the way I've formulated can further enhance our understanding of the connection between the two discounting domains. I will present my results at the conference.
Abstract: Observers discriminated the numerical proportion of two sets of elements (N = 9, 13, 33, and 65) that differed either by color or orientation. According to the standard Thurstonian approach, the accuracy of proportion discrimination is determined by irreducible noise in the nervous system that stochastically transforms the number of presented visual elements onto a continuum of psychological states representing numerosity. As an alternative to this customary approach, we propose a Thurstonian-binomial model, which assumes discrete perceptual states, each of which is associated with a certain visual element. It is shown that the probability β with which each visual element can be noticed and registered by the perceptual system can explain data of numerical proportion discrimination at least as well as the continuous Thurstonian-Gaussian model, and better, if the greater parsimony of the Thurstonian-binomial model is taken into account using AIC model selection. We conclude that Gaussian and binomial models represent two different fundamental principles-internal noise vs. using only a fraction of available information-which are both plausible descriptions of visual perception.
Pub.: 28 Sep '16, Pinned: 03 Jul '17
Abstract: Publication date: August 2016 Source:Journal of Mathematical Psychology, Volume 73 Author(s): Kentaro Katahira Computational models have been used to analyze the data from behavioral experiments. One objective of the use of computational models is to estimate model parameters or internal variables for individual subjects from behavioral data. The estimates are often correlated with other variables that characterize subjects in order to investigate which computational processes are associated with specific personal or physiological traits. Although the accuracy of the estimates is important for these purposes, the parameter estimates obtained from individual subject data are often unreliable. To solve this problem, researchers have begun to use hierarchical modeling approaches to estimate parameters of computational models from multiple-subject data. It is widely accepted that the hierarchical model provides reliable estimates compared to other non-hierarchical approaches. However, how and under what conditions the hierarchical models provide better estimates than other approaches has yet to be systematically investigated. This study attempts to investigate these issues, focusing on two measures of estimation accuracy: the correlation between estimates of individual parameters and subject trait variables and the absolute measures of error (root mean squared error, RMSE) of the estimates. An analytical calculation based on a simple Gaussian model clarifies how the hierarchical model improves the point estimates of these two measures. We also performed simulation studies employing several realistic computational models based on the synthesized data to confirm that the theoretical properties hold in realistic situations.
Pub.: 11 May '16, Pinned: 03 Jul '17
Abstract: Multidimensional scaling models of stimulus domains are widely used as a representational basis for cognitive modeling. These representations associate stimuli with points in a coordinate space that has some predetermined number of dimensions. Although the choice of dimensionality can significantly influence cognitive modeling, it is often made on the basis of unsatisfactory heuristics. To address this problem, a Bayesian approach to dimensionality determination, based on the Bayesian Information Criterion (BIC), is developed using a probabilistic formulation of multidimensional scaling. The BIC approach formalizes the trade-off between data-fit and model complexity implicit in the problem of dimensionality determination and allows for the explicit introduction of information regarding data precision. Monte Carlo simulations are presented that indicate, by using this approach, the determined dimensionality is likely to be accurate if either a significant number of stimuli are considered or a reasonable estimate of precision is available. The approach is demonstrated using an established data set involving the judged pairwise similarities between a set of geometric stimuli. Copyright 2001 Academic Press.
Pub.: 17 Feb '01, Pinned: 03 Jul '17
Abstract: When choosing between delayed or uncertain outcomes, individuals discount the value of such outcomes on the basis of the expected time to or the likelihood of their occurrence. In an integrative review of the expanding experimental literature on discounting, the authors show that although the same form of hyperbola-like function describes discounting of both delayed and probabilistic outcomes, a variety of recent findings are inconsistent with a single-process account. The authors also review studies that compare discounting in different populations and discuss the theoretical and practical implications of the findings. The present effort illustrates the value of studying choice involving both delayed and probabilistic outcomes within a general discounting framework that uses similar experimental procedures and a common analytical approach.
Pub.: 16 Sep '04, Pinned: 03 Jul '17
Abstract: The present experiments extend the temporal discounting paradigm from choice between an immediate and a delayed reward to choice between 2 delayed rewards: a smaller amount of money available sooner and a larger amount available later. Across different amounts and delays, the data were consistently well described by a hyperbola-like discounting function, and the degree of discounting decreased systematically as the delay to the sooner reward increased. Three theoretical models (the elimination-by-aspects, present-value comparison, and common-aspect attenuation hypotheses) were evaluated. The best account of the data was provided by the common-aspect attenuation hypothesis, according to which the common aspect of the choice alternatives (i.e., the time until the sooner reward is available) receives less weight in the decision-making process.
Pub.: 27 Oct '05, Pinned: 03 Jul '17
Abstract: Cultural Consensus Theory (CCT) models have been applied extensively across research domains in the social and behavioral sciences in order to explore shared knowledge and beliefs. CCT models operate on response data, in which the answer key is latent. The current paper develops methods to enhance the application of these models by developing the appropriate specifications for hierarchical Bayesian inference. A primary contribution is the methodology for integrating the use of covariates into CCT models. More specifically, both person- and item-related parameters are introduced as random effects that can respectively account for patterns of inter-individual and inter-item variability.
Pub.: 12 Dec '13, Pinned: 03 Jul '17
Join Sparrho today to stay on top of science
Discover, organise and share research that matters to you