Abstract
We propose a parsimonious topic model for text corpora. In related models such as Latent Dirichlet Allocation (LDA), all words are modeled topic-specifically, even though many words occur with similar frequencies across different topics. Our modeling determines salient words for each topic, which have topic-specific probabilities, with the rest explained by a universal shared model. Further, in LDA all topics are in principle present in every document. By contrast our model gives sparse topic representation, determining the (small) subset of relevant topics for each document. We derive a Bayesian Information Criterion (BIC), balancing model complexity and goodness of fit. Here, interestingly, we identify an effective sample size and corresponding penalty specific to each parameter type in our model. We minimize BIC to jointly determine our entire model — the topic-specific words, document-specific topics, all model parameter values, {\it and} the total number of topics — in a wholly unsupervised fashion. Results on three text corpora and an image dataset show that our model achieves higher test set likelihood and better agreement with ground-truth class labels, compared to LDA and to a model designed to incorporate sparsity.
Biography
David J. Miller received the B.S.E. degree from Princeton University, Princeton, NJ, in 1987, the M.S.E. degree from the University of Pennsylvania, Philadelphia, in 1990, and the Ph.D. degree from the University of California, Santa Barbara, in 1995, all in electrical engineering. From January 1988 through January 1990, he was with General Atronics Corporation, Wyndmoor, PA. He joined the Pennsylvania State University, University Park as an assistant professor in 1995. Since 2008, he has been Professor of electrical engineering at Penn State. His research interests include statistical pattern recognition, machine learning, source coding, bioinformatics, and network security. Dr. Miller was a Member of the Machine Learning for Signal Processing Technical Committee within the IEEE Signal Processing Society until 2010. He was General Chair for the 2001 IEEE Workshop on Neural Networks for Signal Processing. From 2004-2007, he was Associate Editor for IEEE TRANSACTIONS ON SIGNAL PROCESSING. He received the National Science Foundation CAREER Award in 1996.