Abstract by Jeremy Meyer
The Attraction Indian Buffet Distribution (AIBD)
Latent feature models seek to uncover hidden categorical variables that explain observed data. A popular prior distribution for latent feature models is the Indian buffet process (IBP). The IBP is a distribution over binary feature matrices with an unlimited number of columns (for possible features) and a fixed number of rows (one for each observation). The IBP assumes that the observations are exchangeable. In many situations this exchangeability is not reasonable or desirable in the presence of pairwise similarity information between observations. We propose the attraction Indian buffet distribution (AIBD), a distribution for a binary feature matrix influenced by pairwise similarity. Our formulation preserves many of the properties of the original IBP. The probability mass function can be written explicitly and has a tractable normalizing constant, making posterior inference on hyperparameters straight-forward using standard MCMC methods. We demonstrate the feasibility and performance of our method in simulations and an example.