Efficient Variational Inference in miniKanren with Weighted Model CountingVirtual, Live
We extend miniKanren with a collection of primitives for describing probabilistic generative models and describe modifications to the languageās stream-based implementation that permit the efficient variational learning of such models via weighted model counting, with runtimes of within an order of magnitude of their manual implementations. We begin with a naive implementation that requires minimal changes to the core miniKanren implementation, and then describe two modifications to achieve practical levels of efficiency. The first alters the search to factorize conditionally independent conjuncts, avoiding unnecessary combinatorial explosion. The second modifies tabling to recover standard probabilistic dynamic programming algorithms such as Viterbi, forward-backward, and Baum-Welch. The end result is a simple extension to miniKanren that is nevertheless efficient enough to be of use in writing practical probabilistic relational programs.
Thu 15 SepDisplayed time zone: Belgrade, Bratislava, Budapest, Ljubljana, Prague change
14:00 - 15:30 | |||
14:00 30mTalk | Efficient Variational Inference in miniKanren with Weighted Model CountingVirtual, Live miniKanren Pre-print File Attached | ||
14:30 30mTalk | Some criteria for implementations of conjunction and disjunction in microKanrenVirtual, Live miniKanren Pre-print | ||
15:00 30mTalk | Fail Fast and Profile On: Towards a miniKanren ProfilerVirtual, Live miniKanren P: Sloan Chochinov University of Toronto Mississauga, P: Daksh Malhotra University of Toronto Mississauga, Gregory Rosenblatt University of Alabama at Birmingham, Matthew Might University of Alabama at Birmingham | Harvard Medical School, Lisa Zhang University of Toronto Mississauga Pre-print |