Many people believe in the "Law of Small Numbers," exaggerating the degree to which a small sample resembles the population from which it is drawn. To model this, I assume that a person exaggerates the likelihood that a short sequence of i.i.d. signals resembles the long-run rate at which those signals are generated. Such a person believes in the "gambler's fallacy", thinking early draws of one signal increase the odds of next drawing other signals. When uncertain about the rate, the person over-infers from short sequences of signals, and is prone to think the rate is more extreme than it is. When the person makes inferences about the frequency at which rates are generated by different sources -- such as the distribution of talent among financial analysts -- based on few observations from each source, he tends to exaggerate how much variance there is in the rates. Hence, the model predicts that people may pay for financial advice from "experts" whose expertise is entirely illusory. Other economic applications are discussed.