We investigate the origin of Zipf's law for words in written texts by means of a stochastic dynamic model for text generation. The model incorporates both features related to the general structure of languages and memory effects inherent to the production of long coherent messages in the communication process. It is shown that the multiplicative dynamics of our model lead to rank-frequency distributions in quantitative agreement with empirical data. Our results give support to the linguistic relevance of Zipf's law in human language. © Taylor & Francis Group Ltd.