This thesis introduces new unsupervised machine learning algorithms for complex event data. Event data---records of things that happen---is both ubiquitous and abundant in the modern world. Many types of event occur when two or more distinct entities interact. If these entities are capable of interacting with other entities, the result is typically a network of interactions that is dynamic and complex. One very general statistical model of events is a Poisson process. In this work, Gaussian processes---another statistical model---are used to modulate the intensity of Poisson processes allowing dynamic patterns of interactions to be captured. This dynamic model can then be further extended using latent variables that represent the missing information needed to unpick the complex interactions present in the data. Learning Bayesian posterior distributions of the latent variables reveals previously hidden structure and also allows statistical strength to be shared across multiple distinct event streams, improving predictive performance. Efficient inference in these models is challenging however, and a significant contribution of this thesis is an efficient Bayesian inference algorithm for Gaussian process modulated Poisson processes. This scheme scales linearly in the number of observed events and does not require discretisation of the input domain, or expensive Monte-Carlo simulation, that previous posterior inference algorithms for this model required. Latent variable adaptations of this model are presented and the resulting algorithms are applied to data from a variety of different application domains.