Time delays are inherently present in any physical or biological network. However, the role of delays in echo state networks (ESNs) has only been touched upon. In recent years, the use of local plasticity has been explored in the field of reservoir computing, and specifically in ESNs. In this paper, we investigate the role of distance dependent inter-neuron delays in adaptive reservoirs. We introduce a novel ESN design called adaptive distance-based delay network (ADDN), that combines inter-neuron delays with local synaptic plasticity in the reservoir weights using a delay sensitive version of the Bienenstock-Cooper-Munro (BCM) rule. We show that ADDNs perform better on prediction tasks compared to ESNs, regular distance-based delay networks, and ESNs with conventional BCM connections. We optimized the hyperparameters of ADDNs and each of the baseline models using covariance matrix adaptation evolution strategy (CMA-ES). We prove that with ADDNs, we can evolve a single set of hyperparameters that can generate networks which, after unsupervised adaptation, can obtain good performance on different Mackey-Glass sequences with a range of different time constants. By adapting its reservoir weights to the dynamics of the input data, ADDNs can generalize between versions of the same “class” of tasks.