We present an embodied neural model for haptic object classification by active haptic exploration with the humanoid robot NICO. When NICO’s newly developed robotic hand closes around an object, multiple sensory readings from a tactile fingertip sensor, motor positions, and motor currents are recorded. We created a haptic dataset with 83200 haptic measurements, based on 100 samples of each of 16 different objects, every sample containing 52 measurements. First, we provide an analysis of neural classification models with regard to isolated haptic sensory channels for object classification. Based on this, we develop a series of neural models (MLP, CNN, LSTM) that integrate the haptic sensory channels to classify explored objects. As an initial baseline, our best model achieves a 66.6% classification accuracy over 16 objects. We show that this result is due to the ability of the network to integrate the haptic data both over time domain and over different haptic sensory channels. Furthermore, we make the dataset publically available to address the issue of sparse haptic datasets for machine learning research.