Plasticity of synaptic weights is usually supposed the foundation of learning and long-term memory in biological neural networks. Mathematical models of both biological and artificial neural networks reflect this vision. Little attention is paid to the role spike propagation delays play in information processing and learning. We propose a model of myelin plasticity which controls the efficiency of spikes propagation along axons. A neuron modifies the myelin sheath thickness of its input axons to achieve better synchrony of incoming spikes. Synchronous input spikes cause higher postsynaptic response which leads to higher spike generation probability. We show that the axonal delay plasticity model may be used to train a network recognize input patterns even when synaptic weights remain fixed. The delay plasticity approach may be a useful augmentation of spiking neural networks used in neuromorphic computing.