The ability to detect that a grasped object is slipping from the robot gripper is a crucial skill for autonomous robotic manipulation. However, current solutions for automatic slip detection do not perform well in real-world unstructured settings, in which a wide variety of gripper-object interactions could occur. Tactile and force sensing are the most suitable sensory modalities to detect such events, and the recent technological advances in the field are generating novel interesting opportunities. In this work, we propose a data-driven method for automatic slip detection that leverages a novel sensor, which combines the advantages of tactile and force sensing, i.e. distributed measurements of normal and shear contact forces. Interestingly, our model is trained (and tested) uniquely with data obtained during routine robot operations (i.e. in the wild) rather than during a controlled data collection procedure. We compare different sets of tactile/force features to highlight the advantages provided by the different sensory modalities, and we report results that show good detection performances on our in-the-wild dataset, which we make publicly available.