Visibility is important for transportation systems, i.e., vehicles, airplanes, drones, ships. Weather, atmospheric, and artificial conditions, i.e., rain, fog, and chemical components, determine visibility levels and distances. Visibility sensors have been used for specific purposes. On the other hand, camera image-based approaches have enhanced image quality, i.e., visibility, in foggy and rainy scenes. However, it still is hard to obtain the ground truth of visibility distance from a camera image, particularly for far distances. This paper proposes a novel camera-based visibility estimation method using an atmospheric chemical sensor and Deep Learning-based camera images: DeepViss. A transformer regression trains different directions of the paired sensor-image datasets. Experimental results demonstrate the usefulness of DeepViss for far distances, i.e., a few hundred km. This approach contributes to replacing expensive visibility sensors with low-cost cameras, i.e., drones and ships.