Automated anomaly detection and localization in musculoskeletal radiographs are essential for large-scale screening in the radiograph workflow. However, the anomaly is a localized pattern that may be affected by the extra irrelevant areas, and the ambiguous and subtle features of abnormal regions are difficult to detect. To tackle these two problems, we propose a deep guided context-aware network (DR-Net) for anomaly detection in musculoskeletal X-rays. Specifically, we design a positional guide module, which embeds the spatial positional information as prior knowledge to guide the network to enhance the feature representation of a specific region. Then, to detect subtle anomalies, we construct a contextual relation module. It can obtain context-aware features by capturing the spatial dependence of any two positions from the entire X-ray image. It combines context appearance information and selects more distinguishable features from space and channel, producing a detailed visualization of the anomaly region. Note that only image-level labels are required. The extensive experiments on the two radiograph datasets show that DR-Net has a promising performance in anomaly detection and localization.