This paper presents a scheme for retrieving wind direction without 180 degrees ambiguity over the sea solely based on Synthetic Aperture Radar (SAR) images. The dataset utilized for training, validating, and testing wind direction estimation deep neural network consists of 19,210 spatiotemporal match-ups of Sentinel-1 sub-images from Google Earth Engine (GEE) platform and wind direction records from National Data Buoy Center (NDBC) stations, from 2014 to 2023. Evaluation metrics, namely Line Angle Error (LAE) and Wind Direction Ambiguity accuracy (WDA accuracy), are proposed to facilitate modelling and assessing the performance of model in various conditions, whether open or not, and under different marine atmospheric boundary layer (MABL) stratification states. Post-processing method is devised to derive final robust wind direction field. Stratification can be quantified by a bulk Richardson number, Ri. With a wide spatiotemporal distribution, 58 full-frame Sentinel-1 images that satisfy -0.016 < Ri < +0.001 constitute an independent dataset for validating the entire wind direction retrieval process. Statistical analysis was conducted to compare the SAR-derived wind directions with those from COSMO-REA6, ASCAT and ERA5. It reports bias <1.4(degrees) and centered root mean square difference (CRMSD) < 17.3(degrees). Wind streaks (WS) and shallow convective cells (MC) are common coherent structures in these images, both of which contain wind direction information. The scheme has a certain ability to counteract the contamination from rain cells, internal waves and atmospheric gravity waves, and also demonstrates great potential for high-resolution wind direction retrieval of tropical cyclones (TC). Additionally, we introduced Gradient-weighted Class Activation Mapping (Grad-CAM) to emphasize the significance of landshadow features to wind direction inference.