Although I’m a “newbie”, I’m currently looking into the first one. I’ll post how I did it here, feedback is much appreaciated xD
I used wget -m -p -E -k -np https://www.nrcs.usda.gov/conservation-basics/natural-resource-concerns/soil/climate-data-tools as a starting point. (as suggested by @markwyner)
Henry Dataset
For the first entry in the list the “Henry”-Database I also scraped the website via wget -m -p -E -k -np http://soilmap2-1.lawr.ucdavis.edu/henry/ and as there is no convenient way to download the whole dataset at once (besides maybe via R) I am now downloading the dataset manually via looping via for i in `seq 1 2000`; do wget https://soilmap2-1.lawr.ucdavis.edu/henry/get_sensor_graph_data.php?sensors=$i; done (2000 as the website shows ~1600 sensors and it looks like they are numbered sequentially) I got the associated data for sensors via wget https://soilmap2-1.lawr.ucdavis.edu/henry/get_sensors.php
Droughtmonitor
for i in `seq 2000 2025`; do wget https://droughtmonitor.unl.edu/data/shapefiles_m//$i\_USDM_M.zip ; done
2 Likes