Deep learning downscaling of climate variables to convection-permitting scales

dc.contributor.authorDaust, Kiri Shea
dc.contributor.supervisorMonahan, Adam Hugh
dc.date.accessioned2024-11-12T21:24:04Z
dc.date.available2024-11-12T21:24:04Z
dc.date.issued2024
dc.degree.departmentSchool of Earth and Ocean Sciences
dc.degree.levelMaster of Science MSc
dc.description.abstractAdapting to the changing climate requires accurate local climate information, a computationally challenging problem. Recent studies have used Generative Adversarial Networks (GANs), a type of deep learning, to learn complex distributions and downscale climate variables efficiently. Capturing variability while downscaling is crucial for estimating uncertainty and characterising extreme events—critical information for climate adaptation. Since downscaling is an undetermined problem, many fine-scale states are physically consistent with the coarse-resolution state. To address this ill-posed problem, downscaling techniques should be stochastic, able to sample realisations from a high-resolution distribution conditioned on low-resolution input. Previous stochastic downscaling attempts have found substantial underdispersion, with models failing to represent the full distribution. I propose approaches to improve the stochastic calibration of GANs in three ways: a) injecting noise inside the network, b) adjusting the training process to explicitly account for the stochasticity, and c) using a probabilistic loss metric. I tested models first on a synthetic dataset with known distributional properties, and then on a realistic downscaling scenario, predicting high-resolution wind components from low-resolution climate covariates. Injecting noise, on its own, substantially improved the quality of conditional and full distributions in tests with synthetic data, but performed less well for wind field downscaling, where models remained underdispersed. For wind downscaling, I found that adjusting the training method and including the probabilistic loss improved calibration. The best model, with all three changes, showed much improved skill at capturing the full variability of the high-resolution distribution and thus at characterising extremes. Investigating the stochastic GAN framework with other variables, I show that it successfully downscales temperature, specific humidity, and precipitation. I also find that the stochastic framework substantially improves the downscaling of extreme precipitation. Next, I find that while multivariate downscaling can improve dependence structures between downscaled variables, it leads to blurry downscaling of individual variables. I demonstrate that including high-resolution topography as an input improves spatial structure for most variables. Finally, I test the generalisability of the GAN framework to a new location with a different climate, and show that while the GAN performs well for temperature and humidity, it fails for precipitation due to mismatches between the low- and high-resolution data. These results represent important techniques and insights towards operational GAN-based downscaling.
dc.description.scholarlevelGraduate
dc.identifier.urihttps://hdl.handle.net/1828/20740
dc.languageEnglisheng
dc.language.isoen
dc.rightsAvailable to the World Wide Web
dc.subjectClimate downscaling
dc.subjectDeep learning
dc.subjectStatistical downscaling
dc.subjectEnsemble forecasts
dc.titleDeep learning downscaling of climate variables to convection-permitting scales
dc.typeThesis

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Daust_Kiri_MSc_2024.pdf
Size:
35.64 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.62 KB
Format:
Item-specific license agreed upon to submission
Description: