• Welcome to TalkWeather!
    We see you lurking around TalkWeather! Take the extra step and join us today to view attachments, see less ads and maybe even join the discussion.
    CLICK TO JOIN TALKWEATHER

Creating My Own NWP Model

wx_guy

Member
Messages
1,033
Location
United States
HAM Callsign
KO4ZGH
Special Affiliations
  1. SKYWARN® Volunteer
  2. ARRL Member
Hello everyone!


I've recently undertaken a very tough (admittedly tougher than I imagined) endeavor to get a customized version of the WRF-ARW computer model running on my computer. After 50-75 hours of painstaking troubleshooting, I have been met with some initial success! As you can see below, I still have some polishing to do (getting rid of the weird double legend (probably easy fix, I just mis-coded), making the labels easier to see, timestamps, etc.) but I'm excited to share results so far. The static image below is temperatures at the "test time" I've been using which is 03/19/25 at 0z run. The gif is dewpoint temperatures.

I'm working on adding radar and lots of derived fields (especially for severe weather), and possibly integrating live data into it similar to the HRRR.

It currently uses GFS initial conditions downloaded from NOMADS.

I'm excited at all the possible uses here! I'll definitely be sharing output for weather events! I can even customize it to create my own hurricane model, so very excited to try that during hurricane season.

Happy to answer any questions anyone has. Everything I've used is open source and publicly available. Let's go!

temp19.gif1742567651815.png
 
Hello everyone!


I've recently undertaken a very tough (admittedly tougher than I imagined) endeavor to get a customized version of the WRF-ARW computer model running on my computer. After 50-75 hours of painstaking troubleshooting, I have been met with some initial success! As you can see below, I still have some polishing to do (getting rid of the weird double legend (probably easy fix, I just mis-coded), making the labels easier to see, timestamps, etc.) but I'm excited to share results so far. The static image below is temperatures at the "test time" I've been using which is 03/19/25 at 0z run. The gif is dewpoint temperatures.

I'm working on adding radar and lots of derived fields (especially for severe weather), and possibly integrating live data into it similar to the HRRR.

It currently uses GFS initial conditions downloaded from NOMADS.

I'm excited at all the possible uses here! I'll definitely be sharing output for weather events! I can even customize it to create my own hurricane model, so very excited to try that during hurricane season.

Happy to answer any questions anyone has. Everything I've used is open source and publicly available. Let's go!

View attachment 36872View attachment 36871
First of all great work, This is incredible. Would you be able to load historical environment/reanalysis data? I’d be interested in seeing some recreations of our most historic outbreaks.
 
First of all great work, This is incredible. Would you be able to load historical environment/reanalysis data? I’d be interested in seeing some recreations of our most historic outbreaks.
Thanks! As long as the data still resides in raw model format (GFS or otherwise), it should be possible, yeah.
 
Great work!

What horizontal resolution is it running at? Vertical resolution? What are the specs of the machine you’re running it on?

Are you able to compute derived parameters in model or does it need to be done in post processing? Fair warning (in case you’re not aware), if you need to calculate those derived parameters in post processing, if you’re using the MetPy/Python stack, it is EXTREMELY slow. Some users on Github report calculating some fields like SRH takes a couple of hours. I’m working on a C++ library to compute a lot of those thermodynamic/kinematic parameters that should be significantly faster. I’ll be happy to share it when I’m done if you’re interested.
 
Great work!

What horizontal resolution is it running at? Vertical resolution? What are the specs of the machine you’re running it on?

Are you able to compute derived parameters in model or does it need to be done in post processing? Fair warning (in case you’re not aware), if you need to calculate those derived parameters in post processing, if you’re using the MetPy/Python stack, it is EXTREMELY slow. Some users on Github report calculating some fields like SRH takes a couple of hours. I’m working on a C++ library to compute a lot of those thermodynamic/kinematic parameters that should be significantly faster. I’ll be happy to share it when I’m done if you’re interested.
Thanks! I have 3 grids set up, one at 15 km, one at 5km, and one at 1 km. I don't remember the vertical resolution, I'd have to look. I'm running this on a fairly high-end MacBook Pro (which in of itself caused a lot of setup issues). Most derived parameters I'm having to compute, but WRF-Python has some pretty robust code for doing a lot of it, but I'm still having to do some manually in Python. CAPE takes a good while to compute for me, I know.
 
Running the first FULL run (with all 3 domains), which I think will take 6-10 hours total. I started it at about 9:30 this morning and it's been running for ~6 hours so far, so we'll see how long it takes. Excited to test out some of the results. I'll post here, assuming all goes well.
 
It took about 10 hours to run the full 3-way gridded model, significantly longer than the 2-way gridded model (which took about 3 hours). Except in rare circumstances, not sure the 3-way gridded model is worth the extra time. Anyway! Here's some of the output results:

radar_dom1_032325.gif


radar_dom2_032325.gif


radar_dom3_032325.gif

DWPT_dom2_032325.gif



WND_dom2_032325.gif


I definitely plan to utilize this during severe season and hurricane season. In hurricane season, there's the option to create a vortex-following nest similar to what you see with the hurricane models that follow the cyclone, so that'll be cool! All in all, this is proving to be a really cool experience, despite the hours of frustration getting it to work on a MacBook. Excited of the possibilities though!
 
Just to update, I've stopped for now work on STP, SCP, and VTP calculations -- the correct way to do them was taking hours and hours on the computer for each frame of output. So I'll leave those to the supercomputers and High Performance Clusters right now haha But I have been producing a lot of cool new features with the existing data. Like I just got this hodograph feature working. Super excited at a few other things I'm working on.
test590.png
 
Back
Top