I was in third grade when the Blizzard of '78 hit. We watched every weather forecast that came on in northeast CT, Moosup at the time.
The family favorites were Hilton Kaderli in Hartford and John Ghiorse in Providence. It was my first introduction, that I remember, into the world of weather forecasting. It stuck with me over the years and I eventually decided to pick a college that would allow me to study a little weather. I was still certain I'd play pro baseball, so I wasn't actually considering it for a career just yet.
The basics are what I expected in Intro Meteorology at Cornell University. Still, there was talk about computer models. Since it was an intro course, there was no in-depth discussion about the thermodynamics that went in to those models or the calculus involved. Deeper into my meteorology studies, I'd learn all there was to know about the computer models of the day. Not so much that I could have programmed the computers, but as much as would be needed to know to make forecasts and understand the underlying principles.
The Limited Fine Mesh (LFM) model was still around when I began college in 1987, but it was on its way out. It was the model that correctly predicted the blizzard of '78 two days out- an amazing feat at the time. We'd take the LFM forecast precipitation amounts and routinely cut them in half. That's where we would start.
Taking over was the Nested Grid Model (NGM), which was essentially an improvement of the LFM. It was a wonder in the world of weather at the time, and I was getting into the forecasting just as it arrived. It had better resolution, meaning there was more data going into it. Every computer model has grids overlaying the globe. Calculations are done for points on the grid. Less space between points means more calculations and more points. That, theoretically, leads to better forecasts.