Wind speed depends on the time "average" used
the big question is what do the numbers in the forecast model mean?
lets say, it says it will be 15 knots from SW at 2 pm, but what does that mean?
Is it going to be on average 15 knots for one hour? Or is it going to be 15 knots at least ones during a 2 minute sample period?
From what I've gathered wind speed averages are not all made equal. If you compare 1 minute average wind speed versus 10 minute average wind speed, then the highest 1 minute average would be about 14% stronger then 10 minute average.
Whenever you look at a map full of weather stations, all of them use different averages, which means that you can't really compare the wind speed readings from one to another.
The good thing is that all airports around the world use 2 minute averages for wind speeds. But then Europe uses 10 minute, and US uses 2 minute average and weatherflow stations use 1 minute averages.
Anyone has any thoughts on above? How big of a deal is this really?
You are right, there are different averaging way of the wind. In most of places in the world, wind is averaged on 10 minutes, except the US and the airports which use the US standard of 2 minutes. 1 minutes is only considered for the Saffir-Simpson scale used by NHC for tropical cyclones. It is sure that most of weather models are not calibrated to give 1 minute-wind output.
For a long time I tried to know what wind average is considered for each weather model. I’ve never found an answer about that. The reported wind speeds by weather stations in part of the world are averaged on 10 minutes and in other part on 2 minutes. And all these data are used during the assimilation phase by each weather models. So what more can we say....