Yeah, right.
Anyway, I'll explain it more fully for you as you missed the point.
This year we had a remarkably well organised polar vortex. In the weeks running up to Christmas the models were unwaveringly showing a zonal setup with a roaring jet across the Atlantic.
The models handle this very well, as they're designed to - it is, after all the default scenario.
You would no doubt have noticed that this winter the models have shown virtually nothing when it comes to blocking - even in the ensembles, -10 850s and easterlies have been almost entirely absent.
In such a scenario, with such high confidence and high probability of the outcome (zonal), losing even 15% of the data wouldn't make much difference - it might mean instead of zonality being say 90-95% likely, it would be 85-100% likely (i.e. a widening of the envelope, but not enough to make an appreciable difference).
On the other hand, the previous few years have had much lower confidence in a given outcome, "Shannon entropy" as it was called by some. They still leaned towards zonality (the models after all will generally revert to the climatinc mean), but the chance of that was much lower than it was in the run up to this Christmas. In all those years, -10s and so forth were far more common in the ensemble suites.
So, when you remove data you change a relatively uncertain scenario into a very uncertain one. One of the effects of this is that the jet stream is even more poorly modelled and hence you get highs / blocking popping up where it won't actually occur in reality. As the data comes back and the confidence increases, the more outlandish outcomes disappear again.
You would also have seen this to a limited extent this Christmas too. There was a (feeble) attempt at blocking on the 6z run on Christmas Day - and again today, also on the 6z run of the GFS. You'll brush it off as coincidence, of course, but it's damned funny how it keeps on happening at times of lower data input!
Originally Posted by: Retron