Depends on the situation really. E.g., how often you make changes to the data that trigger recalculations; what kind of “long time” we’re talking etc. Option 2 is certainly a good idea for many cases. But it also may happen that the bottleneck is something that could be easily avoided with some minor reorganization, like this.
TL’DR: need to review each case individually.
Re option 2, here’s what I’m doing for example. There’s a report that takes quite some time to generate (several minutes). So I encourage to do it only on demand, but provide indication that no refresh is required (based on whether ModifiedOn.Max
of source tables is earlier or later than such of the generated table)