Hello,
I often have to work with very large logs - in the realm of 1 million records and easily more, gigabytes in log file size.
I usually merge these files and then start drilling down by using filters, like level filters and text filters (sometimes I also combine them with the boolean filter logic):

(see attached example.png)
To my surprise, when I add nested filters, the search time still takes quite a while (several seconds), even though the outer filter only has a few entries as result (in the realm of ~1-100 entries).
My guess is that the inner filters still run against the whole data set and then the result is merged with the outer filters.
My expectation, though, is that the inner filters should run against the already filtered outer results, in theory leading to much shorter runtimes.
Is there something I or you can do to improve the performance of nested filters?
Currently, I am on Beta 3.1.18, but this was already an issue with older versions.
Thanks and best regards,
Daniel