An ideal literature search should retrieve relevant articles, and omit irrelevant ones, using the fewest terms possible. However, a search strategy that is too focused will likely miss some relevant articles. On the other hand, broadening the search to capture all relevant articles comes at the expense of retrieving irrelevant articles that must then be screened out. This is probably not a big issue if the total number of papers is manageable (a few hundred?), but can become very laborious if there are thousands.
I wonder if there are any rules of thumb or research/discussion on the optimal ratio of hits to misses to aim for in literature searches of different sizes?
For example, suppose 20 papers were published about an event and a focused search finds 15, with no irrelevant results. To find the other 5, it may be necessary to widen the search to return 40 results (1:1 hits to misses), 100 results (1:4), or 200 (1:9). However, if there were 50 relevant papers, a search with 1:9 hits to misses would mean screening 50 positives from among 500 results, which becomes a much more arduous task, so it may be better to aim for 1:4 at the risk of losing a few relevant results.