If you read through the "Keywords" FAQ you learned that some sort of keywords can cause issues during scraping.
Especially when scraping Google this can be a source of trouble.
In the large majority of cases scraping goes perfectly fine but sometimes Google might spot an unusual activity.
The use of special search operators often does not work in bulk and can result in the whole group of keywords to
be banned from Google (or other services).
We have noticed two sort of bans with Google:
Our system will do the best possible to get as much results as possible.
When facing detection our scraping backend reacts intelligently. It will skip the offending keyword(s) and reduce the scraping speed for the job.
The scraping servers will keep reducing speed and skipping keywords until the job is either finished or it has to be aborted.
Jobs with high priority will be aborted at a later time than jobs with low priority.
In the case of a global ban there is nothing we can do except wait because Google learned how to detect your keywords.
In the case of a "leaky ban" a higher priority will ensure that more results are being scraped and that can often be enough to finish a difficult job.
In all such cases it is recommended to consider using a more simple or natural form of keywords that will appear more like an organic search.