-
Notifications
You must be signed in to change notification settings - Fork 466
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
海量数据去重-dedup - feapder-document #10
Comments
好骚~ |
么么哒 |
数据入了一次库,我清库后,想再入一次,但是一直提示重复数据,清了redis dedup key, 还是没用。 |
就只存redis里了 |
那就很奇怪,在写item的时候,总是提示数据重复,写入为0,换了个redis,表也删了,就差没重启机器了。 |
FloomFilter有bug |
开启去重后,相应的key得手动删除,使用delete_keys="*"无效 |
因为去重库默认是共用的,多个爬虫在一个池子里去重,目的是为了节省空间 |
https://boris.org.cn/feapder/#/source_code/dedup
Description
The text was updated successfully, but these errors were encountered: