使用robos.txt屏蔽垃圾蜘蛛 或 nginx $http_user_agent屏蔽垃圾蜘蛛的UA

一、在网站根目录下robos.txt文件里面屏蔽掉相关垃圾蜘蛛:

User-agent: SemrushBot
Disallow: /
User-agent: DotBot
Disallow: /
User-agent: MJ12bot
Disallow: /
User-agent: AhrefsBot
Disallow: /
User-agent: MauiBot
Disallow: /
User-agent: MegaIndex.ru
Disallow: /
User-agent: BLEXBot
Disallow: /
User-agent: ZoominfoBot
Disallow: /
User-agent: ExtLinksBot
Disallow: /
User-agent: hubspot
Disallow: /
User-agent: leiki
Disallow: /
User-agent: webmeup
Disallow: /

User-agent: Googlebot
Disallow: /
User-agent: googlebot-image
Disallow: /
User-agent: googlebot-mobile
Disallow: /
User-agent: yahoo-mmcrawler
Disallow: /
User-agent: yahoo-blogs/v3.9
Disallow: /
User-agent: Slurp
Disallow: /
User-agent: twiceler
Disallow: /

User-agent: AhrefsBot
Disallow: /
User-agent: psbot
Disallow: /
User-agent: YandexBot
Disallow: /

 

 

下面以WP博客做例子:

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-include/
Disallow: /wp-login.php?redirect_to=*
Disallow: /go?_=*
Allow: /wp-admin/admin-ajax.php
User-agent: MJ12bot
Disallow: /
User-agent: YisouSpider
Disallow: /
User-agent: SemrushBot
Disallow: /
User-agent: SemrushBot-SA
Disallow: /
User-agent: SemrushBot-BA
Disallow: /
User-agent: SemrushBot-SI
Disallow: /
User-agent: SemrushBot-SWA
Disallow: /
User-agent: SemrushBot-CT
Disallow: /
User-agent: SemrushBot-BM
Disallow: /
User-agent: SemrushBot-SEOAB
Disallow: /
user-agent: AhrefsBot
Disallow: /
User-agent: DotBot
Disallow: /
User-agent: Uptimebot
Disallow: /
User-agent: MegaIndex.ru
Disallow: /
User-agent: ZoominfoBot
Disallow: /
User-agent: Mail.Ru
Disallow: /
User-agent: BLEXBot
Disallow: /
User-agent: ExtLinksBot
Disallow: /
User-agent: aiHitBot
Disallow: /
User-agent: Researchscan
Disallow: /
User-agent: DnyzBot
Disallow: /
User-agent: spbot
Disallow: /
User-agent: YandexBot
Disallow: /
User-agent: SemrushBot
Disallow: /
User-agent: SemrushBot-SA
Disallow: /
User-agent: SemrushBot-BA
Disallow: /
User-agent: SemrushBot-SI
Disallow: /
User-agent: SemrushBot-SWA
Disallow: /
User-agent: SemrushBot-CT
Disallow: /
User-agent: SemrushBot-BM
Disallow: /
User-agent: SemrushBot-SEOAB
Disallow: /

二、在niginx配置文件中加入以下代码,使用User Agent禁止访问网站:

#禁止Scrapy等工具的抓取
if ($http_user_agent ~* (Scrapy|Curl|HttpClient)) {
return 403;
}

#禁止指定UA访问。搜集的比较全
if ($http_user_agent ~* “SiteAuditBot|PetalBot|WinHttp|WebZIP|FetchURL|node-superagent|Jullo|Apache-HttpAsyncClient|oBot|EasouSpider|BOT/0.1|FlightDeckReports|Linguee Bot|MegaIndex.ru|BLEXBot|ZoominfoBot|ExtLinksBot|hubspot|leiki|webmeup|MauiBot|DotBot|FeedDemon|Indy Library|Alexa|Toolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Feedly|UniversalFeedParser|ApacheBench|Microsoft URL Control|Swiftbot|ZmEu|YandexBot|jaunty|Python-urllib|lightDeckReports Bot|YYSpider|DigExt|HttpClient|MJ12bot|heritrix|Bytespider|Ezooms|Googlebot|JikeSpider|SemrushBot” ) {
return 403;
}

#禁止非GET|HEAD|POST方式的抓取
if ($request_method !~ ^(GET|HEAD|POST)$) {
return 403;
}

 

如何得出是频繁访问的user_agent呢,通过分析nginx的日志可以得出:

tail -n 1000 /usr/local/nginx/logs/access.log | awk -F\" '{A[$(NF-1)]++}END{for(k in A)print A[k],k}' | sort -n | tail
分析访问次数

测试:
curl -I -A "YisouSpider" www.ittce.com
扫码领红包

微信赞赏支付宝扫码领红包

发表回复

后才能评论