服务器禁用爬虫

2022-09-26 17:05 By "Powerless" 2678 0 3

【Nginx禁止爬虫访问的方法】

if ($http_user_agent ~* "Scrapy|Baiduspider|Curl|HttpClient|Bytespider|FeedDemon|JikeSpider|Indy Library|Alexa Toolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Feedly|UniversalFeedParser
|ApacheBench|Microsoft URL Control|Swiftbot|ZmEu|oBot|jaunty|Python-urllib|lightDeckReports Bot|YYSpider|DigExt|YisouSpider|HttpClient|MJ12bot|heritrix|EasouSp
ider|Ezooms|^$"){
    return 403;
}

如需跳转其他页面,只需要吧return 403 换成对于的地址即可,配置如下:

if ($http_user_agent ~* "Scrapy|Baiduspider|Curl|HttpClient|Bytespider|FeedDemon|JikeSpider|Indy Library|Alexa Toolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Feedly|UniversalFeedParser
|ApacheBench|Microsoft URL Control|Swiftbot|ZmEu|oBot|jaunty|Python-urllib|lightDeckReports Bot|YYSpider|DigExt|YisouSpider|HttpClient|MJ12bot|heritrix|EasouSp
ider|Ezooms|^$") {
    return 301 https://yoursite.com;
}

如需禁止特定来源用户,配置如下:

if ($http_referer ~ "baidu\.com|google\.net|bing\.com")  {
  return 403;
}

如需仅允许GET,HEAD和POST请求,配置如下:

#fbrbidden not GET|HEAD|POST method access
if ($request_method !~ ^(GET|HEAD|POST)$) {
        return 403;
}


【Apache禁用爬虫的配置】

mod_rewrite模块确定开启的前提下,在.htaccess文件或者相应的.conf文件,添加以下内容:

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} (^$|FeedDemon|Indy Library|Alexa Toolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Feedly|UniversalFeedParser|ApacheBench|Microsoft URL Control|Swiftbot|ZmEu|oBot|jaunty|Python-urllib|lightDeckReports Bot|YYSpider|DigExt|HttpClient|MJ12bot|heritrix|EasouSpider|Ezooms) [NC]
RewriteRule . - [R=403,L]


评 论

View in WeChat

Others Discussion

  • MySQL中的行级锁,表级锁,页级锁
    Posted on 2018-08-25 11:00
  • MySQL分组
    Posted on 2019-11-18 14:00
  • PHP练习-爬楼梯问题
    Posted on 2020-08-14 23:56
  • PHP8.1 性能基准测试
    Posted on 2022-10-08 17:40
  • MySQL事务介绍
    Posted on 2019-06-05 18:14
  • 必学十大经典排序算法,看这篇就够了
    Posted on 2019-11-18 16:30
  • 巧用CAS解决数据一致性问题
    Posted on 2019-03-07 11:55
  • PHP练习-无重复字符的最长子串
    Posted on 2020-09-17 18:03