summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorCyril Jaquier <cyril.jaquier@fail2ban.org>2007-03-07 21:32:33 +0000
committerCyril Jaquier <cyril.jaquier@fail2ban.org>2007-03-07 21:32:33 +0000
commit64226d09c0a81fecee68e9e7602b054ebb701e39 (patch)
tree948faf6696d41ae43f02674fe91e779d7b0364e4
parent7f92be5f7f2bf115ec5c4984885531ad553d3632 (diff)
downloadfail2ban-64226d09c0a81fecee68e9e7602b054ebb701e39.tar.gz
- Improved failregex a bit
- Added TrackBack/1.02 git-svn-id: https://fail2ban.svn.sourceforge.net/svnroot/fail2ban/trunk@558 a942ae1a-1317-0410-a47c-b1dcaea8d605
-rw-r--r--config/filter.d/apache-badbots.conf11
1 files changed, 6 insertions, 5 deletions
diff --git a/config/filter.d/apache-badbots.conf b/config/filter.d/apache-badbots.conf
index 4feb70d7..a700c863 100644
--- a/config/filter.d/apache-badbots.conf
+++ b/config/filter.d/apache-badbots.conf
@@ -5,18 +5,19 @@
#
# Author: Yaroslav Halchenko
#
-#
[Definition]
-badbotscustom = EmailCollector|WebEMailExtrac
+
+badbotscustom = EmailCollector|WebEMailExtrac|TrackBack/1\.02
badbots = atSpider/1\.0|autoemailspider|China Local Browse 2\.6|ContentSmartz|DataCha0s/2\.0|DataCha0s/2\.0|DBrowse 1\.4b|DBrowse 1\.4d|Demo Bot DOT 16b|Demo Bot Z 16b|DSurf15a 01|DSurf15a 71|DSurf15a 81|DSurf15a VA|EBrowse 1\.4b|Educate Search VxB|EmailSiphon|EmailWolf 1\.00|ESurf15a 15|ExtractorPro|Franklin Locator 1\.8|FSurf15a 01|Full Web Bot 0416B|Full Web Bot 0516B|Full Web Bot 2816B|Industry Program 1\.0\.x|ISC Systems iRc Search 2\.1|IUPUI Research Bot v 1\.9a|LARBIN-EXPERIMENTAL \(efp@gmx\.net\)|LetsCrawl\.com/1\.0 +http\://letscrawl\.com/|Lincoln State Web Browser|LWP\:\:Simple/5\.803|Mac Finder 1\.0\.xx|MFC Foundation Class Library 4\.0|Microsoft URL Control - 6\.00\.8xxx|Missauga Locate 1\.0\.0|Missigua Locator 1\.9|Missouri College Browse|Mizzu Labs 2\.2|Mo College 1\.9|Mozilla/2\.0 \(compatible; NEWT ActiveX; Win32\)|Mozilla/3\.0 \(compatible; Indy Library\)|Mozilla/4\.0 \(compatible; Advanced Email Extractor v2\.xx\)|Mozilla/4\.0 \(compatible; Iplexx Spider/1\.0 http\://www\.iplexx\.at\)|Mozilla/4\.0 \(compatible; MSIE 5\.0; Windows NT; DigExt; DTS Agent|Mozilla/4\.0 efp@gmx\.net|Mozilla/5\.0 \(Version\: xxxx Type\:xx\)|MVAClient|NASA Search 1\.0|Nsauditor/1\.x|PBrowse 1\.4b|PEval 1\.4b|Poirot|Port Huron Labs|Production Bot 0116B|Production Bot 2016B|Production Bot DOT 3016B|Program Shareware 1\.0\.2|PSurf15a 11|PSurf15a 51|PSurf15a VA|psycheclone|RSurf15a 41|RSurf15a 51|RSurf15a 81|searchbot admin@google\.com|sogou spider|sohu agent|SSurf15a 11 |TSurf15a 11|Under the Rainbow 2\.2|User-Agent\: Mozilla/4\.0 \(compatible; MSIE 6\.0; Windows NT 5\.1\)|WebVulnCrawl\.blogspot\.com/1\.0 libwww-perl/5\.803|Wells Search II|WEP Search 00
# Option: failregex
-# Notes.: Regexp to catch known spambots and software alike. Please verify that
-# it is your intent to block IPs which were driven by abovementioned bots
+# Notes.: Regexp to catch known spambots and software alike. Please verify
+# that it is your intent to block IPs which were driven by
+# abovementioned bots.
# Values: TEXT
#
-failregex = ^(?P<host>\S*) -.*"GET.*HTTP.*"(?:%(badbots)s|%(badbotscustom)s)"$
+failregex = ^<HOST> -.*"(GET|POST).*HTTP.*"(?:%(badbots)s|%(badbotscustom)s)"$
# Option: ignoreregex
# Notes.: regex to ignore. If this regex matches, the line is ignored.