Go to main content

Textpattern CMS support forum

You are not logged in. Register | Login | Help

#1 2006-04-07 15:57:21

jtlapp
Member
From: Austin, TX
Registered: 2006-03-19
Posts: 59
Website

Blocking crawler logging

Anybody know how to keeping crawler visits out of the log?
I don’t want to view just referrers.

I don’t mind tweaking the code, but I’m not sure what would do it.

Thanks!
~joe

Offline

#2 2006-04-07 16:15:52

els
Moderator
From: The Netherlands
Registered: 2004-06-06
Posts: 7,458

Re: Blocking crawler logging

In this thread you’ll find directions how to modify log.php to do this. At the time I couldn’t get it working, but I’m not a coder. So if the thread helps you to find a way how to do it, I’d love to hear it ;)

Offline

#3 2006-04-08 03:38:31

jtlapp
Member
From: Austin, TX
Registered: 2006-03-19
Posts: 59
Website

Re: Blocking crawler logging

Thanks, Els. That’s not quite what I was after anyway, but it does make it obvious what I need to do.

I’m going to code up a fix that actually prevents the logging of designated IPs and domains, allowing you to specify partial IPs and partial domains (and to avoid clock cycles doing domain lookups and DB writes).

It’s pretty straightforward. When I’m sure it works, I’ll post it here.

~joe

Offline

#4 2006-04-08 05:08:36

jtlapp
Member
From: Austin, TX
Registered: 2006-03-19
Posts: 59
Website

Re: Blocking crawler logging

Well, I’ve finished it and have it working, but I just spent the past 45 minutes trying to figure out how to get Textile to post the code to this forum. Putting the code between pre and code tags didn’t work. It would still interpret some of the code and — amazingly — even remove whole sections of code.

Google can find an article on posting code to the forum, but that article is gone. How do I do this???!!!!! Thanks!

Offline

#5 2006-04-08 05:11:45

KurtRaschke
Plugin Author
Registered: 2004-05-16
Posts: 275

Re: Blocking crawler logging

http://textpattern.com/faq/43/how-do-i-post-tags-and-code-on-the-forum

or use pastebin.

-Kurt


kurt@kurtraschke.com

Offline

#6 2006-04-08 05:22:55

jtlapp
Member
From: Austin, TX
Registered: 2006-03-19
Posts: 59
Website

Re: Blocking crawler logging

Okay, I have something that appears to work. It’s a hack for TxP 4.0.3. Two steps:

(1) Create a file called ignore.php and put it in textpattern/, the same directory that has config.php. Put this code in the file, but you customize it as needed:

<pre>
<notextile>
&lt;?php

// IP addresses not to log (or front-ends thereof if ends with dot).
// List an IP address if you can, as doing so eliminates DNS lookups.

$ignore_ips = array( ’192.168.1.’
);

// Domains not to log (or tail-ends thereof if begins with dot).
// List domains in any letter case; they are not case-sensitive.

$ignore_domains = array( ‘.googlebot.com’, ‘.inktomisearch.com’
);

?&gt;
</code>
</pre>

Be sure to put commas between your quoted IPs or domains, as PHP requires.

(2) Add to textpattern/publish/log.php all of the lines designated by /* JTL */ and found between /* BEGIN JTL */ and /* END JTL */, in the following code (or just replace the function logit() with everything here):

Code for log.php
(Sorry, I had to use pastebin. Textile does not seem to have a working syntax-escape. Thanks Kurt for pointing me there.)

This does not help people with dynamic IP who want to ignore their own visits. I’m not quite sure how to deal with that.

Lemme know if you have any problems.

~joe

Offline

#7 2006-04-08 06:00:25

zem
Developer Emeritus
From: Melbourne, Australia
Registered: 2004-04-08
Posts: 2,579

Re: Blocking crawler logging

Joe, grep publish.php for $nolog. You can do this in a plugin.


Alex

Offline

#8 2006-04-08 06:23:12

jtlapp
Member
From: Austin, TX
Registered: 2006-03-19
Posts: 59
Website

Re: Blocking crawler logging

Hi Alex. I must be missing something. I don’t see the hook, but that does look like a good place for a callback. ~joe

Offline

#9 2006-04-08 06:55:15

zem
Developer Emeritus
From: Melbourne, Australia
Registered: 2004-04-08
Posts: 2,579

Re: Blocking crawler logging

Plugin code loads early on, before that block. You could do your check at plugin load time, and set $nolog if it’s a hit that shouldn’t be logged.

Last edited by zem (2006-04-08 06:55:35)


Alex

Offline

#10 2006-04-08 07:15:31

jtlapp
Member
From: Austin, TX
Registered: 2006-03-19
Posts: 59
Website

Re: Blocking crawler logging

Ah, I see. Hmmm. That works well for the IP check but if I do the DNS check there we’d do two DNS lookups per logged visit (or one lookup and one “cache” retrieval or two “cache” retrievals). It works, but it’s not exactly desirable.

I’d need to hijack logit() altogether, but I’m not sure I can do that at plugin load time. If I can, I’d be duplicating TxP code (logit) and trying to keep it current with the latest TxP.

I’m not sure what the right answer is.

Offline

#11 2006-04-19 00:19:49

jtlapp
Member
From: Austin, TX
Registered: 2006-03-19
Posts: 59
Website

Re: Blocking crawler logging

Is anybody using this hack?

I’ve just realized that it has a drawback: since ignored domains don’t make it into the log file, and since the log file is the domain cache, every visit by an ignored domain suffers a DNS lookup.

I think the right thing to do is to have a proper domain cache (which would be a faster lookup, anyway, and fewer lookups overall). But perhaps an intermediate fix is to have ignored domains logged only once; if the domain already appears in a log entry, it is ignored, and otherwise it is logged. Reduces log clutter, anyway, and lets you know that your domain is being visited by the search bots.

Would be nice to come up with a solution we can put into TxP proper. Perhaps implementing a domain name cache would allow a plugin to implement this ignore feature and communicate the DNS lookup to the logger without resulting in a second lookup.

~joe

Offline

#12 2006-04-19 01:46:56

jtlapp
Member
From: Austin, TX
Registered: 2006-03-19
Posts: 59
Website

Re: Blocking crawler logging

Grrr. My old pastebin code seems to have disappeared from the pastebin site. Nevermind. I came up with a way to post arbitrary code to Textile. I’ll provide a separate post saying how it’s done.

Here’s the new logit() function that ignores all but one ignored_domain visit. That is, if an ignored domain does not yet have a log in the log file, it’s first visit is logged. Subsequent visits are not logged, until its previous logged visit expires from the log file. Reasoning is per my previous post. (This is a temporary hack until TxP gets a real domain name cache.)

Replace the logit() function in publish/log.php:

<pre><code> function logit&#40;&#36;r&#61;&#39;&#39;&#41; &#123; global &#36;ignore&#95;ips&#44; &#36;ignore&#95;domains&#59; &#47;&#42;JTL&#42;&#47;

global &#36;siteurl&#44; &#36;prefs&#44; &#36;pretext&#59; &#36;mydomain &#61; str&#95;replace&#40;&#39;www&#46;&#39;&#44;&#39;&#39;&#44;preg&#95;quote&#40;&#36;siteurl&#44;&#34;&#47;&#34;&#41;&#41;&#59; &#36;out&#91;&#39;uri&#39;&#93; &#61; &#64;&#36;pretext&#91;&#39;request&#95;uri&#39;&#93;&#59; &#36;out&#91;&#39;ref&#39;&#93; &#61; clean&#95;url&#40;str&#95;replace&#40;&#34;http&#58;&#47;&#47;&#34;&#44;&#34;&#34;&#44;serverSet&#40;&#39;HTTP&#95;REFERER&#39;&#41;&#41;&#41;&#59; &#36;host &#61; &#36;ip &#61; serverSet&#40;&#39;REMOTE&#95;ADDR&#39;&#41;&#59; &#47;&#42;BEGIN JTL&#42;&#47; foreach&#40;&#36;ignore&#95;ips as &#36;ignore&#41; &#123; &#36;max&#95;len &#61; strlen&#40;&#36;ignore&#41;&#59; if&#40;&#36;ignore&#123;&#36;max&#95;len &#45; 1&#125; &#61;&#61; &#39;&#46;&#39;&#41; &#47;&#47; don&#39;t want &#46;1 matching &#46;100 &#123; if&#40;&#33;strncmp&#40;&#36;ignore&#44; &#36;ip&#44; &#36;max&#95;len&#41;&#41; return&#59; &#125; else if&#40;&#33;strcmp&#40;&#36;ignore&#44; &#36;ip&#41;&#41; return&#59; &#125; &#47;&#42;END JTL&#42;&#47; if &#40;&#33;empty&#40;&#36;prefs&#91;&#39;use&#95;dns&#39;&#93;&#41;&#41; &#123; &#47;&#47; A crude rDNS cache if &#40;&#36;h &#61; safe&#95;field&#40;&#39;host&#39;&#44; &#39;txp&#95;log&#39;&#44; &#34;ip&#61;&#39;&#34;&#46;doSlash&#40;&#36;ip&#41;&#46;&#34;&#39; limit 1&#34;&#41;&#41; &#123; &#36;host &#61; &#36;h&#59; &#47;&#42;BEGIN JTL&#42;&#47; &#36;max&#95;len &#61; strlen&#40;&#36;host&#41;&#59; foreach&#40;&#36;ignore&#95;domains as &#36;ignore&#41; &#123; if&#40;strlen&#40;&#36;ignore&#41; &#60;&#61; &#36;max&#95;len&#41; &#123; if&#40;&#36;ignore&#123;0&#125; &#61;&#61; &#39;&#46;&#39;&#41; &#47;&#47; don&#39;t want bop&#46;com matching beebop&#46;com &#123; if&#40;&#33;strcasecmp&#40;&#36;ignore&#44; substr&#40;&#36;host&#44; &#36;max&#95;len &#45; strlen&#40;&#36;ignore&#41;&#41;&#41;&#41; return&#59; &#125; else if&#40;&#33;stricmp&#40;&#36;ignore&#44; &#36;host&#41;&#41; return&#59; &#125; &#125; &#47;&#42;END JTL&#42;&#47; &#125; else &#123; &#47;&#47; Double&#45;check the rDNS &#36;host &#61; &#64;gethostbyaddr&#40;serverSet&#40;&#39;REMOTE&#95;ADDR&#39;&#41;&#41;&#59; if &#40;&#36;host &#33;&#61; &#36;ip and &#64;gethostbyname&#40;&#36;host&#41; &#33;&#61; &#36;ip&#41; &#36;host &#61; &#36;ip&#59; &#125; &#125; &#36;out&#91;&#39;ip&#39;&#93; &#61; &#36;ip&#59; &#36;out&#91;&#39;host&#39;&#93; &#61; &#36;host&#59; &#36;out&#91;&#39;status&#39;&#93; &#61; 200&#59; &#47;&#47; FIXME &#36;out&#91;&#39;method&#39;&#93; &#61; serverSet&#40;&#39;REQUEST&#95;METHOD&#39;&#41;&#59; if &#40;preg&#95;match&#40;&#34;&#47;&#94;&#91;&#94;&#92;&#46;&#93;&#42;&#92;&#46;&#63;&#36;mydomain&#47;i&#34;&#44; &#36;out&#91;&#39;ref&#39;&#93;&#41;&#41; &#36;out&#91;&#39;ref&#39;&#93; &#61; &#34;&#34;&#59; if &#40;&#36;r&#61;&#61;&#39;refer&#39;&#41; &#123; if &#40;trim&#40;&#36;out&#91;&#39;ref&#39;&#93;&#41; &#33;&#61; &#34;&#34;&#41; &#123; insert&#95;logit&#40;&#36;out&#41;&#59; &#125; &#125; else insert&#95;logit&#40;&#36;out&#41;&#59; &#125;

</code></pre>

Offline

Board footer

Powered by FluxBB