Recently, the algorithm of Googlebot changed. (This is just a guess of myself)
And now, it accesses piCal's links very frequently.
As you know, piCal has very many internal links.
If Googlebot crawls is going to access all of piCal's links, it will be high load of the server.
Thus I've just modified piCal as outputing meta header like this:
<meta name="robots" content="index,nofollow" />
<meta name="robots" content="<{$xoops_meta_robots}>" />
User-agent: *
Disallow: /(xoops)/cgi-bin/
Disallow: /(xoops)/tmp/
Disallow: /(xoops)/cache/
Disallow: /(xoops)/class/
Disallow: /(xoops)/images/
Disallow: /(xoops)/include/
Disallow: /(xoops)/install/
Disallow: /(xoops)/kernel/
Disallow: /(xoops)/language/
Disallow: /(xoops)/templates_c/
Disallow: /(xoops)/themes/
Disallow: /(xoops)/uploads/
Disallow: /(xoops)/modules/piCal/
The theme I've been using has the line:
<meta name="robots" content="<{$xoops_meta_robots}>" />
..And I'm using the latest piCal index.php which attempts to assign the "nofollow" directive:
$xoopsTpl->assign( "xoops_meta_robots" , $meta_robots ) ;
But for some reason the page is compiled with the Robots rule as set by the main Xoops preferences. For now I'm using "Disallow" but would rather have "nofollow" so more of the calendar gets indexed.