Hi all,
I have enjoyed reading the responses in this thread to my initial question.
jpadie - Thanks for the code you posted earlier. I tested it with a database and I agree that this is far better than using a flat file. When I ran the script, I received one error:
Warning: mysql_fetch_assoc(): supplied argument is not a valid MySQL result resource in /home/user/public_html/logging/logging5.php on line 63
Line 63 is:
Code:
$row = mysql_fetch_assoc($result);
Also, the table columns "Time" and User Agent" were empty in the page output. The "visitDate" column contains entries in the database, but the "visitAgent" column has no entries.
Reloading the script many times didn't give the message "Too many visits today". I only received that message when I changed
Code:
if ($log->numRecentVisits() > 20){
to
Code:
if ($log->numRecentVisits() >= 0){
jpadie
But we have not really been given enough information by the OP to determine whether multi-hit limitations is the right approach to resolving whatever issue he faces.
I am building a website containing a great deal of proprietary information (It is a type of directory). I would like to give visitors access to this information but I need to protect the information from people who may try to download too much or even the entire site using WGET or some other spidering program. The actual number of pages that they may access before being blocked can be decided at a later stage and the restriction would not be applied to every page on the website - only to the "valuable" pages. Perhaps I am not taking the right approach, but I am exploring the possibilities.
psymonj
I assume you'll be looking to make allowances for search engines and alike?
I do wish to allow search engines to spider the site but this is not the most important consideration.
I have been re-considering my initial concept and I feel that I may use a system that requires visitors to register and log in before they have access to the detailed information. The basic information would then still be available to the search engines.
I have been looking today at a system called SOBI for Joomla that seems as though it will suit my purposes for allowing access to the detailed information only to logged in users.
I now need to look into incorporating a user-logging script that will prevent logged in users from accessing too many pages on the site within a 24 hour period.
Joomla contains a small piece of code for checking that a user is logged in:
Code:
$user = &JFactory::getUser();
if ($user->get('gid')) echo 'logged in';
so now I shall try to incorporate a logging script at the top the page.
jpadie
... and it may well be that a simple session counter would work equally well.
ingresman
Are you sure you want ot do it on IP ?,
I would prefer to make the check based on IP address and user ID so that many people could access the detailed information from the same IP address provided that they have logged in separately. The logging system would need to record the userID so that it survives after the visitor has logged out and the session has ended.
I presume that the logging table in the database can be cleared out each day using cron?
jpadie
i've also rewritten the database based code to work on a flat file. i have a feeling that the performance would be terrible though (for anything over a few users). although, actually, i've just thought of a better way to do this... i'll post back if the OP is genuinely after a flat-file based solution...
I am using this and other projects to get to improve my understanding of PHP - the flat-file solution was the first thing that came to mind after the simple tracking script that I had. I'm not specifically after the flat file solution, but I would be interested to see it if you have already prepared it.