The best solution is to put the .db file in a separate directory. Using a spider program, hackers could easily find the .db file and since you do not have it in a password protected directory, they could download the file via the web browser or through brute force access via telnet/ftp.
Just move the .db file into a separate password protected directory. Then change the paths in the
default.cfg file.
Then, you should add the following entry in your
robots.txt file:
Code:
Disallow: /cgi-bin/ You could also add META tags in your subs in the
html.pl to disallow indexing via the
robots META tag.
Regards,
Eliot Lee