Here are some basic security tips for a shared hosting server:
* It is important to ensure that your local machine is safe. For doing this, use reliable and updated antivirus software like Microsoft Security Essentials.
* Update all your web applications on a regular basis. This includes components, modules and addons that you have integrated.
* Select strong and complicated passwords for the cPanel account, FTP, MySQL and mail users. Avoid using the same passwords for different users. It is also important to ensure that your cPanel username and password should not be saved in any file of your account.
* Do not use directories having permissions above 755. In case your applications warrants the use of such directories, place them outside your web root (public_html) or you can even place a .htaccess file in them that contains ‘deny from all’ for restricting public access to these files.
* Make use of secure and encrypted connections while logging into cPanel yourdomain(dot)com/cpanel.
* Tweak the local PHP settings for improved security. For doing this, disable the unnecessary options and functions. Below mentioned are some sample recommendations:
allow_url_fopen=off
disable_functions = proc_open , popen, disk_free_space, set_time_limit, leak, tmpfile, exec, system, shell_exec, passthru
Note that the directives mentioned above can hamper your code’s functionality. They have to be added in a php.ini file in every directory where you would like to apply them.
* Deny Pearl and other bots from accessing your website. This can be easily implemented by applying the following rules in your .htaccess:
SetEnvIfNoCase User-Agent libwww-perl bad_bots
order deny,allow
deny from env=bad_bots
* If you are not utilizing the Pearl scripts, you can add a bogus handler for these files. Create a .htaccess file in your home directory with the content mentioned below:
##Deny access to all CGI, Perl, Python and text files
<FilesMatch “.(cgi|pl|py|txt)”>
Deny from all
</FilesMatch>
##If you are using a robots.txt file, please remove the
# sign from the following 3 lines to allow access only to the robots.txt file:
#<FilesMatch robots.txt>
#Allow from all
#</FilesMatch>
* Change SSH port number from default 22 to any other one you want.
The tips mentioned above will prevent the Pearl scripts from being implemented. Many exploits and backdoors are created in Pearl and with the settings mentioned above, they will be prevented from running. This directive will be applicable to all your subdirectories.
Remember, once your web hosting account has been compromised, it is a possibility that the intruder will leave a backdoor for gaining easy access at a later point of time. This is the reason why it is not just enough to fix the vulnerable code. Detecting the backdoor can be time consuming and expensive as it might require a professional developer. Therefore, it is important to follow the security guidelines mentioned above.