Find out Files modified within the Last x Days in Linux

Just use this simple command to recursively find out what files in the current directory has been modified within the last 6 days:

find . -type f -mtime -6


To find modified files in the current directory but to not look in certain sub-directories such as Maildir or logs:

find . -type f -mtime -6 | grep -v "/Maildir/" | grep -v "/logs/"


This would come very handy in:

  1. Finding files that have been hacked or maliciously uploaded.
  2. Finding files that are modified or updated by you in the last few days for backup or recovery or simply synchronization.

More tips can be found in this article as well.

PHP Memory Exhaustion | A common cause of the WSOD (White Screen of Death)

If you are experiencing issues with your PHP based website in Cloud Sites, and are seeing the “White Screen of Death” the first thing you will want to check is your php_errors.log file which is located within the logs directory of your website in FTP.

By default, php errors are logged to that file unless otherwise specified by your application.

A very common PHP error is the “PHP Fatal error: Allowed memory size of XXXXX bytes exhausted” error.

Fortunately this is a relatively simple issue to fix.

Rackspace Cloud’s default PHP memory is set to 32M. Some plugins or themes may require more memory than this to process.

To increase the PHP memory you will need to edit the .htaccess file within your /web/content/ directory (or create a .htaccess file if it does not already exist) and add the following line of code towards the top of the file:

php_value memory_limit ?M

Replace the ? with the amount of php memory you wish to allocate. We recommend increasing it to 64M unless more is necessary.

Keep in mind that more php memory will NOT make your website run faster. Too much php memory may actually make your website run slower. The php memory value is how much memory your application is requiring to load each webpage.

So as you can imagine, if your application is requiring upwards of 128M of memory just to load a single webpage it probably won’t load very quickly. 😀

Get Server CPU Usage of A Specific User

Suppose your user name is jimgreen and you can run the following command to get a very specific CPU usage of it in real time:

top -b -n 1 -u jimgreen | awk 'NR>7 { sum += $9; } END { print sum; }'

You can even further capture the usage metric in PHP and do conditional actions:
$usage = shell_exec("top -b -n 1 -u jimgreen | awk 'NR>7 { sum += $9; } END { print sum; }'");
Now $usage contains the CPU usage of the user jimgreen.

How to count the number of files in a tar archive?

So you have a tar archive file and you want to know the number of files in it without expanding it. You know the text counter program ‘wc’? You can count the number of lines from the input by:

wc -l file.txt

You know how to list the content (files and directories) of a tar archive?

tar --list example.tar

Now you just combine these two to count the number of lines of the files list of the tar archive:

tar --list example.tar | wc -l

So that you have the number of files in the tar file example.tar. The pipe means relaying the output of the previous command to the next command as input.

How to find out the number of users on your hosting server?

How many user accounts are there on your hosting server? You can look up the number of server user accounts by the following command via SSH:

wc -l /etc/passwd

And it may output something like this:

76 /etc/passwd

Which means there are a total of 76 active users in the system. However, the actual number of human users of the server should be lower than the amount because there are system users created to carry out certain tasks.

Multiply the amount by 5 and you may get a rough number of websites hosted on your server.

Create a local static mirror of your WordPress blog by SSH command wget

wget command should be available in most hosting companies who offer SSH access to your hosting account. It is usually used to download stuff from the remote server, for example, to download something:


However, there’s yet another hidden trick of wget that could enable you to make a mirror backup of any website – well, not actually any website but wget feels more comfortable with certain sites. WordPress blogs are perfect candidates for wget to mirror. Mirroring a WordPress blog can be done by a very simple switch of the wget command through SSH:

wget -mk

All the documents relationships and HTML links will be taken care of so that local browsing of the mirrored copy will be completely no problem.

Change and Increase the Max PHP File Uploading Limit

The default php configuration comes with a hard cap of 2MB on the size of uploaded file determined by the php.ini directive upload_max_filesize in conjunction with post_max_size. The maximum uploading size of a file is the lower. Therefore, to increase the uploading cap and raise uploading limit, you will need to edit those 2 directives in php.ini.

The location of php.ini varies distribution by distribution, in this example, with Ubuntu 9.04 Jaunty, php.ini is located at /etc/php5/apache2/php.ini, so

sudo vi /etc/php5/apache2/php.ini

Press / to find upload_max_filesize and change it to, say 8M:

upload_max_filesize = 8M

Press ESC, :, wq and Enter. Now the php.ini is saved with the new uploading file limit. Reload apache2 to read the new configurations:

sudo /etc/init.d/apache2 reload

Now you should be able to upload any file up to 8MB in size, in case you need larger uploading limit, in addition to change upload_max_filesize to, say, 16M, you must also edit post_max_size to more than or equaling to 16M because file uploads are processed through HTTP POST method.

Best Books about Apache mod_rewrite Module, .htaccess Books

mod_rewrite is universally acknowledged as one of the best modules of Apache, simple yet powerful. It’s one of the top reasons why Apache is the best web server. These are some books found at Amazon on Apache mod_rewrite and how to use it in .htaccess directives.

Web Developers / Designers’ Books:

  1. Best HTML Books
  2. Best CSS Books
  3. Best JavaScript Books
  4. Best PHP Books
  5. Best MySQL Books
  6. Best Linux Books
  7. Best Apache Books (mod_rewrite Books)
  8. Best Web Hosting Books

The Definitive Guide to Apache mod_rewrite

The Definitive Guide to Apache mod_rewrite

How to know the physical location of a website domain or tell where a site is hosted?

To trace and know where a website is hosted physically (server location), you will first get its IP address and then query it for the geo location. Many online tools and IP databases offer such free services such as IP to Location. It not only converts the IP to a physical location but also tries to give the institution (companies, data centers, national organizations, etc.) that’s responsible for the IP address based on IP ranges distribution table with an option to map the IP geographically.

Oh wait, I think I forget about how to get the IP address of a website. All after all, you can ping the domain name to get the IP address of the site from command line. Or, you can use domain IP lookup tools such as this one. It not only gives you the IP address of the particular domain URL but also pokes the server around and guesses what other sites are possibly being hosted on that IP.

Actually, there’s a much simpler one to consult when you need to know the physical location of a site and where the website is hosted: Domain Tools. By querying a domain, it gives you the IP where the domain / site is currently hosted and also the organization or data center (might be a hosting company) that’s administering the IP. Very good tool to know where a site or domain is hosted.

Backup and synchronize files between 2 or more hosting servers

This kind of redundancy is encouraged to protect potential loss of important data, such as essential website programs and databases.

After installing rsync on all peer hosting servers you own, you can easily backup stuff and synchronize them among the servers for a safe data redundancy.

rsync -e 'ssh -p 25000' -avl --delete --stats --progress [email protected]:/home/user1 /backup

This simple command will take care of everything for you, the rsync command connects to the remote server as user1 and backups or synchronizes everything in /home/user1 from the remote server to the local directory /backup. The –delete switch indicates that files that previously existed at /home/user1 on the remote server but not now will be also deleted in /backup at the local server.

‘ssh -p 25000’ prescribes rsync to connect via SSH on port 25000.

vi code highlighting: change the default comments color from dark blue to light blue

The default colors for comments (texts in /* */ or following // or #, …) in vi code highlighting are a little too dark. Ever wanted to make it more recognizable in SSH console?

Find and edit /etc/vim/vimrc with vi:

vi /etc/vim/vimrc

And add in this line:

colorscheme desert

Wherein desert is one of the available color schemes vim comes with. Now we will need to edit the actual color scheme file and change the highlighting colors:



hi Comment ctermfg=darkcyan


hi Comment ctermfg=blue

Save the change and exit. Run:

source /etc/vim/vimrc

And the changes will now take effect.

The default directory color of ls –color is also too dark, you can learn how to change the default directory color of ls –color.

Use Shell Environment Variable LS_COLORS to Change Directory Listing Colors of ls –color

After you have enabled the color switch of ls command in shell console, it’s nice but some may complain that the deep blue color of the directories are too dark to recognize sometimes. Let’s change that.

Just open up the .profile or .bash_profile file under your home directory and put this line in it:

export LS_COLORS='di=01;34'

Done! Now the color of the ls directory listings is much lighter and easier to recognize. There’s also a tip of how to change the default dark color for comments in vi text editor.

Colorful ls, SSH Console and Command Prompt

Add the following snippet in the .profile or .bash_profile under your home directory:

export PS1='[\[\e[1;31m\]\u\[\e[0m\] - \[\e[32m\]\w\[\e[0m\]]$ '
export LS_COLORS='di=01;34'
alias ls='ls --color -l'

If you are ‘supergirl’, your Linux home directory would be located at: /home/supergirl, and the file you should add the above lines to is: /home/supergirl/.profile or /home/supergirl/.bash_profile.

What is LS_COLORS doing here?

Linux SSH commands to show and monitor server resources and real-time performance: memory, swap, disk usage, CPU usage and I/O …

Below are a few general commands found in most popular Linux distros which you can use via SSH to check the status of your hosting server.

To show used and available RAM memory and swap space usage:

free -m

To show current disk storage usage by mounted device:


To show disk usage statistics of the current directory by directories and files:


To show the hard disk space a directory or a file takes up:

du filename

To show the length of time this server has been up and the server loads in the past 1 minute, 5 minutes and 15 minutes:


To display a real-time updated server resource usage including: server uptime, user logged on, load average, current tasks, CPU usage, memory usage and swap usage:


To display a list of real-time active or sleeping processes your server is up to:


To show some information about the current status of virtual memory, CPU usage, I/O usage:


This is also a good tool to find out system performance bottlenecks.

To display currently logged on users on the system:




To print a full screen text graph of the server load refreshed every few seconds:


If you are on shared hosting, chances are your server usage has been imposed some hard limits such as the largest amount of files / directories possible and the hard storage limit. View them by:


Quick tip: 256 MB VPS helps you no more than shared hosting

256MB being the startup plan from most VPS providers will be no better than a shared hosting plan from affordable hosting providers. 384MB may seem to be a 128MB extra but actually just slightly more.

As a result of the nature of VPS, an entire operating system (such as Linux distributions: Ubuntu, Debian or Centos, and so forth) resides in it with a complete web server package: Apache, PHP, MySQL and potentially a lot of necessary modules and extensions, making the mere hosting slice of 256MB a frugal choice to cover all the overheads, much like the sunk cost in Economics. It is after the threshold of 256MB that every additional MB of RAM you purchase will be consumed by your own websites rather than by the system. Well, not precisely all of 256MB will be used for the system, but you are left with like 50MB – 100MB from the whole 256MB pie after installing everything WWW and getting your slice ready for websites.

So if you are going to switch to a VPS, make sure you board on at least 512MB memeory for a start or it won’t be worth the while and it may just be good enough to spread your sites across various shared hosting plans (preferrably from various distant hosting comapnies) for some SEO advantage.