Create a local static mirror of your WordPress blog by SSH command wget

wget command should be available in most hosting companies who offer SSH access to your hosting account. It is usually used to download stuff from the remote server, for example, to download something:

wget http://www.google.com/money.zip

However, there’s yet another hidden trick of wget that could enable you to make a mirror backup of any website – well, not actually any website but wget feels more comfortable with certain sites. WordPress blogs are perfect candidates for wget to mirror. Mirroring a WordPress blog can be done by a very simple switch of the wget command through SSH:

wget -mk http://www.example.com

All the documents relationships and HTML links will be taken care of so that local browsing of the mirrored copy will be completely no problem.

Quick News: RackSpace Cloud finally launched!

the rackspace cloud

Visit RackspaceCloud.com »

Mosso has been the pioneer of Cloud Hosting. One and a half year after the acquisition of Mosso by RackSpace, RackSpace has finally come out with its own branded cloud hosting based on what Mosso has achieved: RackSpace Cloud.

You can find a slightly more detailed report at here. Should you want to know more, here’s some Mosso and RackSpace cloud reviews and a cloud sites coupon code: (Code Hidden) you can use to sign up with the cloud sites plan for $25 off.

Change and Increase the Max PHP File Uploading Limit

The default php configuration comes with a hard cap of 2MB on the size of uploaded file determined by the php.ini directive upload_max_filesize in conjunction with post_max_size. The maximum uploading size of a file is the lower. Therefore, to increase the uploading cap and raise uploading limit, you will need to edit those 2 directives in php.ini.

The location of php.ini varies distribution by distribution, in this example, with Ubuntu 9.04 Jaunty, php.ini is located at /etc/php5/apache2/php.ini, so

sudo vi /etc/php5/apache2/php.ini

Press / to find upload_max_filesize and change it to, say 8M:

upload_max_filesize = 8M

Press ESC, :, wq and Enter. Now the php.ini is saved with the new uploading file limit. Reload apache2 to read the new configurations:

sudo /etc/init.d/apache2 reload

Now you should be able to upload any file up to 8MB in size, in case you need larger uploading limit, in addition to change upload_max_filesize to, say, 16M, you must also edit post_max_size to more than or equaling to 16M because file uploads are processed through HTTP POST method.

Best Linux Server Administration Books for Learning Linux

In the web hosting industry, Linux is undoubtedly the No. 1 server OS that has been used and talked about. With so many differently branded distributions such as Ubuntu, Debian and CentOS, the basic part has always been the same – the file system, the open source packages, the commands, the shell programming and so forth. It’s a hard nut to crack, but with a few good guides and books, you will soon be on your way of becoming a professional server administration.

Web Developers / Designers’ Books:

  1. Best HTML Books
  2. Best CSS Books
  3. Best JavaScript Books
  4. Best PHP Books
  5. Best MySQL Books
  6. Best Linux Books
  7. Best Apache Books (mod_rewrite Books)
  8. Best Web Hosting Books

A Practical Guide to Linux(R) Commands, Editors, and Shell Programming

A Practical Guide to Linux(R) Commands, Editors, and Shell Programming

Linux Pocket Guide

Linux Pocket Guide

Linux in a Nutshell, 5th Edition

Linux in a Nutshell, 5th Edition

Linux Administration: A Beginner’s Guide, Fifth Edition

Linux Administration A Beginner's Guide, Fifth Edition

Practical Guide to Ubuntu Linux (Versions 8.10 and 8.04), A (2nd Edition)

Practical Guide to Ubuntu Linux (Versions 8.10 and 8.04), A (2nd Edition)

How Linux Works

How Linux Works

Linux Administration Handbook (2nd Edition)

Linux Administration Handbook (2nd Edition)

Ubuntu Linux Toolbox: 1000+ Commands for Ubuntu and Debian Power Users

Ubuntu Linux Toolbox 1000  Commands for Ubuntu and Debian Power Users

Best Books about Apache mod_rewrite Module, .htaccess Books

mod_rewrite is universally acknowledged as one of the best modules of Apache, simple yet powerful. It’s one of the top reasons why Apache is the best web server. These are some books found at Amazon on Apache mod_rewrite and how to use it in .htaccess directives.

Web Developers / Designers’ Books:

  1. Best HTML Books
  2. Best CSS Books
  3. Best JavaScript Books
  4. Best PHP Books
  5. Best MySQL Books
  6. Best Linux Books
  7. Best Apache Books (mod_rewrite Books)
  8. Best Web Hosting Books

The Definitive Guide to Apache mod_rewrite

The Definitive Guide to Apache mod_rewrite

Best Web Hosting Books to Learn about Web Hosting

Web hosting and similar IT infrastructure outsourcing can be intimidating to anyone not tech-savvy enough. These are some of the best readings recommended by book readers from Amazon concerning this technical issue and related decision making process. Some books teach you to use web hosting and some tell you how to run a web hosting business.

Web Developers / Designers’ Books:

  1. Best HTML Books
  2. Best CSS Books
  3. Best JavaScript Books
  4. Best PHP Books
  5. Best MySQL Books
  6. Best Linux Books
  7. Best Apache Books (mod_rewrite Books)
  8. Best Web Hosting Books

Strategies for Web Hosting and Managed Services

Strategies for Web Hosting and Managed Services

Web Host Manager Administration Guide: Run your web host with the popular WebHost Manager software

Web Host Manager Administration Guide Run your web host with the popular WebHost Manager software

Web Hosting: A complete strategy for delivering high quality Web hosting services

web hosting

How to Host your own Web Server

how to host your own web server

The Complete Web Hosting Kit Professional

The Complete Web Hosting Kit Professional [CD-ROM] (CD-ROM)

The Web Hosting Manager

The Web Hosting Manager

Understanding Linux Web Hosting

Understanding Linux Web Hosting

Actually there are not too many books straightly dealing with the topic of website hosting, but a lot on its cousin areas such as Linux, Windows, Apache, PHP and MySQL.

How to know the physical location of a website domain or tell where a site is hosted?

To trace and know where a website is hosted physically (server location), you will first get its IP address and then query it for the geo location. Many online tools and IP databases offer such free services such as IP to Location. It not only converts the IP to a physical location but also tries to give the institution (companies, data centers, national organizations, etc.) that’s responsible for the IP address based on IP ranges distribution table with an option to map the IP geographically.

Oh wait, I think I forget about how to get the IP address of a website. All after all, you can ping the domain name to get the IP address of the site from command line. Or, you can use domain IP lookup tools such as this one. It not only gives you the IP address of the particular domain URL but also pokes the server around and guesses what other sites are possibly being hosted on that IP.

Actually, there’s a much simpler one to consult when you need to know the physical location of a site and where the website is hosted: Domain Tools. By querying a domain, it gives you the IP where the domain / site is currently hosted and also the organization or data center (might be a hosting company) that’s administering the IP. Very good tool to know where a site or domain is hosted.

A Review of RackSpace Review

Disclaimer: I’ve never been with RackSpace (but the cloud hosting, Rackspace Cloud) hosting and everything is taken as it is from this review at WHT. I’m just a little surprised that a host like RackSpace could ever make such a mistake to introduce 8 hours of downtime to its clients (that is of course, if what the reviewer said was truth and precise).

The original RackSpace review can be found here. I tried to reply to the post but it is too old.

But pardon me, 8 hours of downtime? Whatever the reason it is and no matter how many times the CEO apologizes for this (it could very well be an employee who is assigned the job to apologize on behalf of the CEO or automated mass email), it’s simply unacceptable. I don’t know about the other sites you may have with them, but officedebo.com doesn’t seem to be a mission critical one with tons of traffic that’s probably making a dollar every second. If that is the case I doubt you will remain calm about the 8 hours downtime.

8 hours downtime is even barely acceptable for other hosting companies, let alone RackSpace, it’s absolutely way too MANY for a host like them. Support comes far second to uptime, no matter how good it is.

Imagine if there’s a host that can Godly guarantees all of your sites 120% uptime but with no support at all, will you pay as much as what rackspace charges you for it? I think the answer is pretty obvious.

Again, I’m just talking about the issue here as it is from the original reviewer. 8 hours downtime is indeed unacceptable however personally, this incident hardly sways my belief in RackSpace. They are still one of the top hosting providers out there that’s worth your trust. View more reviews of Rackspace.

Also, for Rackspace Cloud, their newly launched cloud hosting service, here are some customer reviews.

What is Cloud Hosting & Cloud Computing? Definition of Cloud Web Hosting Defined

how cloud computing worksCloud computing has been the latest buzz word ever since last year 2008, promoted by large Information Technology companies such as Google, Microsoft and IBM. It’s very similar in nature to grid computing (An alignment of individual computers serve virtually as a single one, providing combined computing resources) though more focused on the way of consumption rather than the infrastructure itself.

The biggest Cloud is the Internet itself. It is a form of computing that supplies resources as a service to the consumers (enterprises or individuals) and that is paid for in the manner of utility such as electricity. This way, traditional sunk cost of building a huge self-supporting IT infrastructure can be avoided and users only pay for what they use as they go and exit freely. The actual physical infrastructure is created and managed by experienced hands who run and support it as well as the computing buyers.

It’s much like the on demand allocation of computing resources such as CPU time, network capabilities and digital storage with flexibility in scaling. The whole system frees the users from learning and managing all this stuff.

Cloud hosting, or cloud web hosting, similarly, backed by large clouds of computing devices or a cluster of servers, supplies hosting resources such as data storage, network bandwidth and CPU usage and bills the customer in the way of utility. With a cloud host, we only pay what we use and never have to worry about how this cloud stuff is accomplished in the back stage. We don’t have to worry about any of the technical aspects such as load balance and security, nor the hardware performance (CPU, RAM, hard drive, etc.), nor the chores such as device repair or replacement. We just consume and enjoy the results – the hosting resources – and use them to feed our websites and applications.

In simple words, any of the websites hosted on a hosting cloud or cloud based hosting should have the potential to spread its computing needs across the entire cloud, introducing the following benefits:

  1. Instantly available resources from the pool (not restricted by any single physical server at all) to be allocated to where it’s most needed.
  2. High scalability enables easy and fast scaling in resource consumption, either manually or automatically.
  3. Sudden website usage surge will not be a problem at all because the burden is evenly spread across the entire network.
  4. High availability and great reduced failure risk because the cloud works as a whole and any integral piece of hardware in it is dispensable. The failure of any one of them will not affect any of the hosted websites at all.

One of the most prominent players in the current cloud hosting market is Rackspace Cloud.

Unmanaged Hosting Server Installation & Initial Configuration for Dummies

Unmanaged hosting often comes with very competitive price compared to managed hosting environment, however, the drawback of which, is the steep learning curve for server noobs that unfortunately includes me. It’s meant for technically proficient ones after all.

This simple tutorial will walk you through the steps needed to set up a working web server that’s ready to serve websites from a bare bone Linux distro (Ubuntu, in this case). It serves to be a survival guide of unmanaged hosting for novice Linux server administrators while at the same time documents my findings and learned tips for myself.

We will take Ubuntu 9.04 (Jaunty) for example. Though all things are done at a Mosso 256 MB Cloud Server, 99% of them should work without a problem at other unmanaged hosting providers as long as the Linux distribution is identical. After this tutorial, you will have a working VPS or dedicated hosting server with update to date softwares and beginner security, in addition to necessary software packages to run and manage a LAMP web server: FTP, Apache 2, PHP 5.2.6 and MySQL.

Here we go. Suppose you have made the decision to go with one of the Linux distros of which I’d suggest Ubuntu and installed a plain version of it from the hosting control panel, now:

  1. You will be given a bare IP address, of course, when you have finished installing the Linux distro from the contrl panel provided by your hosting company. Download PuTTY and set it up to connect to your hosting server of that IP as the root via SSH. The root password should also have been revealed or emailed to you.
  2. Change the root password to a new one.
  3. Customize the default SSH listening port of 22 to a custom one.
  4. Build up necessary iptables firewall rules.
  5. Customize shell environment and make the prompt ls command listings a little more colorful so it’s more readable.
  6. Enable vi code highlighting (enabled by default in Ubuntu) and change the dark blue color for comments to a lighter blue.
  7. Update the software source lists in Ubuntu by:
    aptitude updateAnd set the proper locales:
    locale-gen en_US.UTF-8
  8. Upgrade the current distro to its latest:
    aptitude safe-upgradeFollowed by:
    aptitude full-upgrade
  9. Install the essential tools and packages for development, the build tools:
    aptitude install build-essential
  10. Install MySQL:
    aptitude install mysql-server mysql-clientThe installation will prompt you twice for the root password.
  11. Install Apache:
    aptitude install apache2 apache2.2-common apache2-mpm-prefork apache2-utils libexpat1 ssl-certChange server name in the Apache configuration:
    vi /etc/apache2/apache2.confAnd add the following directive at the end of the file:
    ServerName kingChange ‘king’ to whatever you’ll name your own server. For beginner SSH users, nano would be a better choice for its intuitive editing capabilities. After restarting Apache web server gracefully by not interrupting connected clients:
    apache2ctl gracefulYou should be able to view the demo web page at http://(your server IP address).
  12. Install PHP5:
    aptitude install libapache2-mod-php5 php5 php5-common php5-curl php5-dev php5-gd php5-imagick php5-mcrypt php5-memcache php5-mhash php5-mysql php5-pspell php5-snmp php5-sqlite php5-xmlrpc php5-xsl
  13. Turn off server signature:
    nano /etc/apache2/conf.d/securityAnd change:
    ServerTokens FullTo:
    ServerTokens Prod
  14. Change the hostname of your server. Open and edit /etc/hostname:
    vi /etc/hostnameTo a top level domain you have registered for your website. And add it in /etc/hosts:
    vi /etc/hostsIn this way:
    127.0.0.1 example.com
  15. Reboot by:
    shutdown -r now
  16. Install the mail module for your hosting server, Postfix, so the php function mail() works:
    aptitude install postfix telnet mailxJust choose ‘Internet Site’.
  17. Install FTP daemon service so you can FTP stuff to your server as any user.
  18. Install rsync so that you can easily synchronize and backup files between 2 remote hosting servers:
    aptitude install rsync

Backup and synchronize files between 2 or more hosting servers

This kind of redundancy is encouraged to protect potential loss of important data, such as essential website programs and databases.

After installing rsync on all peer hosting servers you own, you can easily backup stuff and synchronize them among the servers for a safe data redundancy.

rsync -e 'ssh -p 25000' -avl --delete --stats --progress [email protected]:/home/user1 /backup

This simple command will take care of everything for you, the rsync command connects to the remote server 123.45.67.890 as user1 and backups or synchronizes everything in /home/user1 from the remote server to the local directory /backup. The –delete switch indicates that files that previously existed at /home/user1 on the remote server but not now will be also deleted in /backup at the local server.

‘ssh -p 25000’ prescribes rsync to connect via SSH on port 25000.

Installing FTP (vsFTPd) Service on Ubuntu Server

FTP is an indispensable feature of servers that host and serve websites as it enables us to easily upload stuff to the remote server. On a Ubuntu server, with a little help of aptitude command (the package management program descended from Debian), you can install the most simple yet most common FTP daemon program for your server: vsFTPd.

apt-get install vsftpd

It is started automatically after successful installation. Stop it:

/etc/init.d/vsftpd stop

So that you can customize the configuration file:

vi /etc/vsftpd.conf

And make it look like:

pasv_enable=YES
pasv_max_port=8010
pasv_min_port=8001

anonymous_enable=NO
local_enable=YES
write_enable=YES
local_umask=022

idle_session_timeout=3600

chroot_local_user=YES

pam_service_name=ftp

Restart the FTP service:

/etc/init.d/vsftpd start

Now you can try connecting to the FTP and transferring some stuff.

vi code highlighting: change the default comments color from dark blue to light blue

The default colors for comments (texts in /* */ or following // or #, …) in vi code highlighting are a little too dark. Ever wanted to make it more recognizable in SSH console?

Find and edit /etc/vim/vimrc with vi:

vi /etc/vim/vimrc

And add in this line:

colorscheme desert

Wherein desert is one of the available color schemes vim comes with. Now we will need to edit the actual color scheme file and change the highlighting colors:

/usr/share/vim/vimcurrent/colors/desert.vim

Change:

hi Comment ctermfg=darkcyan

To:

hi Comment ctermfg=blue

Save the change and exit. Run:

source /etc/vim/vimrc

And the changes will now take effect.

The default directory color of ls –color is also too dark, you can learn how to change the default directory color of ls –color.

Use Shell Environment Variable LS_COLORS to Change Directory Listing Colors of ls –color

After you have enabled the color switch of ls command in shell console, it’s nice but some may complain that the deep blue color of the directories are too dark to recognize sometimes. Let’s change that.

Just open up the .profile or .bash_profile file under your home directory and put this line in it:

export LS_COLORS='di=01;34'

Done! Now the color of the ls directory listings is much lighter and easier to recognize. There’s also a tip of how to change the default dark color for comments in vi text editor.

Colorful ls, SSH Console and Command Prompt

Add the following snippet in the .profile or .bash_profile under your home directory:

export PS1='[\[\e[1;31m\]\u\[\e[0m\] - \[\e[32m\]\w\[\e[0m\]]$ '
export LS_COLORS='di=01;34'
alias ls='ls --color -l'

If you are ‘supergirl’, your Linux home directory would be located at: /home/supergirl, and the file you should add the above lines to is: /home/supergirl/.profile or /home/supergirl/.bash_profile.

What is LS_COLORS doing here?