www.gmail.com – Tips

by Yang Yang on February 20, 2011

You can login to Gmail.com at here: http://gmail.google.com/

I have been a user of Gmail for almost 6 years. I have some tips on using Gmail.com to share with you that may make your life easier.

You can sign up and get an Gmail account at here: http://www.gmail.com. It’s completely free.

Enable keyboard shortcuts

When you are logged in your gmail account, press Shift + / (Shift key and / key, no plus key) and it’ll open up a very nice semi-transparent dialog that shows a full list of all the keyboard shortcuts available to you in Gmail.

Gmail keyboard shortcuts

Gmail keyboard shortcuts

All the shortcuts are disabled by default. Just click the “Enable” link to enable them all. Now you would have all the nice productivity boosters such as:

  1. c” for quick compose
  2. p” for previous message in conversation, and “n” for next message
  3. *” then “u” to select all unread conversations; “Shift + i” to mark as read
  4. Tab” then “Enter” to immediately send the message
  5. #” to move to trash
  6. r” to reply
  7. f” to forward
  8. etc.

How cool is that!

Make messages from a certain sender to skip inbox

If you use vps web hosting, you may be sent too many emails too frequently about the status (security, load, etc.) of your vps box. If these messages are left in the inbox, it’d be a nightmare because there are just too many of them and it makes you finding a certain message much harder and you would be more likely to miss a legitimate / personal message.

So naturally, we would want those messages to be received and dropped in another custom inbox rather than the main one. For example, all these messages are from root@host.example.com:

  1. Click the top settings cog in the top right corner –>
  2. Mail settings –>
  3. Filters –>
  4. Create a new filter –>
  5. From: root@host.example.com –>
  6. Next Step
  7. Check “Skip the Inbox (Archive it)” and check “Apply the label: New label…”and create a new label (such as “Alerts”) which will be a custom inbox that will be storing all messages coming from root@host.example.com –>
  8. Create Filter (done)

Now all messages coming from root@host.example.com will be automatically stored in the custom inbox “Alerts” rather than the main inbox.

There are a lot more you can do with filters that automatically processes or takes care of messages by certain criteria.

Use search operators to quickly find messages – by sender, recipient, subject, attachment, label, etc.

Just like with Google.com the web search engine, you can use search operators with Gmail.com as well. Some of my most frequently used operators with Gmail are:

  • from:jane@hotmail.com —- Find all messages / communications from jane@hotmail.com
  • to:michael@me.com —- Find all messages you sent to michael@me.com
  • has:attachment subject:samples —- Find all messages that have attachment and have ‘samples’ in the subject
  • filename:office.jpg label:photos is:starred —- Find all starred messages with label ‘photos’ that have the file office.jpg attached

Get to know more operators and examples at here: http://mail.google.com/support/bin/answer.py?answer=7190&hl=en. They will prove to be very handy.

Alias email addresses – you have unlimited sub-emails!

If your email address is ada.monroe@gmail.com, messages delivered to:


Are all directed to ada.monroe@gmail.com. With one Gmail account, you have a literally unlimited number of sub-emails.

With filters, this would come out even more powerful because you can set up filters to automatically move, star, forward or apply labels to messages from a certain alias address.

Pre-saved messages / responses to be re-used by a click

If you find yourself typing the same (or almost the same) messages over and over again, you may want to save the message as Canned Responses which is an experimental module that you can turn on for your Gmail account in the Labs (Mail settings -> Labs).

Combining it with filters, you will have a highly automated email system that messages of certain criteria (such as coming from a specific sender or containing an arbitrary phrase) will be automatically replied to with one of your pre-saved responses.

Accessing multiple email accounts (such as from hotmail.com and yahoo.com) within your Gmail account

Other than Gmail itself, you can also set up 3rd party email services so that you can access them all from within Gmail. This would definitely make things simpler and save time if you have quite a few emails from different providers to check everyday. All you need is entering the POP3 server details and giving Gmail the permission to download the messages. Follow this guide for how to set it up, it’s quite easy: http://mail.google.com/support/bin/answer.py?hl=en&answer=21288

Shift-click to instantly select multiple messages

It’s just like what you would do in Windows. If there is a large series of multiple messages that you want selected, just select the first one and then hold Shift key to click the last. So simple and so intuitive.

{ Comments on this entry are closed }

MySQL: How to export a database / table to XML?

by Yang Yang on February 17, 2011

You can export any MySQL database or table into an XML file by the exporting capabilities of phpMyAdmin, the web interface of MySQL. The problem with this approach, however, is that with large tables, you may have to manually export the table more than once by sections into several sequential XML files.

A better approach is by native MySQL command line that will create and store all entries of the table into one XML file:

mysql -u db_user -p db_name --xml -e "SELECT * FROM table_name" > table_name.xml

Wherein db_user is your MySQL server user, db_name is the name of the database and table_name is the name of the table that you would like exported to XML. The resulted XML will be stored in table_name.xml.

Note that this is different from mysqldump in that it’s basically executing a query by the –e switch and output the results in XML (–xml). The results is then directed to the out file rather than displayed on terminal screen.

{ Comments on this entry are closed }

To delete all content in any directory, including all sub-directories and files, I’ve been using this:

rm -rf somedir/*

If it is to delete all content of the current directory:

rm -rf *

However, it turns out ‘rm -rf’ doesn’t remove hidden files such as .htaccess (Files with a name starting with a dot are hidden in Linux). To delete all the hidden files as well, I have to run a 2nd command:

rm -rf .??*

{ Comments on this entry are closed }

Linux: How to ‘find’ and search ONLY text files?

by Yang Yang on January 22, 2011

The ‘find’ command in Linux systems searches through a directory and return files that satisfy certain criteria. For instance, to find a file that contains the string ‘needle text’ in the ‘mydocs’ directory:

find mydocs -type f -exec grep -l "needle text" {} \;

The problem of this approach is that it would search through ALL files in this directory including the binary ones such as images, executables and zip packages. Sensibly, we would only want to search through text files for a specific string. If there are far too many of binary files in present, it’d be a significant waste of CPU usage and time to get what you want, because it’s totally unnecessary to go through the binary files.

To achieve this, use this version of the above command:

find mydocs -type f -exec grep -l "needle text" {} \; -exec file {} \; | grep text | cut -d ':' -f1

I asked the question at stackoverflow.com and peoro came up with this solution. It works great.

Basically, the bold part checks each file’s mime type information and only searches the files that have ‘text’ in its mime type description. According to the Linux ‘file’ command manual, we can be fairly sure that files with ‘text’ in its mime type string are text files AND all text files have ‘text’ in its mime type description string.

Thus far the best way to do this

find -type f -exec grep -Il . {} \;

Or for a particular needle text:

find -type f -exec grep -Il "needle text" {} \;

The -I option to grep tells it to immediately ignore binary files and the . option along with the -l will make it immediately match text files so it goes very fast.

{ Comments on this entry are closed }

One of my old Internet friends Brad made a very nice online slide that introduces to you some of the exciting new features of PHP 5.3. I’m most interested in namespace that would make coding in a large project and code reuse much easier, especially for people who find keeping naming conventions a challenge in the team.

Here’s the original post and slide: http://bradley-holt.com/2010/11/new-features-in-php-53/

All the new stuff and concepts are presented in a practical manner that’s extremely straightforward. Programmers should find them a breeze to crunch.

{ Comments on this entry are closed }

Was doing something with a regular expression and very oddly the connection keeps being reset every time I refresh the web page.

I tried to narrow down the problematic line by removing the code in functional chunks. Finally it comes down to a preg_match() instance with a small bit in the regular expression that’s accidentally and wrongly typed in caught my attention:


Got rid of the second plus sign:


And it’s all right.

{ Comments on this entry are closed }

MySQL: Export Table to CSV Text Files for Excel

by Yang Yang on November 18, 2010

MySQL tables can be exported to SQL dump file which is basically a text file of SQL queries that can be used to import the table back into database. To export MySQL tables into other formats such as CSV, phpMyAdmin proves to be very handy if you have changed the execution timeout in seconds to zero (so it never times out) – or it won’t work with large tables. However, there’s another way to do this in native SQL query and it works until the end of a very large table:

SELECT * FROM mytable INTO OUTFILE "c:/mytable.csv"

The INTO OUTFILE command in this MySQL query will store all rows selected from the table mytable into the text file c:/mytable.csv in the form:


Now you can open and use the CSV file in another application such as Excel and manipulate it any way you like.

If you need another format, just change the query accordingly.

Add column / field headers at the beginning of the CSV file

To add column names at the beginning of the CSV file so each field is identified, use this SQL snippet:

SELECT 'id', 'name', 'age' UNION
SELECT * FROM mytable INTO OUTFILE "c:/mytable.csv"

That is, adding “SELECT ‘id’, ‘title’, ‘slug’ UNION”. And you will have:


The awkward thing is you have to manually add the field names one by one. For now, I know no option other than this that can add the column names before all the records when you are dumping CSV files from MySQL, except with phpMyAdmin.

{ Comments on this entry are closed }

A few SEO tips

by Yang Yang on November 10, 2010

findHaven’t come up with any solidly helpful posts recently so I thought I’d throw in this one. These are some of the things I learned the hard way in the past few years for ranking my websites in Google. Hopefully you would find something new in here.

I rely on SEO to get traffic – in most cases, it yields the best traffic across all possible sources. With good SEO (both on-site content optimization and off-site reputation / link building), it’s hard to not make money. Especially if you are an expert in Internet marketing (niche research, reputation building & management, consumer psychology, landing page tuning, blah blah blah…), it’s even harder to not be rich. Making good money is easy, you just need time.

Don’t use a host that’s POPULAR and CHEAP.

Really popular hosts like hostgator and dreamhost have millions of domains hosted with them. Because they are cheap, spammers like them and Google knows it. I frequently launch new sites and from my experience with dreamhost, after submitting the new site at here, without building any backlinks, it typically takes 1 week or more to get it indexed.

However with hawkhost and wiredtree, it’s totally different situation. Without any initial backlinks, new sites can be indexed in Google 1 day after submission, even when it’s just a blank site with an empty Apache index page. Sometimes I didn’t even have to manually submit the site and it magically and automatically got in Google’s index.

Sites hosted with hawkhost tend to be more stable in Google’s index. However, it’s hard to keep a new site (with merely any content) in Google’s index if it is hosted with dreamhost (and similarly very popular hosts with cheap shared plans). Google would soon get rid of your new site if you don’t keep working on it.

Have a 4 year old website.

Adsense is one of my favorite money makers and my most steady stream of Adsense income comes from a site I built in 2006. I created some nice content (very nice and very original) back then and I just left it there.

I made only $10 a month from the site in the first year and after some very frustrating ups and downs, it’s gradually climbing up. Now, 4 years later, it’s averaging $600 a month. To be honest, I never actually spent much time on it at all. No link building nor frequent content updating no nothing and it’s now making me 600 bucks a month. Not much, but still.

Not only is it receiving large amount of steady traffic, new content are generally very well positioned in front spots in search engine results. The older the site, the more authoritative it can get from search engine’s point of view.

Time is the ultimate distinguisher between builders and spammers. Spammers come and go, hit and run. They are always impatient, looking to make the quick buck with a spammy site. Once they find it’s not profitable, they’ll stop renewing the domain after the 1st year. Google knows this too well.

So most of your sites would not actually start performing in terms of search engine traffic until at least 1 year after domain registration. Yet most people are too obsessed with quick results and never wait that long. They kill their sites just before they can make them decent money.

Be natural.

Google is becoming smarter and smarter. I would never go against them by challenging their intelligence and capabilities to identify spam (or partial spam).

Sites I intentionally optimize in title, description and content keywords as well as off-site link anchor texts never seem to get anywhere substantial. It’s boring and it’s chores. It’s not worth it. I can spend the same time and dollar bill in creating content that’s useful and exciting. Best of all, it’s much more fun that will keep you going!

Duplicate content is a myth.

While being original is absolutely a must in ultra saturated / competitive niches, duplicate content isn’t that big a deal in most niches.

Forget SEO. Start making friends and never stop creating stuff.

I’ve been doing SEO for 4 years and I can finally say, this is the ultimate SEO tip.

{ Comments on this entry are closed }

Argophilia – Eastern Europe Travel Portal

by Yang Yang on November 2, 2010

Phil was one of my best friends on the web and he has been very kind and helpful when I was just starting out. A few months back he invited me to work on a travel project that eventually landed as Argophilia. It’s a shame I was occupied by a lot of chores then and never actually contributed anything substantial to it.

Now you can see it’s becoming something real: http://argophilia.com/

And the news portal: http://www.argophilia.com/news/

They are simply awesome! Both the design and the content are exceptionally great. Hopefully they would take off soon as for now the only travel site targeting the Eastern Europe.

{ Comments on this entry are closed }

I have WAMP server installed on my local computer for PHP and MySQL development. After using it for a while, the MySQL installation folder has run up some seriously huge log files that are taking enormous amount of disk space. We are talking 10s of GB here. The file names are like ibdata1, mysql-bin.000017, etc.. It appears it’s because I frequently have large databases in and out by the ‘mysql’ and ‘mysqldump’ command, building up large stacks of data importing and exporting logs for data rollback and recovery or something.

Simply deleting the log files may result in problems such as being unable to start MySQL again. Tried that and I had to reinstall it to make it work again.

After some searching, I’ve found this query to solve my problem. Just execute it after you have logged in MySQL via command line – of course you will need the root privileges:

PURGE BINARY LOGS BEFORE '2036-12-12 12:12:12';

Something like that would purge and delete all the huge logs to free the disk. Just make the date time to be as sci-fi as possible so that all log files are purged.

{ Comments on this entry are closed }