WordPress Optimization: Description and Keywords

Many children’s shoes like to install such as All in one SEO optimization plug-in, in fact, this WordPress plug-in is nothing more than to optimize the Title, Meta Description and Keywords. Of course, there are many free themes in the early development, there is no optimization Description and Keywords, such as I love WordPress currently use the free version of WordPress Geek theme: Geeky.

1. Title optimization

Title optimization is very simple, as long as the use of the following code to replace the header.php file in the default Title call code can be friends:

1 <?php $the_title = wp_title(' - ', false); if ($the_title != '') : ?>
2     <title><?php echo wp_title('',false); ?>
3     <?php bloginfo('name'); ?></title>
4 <?php else : ?>
5     <title><?php bloginfo('name'); ?></title>
6 <?php endif; ?>
2. Description and Keywords Optimization

Use the following code to replace the header.php file description and the default call code

 1 <?
 2 if (is_home()){
 3      $description = "输入你首页的描述文字";
 4      $keywords = "输入你首页的关键词";
 5 }
 6 elseif (is_category())
 7 {
 8     $description = category_description();
 9     $keywords = single_cat_title('', false);
10 }
11 elseif (is_tag())
12 {
13     $description = tag_description();
14     $keywords = single_tag_title('', false);
15 }
16  elseif (is_single())
17 {
18      if ($post->post_excerpt) {
19          $description     = $post->post_excerpt;
20      } else {
21          $description = substr(strip_tags($post- 
22 >post_content),0,220);
23      }
24     $keywords = "";
25     $tags = wp_get_post_tags($post->ID);
26     foreach ($tags as $tag ) {
27         $keywords = $keywords . $tag->name . ", ";
28     }
29 }
30 ?>
31 <meta name="keywords" content="<?=$keywords?>" />
   <meta name="description" content="<?=$description?>" />

Using the above method, Keywords is the article’s tags, Description is published summary of the log, if you do not add the summary, that is, the article interception of 220 characters as a summary.

WordPress removes duplicate headers in your data

One is to delete the duplicate method is: use a plug-in, called Delete Duplicate Posts, we can Quguan online download

Two kinds of delete duplicate method is: Log in the database, use the sql statement to delete, the specific statement is the following code:

Create table my_tmp as select min (ID) as col1 from WP_posts group by post_title;
Delete from wp_posts where ID not in (select col1 from my_tmp);
Drop table my_tmp;

But because the plug-in is easy WordPress slow down I still give him, and direct operation of the database, but also need to login database is too much trouble, so I integrated the above code to write a php file, the code is as follows

Require ( ‘./ wp-load.php’);
$ Strsql = “create table my_tmp as select min (ID) as col1 from cd_posts group by post_title”;
$ Strsql1 = “delete from cd_posts where ID not in (select col1 from my_tmp)”;
$ Strsql2 = “drop table my_tmp”;
$ Result = mysql_query ($ strsql);
$ Result = mysql_query ($ strsql1);
$ Result = mysql_query ($ strsql2);

Save the above code as delete.php, put in the root directory, for example, for my site then access this file


WordPress Cleanup WP_postmeta Raiders

WP_postmeta table is stored with the information in the article, except for some specific time the data inside is useless to write the data before I also clean up the article, simply gives two statements and other data tables clean up, see WordPress database cleaning.

In several statements given below to make your WP_postmeta clean to very much.

DELETE FROM wp_postmeta WHERE meta_key = ‘_edit_lock’;
DELETE FROM wp_postmeta WHERE meta_key = ‘_edit_last’;

This is two statements before testing safe removal without risk.

DELETE FROM wp_postmeta WHERE meta_key = ‘_wp_old_slug’;
DELETE FROM wp_postmeta WHERE meta_key = ‘_revision-control’;
DELETE FROM wp_postmeta WHERE meta_value = ‘{{unknown}}’;

This is the new attempt of the three statements, the first two are not dangerous. The last one is not clear the current role, but no abnormalities after deletion. The implementation of these five sentences to delete more than 95% of the data can be considered on the limit optimization, and finally take into account the data table is not very important, cleanliness of people can try to empty the table, of course, I test empty table So that some of the original data is lost.


What is .htaccess?

Unix, and Linux systems or any version of the Apache Web server is supporting .htaccess, but some service providers may not allow you to customize your own .htaccess files. Most of the domestic virtual host will not have this function.

To enable .htaccess, you need to modify httpd.conf, enable AllowOverride, and use AllowOverride to restrict the use of specific commands. If you need to use the .htaccess other than the file name, you can use the AccessFileName command to change. For example, if you need to use. config, it can be configured in the server configuration file as follows: AccessFileName. config.

Generally speaking, the .htaccess can help us to achieve include: folder password protection, the user is automatically redirected, custom error pages, change the file name extension, ban specific IP addresses of users, allows only a specific list of users, directory listing, and the use of other documents as an index file and some other features.

The following describes the specific .htaccess file function settings

How to create a .htaccess file

.htaccess is a weird file name (from Win’s perspective, it does not have a file name, only a 8-letter extension, but in fact it is the name of linux, and many things under linux, we In the Win system is not directly through the “new” file to create, but we can use the cmd in the copy to achieve, such as copy sample.txt .htaccess. You can also create a htaccess.txt, then Ftp to the server, through FTP to modify the file name.

How to customize error pages

One application of .htaccess is a custom error page, that will allow you to have your own, personalized error pages (for example, file not found), instead of the error page provided by your service provider or do not have any pages. This will make your site look more professional when errors occur. You can also use scripts to notify you when an error occurs (such as when a page not found will automatically email you). You know any page error code (for example, 404 page not found), are accessible through the .htaccess file adding the following text turn it into a custom page:

ErrorDocument errornumber /file.html

For example, if my root directory has a notfound.html file, I would like to use it as a 404 error page:

ErrorDocument 404 /notfound.html

If the file is not under the root directory of the Web site, you only need to set the path to:

ErrorDocument 500 /errorpages/500.html

The following are some of the most common errors:
401 – Authorization Required needs validation
400 – Bad request error request
403 – Forbidden forbidden
500 – Internal Server Error
404 – Wrong page – page not found

Next, you need to do is create a file to display when an error occurs, and then take them. htaccess to upload.

.htaccess file of frequently used commands

First, prohibit the display of directory list
Sometimes, for some reason, there is no index file in your directory, which means that when someone types the path to the directory in the browser address bar, all the files in that directory are displayed, which will give your web site a security risks. To avoid this situation (without having to create a bunch of new index files), you can type in the following commands in your .htaccess document to prevent visitors from accessing your directory.

Directory listings are displayed:

Options -Indexes

Second, block/allow specific IP addresses
In some cases, you may want to allow only specific IP of users can access your Web site (for example: the specific ISP’s users are only allowed to enter a directory), or to ban certain IP addresses (for example: the low-level user isolation in the your information section). Of course, this is only useful if you know the IP address you want to block, but most users on the network now use dynamic IP addresses, so this is not a common way to limit usage.

You can block an IP address by using:
deny from

Here is a banned IP address, if you only specify a few of them, you can block the entire network segment address. If you enter 210.10.56., All IP addresses of to will be blocked.

You can use the following command to allow an IP address to access the site:
allow from

Allowed IP addresses for, you can block IP addresses allow the entire network segment.

If you want to prevent all access to the directory, you can use:
deny from all

But this does not affect the script to use this directory under the document
The .htaccess file (or “distributed configuration file”) provides a way to change the configuration for a directory by placing a file containing one or more instructions in a specific document directory to act on the directory and all its subdirectories As a user, the commands that can be used are restricted, and the administrator can set it by Apache’s AllowOverride directive.

– Directives in subdirectories override instructions in the higher-level directory or in the master server configuration file.
– .htaccess must be uploaded in ASCII mode, preferably with permissions set to 644.

Location of error documents

Commonly used client request error Return Code:
401 Authorization Required
403 Forbidden
404 Not Found
405 Method Not Allowed
408 Request Timed Out
411 Content Length Required
412 Precondition Failed
413 Request Entity Too Long
414 Request URI Too Long
415 Unsupported Media Type

Common server returned error codes:
500 Internal Server Error

Users can use. Htaccess specified their own prior to make a good error reminder page. In general, people can specifically set up a directory, such as errors placed these pages. Then. Htaccess, add the following instructions:

ErrorDocument 404 /errors/notfound.html
ErrorDocument 500 /errors/internalerror.html

One instruction line. The first instruction is the meaning of 404, that is not found in the required documents when the page is displayed / errors directory notfound.html page. It is not difficult to see the syntax format:
ErrorDocument Error code / directory name / filename. Extension
If you need to prompt the information is very small, do not have special pages, direct use of HTML in the directive, for example, the following example:
ErrorDocument 401 ”
You do not have permission to access the page, please give up!”

Password protection for document access

To use .htaccess to a directory of the document set up to access the user and the corresponding password, the first thing to do is generate a. Htpasswd text documents, such as:

Zheng: y4E7Ep8e7EYV

Here the password is encrypted, the user can find some tools to encrypt the password. Htaccess support encoding. The document is best not to put on the www directory, the proposed directory on the www outside the directory, so that more secure.

With the authorized user documentation, you can add the following directive to the .htaccess file:

AuthUserFile .htpasswd The server directory
AuthGroupFile / dev / null (need to access the directory)
AuthName EnterPassword
AuthType Basic (authorization type)
Require user wsabstract (allow access to the user, if you want to allow all users in the table, you can use require valid-user)

Note, the brackets part of the learning time to add their own comments
Deny access from an IP

If I do not want a government department to access the content of my site, it can be .htaccess IP to join the department and reject them.

For example:
Order allow, deny
Deny from
Deny from 219.5.45.
Allow from all

The second line rejects an IP, the third line rejects an IP segment, which is ~

Want to reject everyone? With deny from all good. Not only with IP, you can also use the domain name to set.

Protect the .htaccess document
When you use .htaccess to set password protection for a directory, it contains the path to the password file. From the security point of view, it is necessary to protect. Htaccess up, let others see the contents. Although it can be done in other ways, such as document permissions. However, .htaccess itself can do, just add the following directive:
order allow,deny
deny from all
URL redirects

We could redesign the site, will migrate the document, or change the directory. At this time, access from search engines or other sites linked to it may be wrong. In that case, the old URL can be automatically redirected to the new address with the following command:

Redirect / Old Directory / Old Document Name The address of the new document
Or the entire directory of the steering:
Redirect Old directory New directory
Change the default home page file

In general, the default home page file name default, index and so on. However, there are times when there is no default file in the directory, but a specific file name, such as pmwiki.php in pmwiki. In this case, the user to remember the file name to access a lot of trouble. In .htaccess can easily set the new default file name:

DirectoryIndex The new default file name

You can also list multiple, in order that the priority between them, for example:
DirectoryIndex filename.html index.cgi index.pl default.htm

Prevent hotlinking

If you do not like people on their web pages to connect their own pictures, documents, it can also be done through the htaccess instructions.

Need instructions are as follows:
RewriteEngine on
RewriteCond % !^$
RewriteCond % !^http://(www.)? mydomain.com…*$ [NC]
RewriteRule . (gif|jpg)$ – [F]

If you feel that someone else’s page to open a skylight does not look good, it can use a picture to replace:
RewriteEngine on
RewriteCond % !^$
RewriteCond % !^http://(www.)? mydomain.com…*$ [NC]
RewriteRule . (Gif|jpg) $ http://www.mydomain.com/replace the picture file name [R,L]
source:http://wsabstract.com/howt… . .htaccess to achieve standardization of Web site

Web site standardization in SEO is a more important link, while the existence of different versions of the URL, not only may result in replication, but also can not focus on the right weight.

Most of the current site, domain names are bound with WWW and without two versions, and even many sites at the same time binding multiple domain names. After processing (jump, CNAME … …), for the visitors may not have any impact, see the same content; but for search engines, you can not determine which URL is the real home page.

Controllable factors, the site content link should keep the same version, such as the entire station, return to the home page address for a fixed version. But there is an uncontrollable factor, that is, someone else links to your site, the URL used. Many spontaneous links or exchange links, others may be omitted WWW, or use a binding of another domain name.

Uncontrollable external factors, we can be controlled from within the site. Using Apache .htaccess, use a 301 redirect, standard Web site.
Creates a new empty file, filename .htaccess, fill in the redirection rules, upload to the site root directory.

With 301 will not redirect the domain name with WWW
rewriteEngine on
rewriteCond % ^379.cc [NC]
rewriteRule ^(.*)$ http://www.379.cc/$1 [R=301,L]

Bound with a 301 redirect multiple domain names to the main domain name
RewriteEngine on
RewriteCond % ^379.cc$ [OR]
RewriteCond % ^bbs.379.cc$ [OR]
RewriteCond % ^luoyang.cc$ [OR]
RewriteCond % ^www.luoyang.cc$ [OR]
RewriteCond % ^bbs.luoyang.cc$
RewriteRule ^(.*)$ http://www.379.cc/$1 [R=301,L]

Of course, you can also extend, to redirect index.html, index.php, etc:
RewriteEngine on
RewriteCond % ^[A-Z]{3,9}\ /index\.php\ HTTP/
RewriteRule ^index\.php$ http://www.379.cc/ [R=301,L]
Enabling SSI Via htaccess allows SSI (Server Side Including) functionality via htaccess
AddType text/html .shtml
AddHandler server-parsed .shtml
Options Indexes FollowSymLinks Includes
DirectoryIndex index.shtml index.html
Blocking users / sites by referrer Block user / site access according to referrer
mod_rewrite module is required

Case 1. Blocking a single referrer:badsite.com
RewriteEngine on
# Options +FollowSymlinks
RewriteCond % badsite\.com [NC]
RewriteRule .* – [F]

Case 2. Prevent multiple referrer:badsite1.com, badsite2.com
RewriteEngine on
# Options +FollowSymlinks
RewriteCond % badsite1\.com [NC,OR]
RewriteCond % badsite2\.com
RewriteRule .* – [F]
[NC]-case insensitive (Case-insensite)
[F] – 403 Forbidden

Note that the above code comments out the “Options + FollowSymlinks” statement. If the server does not set up FollowSymLinks in the httpd.conf paragraph, you need to add this, otherwise you will get “500 Internal Server error” error.

Blocking bad bots and site rippers (aka offline browsers) block bad crawler and offline browser
Needed mod_rewrite module
Bad spider? Like catch a junk email address crawler and does not comply with robots.txt reptiles (such as Baidu?)
They can be determined by HTTP_USER_AGENT
(But for some search engines to set their agent to “Mozilla / 4.0 (compatible; MSIE 5.5; Windows NT 5.0)”, can not do anything)
RewriteEngine On
RewriteCond % ^BlackWidow [OR]
RewriteCond % ^Bot\ mailto:craftbot@yahoo.com [OR]
RewriteCond % ^ChinaClaw [OR]
RewriteCond % ^Custo [OR]
RewriteCond % ^DISCo [OR]
RewriteCond % ^Download\ Demon [OR]
RewriteCond % ^eCatch [OR]
RewriteCond % ^EirGrabber [OR]
RewriteCond % ^EmailSiphon [OR]
RewriteCond % ^EmailWolf [OR]
RewriteCond % ^Express\ WebPictures [OR]
RewriteCond % ^ExtractorPro [OR]
RewriteCond % ^EyeNetIE [OR]
RewriteCond % ^FlashGet [OR]
RewriteCond % ^GetRight [OR]
RewriteCond % ^GetWeb! [OR]
RewriteCond % ^Go! Zilla [OR]
RewriteCond % ^Go-Ahead-Got-It [OR]
RewriteCond % ^GrabNet [OR]
RewriteCond % ^Grafula [OR]
RewriteCond % ^HMView [OR]
RewriteCond % HTTrack [NC,OR]
RewriteCond % ^Image\ Stripper [OR]
RewriteCond % ^Image\ Sucker [OR]
RewriteCond % Indy\ Library [NC,OR]
RewriteCond % ^InterGET [OR]
RewriteCond % ^Internet\ Ninja [OR]
RewriteCond % ^JetCar [OR]
RewriteCond % ^JOC\ Web\ Spider [OR]
RewriteCond % ^larbin [OR]
RewriteCond % ^LeechFTP [OR]
RewriteCond % ^Mass\ Downloader [OR]
RewriteCond % ^MIDown\ tool [OR]
RewriteCond % ^Mister\ PiX [OR]
RewriteCond % ^Navroad [OR]
RewriteCond % ^NearSite [OR]
RewriteCond % ^NetAnts [OR]
RewriteCond % ^NetSpider [OR]
RewriteCond % ^Net\ Vampire [OR]
RewriteCond % ^NetZIP [OR]
RewriteCond % ^Octopus [OR]
RewriteCond % ^Offline\ Explorer [OR]
RewriteCond % ^Offline\ Navigator [OR]
RewriteCond % ^PageGrabber [OR]
RewriteCond % ^Papa\ Foto [OR]
RewriteCond % ^pavuk [OR]
RewriteCond % ^pcBrowser [OR]
RewriteCond % ^RealDownload [OR]
RewriteCond % ^ReGet [OR]
RewriteCond % ^SiteSnagger [OR]
RewriteCond % ^SmartDownload [OR]
RewriteCond % ^SuperBot [OR]
RewriteCond % ^SuperHTTP [OR]
RewriteCond % ^Surfbot [OR]
RewriteCond % ^tAkeOut [OR]
RewriteCond % ^Teleport\ Pro [OR]
RewriteCond % ^VoidEYE [OR]
RewriteCond % ^Web\ Image\ Collector [OR]
RewriteCond % ^Web\ Sucker [OR]
RewriteCond % ^WebAuto [OR]
RewriteCond % ^WebCopier [OR]
RewriteCond % ^WebFetch [OR]
RewriteCond % ^WebGo\ IS [OR]
RewriteCond % ^WebLeacher [OR]
RewriteCond % ^WebReaper [OR]
RewriteCond % ^WebSauger [OR]
RewriteCond % ^Website\ eXtractor [OR]
RewriteCond % ^Website\ Quester [OR]
RewriteCond % ^WebStripper [OR]
RewriteCond % ^WebWhacker [OR]
RewriteCond % ^WebZIP [OR]
RewriteCond % ^Wget [OR]
RewriteCond % ^Widow [OR]
RewriteCond % ^WWWOFFLE [OR]
RewriteCond % ^Xaldon\ WebSpider [OR]
RewriteCond % ^Zeus
RewriteRule ^.* – [F,L]
[F] – 403 Forbidden
[L]-connection (Link)
Discuz! Forum static configuration rules are as follows:
# RewriteEngine model open
RewriteEngine On
# Modify the following statement in the/discuz directory of your forum address, if the program is placed in the root directory, please modify the/discuz to/
RewriteBase /
# Rewrite rules do not modify
RewriteRule ^archiver/((fid|tid)-[\w\-]+\.html)$ archiver/index.php?$1
RewriteRule ^forum-([0-9]+)-([0-9]+)\.html$ forumdisplay.php?fid=$1&page=$2
RewriteRule ^thread-([0-9]+)-([0-9]+)-([0-9]+)\.html$ viewthread.php?tid=$1&extra=page\%3D$3&page=$2
RewriteRule ^space-(username|uid)-(.+)\.html$ space.php?$1=$2
RewriteRule ^tag-(.+)\.html$ tag.php?name=$1

WordPress Site Performance Optimization to reduce the load time to improve the user experience

Recently to listen to the D2 Forum, a more profound understanding of the site for the user experience the importance of performance.

Slow 500ms = 20% lower Google User Visits

Slow 400ms = Yahoo! User traffic decreased by 5-9%

Slow 100ms = Amazon sales are down 1%

After the meeting, can not help but want to see how their site performance. Since the site moved to foreign countries, really a bit worried. You can not help but be amazed by the performance of your site with Google Webmaster Tools: Your site’s average page load time is 20.2 seconds. The site is slower than 99.7% of the sites.

It seems to optimize the hands of the. First in their own Firefox browser installed in the site performance testing tools Google Page Speed and Yslow. First, use YSlow to check the size of the home page. A total of 257.3KB.

And then Google Page Speed to see what can be improved. From the problems found, there are still many places can be optimized.

Due to the limited technical capacity and time, only a simple optimization can be carried out. I did some of the following optimizations:

First, compression style pictures

I did not use any advanced compression tools, but with Fireworks CS4 batch processing of the picture (no change in file type), the original site theme of all the style pictures have 195KB, optimized to reduce the 54.94KB. And optimize the picture, the naked eye can not see the difference (at least I did not see it).

Second, start gzip compression

Because I use the virtual host, and did not configure the server to open gzip permissions. I can only rely on WordPress plug-in GZIP Output. Home page size (Html / Text) decreased by 18.9KB.

However, GZIP Output can only be compressed for the php file, for the bulk of the CSS, JS files can not be processed. I also downloaded and installed the WP CSS and WP JS plug-ins to compress the CSS and JS files separately. However, WP JS this plug-in I did not get to know how to use, WP CSS is applied successfully. Web site theme style file from the original 23.5KB into 8.1KB, on the home page is also reduced by 15.4KB.

Third, reduce the unnecessary page elements

Js from the page load point of view, I have two Js is used to do site visits statistics. One is google analytics, the other is piwik. Trade-offs I removed the piwik statistics code, the home page size is reduced by 8.3KB.

In addition, the website footer has a picture of no effect. I simply to get rid of the home page size has reduced 5.66KB ,. Doing so not only reduces the page size, but also reduces unnecessary Http requests.

In conclusion

Really do not optimize optimization, a surprise. After optimizing the size of the first page from the original 257.3KB into 154.0KB, the total optimization value of 103.3KB, than the original page optimization of 40.15%. Home Http request from the original 25 into 22. Specific optimization details are as follows:

After some optimization, in the end will have what effect the page loading time. Let us try using the Web Page Analyzer, it can simulate the user in different network access page load time. Compare before and after optimization of the situation, the effect is quite good. Users of the mainstream bandwidth load time, optimized from 20.37 seconds to less than 3 seconds.

The above optimization work is not only on the home page optimization effect. Because the entire site uses a theme style, other pages have also been optimized. As for the overall performance of the site, we also use the article at the beginning of the Google Webmaster Tools to test. In January and then through it to view site performance, and finally, have been satisfied with the results. “Average page load time of your site is 2.6 seconds (Updated: 2010-1-9). The website is faster than 57% of the site.”

Article Source: http: //www.2beusable.com/website-optimization-practice.html