As for WordPress optimizations, some people may scoff. Is there anything to say about it? Is it any different from website optimizations made by other programs? Well, I admit, it’s not much different than a website that’s built with other programs, but, since I’m writing this, there’s got to be a reason that I’m writing it, and it makes sense to exist.

WordPress is now more and more webmasters do stand, because it is easy to understand, easy to use, don’t need to write the multifarious function, need not called to entangled with the complex argument, because it is already help us deal with these problems, we can only need to use them, may even have the corresponding optimization such as check, if a man doesn’t understand PHP, he only need to spend a little time can also build a very simple web site, so a lot of people like to do it all stand, from personal blogs to companies. But also because it is simple, convenient, in the optimization does not even need stationmaster how to use god, so some details on the problem often revealed. The following are some small problems that I have sorted out by myself. I hope you can add some deficiencies:

1. Key words

The details shown in many aspects, the first is the title, the article title contains the best in the articles of the key keywords, do not use some meaningless title, there is need to webmasters make have a very good idea of the title. Secondly, the article content, remember don’t frequent keywords, some webmasters think appear more key words in the article the ranking will have very good help, in fact, the idea of starting point is right, but there is a degree of all things, if the number of keyword density is too large, so the quality of the article you just go on.

2. Robots. TXT problem

As for the role of robots file, I won’t mention it here. Some new webmasters tend to ignore this small detail very easily. They think that it has little effect and is optional. Place a robots.txt file in the blog root directory to specify that the search engine only contains the specified content. For WordPress, there are some address should not be search engine indexes, such as a background process, log files, FEED address, etc., so that robots files created or necessary, you can choose to manually create, also can choose corresponding plug-in to generate.

3. Sitemap problem

In fact the most webmasters are easy to overlook, its role as important as robots, a sitemap is the site map, equivalent to the compass, for Google’s search engine, using a sitemap can let the search engine more efficient indexing, baidu also is same, just baidu sitemap file suffix is TXT and Google is XML.

4. Content update problem

Do site the most difficult is to insist, optimization is a long-term process, short-term effect cannot be seen, a significant portion of the owners insist again after a period of time can not reach their desired effect can choose to give up, that’s why we often see some sites last updated time be months or even more than a year, for the optimization of wordpress, will be updated frequently, and to write quality content, this is where the most key of SEO, write high quality articles, will be more easy to achieve the goal of SEO.

5. Open GZIP compression

Though WordPress is simple, the function is also very good, but relatively speaking, there are a lot of shortage, is the use of a large number of plug-ins, can cause the website open speed slow (of course, this is the common fault of the any program that is not limited to the WordPress), aiming at this problem, there is no any way to cure, can only take some measures to relieve the, careful owners should find, sometimes see when you look at someone else’s web analytics data GZIP compression is turned on, in fact this is a small detail, However, the effects of opening and not opening are different (there is not much difference when the data amount of the website is small, and it is more obvious when the data amount of the website is large).

6. Combination of js files and CSS files

By merging the js and CSS file, can decrease the number of requests for access to the server every time, reduce pressure on the server, the corresponding also can improve the site access speed, although this is only a very small details, but also nots allow to ignore.