
For effective interaction with their communities, many large organizations and companies use the long-established forum format. Unlike the popular format of social networks, the forum allows you to consolidate the community at a qualitatively different level due to more accurate data structuring, the ability to use powerful tools for finding information, the ability to use advanced rating systems and gamification, moderation and anti-spam.
The purpose of the article is to describe the installation procedure of the modern
XenForo engine for deploying forums using memcached caching and the most powerful search engine ElasticSearch. These services will operate inside the Docker container and deployed and managed through the Plesk interface.
')
In addition, the article touches upon the wider use of
Elastic Stack (ElasticSearch + Logstash + Kibana) in the context of Plesk for data analysis. For example, analyzing search queries on the forum and analyzing server logs.
Let's start by expanding a forum based on the XenForo engine on a Plesk domain:
- Create subscription in Pleska for the domain forum.domain.tld
- In PHP Settings for this domain, select the latest available version of PHP to use. For example, PHP 7.1.10
- Go to the File Manager and in the httpdocs directory of the site we delete everything except, perhaps, the file favicon.ico
- Via the File Manager and the Upload button, fill in the httpdocs directory with the zip file of the XenForo distribution. For example, xenforo_1.5.15a_332013BAC9_full.zip
- Extract the file with the Extract Files button. The archive content will be unpacked into the upload directory. You need to enter it, select everything, and use the Move button to transfer everything to the httpdocs directory. After that, the upload directory and the xenforo_1.5.15a_332013BAC9_full.zip archive can be deleted.
- In the subscription forum.domain.tld go to the Databases section and create a database for the future forum. Name, username and password set at its discretion. Like this:

For security, it is important to set Access Control to Allow local connections only. - We begin the installation of the forum. Go to forum.domain.tld. The installation menu for XenForo should appear. Follow the instructions and in one of the steps specify the name of the database created, its user name and password. Then you will need to create an admin account for the forum. After the successful completion of the installation, you can enter the administrative panel of your forum and make the final settings.
- To significantly speed up your forum, we will connect memcached caching to it and use the appropriate docker container from the Plesk Docker Extension for this. But before we install it, we need to install the memcached module for our version of PHP used on the site.
- Compile the memcached PHP module using the example of a Debian / Ubuntu Plesk server. Install all necessary packages:
Compile the module:
Install the compiled module:
- In the installed Plesk Docker Extension, we find memcached docker and install and run it with these settings:

- After that, the port 11211 should be available on your Plesk server. You can check its performance with the command:
- We connect memcached caching to the forum. To do this, go to the File Manager and in the Code Editor open the file forum.domain.tld / httpdocs / library / config.php. Add lines to its end:
$config['cache']['enabled'] = true; $config['cache']['frontend'] = 'Core'; $config['cache']['frontendOptions']['cache_id_prefix'] = 'xf_'; //Memcached $config['cache']['backend']='Libmemcached'; $config['cache']['backendOptions']=array( 'compression'=>false, 'servers' => array( array( 'host'=>'localhost', 'port'=>11211, ) ) );
- Check that the forum is working correctly. Directly the work of the connected caching can be checked with the command:
- You can connect the powerful search engine ElasticSearch to the XenForo forum. To do this, you will need to install the XenForo plugin called XenForo Enhanced Search and docker container elasticsearch. This docker container for its work requires a significant amount of memory, therefore, your server should have enough of it. You can also install the XenForo Enhanced Search plugin by downloading and unpacking a zip file through the Plesk File Manager. Installation details for XenForo plugins can be found in the relevant documentation. As a result, in the admin panel of your forum you should get the following settings for your search engine ElasticSearch:


- In order for the search to work in the Plesk Docker Extension, you must install elasticsearch docker container with the following settings:

and then verify that port 9200 is open for connection with the command:
- After that, in the forum administration panel you need to make sure that ElasticSearch is connected and create a search index:


- As a result, you got a forum on a modern XenForo engine with a powerful search engine and accelerated by caching based on memcached. As a further improvement and empowerment, you can use the analysis of search queries on your forum using Kibana. To do this, you need to use either a separate kibana docker container, or one elasticsearch + kibana docker container and a patch for the XenForo Enhanced Search plugin, which creates a separate ElasticSearch index that stores search queries and which can be analyzed using Kibana. For example, to get just such statistics on keywords used in search queries:


- The patched file for version 1.1.6 of the XenForo Enhanced Search plug-in can be taken here ElasticSearch.php They need to replace the original file in the httpdocs> library> XenES> Search> SourceHandler directory A separate index will be created in addition to the search index in ElasticSearch for storing search queries named saved_quiries, which will need to be used for analysis in Kibana.
An interesting application seems to be the replacement of standard analysis systems in Splash Awstats and Webalizer with a more efficient and flexible analysis system in Kibana. There are several options for sending vhost logs to ElasticSearch. For example:
- using the third Elastic Stack component - Logstash
- Using the rsyslog service and the omelasticsearch.so ( yum install rsyslog-elasticsearch ) plugin installed for it, you can directly send log data to ElasticSearch. This is very cool, as it does not require an extra link in the form of Logstash.
While the problem here is that in order to properly store the logs in ElasticSearch and parse them with Kibana, they must be in json format. At the vhost level, it is not possible to change the nginx log_format parameter. As one of the possible solutions, it is to use the
Filebeat service, which will take the usual nginx, Apache or other service logs, parse it into the required format and give it further. And it allows you to collect logs from different servers. In general, the field for experimentation seems huge.
For example, using rsyslog, you can send any other system logs to ElasticSearch for Kibana analysis and it is quite workable. For example, you can use the configuration file
/etc/rsyslog.d/syslogs.conf , which will work to send the local syslog logs to Elasticsearch in a convenient format for Logstash / Kibana after starting the rsyslog service with the rsyslogd
-dn command :
You can see that ElasticSearch index logstash-2017.10.10 was successfully created and ready for use and analysis with Kibana:
As a result, you can create in the Kibana Dashboard with the necessary data, from which you can create your own visualizations. For example, something like this Dashboard:

And in the case of the analysis of nginx access.log, you can do some such Dashboard:

As a result, we get a modern platform for working with the community with the additional possibility of collecting all sorts of statistical data and analyzing them.
Of course, the article is general, recommendatory. It does not mention many important aspects, for example, security issues, it only gives directions, describes possible scenarios and ways to implement them. Experienced administrators can perform the finer details on their own. Elastic Stack is like a tool, a designer, with which you can get the desired result according to your taste. It is only important to feed him the necessary and correct data with which you want to work in the future and which you need to analyze.