HackerOne users: Testing against this community violates our program's Terms of Service and will result in your bounty being denied.

Running vanilla from docker in production

I've been using Vanilla for about 5 years, and have developed a maintenance process along the way that has broken down. My goal was to version control our setup (customisations, theme, plugins), and allow for automated deployments like the rest of our stack. I did this by forking the repo and adding to it: for plugins that were on GitHub, I added them as submodules; when they weren't on GitHub, I added their contents to our repo.

This worked for the most part, but upgrading the forums was always a bit of a pain, as we had to rebase and deal with any merge conflicts. We now have a huge number of merge conflicts, to the point that I'd like to start over and make a better setup.

The goal is to create a version controlled forum setup, with reproducible builds, that can run on an ephemeral filesystem (even heroku) so we can load balance it and even blow it away and reconstruct it if necessary. The twelve-factor app provides some background and general principles for this.

To do this, I figure I'll need to:

  1. Start from a specific Vanilla release: either a distributed package (zip) or the GitHub codebase (in which case I'd need to run composer).
  2. Fill in config.php with settings and secrets, without putting secrets into version control.
  3. Pull in our theme, either from a separate git repo or from a directory within this repo, and move it to the themes directory of the Vanilla codebase.
  4. Do the same for 10+ plugins, some of which are on GitHub, others which are just .zip files on the addons website.
  5. Set conf , cache , and uploads to be writable by PHP.
  6. Somehow make uploads a separate volume (like S3) so it can be shared across multiple instances and can persist (even be backed up).

I've started implementing this in a Dockerfile. You can see my progress so far at this repo.

Where I need help

1. config.php

I'm having some trouble with config.php because Vanilla writes to it when an admin makes a change in the admin dashboard: it replaces getenv(...) with the plaintext value. Therefore, if you want to commit any changes made in the dashboard back to source control (say, by mounting config.php as a volume and then making the changes), you'd be committing secrets unless you remember to go back in and change them back to getenv . Solutions to this I've thought of so far are: (a) make it read-only, so the application cannot change it, or (b) consider it a secrets file, outside of source control. But then how would you do an automated deployment? Is there a way to stop Vanilla from writing to config.php ? I would think the database would be better suited for some of these settings, no?

2. Pulling in the theme and plugins

The simplest approach here is for the Dockerfile to just curl and unzip the theme and plugins. I can either put them in the appropriate place in the Vanilla codebase, or symlink them; it doesn't make much difference, I don't think. But it starts to feel like the job of a dependency manager. I could use composer here, but I'd have to either move the installed directory or symlink it to the appropriate place, and I don't think composer can do that, can it? Does anything else come to mind here?

Let me know if there's been any prior work on this, or if anyone wants to collaborate. I'm sure there's ways we can make the repo generic.

Comments

  • charrondevcharrondev Developer Lead (PHP, JS) Montreal Vanilla Staff
    edited January 2020

    But then how would you do an automated deployment? Is there a way to stop Vanilla from writing to config.php ? I would think the database would be better suited for some of these settings, no?

    You can make a bootstrap.early.php and put it in your config folder. This can write in memory values into config and is never touched by the application.

    Check out the example here for conditionally configuring memcached.

    Notice the 3rd parameter in saveToConfig.

    That applies the config value only in memory for the current request and does not write it to the disk.

  • @charrondev interesting! So I could leave things like DB settings, cookie salt, update token, smtp creds, etc. blank in config.php and then make a bootstrap.early.php file that sets them?

    `saveToConfig('Database.Host', getenv('DB_HOST'), false);`

    `saveToConfig('Garden.Cookie.Salt', getenv('COOKIE_SALT'), false);`

    etc.? Will modifying admin settings not write the database credentials and update token etc. to config.php from the in-memory value?

  • Hm.. perhaps I'm doing something wrong, but I'm getting errors that suggest vanilla is running before bootstrap.early.php has a chance to set the values.

    • PHP Fatal error: Uncaught Exception: Cookie salt is empty. in /var/www/html/library/core/class.cookieidentity.php:486
    • An error occurred while attempting to connect to the database|Gdn_Database|Connection|SQLSTATE[HY000] [2002] php_network_getaddresses: getaddrinfo failed: Name or service not known in /var/www/html/library/database/class.database.php:172

    My conf/bootstrap.early.php file looks like this: https://gist.github.com/wilson29thid/8f0dfbd5006b5fe2040a013c27921001

  • Ah - I was leaving empty values in config.php (e.g. `$Configuration['Database']['Host'] = ''`) and those were overwriting what was set in bootstrap.early.php . I commented those out and it seems to be working. I can edit things in the admin dashboard and vanilla doesn't seem to be putting in the credentials, as I'd hoped. Thanks @charrondev ! I'll update here when I figure something out for putting the plugins/theme in.

  • I've got it working! I even got it running in heroku (had to run some magical heroku docker command though). You can see the Dockerfile at https://github.com/29th/forums

    The last thing to figure out is how to make /uploads use an external volume or remote service (like S3). Has this been done already? I would imagine there would be plugins to route file uploads to an S3 bucket, and to use that S3 bucket's URL when rendering them, but I don't see any. For testing purposes, I could at least change the URL path when rendering them, to point to my production server's public-facing /uploads folder, but I'd definitely need to override the upload behaviour when using this in production.

  • charrondevcharrondev Developer Lead (PHP, JS) Montreal Vanilla Staff

    It's definitely been done before (hosting images over on S3). We do it ourselves for Vanilla Cloud, but unfortunately that's a cloud-only integration.

    You can essentially totally control the saving/moving/deleting of files though through a custom plugin.

    I can't share the code with you, but I can share some of the plugin method signatures that we use to handle the events.

    interface UploadPlugin extends Gdn_Plugin {
       /**
        * Adds the appropriate url prefixes for the various cloud files.
        *
         * @param Gdn_Upload|mixed $sender The upload object doing the manipulation.
         * @param array $args Arguments useful for manupulating the URLs
        *
        * @throws \Exception
        */
        public function gdn_upload_getUrls_handler(Gdn_Upload $sender, array $args);
    
    
        /**
         * Copy a file locally so that it can be manipulated by php.
         *
         * @param Gdn_Upload|mixed $sender The upload object doing the manipulation.
         * @param array $args Arguments useful for copying the file.
         *
         * @throws Exception Throws an exception if there was a problem copying the file for local use.
         */
        public function gdn_upload_copyLocal_handler(Gdn_Upload $sender, array $args);
    
    
        /**
         * Override file deletes for files stored in S3.
         *
         * @param Gdn_Upload|mixed $sender The upload object performing the delete.
         * @param array $args The upload arguments.
         *
         * @throws Exception Throws an exception if there was a problem deleting the file.
         */
        public function gdn_upload_delete_handler(Gdn_Upload $sender, array $args);
    
    
        /**
         * Override file uploading to save to S3.
         *
         * @param $sender The upload object doing the upload.
         * @param array $args The arguments for the upload.
         *
         * @throws Exception Throws an exception if there is an error saving the files.
         */
        public function gdn_upload_saveAs_handler(Gdn_Upload $sender, array $args);
    }
    

    Turn that at into an implemented class and you can put your files anywhere you want and control it all through PHP.

  • Excellent, thanks @charrondev !

  • Just wanted to share that we now have Vanilla running in docker in production :D Here's our dockerfile: https://github.com/29th/forums

    It's orchestrated by a docker-compose.yml file alongside other apps, which you can see in this repo: https://github.com/29th/personnel

    There are improvements I'd like to make: using a non-root user within the container, and using ADD instead of COPY so I don't need to use curl, but overall it's working pretty well. With a few tweaks it could be generic enough to be reused (at the moment it does a few things that only we need, like install our theme), but for now it could serve as reference for anyone else interested.

    Certainly open to suggestions on it! Thanks for the help in this thread.

Sign In or Register to comment.