One for the Vanilla Team - Is there a way to "drive" Vanilla from a CLI?
One of my clients is using Vanilla 2.0 (will be moving to 2.1 as soon as we finish verifying the plugins), and asked me if there is a way to simplify the deployment by using a script of some sort. Such script would have to perform the whole installation and it would have to enable the required plugins, one by one, without user interaction.
This may sounds simple, but, apparently, it isn't. It may be possible to complete the installation by populating the database manually, but enabling the plugins is more of a PITA. I thought that I could have just used CURL, or similar, to call the URLs to enable the plugins, but such URLs require the transient key, which, obviously, is not public.
Before I start coming out with ideas that would lead me to reinvent a wheel, I was wondering if there is an official way to send commands to Vanilla 2.0 and 2.1 from a CLI script, possibly in a non-hacked way.
Trivia: a very junior developer told me that "reinventing the wheel is what software development is all about", but, after 20 years in the field, I'm still not convinced.
Comments
devials advocate here
I suppose the first thing you would have to figure out does the plugin does actually work with the version you want. And since they are scattered throughout the land, and the requirement of a version or approval or anything of that nature is a no-win situation at this point.
Do you mean once the plugins you want are in place and actually downloaded.
I'm just curious. Wouldn't it be more costly from a client's point of view to pay for a tool that could maybe be only used once for a one-off and the time to develop it. Then the 15 minutes it would take to do an upgrade manually by the developer.
But it would be interesting to see the mechanics. Hope you get some good answers.
have you seen this:
http://vanillaforums.org/discussion/23877/1-step-vanilla-dev-sites
you might do something like this and run a few shell commands and move folders around and dump and import as well.
http://vanillaforums.org/discussion/comment/187296/#Comment_187296
P.S. AS you know I am not on the vanilla team. Just a fan drinking beer up in the bleachers.
I may not provide the completed solution you might desire, but I do try to provide honest suggestions to help you solve your issue.
Very interesting question ... The Url will call a couple of functions, right?
If there is no database tables involved, isn't it just adding the plugin line to conf/config.php?
If you circumvent the transient key, you're circumventing some parts of security
There was an error rendering this rich post.
Thanks @peregrine and @underdog. To answer your questions:
deploy forum
-> Deploying... -> Done.The 1-step Vanilla dev sites won't do, as sites are usually installed in brand new servers (and, during preparation, on brand new VM on developers' PCs). Symlinking everything every time would not be any better than enabling the plugins manually.
Regarding the database, we discarded the dump & import process, as we prefer to start with a clean slate, without risking to import users from somewhere, with the wrong permissions, or "leftover" content that someone mistakenly left around.
Setup()
step (creation of tables, views, indexes, pre-population of data, etc). Just flagging them as "enabled" won't work, they have to be enabled as if the Admin did it manually.To make a comparison, we would need something like drush. We don't need the same flexibility right off the bat, but that's the idea.
Note: a drush-like tool was precisely what I was thinking of implementing. I just wanted to make sure that there isn't something similar already built-in, before doing a lot of redundant work.
My shop | About Me
@businessdad I would control from the outside, rather than trying to make a cli for vanilla which would be a lot of work.
I would recommend using a toolkit like fabric, you can have a fabfile.py on the client side and having a server hosted git repo (origin), which you can push to over ssh, and part of the deploy it will have a special git repo representing the live site which would pull with the
-f
option (which can be virtualized and sandboxes if you like). Pity there is not php solution like virtualenv(wrapper).Stuff like enabling pluigns, it would using fabric again, the plugin would be added locally (you could script that) and then deployed normally, then enabling can either be done normally, or you can create an option with your fabric script which would trigger enabling.
Basically you are moving the control to the client side, and the interaction is done over ssh. You can also have a custom suders file, so that you just upload the public key once, and you aren't continually prompted for credentials.
Personally I hate that web applications/frameworks are trying to control the file system often in many web frameworks. it encourages weak setups. It is smacks of tail wagging the dog, but people only care about convenience these days.
grep is your friend.
Thanks @x00. Deploying a "pure" Vanilla site, as in copying the framework, plugins, applications and so on is not a big deal. There are many ways to do it from outside the framework (any automation tool could be used for that). Same for updates: most of the plugins we use can simply be deleted and replaced, without having to be disabled/enabled every time. The few ones that don't auto-update will be modified to do so.
Plugins and applications are one of the main obstacle: we don't want to have users click on enable, wait, enable, wait, enable, wait and so on, therefore being able to activate plugins automatically is critical. However, it cannot be done by just calling URLs, due to the lack of the Transient Key, which is there for added security, but becomes a hindrance in our case.
I reckon that we could leverage the Vanilla API (after rewriting it to support 2.0 and 2.1, as we won't move to 2.2 any time soon) to perform the few administrative tasks we need. It's true that Vanilla API is also an application that has to be enabled, but we should be able to get away with just adding some lines in the
config.php
, since it doesn't do anything complex in thesetup()
method.A workflow like this should cover our needs:
config.php
.In such case, we should only have to add the API commands we need, and we should be ready to go.
My shop | About Me
You can get
TransientKey
this isn't a CSRF context.Just that using an API is the better solution.
grep is your friend.
Then I still have to find out how... -_-
My shop | About Me
Well you run curl with sessions then you login, then you scrape a page for TransientKey. Fugly but it will work.
grep is your friend.
CSRF the context is someone elses session, this is your session.
grep is your friend.
I meant curl with cookie jar.
grep is your friend.
A well-constructed plugin should have any database changes within the Structure() method, which can be triggered by a call to /utility/update, bypassing the need to manually enable each plugin. I might start by patching/requesting patches for the plugins you're using as necessary.
Our hosted environment doesn't auto-enable any complex plugins during spawn anyway, so it's a non-issue for us. We do, however, call /utility/update as part of the spawn process. I'm not aware of any special CLI commands during spawn, but it's not my area.
Now that I think about it, none of our plugins or applications use the
Structure()
method, as I moved everything related to the plugin's setup into its own separate library (I didn't like the mixup of DDL and DML stuff in the main plugin class). However, it should be simple enough to call the same library fromStructure()
as well, it's just something likePluginInstall::Run()
.That could be a workardound to enable plugins "from the outside": enable the plugin via a
config.php
, then trigger the creation of tables using/utility/update
. I will just have to test it thoroughly, as some plugins may assume that, if they are enabled, then the tables are already in place (which is not an incorrect assumption, actually).My shop | About Me
As long as the proper table definitions are being called/triggered from Structure(), they should not care whether the plugin was previously enabled or not. It will attempt to make any changes needed to bring it in line with the Structure() definitions.
Sorry, I was not clear. What I meant is that the plugin class assumes that all required tables exist from the moment it's instantiated. That is:
That should not be an issue, as I don't recall adding any business logic to the constructor, but I will have to double check, just to be sure.
My shop | About Me
Oh, that's true. It seems like querying from the __construct() would be a bad idea / performance thing anyway.