Please upgrade here. These earlier versions are no longer being updated and have security issues.
HackerOne users: Testing against this community violates our program's Terms of Service and will result in your bounty being denied.

Vanilla 2.6 Installation Issue(s)

13»

Comments

  • @x00 said:
    because the web process owner / group shouldn't have ownership over file it can potentially write / execute, except in rare sandbox scenarios where you would want the framework to directly control file management through scripting or certain set-up of php where it is typically isn't www-data. It is one further step in the wrong direction. You don't have to user root however.

    Yes you are right the other command was not good, and beside you know how to use find.

    Wildcard doesn't match hidden files by default.

    @x00 I beg to differ.. in many cases you do want the web process to write to a file. And, you are talking about 90% of the web installations out there that use the permissions I suggested. Bottom line, he had root as owner:group in many files and directories without executable permissions at all.. without going into a diatribe on linux permissions, if you have:

    drwxr-x--- root root parentdirectory
    and
    -rw-r--r-- httpowner httpowner childfile

    That file will never be read (or written) to by apache. Do you think that is good?

    That's what I was potentially fixing... check his very first output of perms and you'll see what I mean.

    The only reason I am beating a dead horse here is that you went off so "loudly", and you didn't even care to read the entire thread. You said I was giving files executable perms. You implied I was suggesting permissions that were a security problem. Neither is true.

    Could one harden the box more? Sure.. but 90% of web server environments are set up with:
    directories: 755
    files: 644
    For a reason.. it's versatile and it ensures operation... which is what I was trying to do.. to help this guy get his forum operational.. you can always harden afterwards (to your paranoia).

    Enough said I think :-)

  • x00x00 MVP
    edited June 2018

    @x00 I beg to differ.. in many cases you do want the web process to write to a file. And, you are talking about 90% of the web installations out there that use the permissions I suggested. Bottom line, he had root as owner:group in many files and directories without executable permissions at all.. without going into a diatribe on linux permissions, if you have:

    it is an anti-pattern for a frameworks to control the entire code base file management through scripting hence the "tail wagging the dog" remark. Yes frameworks like wordpress do this and it has always been controversial even within the orgnaisation, to the extent you do have alternative deployment methods.

    Of course there are folder and file which you woudl selectively do it. I never said otherwise I was commenting on the wildcard approach.

    The point I was making is it is better not to all of the codebase under the web precess user. Instead it is better to put it under the user/group of the maintainer or even nobody.

    Then you put those folder you do need the web process user to control under it.

    grep is your friend.

  • I'm sorry, it was an annoyingly snobby (and inaccurate) post you made that didn't help the OP.

  • x00x00 MVP
    edited June 2018

    @donovanb said:
    Could one harden the box more? Sure.. but 90% of web server environments are set up with:
    directories: 755
    files: 644
    For a reason.. it's versatile and it ensures operation... which is what I was trying to do.. to help this guy get his forum operational.. you can always harden afterwards (to your paranoia).

    Enough said I think :-)

    I never suggested these permission weren't adequate but not with the web process user.

    We can agree to disagree. but you will be giving file write permission to the web process user, when it not needed or advisable. That is it nothign else nothign more.

    grep is your friend.

  • @donovanb said:
    I'm sorry, it was an annoyingly snobby (and inaccurate) post you made that didn't help the OP.

    I was just pointing something out you chose to take offence to it.

    grep is your friend.

  • Okay, I do agree with you there on 'linux permission best practices' @x00 .. we can move on.

  • x00x00 MVP
    edited June 2018

    no problem.

    Some advise to the OP. You can check a server rule is being reached by putting some effect there or actual debugging and you can check internal redirects by switching to 302 redirects.

    http://httpd.apache.org/docs/2.4/mod/mod_rewrite.html#logging

    this assumes the op has enabled AllowOverride so the .htaccess can be used.

    grep is your friend.

  • It's ok, lol. I've already solved this problem, and don't worry. AllowOverride All isn't being declared in apache2.conf anymore. I've made a conf specific to the website and done it under

  • Setting $Configuration['Garden']['RewriteUrls'] = false; works. However the impact is that url is not friendly, it has index.php?p=/ after base url. Setting to true makes the url friendly but the page is not found on the server. Can it be fix on htacccess? or plugin?

  • R_JR_J Ex-Fanboy Munich Admin

    @martin28 said:
    Setting $Configuration['Garden']['RewriteUrls'] = false; works. However the impact is that url is not friendly, it has index.php?p=/ after base url. Setting to true makes the url friendly but the page is not found on the server. Can it be fix on htacccess? or plugin?

    When you look through this discussion, you will find at least two alternative htaccess files. Try them and that should solve the problem

  • @R_J , I tried the barebone htaccess file you provided & setback the rewriteurls to true. I also set the virtual host directory to allowoverrideall (in httpd.vhost.conf). Restart apache back to same issue, only the home route is working, the discussions, activity & other page shows "The requested URL /discussions was not found on this server." Should I allowoverrideall on the http.conf not the httpd.vhost.conf? My worry is the security.

  • I managed to get it working with friendly url by 1. enable rewrite_module in apache httpd.conf, 2. use original htaccess from .htaccess.dist . 3. set $Configuration['Garden']['RewriteUrls'] = true;

  • x00x00 MVP
    edited June 2018

    just to educate not to cause animosity wordpress's own advice is not to use the process user for all the files. Even though they have long used the framework to manage its own files.

    https://codex.wordpress.org/Changing_File_Permissions

    All files should be owned by the actual user's account, not the user account used for the httpd process

    this applies to suexec setup with applies to most popular conflagrations. You can setup the access to the files through configuration to ensure updates work.

    dispute this many instantiations are still done wrong.

    grep is your friend.

  • @x00 said:
    just to educate not to cause animosity wordpress's own advice is not to use the process user for all the files. Even though they have long used the framework to manage its own files.

    https://codex.wordpress.org/Changing_File_Permissions

    All files should be owned by the actual user's account, not the user account used for the httpd process

    this applies to suexec setup with applies to most popular conflagrations. You can setup the access to the files through configuration to ensure updates work.

    dispute this many instantiations are still done wrong.

    I guess if you don't want animosity.. you shouldn't start with "Holy crap, don't do that.."... ;-)

    Yes, you are right that one could create a user to manage (and own) files to be more secure. This person had not done that... he (or she) had been editing files with root. Instead of taking up his time to ask him to create a username to edit files, or to spend the time to install suexec and to reconfigure everything, I was addressing the actual problem.

    Also, who listens to wordpress???.. the most widely hacked platform on the planet. :-)

    tldr;
    Contrary to your conclusion about what wordpress was recommending.. they are just simply saying that files need to be owned by the accounts user in a suexec environment because.. well, it just wouldn't work right otherwise, right?... editing, running, and accessing would be potentially broken and/or frustrating.

    You also said that suexec are the most popular 'conflagrations'... sorry, not sure I agree with that. First, Microsloth has taken over apache as the number 1 webserver recently. Second, though I don't know the stats these days.. with Amazon being the number 1 host provider, I'm guessing shared hosting environments still outweigh other linux environments.. which many of them have loose permissions. Not that that is good, but just correcting your statement.

    If you look right above your quote in that article, wordpress recommends exactly what I recommended to this person:

    "For these systems, the php process runs as the owner of the php files themselves, allowing for a simpler configuration and a more secure environment for the specific case of shared hosting."

    Ultimately, I do ideally agree with you on your base premise. You are right that setting up a user to own files or running the environment as suexec is more secure. However, I wasn't about to ask the OP to do that.. and ultimately, security has many facets that differ with each owner. This was not a "Linux best practices" post... and not every hosting environment needs a suexec environment.

    I would say your comments in this thread would make a great "linux best practices" post. :-)

  • xDracxDrac New
    edited July 2018

    I'm having the same issue as described in the first post, did you get around to fixing this?
    Edit: Nevermind, renaming '.htaccess.dist' into '.htaccess' actually worked.

Sign In or Register to comment.