HackerOne users: Testing against this community violates our program's Terms of Service and will result in your bounty being denied.

No links in title

x00x00 MVP

@Linc @R_J

There has been a lot of manual spam from India especially with links in the title

This is what I use against this:

    public function discussionModel_beforeSaveDiscussion_handler($sender, $args) {
        $sender->Validation->addRule('NoLinks', 'regex:/^(?:(?!\b(((https?:\/\/|www\.)[\S]+)|([a-z0-9\-\.]{3,}\.(com|org|net|int|edu|gov|mil|co|ca|de|jp|fr|au|us|ru|ch|it|nl|se|no|es|co\.[a-z]{2})(\/[\S]*)?))\b).)*$/');
        $sender->Validation->applyRule('Name', 'NoLinks', 'Title cannot contain links');
    }

You can also limit links in body to post count (untested)

    public function discussionModel_beforeSaveDiscussion_handler($sender, $args) {

        $sender->Validation->addRule('NoLinksTitle', 'regex:/^(?:(?!\b(((https?:\/\/|www\.)[\S]+)|([a-z0-9\-\.]{3,}\.(com|org|net|int|edu|gov|mil|co|ca|de|jp|fr|au|us|ru|ch|it|nl|se|no|es|co\.[a-z]{2})(\/[\S]*)?))\b).)*$/');
        $sender->Validation->applyRule('Name', 'NoLinksTitle', 'Title cannot contain links');

        if (!checkPermission('Garden.Moderation.Manage') && !Gdn::session()->User->Verified && Gnd::session()->User->CountDiscussions + Gnd::session()->User->CountComments <= c('NoLinksBody.NumPosts', 5)) {
            $sender->Validation->addRule('NoLinksBody', 'regex:/^(?:(?!\b(((https?:\/\/|www\.)[\S]+)|([a-z0-9\-\.]{3,}\.(com|org|net|int|edu|gov|mil|co|ca|de|jp|fr|au|us|ru|ch|it|nl|se|no|es|co\.[a-z]{2})(\/[\S]*)?))\b).)*$/');
            $sender->Validation->applyRule('Body', 'NoLinksBody', 'You can\'t post links yet');
        }
    }


    public function commentModel_beforeSaveComment_handler($sender, $args) {
        if (!checkPermission('Garden.Moderation.Manage') && !Gdn::session()->User->Verified && Gnd::session()->User->CountDiscussions + Gnd::session()->User->CountComments <= c('NoLinksBody.NumPosts', 5)) {
            $sender->Validation->addRule('NoLinksBody', 'regex:/^(?:(?!\b(((https?:\/\/|www\.)[\S]+)|([a-z0-9\-\.]{3,}\.(com|org|net|int|edu|gov|mil|co|ca|de|jp|fr|au|us|ru|ch|it|nl|se|no|es|co\.[a-z]{2})(\/[\S]*)?))\b).)*$/');
            $sender->Validation->applyRule('Body', 'NoLinksBody', 'You can\'t post links yet');
        }
   }

grep is your friend.

Comments

  • K17K17 Français / French Paris, France ✭✭✭

    Really inresting because we can't do anything against manual spamming.... This will limit possibilites.

  • R_JR_J Ex-Fanboy Munich Admin

    I'm not the one to implement anything of that here. I know that actions are being taken to reduce that spam and those actions should prevent spammers to register.

    From my point of view that is the best approach. You only need to monitor registration process and not posting, which is better for performance. It will also prevent a myriad of spam-bot-users.


    If I had a community and I would facing spammers getting through registration process, I would tend using your second approach.
    But if I had time for a complex solution, I would try a different approach, which makes use of the "Verified" flag, the SpamModel and the human brain in form of community interaction.

    1. On setup(), set the "Verified" flag to all users with CountDiscussions + CountComments > 20 and Banned/Deleted = 0. Those will be active users and their posts will not have to be spam filtered any more.
    2. Hook into the spam model (CheckSpam handler) and mark every discussion/comment with a link as spam
    3. Allow "Verified" users to "trust" other users by adding "Nominate Verified" to the reactions below each user not having the "Verified" flag.
    4. Log the UserIDs of those who trusted a user to be able to track down abuse
    5. Users which have been marked "trusted" by several users (configurable value) get the "verified" flag set (so that their posts pass the spam filter

    As a result I would hope to only find links from users, not from spammers.

  • Manual spammer do not have any issue getting through registration. There is cheap labour from India and china.

    Validation rule have a minimal performance cost.

    You could use the spam model, but that create more work for moderator. Things like stop Forum Spam, obviously have a performance cost.

    There are pros and cons to either systems.

    The fact these links are no follow should be a deterrent enough however, these spammer are just paid to do a job, it doesn't require much understanding. Making spamming less attractive in a more obvious way means the site is less valuable to them.

    grep is your friend.

  • One way is not having new user posts being public until approved or a time period has passed.

    grep is your friend.

  • R_JR_J Ex-Fanboy Munich Admin

    @x00 said:
    Validation rule have a minimal performance cost.

    Yes, that was more a thought from me than a real statement. I totally agree that adding some validation to the BeforeSave event doesn't count.

    @x00 said:
    Manual spammer do not have any issue getting through registration. There is cheap labour from India and china.
    (...)
    One way is not having new user posts being public until approved

    That's what I haven't really considered by now: humans are able to bypass all registration burdens, but paid spammers need a motiviation to do so. In order to get rid of them it might indeed be needed to pre-moderate all posts of new members.

    Moderation effort is needed anyway. And I would feel better to verify 2 or 3 real discussions per day than to delete 30-40 spammers

  • rbrahmsonrbrahmson "You may say I'm a dreamer / But I'm not the only one" NY ✭✭✭

    Well, yes, but I do see value in using some automation to reduce the moderator's burden. I really like the idea of letting spammers be the only ones who see their own creations without letting them know that they get special treatment. I think it was on your evil plugin list...

  • rbrahmsonrbrahmson "You may say I'm a dreamer / But I'm not the only one" NY ✭✭✭

    Variation - after analysis internally mark both spammers and discussion/comment as "Spam", thereafter only those uses marked as spammers can see spam-marked discussions/comments while other users cannot see those. Let the spammers enjoy their handywork. After a while (when enough spam os accumulated) let the spammers only see spam (so they can only comment on spam). Now, that's more evil;-)

  • in all honesty they don't care that much. they just move form forum to fourm, they quicker you make them move on the better.

    It is just a job to them. if they can earn a few bob doing this get something to eat they don't really care what they have been told to do.

    grep is your friend.

  • rbrahmsonrbrahmson "You may say I'm a dreamer / But I'm not the only one" NY ✭✭✭

    True, so add to the evil plug in notion some artificial delay, slow them so they won't move that fast to the next site.

    But let's think about what the process is. I imagine they work of a list of websites. How to get off that list? Fake "down" state?

  • x00x00 MVP
    edited November 2017

    it I not entirely manual it is semi automated, hence the generated content.

    if you were able fake a downed site you could block them effectively. This is not going be a good move as it affect reputation.

    You could ban IP from India but even I have Indian clients. It is not really that fair, and easily got round.

    The only way is to make they site less attractive to spammers an make their live more difficult.

    grep is your friend.

  • rbrahmsonrbrahmson "You may say I'm a dreamer / But I'm not the only one" NY ✭✭✭

    I haven't looked to check whether new spam IDs are reused or not. If they do then the evil plugin would create a black hole;-)

    Also, I concur that blocking a country is over the top (and bigotted IMHO), but could the plugin build a more refined black IP ranges that get the "site down" treatment?

  • you just refuse the request. You can keep them hanging at your expense

    grep is your friend.

  • most of these spammers are given software which already use proxies. So banning IPs is not gong do much to stop them, that is pretty much the first thing they think of.

    grep is your friend.

  • @Todd @Linc

    Can someone do something about this specific spamming campaign at least? It is obviously the same people each day. I no index the discussions every day for you.

    grep is your friend.

  • @x00, the team has been working on an addon to manage a lot of the spam registrations here. Hope to see it deployed within the next week.

    Add Pages to Vanilla with the Basic Pages app

  • rbrahmsonrbrahmson "You may say I'm a dreamer / But I'm not the only one" NY ✭✭✭

    Hi @Shadowdare - hope this addon will be placed in the addon repository.

    @x00 - I wonder whether the purpose of the recent group of spammers is to affect google ranking or to hope to gain actual clicking. I know they are working of a script, but what's the real strategy here?

  • rbrahmsonrbrahmson "You may say I'm a dreamer / But I'm not the only one" NY ✭✭✭

    I actually looked into one of these spams (getnutritionshelp.com). Whois says their domain registration is with "WhoisGuard, Inc." in Panama with their client name hidden. I then did a google search for "get nutritions help" and found the website on top with many other entries representing spammed forums. So clearly their method is to attach open forums. Thus, a solution to the problem on this site would be beneficial to many Vanilla users.

    If the solution is more like the "evil" plugin I'd add nofollow noindex to the specific page;-) but I assume that's not what you are working on.

  • x00x00 MVP
    edited November 2017

    @rbrahmson said:
    Hi @Shadowdare - hope this addon will be placed in the addon repository.

    @x00 - I wonder whether the purpose of the recent group of spammers is to affect google ranking or to hope to gain actual clicking. I know they are working of a script, but what's the real strategy here?

    they are paid lackeys, they have little to know knowledge of SEO, most of the strategy they are told to do is based on redundant black SEO techniques.

    This is what happen when people answer those email claiming they can improve the SEO rank. Many of the companies involve don't even realise they have paid for this spam.

    I have been to India, they struggle with corruption, they are starting do soemthign about it but there is a long way to go. In a culture of corruption many people don't care about shitting on their own doorstep, or long term reputation so long as there is some short term gain, regardless of it is at someone else's expense.

    grep is your friend.

Sign In or Register to comment.