Vanilla 1 is no longer supported or maintained. If you need a copy, you can get it here.
HackerOne users: Testing against this community violates our program's Terms of Service and will result in your bounty being denied.
Feed Publisher
This discussion has been closed.
Comments
It's just to allow you to use SimpleCache, as we needed to prevent FP to stop the Vanilla execution.
Thanks to Klod for his tests.
Hi chris, could you try to change a line in ReturnFeedItem() function of the functions.php file?
Replace
<title>' . $Properties[ 'Title' ] . '</title>
by
<title>' . htmlspecialchars( $Properties[ 'Title' ]) . '</title>
link to the fix.
I guess in categories_config.php on line 41, 42, 62 and 63 there should be something like this
<a href="'.$Configuration['WEB_ROOT'].'search.php?PostBackAction...
instead of
<a href="/search.php?PostBackAction...
it causes 404 error if vanilla is installed in a subfolder when accessing discussion board feed or general discussion feed.
The code in this file has been separated because it's pure personal config, so you can easily update FP without touching your settings.
I know I sometimes make things more tricky than they should, but this time I made them simpler (or it maybe was just Dan39!).
So feel free to tweak this file as you wish.
You are the first to report this, if you think that adding $Configuration['WEB_ROOT'] will help people to understand this file and put working code in it, please try to convince me again, because I prefer to leave it as is. Maybe the comments are not clear enough.
Thanks sharing your idea, I don't want to break any "reporting will", it's really great from you and we all love our community.
Now I'm gonna release this new version...
sure, no problem, it is completely up to you
I just went through the config file again and now I maybe understand better what you mean.
I was just thinking that it should work with the default config on a default installation (even the Vanilla is installed in a subfolder).
I think another configurations will be rare.
If this is not going to be changed then take my comments as just an example of configuration
And I just want to say I realised there is more /search.php links there then I wrote in my previous comment.
So I changed all occurancies of /search.php to '.$url.'search.php
and put $url = '/'; at the beginning of all code so I have only one place to change if I need to change root folder.
Actually I have there $url = $Configuration['WEB_ROOT'];
anyway, thanks for this extension
2) you can use searches to track discussions with feeds, try
http://lussumo.com/community/search/?PostBackAction=Search&Keywords=&Type=Topics&btnSubmit=Search&Feed=RSS2
you should get what you want
OK, let me repeat this to see if I understood:
On the home page of lussumo, we have this link
http://lussumo.com/community/discussion/7410/open-sourcing-our-addons/#Item_43
In the feed for this home page, we have this corresponding link
http://lussumo.com/community/discussion/7410/
On the home page of my local test site (FeedPublisher installed), I have this link
http://feedpublisher.local/comments.php?DiscussionID=6&page=1#Item_2
In the feed for this home page, we have this corresponding link
http://feedpublisher.local/comments.php?DiscussionID=6&Focus=37#Comment_37
So in each case, we have two URLs that lead us to the same content but with slightly different URLs. Search engines consider this to be duplicate content (different URLs = different pages). My questions:
1) I guess the GET string is part of the URL, so different GET strings are considered different pages, am I right? People that don't use pretty urls have the same problems that the others.
2) what about the anchor value? Is it considered as a different request? My guess is no (sorry, this is a very little part of the problem, but I wanted to know).
I'm not a SEO guru but I have an idea: will a robots.txt make it?
EDIT: I investigated a bit but it seems not feasible with simple skills only. Any robots.txt guru here?
I see no solution but trying to build exactly the same URL.
User-agent: * Disallow: /*&Focus= Disallow: /*?Focus=
You can test it out in the Google Webmaster Tools. Google (and other search engines) will ignore any URL that has a Focus parameter in it.