Please upgrade here. These earlier versions are no longer being updated and have security issues.
HackerOne users: Testing against this community violates our program's Terms of Service and will result in your bounty being denied.
Can a urlcode contain "number signs" (#) ?
xDaizu
New
Using branch master from... a week ago. (Sorry, not my fault it doesn't have a proper name! Or I cant find it, at least )
Today I was tinkering with the API and I provided as URLCode for a Category the name 1234#test. It didn't throw any error, like it did when I tried slashes or quotes, but when I click in the category, it throws a "Whoops! Page not found." error. Is this normal behaviour?
0
Comments
http://en.wikipedia.org/wiki/Fragment_identifier
grep is your friend.
Maybe you can Url-encode it? "1234#test" -> "1234%23test"
you should not name your category as just a number as you will get weird behaviour too. If you want you category to have strange characters, make sure the slug is clean (you can save it separately).
grep is your friend.
Is not that I want # in the urls, it's just that I'm testing the limits of the API before opening it up to the dumb users.
Then if, by design, a category shouldn't include # or just numbers, it should throw an exception like when you try to include quotes or slashes, right?
I wouldn't open up the api to users. Why are users creating categories anyway?
grep is your friend.
Not actually users. Just managers, which are like users, but fewer... haha. Nevermind why they're creating categories. They have to, because it's the business model. Besides, that's not the point.
The point is the "wrong name exception" having holes. It should be more strict. As strict as the valid URLCodes are, at least, shouldn't it?
@xDaizu it works correctly in the core.
kasper specifically ask for issues with API to be made on github
https://github.com/kasperisager/vanilla-api/issues
I suspect it would be your responsibility to url encode the slug.
grep is your friend.
Oh, ok. I was literally told that the API didn't do any of the "heavy lifting" of the requests (which I assumed included validation), so I though it was a core problem.