I would actually classify his version as pretty slow - but we've got some indexes on it now that are helping quite a bit. we'll post the results when finished.
Oh, I don't know.
They've got whispers enabled, so that definitely slows things down on the discussion list. When they turned whispers off it sped up like nobody's business - but they really like whispers, so they turned it back on.
right now it's slower than the old software, but there are a lot of other factors that have changed, so there can't be an apples:apples comaprison. the other software was a lot less feature-complex than vanilla and i think it was on a different host. what host did you use, denied?
the idea of tw with whispers makes my head feel funny. that's actually quite frightening...
and the old code was super lightweight, so vanilla probably won't ever get faster than it (based on a similar server set-up, that is. on a honking dedicated box, that's a different question). but i'm not giving up the old tw code, so they're shit outta luck!
but, really, it's better for them to be on vanilla.
itchy - jonezy was an active member on tw, not like the bastard ever went there, or anything, but you can let him into the new one.
and, really, that much jibber jabber (almost half a million posts) in about 13-14 months by a heavily active user base of, say, 50 people and then a further 80, or so less frequent users who weren't banned is something of a marvel.
Hmm. If someone gives me a dataset, I'll hook the thing up to DTrace on Solaris and tell you exactly where the thing is spending its time, whether it be in PHP functions, MySQL, or something else.
That would be swell - would the data from this community site be enough, or would you really need a big hunk of data like ithcy's dealing with to get some good results?
Another good way to index your tables when you have an existing system like this is to turn on query logging, let it run for awhile, then look at the most frequent queries. Now place indexes on the columns you search on (where w=x and y=z). If you do a lot of searching on tables that become large, this can help tremendously.
Just talked to franklinmint, and he's going on holidays for the next five days - back next Monday. So, unless anyone else wants to do some DTrace-ing... we'll have to wait and see on this one.
Interestingly enough, I just got an email from a guy using Vanilla who says that when there are >= 50 ppl using his installation of Vanilla, he see's his apache usage go up 10 - 15% and it's slowing everything else down.
I haven't been witness to anything like that at all - anyone else?
Comments
and the old code was super lightweight, so vanilla probably won't ever get faster than it (based on a similar server set-up, that is. on a honking dedicated box, that's a different question). but i'm not giving up the old tw code, so they're shit outta luck!
but, really, it's better for them to be on vanilla.
and, really, that much jibber jabber (almost half a million posts) in about 13-14 months by a heavily active user base of, say, 50 people and then a further 80, or so less frequent users who weren't banned is something of a marvel.
Hmm. If someone gives me a dataset, I'll hook the thing up to DTrace on Solaris and tell you exactly where the thing is spending its time, whether it be in PHP functions, MySQL, or something else.
Another good way to index your tables when you have an existing system like this is to turn on query logging, let it run for awhile, then look at the most frequent queries. Now place indexes on the columns you search on (where w=x and y=z). If you do a lot of searching on tables that become large, this can help tremendously.