As you may know, last night our webhost temporarily shut us down due to extremely high CPU usage after the Summersend spam wars game opened up. In the few hours that it was open, the game produced 1400 posts and at the time we were shut down was using 80% of the server CPU and around
120MB of bandwidth per minute.
Since then, I have been monitoring our CPU usage, and even with the game closed our usage has been borderline, as you can see here:
Because of this, I've started looking at contributing factors and have found that it's likely that the issue wasn't the spam game itself, but the heavy load required to display the topic pages, which only became apparent during the spam game because they were being loaded so quickly.
Specifically, I've found two particular issues:
1)
HTTP Images: When we switched to using https, we started using SMF's image proxy to make sure all images were served as https. This means for images in avatars, posts, and signatures, if it uses a http url the site downloads it, caches it, and serves it to you as https. However, this creates overhead depending on the size of the image, and some images were using upwards of 4MB of bandwidth per image per page load.
The solution is to simply switch those http urls to https, and I've already done so for some avatars and sigs that I founded to be particularly large. It's likely in the future that http image links will no longer be accepted for avatars or [img] tags to minimize use of the proxy.
This issue probably needs to be revisited sometime in the future,
2)
Notifications: Every time you load a page on the website, it pulls the full list of notifications you haven't dismissed along with the graphic for dismissing them, and for some people (including me) this amounts to hundreds of notifications. I have dismissed all my notifications personally, but in the future I will make it so only the latest 25 or so notifications will appear, with a link to see all of them on a separate page.
3)
Bots: At any given time, you will see that we usually have many more guests than registered members on the forums, and usually most of them are bots such as search engine web-crawlers. These bots use as many resources as an actual person viewing the website, and having many of them crawling around the forums can create quite drain even if actual forum activity is low. I have instituted a robots.txt file limiting crawlers to 1 action every 20 seconds. If this doesn't help, I may have to look into banning certain bots via ip address.
It will be several hours before I know if the preliminary measures I have taken will have an effect using our webhost's tools, but if they do, I will try to implement the other measures tonight and have Spam Wars back up tomorrow.