Unfortunately our R1Soft backups on the Kobold server didn't trigger until 11 AM yesterday and roughly 12:40 PM today due to some delays/slowness on the backup server itself. While this should be fine normally as our servers have good hardware and plenty of I/O - the R1Soft backup is triggering several extremely intensive queries to MySQL as a part of it's MySQL backup process and, as such, it's causing slowness/instability/issues.
This was exacerbated by a few accounts using more than they should to begin with and we suspended and notified those users yesterday. Today upon investigation we found numerous 'stuck' queries running in the MySQL server:
We did attempt to stop/kill those threads as you can see indicated by "Killed" at the beginning of the query, however, they have failed to close out. We've been forced today, just as we were yesterday, to force a quit of MySQL.
The result is that when it starts back up it's going to take a few minutes to go over databases and make sure they're complete/repaired. After this is done we will be performing a manual check that will take a couple of hours but should have little to no impact.
We apologize for any trouble this may have caused you or may be causing you and I can assure you we're doing everything we can to both resolve the issue currently as well as to avoid the issue in the future. We did stop the R1Soft backup for today and killed these MySQL threads, however, manual administrative interaction was still required.
We'll update this thread if we have anything new to provide - we're currently working on restoring MySQL and we expect it to be back online within 5 minutes roughly.