Wiki db size

From richmondmakerlabs.uk
Revision as of 16:07, 6 November 2023 by Beardedfool (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Problem:

  1. The wikidb, specifically the wiki_text is quite bit, backups are 75Mb uncompressed
  2. Less important but nice to tidy up, there are a load of tables we no longer need from the group before the hackspace
MariaDB [wiki]> SELECT
    ->     table_name AS `Table`,
    ->     round(((data_length + index_length) / 1024 / 1024), 2) `Size in MB`
    -> FROM information_schema.TABLES
    -> WHERE table_schema = "wiki"
    ->     AND table_name = "wiki_text";
+-----------+------------+
| Table     | Size in MB |
+-----------+------------+
| wiki_text |      53.52 |
+-----------+------------+
1 row in set (0.001 sec)

I have deleted a LOT of old revisions. Mediawiki saves the whole page, you can imagine what happened with the blog being updated over the year

A mariadb optimize doesn't reduce the size of the table

Possible causes - I've a couple of guesses:

  1. Previous hacking, if you look for blocked users Ian accidentally opened the wiki to the world and there was a large injection of
  2. encryption of previous edits means they're not being removed - https://www.mediawiki.org/wiki/Manual:CompressOld.php


Suggestion: I'm very tempted to just start with a new db and import the xml. It'll mean I need to recreate users but there are only a handful regularly using it, then we know we're fresh

But I'm also interested to know what you can find, whether either of my guesses are right. Mainly how to prevent this in the future.