Wiki db size: Difference between revisions
Jump to navigation
Jump to search
Beardedfool (talk | contribs) No edit summary |
Beardedfool (talk | contribs) No edit summary |
||
Line 18: | Line 18: | ||
</pre> | </pre> | ||
However a https://www.mediawiki.org/wiki/Manual:DumpBackup.php | *However a https://www.mediawiki.org/wiki/Manual:DumpBackup.php comes out as 1.9M | ||
** Note this should just be text, no users, history etc | |||
I have deleted a LOT of old revisions. Mediawiki saves the whole page, you can imagine what happened with the blog being updated over the year | |||
I | * https://www.mediawiki.org/wiki/Manual:DeleteOldRevisions.php | ||
* https://www.mediawiki.org/wiki/Manual:DeleteArchivedRevisions.php | |||
A mariadb optimize doesn't reduce the size of the table | |||
Possible causes - I've a couple of guesses: | |||
I' | # Previous hacking, if you look for blocked users Ian accidentally opened the wiki to the world and there was a large injection of | ||
# encryption of previous edits means they're not being removed - https://www.mediawiki.org/wiki/Manual:CompressOld.php | |||
But I'm also interested to know whether either of my | |||
Suggestion: | |||
I'm very tempted to just start with a new db and import the xml. It'll mean I need to recreate users but there are only a handful regularly using it, then we know we're fresh | |||
But I'm also interested to know what you can find, whether either of my guesses are right. Mainly how to prevent this in the future. |
Latest revision as of 16:07, 6 November 2023
Problem:
- The wikidb, specifically the wiki_text is quite bit, backups are 75Mb uncompressed
- Less important but nice to tidy up, there are a load of tables we no longer need from the group before the hackspace
MariaDB [wiki]> SELECT -> table_name AS `Table`, -> round(((data_length + index_length) / 1024 / 1024), 2) `Size in MB` -> FROM information_schema.TABLES -> WHERE table_schema = "wiki" -> AND table_name = "wiki_text"; +-----------+------------+ | Table | Size in MB | +-----------+------------+ | wiki_text | 53.52 | +-----------+------------+ 1 row in set (0.001 sec)
- However a https://www.mediawiki.org/wiki/Manual:DumpBackup.php comes out as 1.9M
- Note this should just be text, no users, history etc
I have deleted a LOT of old revisions. Mediawiki saves the whole page, you can imagine what happened with the blog being updated over the year
- https://www.mediawiki.org/wiki/Manual:DeleteOldRevisions.php
- https://www.mediawiki.org/wiki/Manual:DeleteArchivedRevisions.php
A mariadb optimize doesn't reduce the size of the table
Possible causes - I've a couple of guesses:
- Previous hacking, if you look for blocked users Ian accidentally opened the wiki to the world and there was a large injection of
- encryption of previous edits means they're not being removed - https://www.mediawiki.org/wiki/Manual:CompressOld.php
Suggestion:
I'm very tempted to just start with a new db and import the xml. It'll mean I need to recreate users but there are only a handful regularly using it, then we know we're fresh
But I'm also interested to know what you can find, whether either of my guesses are right. Mainly how to prevent this in the future.