Wiki db size

From richmondmakerlabs.uk
Revision as of 23:44, 5 November 2023 by Beardedfool (talk | contribs) (Created page with "Problem: 1) The wikidb, specifically the wiki_text is quite bit, backups are 75Mb uncompressed 2) Less important but nice to tidy up, there are a load of tables we no longer need from the group before the hackspace <pre> MariaDB [wiki]> SELECT -> table_name AS `Table`, -> round(((data_length + index_length) / 1024 / 1024), 2) `Size in MB` -> FROM information_schema.TABLES -> WHERE table_schema = "wiki" -> AND table_name = "wiki_text"; +-...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Problem: 1) The wikidb, specifically the wiki_text is quite bit, backups are 75Mb uncompressed 2) Less important but nice to tidy up, there are a load of tables we no longer need from the group before the hackspace

MariaDB [wiki]> SELECT
    ->     table_name AS `Table`,
    ->     round(((data_length + index_length) / 1024 / 1024), 2) `Size in MB`
    -> FROM information_schema.TABLES
    -> WHERE table_schema = "wiki"
    ->     AND table_name = "wiki_text";
+-----------+------------+
| Table     | Size in MB |
+-----------+------------+
| wiki_text |      53.52 |
+-----------+------------+
1 row in set (0.001 sec)

However a https://www.mediawiki.org/wiki/Manual:DumpBackup.php

Possible causes: I've a couple of guesses


Options: I'm half tempted to just start with a new db and import the xml. It'll mean I need to recreate users but there are only a handful

But I'm also interested to know whether either of my susp