Hi,
Not really sure if anyone will have any ideas.. but I thought it was worth asking =)
I'm trying to import the Wikipedia dumps from However, I'm having problems with the "limits" in place.
For example, using this command;
wget java -server -jar mwdumper.jar --format=sql:1.5 20051127_pages_articles.xml.bz2 > dump.sql
..gives;
Anyone had any experience with this, and how to solve/raise the limit?
TIA!
Andy
Not really sure if anyone will have any ideas.. but I thought it was worth asking =)
I'm trying to import the Wikipedia dumps from However, I'm having problems with the "limits" in place.
For example, using this command;
wget java -server -jar mwdumper.jar --format=sql:1.5 20051127_pages_articles.xml.bz2 > dump.sql
..gives;
user@undevmac wiki $ java -server -jar mwdumper.jar --format=sql:1.5 20051127_pages_articles.xml.bz2 > dump.sql
1,000 pages (231.803/sec), 1,000 revs (231.803/sec)
2,000 pages (260.146/sec), 2,000 revs (260.146/sec)
Exception in thread "main" java.io.IOException: Parser has reached the entity expansion limit "64,000" set by the Application.
at org.mediawiki.importer.XmlDumpReader.readDump(Unknown Source)
at org.mediawiki.dumper.Dumper.main(Unknown Source)
user@undevmac wiki $
Anyone had any experience with this, and how to solve/raise the limit?
TIA!
Andy