I'm writing a parser function extension that outputs about 5000 lines of text (an organizational chart of a company) as a nested, bulleted list.
* Bob the CEO ** Jane Jones ** Mike Smith *** etc.
It takes about 3 seconds (real time) for MediaWiki to render this list, which is acceptable. However, if I make it a list of links, which is more useful:
* [[User:Bob | Bob the CEO]] ** [[User:Jane | Jane Jones]] ** [[User:Mike | Mike Smith]]
the rendering time more than doubles to 6-8 seconds, which users perceive as too slow.
Is there a faster implementation for rendering a large number of links, rather than returning the wikitext list and having MediaWiki render it?
Thanks, DanB
________________________________ My email address has changed to [email protected]. Please update your address book.
Cimpress is the new name for Vistaprint NV, the world’s leader in mass customization. Read more about Cimpress at www.cimpress.com. ________________________________
Probably the fastest thing would be to manually create the <ul><li> etc and wrap them around a loop calling the linker functions (Linker::link).
-- brion
On Tue, Jan 27, 2015 at 1:33 PM, Daniel Barrett [email protected] wrote:
I'm writing a parser function extension that outputs about 5000 lines of text (an organizational chart of a company) as a nested, bulleted list.
- Bob the CEO
** Jane Jones ** Mike Smith *** etc.
It takes about 3 seconds (real time) for MediaWiki to render this list, which is acceptable. However, if I make it a list of links, which is more useful:
- [[User:Bob | Bob the CEO]]
** [[User:Jane | Jane Jones]] ** [[User:Mike | Mike Smith]]
the rendering time more than doubles to 6-8 seconds, which users perceive as too slow.
Is there a faster implementation for rendering a large number of links, rather than returning the wikitext list and having MediaWiki render it?
Thanks, DanB
My email address has changed to [email protected]. Please update your address book.
Cimpress is the new name for Vistaprint NV, the world’s leader in mass customization. Read more about Cimpress at www.cimpress.com. ________________________________ _______________________________________________ Wikitech-l mailing list [email protected] https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Tue Jan 27 2015 at 1:37:36 PM Brion Vibber [email protected] wrote:
Probably the fastest thing would be to manually create the <ul><li> etc and wrap them around a loop calling the linker functions (Linker::link).
https://doc.wikimedia.org/mediawiki-core/master/php/html/classLinker.html# a52523fb9f10737404b1dfa45bab61045
Another option could be using LinkBatch.
-Chad
You should be able to return something like this to make your parser function output raw HTML instead of WikiText.
return array( $output, 'noparse' => true, 'isHTML' => true );
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]
On 2015-01-27 1:33 PM, Daniel Barrett wrote:
I'm writing a parser function extension that outputs about 5000 lines of text (an organizational chart of a company) as a nested, bulleted list.
- Bob the CEO
** Jane Jones ** Mike Smith *** etc.
It takes about 3 seconds (real time) for MediaWiki to render this list, which is acceptable. However, if I make it a list of links, which is more useful:
- [[User:Bob | Bob the CEO]]
** [[User:Jane | Jane Jones]] ** [[User:Mike | Mike Smith]]
the rendering time more than doubles to 6-8 seconds, which users perceive as too slow.
Is there a faster implementation for rendering a large number of links, rather than returning the wikitext list and having MediaWiki render it?
Thanks, DanB
My email address has changed to [email protected]. Please update your address book.
Cimpress is the new name for Vistaprint NV, the world’s leader in mass customization. Read more about Cimpress at www.cimpress.com. ________________________________ _______________________________________________ Wikitech-l mailing list [email protected] https://lists.wikimedia.org/mailman/listinfo/wikitech-l
As of https://gerrit.wikimedia.org/r/#/c/29879/2/utils/MessageTable.php,cm , Linker::link took 20 KiB of memory per call. Cf. http://laxstrom.name/blag/2013/02/01/how-i-debug-performance-issues-in-media... I don't know if such bugs/unfeatures and related best practices were written down somewhere.
Nemo