Welcome to ikiwiki's todo list. Link items to done when done.
Also see the Debian bugs.
Suggestions of ideas for plugins:
list of registered users - tricky because it sorta calls for a way to rebuild the page when a new user is registered. Might be better as a cgi?
At best, this could only show the users who have logged in, not all permitted by the current auth plugin(s). HTTP auth would need web-server-specific code to list all users, and openid can't feasibly do so at all. --JoshTriplett
sigs ?
Support RecentChanges as a regular page containing a plugin that updates each time there is a change, and statically builds the recent changes list. (Would this be too expensive/inflexible? There might be other ways to do it as a plugin, like making all links to RecentChanges link to the cgi and have the cgi render it on demand.)
Or using an iframe to inline the cgi, although firefox seems to render that nastily with nested scroll bars.
Or just link to the equivalent in the version control system, if available; gitweb's shortlog or summary view would work nicely as a RecentChanges. --JoshTriplett
Why not fork the process? We wouldn't have to wait around for a response since we would assume the recent changes page was being generated correctly.
It would be nice to be able to have a button to show "Differences" (or "Show Diff") when editing a page. Is that an option that can be enabled? Using a plugin?
For PlaceWiki I want to be able to do some custom plugins, including one that links together subpages about the same place created by different users. This seems to call for a plugin that applies to every page w/o any specific marker being used, and pre-or-post-processes the full page content. It also needs to update pages when related pages are added, so it needs to register dependencies pre-emptively between pages, or something. It's possible that this is a special case of backlinks and is best implemented by making backlinks a plugin somehow. --Joey
random page (cgi plugin; how to link to it easily?)
How about an event calendar. Events could be sub-pages with an embedded code to detail recurrance and/or event date/time
rcs plugin (JeremyReed has one he has been using for over a month with over 850 web commits with 13 users with over ten commits each.)
asciidoc or txt2tags format plugins
Should be quite easy to write, the otl plugin is a good example of a similar formatter.
Isn't there a conflict between ikiwiki using [[ ]] and asciidoc using the same? There is a start of an asciidoc plugin at http://www.mail-archive.com/asciidoc-discuss@metaperl.com/msg00120.html -- KarlMW
- manpage plugin: convert "ls(1)" style content into Markdown like [ls(1)](http://example.org/man.cgi?name=ls§=1) or into HTML directly.
Posted Fri Jun 15 21:57:57 2007With a full installation of groff available, man offers HTML output. Might take some fiddling to make it fit into the ikiwiki templates, and you might or might not want to convert pages in the SEE ALSO as well. --JoshTriplett
Ikiwiki's preprocessor parser cannot deal with arbitrary nested preprocesor directives. It's possible to nest a directive with single quoted values inside a triple-quoted value of a directive, but that's all.
It's not possible to unambiguously parse nested quotes, so to support nesting, a new syntax would be needed. Maybe something xml-like?
Posted Sun Jun 3 02:59:55 2007You can, however, unambiguously parse nested square brackets, and I think that would solve the problem, as long as you never allow the contents of a directive to contain a partial directive, which seems reasonable to me.
For example, I think you can unambiguously parse the following:
[[if test="enabled(template) and templates/foo" then=""" [[template id=foo content="""Flying Purple People Eater"""]] """]]
It'd be nice to be able to specify an altenate template file to be used for some pages. For example, I'd like most of my pages to use page.tmpl but I want my front page to be formatted in some unique way, so I'd like it to use a separate front.tmp template instead.
I'm not sure what syntax to use for this, (template seems to be taken for another purpose already). Perhaps something like [[page-template front]] ?).
Joey provided a nice suggestion for implementing this feature, ("I
would probably add a hook that allowed overriding the default template
constuction and returning a template object"). I did start looking
into that, but finally I wimped out and just put the following hack
into the genpage()
function in Render.pm:
if ($page eq 'index') {
$template->param(suppresstitle => 1);
}
That lets me use a <TMPL_UNLESS SUPPRESSTITLE>
in my template to get
the effect I want. I don't think that's anything that upstream should
pick-up as is, (maybe with an appropriate configuration option, but
then again allowing for per-page template selection would be more
powerful anyway). But I'm happy enough now that I probably won't
pursue implementing this feature further myself.
But I'd still happily switch to using this feature if someone were to implement it.
Posted Wed May 30 19:01:08 2007passwordauth could support an "account creation password", as a simplistic anti-spam measure. (Some wikis edited by a particular group use an account creation password as an "ask an existing member to get an account" system.) --JoshTriplett
Posted Thu May 24 18:12:25 2007How about a plugin adding a preprocessor directive to render some given LaTeX and include it in the page? This could either render the LaTeX as a PNG via dvipng and include the resulting image in the page, or perhaps render via HeVeA, TeX2page, or similar. Useful for mathematics, as well as for stuff like the LaTeX version of the ikiwiki logo.
ikiwiki could also support LaTeX as a document type, again rendering to HTML.
Conversely, how about adding a plugin to support exporting to LaTeX?
I did some tests with using Markdown and a customized HTML::Latex and html2latex and it appears it will work for me now. (I hope to use ikiwiki for many to collaborate on a printed book that will be generated at least once per day in PDF format.)
--JeremyReed
Have a look at pandoc. It can make PDFs via pdflatex. --roktas
here is a first stab at a latex plugin. Examples here. Currently without image support for hevea. And the latex2html output has the wrong charset and no command line switch to change that. Dreamland.
Posted Mon May 21 19:29:03 2007The pages in the basewiki should be fully self-documenting as far as what users need to know to edit pages in the wiki. HelpOnFormatting documents the basics, but doesn't include every preprocessor directive.
Note that there's a disctinction between being self-documenting for users, and being complete documentation for ikiwiki. The basewiki is not intended to be the latter, so it lacks the usage page, all the plugin pages, etc.
I've made some progress toward making the basewiki self-documenting by moving the docs about using templates, shortcuts, and blogs from the plugin pages, onto the pages in the basewiki.
Here are some of the things that are not documented in full in the basewiki:
joey@kodama:~/src/ikiwiki/doc/plugins>grep usage *
aggregate.mdwn:## usage
graphviz.mdwn:page. Example usage:
graphviz.mdwn:amounts of processing time and disk usage.
img.mdwn:## usage
linkmap.mdwn:set of pages in the wiki. Example usage:
map.mdwn:This plugin generates a hierarchical page map for the wiki. Example usage:
postsparkline.mdwn:# usage
sparkline.mdwn:# usage
sparkline.mdwn:more detail in [its wiki](http://sparkline.wikispaces.com/usage).
table.mdwn:## usage
Meta is another one.
The holdup on documenting these in full in the basewiki is that I'm not sure where to put the docs. HelpOnFormatting should stay as simple as possible and just give examples, not full lists of availavle parameters, etc. And it's bad enough that blog uses that toplevel namespace, without adding lots more toplevel pages to the basewiki. (blog really needs to be moved.. I have several wikis that override it with their actual blog content).
Maybe the thing to do would be to make a meta/ or usage/ or wiki/ or something directory in the basewiki, and put new pages documenting how to use preprocesor directives in there.
Actually, if we look at the basewiki contents:
blog.mdwn@ pagespec.mdwn@ subpage@
favicon.ico@ preprocessordirective.mdwn@ subpage.mdwn@
helponformatting.mdwn@ sandbox.mdwn@ templates/
index.mdwn@ shortcuts.mdwn@ templates.mdwn@
local.css@ smileys@ wikiicons@
markdown.mdwn@ smileys.mdwn@ wikilink.mdwn@
openid.mdwn@ style.css@
Most of this is meta stuff. Only index.mdwn, local.css, favicon.ico, smileys, wikiicons, shortcuts, and templates are really content/configs that are used as the base of a wiki. The rest is documentation.
Moving a lot of these pages could be hard though.. Lots of wikis probably link to them. Maybe the directory they're moved to could be in the search path, like the userdir is, so that simple links keep working.
See also: Conditional Underlay Files
Posted Sun May 20 00:36:24 2007The edit form could include a checkbox "subscribe to this page", allowing a user to add a page to their subscribed list while editing. This would prove particularly useful for todo and bug items, to allow users to receive notifications for activity on their reports.
Posted Fri May 18 04:54:52 2007Git does not support checking out a subdirectory of a repository as a repository. In order to allow a software project managed with Git to keep its wiki, bug-tracker, TODO-list, and stuff in a subdirectory of the same repository (rather than a parallel foo-wiki.git
repository, which does not stay in sync with the code), ikiwiki should support checking out a repository but only using a subdirectory of that repository. --JoshTriplett
Posted Fri May 18 04:54:52 2007This seems to be a mandatory feature. I'll start working to implement it as soon as possible --Roktas
Thanks! --JoshTriplett
ikiwiki should supply an example .htaccess file for use with HTTP authentication (perhaps as a tip), showing how to authenticate the user for edits without requiring authentication for the entire wiki. (Ideally, recentchanges should work without authentication as well, even though it goes through the CGI.) --JoshTriplett
Posted Thu May 17 07:49:48 2007The account creation process in the default passwordauth plugin could support account moderation by an administrator. New account signups would go into a queue for approval by an administrator.
(Random, potentially infeasible idea: save their edits and apply them if the account gets approved.)
Posted Wed May 9 20:01:25 2007ikiwiki currently stores some key data in .ikiwiki/index. Some plugins need a way to store additional data, and ideally it would be something managed by ikiwiki instead of ad-hoc because:
- consistency is good
- ikiwiki knows when a page is removed and can stop storing data for that page; plugins have to go to some lengths to track that and remove their data
- it's generally too much code and work to maintain a separate data store
The aggregate plugin is a use-case: of 324 lines, 70 are data storage and another 10 handle deletion. Also, it's able to use a format very like ikiwiki's, but it does need to store some lists in there, which complicates it some and means that a very naive translation between a big per-page hash and the .index won't be good enough.
The current ikiwiki index format is not very flexible, although it is at least fairly easy and inexpensive to parse as well as hand-edit.
Would this do: ?
- Plugins can register savestate and loadstate hooks. The hook id is the key used in the index file that the hook handles.
- loadstate hooks are called and passed a list of all values for a page that for the registered key, and the page name, and should store the data somewhere
- savestate hooks are called and passed a page, and should return a list of all values for that key for that page
- If they need anything more complex than a list of values, they will need to encode it somehow in the list.
Hmm, that's potentially a lot of function calls per page eave load/save
though.. For less function calls, only call each hook once per load/save,
and it is passed/returns a big hash of pages and the values for each page.
(Which probably means %state=@_
for load and return %state
for save.)
It may also be better to just punt on lists, and require plugins that need
even lists to encode them. Especially since in many cases, join(" ", @list)
will do. Er hmm, if I do that though, I'm actually back to a big global
%page_data that plugins can just toss data into, arn't I? So maybe that's
%the right approach after all, hmm.. Except that needing to decode/encode list
data all the time when using it would quite suck, so no, let's not do that.
Note that for the aggregate plugin to use this, it will need some changes:
- guid data will need to be stored as part of the data for the page that was aggregated from that guid. Except, expired pages don't exit, but still have guid data to store. Hmm. I suppose the guid data could be considered to be associated with the page that contains the aggregate directive then.
- All feeds will need to be marked as removable in loadstate, and only unmarked if seen in preprocess. Then savestate will need to not only remove any feeds still marked as such, but do the unlinking of pages aggregated from them too.
If I do this, I might as well also:
- Change the link= link= stuff to just links=link+link etc.
- Change the delimiter from space to comma; commas are rare in index files, so less ugly escaped delimiters to deal with.
I would love to see more traditional support for comments in ikiwiki. One way would be to structure data on the discussion page in such a way that a "comment" plugin could parse it and yet the discussion page would still be a valid and usable wiki page.
For example if the discussion page looked like this:
# Subject of First Comment
Posted by [Adam Shand](http://adam.shand.net/) at 10:34PM on 14/04/2007
Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Morbi consectetuer nunc quis
magna. Etiam non est eget sapien vulputate varius. Vivamus magna. Sed justo. Donec
pellentesque ultrices urna.
# Subject of the Second Comment
Posted by [Foo Bar](http://foobar.net/) at 11:41PM on 14/04/2007
Quisque lacinia, lorem eget ornare facilisis, enim eros iaculis felis, id volutpat nibh
mauris ut felis. Vestibulum risus nibh, adipiscing volutpat, volutpat et, lacinia ut,
pede. Maecenas dolor. Vivamus feugiat volutpat ligula.
Each header marks the start of a new comment and the line immediately following is the comments meta data (author, email/url, datestamp). Hopefully you could structure it in such a way that the scope
This would allow:
- A comment plugin to render the comments in "traditional blog" format .
- Possibly even support nesting comments by the header level?
- A comment plugin to create a form at the bottom of the page for people to add comments in the appropriate format to the discussion page
- Still remain usable and readable by people who work via svn.
- When there is ACL support you could mark the discussion page as read only so it could only be updated by the comment plugin (if that's what you wanted)
Is this simple enough to be sensible?
-- AdamShand
Well, if it's going to look like a blog, why not store the data the same way ikiwiki stores blogs, with a separate page per comment? As already suggested in discussion page as blog though there are some things to be worked out also discussed there. --Joey
I certainly won't be fussy about how it gets implemented, I was just trying to think of the lightest weight most "wiki" solution.
-- Adam.
As a side note, the feature described above (having a form not to add a page but to expand it in a formated way) would be useful for other things when the content is short (timetracking, sub-todo list items, etc..) --hb
I've been looking into this. I'd like to implement a "blogcomments" plugin. Looking at the code, I think the way to go is to have a formbuilder_setup hook that uses a different template instead of the standard editpage one. That template would not display the editcontent field. The problem that I'm running into is that I need to append the new content to the old one.
Anything I can do to help? --Joey
Figured it out. Can you comment on the code below? Thanks. -- MarceloMagallon
So, I have some code, included below. For some reason that I don't quite get it's not updating the wiki page after a submit. Maybe it's something silly on my side...
What I ended up doing is write something like this to the page:
\[[blogcomment from="""Username""" timestamp="""12345""" subject="""Some text""" text="""the text of the comment"""]]
Each comment is processed to something like this:
<div>
<dl>
<dt>From</dt><dd>Username</dd>
<dt>Date</dt><dd>Date (needs fixing)</dd>
<dt>Subject</dt><dd>Subject text</dd>
</dl>
<p>Text of the comment...</p>
</div>
. In this way the comments can be styled using CSS.
Code
#!/usr/bin/perl
package IkiWiki::Plugin::comments;
use warnings;
use strict;
use IkiWiki '1.02';
sub import { #{{{
hook(type => "formbuilder_setup", id => "comments",
call => \&formbuilder_setup);
hook(type => "preprocess", id => "blogcomment",
call => \&preprocess);
} # }}}
sub formbuilder_setup (@) { #{{{
my %params=@_;
my $cgi = $params{cgi};
my $form = $params{form};
my $session = $params{session};
my ($page)=$form->field('page');
$page=IkiWiki::titlepage(IkiWiki::possibly_foolish_untaint($page));
# XXX: This needs something to make it blog specific
unless ($page =~ m{/discussion$} &&
$cgi->param('do') eq 'edit' &&
! exists $form->{title})
{
return;
}
if (! $form->submitted)
{
$form->template(IkiWiki::template_file("makeblogcomment.tmpl"));
$form->field(name => "blogcomment", type => "textarea", rows => 20,
cols => 80);
return;
}
my $content="";
if (exists $pagesources{$page}) {
$content=readfile(srcfile($pagesources{$page}));
$content.="\n\n";
}
my $name=defined $session->param('name') ?
$session->param('name') : gettext('Anonymous');
my $timestamp=time;
my $subject=defined $cgi->param('comments') ?
$cgi->param('comments') : '';
my $comment=$cgi->param('blogcomment');
$content.=qq{\[[blogcomment from="""$name""" timestamp="""$timestamp""" subject="""$subject""" text="""$comment"""]]\n\n};
$content=~s/\n/\r\n/g;
$form->field(name => "editcontent", value => $content, force => 1);
} # }}}
sub preprocess (@) { #{{{
my %params=@_;
my ($text, $date, $from, $subject, $r);
$text=IkiWiki::preprocess($params{page}, $params{destpage},
IkiWiki::filter($params{page}, $params{text}));
$from=exists $params{from} ? $params{from} : gettext("Anonymous");
$date=localtime($params{timestamp}) if exists $params{timestamp};
$subject=$params{subject} if exists $params{subject};
$r = qq{<div class="blogcomment"><dl>\n};
$r .= '<dt>' . gettext("From") . "</dt><dd>$from</dd>\n" if defined $from;
$r .= '<dt>' . gettext("Date") . "</dt><dd>$date</dd>\n" if defined $date;
$r .= '<dt>' . gettext("Subject") . "</dt><dd>$subject</dd>\n"
if defined $subject;
$r .= "</dl>\n" . $text . "</div>\n";
return $r;
} # }}}
1;
Posted Tue May 8 21:03:22 2007
In SVN commits 3478, 3480, 3482, and 3485, I added a fieldset around the passwordauth fields, and some additional documentation. However, this needs some additional work to work correctly with the registration part of the form, as well as the buttons. It may also need some CSS love, and some means to style multiple formbuilder fieldsets differently. I reverted these four commits to avoid regressions before the 2.0 release; after the release, we should look at it again. --JoshTriplett
FormBuilder forms can be made much more ameanable to styling by passing these parameters:
name => "signin",
template => {type => 'div'},
This results in a form that uses div instead of a table for layout, and adds
separate id attributes to every form element, including the fieldsets, so that
different forms can be styled separately. The only downside is that it doesn't
allow creating a custom template for the form, but a) nobody has done that and
b) stylesheets are much easier probably. So I think this is the way to go, we
just have to get stylin'.
.fb_submit {
float: left;
margin: 2px 0;
}
#signin_openid_url_label {
float: left;
margin-right: 1ex;
}
#signin_openid {
padding: 10px 10px;
border: 1px solid #aaa;
background: #eee;
color: black !important;
}
That looks pretty good.. putting the passwordauth part in a box of its own with the submit buttons I don't know how to do.
--Joey
Posted Mon Apr 30 21:34:15 2007Need to re-evaluate the contents of goodstuff based on external dependencies.
- Possibly drop img due to PerlMagick dependency, and otl due to vimoutliner dependency.
- Add favicon and more due to lack of external dependencies.
Alternatively, if including items that have minor external dependencies:
- Possibly add table.
I'd like to see some way to include certain files from the underlay only when the wiki has certain plugins enabled. For example:
- Only include smileys.mdwn and the smileys subdirectory if you enable the smiley plugin.
- Exclude openid.mdwn if you disable the openid plugin.
- Include shortcuts.mdwn only if you enable the shortcut plugin.
- Include blog.mdwn only if you don't disable the inline plugin.
- Include favicon.ico only if you enable the favicon plugin.
- Include wikiicons/diff.png (and the wikiicons directory) only if you enable the CGI.
- Include a hypothetical restructuredtexthelp.rst or similar for other formats only with those formats enabled.
I can see two good ways to implement this. Ideally, with conditional text based on ikiwiki features available, ikiwiki could parse a page like conditionalpages.mdwn, which could contain a set of conditional-wrapped page names; that seems like the most elegant and ikiwiki-like approach. Alternatively, ikiwiki.setup could contain a Perl-generated exclude option by default; that would work, but it seems hackish.
Posted Mon Apr 30 04:41:31 2007Another way might be to have a third directory of source files where plugins could drop in pages, and only build the files from there if their plugins were enabled.
Using the conditionals in a page to control what other pages get built feels complex to me, --Joey
(I've written a proposal for this feature --Ben).
Support for uploading files is useful for many circumstances:
- Uploading images.
- Uploading local.css files (admin only).
- Uploading mp3s for podcasts.
- Etc.
ikiwiki should have an easy to use interface for this, but the real meat of the work is in securing it. Several classes of controls seem appropriate:
- Limits to size of files that can be uploaded. Prevent someone spamming the wiki with CD isos..
Limits to the type of files that can be uploaded. To prevent uploads of virii, css, raw html etc, and avoid file types that are not safe. Should default to excluding all files types, or at least all except a very limited set, and should be able to open it up to more types.
Would checking for file extensions (.gif, .jpg) etc be enough? Some browsers are probably too smart for their own good and may ignore the extension / mime info and process as the actual detected file type. It may be necessary to use
file
to determine a file's true type.- Optional ability to test a file using a virus scanner like clamav.
- Limits to who can upload what type of files.
- Limits to what files can be uploaded where.
It seems that for max flexability, rules should be configurable by the admin to combine these limits in different ways. If we again extend the pagespec for this, as was done for conditional text based on ikiwiki features, the rules might look something like this:
( maxsize(30kb) and type(webimage) ) or
( user(joey) and maxsize(1mb) and (type(webimage) or *.mp3) ) or
( user(joey) and maxsize(200mb) and (*.mov or *.avi) and videos/*)
With a small extension, this could even be used to limit the max sizes of normal wiki pages, which could be useful if someone was abusing an open wiki as a wikifs. Maybe.
( type(page) and maxsize(32k) )
And if that's done, it can also be used to lock users from editing a pages or the whole wiki:
!(( user(spammer) and * ) or
( user(42.12.*) and * ) or
( user(http://evilopenidserver/*) and * ) or
( user(annoying) and index) or
( immutable_page ))
That would obsolete the current simple admin prefs for banned users and locked pages. Suddenly all the access controls live in one place. Wonderbar!
(Note that pagespec_match will now return an object that stringifies to a message indicating why the pagespec matched, or failed to match, so if a pagespec lock like the above prevents an edit or upload from happening, ikiwiki could display a reasonable message to the user, indicating what they've done wrong.)
Posted Fri Apr 27 08:44:24 2007The aggregate plugin's handling of http 401 (moved permanently) could be improved. Per RFC 1945:
The requested resource has been assigned a new permanent URL and any future references to this resource should be done using that URL.
So ideally aggregate would notice the 401 and use the new url henceforth.
It's a little tricky because the aggregate plugin can't just edit the page and change the url in the preprocessor directive. (Because committing such an edit would be .. hard.) Also, aggregate directives may also include a separate url for the site, which may also have changed. Perhaps the thing to do is record the new url in the aggregate plugin's state file, and change the message to "Processed ok (new url http://..)", and let the user deal with updating the page later.
Posted Fri Apr 27 00:14:54 2007We might want some kind of abbreviation and acronym plugin. --JoshTriplett
Posted Thu Apr 26 19:14:56 2007We should support SVG. In particular:
We could support rendering SVGs to PNGs when compiling the wiki. Not all browsers support SVG yet.
We could support editing SVGs via the web interface. SVG can contain unsafe content such as scripting, so we would need to whitelist safe markup.
We could support web-based image editing, using something like Snipshot. Several comparisons of web-based image editors exist; we would need to choose which one(s) to support. --JoshTriplett
Posted Thu Apr 26 19:14:56 2007Ikiwiki could optionally use rel=nofollow on all external links, or on all those from a configurable subset of pages (such as */discussion if using opendiscussion). --JoshTriplett
Posted Tue Apr 24 20:58:07 2007ikiwiki could provide one or more plugins that provide "add to" links for popular feed readers, such as Google Reader, Bloglines, My Yahoo!, or Netvibes.
Potentially less useful given an implementation of integration with Firefox and Iceweasel feed subscription mechanism. --JoshTriplett
Posted Sun Apr 22 17:43:47 2007ikiwiki could support grabbing the /topic from an IRC channel and putting the result in a page. See http://wiki.debian.org/TopicDebianDevel for an example. Like aggregate, the page and its updates should not go in version control. --JoshTriplett
A simple script should be able to poll for the irc topic and update a page, then run ikiwiki -refresh to update the wiki. No need to put that in ikiwiki or a plugin, though. --Joey
Posted Sun Apr 22 02:48:33 2007Firefox and Iceweasel, when encountering a news feed, display a page that
allows the user to subscribe to the feed, using Live Bookmarks, Google Reader,
Bloglines, My Yahoo!, or an external reader program. The list of available
applications comes from URIs and titles in the preferences, under
browser.contentHandlers.types.*
. For the benefit of people who use
aggregate as their feed reader, the ikiwiki CGI could expose a URI
to directly add a new feed to the aggregated list; this would allow users to
configure their browser to subscribe to feeds via aggregate running
on their site. We could then provide the manual configuration settings as a
tip, and perhaps provide an extension or other mechanism to set them
automatically. --JoshTriplett
Perhaps ikiwiki should support XML-RPC-based blogging, using the standard MetaWeblog protocol. This would allow the use of applets like gnome-blog to post to an ikiwiki blog. The protocol supports multiple blog names, so one standard URL with page names as blog names would work. --JoshTriplett
Posted Sat Apr 14 20:11:05 2007This would be a great thing to add a plugin for. (Probably using the cgi hook to let ikiwiki act as an RPC server. --Joey
I'd love to see support for this and would be happy to contribute towards a bounty (say US$100) :-). PmWiki has a plugin which implements this in a way which seems fairly sensible as an end user. --AdamShand
ikiwiki should have a consistent set of smileys. We could fix the current smileys, or we could grab a new set of consistent smileys, such as the Tango emotes from gnome-icon-theme (GPL).
Posted Sat Apr 14 20:11:05 2007Pages could have a linktitle
(perhaps via meta), and
wikilinks could use that title by default when linking to the
page. That would allow pages to have a simple, easily linkable name (without
spaces, for instance), but use the proper title for links. For example,
PreprocessorDirective could use the linktitle
"preprocessor directive",
and pages for users could have linktitle
s that put spaces in their
names.
Ideally, perhaps two versions of the title could exist, one for general use,
and an optional one for if the case in the actual link starts with an
uppercase letter. That would allow preprocessordirective to use the link
text "preprocessor directive", but PreprocessorDirective to use the link
text "Preprocessor Directive", for use at the beginnings of sentences. If the
second version did not exist, the first version would apply to both cases.
However, that also seems like potential overkill, and less important than the
basic functionality of linktitle
.
--JoshTriplett
When run with the Git backend, ikiwiki should use GIT_AUTHOR_NAME
and GIT_AUTHOR_EMAIL
rather than munging the commit message. Depending on the semantics you want to imply (does a web edit constitute a commit by the user or by the script?), it could also set GIT_COMMITTER_NAME
and GIT_COMMITTER_EMAIL
to the same values. --JoshTriplett
ikiwiki could support rendering and editing po files via the web. Run against a software repository, ikiwiki would make for an interesting translation-management tool. --JoshTriplett
Posted Tue Apr 10 01:27:49 2007ikiwiki could support a pastebin (requested by formorer on #ikiwiki
for http://paste.debian.net/).
Desired features:
- expiration
- syntax highlighting with line numbering
- Password protection?
-- JoshTriplett
Posted Mon Apr 9 21:12:20 2007How about a plugin providing a preprocessor directive to render a graphviz file as an image via one of the graphviz programs ("dot" by default) and include the resulting image on the page, using the "cmapx" image map format? graphviz files themselves could also render the same way into an HTML file with the same basename as the graphviz file; format and program could come either from an ikiwiki configuration option or comments/directives in the file. (For example, "digraph" could imply "dot", and "graph" could imply "neato".)
To complement this, ikiwiki could support creating and editing graphviz files through the CGI interface, as a new page type; preview could render the file. It would also help to have some sort of graphviz extension attribute for linking to a wiki page, which would become a standard href or URL attribute in the input passed to the particular graphviz program.
Posted Mon Apr 9 17:37:14 2007Editing graphviz files safely online might be tricky. Graphvis would need to be audited. --Joey
I've added a graphviz plugin which adds a preprocessor directive to render inline graphviz graphs, addressing part of this todo item. It doesn't yet support graphviz files as a separate page type, image maps, or wikilinks.--JoshTriplett
Random idea: create an ikiwiki IRC bot, which notices the use of ikiwiki syntax in the channel and translates. This would work nicely for "frequently-given answer" bots like dpkg on #debian, since you could give answers by linking to wiki pages. ikibot could also excerpt page content.
<newikiuser> How do I set up ikiwiki with Git?
<ikihacker> setup
<ikibot> http://ikiwiki.info/setup.html: "This tutorial will walk you through setting up a wiki with ikiwiki. ..."
Posted Mon Apr 9 17:37:07 2007
I'd like the ability to block external links from anonymous users, or from
untrusted users. This could work by generating the HTML for the new page and
comparing it to the HTML for the old page, looking for any new <a>
tags with
href values that didn't exist in the old page and don't start with the URL of
the wiki. Comparing the HTML, rather than the input, allows usage with
various types of input formats, and ensures that a template, shortcut, or some
new plugin will not bypass the filter.
This would probably benefit from a whitelist of acceptable external URLs.
This may actually form a subset of the general concept of content policies, described at fileupload.
Posted Fri Apr 6 20:46:12 2007search could provide OpenSearch metadata. Various software supports OpenSearch (see the Wikipedia article on OpenSearch); in particular, browsers like Firefox and Iceweasel will automatically discover an OpenSearch search and offer it in the search box.
More specifically, we want to follow the OpenSearch Description Document
standard,
by having a link
with rel="search"
and
type="application/opensearchdescription+xml"
in the headers of HTML, RSS,
and Atom pages. The href
of that link
should point to an
OpenSearchDescription XML file with contents generated based on the
information in ikiwiki.setup
, and the title
attribute of the link
should
contain the wiki title from ikiwiki.setup
.
Admins need the ability to block IP ranges. They can already ban users.
See fileupload for a propsal that grew to encompass the potential to do this.
Posted Fri Apr 6 17:13:38 2007ikiwiki could support manpages (or general groff input files) and convert them to HTML. --JoshTriplett
Posted Fri Apr 6 17:13:38 2007Either template or shortcut should support some form of very simple text parsing or regex application, to make it possible to write shortcuts like these:
\[[mmlist listname@lists.example.org]] -> <listname@example.org> ([mailman page] (http://lists.example.org/mailman/listinfo/listname)
\[[debcl packagename]] -> [packagename changelog](http://packages.debian.org/changelogs/pool/main/p/packagename/current/changelog)
For shortcut definitions, a match
parameter could supply a regex, and then the url
and desc
parameters could make use of the named or numbered groups from the match.
I'm not comfortable with exposing regexps to web editing. At the very least it's trivial to construct regexps that take indefinitely long to match certain strings, which could be used to DOS ikiwiki. At worst, perl code can be embedded in regexps in a variety of ways that are painful to filter out, and perl's regexp engine could also potentially have bugs that could be exploited by user-supplied regexps.
It seems that a better place to put this kind of text munging is in special-purpose plugins. It should be very simple to write plugins for the above two examples, that look identical to the user as what you described.
--Joey
Fair enough. I only proposed regexes for the purposes of generality.
That said, some simple text substitution mechanisms might handle many of these
cases without the need for a specialized plugin beyond shortcut.
For instance, substring extraction would suffice for the debcl
shortcut, and
something like a split function would work for the mmlist
shortcut.
A plugin to generate calendars using pal.
Main issues:
- pal's HTML output is not valid (fixed in pal SVN)
- make sure it's secure
- calendars change with time, so ikiwiki would need to be run from cron daily to update them, and the plugin would need to somehow mark the page as needing a rebuild after time had passed. Similar to aggregate.
Create some nice(r) stylesheets.
Should be doable w/o touching a single line of code, just editing the wikitemplates and/or editing style.css.
Posted Sat Mar 31 06:16:59 2007Supporting or switching to MultiMarkdown would take care of a few of the outstanding feature requests. Quoting from the MultiMarkdown site:
MultiMarkdown is a modification of John Gruber's original Markdown.pl file. It uses the same basic syntax, with several additions:
I have added a basic metadata feature, to allow the inclusion of metadata within a document that can be used in different ways based on the output format.
I have allowed the automatic use of cross-references within a Markdown document. For instance, you can easily jump to [the Introduction][Introduction].
I have incorporated John's proposed syntax for footnotes. Since he has not determined the output format, I created my own. Mainly, I wanted to be able to add footnotes to the LaTeX output; I was less concerned with the XHTML formatting.
Most importantly, however, I have changed the way that the processed output is created, so that it is quite simple to export Markdown syntax to a variety of outputs. By setting the
Format
metadata tocomplete
, you generate a well-formed XHTML page. You can then use XSLT to convert to virtually any format you like.
MultiMarkdown would solve the BibTex request and the multiple output formats would make the print_link request an easy fix. MultiMarkdown is actively developed and can be found at:
Posted Wed Mar 28 17:45:51 2007I don't think MultiMarkdown solves the BibTeX request, but it might solve the request for LaTeX output. --JoshTriplett
Unless there's a way to disable a zillion of the features, please no. Do not switch to it. One thing that I like about markdown as opposed to most other ASCII markup languages, is that it has at least a bit of moderation on the syntax (although it could be even simpler). There's not a yet another reserved character lurking behind every corner. Not so in multimarkdown anymore. Footnotes, bibliography and internal references I could use, and they do not add any complex syntax: it's all inside the already reserved sequences of bracketed stuff. (If you can even say that ASCII markup languages have reserved sequences, as they randomly decide to interpret stuff, never actually failing on illegal input, like a proper language to write any serious documentation in, would do.) But tables, math, and so on, no thanks! Too much syntax! Syntax overload! Bzzzt! I don't want mischievous syntaxes lurking behind every corner, out to get me. --tuomov
ikiwiki already supports MultiMarkdown, since it has the same API as MarkDown. So if you install it as Markdown.pm (or as /usr/bin/markdown), it should Just Work. It would also be easy to support some other extension such as mmdwn to use multimarkdown installed as MuliMarkdown.pm, if someone wanted to do that for some reason -- just copy the mdwn plugin and lightly modify. --Joey
It would be useful if ikiwiki was able to create google sitemap files to allow easy indexing.
Sitemaps are particularly beneficial when users can't reach all areas of a website through a browseable interface. (Generally, this is when users are unable to reach certain pages or regions of a site by following links). For example, any site where certain pages are only accessible via a search form would benefit from creating a Sitemap and submitting it to search engines.
What I don't get is exactly how ikiwiki, as a static wiki that's quite deeply hyperlinked, benefits from a sitemap. The orphans plugin can produce a map of pages that other pages do not link to, if you're worried about having such pages not found by web spiders.
--Joey
While pages are very interlinked, most people use ikiwiki for blogging. Blogging produces pages at random intervals and google apparently optimizes their crawls to fit the frequency of changes. For me it's not so often that the contents of my blog changes, so google indexes it quite infrequently. Sitemaps are polled more often than other content (if one exists) so it's lighter for the site and for search engines (yes, google) to frequently poll it instead. So it's not that pages can't be found, but it's lighter for the site to keep an up to date index.
-- Sami
Posted Wed Mar 28 17:45:51 2007I see that ikiwiki has already some bugs stored on the wiki, but adding better support for bug tracking would really make it a good project management system for small projects. Storing the sourcecode, wiki, developer blog and the issue tracker information under a same revision control system really makes sense. At the moment the only part missing from ikiwiki is the bug tracker plugin.
The support would not need to be anything fancy, assignment of bug numbers is perhaps the biggest thing missing when compared to a plain wiki page. Integration with the revision control system a la scmbug would really neat though, so that bug tracker commands like (closes: #nnn) could be embedded to the source code repository commit messages.
Posted Sat Mar 24 15:04:32 2007A while back I posted some thoughts in my blog about using a wiki for issue tracking. Google's BTS also has some interesting developments along the lines of free-form search-based bug tracking, a style that seems a better fit to wikis than the traditional rigid data of a BTS.
I sorta take your point about bug numbers. It can be a pain to refer to 'usingawikiforissue_tracking' as a bug name in a place like a changelog.
OTOH, I don't see a need for specially formatted commit messages to be used to close bugs. Instead, if your BTS is kept in an ikiwiki wiki in an RCS along with your project, you can do like I do here, and just edit a bug's page, tag it
done
, and commit that along with the bug fix.--Joey
I think a little bit more structure than in a normal wiki would be good to have for bug tracking. Bug numbers, automatic timestamps on comments and maybe an email interface would be nice. The resulting page may not look like a wikipage anymore, but rather something like the Debian BTS web-interface, but it would still benefit from wikilinks to the documentation in the wiki etc.
About the commit message interface: I was thinking about a project architecture where the code is kept in a separate revision control system branch than the metadata (blog, wiki, BTS). This would IMHO be a cleaner solution for distributing the source and making releases etc. For this kind of setup, having the BTS scan the messages of the source branch (by a commit-hook for example) would be useful.
By Google BTS, do you mean the issue tracker in the Google code project hosting?
--Teemu
inline could have a pagespec-specific show=N option, to say things like "10 news items (news/), but at most 3 news items about releases (news/releases/)".
This should eliminate the need for wikiannounce to delete old news items about releases.
Posted Fri Mar 23 20:07:03 2007Consider the "All files in this package search" on http://packages.debian.org. The URL for such a search looks like this:
http://packages.debian.org/cgi-bin/search_contents.pl?word=packagename&searchmode=filelist&case=insensitive&version=unstable&arch=i386
To create a "debfiles" shortcut that takes a package name, you could just hardcode the architecture and distribution:
[[shortcut name=debfiles url="http://packages.debian.org/cgi-bin/search_contents.pl?word=%s&searchmode=filelist&case=insensitive&version=unstable&arch=i386"]]
[[debfiles ikiwiki]]
But what if you could have them as optional parameters instead? The syntax for the invocation should look like this:
[[debfiles ikiwiki dist=testing]]
Some possible syntax choices for the shortcut definition:
[[shortcut name=debfiles url="http://packages.debian.org/cgi-bin/search_contents.pl?word=%s&searchmode=filelist&case=insensitive&version=%(dist)s&arch=%(arch)s" dist="unstable" arch="i386"]]
[[shortcut name=debfiles url="http://packages.debian.org/cgi-bin/search_contents.pl?word=%s&searchmode=filelist&case=insensitive&version=%(dist=unstable)s&arch=%(arch=i386)s"]]
[[shortcut name=debfiles url="http://packages.debian.org/cgi-bin/search_contents.pl?word=%s&searchmode=filelist&case=insensitive&version=%{dist=unstable}&arch=%{arch=i386}"]]
[[shortcut name=debfiles url="http://packages.debian.org/cgi-bin/search_contents.pl?word=$*&searchmode=filelist&case=insensitive&version=${dist=unstable}&arch=${arch=i386}"]]
Well, you can already do this kind of thing with templates. Invocation does look different:
[[template id=debfiles package=ikiwiki dist=testing]]
--Joey
Posted Wed Mar 21 08:12:48 2007I think I would find templates sufficient, if:
- I could use the name of the template as a preprocessor directive (
[[templatename ...]]
), rather than using thetemplate
directive with anid
argument ([[template id=templatename]]
).- Template invocation allowed bare values in addition to
key=value
arguments, and template definition supported some means to access the value. This would allow[[debfiles ikiwiki]]
rather than[[debfiles package=ikiwiki]]
.- I could use ikiwiki syntax in the template, not just HTML and HTML::Template. (If I can already do that, then template should make that more clear.)
- Need to get post commit hook working (or an example of how to use it.)
- rcs_notify is not implemented
- Is the code sufficiently robust? It just warns when mercurial fails.
- When rcs_commit is called with a $user that is an openid, it will be passed through to mercurial -u. Will mercurial choke on this?
- The rcs_commit implementation seems not to notice if the file has been changed since a web edit started. Unlike all the other frontends, which use the rcstoken to detect if the web commit started editing an earlier version of the file, and if so, merge the two sets of changes together. It seems that with the current mercurial commit code, it will always blindly overwrite the current file with the web edited version, losing any other changes.
creating a gallery of a bunch of images:
- Display Exif informations
- Display Image informations (like size, date, resolution, compression...)
- Create CSS data for customizing
- Create Thumbnails (maybe in more than one size, eg: full,1024x768,800x600,640x480)
- Descriptions for every image
- Comments
- Ratings
- Watermarks
- Some javascript for easy navigation (see photon for a good example)
It should be possible to disable every feature for every directory.
This could be split into two distinct projects. One would be to modify the img plugin to support some of these ideas for extracting and using information such as exif out of images. The other project would be to design something that handles setting up a gallery, which could be just some regular wiki pages using the img plugin, and perhaps some other custom plugins for things like ratings and javascript), and adding new images to a gallery as they are added to the wiki.
That's one way to do it, and it has some nice benefits, like being able to edit the gallery pages like any wiki page, to add comments about images, links, etc. An example of ikiwiki being used like that: http://kitenet.net/~family/pics/guaimaca.html (still room for improvement, clearly).
--Joey
Posted Fri Mar 16 19:19:01 2007Add an option to use the Showdown GUI for editing or adding content. It is BSD-licensed javascript that shows the rendered Markdown (or HTML) while editing.
A demo is at http://www.attacklab.net/showdown-gui.html
(I read about this on the markdown mailing list.)
Posted Fri Mar 2 01:14:33 2007Wikiwyg also can provide a nice GUI for editing, although it would need to be expanded to support markdown. The benefit compared to showdown is that it provides a GUI for editing with widets for italics, etc, compared to showdown which still leaves input in markdown and seems more geared to a fast preview of the html. --Joey
Some inconsistences around the toplevel index page:
- ikiwiki is a separate page; links to ikiwiki should better go to the index though.
- The toplevel Discussion page has some weird parentlinks behavior. This could be special cased around with the following patch. However, I'm unsure if I like the idea of more special cases around this. It would be better to find a way to make the toplevel index page not be a special case at all.
Here is a patch:
--- IkiWiki/Render.pm (revision 1187)
+++ IkiWiki/Render.pm (working copy)
@@ -71,6 +71,7 @@
my $path="";
my $skip=1;
return if $page eq 'index'; # toplevel
+ $path=".." if $page=~s/^index\///;
foreach my $dir (reverse split("/", $page)) {
if (! $skip) {
$path.="../";
Posted Thu Mar 1 22:52:05 2007I would like to suggest another tack, namely a bigger, better special case. The basic idea is that all indices of the form foo/bar/index get the wiki path foo/bar. This makes some things more elegant:
- All files having to do with foo/bar are in the foo/bar directory, rather than the (admittedly minor) wart of having the index be in foo/.
- This sort of addresses broken parentlinks in that example/ is guaranteed to be a valid path. (There might be no index there, though.)
- This is more in line with standard HTML practice, as far as I understand it, namely that linking to a/b means a/b/index.html rather than a/b.html.
This would change the inline plugin in strange ways -- I think if foo/index.html contains [[inline "* and !*/Discussion"]], it should skip inlining foo/index.html explicitly, but would inline index pages in child directories foo/bar/baz/index.html as bar/baz.
It always bothers me that foo/bar/ files need a foo/bar.html in front of them, rather than a foo/bar/index.html, as is (to my mind) traditional.
Ethan
Hmm, now I've had time to think about this, and this does conflict pretty hard with foo.html/Discussion pages. Well, back to the drawing board.
Well, it seems unlikely that you'll have both foo/bar.html and foo/bar/index.html, so why not accept either as foo/bar? This would both preserve backwards compatibility, as well as allow foo/bar/Discussion.
Ethan
No, in order for this to work, the wiki path foo/bar/baz could be any of:
- foo/bar/baz.html
- foo/index/bar/index/baz.html
- foo/bar/index/baz.html
- foo/bar/index/baz/index.html
Or many others. Which is probably even hackier than having both foo.html and foo/.
Ethan
I thought I'd draw attention to a desire of mine for ikiwiki. I'm no power-user, and mostly I do fairly simple stuff with my wiki.
However, I would like the ability (now) to rename/move/delete pages. As part of having a genealogy wiki, I've put name and dates of birth/death as part of the title of each article (so to avoid cases where people have the same name, but are children/cousins/etc of others with that name). However, some of this information changes. For instance, I didn't know a date of death and now I do, or I had it wrong originally, or it turns out someone is still alive I didn't know about. All of these cases leave me with bad article titles.
So, I can go ahead and move the file to a new page with the correct info, orphan that page, provide a link for the new page if desired, and otherwise ignore that page. But then, it clutters up the wiki and serves no useful purpose.
Anyway to consider implementing rename/move/delete ? I certainly lack the skills to appreciate what this would entail, but feel free to comment if it appears impossible, and then I'll go back to the aforementioned workaround. I would prefer simple rename, however.
Thanks again to Joey for putting ikiwiki together. I love the program.
Kyle=
The MediaWiki moving/renaming mechanism is pretty nice. It's easy to get a list of pages that point to the current page. When renaming a page it sticks a forwarding page in the original place. The larger the size of the wiki the more important organization tools become.
I see the need for:
- a new type of file to represent a forwarding page
- a rename tool that can
- move the existing page to the new name
- optionally drop a forwarding page
- optionally rewrite incoming links to the new location
Brad
This could be implemented through the use of an HTTP redirect to the new page, but this has the downside that people may not know they're being redirected.
This could also be implemented using a combination of raw inline and meta to change the title (add a "redirected from etc." page. This could be done with a plugin. A redirect page would be [[redirect page="newpage"]]. But then if you click "edit" on this redirect page, you won't be able to edit the new page, only the call to redirect. --Ethan
Note that there is a partial implementation in the patchqueoe.
Posted Fri Feb 23 19:21:32 2007(moved from an item on the main discussion page.)
ikiwiki could have a CVS backend.
Original discussion:
Posted Wed Feb 21 01:03:25 2007Any examples of using ikiwiki with cvs?
No, although the existing svn backend could fairly esily be modified into a CVS backend, by someone who doesn't mind working with CVS. --Joey
When editing a page, it would help to have a "preview changes" or "show diff" button, which brings up a diff from the current page content to the proposed new page content. --JoshTriplett
Some discussion from the main discussion page:
Posted Wed Feb 21 01:03:25 2007It would be nice to be able to have a button to show "Differences" (or "Show Diff") when editing a page. Is that an option that can be enabled?
It's doable, it could even be done by a plugin, I think. --Joey
Given network access when building the wiki, ikiwiki could retrieve information from bug-tracking systems and apply that information to BTS links. For instance, consider how links from one bugzilla bug to another use strikeout formatting for links to fixed bugs, and use the status and summary of the bug to the link as a title.
This seems somewhat difficult, as ikiwiki would need to maintain a cache of the remote BTS information, and know how to talk to various types of BTS. CPAN modules exist to solve this problem for some types of BTS.
scmbug might help here. --JoshTriplett
Posted Wed Feb 21 01:03:25 2007After using it for a while, my feeling is that hyperestradier, as used in the search plugin, is not robust enough for ikiwiki. It doesn't upgrade well, and it has a habit of sig-11 on certian input from time to time.
So some other engine should be found and used instead.
Enrico had one that he was using for debtags stuff that looked pretty good. That was Xapian, which has perl bindings in libsearch-xapian-perl. The nice thing about xapian is that it does a ranked search so it understands what words are most important in a search. (So does Lucene..) Another nice thing is it supports "more documents like this one" kind of search. --Joey
I've done a bit of prototyping on this. The current hip search library is Lucene. There's a Perl port called Plucene. Given that it's already packaged, as
libplucene-perl
, I assumed it would be a good starting point. I've written a very rough patch againstIkiWiki/Plugin/search.pm
to handle the indexing side (there's no facility to view the results yet, although I have a command-line interface working). That's below, and should apply to SVN trunk.Of course, there are problems.
- Plucene throws up a warning when running under Taint mode. There's a patch on the mailing list, but I haven't tried applying it yet. So for now you'll have to build IkiWiki with
NOTAINT=1 make install
.- If I kill
ikiwiki
while it's indexing, I can screw up Plucene's locks. I suspect that this will be an easy fix.There is a C++ port of Lucene which is packaged as
libclucene0
. The Perl interface to this is called Lucene. This is supposed to be significantly faster, and presumably won't have the taint bug. The API is virtually the same, so it will be easy to switch over. I'd use this now, were it not for the lack of package. (I assume you won't want to make core functionality depend on installing a module from CPAN). I've never built a Debian package before, so I can either learn then try building this, or somebody else could do the honours.If this seems a sensible approach, I'll write the CGI interface, and clean up the plugin. -- Ben
The weird thing about lucene is that these are all reimplmentations of it. Thank you java.. The C++ version seems like a better choice to me (packages are trivial). --Joey
Might I suggest renaming the "search" plugin to "hyperestraier", and then creating new search plugins for different engines? No reason to pick a single replacement. --JoshTriplett
Index: IkiWiki/Plugin/search.pm =================================================================== --- IkiWiki/Plugin/search.pm (revision 2755) +++ IkiWiki/Plugin/search.pm (working copy) @@ -1,33 +1,55 @@ #!/usr/bin/perl -# hyperestraier search engine plugin package IkiWiki::Plugin::search; use warnings; use strict; use IkiWiki; +use Plucene::Analysis::SimpleAnalyzer; +use Plucene::Document; +use Plucene::Document::Field; +use Plucene::Index::Reader; +use Plucene::Index::Writer; +use Plucene::QueryParser; +use Plucene::Search::HitCollector; +use Plucene::Search::IndexSearcher; + +#TODO: Run the Plucene optimiser after a rebuild +#TODO: CGI query interface + +my $PLUCENE_DIR; +# $config{wikistatedir} may not be defined at this point, so we delay setting $PLUCENE_DIR +# until a subroutine actually needs it. +sub init () { + error("Plucene: Statedir <$config{wikistatedir}> does not exist!") + unless -e $config{wikistatedir}; + $PLUCENE_DIR = $config{wikistatedir}.'/plucene'; +} + sub import { #{{{ - hook(type => "getopt", id => "hyperestraier", - call => \&getopt); - hook(type => "checkconfig", id => "hyperestraier", + hook(type => "checkconfig", id => "plucene", call => \&checkconfig); - hook(type => "pagetemplate", id => "hyperestraier", - call => \&pagetemplate); - hook(type => "delete", id => "hyperestraier", + hook(type => "delete", id => "plucene", call => \&delete); - hook(type => "change", id => "hyperestraier", + hook(type => "change", id => "plucene", call => \&change); - hook(type => "cgi", id => "hyperestraier", - call => \&cgi); } # }}} -sub getopt () { #{{{ - eval q{use Getopt::Long}; - error($@) if $@; - Getopt::Long::Configure('pass_through'); - GetOptions("estseek=s" => \$config{estseek}); -} #}}} +sub writer { + init(); + return Plucene::Index::Writer->new( + $PLUCENE_DIR, Plucene::Analysis::SimpleAnalyzer->new(), + (-e "$PLUCENE_DIR/segments" ? 0 : 1)); +} + +#TODO: Better name for this function. +sub src2rendered_abs (@) { + return map { Encode::encode_utf8($config{destdir}."/$_") } + map { @{$renderedfiles{pagename($_)}} } + grep { defined pagetype($_) } @_; +} + sub checkconfig () { #{{{ foreach my $required (qw(url cgiurl)) { if (! length $config{$required}) { @@ -36,112 +58,55 @@ } } #}}} -my $form; -sub pagetemplate (@) { #{{{ - my %params=@_; - my $page=$params{page}; - my $template=$params{template}; +#my $form; +#sub pagetemplate (@) { #{{{ +# my %params=@_; +# my $page=$params{page}; +# my $template=$params{template}; +# +# # Add search box to page header. +# if ($template->query(name => "searchform")) { +# if (! defined $form) { +# my $searchform = template("searchform.tmpl", blind_cache => 1); +# $searchform->param(searchaction => $config{cgiurl}); +# $form=$searchform->output; +# } +# +# $template->param(searchform => $form); +# } +#} #}}} - # Add search box to page header. - if ($template->query(name => "searchform")) { - if (! defined $form) { - my $searchform = template("searchform.tmpl", blind_cache => 1); - $searchform->param(searchaction => $config{cgiurl}); - $form=$searchform->output; - } - - $template->param(searchform => $form); - } -} #}}} - sub delete (@) { #{{{ - debug(gettext("cleaning hyperestraier search index")); - estcmd("purge -cl"); - estcfg(); + debug("Plucene: purging: ".join(',',@_)); + init(); + my $reader = Plucene::Index::Reader->open($PLUCENE_DIR); + my @files = src2rendered_abs(@_); + for (@files) { + $reader->delete_term( Plucene::Index::Term->new({ field => "id", text => $_ })); + } + $reader->close; } #}}} sub change (@) { #{{{ - debug(gettext("updating hyperestraier search index")); - estcmd("gather -cm -bc -cl -sd", - map { - Encode::encode_utf8($config{destdir}."/".$_) - foreach @{$renderedfiles{pagename($_)}}; - } @_ - ); - estcfg(); + debug("Plucene: updating search index"); + init(); + #TODO: Do we want to index source or rendered files? + #TODO: Store author, tags, etc. in distinct fields; may need new API hook. + my @files = src2rendered_abs(@_); + my $writer = writer(); + + for my $file (@files) { + my $doc = Plucene::Document->new; + $doc->add(Plucene::Document::Field->Keyword(id => $file)); + my $data; + eval { $data = readfile($file) }; + if ($@) { + debug("Plucene: can't read <$file> - $@"); + next; + } + debug("Plucene: indexing <$file> (".length($data).")"); + $doc->add(Plucene::Document::Field->UnStored('text' => $data)); + $writer->add_document($doc); + } } #}}} - -sub cgi ($) { #{{{ - my $cgi=shift; - - if (defined $cgi->param('phrase') || defined $cgi->param("navi")) { - # only works for GET requests - chdir("$config{wikistatedir}/hyperestraier") || error("chdir: $!"); - exec("./".IkiWiki::basename($config{cgiurl})) || error("estseek.cgi failed"); - } -} #}}} - -my $configured=0; -sub estcfg () { #{{{ - return if $configured; - $configured=1; - - my $estdir="$config{wikistatedir}/hyperestraier"; - my $cgi=IkiWiki::basename($config{cgiurl}); - $cgi=~s/\..*$//; - - my $newfile="$estdir/$cgi.tmpl.new"; - my $cleanup = sub { unlink($newfile) }; - open(TEMPLATE, ">:utf8", $newfile) || error("open $newfile: $!", $cleanup); - print TEMPLATE IkiWiki::misctemplate("search", - "\n\n\n\n\n\n", - baseurl => IkiWiki::dirname($config{cgiurl})."/") || - error("write $newfile: $!", $cleanup); - close TEMPLATE || error("save $newfile: $!", $cleanup); - rename($newfile, "$estdir/$cgi.tmpl") || - error("rename $newfile: $!", $cleanup); - - $newfile="$estdir/$cgi.conf"; - open(TEMPLATE, ">$newfile") || error("open $newfile: $!", $cleanup); - my $template=template("estseek.conf"); - eval q{use Cwd 'abs_path'}; - $template->param( - index => $estdir, - tmplfile => "$estdir/$cgi.tmpl", - destdir => abs_path($config{destdir}), - url => $config{url}, - ); - print TEMPLATE $template->output || error("write $newfile: $!", $cleanup); - close TEMPLATE || error("save $newfile: $!", $cleanup); - rename($newfile, "$estdir/$cgi.conf") || - error("rename $newfile: $!", $cleanup); - - $cgi="$estdir/".IkiWiki::basename($config{cgiurl}); - unlink($cgi); - my $estseek = defined $config{estseek} ? $config{estseek} : '/usr/lib/estraier/estseek.cgi'; - symlink($estseek, $cgi) || error("symlink $estseek $cgi: $!"); -} # }}} - -sub estcmd ($;@) { #{{{ - my @params=split(' ', shift); - push @params, "-cl", "$config{wikistatedir}/hyperestraier"; - if (@_) { - push @params, "-"; - } - - my $pid=open(CHILD, "|-"); - if ($pid) { - # parent - foreach (@_) { - print CHILD "$_\n"; - } - close(CHILD) || print STDERR "estcmd @params exited nonzero: $?\n"; - } - else { - # child - open(STDOUT, "/dev/null"); # shut it up (closing won't work) - exec("estcmd", @params) || error("can't run estcmd"); - } -} #}}} - -1 +1;Posted Wed Feb 21 00:12:53 2007
What do you think about refreshing RecentChanges page (via Meta Refresh Tag)? It can be useful for users like me which rather prefer watching the last changes in WWW browser tab than subscribing to page. --Pawel
Posted Tue Feb 20 00:27:48 2007Depends, if it were done the time period should be made configurable. Unwanted server load due to refeshing could be a problem for some. --Joey
Yes, it should be configurable by ikiwiki admin. I believe he's not stupid and he will not set too short refresh period to kill his server
I propose to add
recentchanges_refresh
variable in ikiwiki setup to setting refresh period. If it's not defined, then ikiwiki doesn't put refresh meta tag intorecentchanges.tmpl
. Do you like it?--Pawel
Seems reasonable --Joey
Sounds like a client-side issue, not an ikiwiki issue. Grab the ReloadEvery extension for
FirefoxIceweasel, and use that to periodically refresh any page you want. --JoshTriplett
A speel chek plug-in woold be fantaztik. Anyone working on this?
Knot adz fair ass eye no --Joey
Firefox 2 (or whatever it will be in Debian) does this for you, and then there's the mozex extension
Posted Tue Feb 20 00:27:48 2007Perhaps I'm just too stupid to find the proper way to do this, but how would I add a new page to the wiki without selecting to edit an already installed one and frobbing the URL to direct to the to-be-created page? --ThomasSchwinge
Good point. Of course one way is to start with creating a link to the page, which also helps prevent orphans. But other wikis based on CGI do have this a bit easier, since they can detect an attempt to access a nonexistant page and show an edit page. Ikiwiki can't do that (unless its web server is configured to do smart things on a 404, like maybe call ikiwiki.cgi which could be modified to work as a smart 404 -> edit handler).
Some wikis also provide a UI means for creating a new page. If we can find something good, that can be added to ikiwiki's UI. --Joey
Hmm, maybe just a preprocessor directive that creates a form inside a page, like is used for blog posting already would suffice? Then the main page of a wiki could have a form for adding new pages, if that directive were included there. Won't work for subpages though, unless the directive were added to the parent page. However, unconnected subpages are surely an even rarer thing to want than unconnected top level pages. --Joey
Here is a simple plugin that does that. Perhaps options could be added to it, but I couldn't really think of any. http://jameswestby.net/scratch/create.diff -- JamesWestby
Maybe a very simple PHP frontend for serving the statically generated pages, that would display a page editing form or something like that for non-existent pages, wouldn't be too bad a thing and resource hog? Just a thought... --Tuomov
Posted Tue Feb 20 00:27:48 2007Wishlist: optionally use the syntax plugin automatically on source code files in the repository with recognized extensions or shebangs, and render them as though they consisted of an .mdwn page containing nothing but a single call to the syntax plugin with the file contents as the text argument and the recognized type as the type argument.
Together with the ability to have wiki-formatted comments, this would allow the use of ikiwiki for literate programming.
Posted Tue Feb 20 00:27:48 2007I would love to see a plugin that lets you create one or more BibTeX-formatted bibliography pages and add citations to other pages. The plugin could also render the bibliographies themselves using a chosen BibTeX style and an HTML formatter for LaTeX (such as HeVeA).
Posted Tue Feb 20 00:27:48 2007- Need to get post commit hook code working.
- Need some example urls for web based diffs.
Wishlist item: I'd love to see the ability to optionally switch back to wiki syntax within the comments of code pretty-printed with the syntax plugin. This would allow the use of links and formatting in comments.
Posted Tue Feb 20 00:27:48 2007To avoid the two-step posting process of typing a page name, hitting "Edit", entering content, and hitting "Save Page", how about optionally including a post content field, save button, and preview button directly on the page with the inline? This would particularly help when using an inline directive for a comment form at the bottom of a blog post; with these added fields, the post form becomes exactly like the typical blog comment form.
Posted Tue Feb 20 00:27:48 2007I agree that having this as an option is reasonable. Although it would take a fair amount of work. --Joey
How about a direct link from the page header to the source of the latest version, to avoid the need to either use edit or navigate to the current version via the history link?
Posted Tue Feb 20 00:27:48 2007Commit messages should allow wiki syntax, and RecentChanges should format them accordingly.
That's a neat idea! It would probably have to be only the simpler bits, without preprocessor directives -- wouldn't want a commit message inlining a whole page into RecentChanges. Of course, it could only use one of the available markups, ie the default markdown. --Joey
To go along with this, the preview should show the formatted commit message. --JoshTriplett
Posted Tue Feb 20 00:27:48 2007I'd like the ability to use a shortcut, but declare an explicit link text
rather than using the link text defined on shortcuts. For example, if I
create a shortcut protogit
pointing to files in the xcb/proto.git gitweb
repository, I don't always want to use the path to the file as the link text;
I would like to src/xcb.xsd, but use the link text "XML Schema for the X
Window System protocol". --JoshTriplett
If I understand you correctly, you can use Markdown [your link text](the path or URL) . Using your example: XML Schema for the X Window System protocol
If I don't understand this, can you give an HTML example? --JeremyReed
The problem is like that in shortcuts don't escape from Markdown. We would like to use the shortcuts plugin but add a descriptive text -- in this case [[xcbgit src/xcb.xsd|XML Schema...]] The file src/xcb.xsd could be any url, and the point of shortcuts is that you get to shorten it. --Ethan
Some clarifications: You can always write something like
[XML Schema for the X Window System Protocol](http://gitweb.freedesktop.org/?p=xcb/proto.git;a=blob;hb=HEAD;f=src/xcb.xsd)
to get XML Schema for the X Window System Protocol. However, I want to define a shortcut to save the typing. If I define something likeprotogit
pointing tohttp://gitweb.freedesktop.org/?p=xcb/proto.git;a=blob;hb=HEAD;f=%s
, then I can write[[protogit src/xcb.xsd]]
; however, I then can't change the link text to anything other than what the shortcut defines as the link text. I want to write something like[[XML Schema for the X Window System Protocol|protogit src/xcb.xsd]]
, just as I would write a wikilink likethe shortcuts on this wiki
to get the shortcuts on this wiki. (The order you suggest, with the preprocessor directive first, seems quite confusing since wikilinks work the other way around.) --JoshTriplettHow about [xcbgit XML_Schema|src/xcb.xsd]. That's the same way round as a wikilink, if you look at it the right way. The syntax Josh suggests is not currently possible in ikiwiki.
However.. Short wikilinks has some similar objectives in a way, and over there a similar syntax to what Josh proposes was suggested. So maybe I should modify how ikiwiki preprocessors work to make it doable. Although, I seem to have come up with a clear alternative syntax over there. --Joey
One possible alternative, would be a general [[url ]]
scheme for all kinds of links. As mentioned in Short wikilinks, I have wanted a way to enter links to the wiki with markdown-style references,
specifying the actual target elsewhere from the text, with just a short reference in the text. To facilitate automatic conversion from earlier (already markdownised) "blog", I finally ended up writing a custom plugin that simply gets the location of wikipage, and use markdown mechanisms:
Here [is][1] a link.
[1]: \[[l a_page_in_the_wiki]]
Obviously [this](\[[l another_page]]) also works, although the syntax is quite cumbersome.
So that the 'l' plugin inserts the location the page there, and markdown does the rest. My plugin currently fails if it can't find the page, as that is sufficient for my needs. Differing colouring for non-existing pages is not doable in a straightforward manner with this approach.
For external links, that is no concern, however. So you could define for each shortcut an alternative directive, that inserts the URL. Perhaps [[url shortcutname params]]
or \[[@shortcutname params]]
(if the preprocessor supported the @), and this could be extended to local links in an obvious manner: [[url page]]
or @page
. Now, if you could just get rid off the parantheses for markdown, for the short inline links --tuomov (who'd really rather not have two separate linking mechanisms: ikiwiki's heavy syntax and markdown's lighter one).
Markdown supports nice short links to external sites within body text by references defined elsewhere in the source:
foo [bar][ref]
[ref]: http://example.invalid/
It would be nice to be able to do this or something like this for wikilinks as well, so that you can have long page names without the links cluttering the body text. I think the best way to do this would be to move wikilink resolving after HTML generation: parse the HTML with a proper HTML parser, and replace relative links with links to the proper files (plus something extra for missing pages).
That's difficult to do and have resonable speed as well. Ikiwiki needs to know all about all the links between pages before it can know what pages it needs to build to it can update backlink lists, update links to point to new/moved pages etc. Currently it accomplishes this by a first pass that scans new and changed files, and quickly finds all the wikilinks using a simple regexp. If it had to render the whole page before it was able to scan for hrefs using a html parser, this would make it at least twice as slow, or would require it to cache all the rendered pages in memory to avoid re-rendering. I don't want ikiwiki to be slow or use excessive amounts of memory. YMMV. --Joey
Or you could disk cache the incomplete page containing only the body text, which should often not need re-rendering, as most alterations consist of changing the link targets exactly, and we can know pages that exist before rendering a single page. Then after backlinks have been resolved, it would suffice to feed this body text from the cache file to the template. However, e.g. the inline plugin would demand extra rendering after the depended-upon pages have been rendered, but these pages should usually not be that frequent, or contain that many other pages in full. (And for 'archive' pages we don't need to remember that much information from the semi-inlined pages.) It would help if you could get data structures instead of HTML text from the HTMLizer, and then simply cache these data structures in some quickly-loadeble form (that I suppose perl itself has support for). Regexp hacks are so ugly compared to actually parsing a properly-defined syntax...
A related possibility would be to move a lot of "preprocessing" after HTML generation as well (thus avoiding some conflicts with the htmlifier), by using special tags for the preprocessor stuff. (The old preprocessor could simply replace links and directives with appropriate tags, that the htmlifier is supposed to let through as-is. Possibly the htmlifier plugin could configure the format.)
Or using postprocessing, though there are problems with that too and it doesn't solve the link scanning issue.
Other alternatives would be
to understand the source format, but this seems too much work with all the supported formats; or
something like the shortcut plugin for external links, with additional support for specifying the link text, but the syntax would be much more cumbersome then.
I agree that a plugin would probably be more cumbersome, but it is very doable. It might look something like this:
[[link bar]]
[[link bar=VeryLongPageName]]
This is, however, still missing specifying the link text, and adding that option would seem to me to complicate the plugin syntax a lot, unless support is added for the |-syntax for specifying a particular parameter to every plugin.
Well, the link text in my example above is "bar". It's true that if you want to use the same link text for multiple distinct links, or different link texts for the same link, this is missing a useful layer of indirection; it's optimised for the (probably) more common case. It could be done as a degenerate form of the syntax I propose below, BTW. --Joey
... Returning to this, the syntax infact wouldn't be so bad with the |-syntax, given a short name for the plugin:
[[whatever|ref 1]]
[[ref 1=page_with_long_name]]
A way to do this that doesn't need hacking at the preprocessor syntax follows: --Joey
[[link bar=1]]
[[dest 1=page_with_long_name]]
But this doesn't work so well for links that aren't valid keys. Such as stuff with spaces in it. I'd like to be able to write any kind of links conveniently, not just something that looks like a wikilink.
You're right, and to fix that it could be turned around: --Joey
[[link 1=bar]]
[[dest 1=page_with_long_name]]
Posted Tue Feb 20 00:27:48 2007It also shouldn't be difficult to support non-wiki links in this same way, so that you could still link everywhere in an uniform manner, as the (still preferred by me) HTML processing approach would provide. Perhaps a plugin call wouldn't even be necessary for the links themselves: what about aliases for the normal link mechanism? Although the 'ref' call may infact be cleaner, and adding that |-syntax for plugins could offer other possibilities for other plugins.
I agree, it should be easy to make it support non-wiki links too. We seem to have converged at something we can both live with that's reasonable to implement.. --Joey
Along the same lines as having a default name for new posts, an option to include default content in a new inline post would help with tasks like using an inline for a comment form on each new blog post. --JoshTriplett
No, it would only help if the new blog post were being made via the form. If you're editing it in vi, and committing, it doesn't help.
This is another reason why I prefer the approach in discussion page as blog --Joey
This feature would also allow the automatic inclusion of a given template in every new post, which could help for plugins (automatically use the plugin template), or for bugs and todo items (automatically use a template that appends "(done)" to the title if the page links to "done"). --JoshTriplett
Posted Tue Feb 20 00:27:48 2007Stuff still needing to be done with tags:
It's unfortunate that the rss category (tag) support doesn't include a domain="" attribute in the category elements. That would let readers know how to follow back to the tag page in the wiki. However, the domain attribute is specified to be the base url, to which the category is just appended. So there's no way to add ".html", so the url won't be right.
This is one good argument for changing ikiwiki so that pages are all dir/index.html, then a link to just "dir" works.
Ikiwiki has already been optimised a lot, however..
Look at splitting up CGI.pm. But note that too much splitting can slow perl down.
The backlinks calculation code is still O(N^2) on the number of pages. If backlinks info were stored in the index file, it would go down to constant time for iterative builds, though still N^2 for rebuilds.
Since /
now works in WikiLinks to anchor links to the root of the site
(for instance, index
), /
alone should work as a reference to the
top-level index page (for instance, Home
). --JoshTriplett
Any way to use inline
but point the feed links to a different feed on the
same site? I have news in news/*, a news archive in news.mdwn, and the
first few news items on index.mdwn, but I don't really want two separate
feeds, one with all news and one with the latest few articles; I'd rather
point the RSS feed links of both to the same feed. (Which one, the one
with all news or the one with the latest news only, I don't know yet.)
Posted Tue Feb 20 00:27:48 2007Not currently. It could be implemented, or you could just turn off the rss feed for the index page, and manually put in a wikilink to the news page and rss feed. --Joey
That wouldn't use the same style for the RSS and Atom links, and it wouldn't embed the feed link into
<head>
so that browsers can automatically find it.
Feature idea: I'd like to be able to tag pages in an ikiwiki blog with a publication date, and have the option of building a blog that excludes publication dates in the future. (meta pubdate= ?)
I'm using ikiwiki on git for a "tip of the day" RSS feed, and I'd like to be able to queue up a bunch of items instead of literally putting in one tip per day. In the future I think this will come in handy for other Mainstream Media-oriented requirements such as "embargo dates" and "editor on vacation".
Posted Tue Feb 20 00:27:48 2007The problem with implementing a feature like this is that, since ikwiiki is a wiki compiler, if something causes content to change based on the date, then the wiki needs to be rebuilt periodically. So you'd need a cron job or something.
Implemeting this feature probably needs plugin dependency calulation to be implemented. --Joey
It would help to allow filtering of content when inlining pages. For example, given some way to filter out the display of inlines within other inlines, a blog post could allow easy inline commenting by putting an inline directive with post form at the bottom of the post.
Posted Tue Feb 20 00:27:48 2007That's trying to do the same thing as the todo item discussion page as blog. Difference is that you're suggesting displaying the comments in the blog post that they comment on, instead of on the separate disucssion page. Which leads to the problem of those comments showing up inlined into the blog.
I know there are benefits to having the comments on the same page and not a separate discussion page, but it does add compliciations and ikiwiki already has discussion pages, so I'm more likely to go the route described in discussion page as blog. --Joey
If a wikilink does not show the name of the page, because it's been overridden to show something else, it could put a title="pagename" in the link. This way users mousing over the wikilink would get a nice tooltip with some extra info.
Posted Tue Feb 20 00:27:48 2007How about adding ACL? So that you can control which users are allowed to read, write certain pages. The moinmoin wiki has that, and it is something, that I think is very valuable.
Posted Tue Feb 20 00:27:48 2007ikiwiki currently has only the most rudimentary access controls: pages can be locked, or unlocked and only the admin can edit locked pages. That could certianly be expanded on, although it's not an area that I have an overwhelming desire to work on myself right now. Patches appreciated and I'll be happy to point you in the right directions.. --Joey
I'm really curious how you'd suggest implementing ACLs on reading a page. It seems to me the only way you could do it is .htaccess DenyAll or something, and then route all page views through ikiwiki.cgi. Am I missing something? --Ethan
Or you could just use apache or whatever and set up the access controls there. Of course, that wouldn't integrate very well with the wiki, unless perhaps you decided to use http basic authentication and the httpauth plugin for ikiwiki that integrates with that.. --Joey
Which would rule out openid, or other fun forms of auth. And routing all access through the CGI sort of defeats the purpose of ikiwiki. --Ethan
Another useful feature might be to be able to choose a different template file for some pages; blog pages would use a template different from the home page, even if both are managed in the same repository, etc.
Well, that would probably be fairly easy to add if it used pagespecs to specify which pages use the non-default template.
Hmm, I think the pagetemplate hook should allow one to get close enough to this in a plugin now.
Posted Tue Feb 20 00:27:48 2007I am serving notice that I am starting work on a calendar plugin inspired by Blosxom's calendar plugin. The current plan is to create a plugin that looks through all the source files matching a certain pagespec, and optionally spit out a month view for the specified month (default to current), or spit out a year view for a given year (defaulting to the current year), of a list of year with posts in them. The output would be a table, with the same CSS directives that the Blosxom plugin used to use (so that I can just reuse my css file). The links would be created to a $config{archivedir}/$year or $config{archivedir}/$year-$month file, which can just have
[[inline pages="blog/* and !*/Discussion and creation_year($year) and creation_month($month)" rss="no" atom="no" show="0"]]
or some thing to generate a archive of postings.
Roland Mas suggested a separate cron job to generate these archive indices automatically, but that is another thread.
ManojSrivastava
Posted Tue Feb 20 00:27:48 2007Need a way to sign name in page that's easier to type than "--Joey" and that includes the date.
What syntax do other wikis use for this? I'm considering "--" (with spaces removed) as it has a nice nmemonic.
OTOH, adding additional syntax for this would be counter to one of the design goals for ikiwiki: keeping as much markup as possible out of the wiki and not adding nonstandard markup. And it's not significantly hard to type "--Joey", and as to the date, we do have page history.
I'm also unsure how to possibly implement this. Seems ikiwiki would need to expand the rune to the user's name when a page is saved, but that leaves out svn commits.
Alternate idea: Make a sig plugin, which would expand --Name to
--Name (the "user/" bit would be configurable). This would be very
easy to do, although it would need to try to avoid false positives, such
as --foo
in C code..
Currently, shortcuts must have the url
parameter, and can optionally
have the desc
parameter. If the shortcut
directive instead required at
least one of url
or desc
, then shortcuts could just supply a description
without an URL. Since desc can contain arbitrary wiki markup, this would
allow shortcuts with multiple links, such as the mmlist shortcut proposed on
simple text parsing or regex in template or shortcut, or a comprehensive
Debian package shortcut which linked to the package page and parenthetically
to the BTS and PTS.
It sounds like you're looking for templates, not shortcuts. --Joey
Posted Tue Feb 20 00:27:48 2007Perhaps true (see my issues with template syntax on shortcut optional parameters), but allowing a
shortcut
without anurl
still seems reasonable, and simple. You could also use such shortcuts without markup at all, as an abbreviation mechanism:[[shortcut name=spi desc="Software in the Public Interest, Inc."]]]. [[shortcut name=sosp desc="Symposium on Operating System Principles"]]]. [[shortcut name=cacm desc="Communications of the ACM"]]].
How about an option in inline for providing a default post name in the form, consisting of the current date in ISO format?
Posted Tue Feb 20 00:27:48 2007Wikiwyg is a WYSIWYG editor written in javascript for wikis. It allows editing in a gui or in wikitext and converts edits back to wiki format to be saved to the wiki.
It would be awesome to use this in ikiwiki, but to take full advantage of it with ikiwiki, it would need to know about MarkDown. Wikiwyg does allow defining the text that is stuck on each side of a given html element to make it wikified, for example, it can add "# " for a h1, "[[" and "]]" for a link, etc. This seems easily doable.
The other thing that would need doing is a saveChanges
function would
need to be implemented that saves the text back to ikiwiki.
http://svn.wikiwyg.net/code/trunk/wikiwyg/share/Kwiki/lib/Wikiwyg/Kwiki.js
seems like a good starting point for building a submit form on the fly.
One other problem: Wikiwyg works by parsing html from a div, turning it back into the wiki markup, and editing/saving that. That seems to assume that there's a way of parsing a page's html and getting back to the underlying wiki markup, which is not always the case in ikiwiki. Unless there's some other way to feed it the actual source for a page, this seems like a problem. According to the developers, it is possible to do that, and start off in WikiText mode.
Posted Tue Feb 20 00:27:48 2007Why isn't it statically-genereated, but generated dynamically by CGI? It seems like it could be beneficial to have it rendered in the post-commit hook, just like everything else in the wiki.
I hope to statically generate it eventually, currently the problem is that it takes at least several seconds to generate the recentchanges page, and adding several seconds to every page edit is not desiriable. If the time can be reduced it could be done, I'm also not adverse to adding an optional way to statically render it even at the current speed. --Joey
Also, is it planned/desired that recent changes generate the same information in RSS feed format? This seems like it could be a useful way to keep track of the wiki as a whole.
This is used by various interwiki type things, I think, so should be done.. --Joey
Lastly, would it be possible to use the recent changes code with a pagespec? I understand this sort of infringes on territory covered by the inline plugin, but the inline plugin only puts a page in the RSS feed once, when it's created, and I imagine some people -- some deranged, obsessive-compulsive people like myself -- would like to know about the changes made to existing pages as well as newly-created pages.
That would work rather well for pages like todo and bugs, where you want to know about any updates, not just initial creation. --JoshTriplett
Of course you can use email subscriptions for that too.. --Joey
I have more thoughts on this topic which I will probably write tomorrow. If you thought my other patches were blue-sky, wait until you see this. --Ethan
OK, so here's how I see the RecentChanges thing. I write blog posts and the inline plugin generates RSS feeds. Readers of RSS feeds are notified of new entries but not changes to old entries. I think it's rude to change something without telling your readers, so I'd like to address this. To tell the user that there have been changes, we can tell the user which page has been changed, the new text, the RCS comment relating to the change, and a diff of the actual changes. The new text probably isn't too useful (I have a very hard time rereading things for differences), so any modifications to inline to re-inline pages probably won't help, even if it were feasible (which I don't think it is). So instead we turn to creating diffs automatically and (maybe) inlining them.
I suggest that for every commit, a diff is created automagically but not committed to the RCS. The page containing this diff would be a "virtual page", which cannot be edited and is not committed. (Committing here would be bad, because then it would create a new commit, which would need a new diff, which would need to be committed, etc.) Virtual pages would "expire" and be deleted if they were not depended on in some way.
Let's say these pages are created in edits/commit_%d.mdwn. RecentChanges
would then be a page which did nothing but inline the last 50 edits/*
.
This would give static generation and RSS/Atom feeds. The inline
plugin could be optionally altered to inline pages from edits/*
that match any pages in its pagespec, and through this we could get
a recent-changes+pagespec thing. You could also exclude edits that have
"minor" in the commit message (or some other thing that marks them as
unremarkable).
You could make an argument that I care way too much about what amounts to edits anyhow, but like Josh says, there are use cases for this. While this could be done with mail subscriptions, I can think of sites where you might want to disable all auth so that people can't edit your pages. --Ethan
Posted Tue Feb 20 00:27:48 2007Currently, the page title (either the name of the page or the title specified with [[meta title="..."]]
) shows up in a <div class="header">
. I tend to follow the w3c guideline recommending the use of h1 for the title; for this purpose, how about an option to make the page title an <h1 class="header">
, and shift the markdown headings down by one (making # an h2, ## an h3, etc; or alternatively making # equivalent to [[meta title="..."]]
)?
Posted Tue Feb 20 00:27:48 2007The reason I don't use a h1 for the navbar is that while it incorporates the page title, it's not just a page title, it has the links to parent pages. I also don't want to get in the business of munging up markdown's semantics. This way, # is reserved for h1 if you choose to use headers in your page. --Joey
A few plugins need more complex dependency calculations than ikiwiki can do on its own:
- Use of a version plugin should only make the page rebuild when it's built with a new version of ikiwiki.
- The sidebar plugin should make any page get rebuilt if a sidebar is created "closer" to it than the current sidebar.
- Some plugin might want to always rebuild the page that uses it.
- If backlinks were turned into a plugin, it would need to make a page rebuild when its backlinks changed.
These suggest there should be a way for plugins to have hooks that tweak the list of pages to rebuild.
Which in turn suggests that there should be a list of pages to rebuild; currently there's not, and the best such an interface could do would be to rebuild the pages even if they were already going to be rebuilt for some other reason. (See optimisations.)
It also suggests that plugins will want to examine pages and/or store data about them to use in the dependency calculations. For example, the version plugin would need to store info about what pages use it.
Posted Tue Feb 20 00:27:48 2007