Difference between revisions of "Courses/Wikis-Publishing Platforms"

From Publication Station
 
(26 intermediate revisions by the same user not shown)
Line 1: Line 1:
<slidy theme="a"/>


Wikis as Publishing Platforms - 3 x 3 hours course


[[Wikis-Publishing_Platforms/synopsis and goals]]


=Course Description=
== Titel ==
Wikis as Publishing Platforms
== Instructor ==
André Castro
== Station ==
Publication Station
== About ==
Wikis form a family of web applications, dedicated to collaborative content creation. In a Wiki, anyone can easily become an editor and start creating content. Compared to other content management systems, Wikis impose little rules on its users, allowing them to develop workflows and structures based on the necessities and specificities of each project. This course will be based on the experience of devising a hybrid publishing workflow for the [http://beyond-social.org/ Beyond Social] magazine, centered around a [http://beyond-social.org/wiki/index.php/Main_Page Wiki]. The tools, methods, and protocols employed in Beyond Social will be revealed and explored, with the intention of finding ways in which they can be applied and extended to other hybrid publishing projects, within WdKA.
== Planning ==
* getting to know Wikis: Wikis origins, goals, and relevance in todays web; But also its syntax, structure,
* converting Wiki pages into other formates, using Pandoc: website(HTML pages), EPUB, ICML(inDesign project)
* Mediawiki API - access content through Mediawiki API, and in conjunction with Pandoc develop scripts which automate publishing tasks; Experiment with retrieving and applying normally hidden information, such as a page's history.
== References ==
* [http://upload.wikimedia.org/wikipedia/commons/3/31/Ward_Cunningham%2C_Inventor_of_the_Wiki.webm Interview with Ward Cunningham] - inventor of the first wikis
* Koren, Yaron. 2012. Working with MediaWiki. San Francisco: WikiWorks Press.
* Cramer, Florian. 2011. “A Brechtian Media Design: Annemieke van Der Hoek’s Epicpedia.” In Critical Point of View: A Wikipedia Reader, 221–26. Amsterdam: Institute of Network Cultures. http://networkcultures.org/blog/publication/critical-point-of-view-a-wikipedia-reader/
== Publishing and Art Projects Based on Wikis ==
* [http://beyond-social.org/ Beyond Social] and its [http://beyond-social.org/wiki/index.php/Main_Page wiki]
* [http://toneelstof.be/ Toneelstof] and its [http://toneelstof.be/w wiki]
* [http://www.epicpedia.org/ Epicpedia] by Annemieke van der Hoek
* [http://p-dpa.net/work/iraq-war-wikihistoriography/ The Iraq War: A History of Wikipedia Changelogs] by James Bridle
== For whom ==
For anyone who's interested in finding simple and experimental ways to publish content on the web and in other supports.
= Course Goals =
By the end of this course participants should have:
* developd an understanding of wiki as flexible, customizable and open (read-write) publishing environments on the web.
* familiarized themselves cultural context in which Wikis developed, as well as the motivations that led to their development and popularity.
* familiarized themselves with the conventions of wikis, such as syntax, pages' structure, revisions, categories, namespaces.
* understood and explored the wiki as an inherently hybrid publishing environment, by:
** comprehending syntax as a structured language - a markup language - compatible to HTML and other markups, which results in the easy and lossless translation to many publishing formats and forms: websites, EPUBs, paper (through Scribus/inDesign), visual maps (SVG).
* exploring the possibilities of the wiki's API, as a way to programmatically access information from the wiki, opening up the possibilities for creating generative wikis, which are constantly informed by updates on the wiki.
__TOC__


=Day 1=
=Day 1=
Line 11: Line 59:


==Wiki Intro - What are wikis?==
==Wiki Intro - What are wikis?==


== Context: wikis as (hybrid) publishing platforms ==
== Context: wikis as (hybrid) publishing platforms ==
Line 102: Line 151:


==Files and Images==
==Files and Images==
To include non-text media (images,pdfs,audio-video) on the wiki, a File page needs to be created as a place-holder for the file
* <nowiki>[[File:myimage.png]]</nowiki>
* the page must be saved
* click on the red link, of the file you want to upload
* Follow the instruction to upload your file


==Pages' Revision History==
==Pages' Revision History==
The history of previous version or revisions of a page are stored.
The history of versions or revisions of a page is stored.


They can be accessed in "View History"
They can be accessible via "View History"


Each revision shows its author, time, action, and at times summary.
Each revision shows its author, time, action, and at times summary.


And revisions can be compared, edited and undone.
'''Revisions can be compared, edited and undone.'''


==Artworks exploring Wikipedia's Revision History==
==Artworks exploring Wikipedia's Revision History==
[http://p-dpa.net/work/iraq-war-wikihistoriography/ The Iraq War: A History of Wikipedia Changelogs] by James Bridle, collects the history of changes from the Wikipedia's article on The Iraq War between 2004 and 2009.
[http://p-dpa.net/work/iraq-war-wikihistoriography/ The Iraq War: A History of Wikipedia Changelogs] by James Bridle, collects the history of changes from the Wikipedia's article on The Iraq War between 2004 and 2009.


[http://www.epicpedia.org/ Epicpedia] by Annemieke van der Hoek transforms Wikipedia's articles and their revision/editing history into a theater script.
[http://www.epicpedia.org/ Epicpedia] by Annemieke van der Hoek transforms Wikipedia's revision history into a theater script.


Cramer, Florian. 2011. “A Brechtian Media Design: Annemieke van Der Hoek’s Epicpedia.” In Critical Point of View: A Wikipedia Reader, 221–26. Amsterdam: Institute of Network Cultures. http://networkcultures.org/blog/publication/critical-point-of-view-a-wikipedia-reader/
Cramer, Florian. 2011. “A Brechtian Media Design: Annemieke van Der Hoek’s Epicpedia.” In Critical Point of View: A Wikipedia Reader, 221–26. Amsterdam: Institute of Network Cultures. http://networkcultures.org/blog/publication/critical-point-of-view-a-wikipedia-reader/
Line 121: Line 175:
Works that:
Works that:
* present other interfaces to Wikipedia,  
* present other interfaces to Wikipedia,  
* place in the foreground the revision history:  
* placing in the foreground the revision history:  
* question the neutral point-of-view and unified nature of Wikipedia articles.
* question the neutral point-of-view and unified nature of Wikipedia articles.


==Experiment:History==
Rewind the history of edits from your user page to a previous stage, and back to the current.
== Structuring a wiki ==
What situations require an overall structure, more elaborate than its  "flat" systems of pages?
==Subpages: simple hierarchy==
Subpages are useful for organizing information hierarchically, as parent/child/grandchild
A subpage of a given main page, is create by adding a slash, followed by the sub-page's name, to the main page name.
<nowiki>[[main page/subpage/sub-subpage]]</nowiki>
Example: this very page [[Courses/Wikis-Publishing Platforms]] is a sub page from the [[Courses]] page
<nowiki>{{Special:PrefixIndex/{{FULLPAGENAME}}/}}</nowiki> can be placed on a parent page to display all its children.
== Namespaces: separate spaces ==
Namespaces are container for specific content.
* Example: the Namespace Wrecks in this wiki was created for material dedicate to Wrecks project and not a part of the Publication Station [[Wrecks:Main_Page]] and [[FIN:Main_Page]]
* Note: Namespaces need to be created in the configurations of the wiki.
== Categories: Tags, Hierarchies, Spaces, States ==
'''Categories''' are (the only) ways to tag content.
<nowiki>[[Category:courses]]</nowiki> a page to the courses category
Problems: No limit to the categories that can be added to a page.
==For next week==
Install [http://pandoc.org/installing.html Pandoc] on your machines. We'll use it to convert wiki content on other formats.
=Day 2=
==The Importance of Being Markup==
''Converting wiki content into other formats''.
==Markups==
Wiki syntax is a [http://en.wikipedia.org/wiki/Markup_language markup language]. Markup languages are exclusive to plain text formats (as opposed to rich text or binary formats). In plain-text file text-formating is applied to the text through a series of "marks" or tags. The marks or tags are like instructions, on how each marked segment of text should be interpreted or rendered.
==Markups: write/interpret==
You have be doing been using the wiki Markup during this workshop.
You've probably noticed how, when in writing mode, symbols as <code>=heading= '''bold''' [[Page]]</code> '''are interpreted and given a specific meaning or style when the wiki page is rendered''', or in other words, is in read mode.
Like the markup of the following segment, originates a styled and structured [[Revenge of the Text|page]].
<pre>
= Revenge of the Text =
There is a room in the '''Musée d’Orsay''' that I call the ''room of possibilities''.
That room contains:
* a snow flake
* the end of a cloud
* a bit of nothing
</pre>
==Markup: explicit structure==
One of the advantages of markup languages is that they imply an '''explicit text structure'''.
In order words, markups force you to declare:
* "this sentence is a section heading": <code>=section heading=</code>
* "this word is bold": <code>bold '''word'''.</code>
* "this hyperlink has x as URL and" <code>[x-URL title]</code>
You are not allowed to give meaning or structure to the text by visually styling (like you'd do in inDesign or Word).
'''Meaning or structure can only be given to the text semantically'''. That means adding ''marks'' to the segments of text we want to format.
==Markups: interchangeable==
One advantage of this explicit semantic formating is that conversions between different markups (Wiki, HTML, Markdown, LaTeX, etc) are made easy.
In most cases markups have HTML as a reference, allowing only formating options that are available in HTML.
Consequently converting between markups is easy.
A markups converter software has to ''only'' to know the meaning of all the '''meta-characters''' in a markup and what are the corresponding '''meta-characters''' in another markup.
If the software knows that it can convert <pre>=title=</pre> to <pre><h1>title</h1></pre>, <pre># title</pre>, <pre>\section{title}</pre>
==Examples: the same text in different Markup ==
[http://en.wikipedia.org/wiki/HTML HTML]:
<source lang="html5">
<h1>Revenge of the Text</h1>
<p>There is a room in the <strong>Musée d’Orsay</strong> that I call the <em>room of possibilities</em>.</p>
<p>That room contains:</p>
<ul>
  <li>a snow flake</li>
  <li>the end of a cloud</li>
  <li>a bit of nothing</li>
</ul>
</source>
[http://en.wikipedia.org/wiki/Wiki_markup Wiki markup]:
<source lang="text">
= Revenge of the Text =
There is a room in the '''Musée d’Orsay''' that I call the ''room of possibilities''.
That room contains:
* a snow flake
* the end of a cloud
* a bit of nothing
</source>
[http://en.wikipedia.org/wiki/Markdown Markdown]:
<source lang="text">
# Revenge of the Text
There is a room in the **Musée d’Orsay** that I call the *room of possibilities*.
That room contains:
* a snow flake
* the end of a cloud
* a bit of nothing
</source>
[http://en.wikipedia.org/wiki/LaTeX LaTeX]
<source lang="latex">
\section{Revenge of the Text}\label{revenge-of-the-text}
There is a room in the \textbf{Musée d'Orsay} that I call the \emph{room
of possibilities}.
That room contains:
\begin{itemize}
\itemsep1pt\parskip0pt\parsep0pt
\item
  a snow flake
\item
  the end of a cloud
\item
  a bit of nothing
\end{itemize}
</source>
== Pandoc ==
One popular software for converting between markups is [http://johnmacfarlane.net/pandoc '''Pandoc''']
[[File:pandoc_diagram.png]]
==Working with Pandoc==
Pandoc is a command-line tool. This means that instead of a GUI it uses a [http://en.wikipedia.org/wiki/Command-line_interface command line interpreter] or shell, where the interaction with it is based on text commands.
    
    
==Experiment:History==
There a few web interfaces to Pandoc, such as http://pandoc.org/try/, but they are not as flexible as the command-line interface to Pandoc.
Rewind the history of edits from your page to a previous stage, and back to the current.
 
==Pandoc conversions: wiki to HTML==
Convert a file containing mediawiki syntax, into a an HTML file.
 
pandoc --from mediawiki --to html5 --standalone input.wiki --output=output.html
 
'''pandoc''' - program dedicate to the conversion between different markups.
 
'''--from''' - option standing for “from”, is followed by the input format;
 
'''--to''' - option standing for “to”, is followed by the output format;
 
'''--standalone''' - option standing for “standalone”, produces output with an appropriate header and footer;
 
'''--output''' - option for file output;
 
'''input.wiki''' - plain-text file with wiki syntax  - you need to replace it by its actual name
 
==Pandoc conversions: HTML to wiki==
pandoc --from mediawiki --to html5 --standalone input.wiki --output=output.html
 
==Pandoc conversions: wiki to ICML==
ICML files can be imported into inDesign. See [[Hybrid publishing/publishing resources#working_with_ICML_files ]] for more information.
 
 
==Pandoc conversions: wiki to EPUB==
It is also possible to use Pandoc to produce an EPUB from a single <pre>.wiki</pre> file, or any other markup accepted by Pandoc. 
 
A few extra resources, such the cover image, metadata, and stylesheet should be included, as they  strongly influence the EPUB's outcome.
 
pandoc --from wiki --to epub3 --self-contained --epub-chapter-level=1 --epub-stylesheet=styles.epub.css  --epub-cover-image=cover.jpg --epub-metadata=metadata.xml --toc-depth=1 --output=book.epub    book.wiki
 
New options:
* '''--epub-chapter-level=1''' - at what titles' level chapters occur: heading 1
* '''--epub-stylesheet=styles.epub.css'''  - css style sheet for epub
* '''--epub-cover-image=cover.jpg'''  - epub cover image
* '''--epub-metadata=metadata.xml''' -
* '''--toc-depth=1'''
 
Example of metadata.xml
<source lang="xml">
<dc:title id="epub-title-1">MAIN TITLE</dc:title>
<meta refines="#epub-title-1" property="title-type">main</meta>
<dc:publisher>Your Publisher</dc:publisher>   
<dc:date id="epub-date">2015</dc:date>
<dc:language>en-US</dc:language>
<dc:creator id="epub-creator-0">AUTHOR'S NAME</dc:creator>
<meta refines="#epub-creator-0" property="role" scheme="marc:relators">aut</meta>
<dc:contributor id="epub-contributor-1">EDITOR'S NAME</dc:contributor>
<meta refines="#epub-contributor-1" property="role" scheme="marc:relators">edt</meta>
<dc:contributor id="epub-contributor-5">DESIGNER'S NAME</dc:contributor>
<meta refines="#epub-contributor-5" property="role" scheme="marc:relators">dsr</meta>
<dc:subject>TAGS, SEPARATED BY, COMAS</dc:subject>
<dc:description>PUBLICATION SYNOPSIS</dc:description>
</source>
 
 
==Conclusion: wiki as hybrid publishing tools==
Hopefully these examples make more clear what I stated in day#1 wikis are inherently hybrid publishing tools, as
* wikis are about publishing quickly and easily
* wiki content can be easily converted to other formats, since it uses a Markup languange.
 
As a result many derived works, or re-mediations of wiki content exist. From the more pragmatic examples, such as Wikipedia Export book function, [http://toneelstof.be/ Toneelstof], [http://beyond-social.org/ Beyond Social], or this presentation (both wiki content and slide show); To more artistic interventions such as [http://p-dpa.net/work/iraq-war-wikihistoriography/ The Iraq War: A History of Wikipedia Changelogs] by James Bridle or [http://www.epicpedia.org/ Epicpedia] by Annemieke van der Hoek.
 
 
We'll continue looking re-mediations of wiki content. In the next day we'll focus on the MediaWiki API, an interface to Mediawiki wikis, which allows  retrieving and editing content programatically. 
 
=Day 3: Remediation - Mediawiki API=
''Appropriating and reporposing wikis' content''.
 
 
==How to get content?==
Although there is nothing wrong in copying the content of wiki pages in order to turn them into other publishing objects, it is a slightly tedious process, specially when you want to work with large quantities of data. Those tedious tasks can be more easily handled by software.
 
Mediawiki wiki present a solution for this problem by providing a Web API, which functions as a programmatic interface (computer-to-computer instead of human-to-computer) to Mediawiki installations.
 
==APIs==
An API or application programming interface is an interface that allows interaction via other software. It allows things such as apps to be build on top of existing services, where the apps use the API to interact with the service.
 
==Web APIs==
Mediawiki API is a [http://en.wikipedia.org/wiki/Application_programming_interface#Web_APIs Web API]. 
 
Essentially it means that the API is accessed thought '''HTTP requests''' and responds using '''JSON or XML object'''.
 
==Mediawiki API==
[http://www.mediawiki.org/wiki/API:Main_page Mediawiki API] allows many and very intricate request for information (and editing) from Mediawiki installations.
 
A few examples will follow, you can request using your web browser. Including a Pretty-prints JSON extension on your browser, as it will help you read the API's responses.
 
 
 
==Example: Page basic info==
https://en.wikipedia.org/w/api.php?format=json&action=query&titles=Main_Page&prop=info English Wikipedia's Main Page
 
http://beyond-social.org/wiki/api.php?format=json&action=query&titles=Colophon&prop=info Beyond Social's Colophon page
 
==Example: Page content==
https://en.wikipedia.org/w/api.php?format=json&action=query&titles=Main_Page&prop=revisions&rvprop=content English Wikipedia's Main Page
 
http://beyond-social.org/wiki/api.php?format=json&action=query&titles=Colophon&prop=revisions&rvprop=content Beyond Social's Colophon page
 
==Decomposing an API request==
http://beyond-social.org/wiki/api.php? format=json & action=query & titles=Colophon & prop=revisions & rvprop=content
 
* <code>http://beyond-social.org/wiki/api.php?</code> - '''endpoint''' - the home page of the MediaWiki web service. It can be found by going to the [[Version:Special]] page of a wiki a searching for the API entry point
* <code>format=json</code> - the format of the output. Can either be JSON of XML, although XML is being phased out
* <code>action=query</code> - what action to you want to performed. [http://www.mediawiki.org/wiki/API:Query query] is the most used, but there many more available options, such as editing and deleting pages.
'''Then comes the action specific parameters'''
* <code>titles=Colophon</code> - the page queried
* <code>prop=revisions</code> - what [http://www.mediawiki.org/wiki/API:Properties properties] you want from of a page: revision. As no particular revision is specified the latest will be returned
* <code>rvprop=content</code> - what specific properties you want from the page revision in question: content. You could also ask for the <code>user</code> who created the last revision, or the <code>comment</code> left by the user. These properties can be combined in a single request. <code>rvprop=content|user|comment</code>
 
==Example: Page revisions==
??
 
==Example: Images in a page==
https://en.wikipedia.org/w/api.php?format=json&action=query&titles=Willem_de_Kooning&prop=images
 
==Example:  categories a page belongs to==
https://en.wikipedia.org/w/api.php?format=json&action=query&titles=Willem_de_Kooning&prop=categories
 
==Examples: Contributors to a page==
https://en.wikipedia.org/w/api.php?format=json&action=query&titles=Willem_de_Kooning&prop=contributors
 
http://beyond-social.org/wiki/api.php?format=json&action=query&titles=Colophon&prop=contributors
 
==Example: Users of a wiki==
http://beyond-social.org/wiki/api.php?format=json&action=query&list=allusers&aulimit=100
 
limit of 100 users




== Structuring a wiki ==
==Create something with API results==
What situations require an overall structure more elaborate than the systems of pages?
Getting wiki's information (content, users, revisions, images, etc) is easy like this.
Yet, turning it into something else, can be challenging and involve writing scripts to sort the information received from the API calls.


== Hierarchies==
To make life slightly easier for you I'll add a few python scripts that will create:
* a map of all the users of a wiki
* a history of the recent edits to an article
* ????


==Subpages==
All these scripts have parameters that affect their behavior, such as the queried wiki page and wiki, dates, etc.
A subpage of a given main page, are create by adding a slash, followed by the sub page's name, to the main page name.


Subpages are useful for organizing information hierarchically, as parent/child/grandchild
All the results are HTML pages, which are driven from template html page, a sort of shell container to the content collected by the API.


Example: this very page page [[Courses/Wikis-Publishing Platforms]] is a sub page from the [[Courses]] page
You can change the template's CSS style-sheet as well as structure.


<nowiki>{{Special:PrefixIndex/{{FULLPAGENAME}}/}}</nowiki> can be place on a parent page to display all its children.
Just be aware of not changing elements that have <nowiki><!-- container --></nowiki> written inside them.


== Distinct spaces ==
, an interface to Mediawiki wikis, which allows  retrieving and editing content programatically
'''Namespaces''' are container for specific content.


Example: Name space Wrecks: for material dedicate to Wrecks and not a part of the Publication Station [[Wrecks:Main_Page]] and [[FIN:Main_Page]]
=Notes on scripts=


Note: Namespaces need to be created in the configurations of the wiki.
students should have to understand python to generate "derivates" from wiki content.  


Instead they can be given a script that allows them to play with different arguments and allow different "derivates" to emerge from the same content.


== Hierarchies, Spaces, States ==
Some guidelines:
'''Categories''' are (the only) ways to tag content.
* each script deals with only one type of content: users, text, revisions, images, subpages


Problems: No limit to the categories that can be given to a page.
the principal should be similar to epubtrailer where the user can choose: background-color, color, source (wiki, page),

Latest revision as of 12:28, 30 March 2020


Course Description

Titel

Wikis as Publishing Platforms

Instructor

André Castro

Station

Publication Station

About

Wikis form a family of web applications, dedicated to collaborative content creation. In a Wiki, anyone can easily become an editor and start creating content. Compared to other content management systems, Wikis impose little rules on its users, allowing them to develop workflows and structures based on the necessities and specificities of each project. This course will be based on the experience of devising a hybrid publishing workflow for the Beyond Social magazine, centered around a Wiki. The tools, methods, and protocols employed in Beyond Social will be revealed and explored, with the intention of finding ways in which they can be applied and extended to other hybrid publishing projects, within WdKA.

Planning

  • getting to know Wikis: Wikis origins, goals, and relevance in todays web; But also its syntax, structure,
  • converting Wiki pages into other formates, using Pandoc: website(HTML pages), EPUB, ICML(inDesign project)
  • Mediawiki API - access content through Mediawiki API, and in conjunction with Pandoc develop scripts which automate publishing tasks; Experiment with retrieving and applying normally hidden information, such as a page's history.

References


Publishing and Art Projects Based on Wikis


For whom

For anyone who's interested in finding simple and experimental ways to publish content on the web and in other supports.

Course Goals

By the end of this course participants should have:

  • developd an understanding of wiki as flexible, customizable and open (read-write) publishing environments on the web.
  • familiarized themselves cultural context in which Wikis developed, as well as the motivations that led to their development and popularity.
  • familiarized themselves with the conventions of wikis, such as syntax, pages' structure, revisions, categories, namespaces.
  • understood and explored the wiki as an inherently hybrid publishing environment, by:
    • comprehending syntax as a structured language - a markup language - compatible to HTML and other markups, which results in the easy and lossless translation to many publishing formats and forms: websites, EPUBs, paper (through Scribus/inDesign), visual maps (SVG).
  • exploring the possibilities of the wiki's API, as a way to programmatically access information from the wiki, opening up the possibilities for creating generative wikis, which are constantly informed by updates on the wiki.


Day 1

Getting to know wikis: what are wiki. wiki syntax, pages, categories, Extensions.

Wiki Intro - What are wikis?

Context: wikis as (hybrid) publishing platforms

  • Publishing
    • what is written on wikis is public;
    • Low threshold for publishing - click the edit button
  • (Hybrid)
    • wiki content can easily be published under different formats or context.
    • Examples: Wikipedia Export book function; Toneelstof, Beyond Social, or this presentation (both wiki content and slide show)

Wiki Origins

Ward Cunningham Wiki Wiki Web

In 1995 Ward Cunningham creates the first wiki, naming it Wiki Wiki Web (Wiki in Hawaii means quick!).

  • Interview with Ward Cunningham
    • links between pages, as ways to connect information
    • links to non-existing, encourage the creation of the new pages
    • read-write system - writing is as natural and expected as writing
    • the wiki territory expands as a result of contributions
    • anonymous editing - collaborative writing.

Wikipedia, Wiki Media Foundation, MediaWiki

  • 2000 - Jimmy Wales attempts to create a free online encyclopedia - Nupedia.
  • Nupedia fails.
  • 2001 - After hearing about wikis Wales and Larry Sanger decide to build Nupedia on it, calling it Wikipedia.
  • Wikipedia was first build using a software called UseModWiki (a sibling of Wiki Wiki Web). UseModWiki was slow and didn't support the scale Wikipedia was growing into.
  • A new wiki software for Wikipedia, starts is developed, based more efficient and scalable web technology: PHP programming language, MySQL database. The software got the name of MediaWiki.
  • 2002 - Wikipedia switches to using MediaWiki.
  • 2003 - Wiki Media Foundation is created.
  • MediaWiki currently power many wikis, including Wikipedia and subsidiary project, as well as other wiki such as this one.

Wiki software

  • MediaWiki - PHP, MySQL, Webserver
  • Docuwiki - similar to MediaWiki
  • MoinMoin - Python, relies on plain text files & folders
  • Zim - Desktop wiki, relies on plain text files & folders.
  • TiddlyWiki - another Desktop wiki, based on HTML files and Javascript

Comparison of wiki software

Running a wiki: 2 options

Install and set up a wiki in your PC or server.

OR

Use a wikifarm (wiki host), such as Wikia. List of several wiki farms.

wiki conventions

Users

To edit a wiki you need most wikis to register as user.

Different user's groups have different privileges:

  • user: can edit and create pages
  • administrators: can delete pages, put other users into a different group
  • bot

Wiki Markup language

Wiki Markup: structured text, simplified version of HTML, easily converted to HTML.

Pages

Pages Creation

Pages can be created by:

  • creating an empty page link, clicking it, and start editing. The following [[my empty page]] will translate to my empty page
  • adding the page name to URL bar, you'll be directly to the empty page, and start editing http://publicationstation.wdka.hro.nl/wiki/index.php/my empty page
  • Red link = previously non-existing page.
  • Blue link = existing page.

Talk Pages

Every Page has its respective Discussion or Talk page.

Talk pages are the site of discussion and conflicts that go on to the making of an article.

An example: Talk page on Invisible Pink Unicorn Wikipedia article


User Pages

Experiment:Edit User Pages

Create and edit your user page.

Include internal (to pages on this wiki) and external (to web pages outside this wiki) links.

Include an internal and external image.

Files and Images

To include non-text media (images,pdfs,audio-video) on the wiki, a File page needs to be created as a place-holder for the file

  • [[File:myimage.png]]
  • the page must be saved
  • click on the red link, of the file you want to upload
  • Follow the instruction to upload your file

Pages' Revision History

The history of versions or revisions of a page is stored.

They can be accessible via "View History"

Each revision shows its author, time, action, and at times summary.

Revisions can be compared, edited and undone.

Artworks exploring Wikipedia's Revision History

The Iraq War: A History of Wikipedia Changelogs by James Bridle, collects the history of changes from the Wikipedia's article on The Iraq War between 2004 and 2009.

Epicpedia by Annemieke van der Hoek transforms Wikipedia's revision history into a theater script.

Cramer, Florian. 2011. “A Brechtian Media Design: Annemieke van Der Hoek’s Epicpedia.” In Critical Point of View: A Wikipedia Reader, 221–26. Amsterdam: Institute of Network Cultures. http://networkcultures.org/blog/publication/critical-point-of-view-a-wikipedia-reader/

Works that:

  • present other interfaces to Wikipedia,
  • placing in the foreground the revision history:
  • question the neutral point-of-view and unified nature of Wikipedia articles.

Experiment:History

Rewind the history of edits from your user page to a previous stage, and back to the current.

Structuring a wiki

What situations require an overall structure, more elaborate than its "flat" systems of pages?

Subpages: simple hierarchy

Subpages are useful for organizing information hierarchically, as parent/child/grandchild

A subpage of a given main page, is create by adding a slash, followed by the sub-page's name, to the main page name.

[[main page/subpage/sub-subpage]]

Example: this very page Courses/Wikis-Publishing Platforms is a sub page from the Courses page

{{Special:PrefixIndex/{{FULLPAGENAME}}/}} can be placed on a parent page to display all its children.

Namespaces: separate spaces

Namespaces are container for specific content.

  • Example: the Namespace Wrecks in this wiki was created for material dedicate to Wrecks project and not a part of the Publication Station Wrecks:Main_Page and FIN:Main_Page
  • Note: Namespaces need to be created in the configurations of the wiki.


Categories: Tags, Hierarchies, Spaces, States

Categories are (the only) ways to tag content.

[[Category:courses]] a page to the courses category

Problems: No limit to the categories that can be added to a page.

For next week

Install Pandoc on your machines. We'll use it to convert wiki content on other formats.

Day 2

The Importance of Being Markup

Converting wiki content into other formats.

Markups

Wiki syntax is a markup language. Markup languages are exclusive to plain text formats (as opposed to rich text or binary formats). In plain-text file text-formating is applied to the text through a series of "marks" or tags. The marks or tags are like instructions, on how each marked segment of text should be interpreted or rendered.

Markups: write/interpret

You have be doing been using the wiki Markup during this workshop. You've probably noticed how, when in writing mode, symbols as =heading= bold Page are interpreted and given a specific meaning or style when the wiki page is rendered, or in other words, is in read mode.

Like the markup of the following segment, originates a styled and structured page.

= Revenge of the Text =

There is a room in the '''Musée d’Orsay''' that I call the ''room of possibilities''.

That room contains:
* a snow flake
* the end of a cloud
* a bit of nothing

Markup: explicit structure

One of the advantages of markup languages is that they imply an explicit text structure.

In order words, markups force you to declare:

  • "this sentence is a section heading": =section heading=
  • "this word is bold": bold word.
  • "this hyperlink has x as URL and" [x-URL title]

You are not allowed to give meaning or structure to the text by visually styling (like you'd do in inDesign or Word).

Meaning or structure can only be given to the text semantically. That means adding marks to the segments of text we want to format.

Markups: interchangeable

One advantage of this explicit semantic formating is that conversions between different markups (Wiki, HTML, Markdown, LaTeX, etc) are made easy.

In most cases markups have HTML as a reference, allowing only formating options that are available in HTML.

Consequently converting between markups is easy.

A markups converter software has to only to know the meaning of all the meta-characters in a markup and what are the corresponding meta-characters in another markup.

If the software knows that it can convert

=title=

to

<h1>title</h1>

,

# title

,

\section{title}

Examples: the same text in different Markup

HTML:

<h1>Revenge of the Text</h1>
 <p>There is a room in the <strong>Musée d’Orsay</strong> that I call the <em>room of possibilities</em>.</p>
 <p>That room contains:</p>
 <ul>
  <li>a snow flake</li>
  <li>the end of a cloud</li>
  <li>a bit of nothing</li>
 </ul>

Wiki markup:

= Revenge of the Text =
There is a room in the '''Musée d’Orsay''' that I call the ''room of possibilities''.

That room contains:
* a snow flake
* the end of a cloud
* a bit of nothing

Markdown:

# Revenge of the Text
There is a room in the **Musée d’Orsay** that I call the *room of possibilities*.

That room contains:
* a snow flake
* the end of a cloud
* a bit of nothing

LaTeX

\section{Revenge of the Text}\label{revenge-of-the-text}

There is a room in the \textbf{Musée d'Orsay} that I call the \emph{room
of possibilities}.

That room contains:

\begin{itemize}
\itemsep1pt\parskip0pt\parsep0pt
\item
  a snow flake
\item
  the end of a cloud
\item
  a bit of nothing
\end{itemize}

Pandoc

One popular software for converting between markups is Pandoc

Pandoc diagram.png

Working with Pandoc

Pandoc is a command-line tool. This means that instead of a GUI it uses a command line interpreter or shell, where the interaction with it is based on text commands.

There a few web interfaces to Pandoc, such as http://pandoc.org/try/, but they are not as flexible as the command-line interface to Pandoc.

Pandoc conversions: wiki to HTML

Convert a file containing mediawiki syntax, into a an HTML file.

pandoc --from mediawiki --to html5 --standalone input.wiki --output=output.html

pandoc - program dedicate to the conversion between different markups.

--from - option standing for “from”, is followed by the input format;

--to - option standing for “to”, is followed by the output format;

--standalone - option standing for “standalone”, produces output with an appropriate header and footer;

--output - option for file output;

input.wiki - plain-text file with wiki syntax - you need to replace it by its actual name

Pandoc conversions: HTML to wiki

pandoc --from mediawiki --to html5 --standalone input.wiki --output=output.html

Pandoc conversions: wiki to ICML

ICML files can be imported into inDesign. See Hybrid publishing/publishing resources#working_with_ICML_files for more information.


Pandoc conversions: wiki to EPUB

It is also possible to use Pandoc to produce an EPUB from a single

.wiki

file, or any other markup accepted by Pandoc.

A few extra resources, such the cover image, metadata, and stylesheet should be included, as they strongly influence the EPUB's outcome.

pandoc --from wiki --to epub3 --self-contained --epub-chapter-level=1 --epub-stylesheet=styles.epub.css  --epub-cover-image=cover.jpg --epub-metadata=metadata.xml --toc-depth=1 --output=book.epub     book.wiki 

New options:

  • --epub-chapter-level=1 - at what titles' level chapters occur: heading 1
  • --epub-stylesheet=styles.epub.css - css style sheet for epub
  • --epub-cover-image=cover.jpg - epub cover image
  • --epub-metadata=metadata.xml -
  • --toc-depth=1

Example of metadata.xml

<dc:title id="epub-title-1">MAIN TITLE</dc:title>
<meta refines="#epub-title-1" property="title-type">main</meta>
<dc:publisher>Your Publisher</dc:publisher>    
<dc:date id="epub-date">2015</dc:date>
<dc:language>en-US</dc:language>
<dc:creator id="epub-creator-0">AUTHOR'S NAME</dc:creator>
<meta refines="#epub-creator-0" property="role" scheme="marc:relators">aut</meta>
<dc:contributor id="epub-contributor-1">EDITOR'S NAME</dc:contributor>
<meta refines="#epub-contributor-1" property="role" scheme="marc:relators">edt</meta>
<dc:contributor id="epub-contributor-5">DESIGNER'S NAME</dc:contributor>
<meta refines="#epub-contributor-5" property="role" scheme="marc:relators">dsr</meta>
<dc:subject>TAGS, SEPARATED BY, COMAS</dc:subject>
<dc:description>PUBLICATION SYNOPSIS</dc:description>


Conclusion: wiki as hybrid publishing tools

Hopefully these examples make more clear what I stated in day#1 wikis are inherently hybrid publishing tools, as

  • wikis are about publishing quickly and easily
  • wiki content can be easily converted to other formats, since it uses a Markup languange.

As a result many derived works, or re-mediations of wiki content exist. From the more pragmatic examples, such as Wikipedia Export book function, Toneelstof, Beyond Social, or this presentation (both wiki content and slide show); To more artistic interventions such as The Iraq War: A History of Wikipedia Changelogs by James Bridle or Epicpedia by Annemieke van der Hoek.


We'll continue looking re-mediations of wiki content. In the next day we'll focus on the MediaWiki API, an interface to Mediawiki wikis, which allows retrieving and editing content programatically.

Day 3: Remediation - Mediawiki API

Appropriating and reporposing wikis' content.


How to get content?

Although there is nothing wrong in copying the content of wiki pages in order to turn them into other publishing objects, it is a slightly tedious process, specially when you want to work with large quantities of data. Those tedious tasks can be more easily handled by software.

Mediawiki wiki present a solution for this problem by providing a Web API, which functions as a programmatic interface (computer-to-computer instead of human-to-computer) to Mediawiki installations.

APIs

An API or application programming interface is an interface that allows interaction via other software. It allows things such as apps to be build on top of existing services, where the apps use the API to interact with the service.

Web APIs

Mediawiki API is a Web API.

Essentially it means that the API is accessed thought HTTP requests and responds using JSON or XML object.

Mediawiki API

Mediawiki API allows many and very intricate request for information (and editing) from Mediawiki installations.

A few examples will follow, you can request using your web browser. Including a Pretty-prints JSON extension on your browser, as it will help you read the API's responses.


Example: Page basic info

https://en.wikipedia.org/w/api.php?format=json&action=query&titles=Main_Page&prop=info English Wikipedia's Main Page

http://beyond-social.org/wiki/api.php?format=json&action=query&titles=Colophon&prop=info Beyond Social's Colophon page

Example: Page content

https://en.wikipedia.org/w/api.php?format=json&action=query&titles=Main_Page&prop=revisions&rvprop=content English Wikipedia's Main Page

http://beyond-social.org/wiki/api.php?format=json&action=query&titles=Colophon&prop=revisions&rvprop=content Beyond Social's Colophon page

Decomposing an API request

http://beyond-social.org/wiki/api.php? format=json & action=query & titles=Colophon & prop=revisions & rvprop=content 
  • http://beyond-social.org/wiki/api.php? - endpoint - the home page of the MediaWiki web service. It can be found by going to the Version:Special page of a wiki a searching for the API entry point
  • format=json - the format of the output. Can either be JSON of XML, although XML is being phased out
  • action=query - what action to you want to performed. query is the most used, but there many more available options, such as editing and deleting pages.

Then comes the action specific parameters

  • titles=Colophon - the page queried
  • prop=revisions - what properties you want from of a page: revision. As no particular revision is specified the latest will be returned
  • rvprop=content - what specific properties you want from the page revision in question: content. You could also ask for the user who created the last revision, or the comment left by the user. These properties can be combined in a single request. rvprop=content|user|comment

Example: Page revisions

??

Example: Images in a page

https://en.wikipedia.org/w/api.php?format=json&action=query&titles=Willem_de_Kooning&prop=images

Example: categories a page belongs to

https://en.wikipedia.org/w/api.php?format=json&action=query&titles=Willem_de_Kooning&prop=categories

Examples: Contributors to a page

https://en.wikipedia.org/w/api.php?format=json&action=query&titles=Willem_de_Kooning&prop=contributors

http://beyond-social.org/wiki/api.php?format=json&action=query&titles=Colophon&prop=contributors

Example: Users of a wiki

http://beyond-social.org/wiki/api.php?format=json&action=query&list=allusers&aulimit=100

limit of 100 users


Create something with API results

Getting wiki's information (content, users, revisions, images, etc) is easy like this. Yet, turning it into something else, can be challenging and involve writing scripts to sort the information received from the API calls.

To make life slightly easier for you I'll add a few python scripts that will create:

  • a map of all the users of a wiki
  • a history of the recent edits to an article
  • ????

All these scripts have parameters that affect their behavior, such as the queried wiki page and wiki, dates, etc.

All the results are HTML pages, which are driven from template html page, a sort of shell container to the content collected by the API.

You can change the template's CSS style-sheet as well as structure.

Just be aware of not changing elements that have <!-- container --> written inside them.

, an interface to Mediawiki wikis, which allows  retrieving and editing content programatically

Notes on scripts

students should have to understand python to generate "derivates" from wiki content.

Instead they can be given a script that allows them to play with different arguments and allow different "derivates" to emerge from the same content.

Some guidelines:

  • each script deals with only one type of content: users, text, revisions, images, subpages

the principal should be similar to epubtrailer where the user can choose: background-color, color, source (wiki, page),