Thinking in bits (not atoms)

During a break at EMWCon, I participated in a conversation with several people about the relative advantages and disadvantages of requiring people to use wikitext markup in MediaWiki (instead of providing them a visual editor). During the conversation, Lex brought up examples of documents with their content locked up as binary files compared to wiki pages with the text readily available and accessible. I mentioned the idea of “thinking in bits” as part of the conversation.

Reflecting on the conversation later, I realized that I have written here and there about the concept, but don’t really have anything pulling all the thoughts together. So here you go.

I first came across the idea of thinking in bits in Nicholas Negroponte‘s 1995 book Being digital. In the book, Negroponte talks about the limitations, the cost, of moving information around as atoms – paper books, CDs, DVDs, snail mail, you get the idea – and how information would soon be converted from atoms to bits. The immediately obvious implication is that it now becomes essentially free to move and share information as bits.

The less obvious, but much more important, implication is that bits change the way you can think about the information. How you can manipulate and repurpose the information. How you can do things that were impossible with the information locked up in atoms. The obvious applications have come to fruition. Email instead of snail mail. Music downloads instead of CDs, and now streaming instead of downloads. The same with video.

And yet…

And yet, the way this digitized information, these bits, is handled is still in many ways tied to the way atoms were handled. Some of this, such as in the music and movie industries, is purely for commercial reasons. Digital rights management systems are deployed so that the company can benefit from the freedom (as in beer) of distributing their content while at the same time restricting the freedom (as in speech) of the consumers of that content. They are shipping in bits, but they are not thinking in bits.

Even from a creative perspective, as opposed to the commercial, this thinking in atoms prevents them from seeing new possibilities for providing engaging and individual experiences to their customers. For example, consider how labels distribute music, how they release the same tracks in the same order on both CD and on services like iTunes or Google Play. This is thinking in atoms at its finest (worst?).

Imagine if they were thinking in bits instead. They could offer an “album” that includes songs from the setlist the band played in your town, or edit the songs at the disc-breaks so they didn’t fade out / fade in. Along those lines, for the individual song downloads they could edit the track so you didn’t catch the introduction to the next song at the end of the song you’re listening too.

The same is true, albeit for different reasons, inside many organizations. Yes, nearly everything is in bits, stored on shared drives, in Sharepoint or email, or whatever system your orginzation uses to “manage” documents.

And yet….

And yet most of these bits are locked up in digital representations of atoms. We are using bits, but again we are not thinking in bits.

Part of the challenge, of course, is a need to accommodate the lowest common denominator. In the case of many corporate processes that lcd is the requirement to print. So, the templates and processes are designed based on what is expected in the final, printed outcome. Of course, once something is printed, there isn’t a whole lot you can do with it except read it and manually extract the info you need. If you have the digital file that was printed, you can at least search the content. But this is really just a faster way of “reading” the document to get to the “good part”.

What if, on the other hand, the document (whatever it might be) was designed and created based on the expectation that it would be used primarily in a digital format, with the printed product a secondary feature. Or that you don’t even know what the final format needs to be.

As an example (since I was inspired to write this by a conversation at EMWCon), creating your contract proposals as semantic wiki entries. The proposal can be collaboratively developed and reviewed and when ready can be exported into the end format that you need. This will likely be some sort of MS Office or .pdf file that can be easily sent to the potential client, but it could just as easily be shared with them as bits and negotiations conducted against that.

I say “just as easily”. This isn’t to say that work wouldn’t be involved, there would be a lot of work required. Designing, implementing, transitioning, executing. Cultural challenges galore. But, as Lex explained in his story about bikes, cars, and messenger services, the marginal cost of making this change can be far exceeded by the benefits you can gain from the change.


EMWCon 2016 – some notes (create camp)


Spent some time this morning discussing various ideas for projects to work on. Including:

  • HTML2Wiki
  • Semantic Form Themes
  • Mermaid
  • Make site faster
  • Extension certification
    • AD / Vagrant roles
    • BPM Setup for an extension cert service
  • Extension manager
    • Extension interdependency management
  • Extension screenshots and working links to examples on
  • Reification / provenance in SMW
  • Semantic forms validation

Most are somewhat technical (definitely beyond my skill level with MW), but many of those do require some non-technical participation. And some are longer term ideas (screenshots of extensions, for example) that can be continued to be worked on over time.

The one that most appealed to me was Lex’s presentation for creating an “Extension Certification” process for MediaWiki extensions. Would tie in with the potential Enterprise MediaWiki Foundation (EMF?) that we discussed on Day 1.

Basic process is straightforward (developer creates extension, runs acceptance tests, submits to EMF for review, they certify), but the implementation is a bit less so. Quite involved on the developer side, somewhat automatic at the review level. End result would be the “EMF Seal of Approval” for the extension, showing which core versions the extension has been tested against.

This type of process would would go a long way for Enterprise users, especially when trying to convince management, IA, etc that the extension can be trusted and presents (relatively) low-risk in implementing.

You can keep up with progress on the EMWCon 2016 page.


EMWCon 2016 – some notes (day 2)

Live-blogging (kind of) again, day 2. Also, livestream.

Notifications in MediaWiki


Yaron discussing the current status of notification (poor), and what would make an ideal notification system for MediaWiki. Page creation, edit, etc in different sets of pages, such as all pages, pages in namespace, etc, and certain people notified, e.g. users in a user group, specified list, users signed up to be notified, maybe external users via email.

Notification done by maybe email, Extension:Echo, or custom webhook of some kind. Echo (you can see on is planned to be added into MW Core. Several other extensions rely on Echo.

Many potential pitfalls. Too few options not helpful, too many confusing. Regular users vs. admins vs. “superadmins”.

“Looked at SharePoint for bad notification design, and SharePoint didn’t disappoint.”

Google Summer of Code, Yaron is co-mentoring a project to create a “page notifications extension”, started this past Monday. Goal is to create a framework that covers all the possible options, to make notifications in MediaWiki useful.

Jason mentioned Extension:Notificator, which allows users to set up notifications of others. NASA’s Enterprise Media Wiki team released Extension:WatchAnalytics and others that provide some insight into watching.

Inside Wiki Sales


Angelika Müller from Hallo Welt! talking about selling wiki services, especially Blue Spice, their enterprise wiki distro. Covered the basic lifecycle of sales, from contact to requirements to estimates to doing the work and shipping the final product. Challenges include technical and organizational.

All development, as much as possible, should be customer driven. After all, they are the ones who know what they want.

Discussed some of their big use cases.

  • Wiki farms
  • Document management – they first ask, why not use a document management tool, why a wiki? Things to address – setting rights for files, managing and editing
  • Quality management

In many ways, the same challenges any design / development shop has, but some uniqueness based on the product (MediaWiki based).

A big topic of interest is how they are using WebDAV with MS Office to allow their users to edit / save MS Office documents in-place inside the wiki, instead of having to download – edit – save – upload the file.  This was developed as custom work for a group of customers who came together to fund it, with a restriction that the code for doing it could not be shared for one year (which is almost up).

Anja offered that their WebDAV developer will have a webinar next week to discuss with those who are interested.

Lightning Talks

CWIX Wiki for Interoperability Testing

An overview of the CWIX 2016 wiki.

Making Cool Directories with MediaWiki

Yaron showed some examples of “cool” directory sites. Can you do this with MediaWiki? Functionally, yes. Visually / UI, no.

Question: how can we make our display as cool as theirs? Important for external facing wikis, but can be useful and important internally as well.

Check out migadv. And MITRE’s Gestalt.

Another option is to leverage the MediaWiki API to use MediaWiki as back end and build a custom front end using Javascript, etc.

Social Semantic


Jason Bock sharing some of the things we’ve been working on using SMW for bringing some social features to MediaWiki. Started with a brief overview and history of milWiki (see milSuite article on Wikipedia) and installing SMW.

If you are going to use SMW, you should definitely use Semantic Forms. “Has default form” and “has alternate form” are crucial to make sure people are using forms to enter semantic data.

Create Templates based on Queries, makes it so the user doesn’t need to understand how to create queries, they can just use them. Browse pages are the most popular click from homepage, a fact of life in a hierarchical-minded organization. Query forms, with filters, provides an “Amazon” like search / filter experience.

A 12 step training guide for general users to learn how to use SMW to create their own projects, covers basic overview through the more complex concepts. Crucial if you want your users to be able to create their own solutions.

Three examples of user created semantic projects: IT ticketing system, app store, unit training.


Anatomy of a Cyber Taxonomy Development Wiki


Cindy Cicalese from MITRE (Extensions by MITRE). Demonstrated live at the Malware ATtribute Enumeration and Characterization wiki.

Purpose of the wiki is to help generate an ontology for describing malware. Includes consistant iconography to provide better navigation and visual indicators of what you’re looking at. Built around hiearchies to work (hierarchy builder?). Uses a combination of Cargo and SMW, but the only SMW still in use will ultimately be converted to Cargo. It is an example of how the two can coexist, use each where best suited. (nts: Cargo vs. SMW?)

The “Graph this page” option uses MITRE’s VIKI extension. Very nice, reminds me of The Brain.

Cindy walked through some key aspects of the site, how it’s structured and configured. Went down the rabbit hole of walking through some of the code. (I stood at the top of the hole and watched her walk around, a bit beyond my own current knowledge 🙂

Contracting with the GPL


Greg Rundlett talking about the challenges – and joys? – of doing MediaWiki work for large enterprise clients.

Challenge 1 – Lead time: Make sure lead time and discovery are in your pricing model

Challenge 2 – Payment: Large enterprises have set processes that may not be in your best interest. Negotiate for the best terms you can get (net 14 would be good).

Challenge 3 – Insurance: E&O (errors and omissions) and general liability

Challenge 4-6 – “All Your Base Are Belong to Us”. The challenge is how to convince the client to allow you to publish the custom work as GPL.

The “Master Services Agreement” is a standard contract, typically a take-it-or-leave-it thing. But, you should try to write details into the agreement where details are allowed. This is especially needed for the code, how much can you “keep” for reuse or publishing to open repos.

Collaboration and cooperation, more is needed among consultants in EMW space. We aren’t competing against each other, we are competing against the Microsofts. 

Comment from Mark – we need more consultants like freephile as members of the MediaWiki stakeholders.

A bit of discussion – and disagreement – around GPL and how it applies to custom/proprietary code development.

Lightning Talk

Me – MediaWiki as part of larger Enterprise Social Network

Technical Collaboration – WMF


Chris Koerner providing some insight into the technical collaboration team at WikiMedia Foundation.

The team primarily focuses on supporting the WMF and its official efforts, but they do take input from 3d party users and work on things that provide value to them as well.

Phabricator – bug tracking. Look at in more detail later.

The team hosts several events – developer summit, hackathons, Wikimania, and others. If you are doing good things with MediaWiki, let them know so they can get the word out.

WMF – the ongoing saga

General discussion of topics related to the WikiMedia Foundation. Didn’t catch it all, so no real notes here.

Rethinking work

Dave Coplin, Chief Envisioning Officer at Microsoft, imagines what might be possible if more organisations embraced the full, empowering potential of technology & encouraged an open, collaborative & flexible working culture.

EMWCon 2016 – some notes (day 1)

A live blog (of sorts) for Enterprise MediaWiki Conference (EMWCon) 2016. The conference is also being livestreamed.

Panel – Towards a MediaWiki Foundation


Cindy Cicalese, Anja Ebersbach, Mark Hershberger, Chris Koerner, Yaron Koren

Panel discussion to address some of the challenges around the development, maintenance, and use of MediaWiki and related software (extensions). Not a discussion of separating Wikipedia ops from MediaWiki core (which would be something similar to how WordPress is set up).

Yaron presented a case for a MediaWiki foundation that would help fund “unfunded” software. Specifically, funnel money from the users of the software to the developers of the software. Similar to other open source “foundations”, example given was the Linux Foundation. aka pay to play.

Part of the challenge is that the Wikimedia foundation is focused almost exclusively on Wikipedia and their other projects. Desire is to make sure that the MediaWiki software remains usable by other 3d party users, and actually includes development specifically to meet the needs of those 3d party users. For example, the MediaWiki Stakeholders Feature Wishlist.

An interesting mess.

Social Semantic


Jason Bock talking about applying SMW to achieve social objectives on MediaWiki: standard profiles, rate/review articles, user point system, user badges. Shared some of the specific extensions used.

It is possible to create a template for userpages and have it pushed automatically, need to get user feedback on whether they like this or not. Some people prefer freeform userpage. Balance between user needs / desires and the enterprise’s needs.

All uses existing extensions, doesn’t require any development just “front end” SMW work. Walked through examples the “code” for each. This has potential to cause significant performance issues, will work to have some of this turned into extensions.

Questions about gamification: how do you handle cheating (don’t really), has it helped with adoption (with some people).

Improving Enterprise Findability with SMW

Laurent looking at how SMW can be used to help people find answers to their questions.

“Searching doesn’t give you answers, it gives you search results.”

Use the wiki as a wiki; keep it lean and fast. Use APIs to access other systems. Such as WordPress.

Lightning Talks

MediaWikiFarm extension

Nicolas Nallet talking about the MediaWikiFarm extension. Slides from the talk.

US Federal Government MediaWikis

Peter Meyer talking about MediaWiki installations in U.S. Federal Government. Full notes. Something to follow up on – Federal MediaWiki Demonstration and Discussion Group, a monthly gathering to share ideas, practices, and demo ideas. A call for a more consistent coherent wiki policy across the Federal Government.

“Knowledge management is not a curse word.” — Peter Woudsma


Wiki Farms (again)

Peter Woudsma. Reasons for multiple wikis: ownership, access, control, processing. Defined: MediaWiki server install that includes core and some extensions and supports multiple otherwise independent wikis. Went through an example putting three wikis on one MediaWiki server. Reasons to use common elements: data re-use, template re-use, branding, data processing (inter-wiki data data exchange, queries, transposition)

Question: how much is common to all wikis in the farm? Many possible options.

Discussion among the group about approaches they have taken.


SMW Factory

Lex Sulzer spoke about SMW Factory. Offline, secure, encrypted, backup (and a bunch of other terms). Duplicity, Vagrant, Ansible (executable documentation). Details in the link. An easy way to “clone” a wiki soyou can work on it offline, then push changes back to production. (Meant for dev, not content.)

Ultimate goal – ability to have “offline” wikis that can be updated and merged back in with the main wiki. This would be huge. A project for Friday.

Single enterprise wiki


What’s new in Semantic Forms

Jaron giving an update.

Change 1 – Spreadsheet display. “display = spreadsheet” Each entry becomes a separate instance of  the template on the page. Less work for admins, allows for better HTML, such as <label> (important for accessibility)

Change 2 – Some input types moved from Semantic Forms Input extension to Semantic Forms, continues a process started with other input types. Few extensions, less work for admins and developers.

Semantic Forms is now its own thing, no longer requires Semantic MediaWiki.

Change 3 – removed some params, some of the long-deprecated parameters.

Thinking of changing the name, to reduce confusion with other “semantic” things.

Important takeaway – cleaning up Semantic Forms to make it better, easier for developers and admins.

Also discussed changes to Cargo (an alternative to SMW). Key feature of note (to me) – text search of articles.

Opening Session


Peter Woudsma gave a quick overview of the history of MediaWiki in the enterprise and some of the challenges of taking a tool designed as an easy way to update content and making it useful and valuable at the enterprise level. A nice summary of Enterprise aspects of previous conferences, and the need for balance in discussing the non-technical aspects along with the technical.

He gave us quite a few questions, homework if you will, to consider as the conference progresses. Not just in how we use the tools, but how we influence the future development of the tools, and the support infrastructure.

Good discussion around the philosophy and implementation. Differences between what’s going on with MediaWiki software and WikiMedia. Need to show off who is using MediaWiki, and how they are using it. WikiReport.

Lex – SMW power user is critical to success. SMW is a digital co-worker.


Mix of corporate, government, and non-profit interests represented. People from several places in US and Europe (France, Germany, Switzerland).

An almost uneventful TSA experience

The TSA is in the news a lot lately, not much of it good. Long lines, senior people getting fired, and the generally negative perception and frustration with the agency. So, I thought I’d write something about them that is not so negative.

To tell the truth, I don’t think I’ve ever had any real issues with the TSA at my home airport, Lambert International. Sure there have been long lines, and the physical design of the various checkpoints leaves much to be desired, but when it comes to the people of TSA at Lambert I’ve never had an issue. (I can’t say the same about other airports; the experience at Orlando earlier this year was horrendous.) No issue with the people today, either. So why the “almost” in my “almost uneventful” title?

As I was waiting for my stuff to come through the X-ray machine, the agent monitoring the scans held up my laptop bag and asked who owned it. Of course I said, “That’s mine, what’s up?” (Actually, just the “that’s mine”.)

“Do you have anything sharp or fragile? Do you mind if I open the bag and take a look?”

“I don’t think so,” and, “No, go ahead.”

As he opened the bag, he glanced up at the scan and then back at the bag. Then the screen. Then I looked in the bag at what he was looking at.

“I do have a fountain pen in there.” A nice, Levenger L-Tech to be exact. On retrospect, I can see how it might look like an Xacto knife or something on the scan.

The agent, having figured out what he was looking for from the scan, pulled the pen part of the way out of its slot in the bag and commented, “We don’t see many like this one.” Then he slid it back into its slot, handed me the bag and wished me a good day.

Which is about as eventful as I like my encounters with the TSA to be.

Congrats to Moodle team – 3.1 now available

Congrats to the Moodle team and everyone involved in the Moodle 3.1 release, including an update to Moodle Mobile. From the announcement:

Moodle 3.1 wraps a lot of new features together with hundreds of fixes and improvements into a package that we’ll be supporting for the next three years, 1.5 years longer than most releases.

I am proud to announce that the most important new feature in this release is the new core support for competencies, which is something we’ve been talking about and developing in the community for many years.  These help when Moodle is used for competency-based education (CBE), “mastery learning” and any technique that involves learning plans based on the things students know, and the things they are yet to know.