Module:Performances
This module is supposed to provide access to the list of performances Angelina did. At the heart of it is a very generic, configurable table-generator, that could also be used in different contexts.
Usage
Overview
{{#invoke: Performances | createTable -- createTable is the generic table generator
|page=Data:Performances.json -- link to page with JSON data
|supplements=Data:VideoMetaData.json;url -- supplemental data sources
|headers=Song,Date,Type,With,Video -- List of titles used in the group section headers
|keys=[[<song>]],{{d|<date>}},<type>,<with>,[<url> <song> - <event>] -- computed data items corresponding to those titles
|sort=<date> -- computed sort keys for sorting outer table
|sort1=<date> -- (optional): outer sortkey, same as inner if omitted
|filters=date:2018,event:Kongsberg -- filters to select what items are displayed
|char_limit=7 -- The amount of characters of the sort key considered for grouping the sections
|char_limit1=4 -- (optional): outer char limit, no outer grouping if ommited
|caption=Kongsberg 2018 by Month -- Title displayed in Header of outer Table
|group_sort=<date> ! <type> ! <pos> -- how the group section sub-tables are sorted
|id=1 -- Id needed if multiple tables are generated within a page
}}
Options
page: Provides page that contains JSON formated input data.
cargo_query: Alternative way to access data via Cargo query (see below for details)
supplements (optional): comma seperated list of supplemental JSON data source entries of "supplementalDataPage;keyName:charLimit;targetKeyName", with charLimit and targetKeyName being optional. Data from the supplemental source will be matched based on the keys (if the target's key name is ommitted, the same will be used). The charLimit option shortens the primary key value to this amount of characters. The target data can also be in a form, where the matching key data is used as keys itself instead of value of a "key" field.
headers: Comma seperated list of titles used in the group section headers.
keys: Comma seperated list of computed fields used for the colums in group sections defined by the corrosponding headers. Number of entries should match the number of headers.
Key names representing data are enclosed in "<",">" brackets. Each field can reference multiple keys. Other text will be still interpreted e.g. as templates. For the key's values, substitions can be defined by appending them with colons. A pair of strings will be interpretated as the search and replacement strings for a gsub() substition with lua pattern matching. The special keywords "toUpper", "toLower", and "toTitleCase" will perform the appropriate conversions, "limit:n" will limit the resresult to n characters. Several substitutions can be concatenated.
sort: Computed sort fields for sorting outer table, and for labeling of section names. Same syntax as for keys.
group (optional): Definition of how to label section names differently from 'sort' definition.
char_limit (optional): The amount of characters of the sort key considered for grouping the sections. This will also affect the displayed section name if 'group' is not provided.
filters: Comma seperated list of filters to select what items are displayed. A filter is described by "keyname:values". Alternatives within an filter value item can be described with "/". Negations can be described with "!( . )".
sort1 (optional): Outer sortkey. If ommited, the inner sort key will be used.
group1 (optional): Definition of how to label outer section names differently from 'sort1' definition.
char_limit1 (optional): The amount of characters of the sort key considered for the outer grouping. This will also affect the displayed section name, if 'group1' is ommited.
If ommited, no outer grouping will be performed.
id: A unique Id for each table is needed if multiple tables are generated within a page, to provide a basis for unique section ids.
caption (optional): Title displayed in the header of the outer table
group_sort: Computed field. that determines how the group section sub-tables are sorted. Same syntax as for keys.
Features
The generated table may have one or two levels of grouping, and up to three levels of sorting. It has the follwing structure, with the outer grouping being optional (click on the [n Items] buttons to reveal the inner collapsed tables):
| Caption (provided as parameter) | |||
|---|---|---|---|
| 1 (outer grouping level seperator) | |||
| A (inner grouping level header) [2 Items] | |||
| B (inner grouping level header) [1 Item] | |||
| 2 (outer grouping level seperator) | |||
| C (inner grouping level header) [1 Item] | |||
The inner tables are enclosed in
<div youtube-player-placeholder> ... </div>
tags, which can optionally be used by Javascript code contained in a Gadget (MediaWiki:Gadget-embeddedYouTubePlayer.js) to attach an embedded YouTube player to each section, that is loaded with a playlist of any YouTube video link contained in that section. Clicking on those links will then also load and start the linked video in that player.
Each grouping separator/header will set an anchor id, so that the table rows can be jumped at with internal references, e.g. C.
There is also code available, that will expand the inner table of a jump target when jumped to.
Use of Cargo queries for input
This was recently added as an alternative source for input. It supports:
- Multiple Cargo tables with configurable fields
- Nested child tables (1–3+ levels)
- List field splitting (# or ,)
- Optional pretty-printed JSON
- Flexible configuration in JSON or Lua-like syntax
The getJSON() function generates nested JSON structures from Cargo tables. It is meant to verify that queries produce the right structures before using them in the createTable() function.
All table and field configuration is contained in a single parameter: cargo_query.
{{#invoke:CargoQueryTest|getJSON
|cargo_query=<JSON or Lua-like table>
|pretty=true
}}- cargo_query – required; configuration of tables, fields, nesting, and root table.
- pretty – optional; if "true", outputs indented JSON (requires mw.text.JSON_PRETTY).
cargo_query Parameters
| Parameter | Type | Description |
|---|---|---|
| tables | array of strings | List of Cargo table names (CamelCase). Order does not determine root. |
| fields | table | Map of tableName → comma-separated list of field names to return. Fields can be lowercase. |
| where | table | Map of tableName → Cargo WHERE clause for filtering rows. Empty string for no filter. |
| nest | array of tables | Each table describes a parent-child nesting rule: { parent = "ParentTable", child = "ChildTable", parentKey = "ParentField", childKey = "ChildField", as = "ChildLabel" } |
| listFields | array of strings | Field names that should be returned as arrays (split on # or ,). |
| root | string | The Cargo table to use as the top-level JSON object. |
Field Splitting Rules
- # delimiter → splits Cargo lists into an array, preserves empty entries.
Example: "Concert##US Concert" → ["Concert","","US Concert"]
- ', delimiter → splits only if there are no spaces around the comma.
Example 1 — Two-level nesting (Performances → Videos)
{{#invoke:CargoQueryTest|getJSON
|cargo_query={
"tables": ["PerformancesDevel","VideosDevel"],
"fields": {
"PerformancesDevel": "song,event,context,date,type,pos,partners,comment,perfID",
"VideosDevel": "perfID,url,duration,quality"
},
"where": {
"PerformancesDevel": "",
"VideosDevel": ""
},
"nest": [
{ "parent": "PerformancesDevel", "child": "VideosDevel", "parentKey": "perfID", "childKey": "perfID", "as": "videos" }
],
"listFields": ["context","partners"],
"root": "PerformancesDevel"
}
|pretty=true
}}Lua-like table style (sanitized)
{{#invoke:CargoQueryTest|getJSON
|cargo_query={
tables = ['PerformancesDevel','VideosDevel'],
fields = {
PerformancesDevel = 'song,event,context,date,type,pos,partners,comment,perfID',
VideosDevel = 'perfID,url,duration,quality'
},
nest = {
{ parent = 'PerformancesDevel', child = 'VideosDevel', parentKey = 'perfID', childKey = 'perfID', as = 'videos' }
},
listFields = ['context','partners'],
root = 'PerformancesDevel'
}
|pretty=true
}}Example 2 — Three-level nesting (Songs → Performances → Videos)
{{#invoke:CargoQueryTest|getJSON
|cargo_query={
tables = ['SongsDevel','PerformancesDevel','VideosDevel'],
fields = {
SongsDevel = 'songID,title,artist',
PerformancesDevel = 'song,event,context,date,type,pos,partners,comment,perfID',
VideosDevel = 'perfID,url,duration,quality'
},
nest = {
{ parent = 'SongsDevel', child = 'PerformancesDevel', parentKey = 'songID', childKey = 'song', as = 'performances' },
{ parent = 'PerformancesDevel', child = 'VideosDevel', parentKey = 'perfID', childKey = 'perfID', as = 'videos' }
},
listFields = ['context','partners'],
root = 'SongsDevel'
}
|pretty=true
}}Tips
- Always specify root in cargo_query to avoid ambiguity.
- Use listFields for any Cargo field that should become a Lua/JSON array.
- pretty=true is optional but recommended for debugging and readability.
- Lua-table style is allowed inline — single quotes ' are fine; the module sanitizes them into valid JSON.
- Multi-level nesting must specify both parent and child explicitly in each rule.
- Use the AJW:Cargo_query_test page to develop your query and check if it delivers the expected JSON equivalent structure.
ToDo
- Make sorting more flexible, e.g. by applying regex substitutions to data values before being sorted.
- Have an option, to not start collapsed (usefull for small tables)
- Implement access to sublists below keys.
- provide default values for empty fields of computed fields, e.g. like <keyName=defaultValue>
- Provide a means to print additional data behind the sort field that does not effect the id-anchor
Examples
Multiple Filters, combined as AND
{{#invoke: Performances | createTable
|page=Data:Performances.json
|headers=Song,Date,Type,With,Video
|keys=[[<song>]],{{d|<date>}},<type>,<with>,[<url> <song> - <event>]
|sort=<date>
|filters=date:2018,event:Kongsberg
|char_limit=7
|caption=Kongsberg 2018 by Month
|group_sort=<date> ! <type> ! <pos>
|id=1
}}
| Kongsberg 2018 by Month | ||||
|---|---|---|---|---|
| 2018-06 [12 Items] | ||||
| 2018-07 [22 Items] | ||||
Single Event with Supplemental Video Metadata
{{#invoke: Performances | createTable
|page=Data:Performances.json
|supplements=Data:VideoMetaData.json;url:43
|headers=Song,Date,Type,Video
|keys=[[<song>]],{{d|<date>}},<type>,[<url> <url-title>; <url-User>]
|sort=<event>
|group_sort=<date> ! <type> ! <pos> ! <song>
|filters=event:Bjerke
|id=2
}}
| Bjerke Lydstudio [8 Items] | |||
|---|---|---|---|
Alternatives in Filters, combinded as OR
{{#invoke: Performances | createTable
|page=Data:Performances.json
|headers=Song,Date,Type,Comment
|keys=[[#<song>|<song>]],{{d|<date>}}: <pos>,<type> <duration> [<url> play],<comment>; <with>
|sort=[[<event>]]
|caption=Repetitive Live Events
|group_sort=<date> ! <type> ! <pos> ! <song>
|filters=event:Allsang på Grensen/TV 2's Artist Gala
|id=3
}}
| Repetitive Live Events | |||
|---|---|---|---|
| Allsang på Grensen [7 Items] | |||
| TV 2's Artist Gala [2 Items] | |||
Demonstrating Different Outer Sorting/Grouping and Negation in Filters
In the following example, the highest level (sort1) is sorted by year, then the middle one (sort) by date, as well as the inner table one, which also features secondary sorting keys (type, pos, song). The seperator "!" that was chosen for these has no special meaning, but it is the lowest character encoding value above a space, which should ensure that the correct sorting is applied.
"duration:!(fragment)" is used for filtering for those videos, that are not fragments.
Also, wie use substitutions to the computed "url" field
{{#invoke: Performances | createTable
|page=Data:Performances.json
|headers=Song,Date,Type,Comment,Video
|keys=[[#<song>|<song>]],{{d|<date>}}; <pos>,<type> <duration>, <comment>; <with>,[<url> <url:.*www.: :.com.*: >]
|sort=[[<event>]]
|sort1=<date>
|char_limit1=4
|filters=type:live,duration:!(fragment)
|caption=Live Events by Year, excluding fragments
|group_sort=<date> ! <type> ! <pos> ! <song>
|id=4
}}
Demonstrating Same Type Sorting, with Different Grouping Levels, and Supplemental Data
Here, the different char_limits on the sorting of song lead to two different grouping levels, by 1st character and by unique title.
{{AtoZ}}
{{#invoke: Performances | createTable
|page=Data:Performances.json
|supplements=Data:Songs.json;song;title,Data:VideoMetaData.json;url:43 |headers=Event,Date,Type,Video,Pos,With,Comment
|keys=[[#<event>|<event>]],{{d|<date>}},<type> <duration>,[<url> play <song-type>],<pos>,<with>,<comment>; <url-channelName>
|sort=[[<song>]]
|char_limit1=1
|caption=Live Songs, excluding fragments
|group_sort=<date> ! <type> ! <pos>
|filters=duration:!(fragment),type:live
|id=5
}}
{{AtoZ}}
A · B · C · D · E · F · G · H · I · J · K · L · M · N · O · P · Q · R · S · T · U · V · W · X · Y · Z
Code
local p = {}
local mw_text = mw.text
local mw_title = require('mw.title')
-- Fetch JSON from a page
function p.fetchJSONFromPage(pageName)
local title = mw_title.new(pageName)
if not title then return nil, 'Invalid page name' end
local content = title:getContent()
if not content then return nil, 'Page not found or empty' end
return mw_text.jsonDecode(content)
end
-- Limit key length
local function limitKeyLength(key, maxChars)
if maxChars then return string.sub(key, 1, tonumber(maxChars)) end
return key
end
-- Find match in secondary table
local function findInSecondary(secondaryData, key, secondaryFieldName)
if type(secondaryData) == "table" then
if secondaryData[key] then return secondaryData[key] end
for _, item in pairs(secondaryData) do
if item[secondaryFieldName] == key then return item end
end
end
return nil
end
-- Merge secondary data into primary
local function enrichPrimaryWithSecondary(primaryData, secondaryPage, primaryFieldName, maxChars, secondaryFieldName)
secondaryFieldName = secondaryFieldName or primaryFieldName
local secondaryData, err = p.fetchJSONFromPage(secondaryPage)
if not secondaryData then return nil, "Failed to load JSON from "..secondaryPage.." ("..err..")" end
for _, item in ipairs(primaryData) do
local primaryFieldValue = item[primaryFieldName]
if primaryFieldValue then
local limitedPrimaryFieldValue = limitKeyLength(primaryFieldValue, maxChars)
local secondaryMatch = findInSecondary(secondaryData, limitedPrimaryFieldValue, secondaryFieldName)
if secondaryMatch then
for k, v in pairs(secondaryMatch) do
if k ~= secondaryFieldName then
local newKey = primaryFieldName.."-"..k
item[newKey] = v
end
end
end
end
end
return primaryData
end
-- Enrich primary from supplements (legacy JSON or Cargo)
local function enrichFromSupplements(primaryData, supplements, supplements_query)
-- Legacy JSON supplements
if supplements then
local supplementsList = mw_text.split(supplements, ",")
for _, supplement in ipairs(supplementsList) do
local parts = mw_text.split(supplement,";")
local secondaryPage = parts[1]
local primaryFieldParts = mw_text.split(parts[2],":")
local primaryFieldName = primaryFieldParts[1]
local maxChars = primaryFieldParts[2]
local secondaryFieldName = parts[3] or nil
local enrichedData, err = enrichPrimaryWithSecondary(primaryData, secondaryPage, primaryFieldName, maxChars, secondaryFieldName)
if not enrichedData then return nil, "Error: "..err end
end
end
-- Cargo supplements
if supplements_query then
for _, supp in ipairs(supplements_query) do
local secData = supp.query
-- If string, try decoding JSON or treat as page name
if type(secData) == "string" then
local ok, tbl = pcall(mw.text.jsonDecode, secData)
if ok and type(tbl) == "table" then
secData = tbl
else
-- Fallback: treat as page
secData, err = p.fetchJSONFromPage(secData)
if not secData then return nil, "Error fetching supplements_query page: "..tostring(supp.query) end
end
end
local primaryField = supp.primaryField
local secondaryField = supp.secondaryField or primaryField
local mode = supp.mode or "single"
local prefix = supp.prefix or ""
local fields = supp.fields or {}
local maxChars = supp.maxChars
for _, item in ipairs(primaryData) do
local keyValue = item[primaryField]
if keyValue then
keyValue = limitKeyLength(keyValue, maxChars)
for _, secItem in ipairs(secData) do
if secItem[secondaryField] == keyValue then
for _, f in ipairs(fields) do
if mode=="array" then
item[prefix..f] = item[prefix..f] or {}
table.insert(item[prefix..f], {url=secItem["URL"], duration=secItem["Duration"], quality=secItem["Quality"]})
else
item[prefix..f] = secItem[f]
end
end
end
end
end
end
end
end
return primaryData
end
-- Simple filter
local function matchesFilter(value, pattern)
local luaPattern = pattern:gsub("%%", "%%%%"):gsub("%*", ".*")
local isGroupNegated = luaPattern:match("^%!%b()$")
if isGroupNegated then luaPattern = luaPattern:sub(3,-2) end
local alternatives = {}
for alt in luaPattern:gmatch("[^/]+") do alt = alt:gsub("([%^%$%(%)%%%.%[%]%*%+%-%?])","%%%1"); table.insert(alternatives, alt) end
local subMatch = false
for _, alt in ipairs(alternatives) do
if tostring(value):match(alt) then subMatch = true; break end
end
if (not isGroupNegated and subMatch) or (isGroupNegated and not subMatch) then return true end
return false
end
-- Resolve paths
local INVISIBLE_SEPARATOR = " "
local function resolvePath(tbl, path)
local current = tbl
for segment in path:gmatch("[^.]+") do
local key, index = segment:match("^(%w+)%[(%d+)%]$")
if key then
if current and current[key] then
local idx = tonumber(index)
if current[key][idx] ~= nil then current = current[key][idx] else return INVISIBLE_SEPARATOR end
else return "" end
else
current = current and current[segment]
if current==nil then return "" end
end
end
return current
end
-- Compute field value
function p.computeField(item, formula)
return (formula:gsub("(%b<>)", function(placeholder)
local content = placeholder:sub(2,-2)
local fieldPart, filterPart = content:match("([^|]+)|?(.*)")
filterPart = filterPart ~= "" and filterPart or nil
local parts = {}
for part in fieldPart:gmatch("[^:]+") do table.insert(parts, part) end
local fieldPath = parts[1]
local value = resolvePath(item, fieldPath)
value = tostring(value or "")
if filterPart and not matchesFilter(value, filterPart) then return "" end
if #parts>1 then
for i=2,#parts,2 do
local pattern = parts[i]
local replacement = parts[i+1]
if pattern=="toUpper" then value=value:upper()
elseif pattern=="toLower" then value=value:lower()
elseif pattern=="toTitleCase" then value=value:gsub("(%a)(%w*)",function(f,r) return f:upper()..r end)
elseif pattern=="limit" and replacement then
local n=tonumber(replacement)
if n then value=mw.ustring.sub(value,1,n) end
elseif replacement then
value=value:gsub(pattern,replacement)
end
end
end
return value
end))
end
-- Split string
local function splitString(str, delimiter)
local result={}
for match in (str..delimiter):gmatch("(.-)"..delimiter) do table.insert(result,match) end
return result
end
-- Function to filter data based on provided filters with support for alternatives and negation
function p.filterData(data, filters)
local filteredData = {}
for _, item in ipairs(data) do
local match = true
for key, pattern in pairs(filters) do
-- Replace '*' with '.*' for wildcard matching
local luaPattern = pattern:gsub('%*', '.*')
-- Check for group negation (e.g., "!(pattern1/pattern2)")
local isGroupNegated = luaPattern:match("^%!%b()$")
if isGroupNegated then
luaPattern = luaPattern:sub(3, -2) -- Remove the "!(" and ")" around the group
end
-- Split the pattern by '/', treating each part as an alternative
local alternatives = splitString(luaPattern, '/')
-- Check if the item's value matches any of the alternatives
local subMatch = false
for _, alt in ipairs(alternatives) do
-- Escape any Lua pattern magic characters in the alternative
alt = alt:gsub("([%^%$%(%)%%%.%[%]%*%+%-%?])", "%%%1")
if tostring(item[key]):match(alt) then
subMatch = true
break
end
end
-- If group negated, we invert the logic for the entire group of alternatives
if (not isGroupNegated and not subMatch) or (isGroupNegated and subMatch) then
match = false
break
end
end
if match then
table.insert(filteredData, item)
end
end
return filteredData
end
-- Sort data
function p.sortData(data, sortKeyFormula)
table.sort(data, function(a,b)
local aKey = p.computeField(a, sortKeyFormula):gsub('[^%w%?%!]', ''):lower():gsub('%?',"%#")
local bKey = p.computeField(b, sortKeyFormula):gsub('[^%w%?%!]', ''):lower():gsub('%?',"%#")
return tostring(aKey) < tostring(bKey)
end)
return data
end
-- Group data
function p.groupData(data, keyFormula, labelFormula, charLimit)
local groupedData = {}
local currentKey = nil
for _, item in ipairs(data) do
local keyValue = p.computeField(item, keyFormula)
if charLimit and tonumber(charLimit) then
keyValue = mw.ustring.sub(keyValue:gsub('[%(%)%[%]%{%}]',''),1,tonumber(charLimit))
end
local headerValue = labelFormula and p.computeField(item, labelFormula) or keyValue
if keyValue ~= currentKey then
currentKey = keyValue
groupedData[#groupedData+1] = {key=keyValue, header=headerValue, items={}}
end
table.insert(groupedData[#groupedData].items, item)
end
return groupedData
end
-- Render table headers
function p.renderHeaders(headers)
return '! ' .. table.concat(headers, ' !! ') .. '\n'
end
-- Render table rows
function p.renderRows(items, computedKeys)
local rows = ''
for _, item in ipairs(items) do
rows = rows .. '|-\n'
local row = {}
for _, keyFormula in ipairs(computedKeys) do
row[#row+1] = p.computeField(item, keyFormula) or ''
end
rows = rows .. '| ' .. table.concat(row, ' || ') .. '\n'
end
return rows
end
-- Render full table
function p.renderTable(data, headers, computedKeys, sortKeyFormula, groupLabelFormula, charLimit, sortKeyFormula1, groupLabelFormula1, charLimit1, groupSortKeyFormula, caption, tableId)
local outerTable = '{| class="wikitable" style="width:100%; margin:0;"\n'
outerTable = outerTable .. '|-\n! colspan="'..#headers..'" style="text-align:left;" | '..caption..'\n'
local sortedData1 = p.sortData(data, sortKeyFormula1)
local groupedData1 = p.groupData(sortedData1, sortKeyFormula1, groupLabelFormula1, charLimit1)
for groupIndex1, group1 in ipairs(groupedData1) do
local sortedData = p.sortData(group1.items, sortKeyFormula)
local groupedData = p.groupData(sortedData, sortKeyFormula, groupLabelFormula, charLimit)
outerTable = outerTable .. '|-\n| style="text-align:center;" | <span id="'..group1.key:gsub('[%[%]]','')..'">'..group1.header..'</span>\n'
for groupIndex, group in ipairs(groupedData) do
if groupSortKeyFormula then p.sortData(group.items, groupSortKeyFormula) end
local groupId = tableId..'-'..groupIndex1..'-'..groupIndex
local itemCount = #group.items
local toggleLabel = '['..itemCount..' Item'..(itemCount>1 and 's' or '')..']'
local toggleSpan = '<span class="mw-customtoggle-'..groupId..'" style="cursor:pointer; color:blue;">'..toggleLabel..'</span>'
outerTable = outerTable .. '|-\n! colspan="'..#headers..'" style="background-color:#f5f5f5; text-align:left;" | <span id="'..group.key:gsub('[%[%]]','')..'">'..group.header..'</span> '..toggleSpan..'\n'
outerTable = outerTable .. '<tr class="mw-collapsible mw-collapsed" id="mw-customcollapsible-'..groupId..'" style="display:none">\n'
outerTable = outerTable .. '| <div class="youtube-player-placeholder">\n'
outerTable = outerTable .. '{| class="wikitable sortable" style="width:100%; margin:0;"\n'
outerTable = outerTable .. '|-\n'..p.renderHeaders(headers)
outerTable = outerTable .. p.renderRows(group.items, computedKeys)
outerTable = outerTable .. '|}\n</div>\n'
outerTable = outerTable .. '</tr>\n'
end
end
outerTable = outerTable .. '|}\n'
return outerTable
end
-- ================================================
-- New: runCargoQuery - returns table of rows
-- ================================================
function p.runCargoQuery(cfg)
-- Accept string JSON or Lua-like table or actual table
if type(cfg) == "string" then
local ok, tbl = pcall(mw.text.jsonDecode, cfg)
if ok and type(tbl) == "table" then
cfg = tbl
else
-- try simple Lua-like sanitization (convert single quotes to double, bare keys)
local safe = cfg
:gsub("(%w+)%s*=", '"%1":') -- bareword keys -> JSON keys
:gsub("'", '"') -- single -> double quotes
:gsub(",%s*}", "}") -- remove trailing commas before }
:gsub(",%s*]", "]")
local ok2, tbl2 = pcall(mw.text.jsonDecode, safe)
if ok2 and type(tbl2) == "table" then
cfg = tbl2
else
return nil, "Failed to parse cargo_query (not valid JSON or sanitized Lua-like)"
end
end
elseif type(cfg) ~= "table" then
return nil, "cargo_query must be table or JSON string"
end
local cargo = mw.ext.cargo
if not cargo then return nil, "Cargo extension not available" end
local tables = cfg.tables or {}
local fields = cfg.fields or {}
local where = cfg.where or {}
local nest = cfg.nest or {}
local listFields = cfg.listFields or {}
local root = cfg.root or tables[1]
local limit = cfg.limit or "max"
if type(tables) ~= "table" or #tables == 0 then
return nil, "No tables specified in cargo_query"
end
-- Query each table and convert rows to Lua tables
local tableData = {}
-- helper to ensure fields string
local function fieldsForTable(t)
local f = fields[t]
if not f then return "*" end
if type(f) == "table" then
return table.concat(f, ",")
else
return tostring(f)
end
end
for _, t in ipairs(tables) do
local fstr = fieldsForTable(t)
local ok, rows = pcall(mw.ext.cargo.query, t, fstr, { where = where[t] or "", limit = limit })
if not ok then return nil, "Cargo query failed for " .. t .. ": " .. tostring(rows) end
tableData[t] = rows or {}
end
-- convert list fields into arrays when requested (or heuristically)
local listFieldSet = {}
for _, lf in ipairs(listFields) do listFieldSet[lf] = true end
local function splitListField(value)
if not value then return nil end
if value:find("#") then
return mw_text.split(value, "#", true)
elseif value:find(",") and not value:match("%s*,%s*") then
return mw_text.split(value, ",", true)
end
return value
end
for _, t in pairs(tableData) do
for _, row in ipairs(t) do
for k, v in pairs(row) do
if type(v) == "string" then
if listFieldSet[k] then
row[k] = splitListField(v)
elseif v:find("#") or (v:find(",") and not v:match("%s*,%s*")) then
row[k] = splitListField(v)
end
end
end
end
end
-- Apply explicit nesting rules (nest is an array of { parent=..., child=..., parentKey=..., childKey=..., as=... })
for _, def in ipairs(nest) do
local parentName = def.parent
local childName = def.child
local parentKey = def.parentKey
local childKey = def.childKey
local label = def.as or childName
if tableData[parentName] and tableData[childName] then
-- index children by childKey
local index = {}
for _, child in ipairs(tableData[childName]) do
local key = child[childKey]
if key then
index[key] = index[key] or {}
table.insert(index[key], child)
end
end
-- attach
for _, parent in ipairs(tableData[parentName]) do
local k = parent[parentKey]
parent[label] = index[k] or {}
end
-- remove child table to avoid circular references
tableData[childName] = nil
end
end
return tableData[root] or {}, nil
end
-- getJSON: returns pretty JSON for debugging
function p.getJSON(frame)
-- allow both direct and parent-frame args
local args = frame:getParent().args
for k,v in pairs(frame.args) do args[k]=v end
local cargo_query = args["cargo_query"]
if not cargo_query then return "Error: cargo_query parameter required" end
local tbl, err = p.runCargoQuery(cargo_query)
if not tbl then return "Error: "..tostring(err) end
local pretty = true
if type(cargo_query) == "table" and cargo_query.pretty ~= nil then
pretty = cargo_query.pretty and true or false
elseif type(cargo_query) == "string" then
-- try detect pretty in JSON string
local ok, parsed = pcall(mw.text.jsonDecode, cargo_query)
if ok and type(parsed) == "table" and parsed.pretty ~= nil then
pretty = parsed.pretty and true or false
end
end
if pretty and mw.text.JSON_PRETTY then
return mw.text.jsonEncode(tbl, mw.text.JSON_PRETTY)
else
return mw.text.jsonEncode(tbl)
end
end
-- Main createTable entry
function p.createTable(frame)
local args = frame:getParent().args
for k,v in pairs(frame.args) do args[k]=v end
local pageName = args['page']
local dataArg = args['data']
local cargo_query = args['cargo_query']
local supplements = args['supplements']
local supplements_query = args['supplements_query']
if not args['headers'] or not args['keys'] or not args['sort'] then
return 'Error: headers, keys, and sort arguments are required.'
end
local headers = mw_text.split(args['headers'],',')
local computedKeys = mw_text.split(args['keys'],',')
local sortKeyFormula = args['sort']
local groupLabelFormula = args['group']
local charLimit = args['char_limit']
local sortKeyFormula1 = args['sort1'] or sortKeyFormula
local groupLabelFormula1 = args['group1']
local charLimit1 = args['char_limit1'] or 0
local groupSortKeyFormula = args['group_sort']
local caption = args['caption'] or ''
local tableID = args['id']
local filters={}
local filterString=args['filters']
if filterString then
for key,pattern in filterString:gmatch('([^:]+):([^,]+),?') do filters[key]=pattern end
end
-- Determine primaryData from page, data, or cargo_query
local primaryData
if dataArg then
local ok,tbl = pcall(mw.text.jsonDecode,dataArg)
if ok and type(tbl)=="table" then primaryData=tbl else return "Error decoding data JSON" end
elseif cargo_query then
local cfg
if type(cargo_query) == "string" then
local ok, tbl = pcall(mw.text.jsonDecode, cargo_query)
if ok and type(tbl) == "table" then cfg = tbl
else
-- try sanitize Lua-like inline (bare keys and single quotes)
local safe = cargo_query
:gsub("(%w+)%s*=", '"%1":')
:gsub("'", '"')
:gsub(",%s*}", "}")
:gsub(",%s*]", "]")
local ok2, tbl2 = pcall(mw.text.jsonDecode, safe)
if ok2 and type(tbl2) == "table" then cfg = tbl2
else return "Error decoding cargo_query JSON"
end
end
elseif type(cargo_query) == "table" then
cfg = cargo_query
else
return "Error: cargo_query must be a table or JSON string"
end
local result, err = p.runCargoQuery(cfg)
if not result then return "Cargo query error: "..tostring(err) end
primaryData = result
elseif pageName then
local err
primaryData, err = p.fetchJSONFromPage(pageName)
if not primaryData then return 'Error fetching JSON: '..err end
else
return 'Error: page, data, or cargo_query argument required.'
end
local err
primaryData, err = enrichFromSupplements(primaryData, supplements, supplements_query)
if not primaryData then return err end
local data = p.filterData(primaryData, filters)
return p.renderTable(data, headers, computedKeys, sortKeyFormula, groupLabelFormula, charLimit, sortKeyFormula1, groupLabelFormula1, charLimit1, groupSortKeyFormula, caption, tableID)
end
-- remove HTML encodings from string
function p.clean(frame)
-- Get first argument (or empty string if missing)
local s = frame.args[1] or ""
-- Remove all HTML entities like ' or &
s = s:gsub("&[#%w]+;", "_")
-- Replace any non-alphanumeric, underscore, hyphen, or dot
s = s:gsub("[^%w_%-%.]", "_")
return s
end
function p.decode(frame)
-- Get first argument (or empty string if missing)
local s = frame.args[1] or ""
-- Remove all HTML entities like ' or &
local s = frame.args[1] or ""
s = mw.text.decode(s)
return s
end
function p.urlencode(frame)
-- Get first argument (or empty string if missing)
local s = frame.args[1] or ""
local s = frame.args[1] or ""
-- Encode safely for URLs or storage
s = mw.uri.encode(s, "WIKI")
return s
end
-- prints substrings "3 4, 5" from context string separate by #s
function p.eventFromContextString(frame)
local s = frame.args[1] or ""
-- 1. Replace 4th hash with ", "
local count = 0
s = s:gsub("#", function()
count = count + 1
return (count == 4) and ", " or "#"
end)
-- 2. Remove everything up to and including 2nd hash
count = 0
s = s:gsub("^[^#]*#[^#]*#", function(match)
count = 2
return ""
end, 1)
-- 3. Replace remaining # with spaces
s = s:gsub("#", " ")
-- Trim any excess spaces
s = s:gsub("^%s+", ""):gsub("%s+$", ""):gsub("%s%s+", " ")
return s
end
-- Simple test harness
function p.test()
local test_cases = {
"O'Brien & Sons",
"Smith & Wesson",
"A&B<C>D\"E'F",
"Rock 'n' Roll",
"Müller & Co.",
"Test_Page-1"
}
local results = {}
for _, input in ipairs(test_cases) do
local frame = { args = { input } }
local output = p.urlencode(frame)
table.insert(results, string.format("Input: %s -> Output: %s", input, output))
end
return table.concat(results, "\n")
end
return p