Module:Performances-devel

From Angelina Jordan Wiki
Revision as of 18:38, 3 October 2025 by Most2dot0 (talk | contribs) (Rewrite with separate page, data, and cargo_query parameters)
The code for this module can be found below its documentation. You can edit the code, edit the documentation, or purge the cache.

NOT FOR PRODUCTION USE!

This is the development version of Module:Performances, which you should use instead of this one.


Work in Progress!!

This module is supposed to provide access to the list of performances Angelina did. At the heart of it is a very generic, configurable table-generator, that could also be used in different contexts.

Usage

Overview

{{#invoke: Performances | createTable       -- createTable is the generic table generator
  |page=Data:Performances.json              -- link to page with JSON data
  |supplements=Data:VideoMetaData.json;url  -- supplemental data sources
  |headers=Song,Date,Type,With,Video        -- List of titles used in the group section headers
  |keys=[[<song>]],{{d|<date>}},<type>,<with>,[<url> <song> - <event>] -- computed data items corresponding to those titles
  |sort=<date>                              -- computed sort keys for sorting outer table
  |sort1=<date>                             -- (optional): outer sortkey, same as inner if omitted
  |filters=date:2018,event:Kongsberg        -- filters to select what items are displayed
  |char_limit=7                             -- The amount of characters of the sort key considered for grouping the sections
  |char_limit1=4                            -- (optional): outer char limit, no outer grouping if ommited 
  |caption=Kongsberg 2018 by Month          -- Title displayed in Header of outer Table
  |group_sort=<date> ! <type> ! <pos>       -- how the group section sub-tables are sorted
  |id=1                                     -- Id needed if multiple tables are generated within a page
}}

Options

page: Provides page that contains JSON formated input data.

supplements (optional): comma seperated list of supplemental data source entries of "supplementalDataPage;keyName:charLimit;targetKeyName", with charLimit and targetKeyName being optional. Data from the supplemental source will be matched based on the keys (if the target's key name is ommitted, the same will be used). The charLimit option shortens the primary key value to this amount of characters. The target data can also be in a form, where the matching key data is used as keys itself instead of value of a "key" field.

headers: Comma seperated list of titles used in the group section headers.

keys: Comma seperated list of computed fields used for the colums in group sections defined by the corrosponding headers. Number of entries should match the number of headers.

Key names representing data are enclosed in "<",">" brackets. Each field can reference multiple keys. Other text will be still interpreted e.g. as templates. For the key's values, substitions can be defined by appending them with colons. A pair of strings will be interpretated as the search and replacement strings for a gsub() substition with lua pattern matching. The special keywords "toUpper", "toLower", and "toTitleCase" will perform the appropriate conversions, "limit:n" will limit the resresult to n characters. Several substitutions can be concatenated.

sort: Computed sort fields for sorting outer table, and for labeling of section names. Same syntax as for keys.

group (optional): Definition of how to label section names differently from 'sort' definition.

char_limit (optional): The amount of characters of the sort key considered for grouping the sections. This will also affect the displayed section name if 'group' is not provided.

filters: Comma seperated list of filters to select what items are displayed. A filter is described by "keyname:values". Alternatives within an filter value item can be described with "/". Negations can be described with "!( . )".

sort1 (optional): Outer sortkey. If ommited, the inner sort key will be used.

group1 (optional): Definition of how to label outer section names differently from 'sort1' definition.

char_limit1 (optional): The amount of characters of the sort key considered for the outer grouping. This will also affect the displayed section name, if 'group1' is ommited.

If ommited, no outer grouping will be performed.

id: A unique Id for each table is needed if multiple tables are generated within a page, to provide a basis for unique section ids.

caption (optional): Title displayed in the header of the outer table

group_sort: Computed field. that determines how the group section sub-tables are sorted. Same syntax as for keys.

Features

The generated table may have one or two levels of grouping, and up to three levels of sorting. It has the follwing structure, with the outer grouping being optional (click on the [n Items] buttons to reveal the inner collapsed tables):

Caption (provided as parameter)
1 (outer grouping level seperator)
A (inner grouping level header) [2 Items]
B (inner grouping level header) [1 Item]
2 (outer grouping level seperator)
C (inner grouping level header) [1 Item]

The inner tables are enclosed in

<div youtube-player-placeholder> ... </div> 

tags, which can optionally be used by Javascript code contained in a Gadget (MediaWiki:Gadget-embeddedYouTubePlayer.js) to attach an embedded YouTube player to each section, that is loaded with a playlist of any YouTube video link contained in that section. Clicking on those links will then also load and start the linked video in that player.

Each grouping separator/header will set an anchor id, so that the table rows can be jumped at with internal references, e.g. C.

There is also code available, that will expand the inner table of a jump target when jumped to.

Use of Cargo queries for input

This was recently added as an alternative source for input. It supports:

  • Multiple Cargo tables with configurable fields
  • Nested child tables (1–3+ levels)
  • List field splitting (# or ,)
  • Optional pretty-printed JSON
  • Flexible configuration in JSON or Lua-like syntax

The getJSON() function generates nested JSON structures from Cargo tables. It is meant to verify that queries produce the right structures before using them in the createTable() function.

All table and field configuration is contained in a single parameter: cargo_query.

{{#invoke:CargoQueryTest|getJSON
 |cargo_query=<JSON or Lua-like table>
 |pretty=true
}}
  • cargo_query – required; configuration of tables, fields, nesting, and root table.
  • pretty – optional; if "true", outputs indented JSON (requires mw.text.JSON_PRETTY).

cargo_query Parameters

Parameter Type Description
tables array of strings List of Cargo table names (CamelCase). Order does not determine root.
fields table Map of tableName → comma-separated list of field names to return. Fields can be lowercase.
where table Map of tableName → Cargo WHERE clause for filtering rows. Empty string for no filter.
nest array of tables Each table describes a parent-child nesting rule: { parent = "ParentTable", child = "ChildTable", parentKey = "ParentField", childKey = "ChildField", as = "ChildLabel" }
listFields array of strings Field names that should be returned as arrays (split on # or ,).
root string The Cargo table to use as the top-level JSON object.

Field Splitting Rules

  • # delimiter → splits Cargo lists into an array, preserves empty entries.
 Example: "Concert##US Concert" → ["Concert","","US Concert"]  
  • ', delimiter → splits only if there are no spaces around the comma.

Example 1 — Two-level nesting (Performances → Videos)

{{#invoke:CargoQueryTest|getJSON
 |cargo_query={
   "tables": ["PerformancesDevel","VideosDevel"],
   "fields": {
     "PerformancesDevel": "song,event,context,date,type,pos,partners,comment,perfID",
     "VideosDevel": "perfID,url,duration,quality"
   },
   "where": {
     "PerformancesDevel": "",
     "VideosDevel": ""
   },
   "nest": [
     { "parent": "PerformancesDevel", "child": "VideosDevel", "parentKey": "perfID", "childKey": "perfID", "as": "videos" }
   ],
   "listFields": ["context","partners"],
   "root": "PerformancesDevel"
 }
 |pretty=true
}}

Lua-like table style (sanitized)

{{#invoke:CargoQueryTest|getJSON
 |cargo_query={
   tables = ['PerformancesDevel','VideosDevel'],
   fields = {
     PerformancesDevel = 'song,event,context,date,type,pos,partners,comment,perfID',
     VideosDevel = 'perfID,url,duration,quality'
   },
   nest = {
     { parent = 'PerformancesDevel', child = 'VideosDevel', parentKey = 'perfID', childKey = 'perfID', as = 'videos' }
   },
   listFields = ['context','partners'],
   root = 'PerformancesDevel'
 }
 |pretty=true
}}

Example 2 — Three-level nesting (Songs → Performances → Videos)

{{#invoke:CargoQueryTest|getJSON
 |cargo_query={
   tables = ['SongsDevel','PerformancesDevel','VideosDevel'],
   fields = {
     SongsDevel = 'songID,title,artist',
     PerformancesDevel = 'song,event,context,date,type,pos,partners,comment,perfID',
     VideosDevel = 'perfID,url,duration,quality'
   },
   nest = {
     { parent = 'SongsDevel', child = 'PerformancesDevel', parentKey = 'songID', childKey = 'song', as = 'performances' },
     { parent = 'PerformancesDevel', child = 'VideosDevel', parentKey = 'perfID', childKey = 'perfID', as = 'videos' }
   },
   listFields = ['context','partners'],
   root = 'SongsDevel'
 }
 |pretty=true
}}

Tips

  1. Always specify root in cargo_query to avoid ambiguity.
  2. Use listFields for any Cargo field that should become a Lua/JSON array.
  3. pretty=true is optional but recommended for debugging and readability.
  4. Lua-table style is allowed inline — single quotes ' are fine; the module sanitizes them into valid JSON.
  5. Multi-level nesting must specify both parent and child explicitly in each rule.
  6. Use the AJW:Cargo_query_test page to develop your query and check if it delivers the expected JSON equivalent structure.

ToDo

  • Make sorting more flexible, e.g. by applying regex substitutions to data values before being sorted.
  • Have an option, to not start collapsed (usefull for small tables)
  • Implement access to sublists below keys.
  • provide default values for empty fields of computed fields, e.g. like <keyName=defaultValue>
  • Provide a means to print additional data behind the sort field that does not effect the id-anchor

Examples

Multiple Filters, combined as AND

{{#invoke: Performances-devel | createTable
  |page=Data:Performances.json
  |headers=Song,Date,Type,With,Video
  |keys=[[<song>]],{{d|<date>}},<type>,<with>,[<url> <song> - <event>]
  |sort=<date>
  |filters=date:2018,event:Kongsberg
  |char_limit=7
  |caption=Kongsberg 2018 by Month
  |group_sort=<date> ! <type> ! <pos>
  |id=1
}}
Kongsberg 2018 by Month
2018-06 [12 Items]
2018-07 [22 Items]


Single Event with Supplemental Video Metadata

{{#invoke: Performances-devel | createTable
  |page=Data:Performances.json
  |supplements=Data:VideoMetaData.json;url:43
  |headers=Song,Date,Type,Video
  |keys=[[<song>]],{{d|<date>}},<type>,[<url> <url-title>; <url-User>]
  |sort=<event>
  |group_sort=<date> ! <type> ! <pos> ! <song>
  |filters=event:Bjerke
  |id=2
}}

Alternatives in Filters, combinded as OR

{{#invoke: Performances-devel | createTable
  |page=Data:Performances.json
  |headers=Song,Date,Type,Comment
  |keys=[[#<song>|<song>]],{{d|<date>}}: <pos>,<type> <duration> [<url> play],<comment>; <with>
  |sort=[[<event>]]
  |caption=Repetitive Live Events
  |group_sort=<date> ! <type> ! <pos> ! <song>
  |filters=event:Allsang på Grensen/TV 2's Artist Gala
  |id=3
}}
Repetitive Live Events
Allsang på Grensen [7 Items]
TV 2's Artist Gala [2 Items]

Demonstrating Different Outer Sorting/Grouping and Negation in Filters

In the following example, the highest level (sort1) is sorted by year, then the middle one (sort) by date, as well as the inner table one, which also features secondary sorting keys (type, pos, song). The seperator "!" that was chosen for these has no special meaning, but it is the lowest character encoding value above a space, which should ensure that the correct sorting is applied.

"duration:!(fragment)" is used for filtering for those videos, that are not fragments.

Also, wie use substitutions to the computed "url" field

{{#invoke: Performances-devel | createTable
  |page=Data:Performances.json
  |headers=Song,Date,Type,Comment,Video
  |keys=[[#<song>|<song>]],{{d|<date>}}; <pos>,<type> <duration>, <comment>; <with>,[<url> <url:.*www.: :.com.*: >]
  |sort=[[<event>]]
  |sort1=<date>
  |char_limit1=4
  |filters=type:live,duration:!(fragment)
  |caption=Live Events by Year, excluding fragments
  |group_sort=<date> ! <type> ! <pos> ! <song>
  |id=4
}}
Jump to… 2013 – 2014 – 2015 – 2016 – 2017 – 2018 – 2019 – 2020 – 2021 – 2022 – 2023 – 2024
Live Events by Year, excluding fragments
2017
Christmas concert tour 2017 [4 Items]
NRK Radio Interview 2017 [2 Items]
2018
Lyden av Norge, P4 Radio [1 Item]
Quincy Jones 85th birthday concert [1 Item]
2024
St. Pancras station, London [4 Items]
Jump to… 2013 – 2014 – 2015 – 2016 – 2017 – 2018 – 2019 – 2020 – 2021 – 2022 – 2023 – 2024

Demonstrating Same Type Sorting, with Different Grouping Levels, and Supplemental Data

Here, the different char_limits on the sorting of song lead to two different grouping levels, by 1st character and by unique title.

{{AtoZ}}
{{#invoke: Performances-devel | createTable
  |page=Data:Performances.json
  |supplements=Data:Songs.json;song;title,Data:VideoMetaData.json;url:43  |headers=Event,Date,Type,Video,Pos,With,Comment
  |keys=[[#<event>|<event>]],{{d|<date>}},<type> <duration>,[<url> play <song-type>],<pos>,<with>,<comment>; <url-channelName>
  |sort=[[<song>]]
  |char_limit1=1
  |caption=Live Songs, excluding fragments
  |group_sort=<date> ! <type> ! <pos>
  |filters=duration:!(fragment),type:live
  |id=5
}}
{{AtoZ}}

A · B · C · D · E · F · G · H · I · J · K · L · M · N · O · P · Q · R · S · T · U · V · W · X · Y · Z

Live Songs, excluding fragments
A
All I Want for Christmas Is You [5 Items]
B
Back to Black [1 Item]
Bohemian Rhapsody [1 Item]
C
Crazy (Willie Nelson song) [1 Item]
D
Dream a Little Dream of Me [1 Item]
F
Fly Me to the Moon [1 Item]
L
Love Don't Let Me Go [1 Item]
W
When We Were Young [1 Item]

A · B · C · D · E · F · G · H · I · J · K · L · M · N · O · P · Q · R · S · T · U · V · W · X · Y · Z

Code

[Edit module code]


local p = {}

local mw_text = mw.text
local mw_title = require('mw.title')

-- Fetch JSON from a page
function p.fetchJSONFromPage(pageName)
    local title = mw_title.new(pageName)
    if not title then return nil, 'Invalid page name' end
    local content = title:getContent()
    if not content then return nil, 'Page not found or empty' end
    return mw_text.jsonDecode(content)
end

-- Limit key length
local function limitKeyLength(key, maxChars)
    if maxChars then return string.sub(key, 1, tonumber(maxChars)) end
    return key
end

-- Find match in secondary table
local function findInSecondary(secondaryData, key, secondaryFieldName)
    if type(secondaryData) == "table" then
        if secondaryData[key] then return secondaryData[key] end
        for _, item in pairs(secondaryData) do
            if item[secondaryFieldName] == key then return item end
        end
    end
    return nil
end

-- Merge secondary data into primary
local function enrichPrimaryWithSecondary(primaryData, secondaryPage, primaryFieldName, maxChars, secondaryFieldName)
    secondaryFieldName = secondaryFieldName or primaryFieldName
    local secondaryData, err = p.fetchJSONFromPage(secondaryPage)
    if not secondaryData then return nil, "Failed to load JSON from "..secondaryPage.." ("..err..")" end

    for _, item in ipairs(primaryData) do
        local primaryFieldValue = item[primaryFieldName]
        if primaryFieldValue then
            local limitedPrimaryFieldValue = limitKeyLength(primaryFieldValue, maxChars)
            local secondaryMatch = findInSecondary(secondaryData, limitedPrimaryFieldValue, secondaryFieldName)
            if secondaryMatch then
                for k, v in pairs(secondaryMatch) do
                    if k ~= secondaryFieldName then
                        local newKey = primaryFieldName.."-"..k
                        item[newKey] = v
                    end
                end
            end
        end
    end
    return primaryData
end

-- Enrich primary from supplements (legacy JSON or Cargo)
local function enrichFromSupplements(primaryData, supplements, supplements_query)
    -- Legacy JSON supplements
    if supplements then
        local supplementsList = mw_text.split(supplements, ",")
        for _, supplement in ipairs(supplementsList) do
            local parts = mw_text.split(supplement,";")
            local secondaryPage = parts[1]
            local primaryFieldParts = mw_text.split(parts[2],":")
            local primaryFieldName = primaryFieldParts[1]
            local maxChars = primaryFieldParts[2]
            local secondaryFieldName = parts[3] or nil
            local enrichedData, err = enrichPrimaryWithSecondary(primaryData, secondaryPage, primaryFieldName, maxChars, secondaryFieldName)
            if not enrichedData then return nil, "Error: "..err end
        end
    end

    -- Cargo supplements
    if supplements_query then
        for _, supp in ipairs(supplements_query) do
            local secData = supp.query
            -- If string, try decoding JSON or treat as page name
            if type(secData) == "string" then
                local ok, tbl = pcall(mw.text.jsonDecode, secData)
                if ok and type(tbl) == "table" then
                    secData = tbl
                else
                    -- Fallback: treat as page
                    secData, err = p.fetchJSONFromPage(secData)
                    if not secData then return nil, "Error fetching supplements_query page: "..tostring(supp.query) end
                end
            end

            local primaryField = supp.primaryField
            local secondaryField = supp.secondaryField or primaryField
            local mode = supp.mode or "single"
            local prefix = supp.prefix or ""
            local fields = supp.fields or {}
            local maxChars = supp.maxChars

            for _, item in ipairs(primaryData) do
                local keyValue = item[primaryField]
                if keyValue then
                    keyValue = limitKeyLength(keyValue, maxChars)
                    for _, secItem in ipairs(secData) do
                        if secItem[secondaryField] == keyValue then
                            for _, f in ipairs(fields) do
                                if mode=="array" then
                                    item[prefix..f] = item[prefix..f] or {}
                                    table.insert(item[prefix..f], {url=secItem["URL"], duration=secItem["Duration"], quality=secItem["Quality"]})
                                else
                                    item[prefix..f] = secItem[f]
                                end
                            end
                        end
                    end
                end
            end
        end
    end

    return primaryData
end

-- Simple filter
local function matchesFilter(value, pattern)
    local luaPattern = pattern:gsub("%%", "%%%%"):gsub("%*", ".*")
    local isGroupNegated = luaPattern:match("^%!%b()$")
    if isGroupNegated then luaPattern = luaPattern:sub(3,-2) end
    local alternatives = {}
    for alt in luaPattern:gmatch("[^/]+") do alt = alt:gsub("([%^%$%(%)%%%.%[%]%*%+%-%?])","%%%1"); table.insert(alternatives, alt) end
    local subMatch = false
    for _, alt in ipairs(alternatives) do
        if tostring(value):match(alt) then subMatch = true; break end
    end
    if (not isGroupNegated and subMatch) or (isGroupNegated and not subMatch) then return true end
    return false
end

-- Resolve paths
local INVISIBLE_SEPARATOR = " "
local function resolvePath(tbl, path)
    local current = tbl
    for segment in path:gmatch("[^.]+") do
        local key, index = segment:match("^(%w+)%[(%d+)%]$")
        if key then
            if current and current[key] then
                local idx = tonumber(index)
                if current[key][idx] ~= nil then current = current[key][idx] else return INVISIBLE_SEPARATOR end
            else return "" end
        else
            current = current and current[segment]
            if current==nil then return "" end
        end
    end
    return current
end

-- Compute field value
function p.computeField(item, formula)
    return (formula:gsub("(%b<>)", function(placeholder)
        local content = placeholder:sub(2,-2)
        local fieldPart, filterPart = content:match("([^|]+)|?(.*)")
        filterPart = filterPart ~= "" and filterPart or nil
        local parts = {}
        for part in fieldPart:gmatch("[^:]+") do table.insert(parts, part) end
        local fieldPath = parts[1]
        local value = resolvePath(item, fieldPath)
        value = tostring(value or "")
        if filterPart and not matchesFilter(value, filterPart) then return "" end
        if #parts>1 then
            for i=2,#parts,2 do
                local pattern = parts[i]
                local replacement = parts[i+1]
                if pattern=="toUpper" then value=value:upper()
                elseif pattern=="toLower" then value=value:lower()
                elseif pattern=="toTitleCase" then value=value:gsub("(%a)(%w*)",function(f,r) return f:upper()..r end)
                elseif pattern=="limit" and replacement then
                    local n=tonumber(replacement)
                    if n then value=mw.ustring.sub(value,1,n) end
                elseif replacement then
                    value=value:gsub(pattern,replacement)
                end
            end
        end
        return value
    end))
end

-- Split string
local function splitString(str, delimiter)
    local result={}
    for match in (str..delimiter):gmatch("(.-)"..delimiter) do table.insert(result,match) end
    return result
end

-- Filter data
function p.filterData(data, filters)
    local filteredData={}
    for _, item in ipairs(data) do
        local match=true
        for key, pattern in pairs(filters) do
            if item[key]==nil or not matchesFilter(item[key], pattern) then match=false; break end
        end
        if match then table.insert(filteredData,item) end
    end
    return filteredData
end

-- Sort data
function p.sortData(data, sortKeyFormula)
    table.sort(data, function(a,b)
        local aKey = p.computeField(a, sortKeyFormula):gsub('[^%w%?%!]', ''):lower():gsub('%?',"%#")
        local bKey = p.computeField(b, sortKeyFormula):gsub('[^%w%?%!]', ''):lower():gsub('%?',"%#")
        return tostring(aKey) < tostring(bKey)
    end)
    return data
end

-- Group data
function p.groupData(data, keyFormula, labelFormula, charLimit)
    local groupedData = {}
    local currentKey = nil
    for _, item in ipairs(data) do
        local keyValue = p.computeField(item, keyFormula)
        if charLimit and tonumber(charLimit) then
            keyValue = mw.ustring.sub(keyValue:gsub('[%(%)%[%]%{%}]',''),1,tonumber(charLimit))
        end
        local headerValue = labelFormula and p.computeField(item, labelFormula) or keyValue
        if keyValue ~= currentKey then
            currentKey = keyValue
            groupedData[#groupedData+1] = {key=keyValue, header=headerValue, items={}}
        end
        table.insert(groupedData[#groupedData].items, item)
    end
    return groupedData
end

-- Render table headers
function p.renderHeaders(headers)
    return '! ' .. table.concat(headers, ' !! ') .. '\n'
end

-- Render table rows
function p.renderRows(items, computedKeys)
    local rows = ''
    for _, item in ipairs(items) do
        rows = rows .. '|-\n'
        local row = {}
        for _, keyFormula in ipairs(computedKeys) do
            row[#row+1] = p.computeField(item, keyFormula) or ''
        end
        rows = rows .. '| ' .. table.concat(row, ' || ') .. '\n'
    end
    return rows
end

-- Render full table
function p.renderTable(data, headers, computedKeys, sortKeyFormula, groupLabelFormula, charLimit, sortKeyFormula1, groupLabelFormula1, charLimit1, groupSortKeyFormula, caption, tableId)
    local outerTable = '{| class="wikitable" style="width:100%; margin:0;"\n'
    outerTable = outerTable .. '|-\n! colspan="'..#headers..'" style="text-align:left;" | '..caption..'\n'
    local sortedData1 = p.sortData(data, sortKeyFormula1)
    local groupedData1 = p.groupData(sortedData1, sortKeyFormula1, groupLabelFormula1, charLimit1)
    for groupIndex1, group1 in ipairs(groupedData1) do
        local sortedData = p.sortData(group1.items, sortKeyFormula)
        local groupedData = p.groupData(sortedData, sortKeyFormula, groupLabelFormula, charLimit)
        outerTable = outerTable .. '|-\n| style="text-align:center;" | <span id="'..group1.key:gsub('[%[%]]','')..'">'..group1.header..'</span>\n'
        for groupIndex, group in ipairs(groupedData) do
            if groupSortKeyFormula then p.sortData(group.items, groupSortKeyFormula) end
            local groupId = tableId..'-'..groupIndex1..'-'..groupIndex
            local itemCount = #group.items
            local toggleLabel = '['..itemCount..' Item'..(itemCount>1 and 's' or '')..']'
            local toggleSpan = '<span class="mw-customtoggle-'..groupId..'" style="cursor:pointer; color:blue;">'..toggleLabel..'</span>'
            outerTable = outerTable .. '|-\n! colspan="'..#headers..'" style="background-color:#f5f5f5; text-align:left;" | <span id="'..group.key:gsub('[%[%]]','')..'">'..group.header..'</span> '..toggleSpan..'\n'
            outerTable = outerTable .. '<tr class="mw-collapsible mw-collapsed" id="mw-customcollapsible-'..groupId..'" style="display:none">\n'
            outerTable = outerTable .. '| <div class="youtube-player-placeholder">\n'
            outerTable = outerTable .. '{| class="wikitable sortable" style="width:100%; margin:0;"\n'
            outerTable = outerTable .. '|-\n'..p.renderHeaders(headers)
            outerTable = outerTable .. p.renderRows(group.items, computedKeys)
            outerTable = outerTable .. '|}\n</div>\n'
            outerTable = outerTable .. '</tr>\n'
        end
    end
    outerTable = outerTable .. '|}\n'
    return outerTable
end

-- Enrich primary data from supplemental sources (legacy or cargo query)
local function enrichFromSupplements(primaryData, supplements, supplements_query)
    -- Legacy supplements (JSON page)
    if supplements then
        local supplementsList = mw_text.split(supplements, ",")
        for _, supplement in ipairs(supplementsList) do
            local parts = mw_text.split(supplement,";")
            local secondaryPage = parts[1]
            local primaryFieldParts = mw_text.split(parts[2],":")
            local primaryFieldName = primaryFieldParts[1]
            local maxChars = primaryFieldParts[2]
            local secondaryFieldName = parts[3] or nil
            local enrichedData, err = enrichPrimaryWithSecondary(primaryData, secondaryPage, primaryFieldName, maxChars, secondaryFieldName)
            if not enrichedData then return nil, "Error: "..err end
        end
    end

    -- New supplements_query (Cargo query)
    -- New supplements_query (Cargo query or JSON string or page name)
    if supplements_query then
        for _, supp in ipairs(supplements_query) do
            local secData = supp.query
            local err
            if type(secData) == "string" then
                -- try decode JSON first (covers #cargo_query output)
                local ok, tbl = pcall(mw.text.jsonDecode, secData)
                if ok and tbl then
                    secData = tbl
                else
                    -- fallback: treat as a page name (legacy)
                    secData, err = p.fetchJSONFromPage(secData)
                    if not secData then return nil, "Error: failed to fetch supplement page "..tostring(supp.query).." ("..err..")" end
                end
            elseif type(secData) ~= "table" then
                return nil, "Error: invalid supplement.query; expected table or JSON string or page name"
            end

            local primaryField = supp.primaryField
            local secondaryField = supp.secondaryField or primaryField
            local mode = supp.mode or "single"
            local prefix = supp.prefix or ""
            local fields = supp.fields or {}

            for _, item in ipairs(primaryData) do
                local keyValue = item[primaryField]
                if keyValue then
                    keyValue = limitKeyLength(keyValue, supp.maxChars)
                    for _, secItem in ipairs(secData) do
                        if secItem[secondaryField] == keyValue then
                            for _, f in ipairs(fields) do
                                if mode == "array" then
                                    item[prefix..f] = item[prefix..f] or {}
                                    table.insert(item[prefix..f], secItem[f])
                                else
                                    item[prefix..f] = secItem[f]
                                end
                            end
                        end
                    end
                end
            end
        end
    end

    return primaryData
end

-- Main function
function p.createTable(frame)
    local args = frame:getParent().args
    for k,v in pairs(frame.args) do args[k]=v end

    local pageName = args['page']
    local dataArg = args['data']
    local cargo_query = args['cargo_query']
    local supplements = args['supplements']
    local supplements_query = args['supplements_query']

    if not args['headers'] or not args['keys'] or not args['sort'] then
        return 'Error: headers, keys, and sort arguments are required.'
    end

    local headers = mw_text.split(args['headers'],',')
    local computedKeys = mw_text.split(args['keys'],',')
    local sortKeyFormula = args['sort']
    local groupLabelFormula = args['group']
    local charLimit = args['char_limit']
    local sortKeyFormula1 = args['sort1'] or sortKeyFormula
    local groupLabelFormula1 = args['group1']
    local charLimit1 = args['char_limit1'] or 0
    local groupSortKeyFormula = args['group_sort']
    local caption = args['caption'] or ''
    local tableID = args['id']

    local filters={}
    local filterString=args['filters']
    if filterString then
        for key,pattern in filterString:gmatch('([^:]+):([^,]+),?') do filters[key]=pattern end
    end

    -- Determine primaryData from page, data, or cargo_query
    local primaryData
    if dataArg then
        local ok,tbl = pcall(mw.text.jsonDecode,dataArg)
        if ok and type(tbl)=="table" then primaryData=tbl else return "Error decoding data JSON" end
    elseif cargo_query then
        if type(cargo_query)=="string" then
            local ok,tbl=pcall(mw.text.jsonDecode,cargo_query)
            if ok and type(tbl)=="table" then primaryData=tbl
            else return "Error decoding cargo_query JSON"
            end
        elseif type(cargo_query)=="table" then
            primaryData=cargo_query
        else return "Error: cargo_query must be a table or JSON string" end
    elseif pageName then
        local err
        primaryData, err = p.fetchJSONFromPage(pageName)
        if not primaryData then return 'Error fetching JSON: '..err end
    else
        return 'Error: page, data, or cargo_query argument required.'
    end

    local err
    primaryData, err = enrichFromSupplements(primaryData, supplements, supplements_query)
    if not primaryData then return err end

    local data = p.filterData(primaryData, filters)
    -- Render table using your existing renderTable function...
    return p.renderTable(data, headers, computedKeys, sortKeyFormula, groupLabelFormula, charLimit, sortKeyFormula1, groupLabelFormula1, charLimit1, groupSortKeyFormula, caption, tableID)
end

return p