Defining multiarray DS json

Issue #3 resolved
Teemu Halmela created an issue

I'm trying to test db2sock with our system. But I have a problem defining a json which contains DS arrays.

For example with a program like this.

     H AlwNull(*UsrCtl)

       dcl-ds itemDS qualified;
            field1 char(5);
            field2 char(5);
            field3 char(5);
            field4 char(5);
       end-ds;

       dcl-pr Main extpgm;
         rows zoned(5:0);
         items likeds(itemDS) dim(20);
         last char(10);
       end-pr;

       dcl-pi Main;
         rows zoned(5:0);
         items likeds(itemDS) dim(20);
         last char(10);
       end-pi;

         dcl-s i int(10);
         for i = 1 to rows;
           items(i).field4 = items(i).field1;
           items(i).field3 = items(i).field2;
         endfor;
         last = 'TEST';

       return;

This json doesn't crash and gives almost the right results.

{"pgm":[
    {"name":"TPGM",  "lib":"DB2JSON"},
    {"s": {"name":"rows", "type":"5s0", "value":2}},
    {"ds": [
        {"s":[
            {"name":"field1", "type":"5a", "value":"ff1"},
            {"name":"field2", "type":"5a", "value":"ff2"},
            {"name":"field3", "type":"5a", "value":""},
            {"name":"field4", "type":"5a", "value":""}
        ]},
        {"s":[
            {"name":"field1", "type":"5a", "value":"gg1"},
            {"name":"field2", "type":"5a", "value":"gg2"},
            {"name":"field3", "type":"5a", "value":""},
            {"name":"field4", "type":"5a", "value":""}
        ]}
    ]},
    {"s": {"name":"last", "type":"10a", "value":""}}
]}
output(201): {"script":[{"pgm":["TPGM","DB2JSON",{"rows":2},{"":{"field1":"TEST"},{"field2":{}},{"field3":"ff2"},{"field4":"ff1"},{"field1":"gg1"},{"field2":"gg2"},{"field3":"gg2"},{"field4":"gg1"}},{"last":{}}]}]}

Weird thing is value TEST goes into DS field1.
Should I define the ds name and dim somewhere? Db2socks is still work in progress but is there currently a way to get this working?

I am testing this on v7.3 using the newest db2sock build by hand.

Comments (66)

  1. Former user Account Deleted

    status ...

    This project is still under construction (overview pages warning). However, cool, thanks for trying out toolkit in the db2 new driver. I will make effort to respond to your finding as quickly as possible (i do sleep however).

    current work ..

    I am trying to work on another 'stand-alone' area driver (remove old PASE libdb400.a). That is, we do not really have the base db2 driver tested yet.

    proposed methodology to work together ...

    Save time (mine and yours) ... This looks similar to another fix while ago (fixed is relative hours, not days). I would like to put 'our joint efforts' on same binary level when you test. Can you please download released (pre-compiled), versions on YIPs.

  2. Former user Account Deleted

    BTW -- Thanks for cut/paste of both RPG and the JSON, this will be very helpful.

    outlook ...

    I forgot to add 'forward looking' commentary (db2sock quarterly/hourly stock report). As with any 'forward looking', mine comes with usual 'doesn't mean anything' legal disclaimer.

    So, my belief, if we (you and me), try a bunch of standard JSON parameter configuration combinations, eventually we will hit the 80% 'use case' for common RPG programs. I would encourage a focus on the likely for now (your example is likely), and, try the exotic after we get the basics running.

  3. Former user Account Deleted

    So, i looked quickly at your test ... you have the json wrong ... it would look more like this ... (but dim is not working yet).

           dcl-pr Main extpgm;
             rows zoned(5:0);
             items likeds(itemDS) dim(20);
             last char(10);
           end-pr;
    
    {"pgm":[
        {"name":"HAMELA01",  "lib":"DB2JSON"},
        {"s": {"name":"rows", "type":"5s0", "value":2}},
        {"ds": [{"name":"items","dim":20},
            {"s":[
                {"name":"field1", "type":"5a", "value":"ff1"},
                {"name":"field2", "type":"5a", "value":"ff2"},
                {"name":"field3", "type":"5a", "value":""},
                {"name":"field4", "type":"5a", "value":""}
            ]}
        ]},
        {"s": {"name":"last", "type":"10a", "value":""}}
    ]}
    

    In this case 'ds' structure named 'items', has a dim(20) qualifier. The json above would set default values for all 20 items (if working correctly).

    items(1).field1 = 'ff1  ';
    items(1).field2 = 'ff2  ';
    items(1).field3 = '     ';
    items(1).field4 = '     ';
    
    items(2).field1 = 'ff1  ';
    items(2).field2 = 'ff2  ';
    items(2).field3 = '     ';
    items(2).field4 = '     ';
    :
    items(20).field1 = 'ff1  ';
    items(20).field2 = 'ff2  ';
    items(20).field3 = '     ';
    items(20).field4 = '     ';
    

    Hypothetically (art of possible), your settings might look something like this ...

    Before we go on ... this whole idea is complete wrong for setting individual values in an input array ... but ... i am trying to lead you to the truth (you made an error).

    {"pgm":[
        {"name":"TPGM",  "lib":"DB2JSON"},
        {"s": {"name":"rows", "type":"5s0", "value":2}},
        {"ds": [{"name":"items","dim":20},
            {"s":[
                {"name":"field1", "type":"5a", "value":"ff1"},
                {"name":"field2", "type":"5a", "value":"ff2"},
                {"name":"field3", "type":"5a", "value":""},
                {"name":"field4", "type":"5a", "value":""}
            ]},
            {"s":[
                {"name":"field1", "type":"5a", "value":"gg1"},
                {"name":"field2", "type":"5a", "value":"gg2"},
                {"name":"field3", "type":"5a", "value":""},
                {"name":"field4", "type":"5a", "value":""}
            ]}
        ]},
        {"s": {"name":"last", "type":"10a", "value":""}}
    ]}
    

    Wherein, array element 1 and 2 where described in your json, but last default values would be 'continue' for remaining ... or ... perhaps last values would return to *BLANKS as default

    items(1).field1 = 'ff1  ';
    items(1).field2 = 'ff2  ';
    items(1).field3 = '     ';
    items(1).field4 = '     ';
    
    items(2).field1 = 'gg1  ';
    items(2).field2 = 'gg2  ';
    items(2).field3 = '     ';
    items(2).field4 = '     ';
    :
    perhaps carry last value forward (seems wrong)
    :
    items(20).field1 = 'gg1  ';
    items(20).field2 = 'gg2  ';
    items(20).field3 = '     ';
    items(20).field4 = '     ';
    :
    perhaps *BLANK (seems more correct)
    :
    items(20).field1 = '     ';
    items(20).field2 = '     ';
    items(20).field3 = '     ';
    items(20).field4 = '     ';
    
  4. Former user Account Deleted

    Mmm ... we are not far enough along to document json interface. So, really, all you have is the few samples in the tests_c directory to run with test1000_sql400json32 (test1000_sql400json64).

    The json interface will be for toolkit writers mostly (php, node, ruby, python, etc.). However, you will also be able to call directly using json over some other protocol IPC (fastcgi, ILE CGI, socket, whatever).

    Are you a toolkit writer??? What language?

  5. Former user Account Deleted

    Ok, I have a fix for your issue.

    Yips Super Driver -- compiled with all ILE parts (DB2JSON *savf)

        libdb400-1.0.5-sg4.zip - (20170927 14:43)
            fix for ds dim(20) - hamela01
            added tests_c compiled tests to zip file (dir included)
            version test
    
            bash-4.3$ ./tests_c/test9999_driver_version32
            run (trace=on)
            version (1.0.5-sg4)
            success (0)
    

    changed your test just a little ..

    bash-4.3$ ./test1000_sql400json32 j0160_pgm_hamela01-ds
    input(4096):
    {"pgm":[
        {"name":"HAMELA01",  "lib":"DB2JSON"},
        {"s": {"name":"rows", "type":"5s0", "value":20}},
        {"ds": [{"name":"itemDS","dim":20},
            {"s":[
                {"name":"field1", "type":"5a", "value":"ff1"},
                {"name":"field2", "type":"5a", "value":"ff2"},
                {"name":"field3", "type":"5a", "value":""},
                {"name":"field4", "type":"5a", "value":""}
            ]}
        ]},
        {"s": {"name":"last", "type":"10a", "value":""}}
    ]}
    
    
    output(1408):
    {"script":[{"pgm":["HAMELA01","DB2JSON",{"rows":20},
    {"itemDS":[
    {"field1":"a1"},{"field2":"b1"},{"field3":"c1"},{"field4":"d1"},
    {"field1":"a2"},{"field2":"b2"},{"field3":"c2"},{"field4":"d2"},
    {"field1":"a3"},{"field2":"b3"},{"field3":"c3"},{"field4":"d3"},
    {"field1":"a4"},{"field2":"b4"},{"field3":"c4"},{"field4":"d4"},
    {"field1":"a5"},{"field2":"b5"},{"field3":"c5"},{"field4":"d5"},
    {"field1":"a6"},{"field2":"b6"},{"field3":"c6"},{"field4":"d6"},
    {"field1":"a7"},{"field2":"b7"},{"field3":"c7"},{"field4":"d7"},
    {"field1":"a8"},{"field2":"b8"},{"field3":"c8"},{"field4":"d8"},
    {"field1":"a9"},{"field2":"b9"},{"field3":"c9"},{"field4":"d9"},
    {"field1":"a10"},{"field2":"b10"},{"field3":"c10"},{"field4":"d10"},
    {"field1":"a11"},{"field2":"b11"},{"field3":"c11"},{"field4":"d11"},
    {"field1":"a12"},{"field2":"b12"},{"field3":"c12"},{"field4":"d12"},
    {"field1":"a13"},{"field2":"b13"},{"field3":"c13"},{"field4":"d13"},
    {"field1":"a14"},{"field2":"b14"},{"field3":"c14"},{"field4":"d14"},
    {"field1":"a15"},{"field2":"b15"},{"field3":"c15"},{"field4":"d15"},
    {"field1":"a16"},{"field2":"b16"},{"field3":"c16"},{"field4":"d16"},
    {"field1":"a17"},{"field2":"b17"},{"field3":"c17"},{"field4":"d17"},
    {"field1":"a18"},{"field2":"b18"},{"field3":"c18"},{"field4":"d18"},
    {"field1":"a19"},{"field2":"b19"},{"field3":"c19"},{"field4":"d19"},
    {"field1":"a20"},{"field2":"b20"},{"field3":"c20"},{"field4":"d20"}]},
    {"last":"TEST"}]}]}
    
    result:
    success (0)
    

    The above is valid json returned, but i don't like the format. That is, probably should have the "ds" structure return an array of array records to ease the parsing on the client side. Another small thing to add to the list json parser and output controller (toolkit-parser-json).

    BTW -- you can replace the json parser with your own version an call toolkit-base on your own. However, again, the interface is not complete, so probably best to wait until i finish my default included project parsers (josn, xml, etc.).

    bash-4.3$ cat ../tests_ILE-RPG/hamela01.rpgle 
         H AlwNull(*UsrCtl)
    
           dcl-ds itemDS qualified;
                field1 char(5);
                field2 char(5);
                field3 char(5);
                field4 char(5);
           end-ds;
    
           dcl-pr Main extpgm;
             rows zoned(5:0);
             items likeds(itemDS) dim(20);
             last char(10);
           end-pr;
    
           dcl-pi Main;
             rows zoned(5:0);
             items likeds(itemDS) dim(20);
             last char(10);
           end-pi;
    
             dcl-s i int(10);
             for i = 1 to rows;
               items(i).field1 = 'a' + %char(i);
               items(i).field2 = 'b' + %char(i);
               items(i).field3 = 'c' + %char(i);
               items(i).field4 = 'd' + %char(i);
             endfor;
             last = 'TEST';
    
           return;
    
  6. Former user Account Deleted

    FYI -- There is a lot more work to do with arrays. Specifically, like xmlservice, we will have to add programming convention of MAX and COUNT to limit json returned (popular RPG convention).

    Something like json below. Where "max" determines maximum array elements to return, and, "many" will be set to the value returned (by RPG). Therein, there will only be up to 10 rows returned in this example ("enddo":"many"), without any json for array elements 11-20. (XMLSERVICE already works this way for old toolkits).

    {"pgm":[
        {"name":"HAMELA01",  "lib":"DB2JSON"},
        {"s": {"name":"max", "type":"5s0", "value":10}},
        {"s": {"name":"many", "type":"5s0", "value":0}},
        {"ds": [{"name":"itemDS","dim":20, "enddo":"many"},
            {"s":[
                {"name":"field1", "type":"5a", "value":"ff1"},
                {"name":"field2", "type":"5a", "value":"ff2"},
                {"name":"field3", "type":"5a", "value":""},
                {"name":"field4", "type":"5a", "value":""}
            ]}
        ]},
        {"s": {"name":"last", "type":"10a", "value":""}}
    ]}
    
  7. Former user Account Deleted

    Ok, i added array of array of records for easier json client parsing.

    bash-4.3$ ./test1000_sql400json32 j0160_pgm_hamela01-ds
    input(4096):
    {"pgm":[
        {"name":"HAMELA01",  "lib":"DB2JSON"},
        {"s": {"name":"rows", "type":"5s0", "value":20}},
        {"ds": [{"name":"items","dim":20},
            {"s":[
                {"name":"field1", "type":"5a", "value":"ff1"},
                {"name":"field2", "type":"5a", "value":"ff2"},
                {"name":"field3", "type":"5a", "value":""},
                {"name":"field4", "type":"5a", "value":""}
            ]}
        ]},
        {"s": {"name":"last", "type":"10a", "value":""}}
    ]}
    
    
    output(1447):
    {"script":[{"pgm":["HAMELA01","DB2JSON",{"rows":20},{"items":[
    [{"field1":"a1"},{"field2":"b1"},{"field3":"c1"},{"field4":"d1"}],
    [{"field1":"a2"},{"field2":"b2"},{"field3":"c2"},{"field4":"d2"}],
    [{"field1":"a3"},{"field2":"b3"},{"field3":"c3"},{"field4":"d3"}],
    [{"field1":"a4"},{"field2":"b4"},{"field3":"c4"},{"field4":"d4"}],
    [{"field1":"a5"},{"field2":"b5"},{"field3":"c5"},{"field4":"d5"}],
    [{"field1":"a6"},{"field2":"b6"},{"field3":"c6"},{"field4":"d6"}],
    [{"field1":"a7"},{"field2":"b7"},{"field3":"c7"},{"field4":"d7"}],
    [{"field1":"a8"},{"field2":"b8"},{"field3":"c8"},{"field4":"d8"}],
    [{"field1":"a9"},{"field2":"b9"},{"field3":"c9"},{"field4":"d9"}],
    [{"field1":"a10"},{"field2":"b10"},{"field3":"c10"},{"field4":"d10"}],
    [{"field1":"a11"},{"field2":"b11"},{"field3":"c11"},{"field4":"d11"}],
    [{"field1":"a12"},{"field2":"b12"},{"field3":"c12"},{"field4":"d12"}],
    [{"field1":"a13"},{"field2":"b13"},{"field3":"c13"},{"field4":"d13"}],
    [{"field1":"a14"},{"field2":"b14"},{"field3":"c14"},{"field4":"d14"}],
    [{"field1":"a15"},{"field2":"b15"},{"field3":"c15"},{"field4":"d15"}],
    [{"field1":"a16"},{"field2":"b16"},{"field3":"c16"},{"field4":"d16"}],
    [{"field1":"a17"},{"field2":"b17"},{"field3":"c17"},{"field4":"d17"}],
    [{"field1":"a18"},{"field2":"b18"},{"field3":"c18"},{"field4":"d18"}],
    [{"field1":"a19"},{"field2":"b19"},{"field3":"c19"},{"field4":"d19"}],
    [{"field1":"a20"},{"field2":"b20"},{"field3":"c20"},{"field4":"d20"}]]},
    {"last":"TEST"}]}]}
    
    result:
    success (0)
    
  8. Teemu Halmela reporter

    Thank you for your speedy help. Things are starting to look better.

    I'll try make some more example programs that at least fits to our needs. I'll move to our real programs when we get these smaller ones working first. And if the performance is good this could replace XMLSERVICE which struggles with large arrays of data.

    End goal here for us is to call RPG programs from the PHP side, so I should be able to help with that.

  9. Teemu Halmela reporter

    There seems to still be something weird going on with dim value and I'm not sure if it is intended.

    I've modified the test program a little bit.

         H AlwNull(*UsrCtl)
    
           dcl-ds inputDS qualified;
                in1 varchar(5:2);
                in2 varchar(5:2);
           end-ds;
    
           dcl-ds outDS qualified;
                out1 varchar(5:2);
                out2 varchar(5:2);
                out3 varchar(10:2);
           end-ds;
    
           dcl-pr Main extpgm;
             inCount int(10);
             input likeds(inputDS) dim(20);
             outCount int(10);
             output likeds(outDS) dim(20);
             last char(10);
           end-pr;
    
           dcl-pi Main;
             inCount int(10);
             input likeds(inputDS) dim(20);
             outCount int(10);
             output likeds(outDS) dim(20);
             last char(10);
           end-pi;
    
             dcl-s i int(10);
             for i = 1 to inCount;
                output(i).out1 = input(i).in1;
                output(i).out2 = input(i).in2;
                output(i).out3 = input(i).in1 + input(i).in2;
             endfor;
             last = 'TEST';
             outCount = i - 1;
           return;
    

    With this json the results are correct but the input array makes a horrible mess. It copies our input to fill the dim.

    input(1000):
    {"pgm":[
        {"name":"TPGM",  "lib":"DB2JSON"},
        {"s": {"name":"inCount", "type":"10i0", "value":5}},
        {"ds": [{"name":"input", "dim": 5},
            {"s":[ {"name":"in1", "type":"5av2", "value":"a1"}, {"name":"in2", "type":"5av2", "value":"a2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"b1"}, {"name":"in2", "type":"5av2", "value":"b2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"c1"}, {"name":"in2", "type":"5av2", "value":"c2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"d1"}, {"name":"in2", "type":"5av2", "value":"d2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"e1"}, {"name":"in2", "type":"5av2", "value":"e2"}]}
        ]},
        {"s": {"name":"outCount", "type":"10i0", "value":0}},
        {"ds": [{"name":"output", "dim":10},
            {"s":[ {"name":"out1", "type":"5av2", "value":""}, {"name":"out2", "type":"5av2", "value":""}, {"name":"out3", "type":"10av2", "value":""}]},
        ]},
        {"s": {"name":"last", "type":"10a", "value":""}}
    ]}
    
    output(1189): {"script":[
        {"pgm":["TPGM","DB2JSON",
            {"inCount":5},
            {"input":[
                [{"in1":"a1"},{"in2":"a2"},{"in1":"b1"},{"in2":"b2"},{"in1":"c1"},{"in2":"c2"},{"in1":"d1"},{"in2":"d2"},{"in1":"e1"},{"in2":"e2"}],
                [{"in1":"a1"},{"in2":"a2"},{"in1":"b1"},{"in2":"b2"},{"in1":"c1"},{"in2":"c2"},{"in1":"d1"},{"in2":"d2"},{"in1":"e1"},{"in2":"e2"}],
                [{"in1":"a1"},{"in2":"a2"},{"in1":"b1"},{"in2":"b2"},{"in1":"c1"},{"in2":"c2"},{"in1":"d1"},{"in2":"d2"},{"in1":"e1"},{"in2":"e2"}],
                [{"in1":"a1"},{"in2":"a2"},{"in1":"b1"},{"in2":"b2"},{"in1":"c1"},{"in2":"c2"},{"in1":"d1"},{"in2":"d2"},{"in1":"e1"},{"in2":"e2"}],
                [{"in1":"a1"},{"in2":"a2"},{"in1":"b1"},{"in2":"b2"},{"in1":"c1"},{"in2":"c2"},{"in1":"d1"},{"in2":"d2"},{"in1":"e1"},{"in2":"e2"}]
            ]},
            {"outCount":5},
            {"output":[
                [{"out1":"a1"},{"out2":"a2"},{"out3":"a1a2"}],
                [{"out1":"b1"},{"out2":"b2"},{"out3":"b1b2"}],
                [{"out1":"c1"},{"out2":"c2"},{"out3":"c1c2"}],
                [{"out1":"d1"},{"out2":"d2"},{"out3":"d1d2"}],
                [{"out1":"e1"},{"out2":"e2"},{"out3":"e1e2"}],
                [{"out1":{}},{"out2":{}},{"out3":{}}],
                [{"out1":{}},{"out2":{}},{"out3":{}}],
                [{"out1":{}},{"out2":{}},{"out3":{}}],
                [{"out1":{}},{"out2":{}},{"out3":{}}],
                [{"out1":{}},{"out2":{}},{"out3":{}}]
            ]},
            {"last":"TEST"}
        ]}
    ]}
    

    But if we leave the dim out of input or set it to 1, to me it looks more correct.

    input(990):
    {"pgm":[
        {"name":"TPGM",  "lib":"DB2JSON"},
        {"s": {"name":"inCount", "type":"10i0", "value":5}},
        {"ds": [{"name":"input"},
            {"s":[ {"name":"in1", "type":"5av2", "value":"a1"}, {"name":"in2", "type":"5av2", "value":"a2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"b1"}, {"name":"in2", "type":"5av2", "value":"b2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"c1"}, {"name":"in2", "type":"5av2", "value":"c2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"d1"}, {"name":"in2", "type":"5av2", "value":"d2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"e1"}, {"name":"in2", "type":"5av2", "value":"e2"}]}
        ]},
        {"s": {"name":"outCount", "type":"10i0", "value":0}},
        {"ds": [{"name":"output", "dim":10},
            {"s":[ {"name":"out1", "type":"5av2", "value":""}, {"name":"out2", "type":"5av2", "value":""}, {"name":"out3", "type":"10av2", "value":""}]},
        ]},
        {"s": {"name":"last", "type":"10a", "value":""}}
    ]}
    
    output(659): {"script":[
        {"pgm":["TPGM","DB2JSON",
            {"inCount":5},
            {"input":[
                {"in1":"a1"},{"in2":"a2"},
                {"in1":"b1"},{"in2":"b2"},
                {"in1":"c1"},{"in2":"c2"},
                {"in1":"d1"},{"in2":"d2"},
                {"in1":"e1"},{"in2":"e2"}
            ]},
            {"outCount":5},
            {"output":[
                [{"out1":"a1"},{"out2":"a2"},{"out3":"a1a2"}],
                [{"out1":"b1"},{"out2":"b2"},{"out3":"b1b2"}],
                [{"out1":"c1"},{"out2":"c2"},{"out3":"c1c2"}],
                [{"out1":"d1"},{"out2":"d2"},{"out3":"d1d2"}],
                [{"out1":"e1"},{"out2":"e2"},{"out3":"e1e2"}],
                [{"out1":{}},{"out2":{}},{"out3":{}}],
                [{"out1":{}},{"out2":{}},{"out3":{}}],
                [{"out1":{}},{"out2":{}},{"out3":{}}],
                [{"out1":{}},{"out2":{}},{"out3":{}}],
                [{"out1":{}},{"out2":{}},{"out3":{}}]
            ]},
            {"last":"TEST"}
        ]}
    ]}
    

    If we leave dim out from both arrays then the output isn't correct which probably is what should happen.

    input(980):
    {"pgm":[
        {"name":"TPGM",  "lib":"DB2JSON"},
        {"s": {"name":"inCount", "type":"10i0", "value":5}},
        {"ds": [{"name":"input"},
            {"s":[ {"name":"in1", "type":"5av2", "value":"a1"}, {"name":"in2", "type":"5av2", "value":"a2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"b1"}, {"name":"in2", "type":"5av2", "value":"b2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"c1"}, {"name":"in2", "type":"5av2", "value":"c2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"d1"}, {"name":"in2", "type":"5av2", "value":"d2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"e1"}, {"name":"in2", "type":"5av2", "value":"e2"}]}
        ]},
        {"s": {"name":"outCount", "type":"10i0", "value":0}},
        {"ds": [{"name":"output"},
            {"s":[ {"name":"out1", "type":"5av2", "value":""}, {"name":"out2", "type":"5av2", "value":""}, {"name":"out3", "type":"10av2", "value":""}]},
        ]},
        {"s": {"name":"last", "type":"10a", "value":""}}
    ]}
    
    output(283): {"script":[
        {"pgm":["TPGM","DB2JSON",
            {"inCount":5},
            {"input":[
                {"in1":"a1"},{"in2":"a2"},
                {"in1":"b1"},{"in2":"b2"},
                {"in1":"c1"},{"in2":"c2"},
                {"in1":"d1"},{"in2":"d2"},
                {"in1":"e1"},{"in2":"e2"}
            ]},
            {"outCount":5},
            {"output":[
                {"out1":"a1"},{"out2":"a2"},{"out3":"a1a2"}
            ]},
            {"last":"TEST"}
        ]}
    ]}
    
  10. Former user Account Deleted

    bottom line ..

    You are making an error. Wrong thinking about what a 'ds' means.

    real life ...

    First design 'parameter marshalling' philosophy ... generally dimensioned 'parameters like s dim(20) or ds dim(999) are used as output parameters. That is to say, most 'natural' human design RPG interfaces are single data values input and dimensioned arrays of 's' and 'ds' as output. In fact, in RPG, ds.elem(i).value=4 becomes so tedious that if you design your 'input' parameter interfaces this way your consumers/users come looking for you with pitch forks, torches and a long rope.

    Overstated (for visual), only exception to this convention is languages like C++ that use objects as passed parameters. Wherein, of course, C++ class is simply a fancy name for a data structure with pointer for methods. Essentially 'encapsulation' acted upon by this->method avoids consumer/user revolt (it's a good thing ... but we RPG compiler guys know better).

    just 'tween us geeks ..

    So, while i will continue our discussion of 'how to set array values inside a ds', this largely will be an academic discussion between compiler theory geeks.

    Most important ... To point, you have a fundamental misunderstanding of defining a data structure. Quickly, 's' or data elements of a 'ds' or data structure are only the default values for the whole dim(20) array. To wit, follows same 'compiler' function/convention as 'traditional' RPG, Therefore, you can only set ONE default for each data element in the inz(*BLANKS), inz(0), and, the default values of the first array element are 'propagated' to the entire 'ds' dim(20) elements (RPG compiler 101).

    This is correct ...

    {"pgm":[
        {"name":"HAMELA01",  "lib":"DB2JSON"},
        {"s": {"name":"rows", "type":"5s0", "value":20}},
        {"ds": [{"name":"items","dim":20},
            {"s":[
                {"name":"field1", "type":"5a", "value":"ff1"},
                {"name":"field2", "type":"5a", "value":"ff2"},
                {"name":"field3", "type":"5a", "value":""},
                {"name":"field4", "type":"5a", "value":""}
            ]}
        ]},
        {"s": {"name":"last", "type":"10a", "value":""}}
    ]}
    

    This is incorrect ...

        {"ds": [{"name":"input", "dim": 5},
            {"s":[ {"name":"in1", "type":"5av2", "value":"a1"}, {"name":"in2", "type":"5av2", "value":"a2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"b1"}, {"name":"in2", "type":"5av2", "value":"b2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"c1"}, {"name":"in2", "type":"5av2", "value":"c2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"d1"}, {"name":"in2", "type":"5av2", "value":"d2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"e1"}, {"name":"in2", "type":"5av2", "value":"e2"}]}
        ]},
    

    fantasy land we go ...

    So, basically, you are proposing a json language with no calculation section. To wit, just using 'ds' and 's', how can i program json (no rude intended, just geek delight)??

    We can of course use json to set interior element values of an array of 'ds', but requires a bit of creative compiler thinking. So, all we have to 'know' is that RPG is a 'packed' language. That is, all the elements of an array are 'squished' back-2-back, with no gaps. Therefore we can simply surround any given 'ds with another 'ds and disperse the 's' element in any given location.

    Wherein this correct defaults setting (common) ...

        {"ds": [{"name":"items","dim":20},
            {"s":[
                {"name":"field1", "type":"5a", "value":"ff1"},
                {"name":"field2", "type":"5a", "value":"ff2"},
                {"name":"field3", "type":"5a", "value":""},
                {"name":"field4", "type":"5a", "value":""}
            ]}
        ]},
    

    Can become set the 11th element value idea (your question) ...

      {"ds": [{"name":"hack",},
        {"ds": [{"name":"items","dim":10},
            {"s":[
                {"name":"field1", "type":"5a", "value":"ff1"},
                {"name":"field2", "type":"5a", "value":"ff2"},
                {"name":"field3", "type":"5a", "value":""},
                {"name":"field4", "type":"5a", "value":""}
            ]}
        ]},
        {"s": [
              {"name":"field1", "type":"5a", "value":"bob"},
              {"name":"field2", "type":"5a", "value":"was"},
              {"name":"field3", "type":"5a", "value":"here"},
              {"name":"field4", "type":"5a", "value":"mary"}
        ]},
        {"ds": [{"name":"items","dim":9},
            {"s":[
                {"name":"field1", "type":"5a", "value":"gg1"},
                {"name":"field2", "type":"5a", "value":"gg2"},
                {"name":"field3", "type":"5a", "value":""},
                {"name":"field4", "type":"5a", "value":""}
            ]}
        ]},
      ]},
    

    and another ...

    As json syntax and beauty is in the eye of the beholder ... we could alo offer another RPG-like 'ds' idea like 'overlay'.

        {"ds": [{"name":"items","dim":20},
            {"s":[
                {"name":"field1", "type":"5a", "value":"ff1"},
                {"name":"field2", "type":"5a", "value":"ff2"},
                {"name":"field3", "type":"5a", "value":""},
                {"name":"field4", "type":"5a", "value":""}
            ]}
        ]},
        {"overlay": [{"where":"items","rec":11},
                [{"name":"field1", "type":"5a", "value":"bob"},
                 {"name":"field2", "type":"5a", "value":"was"},
                 {"name":"field3", "type":"5a", "value":"here"},
                 {"name":"field4", "type":"5a", "value":"mary"}]
        ]},
    

    but not what you did ...

    Sorry, but this is just wrong thinking ...

           dcl-ds inputDS qualified;
                in1 varchar(5:2);
                in2 varchar(5:2);
           end-ds;
    
           dcl-pr Main extpgm;
             inCount int(10);
             input likeds(inputDS) dim(20);
             outCount int(10);
             output likeds(outDS) dim(20);
             last char(10);
           end-pr;
    
           dcl-pi Main;
             :
             input likeds(inputDS) dim(20);
    
        {"ds": [{"name":"input", "dim": 5},
            {"s":[ {"name":"in1", "type":"5av2", "value":"a1"}, {"name":"in2", "type":"5av2", "value":"a2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"b1"}, {"name":"in2", "type":"5av2", "value":"b2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"c1"}, {"name":"in2", "type":"5av2", "value":"c2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"d1"}, {"name":"in2", "type":"5av2", "value":"d2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"e1"}, {"name":"in2", "type":"5av2", "value":"e2"}]}
        ]},
    

    BTW -- Logic error. You MUST (repeat MUST), declare dim(20), NOT dim(5). I will 'teach' why in a later post. For now, think RPG, never pass a smaller 'aggregate' dim(5) into a bigger aggregate dim(20), leaves junk in the elements past 5 (actually machine exception likely). In jspon case, dim(20) starts eating the next parameters data to fill out the entire 20 past 5 (BOOM weird result json).

    apologies

    Again, I have not had time to document the json.

  11. Former user Account Deleted

    I have the heart of a teacher. I don't mind an open debate. I may even be wrong at times, but not this time i think. So, i trust you now understand why you can't just list individual elements inside a dimensions structure ('ds' wrong thing). Repeat (boring), 's' elements of a 'ds' simply set initial values of all elements in the 'entire' array. To wit, when you added more elements inside your 'ds' dim(20), you actually got a much larger array, aka, many more elements in each record of the 'ds'. Anyway, if you doubt my logic, please feel free to attempt doing this in RPG using only 'ds' structures ... you will end up with ds inside ds or overlay like my json example (trust me).

  12. Former user Account Deleted

    Side ... please continue to try new things. The exploration of ideas, accepted or not, often breeds innovation.

    However, as said before, we are really trying to focus on the 'practical' to catch the 80% RPG 'use case'. More exotics like 'overlay' and setting individual array elements as parameter input i consider generally less likely in 'real life' (always exceptions of course).

  13. Former user Account Deleted

    Oh boy, you also have another logic error. (Man, I really hate being the bad guy).

    In your following code snip, you have a structure 'inputDS', which is declared as parameter input likeds(inputDS) dim(20);

           dcl-ds inputDS qualified;
                in1 varchar(5:2);
                in2 varchar(5:2);
           end-ds;
    
           dcl-pr Main extpgm;
             inCount int(10);
             input likeds(inputDS) dim(20);
             outCount int(10);
             output likeds(outDS) dim(20);
             last char(10);
           end-pr;
    

    You MUST account for all 20 elements in the input json. In your json you truncated to dim(5). This means RPG will start eating the other parameters to fill out the 20.

    This is right (the only right) ...

    {"ds": [{"name":"input", "dim": 20},
    

    This is wrong ...

    {"ds": [{"name":"input", "dim": 5},
    

    This is a logic error. You simply can not 'truncate' dim(20) to dim(5) passed in parameter.

    Why? Basically, your compiled RPG program is expecting input likeds(inputDS) dim(20). When your json only passed dim(5), all the values took whatever was in the memory locations following the 5 provided. In case of toolkit, this means that your RPG program started 'eating' past 'input' and took values from the next parameters outCount, output, last. That is, everything shifted in your program because you made an error declaring dimension size of the arrays (made a boo boo).

    In theory, you could try this 'truncate' to dim(5) in RPG, but most likely you get a big fat exception when you touched past the first 5 elements. In IBM i (RPG heap management), pass by reference will not 'eat' into next parameters (like json sample here), instead you will simply walk off the end of the allocation (boom exception). However, IBM i platform is unique in this heap aspect, all other platform, you will simply start writing into the next memory location (aka, like our json toolkit).

    In json, all we have is 'weird errors' at runtime ... because ... well ... garbage in lead to garbage out. Actually you are lucky you program did not blow up at runtime.

  14. Former user Account Deleted

    If you are using PHP, you would be much better off using the current PHP toolkit with xmlservice. xmlservice understands all the correct rules, and, php toolkit provides an abstraction that is less likely to lead to errrors in logic (although same logic error truncate rules apply).

    So, if you are simply among the 'go faster' than xmlservice crowd, you may want to wait until a php toolkit for this new interface is created.

    Just a thought ...

  15. Former user Account Deleted

    So one more 'lesson', well, assuming you are admirably irrepressible and will keep going (damn the torpedoes).

    Basically we have two kinds of parameters geometries in RPG (any ILE language). Most common is pass by reference. This means a pointer to the data element 's' or a data structure 'ds'.

           dcl-pr Main extpgm;
             inCount int(10);
             input likeds(inputDS) dim(20);
             outCount int(10);
             output likeds(outDS) dim(20);
             last char(10);
           end-pr;
    
    argv[]
    pointer -> inCount int(10);
    pointer -> input likeds(inputDS) dim(20);
    pointer -> outCount int(10);
    pointer -> output likeds(outDS) dim(20);
    pointer -> last char(10);
    

    Therefore, in your quest to set the array elements of inputDS, you must be careful not to introduce an additional pointer (shift the parameters).

    ok ... pointer to one ds 'input'
    
    {"ds": [{"name":"input", "dim": 20},
            {"s":[ 
                     {"name":"in1", "type":"5av2", "value":"a1"},
                     {"name":"in2", "type":"5av2", "value":"a2"}
            ]},
    ]},
    
    ok ... pointer to input (variations of 'ds' and 's' allowed within outer 'ds')
    {"ds": [{"name":"input"},
      {"ds": [{"name":"input1", "dim": 10},
            {"s":[ 
                     {"name":"in1", "type":"5av2", "value":"a1"},
                     {"name":"in2", "type":"5av2", "value":"a2"}
            ]},
      ]},
      {"ds": [{"name":"input2", "dim": 10},
            {"s":[ 
                     {"name":"in1", "type":"5av2", "value":"b1"},
                     {"name":"in2", "type":"5av2", "value":"b1"}
            ]},
      ]},
    ]},
    
    this BAD ...
    bad ... pointer to input1
    bad ... pointer to input2
    
      {"ds": [{"name":"input1", "dim": 10},
            {"s":[ 
                     {"name":"in1", "type":"5av2", "value":"a1"},
                     {"name":"in2", "type":"5av2", "value":"a2"}
            ]},
      ]},
      {"ds": [{"name":"input2", "dim": 10},
            {"s":[ 
                     {"name":"in1", "type":"5av2", "value":"b1"},
                     {"name":"in2", "type":"5av2", "value":"b1"}
            ]},
      ]},
    

    The other type of parameter is by value. I have NOT implemented by value in the ILE prcedure yet (yes, i know how). So let's forget 'value' for this moment as it is fairly rare in RPG programming (only real geeks).

           dcl-pr Main extpgm;
             inCount int(10) value;
             input likeds(inputDS) dim(20);
             outCount int(10) value;
             output likeds(outDS) dim(20);
             last char(10);
           end-pr;
    
    argv[]
    value -> inCount int(10) value; (in a register, not memory)
    pointer -> input likeds(inputDS) dim(20);
    value -> outCount int(10) value;  (in a register, not memory)
    pointer -> output likeds(outDS) dim(20);
    pointer -> last char(10);
    
  16. Former user Account Deleted

    Alrighty then ... i just spent a half day explaining parameter marshalling in ILE RPG sample (*). The concepts are correct. I may have made a finger error here occastionally, but i did a fair job of teaching compiler technology (in ten paragraphs or less).

    Happy experimentation.

    (*) BTW -- ILE c programs use a concept known as natural alignment for data, so unlike RPG 'packed' structures, c structures can have holes in layout geometry. If you do not understand 'natural alignment' please stick to RPG for your examples in ILE.

  17. Former user Account Deleted

    Here is your same test with corrected input json (common format).

    bash-4.3$ ./test1000_sql400json32 j0170_pgm_hamela02-ds
    input(4096):
    {"pgm":[
        {"name":"HAMELA02",  "lib":"DB2JSON"},
        {"s": {"name":"inCount", "type":"10i0", "value":5}},
        {"ds": [{"name":"inputDS","dim":20},
            {"s":[
                {"name":"in1", "type":"5av2", "value":"i1"},
                {"name":"in2", "type":"5av2", "value":"i2"}
            ]}
        ]},
        {"s": {"name":"outCount", "type":"10i0", "value":5}},
        {"ds": [{"name":"outDS","dim":20},
            {"s":[
                {"name":"out1", "type":"5av2", "value":"o1"},
                {"name":"out2", "type":"5av2", "value":"o2"},
                {"name":"out3", "type":"10av2", "value":"o3"}
            ]}
        ]},
        {"s": {"name":"last", "type":"10a", "value":"ll"}}
    ]}
    
    
    output(1564):
    {"script":[{"pgm":["HAMELA02","DB2JSON",
    {"inCount":5},
    {"inputDS":[
    [{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
    [{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
    [{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
    [{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
    [{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
    [{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
    [{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
    [{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
    [{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
    [{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}]
    ]},
    {"outCount":5},
    {"outDS":[
    [{"out1":"i1"},{"out2":"i2"},{"out3":"i1i2"}],
    [{"out1":"i1"},{"out2":"i2"},{"out3":"i1i2"}],
    [{"out1":"i1"},{"out2":"i2"},{"out3":"i1i2"}],
    [{"out1":"i1"},{"out2":"i2"},{"out3":"i1i2"}],
    [{"out1":"i1"},{"out2":"i2"},{"out3":"i1i2"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}]
    ]},
    {"last":"TEST"}]}]}
    
    result:
    success (0)
    

    The noticeable RPG changes like "out3":"i1i2" are only in the first 5 elements per "inCount":5. It worked.

    BTW -- As you can see we need the convention for "enddo":"outCount" to avoid records not changed in output (if desired).

  18. Former user Account Deleted

    Warning ... I have a problem with nested 'ds' structures. I will fix next release. However, until fixed, you will not be able to try out any fancy 'ds' within 'ds' workaround for your set array values experiment (aka, no exotic ds work ... yet).

  19. Former user Account Deleted

    Ok, fixed basic function of 'ds within 'ds'. You can find a new 1.0.5-sg6 binary on yips.

    this is correct ...

    Here is a test that will do the 'exotic' set input array attribute with json. Specifically, elements 1-2 will be set to default "i1", "i2", elements 3-5 will be set to "bob", "was", "here","mary","hello","alan", elements 6-20 will be set to default "i1", "i2".

    bash-4.3$ ./test1000_sql400json32 j0171_pgm_hamela02-ds-set_input_array
    input(512000):
    {"pgm":[
        {"name":"HAMELA02",  "lib":"DB2JSON"},
        {"s": {"name":"inCount", "type":"10i0", "value":5}},
        {"ds": [{"name":"inputDS"},
          {"ds": [{"name":"inputDS1","dim":2},
            {"s":[
                {"name":"in1", "type":"5av2", "value":"i1"},
                {"name":"in2", "type":"5av2", "value":"i2"}
            ]}
          ]},
          {"s":[
                {"name":"in1_3", "type":"5av2", "value":"bob"},
                {"name":"in1_3", "type":"5av2", "value":"was"}
          ]},
          {"s":[
                {"name":"in1_4", "type":"5av2", "value":"here"},
                {"name":"in1_4", "type":"5av2", "value":"mary"}
          ]},
          {"s":[
                {"name":"in1_5", "type":"5av2", "value":"hello"},
                {"name":"in1_5", "type":"5av2", "value":"alan"}
          ]},
          {"ds": [{"name":"inputDS2","dim":15},
            {"s":[
                {"name":"in1", "type":"5av2", "value":"i1"},
                {"name":"in2", "type":"5av2", "value":"i2"}
            ]}
          ]}
        ]},
        {"s": {"name":"outCount", "type":"10i0", "value":5}},
        {"ds": [{"name":"outDS","dim":20},
            {"s":[
                {"name":"out1", "type":"5av2", "value":"o1"},
                {"name":"out2", "type":"5av2", "value":"o2"},
                {"name":"out3", "type":"10av2", "value":"o3"}
            ]}
        ]},
        {"s": {"name":"last", "type":"10a", "value":"ll"}}
    ]}
    
    
    output(1633):
    {"script":[{"pgm":["HAMELA02","DB2JSON",
    {"inCount":5},
    {"inputDS":[
    {"inputDS1":[[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}]]},
    {"in1_3":"bob"},{"in1_3":"was"},
    {"in1_4":"here"},{"in1_4":"mary"},
    {"in1_5":"hello"},{"in1_5":"alan"},
    {"inputDS2":[
    [{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
    [{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
    [{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
    [{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
    [{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}]
    ]}]},
    {"outCount":5},
    {"outDS":[
    [{"out1":"i1"},{"out2":"i2"},{"out3":"i1i2"}],
    [{"out1":"i1"},{"out2":"i2"},{"out3":"i1i2"}],
    [{"out1":"bob"},{"out2":"was"},{"out3":"bobwas"}],
    [{"out1":"here"},{"out2":"mary"},{"out3":"heremary"}],
    [{"out1":"hello"},{"out2":"alan"},{"out3":"helloalan"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
    [{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}]
    ]},
    {"last":"TEST"}]}]}
    
    result:
    success (0)
    

    this is not correct ...

    In case you still have doubt, wrong thinking appears below (my opinion). That is to say, because a name in a 'ds' structure appears more than once it does not mean 'moving' to the next record. Aka, "name":"in1" appears 5 times in ONE record of 'ds' "name":"input" below, as does "name":"in2". To wit, this is just a big 'ds' of 10 's' elements of dim(5) that happens to have the same name "in1" and 'in2" re-used for elements. In fact, RPG compiler would complain, but json could care less about "name" until parsing on client (or other exotic action like enddo). Also, as mention, an expected RPG program input dim(20) will not work with "dim":5 truncation. To wit, RPG is expecting 20, and, RPG get's what RPG wants, or you get really weird results (your secondary problem description is weird result that would happen).

    {"ds": [{"name":"input", "dim": 5},
    
    {"s":[{"name":"in1", "type":"5av2", "value":"a1"},
    {"name":"in2", "type":"5av2", "value":"a2"}]},
    
    {"s":[ {"name":"in1", "type":"5av2", "value":"b1"}, 
    {"name":"in2", "type":"5av2", "value":"b2"}]},
    
    {"s":[ {"name":"in1", "type":"5av2", "value":"c1"}, 
    {"name":"in2", "type":"5av2", "value":"c2"}]},
    
    {"s":[ {"name":"in1", "type":"5av2", "value":"d1"}, 
    {"name":"in2", "type":"5av2", "value":"d2"}]},
    
    {"s":[ {"name":"in1", "type":"5av2", "value":"e1"}, 
    {"name":"in2", "type":"5av2", "value":"e2"}]}
    
    ]},
    

    (*) I understand you can make an argument that 'move to next element' occurs when 'same name' is duplicated. I will not accept this design idea, because much to error prone in my opinion. However, like i mentioned, you can build your own json parser and call toolkit-base anyway you like, with any json syntax you choose (to infinity and beyond ... but not in my parser).

  20. Teemu Halmela reporter

    Thank you again for in depth explanations. Almost understood it all 😃.

    I was little bit confused why RPG was working when I gave it dim(5), because like you said RPG is expecting 20 dim array but only got 5, so rest of the parameters should be out of sync. But the output was correct so I went with it to decrease the output size (I was lazy).

    We unfortunately have many programs that take input arrays so getting this work is beneficial. But I think this will work, at least for experimenting.

    {"pgm":[
        {"name":"TPGM",  "lib":"DB2JSON"},
        {"s": {"name":"inCount", "type":"10i0", "value":5}},
        {"ds": [{"name":"input"},
            {"s":[ {"name":"in1", "type":"5av2", "value":"a1"}, {"name":"in2", "type":"5av2", "value":"a2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"b1"}, {"name":"in2", "type":"5av2", "value":"b2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"c1"}, {"name":"in2", "type":"5av2", "value":"c2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"d1"}, {"name":"in2", "type":"5av2", "value":"d2"}]},
            {"s":[ {"name":"in1", "type":"5av2", "value":"e1"}, {"name":"in2", "type":"5av2", "value":"e2"}]},
            {"ds": [{"name": "inputEmpty", "dim":15},
                {"s":[ {"name":"in1", "type":"5av2", "value":""}, {"name":"in2", "type":"5av2", "value":""}]}
            ]}
        ]},
        {"s": {"name":"outCount", "type":"10i0", "value":0}},
        {"ds": [{"name":"output", "dim":20},
            {"s":[ {"name":"out1", "type":"5av2", "value":""}, {"name":"out2", "type":"5av2", "value":""}, {"name":"out3", "type":"10av2", "value":""}]},
        ]},
        {"s": {"name":"last", "type":"10a", "value":""}}
    ]}
    
  21. Teemu Halmela reporter

    Bear with me, I have more horrible examples.

    So there is this program. These are actual parameters from existing program (names have been changed and Arr2 is actually 500 long). I also wanted to try there is no funny business going on when using non-free RPG, as all of our programs are like that. In this example Arr2 is used as input/output array.
    The good thing is this seems to work properly. Except the €-char is missing, but I don't know if RPG is supposed to handle that character . Things only start to break horribly if we change Arr2 to Occurs(500).

         D Parm1           S              1A             
         D Parm2           S             18A             
         D Parm3           S              2A             
         D Parm4           S             10A             
         D Arr1Count       S              3S 0           
         D Arr1            DS                  Occurs(10)
         D  Arr1P1                        7A             
         D  Arr1P2                      132A             
         D  Arr1P3                       30A             
         D  Arr1P4                        1S 0           
         D Parm5           S              1A             
         D Parm6           S              4A             
         D Parm7           S             18A             
         D Parm8           S            100A             
         D Parm9           S              3S 0           
         D Parm10          S              3S 0           
         D Arr2Count       S              5S 0           
         D Arr2            DS                  Occurs(10)
         D  Arr2P1                        1A             
         D  Arr2P2                       30A                  
         D  Arr2P3                        4A                  
         D  Arr2P4                       18A                  
         D  Arr2P5                      100A                  
         D Parm11          S             30A                  
    
         D i               S              5P 0                
    
         C     *Entry        Plist                            
         C                   Parm                    Parm1    
         C                   Parm                    Parm2    
         C                   Parm                    Parm3    
         C                   Parm                    Parm4    
         C                   Parm                    Arr1Count
         C                   Parm                    Arr1     
         C                   Parm                    Parm5    
         C                   Parm                    Parm6    
         C                   Parm                    Parm7    
         C                   Parm                    Parm8    
         C                   Parm                    Parm9    
         C                   Parm                    Parm10               
         C                   Parm                    Arr2Count            
         C                   Parm                    Arr2                 
         C                   Parm                    Parm11               
    
         C                   For       i             = 1 To Arr2Count By 1
         C                   Eval      %occur(Arr2)  = i                  
         C                   Eval      Arr2P5        = %trim(Arr2P1) +    
         C                                             %trim(Arr2P2) +    
         C                                             %trim(Arr2P3)      
         C                   Endfor                                       
         C                                                                
         C                   Eval      Parm11        = %trim(Parm2)       
         C                                                                
         C                   Eval      %occur(Arr1)  = 1                  
         C                   Eval      Arr1P2        = %trim(Parm8)
         C                   Eval      Arr1Count     = 1           
         C                                                         
         C                   Eval      *Inlr         = '1'         
         C                   Return                                
    
    input(2591):
    {"pgm":[
        {"name":"TPGM2",  "lib":"DB2JSON"},
        {"s": [
            {"name":"Parm1", "type":"1a", "value":""},
            {"name":"Parm2", "type":"18a", "value":"äöÄÖåÅáÁàÀ"},
            {"name":"Parm3", "type":"2a", "value":""},
            {"name":"Parm4", "type":"10a", "value":""},
            {"name":"Arr1Count", "type":"3s0", "value":0}
        ]},
        {"ds": [{"name":"Arr1", "dim":10},
            {"s":[
                {"name":"Arr1P1", "type":"7a", "value":""},
                {"name":"Arr1P2", "type":"132a", "value":""},
                {"name":"Arr1P3", "type":"30a", "value":""},
                {"name":"Arr1P4", "type":"1s0", "value":0}
            ]}
        ]},
        {"s": [
            {"name":"Parm5", "type":"1a", "value":""},
            {"name":"Parm6", "type":"4a", "value":""},
            {"name":"Parm7", "type":"18a", "value":""},
            {"name":"Parm8", "type":"100a", "value":"!#¤%&/()=?+*^_-:;@£${[]}\\€<>"},
            {"name":"Parm9", "type":"3s0", "value":0},
            {"name":"Parm10", "type":"3s0", "value":0}
        ]},
        {"s": {"name":"Arr2Count", "type":"5s0", "value":3}},
        {"ds": [{"name":"Arr2"},
            {"s":[
                {"name":"Arr2P1", "type":"1a", "value":"a"},
                {"name":"Arr2P2", "type":"30a", "value":"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"},
                {"name":"Arr2P3", "type":"4a", "value":"ap3"},
                {"name":"Arr2P4", "type":"18a", "value":""},
                {"name":"Arr2P5", "type":"100a", "value":""}
            ]},
            {"s":[
                {"name":"Arr2P1", "type":"1a", "value":"b"},
                {"name":"Arr2P2", "type":"30a", "value":"BBBBBBBBBBBBBBBBBBBBBBBBBBBBBB"},
                {"name":"Arr2P3", "type":"4a", "value":"bp3"},
                {"name":"Arr2P4", "type":"18a", "value":""},
                {"name":"Arr2P5", "type":"100a", "value":""}
            ]},
            {"s":[
                {"name":"Arr2P1", "type":"1a", "value":"c"},
                {"name":"Arr2P2", "type":"30a", "value":"CCCCCCCCCCCCCCCCCCCCCCCCCCCCCC"},
                {"name":"Arr2P3", "type":"4a", "value":"cp3"},
                {"name":"Arr2P4", "type":"18a", "value":""},
                {"name":"Arr2P5", "type":"100a", "value":""}
            ]},
            {"ds": [{"name":"Arr2Empty", "dim": 7},
                {"s":[
                    {"name":"Arr2P1", "type":"1a", "value":""},
                    {"name":"Arr2P2", "type":"30a", "value":""},
                    {"name":"Arr2P3", "type":"4a", "value":""},
                    {"name":"Arr2P4", "type":"18a", "value":""},
                    {"name":"Arr2P5", "type":"100a", "value":""}
                ]}
            ]}
        ]},
        {"s": {"name":"Parm11", "type":"30a", "value":""}}
    ]}
    
    output(1863): {"script":[
        {"pgm":["TPGM2","DB2JSON",
            {"Parm1":{}},
            {"Parm2":"äöÄÖåÅáÁà"},
            {"Parm3":{}},
            {"Parm4":{}},
            {"Arr1Count":1},
            {"Arr1":[
                [{"Arr1P1":{}},
                    {"Arr1P2":"!#¤%&/()=?+*^_-:;@£${[]}\\<>"},
                    {"Arr1P3":{}},{"Arr1P4":0.0}
                ],
                [{"Arr1P1":{}},{"Arr1P2":{}},{"Arr1P3":{}},{"Arr1P4":0.0}],
                [{"Arr1P1":{}},{"Arr1P2":{}},{"Arr1P3":{}},{"Arr1P4":0.0}],
                [{"Arr1P1":{}},{"Arr1P2":{}},{"Arr1P3":{}},{"Arr1P4":0.0}],
                [{"Arr1P1":{}},{"Arr1P2":{}},{"Arr1P3":{}},{"Arr1P4":0.0}],
                [{"Arr1P1":{}},{"Arr1P2":{}},{"Arr1P3":{}},{"Arr1P4":0.0}],
                [{"Arr1P1":{}},{"Arr1P2":{}},{"Arr1P3":{}},{"Arr1P4":0.0}],
                [{"Arr1P1":{}},{"Arr1P2":{}},{"Arr1P3":{}},{"Arr1P4":0.0}],
                [{"Arr1P1":{}},{"Arr1P2":{}},{"Arr1P3":{}},{"Arr1P4":0.0}],
                [{"Arr1P1":{}},{"Arr1P2":{}},{"Arr1P3":{}},{"Arr1P4":0.0}]
            ]},
            {"Parm5":{}},
            {"Parm6":{}},
            {"Parm7":{}},
            {"Parm8":"!#¤%&/()=?+*^_-:;@£${[]}\\<>"},
            {"Parm9":0.0},
            {"Parm10":0.0},
            {"Arr2Count":3},
            {"Arr2":[
                {"Arr2P1":"a"},
                {"Arr2P2":"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"},
                {"Arr2P3":"ap3"},
                {"Arr2P4":{}},
                {"Arr2P5":"aAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAap3"},
                {"Arr2P1":"b"},
                {"Arr2P2":"BBBBBBBBBBBBBBBBBBBBBBBBBBBBBB"},
                {"Arr2P3":"bp3"},
                {"Arr2P4":{}},
                {"Arr2P5":"bBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBbp3"},
                {"Arr2P1":"c"},
                {"Arr2P2":"CCCCCCCCCCCCCCCCCCCCCCCCCCCCCC"},
                {"Arr2P3":"cp3"},
                {"Arr2P4":{}},
                {"Arr2P5":"cCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCcp3"},
                {"Arr2Empty":[
                    [{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}],
                    [{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}],
                    [{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}],
                    [{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}],
                    [{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}],
                    [{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}],
                    [{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}]
                ]}
            ]},
            {"Parm11":"äöÄÖåÅáÁà"}
        ]}
    ]}
    
  22. Teemu Halmela reporter

    So when we change previous programs Arr2 -> Occurs(500) things start to throw segmentation faults. But I got things working with these changes change1 change2.

    So now even this input works which is a good sign.

    {"pgm":[
        {"name":"TPGM2",  "lib":"DB2JSON"},
        {"s": [
            {"name":"Parm1", "type":"1a", "value":""},
            {"name":"Parm2", "type":"18a", "value":"äöÄÖåÅáÁàÀ"},
            {"name":"Parm3", "type":"2a", "value":""},
            {"name":"Parm4", "type":"10a", "value":""},
            {"name":"Arr1Count", "type":"3s0", "value":0}
        ]},
        {"ds": [{"name":"Arr1", "dim":10},
            {"s":[
                {"name":"Arr1P1", "type":"7a", "value":""},
                {"name":"Arr1P2", "type":"132a", "value":""},
                {"name":"Arr1P3", "type":"30a", "value":""},
                {"name":"Arr1P4", "type":"1s0", "value":0}
            ]}
        ]},
        {"s": [
            {"name":"Parm5", "type":"1a", "value":""},
            {"name":"Parm6", "type":"4a", "value":""},
            {"name":"Parm7", "type":"18a", "value":""},
            {"name":"Parm8", "type":"100a", "value":"!#¤%&/()=?+*^_-:;@£${[]}\\€<>"},
            {"name":"Parm9", "type":"3s0", "value":0},
            {"name":"Parm10", "type":"3s0", "value":0}
        ]},
        {"s": {"name":"Arr2Count", "type":"5s0", "value":500}},
        {"ds": [{"name":"Arr2", "dim": 500},
            {"s":[
                {"name":"Arr2P1", "type":"1a", "value":"a"},
                {"name":"Arr2P2", "type":"30a", "value":"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"},
                {"name":"Arr2P3", "type":"4a", "value":"ap3"},
                {"name":"Arr2P4", "type":"18a", "value":""},
                {"name":"Arr2P5", "type":"100a", "value":""}
            ]}
        ]},
        {"s": {"name":"Parm11", "type":"30a", "value":""}}
    ]}
    

    I just need to figure out why it hangs when I do the call on the real program.

  23. Former user Account Deleted

    please change posts (now on) ...

    I also do not know what version of db2sock you are running.

    bash-4.3$ ./tests_c/test9999_driver_version32
    run (trace=on)
    version (1.0.5-sg6)
    success (0)
    

    incorrect 'fix' ...

    Mmm ... no. A 'ds' with "dim": 500 does not introduce more pointer to parms. So this fix is incorrect (not accepted).

    -#define ILE_PGM_MAX_ARGS 128
    +#define ILE_PGM_MAX_ARGS 4096
    

    explanation ...

    You only have 15 parms in this json (below). This will fit easily in ILE_PGM_MAX_ARGS 128.

    {"pgm":[
        {"name":"TPGM2",  "lib":"DB2JSON"},
        {"s": [
    (01)        {"name":"Parm1", "type":"1a", "value":""},
    (02)        {"name":"Parm2", "type":"18a", "value":"äöÄÖåÅáÁàÀ"},
    (03)        {"name":"Parm3", "type":"2a", "value":""},
    (04)        {"name":"Parm4", "type":"10a", "value":""},
    (05)        {"name":"Arr1Count", "type":"3s0", "value":0}
        ]},
    (06)    {"ds": [{"name":"Arr1", "dim":10},
            {"s":[
                {"name":"Arr1P1", "type":"7a", "value":""},
                {"name":"Arr1P2", "type":"132a", "value":""},
                {"name":"Arr1P3", "type":"30a", "value":""},
                {"name":"Arr1P4", "type":"1s0", "value":0}
            ]}
        ]},
        {"s": [
    (07)        {"name":"Parm5", "type":"1a", "value":""},
    (08)        {"name":"Parm6", "type":"4a", "value":""},
    (09)        {"name":"Parm7", "type":"18a", "value":""},
    (10)        {"name":"Parm8", "type":"100a", "value":"!#¤%&/()=?+*^_-:;@£${[]}\\€<>"},
    (11)        {"name":"Parm9", "type":"3s0", "value":0},
    (12)        {"name":"Parm10", "type":"3s0", "value":0}
        ]},
    (13)    {"s": {"name":"Arr2Count", "type":"5s0", "value":500}},
    (14)    {"ds": [{"name":"Arr2", "dim": 500},
            {"s":[
                {"name":"Arr2P1", "type":"1a", "value":"a"},
                {"name":"Arr2P2", "type":"30a", "value":"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"},
                {"name":"Arr2P3", "type":"4a", "value":"ap3"},
                {"name":"Arr2P4", "type":"18a", "value":""},
                {"name":"Arr2P5", "type":"100a", "value":""}
            ]}
        ]},
    (15)    {"s": {"name":"Parm11", "type":"30a", "value":""}}
    ]}
    
    
    argv[15]
    pointer->Parm1
    pointer->Parm2
    pointer->Parm3
    pointer->Parm4
    pointer->Arr1
    pointer->Parm5
    pointer->Parm6
    pointer->Parm7
    pointer->Parm8
    pointer->Parm9
    pointer->Parm10
    pointer->Arr2Count
    pointer->Arr2
    pointer->Parm11
    
         C     *Entry        Plist                            
         C                   Parm                    Parm1    
         C                   Parm                    Parm2    
         C                   Parm                    Parm3    
         C                   Parm                    Parm4    
         C                   Parm                    Arr1Count
         C                   Parm                    Arr1     
         C                   Parm                    Parm5    
         C                   Parm                    Parm6    
         C                   Parm                    Parm7    
         C                   Parm                    Parm8    
         C                   Parm                    Parm9    
         C                   Parm                    Parm10               
         C                   Parm                    Arr2Count            
         C                   Parm                    Arr2                 
         C                   Parm                    Parm11               
    

    next ...

    I don't know what went wrong, but your fork fixes are not the answer (rejected).

  24. Former user Account Deleted

    FYI -- Good find on json parser! Here is correct location for k->count (need to test).

    void json_grow_key(json_key_t * k, int i) {
      int g = 0;
      char * old_key = (char *) k->key;
      char * old_val = (char *) k->val;
      char * old_lvl = (char *) k->lvl;
      char * new_key = NULL;
      char * new_val = NULL;
      char * new_lvl = NULL;
      /* already big enough (add grow amount i to count) */
      if (k->max > k->count + i + 1) {
        k->count += i; /* hamela found bug */
        return;
      }
      /* grow by blocks */
      for (g = k->max; k->max < g + i + 1; k->max += JSON400_KEY_BLOCK);
      /* realloc */
      new_key = json_new(k->max * sizeof(int));
      new_val = json_new(k->max * sizeof(char *));
      new_lvl = json_new(k->max * sizeof(int));
      memcpy(new_key,old_key,(k->count * sizeof(int)));
      memcpy(new_val,old_val,(k->count * sizeof(char *)));
      memcpy(new_lvl,old_lvl,(k->count * sizeof(int)));
      k->key = (int *) new_key;
      k->val = (char **) new_val;
      k->lvl = (int *) new_lvl;
      json_free(old_key);
      json_free(old_val);
      json_free(old_lvl);
    }
    
  25. Former user Account Deleted

    Hey, can you please post only RPG free samples for tests. I don't want to add old style RPG programs to tests_ILE_RPG. I mean you are welcome to try out anything (of course), but i wish to add your tests to my collection for regression testing. That is, regression test means your tests will be run every release (free tests for your specific needs).

  26. Former user Account Deleted

    We unfortunately have many programs that take input arrays so getting this work is beneficial. But I think this will work, at least for experimenting

    Throw a dog a bone ... you can see setting input arrays using only 'ds' and 's' is difficult. Essentially we are trying to invent a RPG calculation section by rubbing two 'ds' and 's' sticks together (i made fire).

    Let's say that ...

    My daughter, back days as young girl, use to play "let's say that ..." with the kid next door. Marvels of games from 'hot lava' on the ground to 'jet planes' on the swing set. Point is, we don't have to color inside the lines. We could introduce a simple json language for modifying data specifications.

    {"pgm":[
        {"name":"HAMELA02",  "lib":"DB2JSON"},
        {"s": {"name":"inCount", "type":"10i0", "value":5}},
        {"ds": [{"name":"inputDS","dim":20},
            {"s":[
                {"name":"in1", "type":"5av2", "value":"i1"},
                {"name":"in2", "type":"5av2", "value":"i2"}
            ]}
        ]},
        {"s": {"name":"outCount", "type":"10i0", "value":5}},
        {"ds": [{"name":"outDS","dim":20},
            {"s":[
                {"name":"out1", "type":"5av2", "value":"o1"},
                {"name":"out2", "type":"5av2", "value":"o2"},
                {"name":"out3", "type":"10av2", "value":"o3"}
            ]}
        ]},
        {"s": {"name":"last", "type":"10a", "value":"ll"}}
    ]}
    

    Above is a standard RPG-like set of 'ds' and 's' data specifications. Let's say that we 'cache' this specification in the driver under the name 'HAMELA02'. Now we can create a little language "eval" to update any of the input elements (below). Of course making for screaming fast performance because we never re-parse 'HAMELA02'.

    {"c":[{"eval":"HAMELA02.inputDS(1).in1","equal":"yahoo"}]}
    

    Let's say that ...

  27. Former user Account Deleted

    Reminder "do not use for production" ...

    So, as you can see last post with let's say that "eval", there is creative room for architecture change to fit both performance and 'use case' in this toolkit. To infinity and beyond, to coin a phrase.

    Now for the bad news, this will take a while to sort out, aka, maybe until the end of the year before we have json bells dinging and whistles tooting at high performance (and async, and web, and socket, and "eval", and, and ...).

  28. Former user Account Deleted

    it works 1.0.5-sg7

    Ok, only fix need was k->count in json_grow_key. We did not have to modify ILE_PGM_MAX_ARGS 128, which is a very good thing (... we really, really, do not want this to happen ... you will see when we do pass by value params).

    bash-4.3$ ./test1000_sql400json32 j0181_pgm_hamela03-ds-rpg-occurs-set-element
    input(1000000):
    {"pgm":[
        {"name":"HAMELA03",  "lib":"DB2JSON"},
        {"s": [
            {"name":"Parm1", "type":"1a", "value":""},
            {"name":"Parm2", "type":"18a", "value":"äöÄÖåÅáÁàÀ"},
            {"name":"Parm3", "type":"2a", "value":""},
            {"name":"Parm4", "type":"10a", "value":""},
            {"name":"Arr1Count", "type":"3s0", "value":0}
        ]},
        {"ds": [{"name":"Arr1", "dim":10},
            {"s":[
                {"name":"Arr1P1", "type":"7a", "value":""},
                {"name":"Arr1P2", "type":"132a", "value":""},
                {"name":"Arr1P3", "type":"30a", "value":""},
                {"name":"Arr1P4", "type":"1s0", "value":0}
            ]}
        ]},
        {"s": [
            {"name":"Parm5", "type":"1a", "value":""},
            {"name":"Parm6", "type":"4a", "value":""},
            {"name":"Parm7", "type":"18a", "value":""},
            {"name":"Parm8", "type":"100a", "value":"!#¤%&/()=?+*^_-:;@£${[]}\\€<>"},
            {"name":"Parm9", "type":"3s0", "value":0},
            {"name":"Parm10", "type":"3s0", "value":0}
        ]},
        {"s": {"name":"Arr2Count", "type":"5s0", "value":3}},
        {"ds": [{"name":"Arr2"},
            {"s":[
                {"name":"Arr2P1", "type":"1a", "value":"a"},
                {"name":"Arr2P2", "type":"30a", "value":"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"},
                {"name":"Arr2P3", "type":"4a", "value":"ap3"},
                {"name":"Arr2P4", "type":"18a", "value":""},
                {"name":"Arr2P5", "type":"100a", "value":""}
            ]},
            {"s":[
                {"name":"Arr2P1", "type":"1a", "value":"b"},
                {"name":"Arr2P2", "type":"30a", "value":"BBBBBBBBBBBBBBBBBBBBBBBBBBBBBB"},
                {"name":"Arr2P3", "type":"4a", "value":"bp3"},
                {"name":"Arr2P4", "type":"18a", "value":""},
                {"name":"Arr2P5", "type":"100a", "value":""}
            ]},
            {"s":[
                {"name":"Arr2P1", "type":"1a", "value":"c"},
                {"name":"Arr2P2", "type":"30a", "value":"CCCCCCCCCCCCCCCCCCCCCCCCCCCCCC"},
                {"name":"Arr2P3", "type":"4a", "value":"cp3"},
                {"name":"Arr2P4", "type":"18a", "value":""},
                {"name":"Arr2P5", "type":"100a", "value":""}
            ]},
            {"ds": [{"name":"Arr2Empty", "dim": 7},
                {"s":[
                    {"name":"Arr2P1", "type":"1a", "value":""},
                    {"name":"Arr2P2", "type":"30a", "value":""},
                    {"name":"Arr2P3", "type":"4a", "value":""},
                    {"name":"Arr2P4", "type":"18a", "value":""},
                    {"name":"Arr2P5", "type":"100a", "value":""}
                ]}
            ]}
        ]},
        {"s": {"name":"Parm11", "type":"30a", "value":""}}
    ]}
    
    
    output(1866):
    {"script":[{"pgm":["HAMELA03","DB2JSON",{"Parm1":
    {}},{"Parm2":"äöÄÖåÅáÁà"},{"Parm3":{}},{"Parm4":{}},{"Arr1Count":1},
    {"Arr1":[[{"Arr1P1":{}},{"Arr1P2":"!#¤%&/()=?+*^_-:;@£${[]}\\?<>"},
    {"Arr1P3":{}},{"Arr1P4":0.0}],[{"Arr1P1":{}},{"Arr1P2":{}},
    {"Arr1P3":{}},{"Arr1P4":0.0}],[{"Arr1P1":{}},{"Arr1P2":{}},
    {"Arr1P3":{}},{"Arr1P4":0.0}],[{"Arr1P1":{}},{"Arr1P2":{}},
    {"Arr1P3":{}},{"Arr1P4":0.0}],[{"Arr1P1":{}},{"Arr1P2":{}},
    {"Arr1P3":{}},{"Arr1P4":0.0}],[{"Arr1P1":{}},{"Arr1P2":{}},
    {"Arr1P3":{}},{"Arr1P4":0.0}],[{"Arr1P1":{}},{"Arr1P2":{}},
    {"Arr1P3":{}},{"Arr1P4":0.0}],[{"Arr1P1":{}},{"Arr1P2":{}},
    {"Arr1P3":{}},{"Arr1P4":0.0}],[{"Arr1P1":{}},{"Arr1P2":{}},
    {"Arr1P3":{}},{"Arr1P4":0.0}],[{"Arr1P1":{}},{"Arr1P2":{}},
    {"Arr1P3":{}},{"Arr1P4":0.0}]]},{"Parm5":{}},{"Parm6":{}},
    {"Parm7":{}},{"Parm8":"!#¤%&/()=?+*^_-:;@£${[]}\\?<>"},
    {"Parm9":0.0},{"Parm10":0.0},{"Arr2Count":3},
    {"Arr2":[{"Arr2P1":"a"},
    {"Arr2P2":"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"},{"Arr2P3":"ap3"},
    {"Arr2P4":{}},{"Arr2P5":"aAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAap3"},
    {"Arr2P1":"b"},{"Arr2P2":"BBBBBBBBBBBBBBBBBBBBBBBBBBBBBB"},
    {"Arr2P3":"bp3"},
    {"Arr2P4":{}},{"Arr2P5":"bBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBbp3"},
    {"Arr2P1":"c"},{"Arr2P2":"CCCCCCCCCCCCCCCCCCCCCCCCCCCCCC"},
    {"Arr2P3":"cp3"},
    {"Arr2P4":{}},{"Arr2P5":"cCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCcp3"},
    {"Arr2Empty":[[{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},
    {"Arr2P4":{}},{"Arr2P5":{}}],[{"Arr2P1":{}},{"Arr2P2":{}},
    {"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}],[{"Arr2P1":{}},
    {"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}],
    [{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},
    {"Arr2P5":{}}],[{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},
    {"Arr2P4":{}},{"Arr2P5":{}}],[{"Arr2P1":{}},{"Arr2P2":{}},
    {"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}],[{"Arr2P1":{}},
    {"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}]]}]},
    {"Parm11":"äöÄÖåÅáÁà"}]}]}
    
    result:
    success (0)
    

    please help your tests

    I decided to keep your old schoool RPG program to show 'doubting Thomas' folks that there is nothing up my magic sleeves with RPG free (they really are the same 'ds' and 's' and 'occurs' and ... Thanks Barbara Morris IBM Toronto).

    Help I need somebody (The Beatles) ...

    Anyway, your tests are getting so complex I am having difficulty knowing what to check in expected output .exp (j0181_pgm_hamela03-ds-rpg-occurs-set-element.exp). If you would not mind, please post a .exp data with your test. Uf Da! (I save Uf Da for special occasions of my perplexed heart as a over worked programmer.)

  29. Former user Account Deleted

    BTW --- should be noted that your 'yikes dude' complex tests are not producing valid json because of all the special characters embedded in the character data.

    I am using this validator on-line ... jsonlint

    Error: Parse error on line 19:
    ... {                           "Arr1P2": "!#¤%&/()=?+*^_-:;@£
    ----------------------^
    Expecting 'STRING', 'NUMBER', 'NULL', 'TRUE', 'FALSE', '{', '[', got 'undefined'
    
  30. Former user Account Deleted

    Update ... i guess valid json one weird character returned ... € ... probably some sort of odd ball conversion ascii<>ebcdic. I have no idea what '€' means, so I am just taking this char out so tests will pass clean json validate check.

  31. Former user Account Deleted

    To save you undue effort, i should say that I am not taking contributions to this project yet. This driver is too important to get stuck in legal tar. So until I sort out an IBM approved vetting process I will chat with anyone, and, work on issues. You may also fork if you like, aka, okay to fool around with a copy on your own. Just trying to set expectations.

  32. Teemu Halmela reporter

    I did some changes to the tests and I can only run j018x- tests with this change

    -#define ILE_PGM_MAX_ARGS 128
    +#define ILE_PGM_MAX_ARGS 4096
    

    Without it I get Segmentation fault (core dumped) error. Also I must compile toolkit-base, toolkit-parser-json and db2proc to make things work. So I don't know what is going on with it. I'm currently running my fork which has all the newest changes

    $ test9999_driver_version32
    run (trace=)
    version (1.0.5-sg7)
    success (0)
    

    I also added big json test and it seems to work with your bigkey->countfix. Grab the tests from my fork if you want to include them.

    I have no idea what '€' means

    € is symbol for euro. Should be atleast in iso8859-15 and utf8 charsets. No idea what should be changed to get it work. I hope we don't use it often.

  33. Former user Account Deleted

    Mmm, again, your test is complete wrong (we covered this). You can't set individual elements by listing them inside a 'ds'. This is only working by accident with your expansion of ILE_PGM_MAX_ARGS 4096 (because 'ds structs are 'packed'). Your tests is wrong (same logic error as last time).

  34. Former user Account Deleted

    Wait a moment ... so ... you are going to list all 500 elements by hand inside a single ds structure. Mmmm ... well this might work, except parameter Arr2Count->500 has no meaning. You have only one gigantic 500 element 'ds' with repeated 's' names (500 times). Very odd (exotic), but maybe that should work. But this is still wrong (ILE_PGM_MAX_ARGS 4096) ... must be some other 'grow operation failing.

        {"ds": [{"name":"Arr2"},
            {"s":[{"name":"Arr2P1", "type":"1a", "value":"a"},{"name":"Arr2P2", "type":"30a", "value":"BAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"},{"name":"Arr2P3", "type":"4a", "value":"ap3"},{"name":"Arr2P4", "type":"18a", "value":""},{"name":"Arr2P5", "type":"100a", "value":""}]},
            {"s":[{"name":"Arr2P1", "type":"1a", "value":"a"},{"name":"Arr2P2", "type":"30a", "value":"ABAAAAAAAAAAAAAAAAAAAAAAAAAAAA"},{"name":"Arr2P3", "type":"4a", "value":"ap3"},{"name":"Arr2P4", "type":"18a", "value":""},{"name":"Arr2P5", "type":"100a", "value":""}]},
    :
    500 times (different data in each)
    :
    

    BTW -- you miscounted in the test (only had 499 elements).

  35. Former user Account Deleted

    You may be wasting your time with json interface for these sort of 'exotic' tests. I appreciate any tests to work out edges of json, but any 'big data' operation will not likely be through this interface at all. Basically, if you have these sort of 'big array' problems you will likely use a direct memory call into the toolkit (libtk400.a). That is, there will be a formal 'c code' interface to libtk400.a that can be called directly by any language much like db2 drivers (no json, no xml, etc.). Maybe I can give you a general idea as the complete architecture is not all written yet.

    ===
    1) high speed, big data, exotic calls
    high speed direct interface - big data interface (memory calls)
    ===
    php->pecl (unwritten)->libtk400.a (interfaces not written)
    note: 
    a) should work either within php job (unwritten) 
    b) or as stored proc call (db2proc we also use for json, etc. below)
    
    
    ===
    2) slower web interfaces, maybe ok for 80% of simple json calls as well
    parser interface -- web abstraction interfaces json, xml, etc. (socket, fastcgi, etc.)
    ===
    php->pecl(unwritten)->libjson400.a->libtk400.a 
    php->pecl(unwritten)->libxml400.a(unwritten)->libtk400.a 
    php->pecl(unwritten)->libcvs400.a(unwritten)->libtk400.a 
    
  36. Former user Account Deleted

    BTW -- I will look into your 500 element setter test. We should not have to increase ILE_PGM_MAX_ARGS, so something else is wrong. Again, i consider this sort of test as 'exotic', therein probably not a 'real life' candidate for json, xml, cvs, etc. You would really want the direct memory 'big data' interface to toolkit (libtk400.a) ... but ... well ... it is not written yet.

    reminder (previous post) ... Now for the bad news, this will take a while to sort out, aka, maybe until the end of the year before we have json bells dinging and whistles tooting at high performance (and async, and web, and socket, and "eval", and, and ...) ... and direct memory 'big data' interface to libtk400.a (and more stuff beyond this...).

  37. Former user Account Deleted

    Also I must compile toolkit-base, toolkit-parser-json and db2proc to make things work. So I don't know what is going on with it.

    Admirably irrepressible you are probably going to ignore my warning about ILE_PGM_MAX_ARGS, not to mention using the 'big data' interface instead of json here (damn the torpedoes). The following c structure is used both in toolkit-base and ILE-PROC, This is why you must compile both ends to make any change to ILE_PGM_MAX_ARGS.

    #define ILE_PGM_MAX_ARGS 128
    #define ILE_PGM_ALLOC_BLOCK 4096
    typedef struct ile_pgm_call_struct {
    #ifdef __IBMC__
      /* pad blob alignment */
      int blob_pad[3];
      /* ILE address (set ILE side) */
      char * argv[ILE_PGM_MAX_ARGS];
    #else
      /* pad pase alignment */
      int blob_pad[4];
      /* ILE address (untouched PASE side) */
      ILEpointer argv[ILE_PGM_MAX_ARGS];
    #endif
      int argv_parm[ILE_PGM_MAX_ARGS];
      int arg_by[ILE_PGM_MAX_ARGS];
      int arg_pos[ILE_PGM_MAX_ARGS];
      int arg_len[ILE_PGM_MAX_ARGS];
      char pgm[16];
      char lib[16];
      char func[128];
      int step;
      int max;
      int pos;
      int vpos;
      int argc;
      int parmc;
      int return_start;
      int return_end;
      char * buf;
    } ile_pgm_call_t;
    

    Specifically, we cannot/should not 'tag ILE pointers' on client side of QSQSRVR jobs. Therefore we simply pass the number of argc elements (pointer to parms), to the db2proc ILE program on the stored procedure side. Here we can tag the ILE pointers in the correct process (QSQSRVR process).

      /* set ILE addresses based memory spill location offset */
      for (argc=0; argc < ILE_PGM_MAX_ARGS; argc++) {
        if (argc < layout->argc) {
          /*  by reference */
          if (layout->argv_parm[argc] > -1) {
            /* ILE address parm location (skip by value slots) */
            parmc = layout->argv_parm[argc];
            offset = layout->arg_pos[parmc];
            /* set ILE address to data */
            layout->argv[argc] = (char *)layout + offset;
          }
        } else {
          layout->argv[argc] = NULL;
        }
      }
    

    Also, you may note we shift spill data by 4 bytes going to/from the stored procedure 'blob' call. This will un-tag pointers in the client side, so they cannot be misused (you little hacker you i says to nobody at all).

  38. Former user Account Deleted

    € is symbol for euro. Should be atleast in iso8859-15 and utf8 charsets. No idea what should be changed to get it work. I hope we don't use it often.

    I think we should open a different issue for ccsid (hell). As suggestion, I have found that editors that actually edit in 1208 (utf-8), tend to make issues disappear. That is, your editor may be in iso8859-15, wherein the ascii<>ebcdic fails on this symbol. However if your editor was in 1208 (utf-8), everything may just work.

    Side note: Assuming you are using my little json test program test1000_sql400json. We could check the input in this little program, and, convert from whatever the editor is using (say iso8859-15) into utf-8. There are some handy new SQL400 interfaces that may work (SQL400ToUtf8, SQL400FromUtf8).

  39. Former user Account Deleted

    Ok fixed your input data occurs 500 test.

    SuperDriver - version 1.0.5-sg8

    Thanks for the big test. However, I do not consider setting 500 array elements in a 'ds' a json interface style test. This is better suited to the 'big data' direct libtk400.a memory call interface (not written yet).

    ===
    1) high speed, big data, exotic calls
    high speed direct interface - big data interface (memory calls)
    ===
    php->pecl (unwritten)->libtk400.a (interfaces not written)
    note: 
    a) should work either within php job (unwritten) 
    b) or as stored proc call (db2proc we also use for json, etc. below)
    
    
    ===
    2) slower web interfaces, maybe ok for 80% of simple json calls as well
    parser interface -- web abstraction interfaces json, xml, etc. (socket, fastcgi, etc.)
    ===
    php->pecl(unwritten)->libjson400.a->libtk400.a 
    php->pecl(unwritten)->libxml400.a(unwritten)->libtk400.a 
    php->pecl(unwritten)->libcvs400.a(unwritten)->libtk400.a 
    

    BTW -- only mistake in your tests was 499 elements one short of required 500 elements.

  40. Former user Account Deleted

    BTW -- Also, we did not have to change ILE_PGM_MAX_ARGS (a really, really good thing). The problem was in ile_pgm_grow (as i suspected). You can see source change commit along with hamela04 occurs 500 test.

  41. Former user Account Deleted

    I ran a test of 500 occurs output in json (not input). Output more typical to me, but still seems a bit slow to me ... need some performance work in the json interface. Again, not for production yet, so we will have to look at this after getting basic conversion functions to work. Aka, we are still writing hello world programs with arrays and such that have failed, so, just not ready for big performance tuning work yet.

    SuperDriver - version 1.0.5-sg9

    BTW -- I suspect when the high speed direct call libtk400.a is written, this will also need performance work before all said and done (always the case). Well, this is what you get when you watch sausage being made in db2sock factory. If you don't like the watching, well, come back in December when done (hopefully).

  42. Teemu Halmela reporter

    Good thing you found the real problem with ILE_PGM_MAX_ARGS. I was just poking around and saw that making it bigger "fixed" the problem.

    I found the problem why calling my real program didn't work. The program was crashing because the LIBL wasn't correct and some needed programs wasn't found. It is just weird that the call just stalls, it does not give any errors to the json client.

    This program should show the error.

         H AlwNull(*UsrCtl)
           dcl-pr Main extpgm;
             hello char(128);
           end-pr;
    
           dcl-pr noprog extpgm;
           end-pr;
    
           dcl-pi Main;
             hello char(128);
           end-pi;
             hello = 'Hello World';
             noprog();
           return;
    

    But I got my program to work by defining correct libl like this example j0301_cmd_pgm_hello.json.

    I also tried to use the connect parameter to give it user that already has the correct libl set. But I don't seem to get it to work, it only gives empty output. Is this how the connect should be defined?

    {"script": [
        {"connect":{"db": "*LOCAL", "uid": "USER", "pwd": "PASSWORD"}},
        {"pgm":[
            {"name":"MYPGM"},
        ...
        ]}
    ]}
    
    output(13): {"script":[]}
    
  43. Former user Account Deleted

    Well, for *LIBL i would use a cmd to set to avoid using a 'connect' at all (1).

    {"script":[
      {"cmd":{"exec":"CHGLIBL LIBL(DB2JSON QTEMP) CURLIB(DB2JSON)"}},
      {"pgm":[{"name":"HELLO"},
            {"s":{"name":"char", "type":"128a", "value":"Hi there"}}
           ]}
    ]}
    

    (1) This json interface is mostly for REST calling, therefore, passing a "profile" is generally an unnatural act. There is much more to this story ... fun story ... shocking ending ...

  44. Former user Account Deleted

    So, i don't encourage connect in json interface, but connect is a "parent" to actions that follow. I recommend cmd in previous post.

    bash-4.3$ ./test1000_sql400json32 j0701_connect_pgm_hello                    
    input(1000000):
    {"connect":[{"db":"*LOCAL","uid":"DB2","pwd":"YIKES"},
      {"pgm":[{"name":"HELLO","lib":"DB2JSON"},
            {"s":{"name":"char", "type":"128a", "value":"Hi there"}}
           ]}
    ]}
    
    
    output(63):
    {"script":[{"pgm":["HELLO","DB2JSON",{"char":"Hello World"}]}]}
    
  45. Teemu Halmela reporter

    Okay thanks, I got the connect working. Also if the time comes the connections will probably be made using something like SQL400Connect(). That seems like better way to do things.

  46. Teemu Halmela reporter

    So nested arrays aren't working anymore when I make them stupidly large. Something is overflowing again or can't handle this nesting madness.

         H AlwNull(*UsrCtl)
    
           dcl-ds innerDS qualified;
              field1 char(10);
              field2 char(15);
              field3 char(25);
              field4 char(5);
           end-ds;
    
           dcl-ds outDS qualified;
              out1 int(10);
              out2 varchar(5:2);
              outTable likeds(innerDS) dim(30);
              out3 varchar(10:2);
           end-ds;
    
           dcl-pr Main extpgm;
             val int(10);
             outCount int(10);
             output likeds(outDS) dim(200);
             last char(10);
           end-pr;
    
           dcl-pi Main;
             val int(10);
             outCount int(10);
             output likeds(outDS) dim(200);
             last char(10);
           end-pi;
    
             dcl-s i int(10);
             dcl-s parms int(10);
             val = %parms();
             for i = 1 to %elem(output);
                output(i).out1 = val*val;
                output(i).outTable(1).field2 = 'a' + %char(i);
                output(i).outTable(2).field1 = 'b' + %char(i);
                output(i).outTable(3).field1 = 'c' + %char(i);
                output(i).outTable(4).field2 = 'd' + %char(i);
                output(i).outTable(5).field2 = 'e' + %char(i);
             endfor;
             last = 'TEST';
             outCount = i - 1;
           return;
    
    input(797):
    {"pgm":[
        {"name":"TPGM3", "lib":"DB2JSON"},
        {"s": {"name":"val", "type":"10i0", "value":10}},
        {"s": {"name":"outCount", "type":"10i0", "value":0}},
        {"ds": [{"name":"output", "dim":200},
            {"s":[
                {"name":"out1", "type":"10i0", "value":0},
                {"name":"out2", "type":"5av2", "value":""}
            ]},
            {"ds": [{"name": "innerDS", "dim":30},
                {"s":[
                    {"name":"field1", "type":"10a", "value":""},
                    {"name":"field2", "type":"15a", "value":""},
                    {"name":"field2", "type":"25a", "value":""},
                    {"name":"field2", "type":"5a", "value":""}
                ]}
            ]},
            {"s": {"name":"out3", "type":"10av2", "value":""}}
        ]},
        {"s": {"name":"last", "type":"10a", "value":""}}
    ]}
    
    Segmentation fault (core dumped)
    
  47. Former user Account Deleted

    You have multiple items going on today. Again, thanks for testing. I will look into each.

  48. Former user Account Deleted

    Okay thanks, I got the connect working. Also if the time comes the connections will probably be made using something like SQL400Connect(). That seems like better way to do things.

    Yes. In fact, default connections are already made by SQL400Connect. Wherein, default simply means use the current active profile for DB2 operations including toolkit calls.

    Need for connection speed ...

    Eventually everybody understands "speed need" for connection pooling. We already built connection pooling into new libdb400.a driver with SQL400pConnect, a persistent connection hash based on db/uid/pwd/qual ("keyed" connection).

    php->ibm_db2(newish)->libdb400.a->...
    ... SQL400Connect(db,uid,pwd)->QSQSRVR/toolkit/db2
        (stateless connection 
         -- open/close each "script")
    ... SQL400pConnect(db,uid,pwd,qual)->QSQSRVR/toolkit/db2 
        (state full connection (reuse qual) 
         -- open until force closed)
    

    The json interface to SQL400Connect includes "qual":"anykey", this will allow for connection pooling calls to SQL400pConnect with a "anykey" to re-use (db/uid/pwd filled in automatically of course).

    {"connect":[{"db":"*LOCAL","uid":"DB2","pwd":"YIKES","qual":"mykey"},
      {"pgm":[{"name":"HELLO","lib":"DB2JSON"},
            {"s":{"name":"char", "type":"128a", "value":"Hi there"}}
           ]}
    ]}
    

    to infinity and beyond ...

    In fact, when libdb400.a implements new "socket interface" (ssh, traditional, web), we could even set-up db2 daemons similar to MySql. Therein we can have very sophisticated connection pooling including wild ideas like ...

    ... "private connection" -- script/user active cursors, many script invocations

    ... "abandon connections" -- detect bad user programs hanging on MSGW (toolkit)

    ... so on

  49. Former user Account Deleted

    Mmmm ... we must be out of sync with repositories as this works fine for me ...

    Input:

    {"pgm":[
        {"name":"HAMELA05", "lib":"DB2JSON"},
        {"s": {"name":"val", "type":"10i0", "value":10}},
        {"s": {"name":"outCount", "type":"10i0", "value":0}},
        {"ds": [{"name":"output", "dim":200},
            {"s":[
                {"name":"out1", "type":"10i0", "value":0},
                {"name":"out2", "type":"5av2", "value":""}
            ]},
            {"ds": [{"name": "innerDS", "dim":30},
                {"s":[
                    {"name":"field1", "type":"10a", "value":""},
                    {"name":"field2", "type":"15a", "value":""},
                    {"name":"field2", "type":"25a", "value":""},
                    {"name":"field2", "type":"5a", "value":""}
                ]}
            ]},
            {"s": {"name":"out3", "type":"10av2", "value":""}}
        ]},
        {"s": {"name":"last", "type":"10a", "value":""}}
    ]}
    

    Output:

    {
        "script": [{
            "pgm": ["HAMELA05", "DB2JSON", {
                "val": 128
            }, {
                "outCount": 200
            }, {
                "output": [
                    [{
                        "out1": 16384
                    }, {
                        "out2": {}
                    }, {
                        "innerDS": [
                            [{
                                "field1": {}
                            }, {
                                "field2": "a1"
                            }, {
                                "field2": {}
                            }, {
                                "field2": {}
                            }],
                            [{
                                "field1": "b1"
                            }, {
                                "field2": {}
                            }, {
                                "field2": {}
                            }, {
                                "field2": {}
                            }],
                            [{
                                "field1": "c1"
                            }, {
                                "field2": {}
                            }, {
                                "field2": {}
                            }, {
                                "field2": {}
                            }],
                            [{
                                "field1": {}
                            }, {
                                "field2": "d1"
                            }, {
                                "field2": {}
                            }, {
                                "field2": {}
                            }],
                            [{
                                "field1": {}
                            }, {
    :
    goes on for many pages (big, big, big, return)
    :
                            }, {
                                "field2": {}
                            }]
                        ]
                    }, {
                        "out3": {}
                    }]
                ]
            }, {
                "last": "TEST"
            }]
        }]
    }
    

    possibilities...

    ... maybe update your fork repository to match mine???

    ... maybe re-compiled the test_c directory??? I changed test1000_sql400json.c to have a much bigger buffer for output (million chars from 512k a while back).

  50. Former user Account Deleted

    So, looks like you will be going nuts with big arrays and nesting. I have decided to increase test1000_sql400json32/64 to 5 million input/output characters. You will need to recompile tests_c (make tgt32 tgt64 install). You can see the output buffer size in output (below).

    bash-4.3$ ./test1000_sql400json32 j0188_pgm_hamela05-ds-rpg-nest
    input(5000000):
    

    You did not include invocation of test1000_sql400json32/64, so i don't know how big your in/out buffers ... very possible yu wrote off the end in that last huge array nested test.

    BTW -- again, great to test json with everything (thanks), but 'big data' interface is not available yet calling toolkit direct (without json).

  51. Teemu Halmela reporter

    Indeed my test program had too small output buffer. I was wondering why it was crashing somewhat differently than my other program. But now I have a bigger test program that should show some overflows :D

    I put some printf statements into the ile_pgm_grow to show some values.

    diff --git a/toolkit-base/PaseTool.c b/toolkit-base/PaseTool.c
    index 8078cea..8954e1e 100644
    --- a/toolkit-base/PaseTool.c
    +++ b/toolkit-base/PaseTool.c
    @@ -1030,7 +1030,10 @@ ile_pgm_call_t * ile_pgm_grow(ile_pgm_call_t **playout, int size) {
         new_len += ILE_PGM_ALLOC_BLOCK;
       }
       /* expanded layout template */
    +  printf("size: %d, max: %d, pos: %d, delta: %d, newlen: %d\n",
    +    size, layout->max, layout->pos, delta, new_len);
       tmp = tool_new(new_len);
    +  printf("tool_new done\n");
       /* copy original data */
       if (orig_len) {
         memcpy(tmp, layout, orig_len);
    
         H AlwNull(*UsrCtl)
    
           dcl-ds innerDS qualified;
              field1 char(10);
              field2 char(15);
              field3 char(25);
              field4 char(5);
              field5 char(40);
           end-ds;
    
           dcl-ds outDS qualified;
              out1 int(10);
              out2 char(5);
              out3 zoned(9:2);
              out4 char(15);
              out5 char(50);
              outTable likeds(innerDS) dim(30);
              out6 char(7);
              out7 char(8);
              out8 char(10);
              out9 zoned(9:2);
           end-ds;
    
           dcl-ds out2DS qualified;
              o2_out1 char(10);
              o2_out2 char(100);
              o2_out3 char(25);
              o2_out4 char(30);
              o2_out5 zoned(4:2);
           end-ds;
    
           dcl-pr Main extpgm;
             val int(10);
             inCount int(10);
             input likeds(out2DS) dim(200);
             out2Count int(10);
             output2 likeds(out2DS) dim(200);
             outCount int(10);
             output likeds(outDS) dim(200);
             last char(10);
           end-pr;
    
           dcl-pi Main;
             val int(10);
             inCount int(10);
             input likeds(out2DS) dim(200);
             out2Count int(10);
             output2 likeds(out2DS) dim(200);
             outCount int(10);
             output likeds(outDS) dim(200);
             last char(10);
           end-pi;
    
             dcl-s i int(10);
             dcl-s parms int(10);
             val = %parms();
             for i = 1 to %elem(output);
                output(i).out1 = val*val;
                output(i).outTable(1).field2 = 'a' + %char(i);
                output(i).outTable(2).field1 = 'b' + %char(i);
                output(i).outTable(3).field1 = 'c' + %char(i);
                output(i).outTable(4).field2 = 'd' + %char(i);
                output(i).outTable(5).field2 = 'e' + %char(i);
             endfor;
             last = 'TEST';
             outCount = i - 1;
           return;
    

    Running with 64-bit version.

    input(1992):                                                                                                                                                                          
    {"pgm":[                                                                                                                                                                              
        {"name":"TPGM3", "lib":"DB2JSON"},                                                                                                                                                
        {"s": {"name":"val", "type":"10i0", "value":10}},                                                                                                                                 
        {"s": {"name":"inCount", "type":"10i0", "value":0}},                                                                                                                              
        {"ds": [{"name":"input", "dim":200},                                                                                                                                              
            {"s":[                                                                                                                                                                        
                {"name":"o2_out1", "type":"10a", "value":""},                                                                                                                             
                {"name":"o2_out2", "type":"100a", "value":""},                                                                                                                            
                {"name":"o2_out3", "type":"25a", "value":""},                                                                                                                             
                {"name":"o2_out4", "type":"30a", "value":""}                                                                                                                              
            ]}                                                                                                                                                                            
        ]},                                                                                                                                                                               
        {"s": {"name":"outCount2", "type":"10i0", "value":0}},                                                                                                                            
        {"ds": [{"name":"output2", "dim":200},                                                                                                                                            
            {"s":[                                                                                                                                                                        
                {"name":"o2_out1", "type":"10a", "value":""},                                                                                                                             
                {"name":"o2_out2", "type":"100a", "value":""},                                                                                                                            
                {"name":"o2_out3", "type":"25a", "value":""},                                                                                                                             
                {"name":"o2_out4", "type":"30a", "value":""},                                                                                                                             
                {"name":"o2_out5", "type":"4s2", "value":0}                                                                                                                               
            ]}                                                                                                                                                                            
        ]},                                                                                                                                                                               
        {"s": {"name":"outCount", "type":"10i0", "value":0}},                                                                                                                             
        {"ds": [{"name":"output", "dim":200},                                                                                                                                             
            {"s":[                                                                                                                                                                        
                {"name":"out1", "type":"10i0", "value":0},                                                                                                                                
                {"name":"out2", "type":"5a", "value":""},                                                                                                                                 
                {"name":"out3", "type":"9s2", "value":0},                                                                                                                                 
                {"name":"out4", "type":"15a", "value":""},
                {"name":"out5", "type":"50a", "value":""}
            ]},
            {"ds": [{"name": "innerDS", "dim":30},
                {"s":[
                    {"name":"field1", "type":"10a", "value":""},
                    {"name":"field2", "type":"15a", "value":""},
                    {"name":"field3", "type":"25a", "value":""},
                    {"name":"field4", "type":"5a", "value":""},
                    {"name":"field5", "type":"40a", "value":""}
                ]}
            ]},
            {"s":[
                {"name":"out6", "type":"7a", "value":""},
                {"name":"out7", "type":"8a", "value":""},
                {"name":"out8", "type":"10a", "value":""},
                {"name":"out9", "type":"9s2", "value":0}
            ]}
        ]},
        {"s": {"name":"last", "type":"10a", "value":""}}
    ]}
    
    size: 4096, max: 0, pos: 0, delta: 0, newlen: 12288
    tool_new done
    size: 165, max: 12288, pos: 7942, delta: 26, newlen: 16384
    tool_new done
    size: 165, max: 16384, pos: 11902, delta: 162, newlen: 20480
    tool_new done
    size: 165, max: 20480, pos: 16027, delta: 133, newlen: 24576
    tool_new done
    size: 165, max: 24576, pos: 20152, delta: 104, newlen: 28672
    tool_new done
    size: 165, max: 28672, pos: 24277, delta: 75, newlen: 32768
    tool_new done
    size: 165, max: 32768, pos: 28402, delta: 46, newlen: 36864
    tool_new done
    size: 165, max: 36864, pos: 32527, delta: 17, newlen: 40960
    tool_new done
    size: 165, max: 40960, pos: 36487, delta: 153, newlen: 45056
    tool_new done
    size: 169, max: 45056, pos: 40696, delta: 40, newlen: 49152
    tool_new done
    size: 169, max: 49152, pos: 44752, delta: 80, newlen: 53248
    tool_new done
    size: 169, max: 53248, pos: 48808, delta: 120, newlen: 57344
    tool_new done
    size: 169, max: 57344, pos: 52864, delta: 160, newlen: 61440
    tool_new done
    size: 169, max: 61440, pos: 57089, delta: 31, newlen: 65536
    tool_new done
    size: 169, max: 65536, pos: 61145, delta: 71, newlen: 69632
    tool_new done
    size: 169, max: 69632, pos: 65201, delta: 111, newlen: 73728
    tool_new done
    size: 169, max: 73728, pos: 69257, delta: 151, newlen: 77824
    tool_new done
    size: 95, max: 77824, pos: 73483, delta: 21, newlen: 81920
    tool_new done
    size: 80823, max: 81920, pos: 74087, delta: 3513, newlen: 163840
    tool_new done
    size: 80823, max: 1077952576, pos: 1078033399, delta: -85143, newlen: 1078034432
    tool_new done
    size: 80823, max: 1078034432, pos: 1078114222, delta: -84110, newlen: 1078116352
    Segmentation fault (core dumped)
    

    There is some crazy stuff going on with those values. I feel my json shouldn't need 1gig of memory, it isn't that horrible.
    Crashing Ile_pgm_grow call is coming from ile_pgm_copy_ds. Something bad might be happening when calculating those memory positions.

  52. Former user Account Deleted

    There is some crazy stuff going on with those values.

    Great! I will look into this next test.

    I feel my json shouldn't need 1gig of memory, it isn't that horrible.

    Yes. This is why i am telling you that the high speed 'big data' interface will likely not be json.

    ===
    1) high speed, big data, exotic calls
    high speed direct interface - big data interface (memory calls)
    ===
    php->pecl (unwritten)->libtk400.a (interfaces not written)
    note: 
    a) should work either within php job (unwritten) 
    b) or as stored proc call (db2proc we also use for json, etc. below)
    
    
    ===
    2) slower web interfaces, maybe ok for 80% of simple json calls as well
    parser interface -- web abstraction interfaces json, xml, etc. (socket, fastcgi, etc.)
    ===
    php->pecl(unwritten)->libjson400.a->libtk400.a 
    php->pecl(unwritten)->libxml400.a(unwritten)->libtk400.a 
    php->pecl(unwritten)->libcvs400.a(unwritten)->libtk400.a 
    
  53. Former user Account Deleted

    Crashing Ile_pgm_grow call is coming from ile_pgm_copy_ds

    Well, unfortunately, i did not have time today to figure out what got messed up with arrays. I will try to look into the issue tomorrow.

  54. Teemu Halmela reporter

    Nice, no more segfaults.

    This last test definitely needs straight binary interface because things are starting to take way too long. I have no idea how that will look so I'll wait and see. Or maybe I could explore how something like PHP modules are constructed to get a head start.

  55. Former user Account Deleted

    Nice, no more segfaults.

    Cool! More progress then (sausage in the making).

    This last test definitely needs straight binary interface because things are starting to take way too long.

    Of course, json interface is not a 'bid data' interface. We need a binary interface for 'big data' to be 'fast'.

    Or maybe I could explore how something like PHP modules are constructed to get a head start.

    Sorry. I already told you 'big data' or 'binary' toolkit interface is not yet written. To be fair, you would be wasting your time most likely. However, education is always a good thing.

    We are still doing toolkit 101 basics with json. In fact, i offer, much easier using json abstraction to organize toolkit designs. Case and point, ease at which we are exchanging tests for your needs.

    starting to take way too long.

    Well, at this point, we are NOT ready to do performance tuning of json interface. All ends PASE/ILE are missing critical caching at this point. Not to leave you hanging ... I have already mentioned we may be able to simply cache the json 'description' of a program call and simply send the json data ... there are many more creative ideas yet to explore here.

    Also, you should set a breakpoint in your called RPG program. You will see that calling into your RPG program is already pretty fast, BUT, output of tons of json is slow. I imagine there are some output side performance tuning possibilities, but 1GB of 'string' json is a wild and crazy test to be sure. BTW -- I am very grateful for your wild tests, they will make everything better for all.

    explore how something like PHP modules are constructed to get a head start.

    Heart of a teacher will be my undoing.

    If you are serious about learning how to compile pecl extension for php (7 or 5.6), I can help you set-up a gcc environmental that works. I can even provide a little template so you can hit the ground running.

  56. Former user Account Deleted

    BTW -- I was in a hurry last night, just wanted to fix this for you before evening meal. I did not fully complete ile_pgm_copy_ds replacement, i will try to clean up the stuff today. Thanks for your tests.

  57. Former user Account Deleted

    Ok. cleaned up the multi-ds code.

    I think you will be able to nest multi-array 'ds' structures to hearts content. Again, deeper you nest the more output json will pop out. So, you may have to continuously increase buffer of little test json program test1000_sql400json.c.

    big data ...

    I won't be doing binary 'big data' interface until after we shake out major areas in json. We still have to add popular RPG convention of 'enddo':'count' to help the massive output of nothing array elements. We also should probably add other 'filters' like overlays (see this range cookie cutter), holes (ignore these elements), and so on to allow for server side filtering the ton-o-json problem you are already witnessing (too slow).

    Also, performance, you can't really draw conclusions about performance or even complete understanding of expected output until we implement all json filter bells and whistles, caching resolve ILE, many others, etc.

    Do you understand? Aka, we have just began to work with json, not to be confused with endgame?

  58. Teemu Halmela reporter

    I got a little impatient so I did little "profiling". I found a couple of problematic places and made some fixes. These made things a lot more better and speed is starting to get more reasonable. The biggest difference is definitely in nested case. Big input json seems to need more work, I'll see if I can find something.

    Before:

    $ time test1000_sql400json32 j0184_pgm_hamela04-ds-rpg-occurs-500-output
    input(5000000):
    output(70932):
    
    result:
    success (0)
    
    real    0m0.913s
    user    0m0.478s
    sys     0m0.001s
    
    
    $ time test1000_sql400json32 j0185_pgm_hamela04-ds-rpg-occurs-500
    input(5000000):
    output(69932):
    
    result:
    success (0)
    
    real    0m2.061s
    user    0m1.118s
    sys     0m0.002s
    
    
    $ time test1000_sql400json32 j0188_pgm_hamela05-ds-rpg-nest 
    input(5000000):
    output(385359):
    
    result:
    success (0)
    
    real    0m9.564s
    user    0m5.287s
    sys     0m0.001s
    

    After:

    $ time test1000_sql400json32 j0184_pgm_hamela04-ds-rpg-occurs-500-output
    input(5000000):
    output(70932):
    
    result:
    success (0)
    
    real    0m0.109s
    user    0m0.040s
    sys     0m0.001s
    
    
    $ time test1000_sql400json32 j0185_pgm_hamela04-ds-rpg-occurs-500       
    input(5000000):
    output(69932):
    
    result:
    success (0)
    
    real    0m1.314s
    user    0m0.516s
    sys     0m0.001s
    
    
    $ time test1000_sql400json32 j0188_pgm_hamela05-ds-rpg-nest      
    input(5000000):
    output(385359):
    
    result:
    success (0)
    
    real    0m0.161s
    user    0m0.070s
    sys     0m0.001s
    

    edit: So it seems that most of the time is used by SQLExecute so there aren't big gains to have anymore.

  59. Former user Account Deleted

    Well, i am not really ready for performance profiling. Too much is changing for just basics. However, I think strlen change is a good idea.

    BTW -- many of your changes in commit seem to be no change at all. I think maybe you are using tabs while i am using spaces. I use spaces because most of the code is generated by python scripts. Also, RPG tests do no like spaces generally. So, can you please switch to space so i can see proposed changes easily?

  60. Former user Account Deleted

    to be clear 'big data' ...

    We want json to run best can performance (thanks for all help), but json is not really a 'big data' interface. To be clear, I consider nested arrayed ds structures to be 'big data', most likely need a binary style interface (not written db2sock).

  61. Teemu Halmela reporter

    Yeah this definitely needs straight binary interface. Those changes were just too good to pass up.

    many of your changes in commit seem to be no change at all.

    The latest commit doesn't actually change anything important, it just converts some CRLF line endings to LF. Don't know how they got there. They were just messing with my git, every time I saved that file.

  62. Former user Account Deleted

    CRLF line endings to LF. Don't know how they got there.

    I believe Windows editors tend to employ CRLF. However, I have been Linux desktop for decades (LF), so I don't recall all Windows eccentrics (i think CRLF).

    git crlf -- but ... seems opinion is like rain here for Windows editors

  63. Former user Account Deleted

    Ok, I added json outareaLen performance of json_output_printf (slightly changed to make work, but thanks). I included start of pass by value work. You cannot see the full pass by value picture/design yet, aka, needed to test basic register loading on call (uni-size to test). More on this later ...

  64. Log in to comment