• read json from file

    From Mortifis@VERT/EPHRAM to All on Sat Jun 20 11:24:37 2020
    I have a local json file that I'd like to search for a matching unknown recordã# without having to read in the entire object (the file has over 200,00 recordsãspanning over 1.8 million lines) ... trying to read in the entire record errorsãwith out of memory ...ããany ideas?ããThanks,ãã~Mortifisãã---ã þ Synchronet þ Realm of Dispair BBS - http://ephram.synchro.net:82ã
  • From echicken@VERT/ECBBS to Mortifis on Sat Jun 20 16:16:15 2020
    Re: read json from fileã By: Mortifis to All on Sat Jun 20 2020 11:24:37ãã Mo> I have a local json file that I'd like to search for a matching unknown recordã Mo> # without having to read in the entire object (the file has over 200,00 recordsããIf your file looks like this:ãã{"a":0,"b":1,"c":2,"d":3}ããYou basically need to parse the whole thing. Or write your own special parser.ããIf your file is like this:ãã{"a":0,"b":1,"c":2,"d":3} {"e":4,"f":5,"g":6,"h":7}ããThen those are two separate JSON strings on adjacent lines. In which case you can read the file line by line, and each line will work with JSON.parse.ãã---ãechickenãelectronic chicken bbs - bbs.electronicchicken.comã þ Synchronet þ electronic chicken bbs - bbs.electronicchicken.comã
  • From Mortifis@VERT/EPHRAM to echicken on Sat Jun 20 21:43:56 2020
    Re: read json from fileã > By: Mortifis to All on Sat Jun 20 2020 11:24:37ãã > Mo> I have a local json file that I'd like to search for a matching unknownã > Mo> recordã > Mo> # without having to read in the entire object (the file has over 200,00ã > Mo> recordsãã > If your file looks like this:ãã > {"a":0,"b":1,"c":2,"d":3}ãã > You basically need to parse the whole thing. Or write your own specialã > parser.ãã > If your file is like this:ãã > {"a":0,"b":1,"c":2,"d":3}ã > {"e":4,"f":5,"g":6,"h":7}ãã > Then those are two separate JSON strings on adjacent lines. In which caseã > you can read the file line by line, and each line will work withã > JSON.parse.sadly ... it looks like this :ãã[ã {ã "id": 707860,ã "name": "Hurzuf",ã "country": "UA",ã "coord": {ã "lon": 34.283333,ã "lat": 44.549999ã }ã },ã {ã "id": 519188,ã "name": "Novinki",ã "country": "RU",ã "coord": {ã "lon": 37.666668,ã "lat": 55.683334ã }ã },ã {ã "id": 1283378,ã "name": "Gorkhā",ã "country": "NP",ã "coord": {ã "lon": 84.633331,ã "lat": 28ã }ã }, .... that's the 1st three records of 209,578 records ... gotta read allã209,578 records (1 million, 8 hundred 86 thousand, 2 hundred 13 lines) ? ...ãbleh ... lolãã---ã þ Synchronet þ Realm of Dispair BBS - http://ephram.synchro.net:82ã
  • From echicken@VERT/ECBBS to Mortifis on Sat Jun 20 22:57:44 2020
    Re: Re: read json from fileã By: Mortifis to echicken on Sat Jun 20 2020 21:43:56ãã Mo> }, .... that's the 1st three records of 209,578 records ... gotta read allã Mo> 209,578 records (1 million, 8 hundred 86 thousand, 2 hundred 13 lines) ? ...ã Mo> bleh ... lolããNot necessarily, but you'll need something more than what we have on hand.ããStreaming JSON parsers for handling really large files are a thing, but I don't know if there's one that can be readily ported to our environment.ããJSON.parse() only wants to parse a complete JSON string. You'd need to be able to pre-process what you're reading from the file to be sure that JSON.parse() won't choke on it.ããThat's tricky to do in a generic way that could handle any old JSON you throw at it.ããEasier if you do it as a custom job for this particular file, and if this file is just a flat array of objects, all with the same keys and types of values. It's either going to be a bit complicated but fairly solid, or simple and hacky and maybe not super reliable.ãã---ãechickenãelectronic chicken bbs - bbs.electronicchicken.comã þ Synchronet þ electronic chicken bbs - bbs.electronicchicken.comã
  • From Mortifis@VERT/EPHRAM to echicken on Sun Jun 21 12:32:56 2020
    Re: Re: read json from fileã > By: Mortifis to echicken on Sat Jun 20 2020 21:43:56ãã > Mo> }, .... that's the 1st three records of 209,578 records ... gotta readã > Mo> all 209,578 records (1 million, 8 hundred 86 thousand, 2 hundred 13ã > Mo> lines) ? ...ã > Mo> bleh ... lolãã > Not necessarily, but you'll need something more than what we have on hand.ãã > Streaming JSON parsers for handling really large files are a thing, but Iã > don't know if there's one that can be readily ported to our environment.ãã > JSON.parse() only wants to parse a complete JSON string. You'd need to beã > able to pre-process what you're reading from the file to be sure thatã > JSON.parse() won't choke on it.ãã > That's tricky to do in a generic way that could handle any old JSON youã > throw at it.ãã > Easier if you do it as a custom job for this particular file, and if thisã > file is just a flat array of objects, all with the same keys and types ofã > values. It's either going to be a bit complicated but fairly solid, orã > simple and hacky and maybe not super reliable.ããThis worked :ããload("sbbsdefs.js");ãvar infile = js.exec_dir + "owm-citylist.json";ããwrite('Enter City Name: ');ãvar what = readln().toUpperCase();ããwriteln('\r\n\r\nSearching for '+what+'\r\n');ããvar j = new File(infile);ãvar json = "";ãvar match = false;ããj.open("r");ããwhile(!j.eof && !match) {ã json = "";ã for(var i = 1; i < 10; i++) { // 9 lines per record { ... }ã json += j.readln();ã json = json.replace(/^\s+/g, '');ã }ã json = json.slice(0, -1); // strip trailing ',' from stringã var obj = JSON.parse(json);ã var n = obj['name'].toUpperCase().indexOf(what);ã if(n >= 0) { ã writeln('Name: '+ obj['name']+'ãCountry:'+obj['country']+'\r\n\r\n'); ã match = true;ã } ã}ãj.close();ããcurrently exits on 1st match ...ããThank you for pointing me in the right direction (again
    :)ãã~Mortifisãã---ã þ Synchronet þ Realm of Dispair BBS - http://ephram.synchro.net:82ã