Subj : Re: read json from file
To : echicken
From : Mortifis
Date : Sun Jun 21 2020 12:32 pm
> Re: Re: read json from file
> By: Mortifis to echicken on Sat Jun 20 2020 21:43:56
> Mo> }, .... that's the 1st three records of 209,578 records ... gotta read
> Mo> all 209,578 records (1 million, 8 hundred 86 thousand, 2 hundred 13
> Mo> lines) ? ...
> Mo> bleh ... lol
> Not necessarily, but you'll need something more than what we have on hand.
> Streaming JSON parsers for handling really large files are a thing, but I
> don't know if there's one that can be readily ported to our environment.
> JSON.parse() only wants to parse a complete JSON string. You'd need to be
> able to pre-process what you're reading from the file to be sure that
> JSON.parse() won't choke on it.
> That's tricky to do in a generic way that could handle any old JSON you
> throw at it.
> Easier if you do it as a custom job for this particular file, and if this
> file is just a flat array of objects, all with the same keys and types of
> values. It's either going to be a bit complicated but fairly solid, or
> simple and hacky and maybe not super reliable.
This worked :
load("sbbsdefs.js");
var infile = js.exec_dir + "owm-citylist.json";
write('Enter City Name: ');
var what = readln().toUpperCase();
writeln('\r\n\r\nSearching for '+what+'\r\n');
var j = new File(infile);
var json = "";
var match = false;
j.open("r");
while(!j.eof && !match) {
json = "";
for(var i = 1; i < 10; i++) { // 9 lines per record { ... }
json += j.readln();
json = json.replace(/^\s+/g, '');
}
json = json.slice(0, -1); // strip trailing ',' from string
var obj = JSON.parse(json);
var n = obj['name'].toUpperCase().indexOf(what);
if(n >= 0) {
writeln('Name: '+ obj['name']+'
Country:'+obj['country']+'\r\n\r\n');
match = true;
}
}
j.close();
currently exits on 1st match ...
Thank you for pointing me in the right direction (again :)
~Mortifis
---
� Synchronet � Realm of Dispair BBS -
http://ephram.synchro.net:82