Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Less eager JSONParser on streams #56

Open
GoogleCodeExporter opened this issue Dec 24, 2015 · 6 comments
Open

Less eager JSONParser on streams #56

GoogleCodeExporter opened this issue Dec 24, 2015 · 6 comments

Comments

@GoogleCodeExporter
Copy link

Now JSONParser.parse(Reader) reads stream eagerly and throws ParseException if 
a JSON ha ended and another one begins. It would be great if there would be an 
option to return JSON object parsed up to that point instead of throwing an 
exception. After that, more JSONs could be read from the stream.

I guess a small additional logic in
'case Yytoken.TYPE_RIGHT_BRACE' and 'case Yytoken.TYPE_RIGHT_SQUARE'
would do the job. (I plan to make a patch.)

Original issue reported on code.google.com by [email protected] on 20 Oct 2011 at 12:14

@GoogleCodeExporter
Copy link
Author

Any movement on this? Has a patched been created for this yet? I'm just running 
across a case where I want to do this.

Original comment by [email protected] on 15 Nov 2011 at 11:54

@GoogleCodeExporter
Copy link
Author

I postponed creating the patch but if you need it I can submit it in one day.

Original comment by [email protected] on 16 Nov 2011 at 1:16

@GoogleCodeExporter
Copy link
Author

I made a work-around. Basically read in data until I reach the end of the JSON 
object from the file, and pass that into the parser. Rinse & repeat.

Would still be a nice to have.

Original comment by [email protected] on 16 Nov 2011 at 6:29

@GoogleCodeExporter
Copy link
Author

I tried to do it but it's much harder than I originally thought because of this:
"the old input stream cannot be reused (internal buffer is discarded and lost)."
(comment from the Lexer)
The lexer is reads ahead and when I invoke again "parse" method even on a new 
JSONParser instance, the reader is in a wrong position. Maybe at EOF.
Currently I don't have idea for resolving this.

Original comment by [email protected] on 17 Nov 2011 at 5:21

@GoogleCodeExporter
Copy link
Author

Hi, you can define a higher level protocol if you want to carry multiple JSON 
objects in a single stream. For example, define the stream as the following:
<separator> JSON 1 <separator> JSON 2 ...
Then implement your owner reader which returns JSON stream between separators.

Hope that helps.

Original comment by [email protected] on 17 Nov 2011 at 10:05

  • Added labels: Type-Enhancement
  • Removed labels: Type-Defect

@GoogleCodeExporter
Copy link
Author

Maybe create a parser and pass it an InputStream/Reader and create a 
PushBackReader around that, which lets you push data back onto the buffer.

Original comment by [email protected] on 17 Nov 2011 at 5:38

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant