Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consuming TCP Stream #1080

Closed
mistal-distal opened this issue May 7, 2018 · 11 comments
Closed

Consuming TCP Stream #1080

mistal-distal opened this issue May 7, 2018 · 11 comments
Labels
state: needs more info the author of the issue needs to provide more details

Comments

@mistal-distal
Copy link

This is a question not an issue. I saw various issues back from this time last year about streams. Would the stream parser be able to parse/consume a TCP stream's buffer?

For example, it parses to the end of the buffer and the completing JSON isn't queued yet so it can rewind on buffer when new data has been queued.

@nlohmann
Copy link
Owner

nlohmann commented May 7, 2018

Did you find a solution?

@mistal-distal
Copy link
Author

I didn't I accidentally closed this without noticing. What would you recommend?

Right now I have a tcp stream coming into a char[ ] buffer

@mistal-distal mistal-distal reopened this May 7, 2018
@nlohmann
Copy link
Owner

nlohmann commented May 7, 2018

Could you describe your program a bit more? What is the structure you are parsing? How does it look like when you call the parser? How would further data be added?

@nlohmann nlohmann added the state: needs more info the author of the issue needs to provide more details label May 7, 2018
@mistal-distal
Copy link
Author

mistal-distal commented May 7, 2018

Long story short the server is continually sending JSON over TCP, which does not guarantee everything gets sent as whole. The buffer contains the sent json message parts and appends them to a string (std::string).

I'd like the client to be able to parse the json on the string and delete things it has parsed. For example here's the receive code block from the TCP stream.

while ((nDataLength = recv(Socket, buffer, 10000, 0)) > 0) {
		
		}

On a console test program I am able to parse my test strings appropriately. For example:
Test String:

{"ssid":"test", "power":1,"channel":20}

json j;
std::cin >> j;

	parseMe::WiFi w {
		j["ssid"].get<std::string>(),
		j["power"].get<int>(),
		j["channel"].get<int>()
	};

std::cout << w.ssid <<" " << w.power << " " <<  w.channel << std::endl;


namespace parseMe {

	struct WiFi {
		std::string ssid;
		int power;
		int channel;

	};
}

Ignore that last edit I made.

@jaredgrubb
Copy link
Contributor

Are you receiving back-to-back json ("{...} {...} {....}") and you need to pull them apart, are you getting one very-large json ("[a,b,c,........]") and you want to consume "a" before you get the closing ']'?

@mistal-distal
Copy link
Author

Back to back json exactly. Right now what I implemented is a size byte in between the json structures. Read size, buffer up a single json statement, parse, delete from buffer, rinse and repeat. It works well so far but I'm wondering if there's cleaner solutions.

@jaredgrubb
Copy link
Contributor

I actually did this a couple years ago but used the "json11" library:

std::string parseError;
for(; iter != end; ) {
    size_t stopPos = 0;
    auto object = json11::Json::parse({iter, end-iter), stopPos, parseError);
    if (!parseError.empty()) { ...  break; }
    iter += stopPos;
}

I don't think nlohmann::json currently does this, but it probably would not be hard to add a form of parse that:

  • allows a JSON object to end early
  • return an iterator to the next unread position
  • allow caller to distinguish between "early end-of-stream" failure from all the other parse errors; the first is expected as you are waiting for bytes off the socket, but you need to be able to detect the latter for when the JSON stream is really broken and you will never be able to recover.

@mistal-distal
Copy link
Author

I've managed to get it working myself, I'm passing multiple different types of JSON structures and converting them to their arbitrary types. All is well

@g-arjones
Copy link

@DOGFIVE Could you please share your solution?

@madf
Copy link

madf commented Aug 14, 2019

@DOGFIVE Could you please share your solution?

I also faced this problem and found out that there is only one JSON parser suitable for this task: https://github.com/lloyd/yajl

It allows partial input and multiple root objects.

@manuelgavidia
Copy link

I've managed to get it working myself, I'm passing multiple different types of JSON structures and converting them to their arbitrary types. All is well

Too old thread, but same issue here today ... @mistal-distal the solution that worked for you was that shared by @jaredgrubb ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
state: needs more info the author of the issue needs to provide more details
Projects
None yet
Development

No branches or pull requests

6 participants