-
Notifications
You must be signed in to change notification settings - Fork 1.4k
"Capacity problem" - or problem in Protobuf logic #26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
The underlying buffer seems to be resized on the fly when data are written into the ByteBuffer. The problem is more that ProtoBuf try to read stuff which was never written into the Buffer. Seems to be a problem in the underlying logic in ProtoBuf? |
Unsure what's going on there as offset=642 with length/capacity=644 should not throw an exception (still 2 uint8s left). Are you using the latest version of ByteBuffer.js? Edit: Have you tried BB 1.3.6? |
I've added a bit more information to the exception message in BB 1.3.6 (now on NPM). It now also names the actual offset that's being accessed. |
I updated to latest version (just now). Full error stack is now: |
I am using Protobufjs in a loop, like Typically: it works for a while (or even just for the very first one), and than it crashes during an other run. What is the problem with the loop? |
Finally, I found that the problem is not the decoding process but the HTTP request - after a while, the remote server starts to use "Chunked transfer encoding" - which was not handled correctly in my code. So, protobufjs works fine but needs a valid data input ;-) |
Maybe I could add some sort of #decodeFromUrl(...) or something. Would you share your code? |
I am now using this code (but I am still not sure whether this 100% correct or not): var ProtoBuf = require("protobufjs"); function readGtfs() |
Hi, |
One thing you could try is to reverse engineer the Vehicles.pb to validate that it actually matches the proto definition: That'd be the obvious reason for failure. If you assume a bug in ProtoBuf.js, any additional information would be useful, like errors thrown or a break down of the data and proto file to a minimal failing case. |
Another point of error could be that somewhere between requesting and parsing the Vehicles.pb there is a string conversion, which is bad. This will most likely corrupt the data. See: Something like this is also required on node, like working with buffers instead of strings when fetching the data, maybe forcing |
The only conversion I have is from binary to ascii because ProtoBuf needs a valid base64 encoded string.
I will try your test cases and I will let you know. |
The problem is this: if (!data) data = chunk; else data += chunk; This converts data to a string from a Buffer object. Never ever do this if the Buffer's data is not an utf8 encoded string! Instead try: function readGtfs() {
var data = []; // List of buffers
var req = http.request(configServer, function(res) {
res.on('data', function (chunk) {
data.push(chunk); // Add buffer chunk
});
res.on('end', function (){
data = Buffer.concat(data); // Make one large buffer of it
try {
var feed = myDecoder.decode(data); // And decode it
... This way, no string conversion happens and I've also updated the FAQ: https://github.com/dcodeIO/ProtoBuf.js/wiki/How-to-read-binary-data-in-the-browser-or-under-node.js%3F |
It works perfectly!!! |
You are welcome! |
Running ProtoBuf.js in Real Life with real data results sometimes in an error message:
Cannot read uint8 from ByteBuffer(offset=642,markedOffset=-1,length=644,capacity
=644): Capacity overflow.
The problem clearly depends on the data which are handled. Anyhow, the reason and how to solve it is open, unfortunately.
The text was updated successfully, but these errors were encountered: