How can my Oxygene service that returns an array of bytes of a png be processed by iPad app?

This isn’t exactly an Oxygene question, but here goes.

We have an application for Desktop written in Delphi that we’ve hired guys to convert to an iPad app. I’ve written a REST service in Oxygene that returns a structure that contains a array of bytes in JSON format that are the bytes of a PNG image.

In my test SL app, I can call that service, convert the json character string to bytes and get the PNG displayed. It actually works really well.

So now the iPad guys have to consume that. They are already asking me if I can change what I’m doing. But I suspect they don’t know much.

So can’t the iPad app read the stream of bytes from the REST service and put that PNG into a UIImage or something? Do they just not know what they are doing?

Or will I have to make some different interface in the REST service to accommodate them?

Not quite sure how they “get” the data, but presuming they have it parsed with the built in json support in cocoa, they can use something like:

NSData *decodedData = [[NSData alloc] initWithBase64EncodedString:base64String options:0];

to decode a base64 encoded string to NSData, which they can load with UIImage.

Ok, thanks. That is what I thought from doing a quick search on the web. Just wanted to double check since I haven’t done anything on apple product.

Not quite sure what you don’t get about the “get” the data :slightly_smiling:

They make a REST service call and it is in the returned result.

right what I meant was I don’t know your your ios colleagues retrieve the data from the webervice, what apis they use. But above should work.

Thanks.