|||

Sending Data over Avatar OSC

March 2022

Recently VRC added OSC support on avatars, can we use this to infil/exfil data?

Building off our previous work using video frames to encode data, I though we could speed things up by proxying data via the new OSC-over-avatar feature VRC just dropped.

At first I thought the major issue with this would be that the on-avatar OSC support is limited to the types and lengths of data that avatar parameters support. Not a lot of data, and no blobs! However, we can do a similar kind of naive encoding as before: convert arbitrary data to base64 and then send it over one byte at a time.

Along the way I discovered base35 encoding, which would allow us to pack things even tighter (in fact, it’s somewhate designed for the task, as a protocol meant to help port old data to 1byte transfer pipes).

It (half) works!

Data encoding and import works, as you can see in the videos below! Alas, once the data has reached the avatar, there’s nothing you can do with it. I had hoped that it was possible to read avatar parameters from within a world (based on this project), but on closer inspection it doesn’t quite work that way.

Ultimately the purpose of this was to allow for external data/API calls to impact the world (proxying that data through the avatar). VRC has announced that they intend to provide OSC control to worlds, which would eliminate the need to do this entirely. Hopefully when (if?) they do this, they’ll also provide data of arbitrary lengths, or at least greater than one integer at a time.

Data packed into integers and ready for transport!

But Wait…

Along the way, I added a goofy blinking thing to my ear (see the videos below) so you can “see” the data flowing into the world. At the moment this simply blinks regularly while data is coming across the pipe, but this got me thinking. While it seems impossible to simply pass data to the world, worlds can contain cameras, and the framebuffer of these can be decoded as we did before. In theory, it should be possible to build an opto-coupler by convering the data to specific blink patterns and focusing a world camera to focus on my ear plug. This would be fun but 1. Insanely slow. I’ve decided I have enough on my plate and am going to stop this project for now, but I’m really hoping VRC adds proper world OSC support (or even better, network calls to Udon) soon!

Simple test: JSON->XML over OSC in ~2 seconds
More complex case: Unbounded JSON->XML with VRC driven timing. Takes about 30 seconds to send a whole address. I use the OSC output as a "callback" and ACK