As a result, the panoramic image gets distortions at the edges
Sure, and the amount of distortion depends on the field of view of the virtual camera.
But in the context of a tracked headset, those edges are in a zone of your sight that is not clear, so that you need to turn the head to view it better, and then the distortion disappears because you moved the center of the image.
I would dare to write such a viewer, but I have not yet dealt with how to get the sensor information into the c/c++ software.
I suppose that you want to do it for an Android device.
If so, you can get the sensor position in Java (random example from the Net: https://www.codeproject.com/Articles/1000120/Android-Compass) and pass it to your C/C++ code through a JNI interface.
You probably can grab the data directly from C/C++ by using generic manipulation methods of Java classes/objects (you probably need to pass through the system Java classes/services to get the data, except if you target a specific device with access to the driver not blocked by kernel permissions), but it would be more complicated with probably no real gain for this task.