Bluebonnet example.. How is it done?

  • Hi! I was looking at bluebonnet example in the new krpano on oculus rift and I'm wondering how this was done. Could someone explain it to me?

    I can see that panos folder has two locations one for left eye and the other for right eye and that panos inside of them are ever so slightly different (like at a different angle?) and the xml file has a variable that gets replaced by the correct stereolabel.. I'm wondering how those panos were generated... It seems like two cameras were used, but would it be possible to do it with only one camera and shift the image to "simulate" the shot produced by the would be second camera? I'm not sure if I explain myself clearly enough..

    I'd really appreciate some insight into a process of creating something like that...

Participate now!

Don’t have an account yet? Register yourself now and be a part of our community!