Posts by odysseyjason

    Multiresolution CUBE is the most efficient projection in tile and data usage for full spherical imagery that is currently supported by krpano. Beyond 60 degrees latitude at the poles the spherical format starts to get quite wasteful (from 50% to 100% wasteful from 60 degrees to 90 degrees). The spherical format is best for partial panoramas, especially if they do not include much above or below 60 degrees latitudes. Technically, the percent waste is a 1- ( COS(angle above or below horizon)). A truly state of the art projection would be the spherical projection from -60 to +60 and a flat top and bottom multiresolution square projection of what is above 60 degrees (it would be a circle projected onto the square) as even the cube gets less efficient towards the corners. I don't know if this projection exists but would be happy to take credit and name it.

    Hi Cestmoimahdi,

    Yes, pannocamadder is only to bake the textures into a single file. You don't need it to render out if using the depthmap method.

    Your code looks fine. I don't see the error there so it must be somewhere else. (maybe its with your model .obj) Can you share a link?

    I do believe krpano can only support a single obj model at this time.

    Jason

    I am very pleased to report that the GPU on the Oculus Quest 2 was easily able to handle my complex Three.js scene with smooth results with the animated lighting and shadow map! The Quest version 1 could COMPLETELY not handle this scene. The GPU on the new Quest seems to be of a much higher caliber, I am amazed with the difference and how well a little mobile chipset is doing! Also reporting here that the krpano panorama images do indeed look sharper with this headset in VR and I did not notice while within an image a negative about the black levels (although I could see the difference when looking at a black screen as its LCD instead of OLED.) The unit does seem lighter and more comfortable too. Excited!!! *smile* *smile* *smile*

    (Now the question becomes what platform to develop for?...) *blink*

    While learning threejs.org and investigating adding textured 3D Models created with Blender and the PanoCamAdder plugin by https://der-mische.de/, I created some ThreeJS 'scenes' to explore all the fun things possible such as lighting, shadows, navigation, and adding other objects.

    Here is a scene that works great in VR on Oculus Quest V1, it has navigation via touch control thumbsticks, 10 geometry objects added that can be manipulated, as well as a simple directional light, but no shadow effects.
    Simple VR Ready Scene - Works well on Quest V1

    ******************************************

    Here is the scene above with a complex animated light and a shadow map effect added. It works great on a Desktop VR setup but is far too much for the Quest 1 GPU (we will see soon about Quest V 2)
    Complex VR Scene (not for Quest 1) Forkable Codepen.io Code


    ******************************************

    The shadow map effect seem to be the hardest part for mobile GPU's to compute, but they are really awesome when power is available. There is so much potential using these panorama based texture-mapped models for real life looking games, real-estate tours, your imagination is the limit! Building the touch controller navigation system was a significant effort, I hope somebody finds inspiration. (Next step is learning Blender and PanoCamAdder)

    Happy Creations *rolleyes*

    Jason

    Hi Cestmoimahdi,

    Currently the object model panorama method does not support tiles. The objects are a description of vertices for a 3D model and they map to coordinates on a texture atlas that is a single image file.

    Once you have baked your model face textures into a texture atlas with blender

    The XML is like:

    Code
    <display depthbuffer="true" />
    <scene name="textured_model_object_test" autoload="true">
    	<preview url="texture_preview.jpg" />
    	
    </scene>

    where your model is: 'tourmodel.obj' and your texture atlas is 'texture.jpg'

    I had just checked out the krpano depthmap samples Example 7 (stereo to depth) here
    https://krpano.com/releases/1.20.…hmap-images.xml

    and was searching Flickr for an indoor equirectangular panorama of a room and came across this:
    https://www.flickr.com/photos/bridevalley/5995278972/

    These two room are not the same (the wood even runs differently on the floor) but they are the exact same layout and color scheme!

    Just thought I would share...

    The ability to use the thumbsticks on the Oculus Quest controller to allow the viewer to turn or 3d movement within a depthmaped panorama is no longer possible through the Oculus Browser on the Oculus Quest after the latest update of the Oculus Browser which now only supports WebXR. The thumbsticks do still work in the Firefox Reality browser on Oculus. This krpano image works with thumbsticks in Firefox Reality, but not Oculus Browser: https://3d-360.com/gigakrindex.ph…=218403&fov=360

    The thumbsticks also do not work with the krpano examples through Oculus Browser on Quest https://krpano.com/releases/1.20.…tour/index.html

    I have tested and WebXR itself still has the ability to use the thumbsticks in the Oculus Brower, so this is something to do with krpano's implementation. This page works on Oculus browser through WebXR to see controller states https://immersive-web.github.io/webxr-samples/…ller-state.html

    This problem seems to be an issue with krpano's use of WebXR and not WebXR itself.

    Would very much like to use spherical projected equirectangular style tiles with Krpano WITHOUT first transforming them to cubical projection.

    Have a set of hundreds of thousands of deepzoom gigapixel images in a spherical projection that are not able to be converted to cubical multilevel projection as they are mostly partial panoramas (much less than 360x180) and a conversion of such would be an extreme waste of resources. This was possible using the flash player version or krpano but not webgl html5. Reprojection is seriously NOT AN OPTION! but would very much like to view this set as not-flat.

    Any hope of supporting spherical projected tiles with krpano webgl html5 in an upcoming version *question*

    Jason *smile* *smile* *smile* *smile*

    Would really love being able to use multilevel tilesets that are in spherical projection with krpano HTML5 WebGL,

    We have a set of hundreds of thousands of gigapixel images in a spherical projection that are not able to be converted to cubical multilevel projection, especially because the majority are partial panoramas (much less than 360x180) and a conversion of such would be an extreme waste of resources. This is possible using the flash player version but not webgl html5. It is possible to view the tilesets of panoramas currently as flat panoramas, but would very much like to view them in their spherical space (understanding that one would have to supply angle of view and height of horizon for partial panoramas)

    Any progress or hope on this front?

    Thanks

    Jason *smile*

    Hello Klaus,

    I initially thought that this krpano 1.19 supported spherical and partial multi-resolution panoramas in html5 webgl that are of the spherical (non-cubeface) projections.

    The XML documentation no longer mentions limitations on doing so and seems to show the example XML to produce it, but when I try it, I just get a blank screen unless the hfov is set at 1.0., as it was with 1.17

    Can you please confirm if this should work or not. If so, what would the XML look like, and if not, is it scheduled for any future release?

    example: http://www.3d-360.com/gigakrindex.php?id=170636

    I curate thousands of partial panoramas that are and will remain in a spherical projection and can not efficiently be transformed into cube faces as they are all multi gigapixel partial panoramas. I would very much like to use the krpano javascript to view these, like was possible with flash krpano example: http://www.3d-360.com/view-remote-im…gigapans/170636

    Thank you for your hard work, time, and consideration

    Jason

    Samsung has a special web browser that works within GearVR and they have recently enabled WebVR to work with the GearVR's external sensors, to allow full WebVR experience for Virtual Reality. Official Announcement Just to be clear, Samsung makes two browsers called 'Internet', a regular mobile browser and a special virtual reality browser to use with the GearVR. The GearVR version arrived in December 2015, but did not have the ability to work with any WebVR pages. On an April 1st, 2016 update, they now have experimental WebVR (although it is using a depracated version of WebVR, so most WebVR samples do not work with the browser, but I found a few that do.)

    It is first necessary to turn ON the WebVR mode by visiting this link within the GearVR Samsung Internet Browser http://webvr-enable (be sure to navigate to the URL, it defaults to search)

    Now if you visit a site on the normal non-gear-vr Samsung Internet Browser that has a WebVR content, and then you insert the phone into the Gear VR, it will load the same page into the GearVRr Samsung Internet Browser (they are completely different programs and starting the GearVR browser takes a few moments). The page is presented like a movie screen, but if you press 'enter VR' it expands into a full 360 degree experience using the GearVR sensors with html and WebVR content.

    Here are the few sample WebVR pages that I found that were still using the old WebVR API and will work with the browser, all are from three.js

    cubes
    panorama
    rollercoaster
    video


    Unfortunately, the resolution seems to drop significantly, I do not know what size resolution window the browser is reporting, but it is nowhere near satisfactory. I do not know if it is possible to change this setting, it seems to be the only hangup other than some stuttering (javascript frame rate) compared to native VR programs. The external sensors are being used, as well as the low persistence mode, all we need now is the resolution and fix the stutter (which is ridiculous as cardboard Krpano works just fine on mobile so its not that the phone is under-powered) .


    Now, how do we get the krpano stereo interface to use the GearVR WebVR API?

    Loading a stereo krpano page into the GearVR Samsung Internet Browser is truly horrible currently. It simply does not work well in any capacity (and is not using the WebVR API of the GearVR) It does load and you can enter a stereo 360 mode, but nothing lines up from eye to eye. Perhaps building a special version that uses the deprecated WebVR api found in the GearVR Samsung Internet Browser?

    Hopefully someone can wire an example up??? Hopefully in the next version of the browser the newest api is used and higher resolution is made possible. *smile* We are at least getting closer!

    Hi,

    I am wondering if anyone has experience with the Samsung Gear VR and Samsung Note 4 with the KRPANO Stereo 3D panorama viewing setup? I would like to get an understanding on the challenges and merits of this exciting new hardware in regards to viewing stereo panoramas. My similar experience is limited to a Nexus 5 cardboard setup.

    The GearVR is a stereo viewing headset built by Samsung in collaboration with Oculus. A Samsung Galaxy Note 4 phone is mounted within the headset to provide a mobile Oculus Rift like experience for virtual reality gaming and media viewing. The Note 4 features a high resolution 2600x1600 AMOLED screen with very fast response times. The Gear VR headset has built in gyroscope/accelerometer/magnetometer sensors for motion tracking (yaw/pitch/roll) and communicates with the Samsung Note 4 phone via a USB connection. The software applications currently available for the GearVR have motion attitude tracking being done on the GearVR's hardware, not the phones internal sensors, presumably because the external sensors do a better job.

    Question: How would the software connection between the GearVR hardware and the Krpano Javascript be made in browser? The RIFT.HTML is for desktop browsers, which can have plugins to reach external hardware VR.JS plugin, while the Cardboard MOBILEVR.HTML uses the mobile devices internal motion hardware though javascript api hooks. Does the Note 4 possibly have a special web browser with javascript api hooks for communicating with external motion sensors? Is it necessary to build an application through the Oculus Mobile SDK in order to link with the external GearVR hardware, (which would make each stereo panorama viewing experience an 'AP' rather than the open web)?

    My experience with the Samsung Tab S 8.4 tablet, which has a similar glorious display, was that the devices built in motion sensors were absolute garbage. When tried with the krpano MOBILEVR.HTML setup, I experienced the worst motion attitude tracking I have ever used on a mobile device! Compared to a Nexus 5 phone, which has good sensor stability and motion tracking, the Samsung Tab S tablet jumps around everywhere and does not make for a believable experience. Does anyone have experience with the Samsung Note 4's internal motion sensors? (are they hopefully improved over the Tab S, so that external hardware access is not necessary for satisfactory panorama viewing with head tracking?) If the Note 4 has sensors on par with the Tab S and krpano can't reach external sensors, then this Gear VR setup is a non-starter for satisfactory stereo panorama viewing.

    Any krpano user reviews on the Gear VR would be most welcome. Is the screen resolution, latency and image stability significantly better than afforded by a Nexus 5 Cardboard setup (which is pretty good!)? How does it compare to the experience provided by the Oculus DK2?


    Thanks *smile*

    Jason
    Stereopan.com developer

    Would KRPano 3D work with NVIDIA 3D Vision?
    NVIDIA 3D Vision supports SBS, but I'd just like to confirm.

    Hello Ivan,

    To the best of my knowledge, full HD with 3D Vision is not possible, somebody needs to write a browser plugin to enable sending two full width screens to the video card from a web browser for html content, currently its only possible to use half height or half width compressed frames. For passive 3D tv's it is a moot point, using half height compressed frames delivers the highest resolution in 3D that the passive 3D tv can deliver (tv's like LG), and with this target it is definitely better to use the half height mode as otherwise it would decrease resolution by another 50%.

    For an active shutter glasses tv, two complete HD frames need to be delivered to the TV for each field scan. Nvidia does have a plugin to do this with browsers for video, but I am not aware of the ability to do so with html content. Here is the documentation page for the 3Dvision plugin: http://www.3dvisionlive.com/3dapps , where I can only find reference to video frames.

    This will work with 3Dvision however, just not full HD on active shutter glass screens (but you wont notice as its still quite wow effect).

    I would love to stand corrected on this issue and have been in contact with Nvidia engineers without resolution.

    Jason

    Hello Tim,

    To shoot stereoscopic panoramas you do need a method to create a view for each of the viewers eyeballs, and the simplest way to do this is to shoot two panoramas with an offset. There are other computationally challenging methods to produce stereo panoramas that solve some issues with the two eyeball-offset setup, but they come with their own issues that prevent one from causally dipping their toes in to the subject, so the two viewpoint method is by far the best to get started (the advanced method requires two cameras where one is in the center and one is offset, used to create a depth map, and a computer graphics system to warp and synthetically create the two eye views on the fly from the obtained data...).

    It is quite possible to make fantastic, fun stereo panoramas without going to the full 360 degree route by just shooting a static panoramic scene and then moving the camera a distance and shooting again. This is not good for action or when clouds are moving and casting shadows, but otherwise it is easy and enjoyable. The stereo separation between each eye will change from the edges to the center of the scene as the two cameras would not be rotating around a common center, but I find when viewing these that it is really not a problem (we are, of course, used to things going flat as that is how tv's have been for ages) When shooting a distant scene for 120 degree view, it really is not a problem. I shoot many handheld stereo panoramas in this method.

    If one wants full 360 degree view, then the cameras need to rotate around a common center. I use a Gigapan Epic Pro robot with a custom bar that I mount two cameras on. This allows me to freeze action and clouds within the same eye view and to shoot in a full circle. I had to develop a simple circuit to fire both cameras from the Gigapan Robotsimultaneously. Of course, this requires two similar camera systems. The stereo separation can NOT be significant or stitching will become difficult to impossible as each camera is not rotating through its lenses entrance pupil (again, not a problem for distance scenes, but trouble with close up elements).

    The distance between the eye views of the cameras (the stereo base) is actually something that depends on the minimum subject distance of a scene. A scene of farway mountains can actually have a much wider stereo separation base to create a normal stereo scene for a viewer, while one that has close up subjects needs the camera separation to be much closer for things to look normal or 'Ortho' stereo.

    The general rule for separation to make things look normal to the viewer is to take the minimum distance to the subject/30 = stereo base separation.

    There is also Hyperstereo and Hypostereo. Hyperstereo is a result of a larger separation between the cameras and actually results in objects in a scene appearing smaller than they should. Hypostereo is when the separation was smaller, causing a scene to appear larger than in reality. Sometimes these effects can be used creatively to your advantage, such as to bring out the stereo depth of a far mountain scene that does not have close up elements.

    Here is a link to a poster presentation I presented at Carnegie Mellon University in 2010 on this subject http://www.3d-360.com/gigapan/?id=64651</a> It explains these concepts in detail with photo examples. John Toeppen is by far the best stereoscopic panorama creator I have met, with incredible vision. I found his work so inspiring that we wrote a paper for SPIE conference this year concerning making and displaying stereo panoramas, feel free to read it here http://www.holographics3d.com/hg/demo/spie/I…Panoramas14.pdf

    Yes, your bar would work well for providing a stereo separation on cameras, you will need to firgure out how to attach it to a panoramic tripod head.

    Hope this helps, after you shoot some stereo panoramas, I would be happy to assist in getting them aligned and presented on stereopan.com if interested!

    Jason Buchheim

    Hello Klaus and everyone,

    Thank you for these Oculus Rift stereo demo's, they are very exciting!

    In 2010 I began working on a Stereo-3D viewer using Krpano's flash browser to create anaglyphs and side by side, over/under stereo panorama viewers. StereoPan.com is the resultant site dedicated to 3D-Stereo Panoramas with beautiful spherical and high resolution gigapixel panoramas to explore.

    I have found that deep-zooming into stereo content to be most interesting and unique.

    For this site, I created a method for displaying two krpano Flash panorama views within the same parent Flash container, and was able to create a communication mechanism for the two views to coordinate movements in each eye view to eliminate eye-strain upon zooming in.

    The Stereopan.com viewer is unique in that it coordinates each eye view as one zooms in and out, actively keeping both eyes looking within the same frame window without eye-strain, even if a user zooms in all the way on a multi-gigapixel stereo-3d panorama! This coordination method keeps objects of interest from going out of frame (off screen) for one eye and prevents eye strain from mis-registration of the left and right eye views upon zooming in to the image by using a depth-map. Without this coordination, mis-registration as the viewer zooms in destroys the 3D effect.

    The Stereopan.com flash viewer does not work for iPhone, iPad, Android, etc. (as Flash does not work here), so I have been working on creating a Stereo-3D viewer that will work well on these systems.

    I have developed a stereo-3d panorama viewer using HTML5 flat panorama viewing technology that outperforms my experiments with stereo krpano multiresoution HTML5 and flash setups in multiple ways:

    1. There is no lag between the left eye and right eye views, the two eyes are always perfectly in synch as the viewer script is drawing both to screen at the same time, not one eye after the other. Both eye views receive movement information at the same time, so one eye is not playing catch up with the other.

    2. It does not crash out the iPhone or iPad, which I have found running two multiresolution krpano html5 viewers within the iOS environment definitely does within a few seconds or minutes of start-up.

    My viewer is not spherical, so I would be very excited to switch my development focus of Stereopan.com to use the krpano multiresolution html5 viewer for Stereo-3D panorama viewing, but at this time, through my experimentation with the multiresolution krpano in a stereo environment, I find the results unsatisfactory (there are synch delays between the views, iOS Safari crashes...)

    Viewing these panoramas on an iOS device is really amazing! An Oculus Rift is not necessary for great 3D viewing, one can use an iPhone or iPod touch with the Hasbro My3D viewer for a high resolution experience (and yes, it can use the gyroscope for head tracking) Get one for only $5 on Ebay . Viewing anaglyphs on the iPad is great fun with excellent resolution (just need 50 cent red, blue glasses) I would imagine when the next iPad Mini comes out with retina resolution, it would be perfect for making an Oculus Rift type device for viewing panoramas with head tracking and super high resolution.

    I would hope in the future for there could be some way with krpano html5 multiresolution to have the two eye views stay perfectly in synch with each other without update lag and of course to turn down the memory usage of krpano-html5 multiresolution somehow when two viewers are present in a window at the same time so as not to crash iOS safari. This would allow me to fully use krpano html5 multiresolutiuon in a stereo environment.

    Here are links to my various Stereo-3D Viewer experiments:

    Cross Eye 3DTV krpano html5 multiresolution
    Parallel Eye 3DTV krpano html5 multiresolution
    Over/Under 3DTV krpano html5 multiresolution

    Multiple View Options krpano FLASH multiresolution

    Stereopan.com Viewer is my in-browser anaglyphing flat gigapixel 3D-stereo panorama viewer using HTML5.
    It is a synchronized flat viewer and will work without issue in iOS(anaglyphing doesn't work in Internet Explorer)

    Another for my Stereopan Deep Zoom stereo-3D viewer:
    Philipsburg Refinery by Ron Schott, can be zoomed in for incredible 3D detail!

    If you are interested, please take a moment to try the different viewing methods and technologies, everyone's feedback would be most welcome!

    Shooting 3D-Stereo panoramas is quite a lot of fun. Of course, it doubles some of the work, but the results are very satisfying, and you get a lot of questions out in the field.

    Thanks

    Jason Buchheim *smile*

    Hi Ciul,

    This is very interesting concept and will be great as an overlay on a pano *smile*. Although I feel it is a bit misleading when you mention pages loading within distorted hotspots being in 'beta' above, as this would require the page to be loaded and rendered inside of the Flash player (not as a javascript overlay) and this is something that the Flash player is not going to be capable of any time soon (html parsing and loading images from websites that do not have open crossdomain.xml files in place).

    Keep up the good work though *thumbsup* !

    Jason

    Hello,

    I would like to introduce you to 3dpan.org , a website developed and dedicated for viewing three-dimensional (3D stereo) Gigapixel panoramas with no eye-strain!

    Reg-Cyan anaglyph (those funny glasses) are required. The anaglyphing is performed in your browser in real-time in a proprietary method that minimizes ghosting and eliminates eye strain.

    I just presented this new technology at the Gigapixel Imaging for Science conference at Carnegie Mellon University on November 13th. 3DPan.org uses new proprietary methods to align the left and right views at whichever point the viewer positions their mouse. 3DPan.org can display any image shot as a three dimensional stereo pair and provides auto-alignment between the underlying left and right images. Image pairs are uploaded to the Gigapan.org website as individual left and right images, then combined for stereo viewing at 3dpan.org.

    The underlying panorama viewing technology used is krpano ;)

    3DPan.org is still beta testing and new features will be added soon.

    Full information on shooting and displaying in 3D can be found by viewing this Gigapan http://www.gigapan.org/gigapans/64651/

    When you add good stereo-pairs to Gigapan, I will add them to the 3dpan.org site. Non-commercial work only (as per Gigapan terms of service).

    Please provide feedback and hope you enjoy! Comment at 3dpan.blogspot.com

    Jason Buchheim
    Developer 3DPan.org