Skip to main content

Report this content

We want the Dreams coMmunity to be a safe, diverse and tolerant place for everyone, no matter their age, gender, race, sexual orientation or otherwise. If you believe this content to contradict these principles, you can file a report for our coMmunity teams to investigate.

Note that misuse of the reporting tool will not be tolerated.

Item being reported:

A forum post by TheBeardyMan

"How do I?" is a solved problem in this case - this is more of a "Should I?"

Unless a VR headset has hardware to measure the distance between the user's pupils - and as far as I am aware, the PlayStation VR headset doesn't - the separation in scene space between the VR left camera and the VR right camera must be based on an assumed average pupil separation.

If the user's pupil separation is wider than that, the user will see what a person with narrower pupil separation than their own would see - the scene will appear large.

If the user's pupil separation is narrower than that, the user will see what a person with wider pupil separation than their own would see - the scene will appear small.

Considering this, should a VR dream include a way to calibrate the VR scale? For example, a 1m x 1m x 1m cube 3m in front of the camera, a Controller Sensor with a control wired via some logic to the VR scale tweak of the active camera, and an instruction to the user to adjust the VR scale until the cube *looks like* a 1m x 1m x 1m cube 3m in front of the camera?

Oh dear! Your browser is either unsupported or there has been a problem loading the page.