22 July 2012

NotAGenius Interactive Stereoscope ... (3D Trifle part 2)


Finally I'm prepared to reveal to the world my genius virtual reality headset concept. It's a new VR headset that will revolutionise gaming and home cinema.... and it's so ingeniously simple, an idiot could've invented it. Ladies and gents, I give you ... the NotAGenius-Interactive-Stereoscope! 

Ok, rubbish title. But it's the idea that counts.
The new Sony* VR headset
(for illustrative purposes only - My concept is nothing like this. Probably)

The problem with existing virtual reality systems is that no matter how good the graphics and smooth the animation, it's still not quite as real as reality. One of the reasons for this is focal points. In the real world when you look at something in the distance, close up objects in your peripheral vision go out of focus. Likewise when you focus on an object close up, your distance vision is out of focus. In the virtual world either everything is in sharp focus (regardless of it's depth), or the programmer/director who created the environment has told the viewer at which level they are to look at, by forcing specific depths in to focus (you can see this in any modern 3D movie). My new interactive VR headset will remove this choice from the game/film creator and puts it back in the hands (or head) of the viewer. For example you will be able to play a first-person VR game where you can focus at any depth. The depths that you are not looking at go out of focus: This makes the users VR experience all the more realistic.


So how does it work?

It's as simple as the vacant expression that currently adorns your face. Inside the headset there are 2 little cameras looking though the screens to see each eye. Each camera can see at exactly which point on the screen the eye is looking at. From this input the system creates instantaneous meta-data that feeds into the output of the headset. The stereo picture that you see is altered instantly and accordingly to put the correct depth of field on to whatever you are viewing at the time. Not only will the system know what you're looking at by the position of an individual eye, but it will also take the position of both eyes giving convergence data that will corroborate the correct focal depth used when viewing a 3D image within the headset. These readings will be taken at least 48 times per second. Obviously everyones eyes are different, so a calibration needs to be done for each user. This only needs to be done once though, as settings can be saved and reloaded every time someone different uses the same headset.


But how can the little cameras see though the screens?

I'm glad you asked. The solution is as simple as simplicity itself, if simplicity had been personified, lobotomised, and christened Boris. The cameras doesn't actually look through the screens but through a two-way mirror. This mirror is at a 45 degree angle and the actual screens (above and perpendicular to the viewers line of vision) are reflected in the mirror. The inclusion of such a mirror in the headset (when shaped correctly) also allows for the appearance of a larger screen further away as opposed to a smaller screen close up. As people find it more difficult to view screens close up, this is another reason that makes the as yet un-patented NotAGenius Interactive Stereoscope headset more comfortable on the eye.


But that's not all!

Oh no sir. For with this new technology it will be possible to create 3D feature films where the viewer can choose what he/she is looking at.


Whatchu talkin' 'bout, genius?

Well I'll tell you. Just like this new technology will make a video gaming experience even more real and interactive, similarly it will make a movie watching experience more real and yes, interactive. Imagine a film where the main action is in the foreground. Perhaps two people are talking in a bar. If there is nothing relevant happening in the back of this shot then the director will only focus on the people talking. With my new system a viewer (having seen the film already and bored with the foreground actors dialogue) can focus on whatever that non-speaking extra is doing in the back of the shot. It'll be like you're there. You can choose what you're looking at. Obviously such films will have to be specially made with a very long depth of field in order for every depth to be viewable.


So what use is it?

A writer/director of such a movie can put hidden 'easter egg' elements in their films. Be it a passing character or vehicle in the background that turns up later in the film or perhaps there is some background action that will only occur to you to focus on from the second time that you see the film. Also if you've got one of those movies by the likes of David Lynch,... you know the ones. The movies that don't really make any sense. Well you can have more 'clues' throughout the film. More depth, more clues, more weirdness for fanboys to discover, that not everyone will get, and then argue about on internet forums for decades to come. It opens up a whole new way of creating cinema. Cinema that's 3D, viewed within a headset, interactive, with hidden elements. Not only that but the interactiveness blurs the lines further between video games and movies. But more on that at a later date.


* Incidentally, if anyone at Sony or any rival company should like to employ my genius brain and make this genius idea a genius reality, please get in touch. I am looking for a more practical outlet to hone my ideas, and there's plenty more ingenuity where this came from.