top of page

RC planes, Drones and VR flight sims

Things have progressed quite a bit from when I was a kid making paper planes and learning MS flight sim! I still remember sitting in a Cessna at Meig's field:

At one point, when I was about 14, I built my own glow engine trainer and enjoyed getting castor oil absolutely everywhere. My fashion sense was... appropriate for the time.

Of course in 2014 when I found out how accessible and cheap quadcopters had become, and with video transmission equipment sending live images to a set of goggles, I HAD to get in on this so I started building with a bunch of low cost components from Shenzhen:

And soon I was doing full 3d acrobatics:

Since then I've competed successfully in the Singapore racing scene, and accompanied Ikura FPV to Hawaii as his spotter for the international Drone Worlds competition, which featured pilots from ~80 countries:

But then along came VR...

My prescient pragmatism of "one day I can do this with computer graphics" was realised. I hate to say it, but I told you so, and I threw all considerations aside to build my own VR cockpit. Turns out the used furniture section at IKEA is the best place to go for this:

The interior has an Oculus with a full fidelity replica of the A-10C Warthog HOTAS controls, rudder pedals and one handed keypad ensuring maximum immersion. The game that runs all this is DCS World, where I fly in a 1:1 simulation of the Georgian/Caucasus region with up to 60 online players at a time, many of whom are real world pilots.

A few notes:

  • Muscle Memory- I spent a large amount of time learning the aircraft and the controls I needed to operate to get it to work. I first used other people's joystick bindings (looking for what was popular first) and then branching off into my own config. This became incredibly important to nail BEFORE I spent too much time flying, as when I decide to learn the next aircraft I would want a similar layout for similar tasks. If you are going to learn a flight simulator it pays dividends to learn how aircraft typically bind HOTAS controls and to mimic their style, this way you'll avoid reflexively hitting the wrong binding over and over because you had a completely different layout last time.

  • UI/UX- Extending that, the joystick is actually a replica A-10C stick, and as such all the buttons map 1:1 to the aircraft. If you do learn the A-10C with this stick, it's an absolutely sublime UI/UX journey as you discover what 50+ years of aeronautical research into those disciplines means for this use case.

  • The Keypad- If you need to take the headset off to find a keyboard key it ruins the immersion and destroys your situational awareness. I bound commonly used keys to this to avoid that, typically F1-F12 and a center-view binding for the headset. Likewise, I use teamspeak to communicate with wingmen via voice and spent a while learning the Bullseye coordinate system.

  • Sim Sickness- Surprisingly there isn't much! Vehicle based experiences are probably the best suited to VR, if you are to avoid sim sickness, and a complex game translates well. Of course, when you eject and your body flies into space and then blacks out, you might feel a bit queezy.

  • Things are hard to see- The instruments are not always easy to read, but at least with positional tracking you can move your head nearer for a better look. For aircraft a long way off, you'll need the radar/AWACS, just like real life, and if you're in a dogfight you'll quickly learn that working in front of a computer has made you prone to neck sprains.

  • Closet- Ultimate man-cave, it makes an IKEA catalogue look inefficient.

Here's an online dogfight with Voiceover to explain what I'm doing and why, and to show you some of the instruments that need attention:

Telemetry and post-flight analysis.

I use Tacview to provide post-flight analysis so that I can see what I thought I could see, what I didn't see therefore what decisions I made. An exceptional learning tool to analyse your flight, it can even load real-world flight data. Here's some odd missile behaviour that I was intent on checking out post-flight.

And then came Eye-Tracking...

Gaze tracking (where your head is pointing) only reveals so much. You can infer what the decision making process was, but without knowing exactly what you're focusing on this is only an approximation. But if you can track a pilot's eyes this changes- I hooked up a Tobii Pro HMD with it's python SDK to control the windows mouse, and can even click cockpit buttons by looking at them. The small blue cross is where I'm looking:

Where is this going?

On several levels this is exciting. For user research, knowing what one's eyes are doing is key to understanding how they think. Likewise with med-tech, the eyes have the highest bandwidth pathway to the brain and reveal higher order cognitive function along with structural characteristics. For Simulated environments we can analyse how people respond to stimuli and make decisions in stressful situations, or simply understand which product packaging they prefer.

With eye tracking coming to HTC Vive in 2019, along with Hololens and the fact it is already in Magic Leap, you can start imagining a future where your eyes don't lie- they tell us more than you think.

bottom of page