When you mix water with Magnesium, sparks fly. Sparks also fly when you mix Media with Tech.
Such was the case at Demo Day where the NY Media Center Incubator companies get to showcase incredible innovations.
This post details some of what happened behind the scenes.
When you mix water with Magnesium, sparks fly. Sparks also fly when you mix Media with Tech. The NY Media Center, home to our company (ETC) runs a spectacular Demo Day & Marketplace where it's Incubator companies get to showcase incredible innovations; ranging from slow-motion video analysis that reveals techniques of guitar virtuosos to planning your next vacation in outer space (with the aid of professional astronomers teaching science). There is much to see, much to try out and much to learn.
It began with a problem
My journey for Demo Day began while I was participating in a VR Hackathon held at NYU's ITP. I was designing an App in Unity that plays immersive 360 degree videos on the Android platform. Unity is the gold standard in game design software. It has facilities to build, test, and analyze your games and applications.
Just one problem here... you're developing and testing on a desktop computer, but your target platform is a separate device; a smartphone. Although you can Play/Test your App on the desktop, it's not the same as the phone it gets installed on.
The big problem (and it's a showstopper) is that video content that gets rendered on a spherical texture just won't render on the desktop computer. It runs fine on the smartphone, but if you want to try playing it on the desktop all you'll see is a blank white canvas instead of the video.
Your only recourse is to generate your standalone App, take that file and install it on your phone and test it there. If you need to make like even the slightest tweaks, like repositioning a visual object just a tad in your virtual environment, you have to go back to your Unity system, make the changes there, and push it out to the Android platform. And it it's not quite right you have to tweak it again. This makes design and development tedious and sllooow.
An epiphany ensued
An epiphany occurred. The Android phone already possesses interactive capabilities. So why not enable edit capabilities directly on the phone itself? This way you can make all the editing changes directly on the phone.
And this idea gave rise to our product- XpressVR, which is what we pitched at Demo Day.
Originally, we were going to pitch some other cool tech we are developing related to haptics and VR. Our ongoing work with a major theme park demands that we keep it under tight wraps.
Preparing for Demo Day
When I first heard about Demo Day at the Media Center I thought wow, this is a great opportunity to showcase some really cool things I am doing. On the first pitch rehearsal a group of fellow "Incubatees" convened at the screening room - a 72 seat auditorium. I get assigned a Pitch Mentor Doug D'Arrigo - a really uber smart guy whose marketing genius is built into his DNA. I was also given an Industry Mentor Jess Engels. It's really great to have these powerhouses covering your back.
On the first pitch rehearsal I presented some work I was doing on haptics- a technology that brings force feedback that makes VR very immersive and real. As it would turn out; my ongoing work with a major theme park involving haptics snowballed to the point where I was no longer free to talk about my work because of its confidential nature. Well it seems for now that the only ones privy to my work on haptics is that theme park and the NSA; the first is by choice, but the second is not.
For the second pitch rehearsal it's back to the drawing board.
Meet the Demo Crew
Before I go and describe the second rehearsal I want to say something about my peers pitching alongside me. Wow. They are an impressive bunch.
Dawn is a world class reporter in finance and emerging markets who is using video as a megaphone to influence the top tier financial and economic influences worldwide.
John is creating a new kind of platform that allows the community of creators, artists, musicians, filmmakers, and writers to keep the monetization and management of their creative assets directly under their control. I suspect a few large corporations who take liberties on creative artists are not going to be thrilled with this development, but they may not have a choice.
Penelope is a class act. She combines sophistication, wit, expressiveness, and passion about using tech to build the definitive clearinghouse on the bewildering array of toxic chemicals that affect so many pre-natal, newborns, and children all the way through puberty.
Dan is developing technology for social media that could be aptly described self organized curation to handle the deluge of media content. He's applying this to the fashion world, where almost every image from the tens of thousands of pictures snapped on the runway are sheer eye candy.
Science has always been appealing and fun. The unfortunate thing is that this is a minority viewpoint. Olivia, originally a laser engineer turned storyteller is bringing science to the masses in a way that's inviting, fun, and at the same time doesn't water down the real science. So how does she do it? How does she give this new face to science? Her core ingredient is that her audience is at the very center of what's happening. We all know about lab rats and the mazes they traverse for a piece of cheese. Well what if we flipped the roles so the lab rats become the scientists, place you in a supersize maze, and instruct you to do their bidding, while they measure your memory, sense of direction or some other metric? Aside from audience involvement, she has an army of real world scientists running all sorts of fun activities - maybe plan your next vacation to out space with your travel agent being a professional astronomer. They for sure can give you the best tips on out of this world places to visit and not get sucked up by a black hole.
Alex has designed an entirely new way to visualize and map scripts and stories visually. It runs on the iPad. I had a chance a few months earlier to view an early prototype. It totally redefines the way films, plays, TV shows, and live events can be portrayed while the planning and creation is happening. What really impressive is he got it right, because he really thought it through.
The Second Pitch Rehearsal
For me, because I needed to switch topics, it was pivot time. Actually the pivot wasn't too hard. I more or less knew what I wanted to cover. Challenge was that I had a lot of real technology I had to create. It's one thing to talk about an idea, but it's entirely another to have something really working rather than giving the illusion of working.
By the way, my second pitch was truly crappy. Aside from a jumbled (I am sure mumbled or fumbled would have been an apt description), it lacked any fancy graphics or slides. It was filled with "jargonese" and had just too many superfluous topics. The tech I was creating wasn't yet given an official product name. Too many things were half-baked, but the core idea of where this technology is headed was sound and solid. I know the product I was creating would do something really significant. My software design approach was spot on.
My idea in its simple form... the creative things like editing a VR scene should not require you to be the expert - VR editing should be as easy as playing a video game. It should virtually be like you're playing a video game.
VR has had some great conceptual innovations but the tools and thinking for VR content creation lag far behind. You design in one environment, but you deploy and test in another. And to make a change you think you need, you go back to the first environment. Make a change. Press a button to compile that pushes possibly hundreds of megabytes through a USB cable to your smartphone that's going to run the VR App. Say some magic incantations, and maybe say a few more because you may be waiting minutes before the code can actually launch on your phone.
This is all wrong. It is senseless. My VR device- an Android phone executes an application program written in Unity. So the program that is launched on my Android is doing a lot of processing and computing; yet I'm ignoring its capabilities for turning that processing and interactiveness on making editing changes on the phone itself.