Website 2.0 Underway

It's coming... A much needed step into the world of responsive design after coming to appreciate, master and now outgrow the good ol' 960 grid system.

Launch Date: 12/30/2014


Google X's Smart Contact Lens Project

Today Google[x] announced their next project- a smart contact lens that detects glucose levels from your tears. Basically, it's an extremely small wireless chip and a miniaturized glucose sensor sandwiched between two layers of contact lens material. The lens measures and relays your glucose levels around-the-clock, making diabetes management a significantly easier task. Soon enough, diabetics can say good bye to finger pricking. 

So it's not a contact lens version of Google Glass... yet.

Google is already testing working prototypes of their lens and has begun talks with the FDA. Project co-founders Brian Otis and Babak Parviz say their next step is to explore "integrating tiny LED lights that could light up to indicate glucose levels having crossed above or below certain thresholds." Naturally, this technology is still in it's infancy, but given the exponential rate at which technology is advancing, squeezing in an array of LEDs could happen sooner than we think.

Naturally, Google isn't the only one working on smart contact lenses. Researchers at UNIST and Samsung are working on their own smart lenses for glaucoma patients. So while it's not a contact lens version of Google Glass just yet, these project gives us a teaser of what's lined up in the next generation of wearable and augmented reality tech- full-fledged immersion:


Read it on Medium:


The Augmented Reality Workplace

SAP is partnering with Vuzix to create the next generation of augemented reality solutions for the enterprise. Much like Google Glass, Vuzix "Smart Glasses" are also an Android-based wearable computer. The key difference is that Vuzix's Smart Glasses M100 (besides looking clunkier) are targeted at commerical and professional users. This concept video showcases a warehouse picker navigating his environment while accessing relevant information feeds in real-time.

Now imagine a world where augmented reality, smart devices, drones and analytics work together seamlessly... Coming soon to an Amazon warehouse near you ;)


Apple iPhone 5S vs Samsung Galaxy S4 vs HTC One - Which Phone Should You Buy?

The Apple iPhone 5S was finally announced last week. No longer sure which smartphone to buy? Isn't the Galaxy S4 or the HTC One better? Get the lowdown on these flagship models and learn how to determine which phone is the right one for you. Also my first YouTube tech video!   

Phone Spec Comparsion Table (Website):
Getting Galaxy S4 Features on your HTC One (YouTube):

VFX Test Film - Super Hero vs Mech Warrior

Here's an extended edit of the final project for my advanced Maya class with the ITP 3D wizard, Prof. Lance Winkel. The objective was to model, texture, rig and animate a 3D character from scratch. I had already been looking to do a CG + live action test using Image Based Lighting (IBL) so I thought I'd spice things up by making the project a live-action short. 


Lighting the CG model: I captured an HDRI probe (read: cheap reflective garden dome) on location and utilized Image Based Lighting (IBL) to light all the CG elements to match the live action footage. IBL is extremely useful, especailly in low-light scenarios as was the case in my footage. Sure, additional light sources and tweaking is required based on the specific angle/scene, but the HDRI map creates a solid starting point and does most of the work for you. 

3D Camera Mapping: As is the case with most short notice video shoots (done in two hours over finals week), some of the footage I went home with was less than optimal for vfx work. Most notably, the video I shot for the mech running down the tracks sequence had too much motion blur to be tracked in Boujou. To salvage the shot, I took a good clean frame of the train tracks footage and recreated the set using 3D camera mapping (also called photogrammetery). After some photoshop clean up work to fill holes that the perspective shift would cause I had the flexibility to do a lot of different camera moves. This made the the zoom in and out on the mech running towards the camera at a very easy task and gave me flexibility to create the exact camera move I wanted in the comfort of post-production, without sacrificing realism or quality. 

Learning Resources: Paul Debevec, AD and professor at the USC ICT  is a pioneer in the areas of HDRI, IBL and Photogrammetry. He also happens to have a plethora of resources about these areas online, including research papers and HDRI probe images: