04/20/19

NAB Show 2019 Breakdown and Overview

        The NAB show was on point this year delivering another awesome array of software and equipment to sample. Early registration grants a free exhibitors pass and with it, access to the entire showroom floor as well as certain panels, demos, and training sessions. Most of my time was spent in the South Hall talking to software reps and comparing notes with other artists and animators. Out of all of the feats of technology and creativity that I saw at the show, here are my top three takeaways from NAB 2019.

The Mighty Motion Graphics Community

I want to start off by giving a huge shout out to the amazing NAB motion graphics community and to Brian Behm (the brain behind the NoSys brand) for being my guide to all things mo-graph surrounding the show. I was really excited to be at the show and meet some of the gurus that inspired me early in my career.

Things kicked off on Sunday with the NAB Mograph Meetup hosted by Beerhaus. I was pleased to see Mathias, who represented Maxon at Signal in Austin, as well as Nick Campbell, the Greyscalegorilla himself. I had an amazing time chatting with other motion artists who told me that I was going to be overwhelmed by the convention. They weren’t wrong.

On Monday after attending the conference, I was invited to ‘Celebrate with Team Adobe’ at Señor Frog’s. There I met a variety of makers who all work with Adobe products every day. Among them were Kevin Ames and Dave Moser from Photofocus, and Red Giant’s Harry Frank. I left the restaurant feeling grateful to be around so many talented and dedicated people.

On Tuesday I made a point to visit the Cinema 4D booth again and get a wristband for the annual Maxon Pinball Party. After the party, I met up with Brian and several other people from the community and headed to the Cosmopolitan Hotel to find some pizza. And when I say find some pizza, I mean that literally. Every year the gang demands that newcomers lead the group to find Secret Pizza— an unmarked pizza place inside the hotel with no advertisement whatsoever. This year we were led by the creator of Action Movie Kid, Daniel Hashimoto, who actually got us to the Secret Pizza in record time.

Maxon Acquires Redshift Rendering Technologies

Maxon announced on Monday, April 8th, that they had acquired Redshift Rendering Technologies. This move is no huge surprise seeing as Redshift is one of the most widely used render engines within the Cinema 4D community. This means that if you are a C4D user, you will likely see full integration with Redshift in the near future. That’s great news for any artist or studio looking to get the most out of their GPU rendering pipeline. But why is everyone so interested in making the switch to  GPU rendering lately?

The answer has to do with the fact that graphics cards are designed differently than CPUs. CPUs can be expensive, but tend to be faster and more accurate— using of only a few processing cores. GPUs on the other hand are slower but they consume less power and are less expensive. They are also built with more cores that can process information simultaneously, and because of that, a single GPU can render as fast as five to twenty CPUs. This means that an individual artist or a small studio can install just four GPUs to achieve studio quality rendering. The drawback of GPU rendering is that it tends to produce noise that shows up in areas like global illumination and depth of field. For that reason, some studios still prefer CPU rendering because they can afford the cost of the very accurate renders that it produces. With the integration of Redshift into Cinema 4D, I would be surprised if we don’t see improved de-noising processes, among other things that will make graphics card rendering engines even more lucrative.

Many film productions are also looking to graphics cards for real-time rendering pipelines. This allows creatives to visualize and capture performance and camera movement inside of a motion capture studio. Director Neill Blomkamp did this using Unity to create the sci-fi short series Adam. Broadcasters are also using real-time rendering to place anchors on a sound stage into virtual studios. In the exhibit hall, Brainstorm was showing their virtual studio system which tracks the camera and uses Unreal Engine to render the graphics. As GPU hardware improves over time, real-time graphics will become more realistic and the use of a digital set will become even more attractive.

Brainstorm virtual studio demo at NAB 2019

Artificial Intelligence— the Future of Video Editing

Artificial Intelligence and Machine learning have been at the forefront of many discussions and the video industry is not immune to the buzz. Knowing this, I had to attend Ai and the Future of Video Editing and Motion Graphics, led by Nick Harauz. Nick took the time to introduce us to a number of features powered by Adobe Sensei, Adobe’s AI powered toolset. These tools can assist in many things such as identifying a font in a photo, or automatically ducking sound based on other audio cues in your project. Out of all of the features that I saw, these are the ones that really stood out:

Content Aware Fill Simply animate a mask to remove an object in After Effects and Adobe Sensei will fill in that area based on the content surrounding your mask (Also Available in Photoshop). If you’ve ever had to take the time to make clean plates, you will understand how powerful this tool is.
Color Match This feature in Premiere will match the color from one shot in your scene to another with just a couple of clicks. Knocking out the bulk of the tedious work will give your colorist more time to make every shot, color perfect.
Morph Cut When cutting between sound bites, this effect will help smooth out the transition without any extra work. This means less time adjusting the timing of your transparency and audio ducking to achieve a similar effect.
Project Fast Mask We were also introduced to #ProjectFastMask which uses machine learning to isolate objects in your footage with just a few clicks. I absolutely hope to see this feature, created by Adobe research scientist Joon-Young Lee, in an upcoming release.

The conference was a great experience and I would highly recommend NAB to anyone who needs to stay in the loop on the latest trends in the video production and broadcast industry. The next NAB Show is in New York on October 16th and 17th    ■

Read More
04/06/19

Nokia Immersive Experience Breakdown

        Nokia put in a request for an immersive experience demonstrating their 5G end-to-end test network. Above Interactive would supply 360° drone footage taken on site at the campus in Irving, Texas. This footage would need match-moving and animation provided by the team at Pixel Mover.

Our Plate Was Full.

We knew that the plate would have a camera move of at least 260 meters as the drone flew from one end of the test site to the other. During this camera move, the animated signal beam would need to be tracked in perspective to keep the two antennas connected. To track something over such a large distance, we decided to use Maya, where we felt most confident in the tools for rigging and motion graphics. The 360° plate was mapped onto a sphere in Maya with a camera set at the center of that sphere. Markers were placed in 3D world space and rendered using the Arnold VR camera provided with Maya. The test renders were compared with the footage, and with some trial and error, we determined the absolute position of the camera.

Immersive Experience in Maya

For the signal beam, the team created a simple rig to control the end points, position, and scale of the beam during the camera move. RnD time was limited because of the two-week turnaround on this project; therefore, we would have to do the actual tracking manually. Adding keyframes for the endpoints would keep the beam “stuck” to the antennas, while the distance from camera would be handled through the scale function of our rig. Once the motion graphics and tracking were complete, we rendered everything with Arnold and composited into the final in Adobe After Effects. The final product provides an immersive experience for developers and investors to learn more about the Nokia testing facility in Irving    ■

 

Read More
04/01/19

Motion Matters: Creating content that moves your audience.

We’ve been thinking a lot lately about what it means to make something move. Motion is what we see with our eyes, but movement is not a force on its own, it is the result of energy put in. To make something move sometimes you just need to give it little push to get the whole ball rolling, but sometimes you need real muscle… and sometimes you need an atomic force! At Pixel Mover, our goal is to be that energy that powers your message.

Pixel Mover creates a lot of animation, but we are starting to realize that that what we really want to put into motion isn’t something that lives inside of a screen, it’s actually the people on the other side of those screens. Because if we are going to create something, we believe that we should have something to say, and we believe that if you say it right, your audience will listen and want to take action!

Pixel Mover is proud to launch this new website and announce that it is accepting new clients! We hope that you enjoy this video of some of our favorite projects and creations and we would love to hear from you so please contact us!

Read More