Over the past couple of years, I’ve seen an increasing number of After Effects users talk about switching to a different app for motion graphics design. But is it really feasible? What are the options – and what would it take for the After Effects user base to start jumping ship?
In this article, I’m sharing some of my personal thoughts on the topic. First and foremost, these are my own idle musings. But secondly, what I’m really doing is looking back at my early career as a Media 100 editor, remembering how quickly Media 100’s user base switched to the competition, and wondering if Adobe could be facing a comparable situation with After Effects.
Right now, and for the past 20+ years, After Effects has been synonymous with motion graphics design. For as long as I’ve used After Effects, I’ve never seen a feasible alternative for the types of work that I’ve done. Apple’s Motion created a few ripples of interest when it launched, but that never seemed to build into a flood of users leaving Adobe. But there are recent signs that this is changing.
On a basic level, Cavalry is a new application for motion graphics design, offering a theoretical alternative to After Effects in the near future. But on a broader level, there are signs that Motion Graphics Design itself is changing, and the way designers are using software is evolving with them. From my own perspective, over the past year I’ve never seen so many After Effects users express an interest in using something else.
So, after over twenty years of market dominance, what does the future hold for After Effects? It’s not a simple question to answer, but it’s worth having a look around at what’s going on.
After Effects has such a wide range of features, and is used across so many different industries that I’m exaggerating to suggest it has no competition. It depends who you ask and what type of work they do.
Without listing specific features, After Effects has two distinct sides – motion graphics design and image compositing. Users who specialize in one and not the other will use completely different tools within After Effects. Designers may never use Mocha, Keylight, or 32-bit linear projects. Compositors may never use the align & distribute panel, essential graphics, shape layers or even text layers. There are whole panels in After Effects that individual users may have never even opened, depending on what type of work they do.
When looking at compositing and visual FX, Adobe have a strong presence in the corporate industry but little to no presence in the high-end Hollywood arena. Feature films – especially blockbusters like the Marvel franchise – are generally composited using Nuke. There’s also Fusion, which has a growing fanbase since Blackmagic took over, and apparently Autodesk Flame is still around as well. So when it comes to compositing and visual fx, After Effects is definitely not the only option.
Motion graphics design is a different field, and it’s here where After Effects has been relatively unchallenged for the past 20 years. The “design” in “motion graphics design” is a good indication of why. The most popular design applications have always been Photoshop and Illustrator, both Adobe products, and their tight integration with After Effects has resulted in a highly effective, self-contained ecosystem. Any app that’s aiming to compete with After Effects for motion graphics design is also competing with Photoshop and Illustrator, and that’s not so easy.
Roughly ten years ago, I was stunned (in a good way) to discover that 2D character animation was also being done in After Effects. There’s a wide range of dedicated character animation software available, such as Toon Boom Harmony, but despite not being intended for character animation it seems that several studios have adopted After Effects for production. Custom scripts and rigging tools were being created for shows such as Tiny Inventions, and even the Angry Birds feature film. Adobe evidently recognized this developing market, and they included an early beta of a dedicated character animation tool with After Effects CC 2015. Two years later it was separated into its own product, Adobe Character Animator, and is probably best known for being used to produce the Stephen Colbert show “Our Cartoon President”. So even though there’s a wide range of more specialized software alternatives, After Effects is also being used for 2D character animation, and 3rd party tools for rigging continue to improve.
After Effects clearly has some competition when it comes to compositing and character animation, but motion graphics design is a different story. In the motion graphics design market, After Effects is undoubtedly the dominant application.
Any piece of motion graphics design created without After Effects is an exception. That’s not to say it doesn’t happen, just that it’s in the minority. After all, talented designers will use whatever tools they’re most comfortable with. Motion graphics design has been done in Flash, with Apple’s Motion, with Autodesk’s Flint & Flame and so on. There are artists who use open-source tools like Blender, and creative geniuses like Gmunk who write their own software. But to be completely realistic, After Effects has always dominated motion graphics design.
However – anecdotally at least – it appears that this might soon be changing.
More than just a pretty window – a lesson from the past
My first job was as a Media 100 editor, but as I primarily write about After Effects there’s a good chance that some people reading this won’t know anything about Media 100. That’s my first point, because Media 100 disappeared pretty rapidly and there’s the (highly improbable) potential for After Effects to follow a similar path.
Around the mid-1990s Media 100, Avid and Immix established a new market with non-linear editing systems that ran on home computers. In each case, the software was tied to the hardware. The Media 100 editing software only worked with Media 100 video cards, Avid software only ran on Avid video cards and so on. The main point is that these early products were sold as complete systems, as opposed to being a software app that would run on any computer.
Because the hardware and the software were tied together and sold as a self-contained product, there wasn’t much point considering what each component was worth. Buyers were simply purchasing an operational edit suite. Purchasing a non-linear system wasn’t necessarily cheap. Media 100 firmly aimed at the lower end of the market but even so, a fully-optioned system on a powerful computer could approach $100K. I think a budget system could be pieced together for about $20K. Avid aimed at a higher end market and by the late 1990s their systems could sell for more than $250K.
This began to change with two significant new products aimed at the desktop video market. Apple released Final Cut Pro in 1999, an editing software package that didn’t require specific hardware to edit video. By version 3 users could plug a DV camera directly into an iMac and edit video without a separate tape deck – a completely revolutionary concept. As video editing transitioned from analogue to digital formats, a new range of digital video cards entered the market, initially costing anything from $5,000 to $10,000. But just a few years later an upstart company in Melbourne, Australia began selling video cards that worked with uncompressed SDI for only $999 – the same price as Final Cut Pro. The video cards didn’t come with editing software, but the price was a fraction of what competing video cards were being sold for (and a fraction of what they’d been selling for a few years earlier).
Although they may not have thought about the individual components of their edit suites before, Avid and Media 100 owners were now faced with serious competition. The move from analogue to digital video placed a spotlight on exactly what customers were paying for, and prompted many customers to re-evaluate the value of their purchases. For example – if Media 100 were charging $20,000 for a product that included a video card and editing software, but you could buy a Digital Voodoo card for $999, it suggested that the Media 100 editing software was costing you $19,000. But if Apple were selling editing software for $999, it suggested that the Media 100 hardware was costing $19,000.
But Apple were selling software for $999 and Digital VooDoo were selling video cards for $999! So what gives?
However you looked at it, the prices being charged by Avid and Media 100 suddenly seemed too high for what you got. The Media 100 hardware was not worth $19,000, and the Media 100 software was not worth $19,000. Something wasn’t adding up. If you could buy a brand new video card and editing software for $2,000 then what, exactly, were you getting from Media 100 for $20,000?
It wasn’t long before Final Cut Pro was – overall – a better editing software package than the Media 100 software, and the Digtial Voodoo, AJA and Blackmagic video cards were more powerful than the Media 100 hardware. The only reason all Media 100 users didn’t jump ship overnight was because they had years of legacy projects, existing investments and obviously years of Media 100 experience to match. You can’t exactly put a price on muscle memory, but it’s difficult to move away from a software application you’ve used on a daily basis for many years. But once users started to jump ship, the end came very quickly.
Not all analogies are good ones
Maybe – superficially at least – we can draw an analogy between Media 100 in the early 2000s and After Effects now. Instead of talking about editing software and hardware video cards, we’re looking at the After Effects user interface and the After Effects rendering engine.
When After Effects was first conceived in 1990, the original paradigm was stacking layers of video on top of each other, comparable to animation cells in the real-world, and Photoshop with still images. Even today, After Effects is commonly described as Photoshop for video, and its simple layer-based interface makes it easy and intuitive for new users.
In 2005 Adobe added 3D layers, creating the unique rendering pipeline that we still have today. After Effects users all around the world grapple with its esoteric mix of 2D and 3D layers, adjustment layers, pre-compositions with collapsible transformations, pseudo-3D cameras, lights, and everything else that makes After Effects what it is. Popular 3rd party plugins such as Element, Stardust, the Trapcode suite and many others work within their own self-contained 3D universes, eventually outputting a flat 2D bitmap to be composited with other After Effects layers.
The After Effects rendering pipeline is complex enough that I’ve never seen anyone attempt a tutorial that describes all of its intricacies (it’s on my to-do list but don’t hold your breath). In some ways, the After Effects rendering pipeline is what makes AE what it is. Hundreds of thousands of users all around the world have years, if not decades of experience with creating animations using After Effects. It’s not perfect, but it’s what we know and presumably love.
In the 14 years since Adobe added 3D layers to After Effects, true 3D applications including Cinema 4D have advanced significantly. With fast GPU rendering now easily affordable, the quirky After Effects rendering engine is starting to look a a little antiquated. With every new real-time demo from nVidia, Unity and other game engine developers, the limitations of the After Effects rendering engine look more and more imposing.
Some After Effects users in 2019 are beginning to feel the same way that Media 100 editors felt, when the first uncompressed video cards came out for less than $1,000. The big difference in the comparison is price, and that’s clearly where the analogy breaks down. Media 100 failed because ultimately they weren’t delivering value for money, while today After Effects has never been more affordable.
After Effects, despite the ever-present grumblings about the Creative Cloud subscription system, is relatively cheap and offers clear value for money. While there are many things to complain about with After Effects, the price is not one of them. Any motion designer who thinks the annual subscription cost for After Effects is too high has evidently never looked at the price tag of applications like Nuke, Cinema 4D, Maya & 3DS Max, or even niche tools like Krakatoa and Real Flow.
But even though the comparison with Media 100 doesn’t hold up from a financial perspective, it’s definitely worth considering from a creative perspective. Media 100 had two components – software and hardware. After Effects has two components – the user interface and the rendering engine. The point with Media 100 was that the combination was more valuable than either individual component. Once you began to consider each part in isolation, you began to see problems. Personally, I think After Effects is similar – it’s a self-contained combination of an interface and a rendering engine. Together, they’re a valuable and productive tool. But if we look at each of them in isolation, we start to see some cracks.
The keyframes are always smoother on the other app
If we ignore the rendering side of After Effects, we have a timeline based user interface that once again can be traced back to around 1990.
While the appearance of the user interface has varied slightly over the years, the basic concepts have not. Animation is created by keyframes and expressions. Layers are stacked on top of each other in an order that (roughly) represents the rendering pipeline, compositions can be pre-composed and so on. But if we have a quick look back at early versions of After Effects then it’s still recognizable as After Effects. Unfortunately, while there’s been an incredible amount of progress made with After Effects over the past 20 years, there’s still a lot that has stayed the same.
After Effects is clearly a member of the Adobe family, and that’s possibly its greatest strength. It works fluently with Photoshop, Illustrator and Premiere. The interface is familiar and intuitive. The timeline based paradigm makes immediate sense to motion designers and editors, and layer based compositions are much friendlier to designers with a Photoshop background that the node-based interface of a vfx compositor like Nuke or Fusion.
After Effects also has a huge user base and generous support from 3rd party software developers, but the sheer quantity of 3rd party add-ons for After Effects obscures some of the shortcomings in the basic application itself. In fact, the abundance of 3rd party add-ons can even hinder the development of After Effects, as Adobe are less likely to add native features if they’re already available as a retail product.
The wrinkles in timelines are starting to show
Let’s consider the native tools available to After Effects users for animation.
After Effects has always had keyframes, the most basic tools for timeline-based animation. Linear, bezier, hold and roving keyframes have been with After Effects for about as long as anyone has seriously used it.
The next major additions to the animator’s toolkit were (in no particular order) an improved graph editor, parenting, and expressions. That was pretty much it for almost 15 years, until Adobe recently gave us new tools for data driven animation.
(NB. Edit: Since I first posted this, I’ve wondered if I should include motion sketch. I still use it sometimes, but it’s very much a raw tool that needs additional finessing. I’ll include it here as a primitive form of motion capture, and I’ll also add motion capture to the list below)
For an animation package that is over 25 years old, the basic list of tools available for animation is pretty small:
- keyframes, including a simple graph editor
- motion sketch
- data driven animation (JSON files)
But even with these tools available to everyone, we don’t have a simple, easy & intuitive way to combine them. For example: parenting layers is a great tool to have, but there’s not an elegant way to turn parenting on & off. Same with expressions – they’re fantastic and very powerful, but there’s no simple way to fade their influence in and out, or even turn them on and off at specific times. It’s not that you can’t do it at all, just that you can’t do it easily.
Keyframes have been with us since the beginning, almost 25 years ago, and we’ve had plenty of time to get used to expressions and parenting. But for an animation package, the animation tools provided to the After Effects user have hardly changed over the last 15 years.
For designers who have come to After Effects from a background in still images, the toolset might not seem problematic. Thousands of people sit down in front of After Effects every day, and they all manage to get stuff done. But in the same way that the shortcomings in Media 100’s editing software were exposed by the release of Final Cut 3, the maturity and development of 3D animation packages is beginning to expose the shortcomings in After Effects. Most AE users have encountered the “boomerang” keyframe problem at some point or another. It’s been there for as long as After Effects has had bezier keyframes, yet it’s 2019 and new users are still googling to learn how to fix it.
Adobe have continued to add a long list of features to After Effects, some of them groundbreaking, it’s just that they haven’t been features that improve the basic animation process. Remember that After Effects is the tool for motion graphics design – and that animation is a crucial part of designing motion.
It’s important to acknowledge the difference between not being able to do something at all, and not being able to do something easily and intuitively.
The haves and the have-nots
I’m grateful for every update and new feature that Adobe introduce, but if we only look at animation tools then there’s been an awfully long gap between getting 3D layers in 2005, and JSON support this year.
Here’s a list of basic animation tools that After Effects doesn’t have:
- Turn parenting on and off
- Easy and intuitive method to mix and combine keyframes and expressions
- Easy and intuitive method to sequence, loop, cycle, ping-pong and repeat selected movements
- Options to automatically continue movement after the last keyframe, or before the first keyframe
- Easy procedural movements, eg oscillations, springs, bounces, wiggles, with the ability to combine, mix and sequence them.
- Interactive, non-destructive smoothing of keyframes and specific groups of keyframes (especially useful for motion tracking).
- Rigging, including both inverse and forward kinematics, WITH an easy way to toggle the rig on and off or between IK and FK.
- Physics simulations, either 2D or 3D
- Cloning and repeating, with animatable offsets
- Audio triggers
- Motion capture
Generally, some of these things can be addressed by 3rd party tools. There are several very powerful rigging solutions available, such as DUIK. Newton provides 2D physics – even the Adobe Character Animator provides a 2D physics workaround for After Effects. Trapcode’s Sound Keys provides audio integration, and EchoSpace provides a method for cloning and offsetting layers, even if it’s a bit clunky. Finally, the swiss-army knife that is iExpressions (made by the German company Mamoworld) attempts the unenviable task of presenting expressions in a more intuitive manner for artists and non-techies.
The point is not that After Effects cannot do these things at all, it’s that After Effects cannot do these things by default, and in an elegant and intuitive manner. As much as I love expressions (and writing scripts in my spare time), even I won’t try to suggest that it’s easy for someone trained as a graphic designer to combine expressions with keyframes.
Something as simple as wanting to keyframe the amount of wiggle over time is still too intimidating for many creative artists.
The more things change, the more they stay the same
For over 20 years, After Effects has been the primary tool for motion graphics designers, but the default tools provided by Adobe to actually design motion are looking a little thin when compared to other animation packages. For the first time ever, I’m working alongside motion artists who prefer animating inside Cinema 4D – not because it’s a true 3D application, but because it provides animation tools that After Effects does not have.
The After Effects interface and animation toolset is intuitive but neglected, and lacks many basic features that other animation packages include as standard (ie the ones I just listed above).
Because of the wealth of 3rd party support and the sheer size of the user base, After Effects isn’t going anywhere overnight. There are thousands of artists worldwide who have embraced the quirky rendering pipeline, and have years of experience with creative tools such as Particular & Form, Element and Plexus. The fact that so many designers are happy to embrace the 3D world-within-a-world approach that plugins like Element and Stardust provide, is evidence that the core user base is happy to put up with the shortcomings in After Effects because of the overall experience. There’s no doubt that After Effects can get stuff done – and getting stuff done is to key to a professional career.
In the world of motion graphics design, After Effects doesn’t have any significant competition. Currently, Cinema 4D is more of a companion tool than a competitor. But, just like Media 100 discovered, that might change quickly…
Be part of the solution, not the problem
Here’s a hypothetical daydream on how the future could change…
There are already many commercial 3D rendering engines available, including high-end renderers like VRay, Redshift and Arnold, and then there are fast gaming engines such as Unreal and Unity. There’s no need to re-write a new rendering engine from scratch, and there’s so much competition amongst the existing options that all of them can be considered state-of-the art in one way or another. They can utilize multiple GPUs, multiple CPU threads, and are already designed to work across different operating systems and host applications. These are technical problems that other people have already solved.
So let’s ignore the rendering side of things, and focus on the user interface.
Interface development is challenging but it doesn’t require a huge team of software engineers. Compared to the massive teams working on games, user interface research and development is relatively fast and inexpensive. You can even design user interfaces using After Effects itself. Consider what a designer costs, and then imagine how much someone could achieve working full-time for an entire year, just refining and improving the After Effects interface. Compared to game development and even Hollywood budgets, it’s really not much. After all, it’s not like enthusiastic After Effects users haven’t already started doing this in their spare time. In 2015, Sander van Dijk launched an astonishingly detailed website outlining ideas and suggestions for After Effects, which quickly went viral among the global AE community.
Better still, imagine what could be achieved if a small team of interface designers re-visited the very concept of animation with a blank page. A modern motion design interface should be re-thought from the ground up, and not be a series of band-aids patched over a 25 year old product. Look at the list above of animation tools which After Effects doesn’t have, and imagine how an animation program might look if all of those things were included from the very beginning – and designed to work together.
One way to start would be devising a hypothetical brief specifically designed to expose all of the shortcomings in the current After Effects interface. Include basic keyframe animation, shape layers, 3D layers with a camera, some character rigging and parenting, and then some 2D physics. Add some repeated/cloned layers, some procedural movement (perhaps a bounce or two) and tie the whole lot together with triggers from an audio track.
Then forget you ever saw After Effects and imagine a new interface that would let you put this together in an intuitive manner…
I’ll give you a hint. It’s not easy. I haven’t done it. This is not a tutorial where I reveal a new trick at the end and say ta-da, here’s one I did earlier.
If someone wants to create a newer, better alternative to After Effects then the user interface is the hard part. Once you have an interface, everything else hangs off it – potentially for decades. It doesn’t matter what the rendering engine is, what OS it runs on, what language it’s programmed in or what system APIs it uses. The interface is the primary tool that a user has to do their work – to get stuff done.
But as I said earlier, interface design is relatively cheap. You can do it with your imagination. You can do it with a pad and a pencil. You can even do it in After Effects. And if you come up with a great interface that allows users to intuitively design motion, then you might just have created a tool that could become more popular than After Effects itself.
Which problem is the problem?
While this article started as some idle musings on the state of the After Effects interface, a more deeper topic to consider is the state of motion graphics design.
After Effects was designed over 20 years ago, and there’s an underlying philosophy behind its rendering engine: “all of the pixels, all of the time”. It was created before the internet existed, when “video“ was synonymous with the tv in your lounge room.
After Effects can scale to very large resolutions, it can work in 8, 16 and 32 bit modes. It renders consistently across all hardware and OSs. You can open a project from over 10 years ago, re-render it today and it will perfectly match whatever you did originally. You don’t get different results with different graphics cards or CPUs. It doesn’t look different if you render on a Mac or on Windows.
That’s not a fluke, that’s the way it was designed. And, FWIW, as impressive as real-time gaming engines can appear in demos, they don’t prioritise any of those features.
Because that’s the way After Effects works, a generation of motion graphics designers have worked with it.
But – and this is a topic that’s beyond my expertise – there are suggestions that motion graphics designers themselves are evolving their approach to design. Motion graphics are no longer limited to something being viewed on a TV. Motion graphics can be lighting installations, kinetic sculptures, interactive real-time animation on a web page, or even formations of flying drones. Motion graphics designers are no longer limiting themselves to pushing pixels around on a screen, but that’s what After Effects does. On a technical level, motion design can be delivered as animating vector artwork or polygons that render in real-time. Modern motion graphics may not be “rendered” in the sense that After Effects renders, they may be coded in HTML as part of a web page. On a physical level, motion design can involve water, mirrors, robots and lasers. Unlike the 1990s, when After Effects was initially developed, there’s now more potential for motion design than a TV screen.
Perhaps the biggest problem facing After Effects in the future isn’t a competing product that looks the same but offers more features. Perhaps it will be a pioneering new application that allows designers to design motion in a revolutionary, innovative and ultimately creative way for mediums that aren’t based on pixels.
Only timelines will tell…
If you’ve found this interesting, please take the time to check out my other articles and tutorials on After Effects.
**I edited the article to reflect a more accurate age of After Effects. After Effects celebrated its 25th anniversary in 2018. My crust old brain thought it was the 30th, but that was wrong. While this is an opinion piece, if you spot any other glaring errors just leave a comment below.