THE FUTURE OF COMPUTER GRAPHICS

2000 and beyond

(Here I speculate about the near and far future of computer graphics and tell what has happened so far.)

| History 1960 | History 1970 | History 1980 | History 1990 | 2000 and beyond |

Last updated: January 2005
[If you want to print this article, the easiest way to avoid trouble is to open this frame in a new window and print it]

'The future has the bad habit of becoming history all too soon'
(Quoting myself :)

The problem with writing about the future is partly explained in my quote. What I call future today will be history tomorrow. And when it comes to texts on a website, this is especially true. A text is so easy to forget for a couple of years and suddenly you realize that half of what you speculated about has been proven wrong by history.

Nevertheless, here's what I'll do. I will divide my text into 'near' and 'far' future. When the near future becomes history I will do the necessary changes.


Past, present and near future: 2000-2005....


Prologue

Regardless if a person is religious or not, or regardless of what religion one follows, the New Year is a good way to round things of in the society or in ones life in general. When we learn history it helps to divide it centuries or decades. This very document is divided in decades and that's how we relate to history. The 60's, the 90's and so on. As such, the shift to a new millennium holds a significant importance. It divides history!

Everything that happens after December 31:st 1999, belongs to the next decade, century and millennium. When the day comes and people will be talking about the 20:th and even the 21:st century in the past tense they will tell you that the American civil war was in the 19:th century, the Pentium III was released in the 20:th and the Voodoo 4 in the 21:st.


So what comes in the near future?

Most of you who follow the graphics industry have a pretty good idea about what to expect in the coming years. On this page I report what is coming and when future becomes history I update the predictions with facts. Here's the story so far.

2000
The year 2000 really was 'the year of nVidia'. In december, nVidia acquired core assets of the once mighty 3DFX. This was a good reminder to all of us how quickly things can change in the industry. ATI are still going strong and Matrox has announced new products, but overall, nVidia has become the clear and undisputed 'standard' for home-computing. There are still some other manufacturers that compete on the professional market and still give nVidia a good match, but that too is probably only a temporary glitch for what now seems as the 'unstoppable' nVidia.
But let's not forget how unstoppable 3DFX seemed only a few years ago...

2001
2001 saw a continuation of nVidia's dominance of the computer graphics market with an occasional competing product from ATI.
Nintendo release the Gamecube in September 2001 (Japan). Click here for the specs.
Gameboy Advance was released 1H 2001. The big event of 2001 was probably be Microsoft's Xbox console. With nVidia developed graphics chip, HardDrive, fast Intel CPU & more, it's designed to kick ass! Main competitors will be Playstation 2 and Nintendo Gamecube. The once so influential SEGA has given up it's hardware business and will now concentrate on software. The company's new aim is to become one of the major players in the software biz.

The movie scene had it's share of limit-pushing movies, including Final Fantasy: The Spirits Within, maybe the first real attempt to create realistic humans in a completely computer generated motion picture while Pixar's Monsters Inc features some pretty convincing fur.
Jurassic Park 3 did it again, of course, with dinosaurs so real, even a graphics artist can sit down and enjoy the movie without thinking about the special effects. The movie A.I. featured extremely well produced special effects, but they were simply evolutionary works based on the same techniques created for the landmark movie Terminator 2. (Interestingly, it was the same crew Dennis Muren/ Stan Winston that worked with the FX.) The biggest movie of the year award goes to Lord Of The Rings featuring some very ambitious scenes.

Those of you who watch the television series Star Trek will no doubt asked yourself the question why all the alien races look like humans with some minor cosmetic changes such as different nose or some crap glued to the forehead. The answer is of course cost! Star Trek: Voyager actually features a race known as species 8472, which is computer generated. However, the screening time of that species is sparse to say the least. Thanks to the lower prices of special effects, who knows, the latest Star Trek series Enterprise may feature lot's more CG aliens assuming it will live for the standard 7 seasons. (It didn't, ED note)

2002
Q1 2002 saw the release of nVidia's Next-gen GPU. The nv25 chip (GeForce 4 Ti). This is the chip that will make Xbox users understand how fast graphics technology is moving forward. A top-of the line PC in 1H 2002 is already many times more powerful than the Xbox (but of course also more expensive).
ATI released the R300 chips (R200 successor) in July. Powerful DirectX 9.0 chip that will hold the performance crown at least until nVidia releases it's nv30 chip. So you can be sure that as soon as this Christmas, the consoles will be clearly inferior to a decent PC. Because of this, Sony are already releasing details about the Playstation 3 and Microsoft are already working on Xbox 2.
Speaking of ATI, they were responsible for leaking an early version of id Software's Doom III game. This game is the brainchild of programming legend John Carmack. The leaked version spread around the world like wildfire and people quickly realized 2 things. First of all, the game looked incredible, the atmosphere was more movielike than any other game in history. And secondly, they realized that this game is going to force a whole lot of hardware upgrades among the wannabe users. This game was clearly going to need faster GFX chips than were available at the time.

On the movie scene Star Wars: Episode 2 displayed a dazzling amount of incredible CGI shots. They weren't doing things that have never been done before, but they are perfecting what was seen in Episode 1... The visuals aren't perfect yet, but for most of the time, it's difficult to imagine if & how they can be improved. Perhaps one of the greatest advances was made in cloth simulation. Robert Bridson, Ronald Fedkiw (Stanford University) & John Anderson (Industrial Light and Magic) presented a paper on 'perfect' cloth simulation at SIGGRAPH 2002. In many scenes in SW2, the actors were actually replaced by digital stunt-doubles and all the clothing needed to be simulated perfectly to fool the eye.
At the end of 2002, fans of Lord of the Rings in particular and CGI fans in general had the opportunity to watch just how far CGI has come. The Two Towers features a computer generated main character (Gollum) which, while not 100% convincing, looks pretty damn photo realistic anyway. The motion was captured from a live actor and the interaction between the CG character and the physical environment was among the best we've seen so far.

2003
At the end of 2002 everyone was waiting for nVidia to launch their latest graphics chip (GeForce FX, alias nv30). While it was announced earlier, the actual shipments started in January 2003 and even then, it was very rare. Pretty soon it became clear that the chip wasn't exactly what people were hoping for. Even nVidia realized that and immediately started to work on the slightly modified successor (nv35) which they finally announced in May 2003. As always, ATI were there to match their product line quite nicely. Competition is usually good for the customer, but I must say that 2003 also shown exactly what is wrong with this situation. By years end, the Graphics Card market is absolutely flooded with different models released by nVidia & ATI. For someone not very familiar with the market, it's next to impossible to make out exactly which model may be best for them.
Still, the graphics chips are rather useless unless there is some good software that puts them to good use. For a long while, 2003 seemed to be one of the most exciting years for a very long time for gamers and movie fans alike. Ultimately, there were release postponements so the year ended in disappointment, but let's take a look at what happened.
The E3 show was the main event where all the big game titles were revealed. Doom III was shown again, but the game everyone was talking about was definitely Half Life 2. The sequel to the immensely popular Half Life, released in 1998. Carmack himself has said that gaming has reached a 'golden point' graphics-wise because it's possible to do pretty much anything the artist can come up with. The characters in these games look extremely lifelike compared to previous generations of games and they certainly set a new standard for computer game developers everywhere. Another thing that impressed the HL2 audiences was the incredibly sophisticated physics simulation within the game. (Physics engine provided by Havoc TM) Add some very advanced AI to that, and you soon realize that HL2 offers a new kind of gameplay compared to older generations of games. (Doom III will be a similarly realistic experience) As I mentioned, it turns out, neither Doom III nor HL2 were released in 2003. So now they are officially 2004 releases.

Even though the postponed releases disappointed the gaming community, 2003 was a quite extraordinary movie year. Like in the game biz, a lot of much anticipated sequels were launched during 2003. X-Men2 offered pretty much 'standard' special FX (and some lovely non CG footage of Mystique ;-). Matrix 2 again manages to shock audiences with incredible and unique special effects that make everyone go 'how the hell did they do that??'. Terminator 3 was another blockbuster sequel and considering that T2 was such an important landmark in movie-production, it had much to live up to. At the end of the day, the effects in T3 were very nice and polished but in truth not revolutionary at all. Matrix Revolutions featured tons of special effects (in fact too many for some) but at this point we are being so spiled that we hardly raise our eyebrows although admittedly, the quality of the effects was stunning. Certainly the big 2003 finale was the release of the last LOTR movie, The Return of The King. Most if not all special effects were almost perfect, but the most impressive thing was possibly the seamless blend of real and CGI footage. The visualization of the attack on the White City was quite remarkable even by today's standards and I can honestly say that I have not seen anything quite like it before. One thing is certain, all the good stuff in 2003 will spoil the audience to a degree that it's going to be pretty much impossible to impress them in the future... Ah well, there's always Star Wars Episode 3 in 2005. They have their work cut out for them, that's for damn sure.

2004
2004 was a great year for graphics in computer games. Many of the titles that were expected in 2003 actually shipped in 2004. And just as many of us knew, a couple of games in particular raised the bar for graphical quality we expect from video games.
The first positive surprise of the year was the game FarCry which was pretty much the first game to utilize the next-generation graphics and could make use of the latest advancements in computer graphics such as Direct X9.0 shaders. The second big title was the eagerly anticipated Doom3, the sequel to the legendary and revolutionary Doom series. Although the game itself might have left one or two players disappointed, no one could deny that the graphics were nothing short of brilliant making use of dynamic lightning, shadows and very moody surround sound. it truly was more of an interactive horror movie than just a game. Then, towards the end of the year, possibly the most anticipated game of all time finally arrived. It was of course Half-Life 2. Being in development for some 6 years, people were starting to wonder if they could ever live up to the hype, but luckily the answer is YES! Apart from the incredibly realistic graphics, the game also added a whole new dimension of gameplay through it's cleverly implemented physics engine.
All in all, 2004 will be remembered by games as the year when computer graphics took a giant leap forward. All new games will inevitably be compared to the above mentioned milestones. That is good news for the gamers and many sleepless nights for the game developers.

These new landmark titles of course demanded pretty fancy hardware to run as they were supposed to, causing many gamers (including me) to upgrade their hardware. nVidia were struggling with it's FX (GF5) line allowing ATI to gain market share. However, in 2004 nVidia made a glorious comeback with their nv40 hardware. Full PixelShader 3.0 support and massively parallel architecture was the medicine that cured the FX plague. ATI of course released their own products (R420) but the gap from the previous generation of hardware was gone. As it turned out, both nVidia and ATI signed special deals with the makers of Doom3 and Half-Life 2 respectively, making sure the games would run optimally on their hardware. At the end of the day, top of the line models from both makers were more than adequate to play all games perfectly well.
As 2004 was the introduction of a new level of graphics quality in games, it is understandable that the level will stay there for the first year or so, because other game developers will release games based on licensed Doom3 and HL2 engines. It takes a long time to write a new revolutionary 3d engine and the only new engine on the horizon right now is the Unreal 3 engine, expected to arrive in 2006.
One more thing that deserves to be mentioned is the development of graphics power in handheld devices. Both nVidia and ATI now offer 3D graphics chips for mobile phones and PDAs. This is probably where the graphics development will be most noticeable for the next few years. In late 2004 Sony also took a step into Nintendo dominated territory by releasing its PSP console. It's a handheld device with a fairly high resolution widescreen display and hold roughly the same graphical power as the Playstation 2. At the time of writing, the device is yet to be released outside Japan, so time will tell if it a success or not.

As far as movies go, there was a slight feeling of anticlimax after the amazing movie year 2003. The Terminator, Matrix and Lord Of The Ring trilogies seem to be concluded. It seems that special effects have matured to a point where people barely care about them anymore. It seems as if it has all been done already. The purely computer animated movies continued to make it big at the box office. Shrek 2 and Pixar animation's The Incredibles are two examples. Perhaps the most noticeable thing is that the development time for a computer animated feature film has been cut down drastically. It used to take Pixar 3 or more years to complete such a movie, but now they seem to release a new movie every year.

2005
Initially, it was thought that the next generation of video consoles (Playstation 3 and the successors to Xbox and GameCube) would be released in 2005, but now things do not look as certain. According to rumors, Xbox2 might be released at the end of 2005 as for the other consoles... We'll have to wait and see. It's probable that the hardware itself will be ready early in 2005, but a game console need games or it is useless. Games take longer to develop than the hardware so we'll have to wait and see about this one.
Needles to say, nVidia and ATI continue to battle each other. The only question is wether the gamers will need anything vastly more powerful than what was released in 2004 considering that many games will be based on the same technology as Doom3 and Half-Life2. It is possible that the most innovation in graphics technology will come in the mobile area. With the quick adoption of 3G, mobil phones might become the next big platform for the makers of graphics chips.

The biggest movie of 2005 will probably be StarWars Episode III. It might be the very last movie in the StarWars saga so I'm guessing that Lucas wants this one to absolutely spectacular. Other movies that might contain some fancy special effects are Peter Jackson's King Kong and the next Batman movie (Batman Begins).


The Future: The years ahead...


Photo-realism to the people!

'Difficult to see.. always in motion the future is..' (Yoda)

It is obvious that we are heading towards photo-realism. In the early 90's, there were no true 3D games at all (Wolfenstein and Doom don't qualify as true 3D games) and 1999 saw the release of Quake 3. A game that featured all the latest tricks of the industry at the time. some 5 years later, Doom III made its appearance and yet again the distance to the previous generation is extremely noticeable to say the least.
The path that the gaming industry has chosen will eventually bring them to the movie industry. In the future, the artists will be using the same 3d models in games and in movies. As movie footage doesn't have to be rendered in real-time, they will always be allowed to do more complex things, but there will be a certain point in the future when this border between realtime and pre-rendered graphics will be so blurry, only the industry experts will be able to tell them apart.
The next big leap in visual quality will come with the introduction of the next generation consoles. I'm guessing that certain types of games such as car/racing games will look so realistic that it will fool more than one person into thinking that they are watching actual TV footage. THe industry is learning more and more tricks about making things look realistic.
It's noteworthy though, that the focus might not just be on pushing more and more polygons around on the screen. According to nVidia's David Kirk, the GFX R&D; teams might focus on solving age old problems such as how to get realtime Ray-tracing and Radiosity. The thing is, again according to David Kirk, these things aren't far away at all, and again, come 2005/6 we might be doing Radiosity rendering in realtime. When that happens, there will be another huge leap towards photo-realism in computer games. (Until now, most games use pre-calculated radiosity rendered shadow-maps applied as secondary textures).

It's interesting too look back for a while and look at the rapid development in consumer graphics. As late as in 1997 SGI's multimillion dollar 'Onyx 2, Infinite Reality - Dual rack' (consuming some 14,000 Watts of power) was among the best the graphics biz had to offer. In 2005 a GeForce 6800 GFX card can be bought for $300 and GFX performance-wise it completely outclasses the Onyx 2 setup.

So what about the movies?

The 90's saw many milestones in cinematographic special effects. This decade will be no different. The milestones will not be the same, but there will be new ones. What lies ahead can be summarized in three words: Faster, better, cheaper! Long gone are the times when ILM was the only company that could deliver state of the art special effects. Nowadays, competition is hard and that results in those 3 words I mentioned above...
The times when people invented new revolutionary algorithms to render photo realistic scenes are more or less behind us. Currently, photo-realism IS possible, but what still prevents us from creating e.g. perfect humans is within the creation process not the rendering. The software used to model, animate and texture objects is still too clumsy. It's still nearly impossible for an artist to model and texture e.g. human skin that would pass as photo realistic from close range. Other, less complicated objects, such as space ships can already be rendered photo-realistically. Be sure though, that the first decade of the new millennium will see photo realistic humans in a movie. After seeing Star Wars Episode 2, I have high hopes for the 2005 release of Episode III. 100% realistically rendered humans may be the final milestone to achieve before computer graphics really can replace anything. (That doesn't mean that it will be practical to do so, but Gollum in LOTR certainly made some real actors look mediocre). Look forward to an amazing decade of CGI!

All the brands and products mentioned in this article are copyrighted by their respective owners.

| History 1960 | History 1970 | History 1980 | History 1990 | 2000 and beyond |




back to main
Return Home