- Innovations business (206)
- Innovations numériques (746)
- Innovations énergétiques (517)
jeudi 24 décembre 2009
Best Strategy Is Speed (Startup2Startup May 2008)
Qualité sonore évaluée virtuellement
Le processus de qualité de Ford s’inspire décidément des nouvelles technologies. Après l’évaluation robotisée de la texture des sièges, le constructeur américain annonce l’utilisation d’un processus pour simuler la qualité sonore de ses véhicules. Le Virtual Vehicle Sound Simulator affine le niveau sonore à l’intérieur d’un véhicule dès le début du processus de production. Les économies réalisées aussi bien en temps qu’en argent sont importantes puisque cela réduit le nombre de tests réels à mener en circuit et dans les tunnels aérodynamiques lors de la phase de prototypage. Avec ce simulateur, les ingénieurs du son de Ford seront en mesure d’effectuer les mêmes tests que ce qui se pratique déjà pour le design des véhicules. En fonction du dessin virtuel des véhicules, le logiciel simule de manière réaliste les sons qu’il produira.
76 simulations sonores différentes
Par ailleurs, ces simulations pourront avoir lieu selon différentes conditions de route, de vitesse, avec divers équipements, charges... Freinages et accélérations font également partie du panel évalué. Cela permettra de s’assurer que la qualité du son produit par différentes parties du véhicule s’accordera aussi harmonieusement que possible avec les autres. Jusqu’à présent, les ingénieurs devaient tester la qualité du son de chaque composant un par un et pour chaque condition. "Nous avons créé une batterie de près de quatre vingt simulations sonores différentes en se basant sur les données obtenues sur les véhicules proposés", explique Mark Clapper, responsable technique chez Ford. "En utilisant le simulateur nous somme capables de reconnaître des sons avec nos oreilles que nous n’aurions jamais pu identifier autrement depuis cette montagne de données".
Améliorer la satisfaction des consommateurs
"Nous savons que la qualité sonore à l’intérieur du véhicule est un facteur majeur de satisfaction des consommateurs", explique Mark Clapper. Le système permet enfin de consulter les consommateurs avant même que le véhicule n’ait été produit. En les équipant d’un casque audio et en leur demandant de donner leur opinion sur les sons de véhicules proposés, il est possible d’identifier des problèmes très tôt dans le processus de production. "Quelquefois un son qui paraît acceptable sur le papier sonne très mal lorsqu’on l’entend réellement", explique encore le responsable technique. "La simulation nous permet d’éliminer des sons désagréables avant même que les designers ne couchent leurs idées par écrit".
SOURCE : L'Atelier
mardi 22 décembre 2009
Tiny glitter-sized photovoltaic cells could revolutionize solar power
Scientists from Sandia National Laboratories have developed tiny, glitter-sized photovoltaic cells that are ten times thinner than conventional solar cells and could one day be used in a variety of applications – from satellites and remote-sensing, to tents and perhaps even clothing. Yep, these cells could turn the average Joe into a walking solar-battery charger.
The Sandia research team identified over 20 benefits of scale for these tiny cells over traditional solar cells, including better performance, more efficiencies and possibly reduced costs. Sandia lead investigator, Greg Nielson said, “Eventually units could be mass-produced and wrapped around unusual shapes for building-integrated solar, tents and maybe even clothing,” he said. This would make it possible for hunters, hikers or military personnel in the field to recharge batteries for phones, cameras and other electronic devices as they walk or rest.”
While solar-charged clothing is not a particularly new concept, these solar particles - made from crystalline silicon – are expected to have more applications, be less expensive and have greater efficiencies than the photovoltaic cells made from six-inch square solar cells. In addition, the team believes that the modules made from the photovoltaic cells could have intelligent controls, inverters and storage integrated at the chip level. An integrated module such as this could reduce problems such as cumbersome design and grid integration processes currently experienced by solar technical assistance teams.
Cheap as chips
The cost reduction is due partly to the fact the microcells don’t need a lot of material to become highly efficient and well-controlled devices. They are just 14 to 20 micrometers thick - a human hair is approximately 70 micrometers thick – and are ten times thinner than a conventional 6 x 6 inch solar cell, however they are capable of being used in large-scale power production. This could mean a reduction in the manufacturing and installation costs when compared to current photovoltaic techniques.
Sandia researcher, Murat Okandan, said, “So they use 100 times less silicon to generate the same amount of electricity. Since they are much smaller and have fewer mechanical deformations for a given environment than the conventional cells, they may also be more reliable over the long term.”
Using a commercial machine called a pick-and-place, up to 130,000 pieces of glitter can be placed per hour at electrical contact points pre-established on the substrate. The cost is estimated at one-tenth of a cent per piece and it is expected each module will contain 10,000 to 50,000 cells per meter – depending on the level of optical concentration required.
Low cost solar concentrators can be placed over each cell which will increase the number of photons being converted into electrons via the photovoltaic effect. And although the cells are small they have a high voltage output which will reduce the costs associated with wiring.
Smaller can sometimes be better
As the cells are so small, they can be manufactured from commercial wafers of any size and if one cell is defunct, it can simply be replaced rather than having to replace an entire brick-sized unit. These small cells have individualized wiring, eliminating the need for thicker power lines to cope with the increased power. There is also less of an issue with shading from overhead obstructions. “The shade tolerance of our units to overhead obstructions is better than conventional PV panels,” said Nielson, “because portions of our units not in shade will keep sending out electricity where a partially shaded conventional panel may turn off entirely.”
Change is good
Although there is a huge change from manufacturing conventional silicon wafers to manufacturing microscale PV cells, the team believes the process would be relatively straightforward because they would use the techniques used in microelectromechanical systems (MEMS), electronics and LED industries.
The cells will be formed on silicon wafers, etched and produced in a hexagonal shape with electrical contacts contained on each shape. At this stage, electricity can be harvested at a rate of 14.9 percent efficiency from the cells – this compares favorably with commercial modules which have a range of 13 to 20 percent efficiency.
Source : Gizmag, 21/12/09
Hugo Boss et la réalité augmentée: comment attirer le chaland?
La marque propose aux clients de se rendre en magasin pour présenter à un écran une carte qu'ils ont reçue. Un jeu de Black Jack se lance, qui leur permet de gagner des bons d'achat.
Editeurs et industriels du disque ne sont pas les seuls à exploiter les solutions de réalité augmentée à des fins marketing. En partenariat avec Total Immersion, Hugo Boss a lancé dans ses boutiques de Sloane Square à Londres Black Magic. L'opération permet aux clients de gagner jusqu'à 250 livres en bons d'achat dans le magasin s'ils se prêtent à un jeu virtuel de Black Jack. Il leur suffit de présenter devant un écran une carte de la marque qui leur a été remise ou la double page promotionnelle insérée dans le magazine gratuit Stylist and Shortlist. Sur l'écran, un jeu de cartes virtuelles est alors battu.
Réalité augmentée et magie noire
Une carte est ensuite sortie du paquet. Les personnes ayant une carte identique à celle qui vient d'être sélectionnée se voient remettre leurs bons. L'environnement virtuel a été créé dans une ambiance s'inspirant de la magie noire, afin de rester dans la logique de la ligne picturale de la marque. "La réalité augmentée tire l'expérience d'achat à un autre niveau. C'est une technologie à ne pas négliger pour augmenter la fréquentation en boutique et le nombre de ventes", note Myles Peyton, de Total Immersion.
Un taux de réponse important
Un tel système vise en effet à attirer en boutique des personnes qui ne pensaient en effet pas forcément acheter de produits de la marque en leur proposant cette expérience ludique. Selon les responsables du projet, le nombre de personnes entrées en magasin après avoir feuilleté le magazine est conséquent. "Nous avons été surpris du taux de réponse des lecteurs. Ils semblent avoir apprécié la possibilité d'interagir avec la technologie", explique une personne de l'agence londonienne SimonandJohn, responsable de la campagne. Celle-ci durera trois semaines. L'installation a été réalisée par Crossplatform.
Source: L'Atelier
Des opérateurs mobiles virtuels affranchis des opérateurs traditionnels ?
Via la géolocalisation et la radio cognitive, une équipe chinoise propose aux MVNO d'identifier les réseaux sous-utilisés par les opérateurs et de les exploiter. Une technique prometteuse mais pas encore applicable en France.
Habituellement, les opérateurs de réseau mobile virtuel (MVNO) louent une partie du réseau des opérateurs agréés via des contrats à long terme. Un système proposé par des chercheurs chinois* vise à rendre ces acteurs moins dépendant d'un opérateur en particulier. Ce, en passant par le principe de la "cognitivité". C’est-à-dire qu'il souhaite leur donner la possibilité de connaître en temps réel quelles parties des réseaux sont actuellement inexploitées afin de les utiliser à leur compte. Le tout, en accord avec les propriétaires de ces réseaux. Ils utilisent pour cela des techniques de géolocalisation ou de radio cognitive**. L’avantage sera de permettre une plus grande flexibilité des investissements engagés - et donc des prix proposés - en fonction de la demande. Pour les cas où cette démarche n'est pas satisfaisante - la quantité de bande réseau que ce type de service permet d’obtenir est très incertaine - ils proposent de faire appel à la location de courte durée, mais dont le coût est plus élevé.
Vers une gestion multi opérateurs ?
Pour connaître les moments où l'une ou l'autre technique est la plus pertinente, ils ont mis au point un modèle qui détermine les différents seuils. Un système intéressant mais non encore applicable dans l'Hexagone, note à L'Atelier Jérome Birba, directeur exécutif de NRJ Mobile : "On est très loin en France de faire basculer le trafic des opérateurs virtuels d’un réseau à un autre ", affirme-t-il. "Cela dit c’est envisageable, et on en discute". Selon lui, ces discussions portent sur la possibilité de donner aux MVNO la main sur la gestion du trafic réseau. Au sein de leur opérateur affilié dans un premier temps, et pourquoi pas à terme pour une gestion multi opérateurs. "Techniquement c'est possible", explique Jérome Birba. "Mais il faudrait que les MVNO soient identifiés par un code réseau au même titre que les opérateurs pour que leurs clients puissent être basculés d’un réseau à un autre".
Agir en tant qu’intermédiaires auprès de MVNO de plus petite taille
Quelque chose qui se pratique déjà dans certains pays d’Europe du Nord. Le système proposé par les chercheurs propose également aux MVNO d’agir en tant qu’intermédiaires entre les propriétaires réseau et d’autres opérateurs virtuels secondaires. Les opérateurs virtuels pourraient ainsi optimiser leurs achats de bande réseau en revendant les surplus. Là aussi, le problème en France c’est que les MVNO ne louent pas vraiment de spectre réseau mais font plutôt de l’achat de minutes, de SMS et de données en gros. "Ce n’est pas possible de sous-louer quelque chose qu’on ne loue pas", explique le directeur exécutif. Ce qui reste envisageable, pour lui, serait que les opérateurs virtuels dont le pouvoir de négociation est important revendent leurs minutes et leurs SMS à d’autres opérateurs virtuels plus petits.
* L’université Chinoise de Hong-Kong et l’université de la Cité de Hong-Kong sont impliquées dans ce projet.
** Cette dernière détermine automatiquement quelles sont les fréquences disponibles. Elle utilise une antenne qui balaie le spectre et qui sélectionne les signaux considérés comme les plus pertinents.
Source: L'Atelier
lundi 21 décembre 2009
Une centrale solaire spatiale en 2016 ?
California utilities push for solar, wind and carbon-capture projects
California regulators went out of this world today and gave the go-ahead to a power-purchase agreement involving the nation’s first solar power plant in space.
Pacific Gas & Electric Co., the state’s largest utility, will proceed with a 15-year contract with Manhattan Beach start-up Solaren Corp., after receiving approval from the California Public Utilities Commission.
The project, which is expected to go live in 2016, will use solar cells from Solaren on orbiting satellites to convert energy from the sun into radio-frequency waves. The waves will be transmitted to a receiving station near Fresno and reverted back into electricity.
The project should produce 1,700 gigawatt-hours of energy each year, according to the commission. The Japanese government said this summer that it intends to pursue a similar space-based solar program.
California hopes that utilities will pull 20% of their power from renewable sources by 2010. Gov. Arnold Schwarzenegger signed a directive in September pushing for a 33% by 2020 goal.San Francisco-based PG&E was also busy today signing a contract to buy and operate its first wind-energy project.
Portland, Ore.-based Iberdrola Renewables Inc., the U.S. branch of Iberdrola SA in Spain, will develop and build the Manzana Wind Project for PG&E. The project, which will be spread across 7,000 acres in the Tehachapi region of eastern Kern County, will cost slightly more than $900 million, the utility said.
The facility will produce up to 246 megawatts, or 670 gigawatt-hours of electricity a year, enough to power roughly 100,000 average California homes. Manzana could go online as early as December 2011 if the project is approved by the PUC.
To finance the effort, customers could see their rates increase 1.1% in 2012 compared with 2009 rates, or an average increase of 25 cents each month, the utility said.
Iberdrola has 3,500 megawatts from operating projects in the U.S., as well as four facilities under construction, said Jan Johnson, a spokeswoman with the company.Also today, the California Public Utilities Commission gave approval for Edison International to spend up to $30 million to co-fund a feasibility study of a carbon-capture and storage plant.
Rosemead-based Edison, the parent of the Southern California Edison utility, will commit as much as $17 million to the first phase of the study, which will explore the permitting, engineering and economics of the Hydrogen Energy California (HECA) project.The project could become a 250-megawatt power station in Kern County that would supply the state with low-carbon, hydrogen-produced electricity. The hydrogen would come from gasifying resources such as petroleum coke from oil refineries, potentially lowering greenhouse gas emissions.
If deemed necessary, Edison could also pour up to $13 million of funding into a second phase of research, according to regulators. So far, the U.S. Department of Energy has spent $308 million supporting the HECA project.-- Tiffany Hsu
SOURCE : Los Angeles Times
dimanche 20 décembre 2009
Avatar : a breaktrough in filmaking technology, a great achievement in Cinema a the frontier of Art, Design and Science
Avatar pushes computer graphics to the very edge of what's possible, particularly in IMAX 3D. Cameron's no stranger to game-changing special effects, with The Abyss he had a hand in creating first fully digital 3D water effect, in Terminator 2: Judgment Day he had the first CGI human character with realistic movement, and in Titanic he used the advanced the rendering of flowing water. Avatar may be another step, but before we take it, we're looking back on Avatar's ancestors. This is how we got here, these are the movies that made Avatar and other computer generated special effects possible. It all started right here…
The Thief of Baghdad (1940)
The First Use of Chroma Key
The First To Use Animated 3D Graphics
Tron (1982)
Extensive Use of 3D CGI
Star Trek: The Wrath of Khan (1982)
The Genesis Effect
Who Framed Roger Rabbit (1988)
Realistic CGI/Human interaction
Jurassic Park (1993)
Photorealistic CGI Integrated With Animatronics
Toy Story (1995)
First Feature Film To Use Only Computer-Generated Imagery
The Matrix (1999)
Bullet Time
Final Fantasy: The Spirits Within (2001)
Photorealism and Live Action Principals In An All Digital Feature Film
Lord of the Rings Trilogy (2001-2003)
The Polar Express (2004)
First Feature Film With All Motion Capture Characters
Immortel (ad vitam), Casshern, Sky Captain and the World of Tomorrow, and Sin City (2004-2005)
Use of The Digital Backlot
Photorealistic Motion Capture Using Virtual Camera System, and Facial Expression Cap Cameras
We have officially arrived at the here and now. A movie revolution took place at the end of 2009 - potentially offering as big a leap in our viewing experience as the change from black-and-white television to colour.
James Cameron, the film director who pushed technical effects to the limit with the blockbuster Titanic in 1997 -11 Oscars and the highest-grossing revenue of all time- , and ushered in the dawn of action films with '80s classics such as Terminator and Aliens, has unleashed the film he has been hoping to make for nearly 20 years.
Behind Avatar : Science, Technology, Art and Design
Underscoring this two and a half our epic lie unparalleled technological, scientific and artistic achievements, including the invention of a novel 3-D film camera, the complete biological and linguistic realization of a virtual world, and flawlessly integrated art direction and conceptual renderings. Many people’s post-viewing reaction will be, “How did they do that?!” ScriptPhD.com is proud to present a special Avatar preview that includes behind-the-scenes secrets and a review of the must-own companion design book The Art of Avatar. Before you go see the movie, get to know it.
Fresh off of astronomical success with Titanic—11 Oscars and the highest-grossing revenue of all time—James Cameron could have done anything. He was King of the World, remember? Any film, any project, the sky was the limit. Instead, he disappeared, only to reemerge in 2005 to propose a new big-budget blockbuster to 20th Century Fox. They funded a $10 million 5-minute prototype for Avatar, but hesitated green-lighting full production, citing the 153-page script (first conceived in 1995), ambitious new video technology, and a story producers feared would alienate audiences. Only when Disney expressed interest in the film did they give Cameron a full go-ahead. The result is a movie with a final budget of over $230 million that required four years’ of full-time work to complete. Avatar is a sweeping epic that takes place on fictitious Pandora, a distant moon in the Alpha Centauri-A star system that has been colonized by humans in the year 2154. Discovery of an abundant precious ore, unobtanium, that might solve Earth’s energy crisis leads corporate and military interests to infiltrate the ranks of a native population of humanoids called the Na’avi. Because of a toxic atmosphere, human “drivers” link their consciousness to genetically engineered avatar models—50% human DNA, 50% Na’avi DNA. Jake Sully, a paraplegic ex-Marine, has been called to take his dead brother’s place for scientific exploration of Pandora’s ecosystem, biosphere and indigenous peoples. Inadvertently enveloped into learning the Na’avi culture and ways, Jake soon falls in love with the Princess Neytari and becomes caught in a battle between his own people and the virtual world he has adopted. Avatar, however, transcends whatever story or theme one imagines to define it. It is a pinnacle of scientific and technological innovation, an ode to its filmmaker’s vast travels (earthly and underwater) and intergalactic fascinations, and a harbinger of a filmmaking style that will redefine 21st Century cinema. “This film integrates my life’s achievements,” Cameron said in a New Yorker profile earlier this fall. “It’s the most complicated stuff anyone’s ever done.”
The Technology
•Performance Capture
Motion capture and computer-generated imagery (CGI) are not new to film. Motion capture (or green screen technology) was first introduced by Cameron for Total Recall, with the first CGI human movements added later for Terminator 2. It is, however, inherently limiting to the size and proportions of the human body, in particular the actor of the character being portrayed. The eyes can’t be moved, for example, and makeup often inhibits actor performance. CGI is traditionally done by placing reflective markers all over an actor’s face and body, which are then interpreted by computer technology to create digitized expressions for the CG character. However, the gulf between human and CG expression, referred to as the “uncanny valley”, is often quite noticeable. To bridge the two and create the first truly seamless hybridized CGI, Cameron and his team developed a new “image-based facial performance capture”, requiring the actors to wear special headgear rig equipped with a camera. With cameras placed just inches from their face, actors’ every muscle contraction or pupil dilation was captured and digitized, creating astounding emotional authenticity to their Na’avi avatar counterparts. “If Madonna can be bouncing around with a microphone in her face and give a great performance,” Producer Jon Landau said in a New York Times interview, “we thought, ‘Let’s replace that microphone with a video camera.’ That video camera stays with the actor while we’re capturing the performance, and while we don’t use that image itself, we give it to the visual-effects company and they render it in a frame-by frame, almost pore-by-pore level.” The scope of clarity and precision of the head-rig allowed for a much larger capture environment than ever before, a bare stage called the “Volume”, six times larger than any previous capture environment.
James Cameron supervising digital effect production for Avatar. Photograph by Art Streiber.
•Digital animation
State-of-the-art animation renderings for Avatar were done by Peter Jackson’s New Zealand-based digital-effects studio Weta Digital. A team of talented artists transferred basic renderings (more on this below) into photo-real images, particularly using new breakthroughs in lighting, shading, and rendering. “I’ve seen people looking at Avatar shots, being convinced they are somehow looking at actors in makeup,” Jackson says. The realism was extended to each leaf, tree, plant and rock, which were rendered in WETA computers. Additionally, a team of artists, headed by Academy Award winner Richard Taylor, designed the props and weapons for the Na’avi and humans. All of this digital design took over one year to complete and took up over a petabyte, one thousand terabytes, of hard drive space!
James Cameron shoots a scene from Avatar using new 3D camera technology. Image courtesy of 20th Century Fox.
•Stereoscopic 3D Fusion Camera System
As far back as ten years ago, James Cameron had wanted to develop a 3D camera. At that time, the concerted goal was to use it to shoot a gritty Mars movie that would act as an emblem for space exploration (Cameron is on the advisory board of NASA). At this time, stereoscopic 3D cameras were the size of washing machines and weighed 450 pounds. The challenge issued to production partner Vince Pace was to develop a lightweight, quiet camera capable of shooting in both 2D and 3D. The result of over seven years of hard work was the groundbreaking new Fusion Camera System, the world’s most advanced 3D camera. It facilitated an almost flawless merger between live action scenes and CG scenes. Most of the live-action scenes were shot in Wellington, New Zealand on sets constructed by a massive team of 150 contractors. Live-action sets included the link rooms (where the humans transported to their Na’avi avatars, the Bio-Lab, the Ops Center military operations area, and the Armor Bay military stronghold, which housed all the weapons and transport units.
•Virtual Camera/Simul-Cam
Tying together the 3D and CG technology of visualizing the film were two new Cameron intermediary inventions: the virtual camera and the simul-cam. The virtual camera, used by Cameron in the Volume motion capture stage, wasn’t actually a camera at all. Looking like a video game controller, it simulated a camera that was fed CG images by supercomputers surrounding the Volume. This allowed amplification of each small adjustment on the virtual production stage, from camera movement to actor interaction, to gauge the overall effect on the final big-screen cut. The simul-cam fed, in integrated real-time, CG characters and environments into the live action Fusion 3D camera eyepiece, allowing Cameron to direct virtual scenes on Pandora the same way he would a live-action scene.
Dr. Paul Frommer, surrounded by Na'avi words he created for Avatar. Photo ©2009 Los Angeles Times.
The Language
Not satisfied with merely creating an otherworldly planet and its native beings from scratch, James Cameron set about equipping the Na’avi humanoid tribe with a language of their own. Audiences will be delighted in the authenticity—all communication on Pandora is shown through subtitles befitting a foreign film. Ever the mindful scientist, Cameron hired USC professor and linguist Dr. Paul Frommer to engineer the dialect from scratch, resulting in a respectable, self-sufficient vocabulary of about a thousand words bound by a consistent sound system, grammar, orthography, and syntax. In an extensive interview with Vanity Fair, Dr. Frommer says that Cameron approached him as far back as 2005, when Avatar was going by a code name of Project 880. As with every other aspect of this film, Cameron’s genius micromanagement provided Dr. Frommer with the basics of the sound and structure he was looking for. “I didn’t start from absolute ground zero,” said Frommer, “Because James Cameron had come up with, in the early script, maybe 30 words. Most of them were character names, but there were a couple of names of animals. So at that point I had a sense of some of the sounds that he had in his ear and it reminded me a little bit of some Polynesian languages.”
In addition to painstakingly working on the syntax for over five years, which he compiled into a pamphlet entitled Speak Na’avi, Dr. Frommer worked closely on-set with Avatar actors to ensure proper pronunciation and phonetic differences between native Na’avi speakers, and their human avatar contrasts. Beyond the film, Dr. Frommer is not done developing Na’avi, in the hopes that it might take off like Klingon did post-Star Trek. “I’m still working and I hope that the language will have a life of its own,” the professor said in an interview with the Los Angeles Times. “For one thing, I’m hoping there will be prequels and sequels to the film, which means more language will be needed. I spent three weeks in May, too, working on the [Avatar] video game for Ubisoft, which is the name of a French company.” (The Hollywood Reporter recently posted a terrific interactive preview of the video game.)
The Science
A technically-adept filmmaker such as James Cameron could have been satisfied with simply allowing his 3D team to virtualize Pandora, especially since most of the world is synthesized from scratch. Instead, Cameron, himself an accomplished diver and the brother of an engineer, painstakingly, some would say obsessively, set about populating the moon with flora and fauna using rigorous scientific methodology. He enlisted the help of UC Riverside botanist Jodie Holt, who became an expert on Pandora’s vegetation and mentored Sigourney Weaver in portraying a botanist in the film. Each leaf, plant, creature, and weed was given an original Na’avi name, a Latin taxonomy, a biological description, population and occurrence, ecology and ethnobotany. Click here to watch a brief interview with Dr. Holt about her role in assembling the biological vision behind Padora’s ecosystem.
Incidentally, all of the information I have written about above and more is being compiled by the film’s writers, producers and directors into a 350-page tome called Pandorapedia, to be made available later this winter. Until then, for curious fans and film devotees looking to gain insight into the artistic development process of Avatar, ScriptPhD.com recommends the stunning design book The Art of Avatar, reviewed below.
Cinema, Meet Art + Design
The Art of Avatar ©2009 20th Century Fox and ABRAMS Books. All rights reserved.
While planning and technology were essential nuts and bolts of its cinematic architecture, they alone do not constitute the blueprint or heart and soul of a film such as Avatar, especially considering the ambitious multi-thematic plots and visual realm. Like any innovative, transformative concept or endeavor, Avatar began and ended with the simple sketch. In a gorgeous, 120 full-color illustration rich companion volume, The Art of Avatar, written by Lisa Fitzpatrick, James Cameron and Peter Jackson, provides a fully transparent visual documentation of the first phases of the development and conceptualization of Avatar. The book’s 200+ works of art include compositions by renowned film artists (Rob Stromberg, Wayne Barlowe, Ryan Church, Ben Proctor and Cameron himself), descriptive script excerpts, commentary and a guide to each component’s realization process. As Fitzpatrick notes in the book’s copy, “it is the artist behind the technology that makes the images of this book, and ultimately the movie, so remarkable.” Having seen both (review here), ScriptPhD.com wholeheartedly agrees.
Suspension of disbelief. The cornerstone of any creative compact between a film and its audience. A compulsory foundation for escaping into imagined worlds and embracing its characters and adventures. Notwithstanding this axiom, Avatar still represents a giant leap forward in the world of filmmaking, according to Peter Jackson, Oscar-winning visionary behind The Lord of the Rings and recently District 9. “Every once in a while, we will see a movie that transcends cultural barriers, genre and taste—a film that lives on in the minds of the audience, years after the fact, a film with a story, characters and dialogue so memorable that it creates its own mythology,” Jackson notes in his foreword to The Art of Avatar. A project of such magnitude can initially seem insurmountable. Executive Producer Jon Landau said he felt like a NASA engineer in 1961 when President Kennedy announced we were heading for the moon—only the moon in question was Pandora.
Much of the initial design was jump-started by Cameron, himself a talented illustrator and meticulously descriptive screenwriter. His own sketches included largely preserved concepts of Na’avi clothing and signature physical appeararance, including a complete design of the heroine Neytiri’s painted face. The humans’ habitat the Venture Star included an 11-page document on how the ship functioned, with physics and engineering details such as light-speed calculations, pod dynamics, engine thermodynamics and architectural plans. Cameron’s Avatar script treatment—a plot and content synopsis—included visual primers such as glowing phantasmagorical forest, purple moss [that] reacts to pressure, rings of green light, dreamlike, surreal beauty that allowed artists to create accurate renderings of Pandora’s biosphere (see below picture).
Pandora's breathtaking panorama, observed by the avatar Jake and his native love interest Neytiri.
Content in The Art of Avatar is smartly divided into categories, including transport, science, gadgetry, and biology. Transport and gadgets, including the Valkyrie shuttle and the Samson untilitarian vehicle, are designed with layouts and specificities worthy of industry standards. Particularly impressive to The ScriptPhD were the details and accuracy of early renderings of the home base biolab, clearly conceived by a man with a profound respect for science. Illustrations of labs are laid out to look like actual labs, with incredible attention to the link unit allowing the humans to transform into their avatars and the incubator tubes housing avatars, by far the most challenging prop for Weta Digital to produce. As an example, the Armored Mobility Platorm (AMP) suit used by military to safely roam around Pandora was designed by TyRuben Ellingson to be functional and accessible (good viewing capacity, easy maneuvering, flexible joints, rearview mirrors). Pictured below is just one of a multi-page design book for the AMP suit alone!
Engineering design of the AMP military suit.
The Na'avi princess Neytiri with the avatar Jake. Cameron describes her in the script: "She watches, only her eyes are moving. She is lithe as a cat, with a long neck, muscular shoulders, and nubile breasts. And she is devastatingly beautiful-for a girl with a tail." ©2009 20th Century Fox.
Biology illustrations are largely split categorically by flaura/fauna and creatures. Pandora’s many unique forests, the Home Tree and Floating Mountain key to the film’s plot, the illuminated foliage, and bioluminescent Fan Lizards and Woodsprites—original Cameron creations—each get individual renderings. In several pull-out pages, the creative process is gradual. Neytiri, for example, started as a pencil sketch, evolved into a clay figurine, then 3D artwork, and finally a screen version. Most interesting to note was the process of creating the original creatures of Avatar. The most critical and symbolic of these was the Banshee, a heroic creature that enjoys a lot of screentime in the film. It is meant to be a metaphor for the eagle, and, as a transport vehicle for the Na’avi, to represent them as a flying culture. Take a look at the transition between an early automotive biomechanical concept sketch by designer Wayne Barlowes and final color digital drawings.
Early concept drawing of the Banshee bird, meant to employ biomechanical form and function.
The final color illustration.
In an exclusive epilogue, James Cameron confesses that he feverishly wrote Avatar in 1995 over the span of just three weeks, fueled largely by his own imagination, every piece of fantasy art ever created, and his rigorous, extensive experience under the ocean. While certain things were specific to Cameron’s preferences—namely the Thanator and Viper wolf drawn by himself—all others were achieved through a sometimes turbulent, always rewarding collaborative process with a talented team of artists, designers, scientists and technologists. “The goal [of the design] became to mix the familiar and the alien in a unique way,” Cameron states. “To serve the metaphor and create a sense of familiarity for the audience, but to always be alien in the specifics.”