Reprints from my posting to SAN-Tech Mailing List and ...


[san-tech][01975] BlueArcプレスリリース ("Avatar")

Date: Fri, 08 Jan 2010 12:18:14 +0900
[san-tech][02132] Re: BlueArcプレスリリース ("Avatar")
"BlueArc Storage Solution Powers Visual Effects Behind "Avatar""
 Jan. 6, 2010

"12 Titan servers to store and manage over 500 terabytes of data
 feeding thousands of render nodes acting in concert to produce the
 special effects for Avatar, as well as an additional 700 terabytes
 of nearline storage."

"Weta spent close to two years on the production of "Avatar" and it
 was by far the most data-intensive project that company has ever
 worked on. Weta had 750 artists and a renderwall of more than 34,000

Weta Digital, NZは、2006年から BlueArcユーザです:
"Weta Digital Turns to BlueArc Titan Storage System To Meet Unrelenting
 Demands of World-Class Special Effects", October 3, 2006
Weta Digital

Shmuel Shottan, Chief Technology Officer, BlueArc Blogから
"The most amazing scene factory in Wellington ? My visit to see how
 Avatar was created", December 23, 2009
"Celebrating (almost) a quarter century of satisfying CGI’s insatiable
 appetite for speed", January 6, 2010

"Out of This World BlueArc Storage Solution Enables "Planet 51" to Take Off"
 Nov. 23, 2009
"BlueArc Storage Solution Stars in "District 9""
 Sept. 14, 2009
  Weta Digitalが関係したようです。
"BlueArc Storage Solution Powers Animated Film 9"
 Aug 4, 2009

"BlueArc Titan clustered NAS helps put 'Planet 51' into orbit"
 04 Dec 2009,289142,sid5_gci1376014,00.html
"Starz Animation Beefs Up Storage Performance For Heavy Production Schedule"
 December 11, 2009
"Image Engine Consolidates Data Center, Ramps Up Storage For Performance
 Improvement", October 1, 2009
  Mac、Windowsに Linux Bladeサーバの混在環境だと NFSは楽ですね。

Date: Fri, 08 Jan 2010 13:38:56 +0900

"Lightstorm Entertainment、『アバター』の撮影現場で Isilon IQ を活用"
"Lightstorm Entertainment Uses Isilon IQ to Power Production of "Avatar""
 December 18, 2009

 共同作業", 2010/01/06,2000056049,20406094,00.htm

"今から約1年前、James Cameron氏のSF大作「アバター」の制作がかなり進ん
 だ頃、同プロジェクトのComputer Generated Imagery(CGI)の大半を手が
 けていた視覚効果スタジオWeta Digital (...中略...) はやや手が回らな

"そこでIndustrial Light & Magic(ILM)が加わった、と同スタジオの
 John Knoll氏は述懐する。"


LightStorm Entertainmentはカリフォルニア、Weta Digitalはニュージーランド
と距離が離れているので、データ転送は Aspera社の faspを採用したそうです

"Aspera software maximizes data transfer efficiency for the production
 of James Cameron's highly-anticipated new movie." December 17, 2009
"Geoff Burdick, Lightstorm Vice President Production Services & Technology.
 "We consistently transfer to New Zealand at 45 Mbps over a 45 Mbps link,
 a 15-30X improvement over standard FTP.  "

the fasp solution
国内代理店 (ネットマークスさん)

"The Avatar storage effect", 21st December 2009

Date: Fri, 22 Jan 2010 09:36:27 +0900
"'Avatar' post-production combines BlueArc and NetApp clustered NAS"
 21 Jan 2010,289142,sid5_gci1379274,00.html

Date: Tue, 09 Feb 2010 14:31:26 +0900
以下の NetApp FlexCacheとの組み合わせについて、NetAppからプレスリリース

> これ参考になると思います:
> "'Avatar' post-production combines BlueArc and NetApp clustered NAS"
>  21 Jan 2010

"Weta Digital and NetApp Bring Avatar to Life", February 3, 2010

 "Together, NetApp and Weta devised a system that allowed Weta to
  automatically balance the throughput requirements of the renderwall
  (the computer system used for real-time rendering) to provide more
  than 35,000 rendering cores with the fastest possible access to
  frequently used texture files. This solution was based on NetApp
  FlexCache to maintain high-speed access to the updated texture files
  to all users simultaneously. By taking advantage of NetApp FlexCache
  systems, Weta Digital reduced manual data movement and management
  by 95%, greatly reducing pressure on technical staff and speeding
  production time."

NetApp FlexCache

Behind the magic of Avatar
 "Weta's computing core ran on 2,176 HP ProLiant BL2×220c Blade Servers."

"Avatar, Hollywood and the Data Center", February 8th, 2010

0 件のコメント: