Stony Brook University unveiled its latest engineering feat, a 1.5 billion pixel Reality Deck. The Reality Deck, a 416 screen super-high resolution virtual reality four-walled surround-view theater, is the largest resolution immersive display ever built driven by a graphic supercomputer. Its purpose and primary design principle is to enable scientists, engineers and physicians to tackle modern-age problems that require the visualization of vast amounts of data.
Reality Deck by the numbers
• 416 high-resolution displays
• 1.5 Billion pixels total (first display to break the one billion pixel mark)
• Five times larger than the second largest display in the world
• Immersive 4-wall layout in a 33’x19’x10’ room with a tiled-display door
• 20-node visualization cluster
• 240 CPU cores – 2.3 TFLOPs performance, 1.2 TB distributed memory
• 80 GPUs – 220 TFLOPs performance, 320 GB distributed memory visualization applications.
Large gigapixel panoramic images – e.g. 45 gigapixel photograph of Dubai, United Arab Emirates;
6 gigapixel Infrared telescope view of the Milky Way
• Large architectural models – e.g. 40 million polygon model visualized at interactive frame rates
• High-performance sound system with 22 speakers and four subwoofers
“This technology will be used for visualizing and analyzing big data, such as advanced medical imaging, protein visualization, nanotechnology, astronomical exploration, micro tomography, architectural design, reconnaissance, satellite imaging, security, defense, detecting suspicious persons in a crowd, news and blog analyses, climate and weather modeling, as well as storm surge mapping to fight flood disasters, such as Superstorm Sandy and global warming,” said Dr. Kaufman.
“The Reality Deck is the next generation virtual reality display at the vanguard of visual computing, with the ability to handle tasks involving huge amounts of data,” said Dr. Kaufman. “In the Reality Deck, data is displayed with an unprecedented amount of resolution that saturates the human eye, provides 20/20 vision, and renders traditional panning or zooming motions obsolete, as users just have to walk up to a display in order to resolve the minutiae, while walking back in order to appreciate the context that completely surrounds them.”
Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology.
Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels.
A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.