Here's a list of things I learned while I was there. Let me know if you need more details.
Both Barco and Christie offer DLP stereo projectors. They trade bit depth for stereo so the colors are not as good.
One of the papers I saw put a net in a Cave where users could grab and pull the net to interact and navigate through the VE.
Microsoft Direct X 8.0 supports multi-monitor hardware acceleration.
Some guys out in Japan have built a huge (6.3m by 4m by 1.5m) concave projection wall using 24 LCD projectors. They have a nice technique for doing seamless calibration. However, the color calibration was off so it did not look that great.
Some guys at Vrije university have developed a competitor to WIREGL called Aura Broadcast, www.icwall.org/.
I saw an interesting input device at the Haptics symposium which looks at the color of the fingernail to detect finger pressure. It was developed at MIT.
Proctor and Gamble gave a talk and claimed that they could not see vortical structures in there desktop environment but were able to identify them in a Cave.
York University is building a 6 sided Cave. They have developed their own hybrid tracking system which uses accelerometers and computer vision.
There was a paper on Walk through screens where they created a laminar flow with fans and fog and projected the images onto the flow. www.cs.tut.fi/~ira/wave.html
If we want to do spatialized sound there is an open source API called OpenAL.
Anthony Steed presented a paper on how to give good Cave demos by using mixed reality. This the augmenting the Cave room idea that we had talked about. He has some very interesting ideas.
Princeton's wall is up to 18 feet by 8 feet and has 6K by 3K resolution. The general consesus in regards to multiple projetor walls is that projector calibration to make the wall seamless is not the issue. The real problem is color calibration.
I met with a guy from Fakespace and they are interested in the FingerSleeve. We are going to send them a device that connects to the pinch Glove box which they want to do some market research with. Intersense, Mechdyne and 3rd Tech. were interested in see that device developed commercially (i.e., if it was available they would buy it).
Intersense now has wireless tracking for their minitrax system. You will have to wear a small RF unit (about the size of a cell phone). They are planning to do wireless hand tracking with sylus and wand as well.
I met Kay Stanney. She is a professor at UCF and does a signficant amount of research on cybersickness.
A paper was presented on the Reflex HMD which is a method of augmenting an HMD to help compensate for lag. It needs additional hardware anf does not work well for low frame rates but it presents an interesting alternative to prediction.
The HIT Lab people developed an eye tracker for the virtual retinal display. This won the best paper award.
There was an interesting paper on haptic bone dissection and one on visualizng billion atom systems although I do not know how novel there approaches are.
I presented the Pop through button paper and it was well received. I had about 20 people come see my after to try out the device. Carlo Sequin was not there.
I also met Mark Livingston and Simon Julier (tracking guys at NRL) who are interested in possible collaborations.
Joseph J LaViola Jr.
Brown University, Dept. of Computer Science, Box 1910, Providence, RI 02912 Phone: 401-863-7662 Fax: 401-863-7657 URL: www.cs.brown.edu/people/jjl