Computer-generated graphics for video games have had quite a few challenges thrown at them over the years. Smooth surfaces have gotten pretty darn good, but things like hair, fur, and cloth have been much more difficult to re-create in a realistic fashion. Researchers at Carnegie Mellon University and the University of California at Berkeley, pressed some computers into six months worth of service, all in the name of creating better digital cloth.
It took 4,554 CPU hours to generate 33 gigabytes of data aimed at figuring out the many ways a piece of cloth can move. This research could end up boosting the quality of things like wizard's robes and superhero capes blowing in the wind in video games. The paper that outlines the results is titled "Near-exhaustive Precomputation of Secondary Cloth Effects."
The simulations run for the study looked at how cloth behaves around a human figure. "I believe our approach generates the most beautiful and realistic cloth of any real-time technique," said Adrien Treuille, an associate professor of computer science and robotics at Carnegie Mellon.
The brute force method of applying mass amounts of computer power helped the researchers create some very delicate effects. "The clothing in our demos does not simply return to a single rest state: hoods fall off and clothes wrap around the character in diverse ways," reads the study.
If the results of the study are applied to game development, it could mean an end to stiff clothing moving unnaturally around characters' bodies. It could also be a big boost to virtual-world fashion. It could definitely make gaming a lot more chic.