Leading New Developments in Visual Computing
by Digital Innovation Gazette
Advancements in computing technology involve more than simply increasing speed and reducing size and power consumption. Enhancements to features, capabilities and specifications go hand-in-hand with education of the development community and those teaching the trade. Here's four standout projects at universities around the world
Many universities and professors around the world are making outstanding contributions to computer-science information technology, including training fellow educators, reviewing beta curriculum, creating case studies, blogging and more. Here, DIG highlights four standout projects at universities around the world.
University of North Carolina at Chapel Hill
Distinguished professor of computer science Dinesh Manocha, along with co-investigator Ming Lin, is developing a method of rendering sound from the physics of objects in motion. He oversees projects exploring sound synthesis (which creates sounds from the principles of physics) and sound propagation (which explores the movement of sound once it is emitted).
Numerous practical applications for the study of sound exist, from creating virtual simulators to manufacturing aircraft and automobiles.
Manocha’s team used multicore processing to solve key problems. For the sound-synthesis project, they developed what Manocha calls “a very simple algorithm in terms of computation cost that will exploit a set of features of human perception -- what we hear well and what we can’t hear well.”
The second problem was associated with sound propagation: “How does the sound wave -- which starts from the speaker, hits walls and gets reflected, deflected and refracted -- eventually reach the receiver?” asks Manocha. For this hurdle, the team used a processing system with 16 cores -- or four quad-core chips.
University of California, Berkeley
The primary area of interest for associate professor of computer science James O’Brien is computer animation, with an emphasis on generating realistic motion using physically based simulation and motion-capture techniques.
Apple Mac Pro computers -- each with two quad-core processors -- are at the heart of O’Brien’s research to render simulations of a flexible needle used in the treatment of prostate cancer. Current methods employ rigid needles and ultrasound, a difficult method that provides very little detail for the surgeon.
The flexible needle that O’Brien and his team are helping to develop has a beveled tip, allowing it to travel in a circular path and avoid vital organs and bones.
“The basic idea,” says O’Brien, “is to realistically model the process of inserting a needle into living tissue for the purposes of modeling biopsies -- a technique called brachytherapy -- where the physician wants to kill cancerous tissue without hurting healthy tissue. By rotating the base of the needle you can control what direction it curves in and steer it around parts of the body you don’t want to penetrate.”
There are also nonsurgical applications for his work. “One technique is a needle going into a human being, another is a Jedi warrior going around smashing things,” he says. “The two use very similar underlying simulation methods.”
St. Petersburg State Polytechnical University
Vladimir Belyaev was a technical lead at Driver-Inter Ltd., a Russian 3D-game developer affiliated with the St. Petersburg State Polytechnical University, where he worked on ways to improve the realism of simulated grass. He does most of his development on quad-core systems.
“Imagine waves across a grass field,” says Belyaev. “Then you see somebody in the grass and it divides, and you can see the trail behind the person. If you go down and see the trail close up, you’ll see each grass blade and some laying flat where they were stepped upon.”
Belyaev says handling all the variables -- whether dealing with storage, control or manipulation -- presented a challenge. “There is a lot of information you have to handle, because if you want to make this truly real, you have billions of values,” he says. “It was extremely challenging to make this all look seamless.”
Jacco Bikker is an instructor for the four-year vocational program International Game Architecture and Design (IGAD), a Ph.D. student and a senior lecturer for the International Architecture and Design program at the NHTV University of Applied Sciences in the Netherlands, where he teaches C/C++ and graphics programming.
A 10-year game industry veteran, Bikker is working on a real-time ray tracer called Arauna, which will render realistic 3D images of virtual worlds. The use of fast multicore processor technology is playing a significant role in helping to accelerate the engine.
“A sufficient frame rate for games is about 30 frames per second,” says Bikker. “For multiplayer, you want to see a higher rate. For resolution, we started with a low 400 pixels by 300 pixels because we anticipated we would have students developing on laptops.” With the latest processors, says Bikker, they get frame rates at 1024 by 768 resolutions.
The Arauna ray tracer was specifically built with games and performance in mind. But whether researchers are improving cancer treatments or animation, multicore processing helps the application of visual computing to the development of real-life simulations.
- Teaching the World to Code in a Day
- Power Your Apps With GPU Programming
- How Will DevOps Impact Mobile App Development?
- The Future of HTML5 and Where It's Headed
- HTML5 App and Game Development
- Sound Strategy: Leveraging Mobile Speech Recognition
- Harnessing WinRT APIs
- Development Trends: Using Government Data
- The Evolution of APIs
- Why Is Parallel Programming So Hard
- Open Source IDEs: An Update
- Android App Development: Some Devices Just Can't Handle It
- App Developer Competitions: Worth It?
- New Programming Languages to Watch
- Is Ruby on Rails a Crown Jewel?
- Getting to Know NoSQL
Article: Copyright © 2017 Studio One Networks. All rights reserved