http://www.swiftless.com/tutorials/glsl/1_setup.html
http://glsl.heroku.com/
http://nehe.gamedev.net/article/glsl_an_introduction/25007/
http://www.iquilezles.org/www/material/nvscene2008/nvscene2008.htm
http://www.youtube.com/watch?v=Hbw1Xx7K0xc&list=UUEbJErBsmPdnbhxrjAw56Vw&index=4&feature=plcp
http://www.rozengain.com/blog/2010/08/10/using-webgl-glsl-shaders-to-create-a-tunnel-effect/
http://http.developer.nvidia.com/GPUGems3/gpugems3_ch01.html
http://www.geisswerks.com/about_terrain.html
http://www.rpenalva.com/blog/?p=254 (raymarching)
http://www.iquilezles.org/www/articles/terrainmarching/terrainmarching.htm (terrain raymarching)
http://www.dasprinzip.com/prinzipiell/2011/03/22/ray-marching-the-distance/
http://www.cs.uic.edu/~kjoshi/GPU_Project3/GPU_Project3.html (terrain project)
http://www.youtube.com/watch?v=IdYP0PUPQpA (procedural terrain)
http://illogictree.com/blog/masters-thesis/ (procedural grass)
http://alteredqualia.com/three/examples/webgl_terrain_dynamic.html (webgl dynamic terrain)
http://www.mit.edu/~danz/marti/intro/lights/high/0.html (intro to raytracing)
http://fuzzyphoton.tripod.com/howtowrt.htm (how to raytrace)
http://glsl.heroku.com/e#1686.0 (sphere example)
http://www.opengl.org/archives/resources/faq/technical/transformations.htm (transformations)
http://www.khronos.org/files/opengl-quick-reference-card.pdf (OpenGL quick reference card)
http://www.pouet.net/topic.php?which=8177&page=1&x=32&y=5 (raymarching intro)
http://cboard.cprogramming.com/game-programming/118838-matrices-confusion-modelview-projection.html (model-view-projection)
http://db-in.com/blog/2011/04/cameras-on-opengl-es-2-x/ (camera openGL)
Wednesday, March 14, 2012
Monday, March 12, 2012
More Cities: GPU Final Project Proposal
Procedurally Generating an Infinite City
Procedural generation has always been a topic of interest for me. In the previous year I worked on procedurally generating cities on the CPU (in Maya), and now I feel that this project is a good opportunity to experiment with something similar on the GPU.
Currently there is not much work on generating view-dependent cities, and existing
implementations of GPU-based city generation are limited to commercial software such
as CityEngine. For my project, I plan on writing an application that generates buildings on the fly based on view parameters (and a maximum view distance that will be covered by fog or shadows), basically creating an infinite city.
At the moment I am considering two approaches. The first method would procedurally
generate buildings in CUDA, where the threads are positioned in 2D blocks and each
thread calculates the vertices of the building based on the block and thread indices. These are pushed into vertex buffers, which are shared along with textures with OpenGL for rendering. For each frame, the vertex buffers are sorted from front to back. New buildings are generated when needed, replacing the most distant building when a free vertex buffer is needed. Building height can be randomized using a fractal algorithm, such as the Diamond Square Algorithm which is commonly used in procedural Terrain Generation.
The alternate approach is to ray-march distance fields in a fragment shader using GLSL. In this case, a viewport-aligned quad is rendered using two triangles. No additional geometry is generated; instead, the fragment shader is used to march a ray through each pixel and into a scene described by distance functions across the entire window.
The first step for my project would be to explore both these methods and choose one
that produces the best results. The rest of the project will focus more on the procedural generation and less on individual building, although I hope to add more details as time allows.
References:
Terrain Generation and Smoothing
Experiment: Cityscape with Raymarching
Raymarching Distance Fields
Modeling with Distance Functions
Email correspondence with Patrick
Procedural generation has always been a topic of interest for me. In the previous year I worked on procedurally generating cities on the CPU (in Maya), and now I feel that this project is a good opportunity to experiment with something similar on the GPU.
Currently there is not much work on generating view-dependent cities, and existing
implementations of GPU-based city generation are limited to commercial software such
as CityEngine. For my project, I plan on writing an application that generates buildings on the fly based on view parameters (and a maximum view distance that will be covered by fog or shadows), basically creating an infinite city.
At the moment I am considering two approaches. The first method would procedurally
generate buildings in CUDA, where the threads are positioned in 2D blocks and each
thread calculates the vertices of the building based on the block and thread indices. These are pushed into vertex buffers, which are shared along with textures with OpenGL for rendering. For each frame, the vertex buffers are sorted from front to back. New buildings are generated when needed, replacing the most distant building when a free vertex buffer is needed. Building height can be randomized using a fractal algorithm, such as the Diamond Square Algorithm which is commonly used in procedural Terrain Generation.
The alternate approach is to ray-march distance fields in a fragment shader using GLSL. In this case, a viewport-aligned quad is rendered using two triangles. No additional geometry is generated; instead, the fragment shader is used to march a ray through each pixel and into a scene described by distance functions across the entire window.
The first step for my project would be to explore both these methods and choose one
that produces the best results. The rest of the project will focus more on the procedural generation and less on individual building, although I hope to add more details as time allows.
References:
Terrain Generation and Smoothing
Experiment: Cityscape with Raymarching
Raymarching Distance Fields
Modeling with Distance Functions
Email correspondence with Patrick
Tuesday, March 6, 2012
Subscribe to:
Posts (Atom)