Further enhancements

We hope that we've given you some tools to get you going with your own music visualizations. As we've suggested throughout this chapter, the options are infinite. Unfortunately, space prohibits us from having too much fun coding more and more stuff here.

  • Animations: We have applied the simplest transformations to each of our visualizations: a simple position, scale, and perhaps 90-degree rotations. Naturally, the position, rotation, and scale can be animated, that is, updated for each frame in coordination with the music, or independent of the music using Time.deltaTime. Stuff can be virtually flying all around you!
  • Advanced textures and shaders: Our shaders and data-driven textures are the most basic: fundamentally rendering a single color pixel corresponding to the audio byte value. The audio data can be fed into much more complex and interesting algorithms to generate new patterns and color and/or be used to morph preloaded textures.
  • Texture mapping: The texture materials in the project are simply mapped onto a flat plane. Hey man, this is VR! Map the textures onto a photosphere or other geometry and totally immerse your users in it.
  • Render to texture: Our trails mode looks alright for these visualizations, but will probably become a mess for anything sufficiently complex. Instead, you could use it exclusively within the surface of your textured planes. Setting up RTs is complex and beyond the scope of this book. Essentially, you introduce another camera to your scene, direct OpenGL to render subsequent draw calls to a new surface that you've created, and use that surface as the texture buffer for the objects you want to render it onto. RT is a powerful concept, enabling techniques such as reflection and in-game security cameras. Furthermore, you can apply transformations to the surface to make the trails appear to fly off into the distance, which is a popular effect among traditional visualizers such as MilkDrop (https://en.wikipedia.org/wiki/MilkDrop).
  • Parametric geometry: Audio data can be used to drive the definition and rendering of 3D geometric models of varying complexity. Think of fractals, crystals, and 3D polyhedra. Take a look at Goldberg polyhedra (refer to http://schoengeometry.com/) and Sacred geometry (refer to http://www.geometrycode.com/sacred-geometry/) for inspiration.

A community invite

We invite you to share your own visualizations with other readers of this book and the Cardboard community at large. One way to do this is via our GitHub repository. If you create a new visualization, submit it as a pull request to the project at https://github.com/cardbookvr/visualizevr, or create your own fork of the entire project!

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset