newt: thanks ;) ... constantly randomizing the noise would have been impossible (512 map size needs about 2 or 3 seconds). i pushed the data into an array to have the opportunity to process it.
zyblade: you could reshape the amplitudes when you re-write the array data. to use the start and end point, you definately need more math. in the current state of the example it's just half of the layout height + offset.
the cpu's processing time heavily depends on your map size of the noise and the number of instances which are moved/scaled/rotated. once it's done (at startup) its data is available through the array. gpu issues shouldn't be problem. only the post-processing stuff (2x blur & glow) is taking about 6mb of vram.
"Is there a way to combine oppenheimers example with newts first one?"
of course! just pull the array/noise-data and use it as displacement.