Kel Philm wrote:I have worked on a few projects with stars in the night sky and I know what you mean about stars being to big, all the HDRI's I looked at were unusable. I think its because they over expose them to get additional nebulae type information and the stars then kind of blow out. Stars are quite difficult to get 100% right and can look really bad if resolution is not high enough as they start to disappear and reappear due to sampling methods when they move. The prob with spherical equirectangular images that they usually compress all the resolution at the top and bottom which is often where you don't want/need it. From memory I made one created in blender using particles to project onto an icosphere and imported that into fusion, I also mixed in a space HDRI where the stars were barely visible, and I think did something to choke or key them out so I got some nice nebulae sort of stuff going on. Might not be the case for deep space but I also ran a noise operation over the stars so they sort of pulse a bit, I assume this is due to various atmospherics(?) that the light rays pass through on their journey.
Thanks Kel, I think maybe because of the title it's a bit difficult for my point to come across. What I'm looking to achieve is not exactly generate a texture, but rather a particle system that will only be turned into a 2D picture when it's rendered from the camera's point of view. For example, if I were to generate that particle system in Fusion, the advantage is that I could composite it as the background and the render of the starship as the foreground, applying filters in Resolve, if the particles can be generated in the Fusion page, or if not, render it from Fusion Studio to an EXR sequence, then bring it into Resolve.
However, if I were to generate the system in Blender, I guess I can render the EXR sequence in it and bring that into Resolve as well.
My goal is to find out which option is easier and looks better to generate those particles, Fusion or Blender?