pebble generator

tl;dr

Procedural generation of 3D pebbles on mobile devices – their shapes, surface patterns, and other characteristics (like color of subsurface scattering). The result is random but deterministic. After a couple of seconds for generation, the resultant pebble can be rotated via touch screen. This Unity scene is integrated into a React Native application on iOS.

About ¼ of the generation uses classic CPU-only approaches. The result is a couple of 2D textures containing distance fields. The rest is done right in the shaders, basically via adding a deterministic noise to every possible “nook and cranny”.

Effort distribution:

  • ≈½ – on the directly apparent visual result
  • ≈½ – “behind the scenes”: integration into React Native, mobile optimization, addressing Unity bugs reproducible only directly on the phones, mobile input, and internal tools

Procedural generation of 3D pebbles on mobile devices: their shapes, surface patterns, and other characteristics, such as the color of subsurface scattering. The result is random, but – with an equal seed value – it is the same on any device. After a couple of seconds of generation, the resulting pebble can be rotated with your finger. This Unity scene is embedded in the React Native app on iOS.

Distribution of effort:

implementation

Approximately a quarter of the generation process is performed using a “classic” approach, resulting in a pair of 2D textures containing distance fields. The remaining ¾ are generated directly in the shader by adding deterministic noise wherever possible.

The subtle patterns on the surface are implemented not through UV mapping, but rather calculated directly in the 3D coordinates of the surface points. Specifically, for low-contrast (barely perceivable) base noise, a triplanar mapping is used.

The more contrasting & apparent patterns – stripes and spots – are done via 2D SDF (Signed Distance Field), but modified (more on this later) for the needs of the project.

In order to use the 2D field as if it were a 3D field – and with no distortion – we resorted to measuring the distances in the field from the centers/middles instead of the boundaries. That is, it would be more accurate to come up with a new name: CDF (Central Distance Field). (The limitation is that this center/middle must exist for this – but, in context of pebbles, most patterns typical for them can be generated by adding noise in shapes that do have a center/middle: circles and lines.)

worst pitfall

Two similar bugs in Shader Graph nodes (in other words, in the library of functions for shaders), which appear only when running on the phone. For both bugs, finding a specific node producing the unexpected result involved repeatedly waiting for the slow process of building the project & launching it on the phone. To soften this time waste we made a debug UI allowing disabling individual shaders directly on the phone in order to wait for fewer restarts.

(Roughly 30 hours would be wasted on this if not the possibility to do something else while waiting. About half of the time it wasn’t a good idea to do other work on this project in parallel, so I just did home chores instead and thus didn’t count such time in billing.)

end result

The project was suspended in favor of more critical priorities of the main project. About the latter I know only that it’s an iOS (built with React Native) app that is somewhat like a social network – likely without literally being one. This is because this component is only to create random avatars, which is completely isolated from the need to know any context – and keeping me outside of the know saved us from the hassle of NDA signing.

info

hours:
195
stack:
Unity, iOS
skills:
URP, C#, Shader Graph, PCG in runtime, Unity as a Library, React Native
role:
solo programming
billing:
hourly without escrow