Frontal camera of the phone can bring additional information about light condition during usual photo. Like overall lighting level and color correction. That can improve photo quality. Theoretically it is possible to determine light position with such info. Of course chromeball be more precision but anyway. Neural network can correct data with iPhone distance laser correlation. Also suitable with AR where frontal camera can bring poor but more realistic reflection to integrated 3D art into environment. Also we can store both photos for further more detailed post processing.