Why do iPhone cameras color correct?
With the fires in CA, sky is orange, but phone washes that out. Why not just have the default be what you see - WYSIWYG? Or have a WYSIWYG filter?
Generally speaking, what you see is not what you are looking at. We as humans do our own color correction all the time, so it’s natural that a camera would do the same. Otherwise, you end up with photos that look e.g. very green when you weren’t expecting it, because your eyes had adapted to the ambient colors.
All consumer digital cameras color correct by default because people don't want to white balance manually any more than they want to adjust the focus or aperture manually; they just want a nice photo and if they don't get it will put the blame on the camera rather than spending more time learning photography. Pro photography apps let you configure that yourself.
The phone is trying to achieve WYSIWYG and failing because it’s a hard problem. It’s difficult to reduce the continuous experience of visual perception to a fixed image obtained from a digital sensor at one moment in time, but skilled photographers can outperform the default algorithm in a point-and-shoot camera app.
It’s the iPhone Camera App, not the camera. Control of white balance is available in other Apps. For example, the Adobe Lightroom App provides white balance control when taking pictures in the “professional” mode. The App is free in the App Store.
Shoot with something white in the frame so it knows how to white balance. I’ve had a similar experience in the past during a wildfire and the shot that most accurately represented the colours I was really seeing had the fluoro lights of a gas station in the frame.
Better to use an iPhone camera app that captures RAW. Then you can set your own white balance after.