That means if I have Auto HDR on, the phone is taking multiple shots even before I've pressed the shutter?
>>Nah. The Camera takes shots when you press the shutter, not before.
Let me get this right. The first photo it will take will be Auto HDR. This as you say is actually a few exposure that are merged.
And, after this it will take another photo, which it will save as a regular photo?
In effect, keeping Auto HDR on, and also having it save a normal photo, means the camera is taking two different photos? What effect does this have on the longevity of the sensor?
>>The camera takes the same photo, but have one post-processed as HDR. Tbh, your phone will die long before sensor gets killed by taking too many shots. Your sensor will not die out of taking shots, but dust, water, dirt etc.
Oh. That means I am effectively getting less than 12 MP?
The pictures look fine but then I am then looking at smaller prints. Hmm...
>>Not so sure about this one cause I never use square. It looks like some stuff that Snapchat fanatics would love and I am not one of them.
Do you find a difference between how the iPhone processes it, and how a "proper" app like Lightroom on your computer processes it? Does the latter give more leeway in terms of how you can process the photo?
Also, what happens if I import an HDR photo into Lightroom or some other app?
>>Your iPhone process each photo you are taking every time using machine learning. The detail is confidential but various tests on YouTube shows Apple is doing a bit too much and photo does not even look like what’s happening in real world anymore.
Sure, post-processing using Photoshop etc. can give you more options and better control over your photos, but that’s based on assumption that the photo you take is RAW, meaning uncompressed and unedited by camera by any means. iOS camera app compress the photo and process it to make the final product looks appealing, but does not really give computer programs too much leeway to do anything. At the end of the day, the result you get is as good as the input you send in.
Is there a reason you've stayed away from dedicated apps (I've been reading good things about Halide, e.g.)
Perhaps because they may not allow for as fast an operation as the stock app?
>>I am just lazy and yes, no system integration for third party apps for “obvious reasons”. For normal use of just showing people what I see, there is no need for a professional camera app. And certainly I do not expect iPhone camera can even remotely compete with high end DSLR or something similar. Professional photo taking needs professional cameras. If you use iPhone to take amazing shots, then the photo quality itself is certainly not the priority.
I see. So, for a more "natural" look it's advisable to stick to 24 fps?
How does using 4k over 1080p make a difference to all this?
Thanks!
>>Human eyes needs at least 24fps to be considered “watching a video”, while 30 and 60 FPS provides smoother experience. I’d say unless your iPhone is 64GB or 128GB, stick with 60fps. The video would be super smooth.
[automerge]1595800212[/automerge]
Yes, I see that in the settings.
I still don't get if the phone will take two different photos, or just one (HDR)?
By keeping the "keep normal photo" option on, am I using the sensor twice over? That presumably will shorten its life.
>>Again, your iPhone will likely die long before your camera sensor wear out by taking too many photos. They do not degrade much, unlike SSD NAND flash. Do keep camera sensor out of water, dust, dirt etc. though, cause those WILL kill camera sensor really fast.