Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

schnitzel-pretzel

macrumors regular
Nov 28, 2023
111
144
Kentucky
On Mac, do you also only rely on the stock apps that Apple provides and then say Mac sucks based on the stock apps?

I don't get how you are angry about a phone you don't even own and are too lazy to simply use a 3rd party app. It's incredible.

You have continually, and at this point intentionally, missed crucial parts of what I've said, so let me make it as clear as it's possible for a human being to make it:

Third party apps do not allow me to take Live Photos with processing disabled, so even if I wanted to use this solution you keep proposing, it's not possible.

And even if I were willing to completely give up on the Live Photo feature, it would take literally hundreds of hours to edit my photos each year.

So your proposed solution is not only more expensive, but gives up core features, AND takes hundreds of additional hours of my time.

Hopefully this gets through to you, although at this point I'm 95% sure you're trolling, or simply utterly refuse to read.



Well, MKBHD atleast has the iPhone 15 Pro and other Android phones so he is better informed when he says it has the best camera on a smartphone.

Oh my God it's like talking to a wall. I had the phone and returned it, due to the camera, how many times do I have to say this before you understand?

"I don't understand why you care about a phone you bought and then returned due to not liking it's camera, why don't you just download another app and spend 200 hours every year correcting photos and also not getting to use Live Photos ever again" -- you
 
  • Like
Reactions: Fozamo

Zest28

macrumors 68020
Jul 11, 2022
2,242
3,102
With Live Photo's, it records what happens 1.5 seconds before and after you take a picture. You can do this with a 3rd party app that records video + audio.

And you got to understand that the AI is trained for the general iPhone user, not for a particular person. Apple is not going to train a custom AI model for every single iPhone customer. So if you find the AI so horrible, you simply have to do it yourself. And if it takes you 200 hours, then that is what it takes if you are so unhappy with the AI.

The point is, you don't own an iPhone 15 at the moment and I have caught you saying things that are simply not true. Because I did test the iPhone 11 Pro Max versus the iPhone 15 Pro Max and the camera on the iPhone 15 Pro Max is much more closer to real life.

But I still don't get it. You are not suffering from the terrible camera on the iPhone 15 Pro at the moment. As you are now enjoying the superior camera on your older phone. So what is the problem here?
 
Last edited:

schnitzel-pretzel

macrumors regular
Nov 28, 2023
111
144
Kentucky
With Live Photo's, it records what happens 1.5 seconds before and after you take a picture. You can do this with a 3rd party app that records video + audio.

Live Photos have Deep Fusion forced on them. There exists no third party application which allows you to take a Live Photo with Deep Fusion off, because you cannot disable it in the camera API. You literally cannot do this. If you take a Live Photo, you always get Deep Fusion.


And you got to understand that the AI is trained for the general iPhone user, not for a particular person. Apple is not going to train a custom AI model for every single iPhone customer. So if you find the AI so horrible, you simply have to do it yourself.

Despite the fact that it is clear at this point you have zero clue how the camera works and will continue to insist on this flawed viewpoint even if Steve Jobs himself told you you are wrong, I will explain in case other people read the thread and want to know how it actually works.

Reduced processing does not require "another AI model". Apple can already give you a WYSIWYG photo without Deep Fusion and extra sharpening. Apple directly said this at WWDC 2021, and it can be verified by using the burst mode on your own iPhone. It gives you a shot that looks like what the viewfinder sees, without Deep Fusion applied. So, one easy solution to my problem would be literally as simple as letting me, the end user, decide to also turn Deep Fusion off for my Live Photos. One toggle switch. No "custom AI model" necessary.

Secondly -- Apple is already allowing the user to customize the AI used by adjusting sliders. That is what their Photographic Styles feature does. They have explicitly explained that it is not a filter -- it is applied in the camera pipeline. A second solution here is to simply add another slider in the Photographic Styles processing for sharpening. This is not complicated. They are not building a separate AI model for every tick on the slider. That would require them to have built 200 x 200 = 40,000 separate models for the Photographic Styles iterations since they each have a slider with 200 options from -100 to 100.


The point is, you don't own an iPhone 15 at the moment

Because I returned it, because of the camera. By your logic someone who buys a car and sells it because it has horrible MPG cannot complain about the MPG because they "don't own it at the moment". It's cringe.



and I have caught you saying things that are simply not true. Because I did test the iPhone 11 Pro Max versus the iPhone 15 Pro Max and the camera on the iPhone 15 Pro Max is much more closer to real life.

Photo processing and what appears realistic to different people is subjective. I have never once claimed that my personal opinion of the processing is the only one that's allowed to exist. The only person who appears to think that their opinion of which processing is more realistic is objectively correct, is you.


But I still don't get it. You are not suffering from the terrible camera on the iPhone 15 Pro at the moment. As you are now enjoying the superior camera on your older phone. So what is the problem here?

The problem is my iPhone X will eventually no longer be supported with any security updates and apps will not work on it, so I will be forced to upgrade, so I will continue to voice my complaints about the current camera processing pipeline until the cows come home. If you don't like it, block me, because I'm never going to stop until they fix it.
 
  • Like
Reactions: Fozamo

Zest28

macrumors 68020
Jul 11, 2022
2,242
3,102
Live Photos have Deep Fusion forced on them. There exists no third party application which allows you to take a Live Photo with Deep Fusion off, because you cannot disable it in the camera API. You literally cannot do this. If you take a Live Photo, you always get Deep Fusion.




Despite the fact that it is clear at this point you have zero clue how the camera works and will continue to insist on this flawed viewpoint even if Steve Jobs himself told you you are wrong, I will explain in case other people read the thread and want to know how it actually works.

Reduced processing does not require "another AI model". Apple can already give you a WYSIWYG photo without Deep Fusion and extra sharpening. Apple directly said this at WWDC 2021, and it can be verified by using the burst mode on your own iPhone. It gives you a shot that looks like what the viewfinder sees, without Deep Fusion applied. So, one easy solution to my problem would be literally as simple as letting me, the end user, decide to also turn Deep Fusion off for my Live Photos. One toggle switch. No "custom AI model" necessary.

Secondly -- Apple is already allowing the user to customize the AI used by adjusting sliders. That is what their Photographic Styles feature does. They have explicitly explained that it is not a filter -- it is applied in the camera pipeline. A second solution here is to simply add another slider in the Photographic Styles processing for sharpening. This is not complicated. They are not building a separate AI model for every tick on the slider. That would require them to have built 200 x 200 = 40,000 separate models for the Photographic Styles iterations since they each have a slider with 200 options from -100 to 100.




Because I returned it, because of the camera. By your logic someone who buys a car and sells it because it has horrible MPG cannot complain about the MPG because they "don't own it at the moment". It's cringe.





Photo processing and what appears realistic to different people is subjective. I have never once claimed that my personal opinion of the processing is the only one that's allowed to exist. The only person who appears to think that their opinion of which processing is more realistic is objectively correct, is you.




The problem is my iPhone X will eventually no longer be supported with any security updates and apps will not work on it, so I will be forced to upgrade, so I will continue to voice my complaints about the current camera processing pipeline until the cows come home. If you don't like it, block me, because I'm never going to stop until they fix it.

You are saying you cannot record video + audio without the AI? So how did Apple do this for their Keynote then which was recorded on the iPhone 15 Pro and then post processed on the Mac? What you are saying is simply not true at all.

In the end, you are asking to make the stock app much more complex by allowing the average iPhone user to change all kinds of parameters they have no clue about. The stock app is for the average consumer who don't know about these things. You are simply not the target audience of the stock app.

Apple is not going to change things for people who are outliers. There is an App Store for that and 3rd party developers. Now if there are no 3rd party developer that fit your needs, then you got to do it yourself.

Well, the difference is that big between the iPhone 15 Pro Max and iPhone 11 Pro Max in terms of colour accuracy, you don't need a scientific method to determine this. Just look with your own eyes how real life looks like in terms of colour and look at the pictures of both phones. It's very hard to miss.

And since you like Steve Jobs so much, he was about customer experience. Apple knows what they are doing with the stock app. You can be certain of that.
 
Last edited:

schnitzel-pretzel

macrumors regular
Nov 28, 2023
111
144
Kentucky
You are saying you cannot record video + audio without the AI?

No, I am saying you cannot create a Live Photo without Deep Fusion, since the AVCam API overrides the configuration property `photoQualityPrioritization`. The camera API limits what the app developer can do -- you cannot simply stitch together a video and photo file to get a Live Photo, you have to configure it within AVCam. Recording video and audio can of course be done without automatic sharpening since the camera supports the ProRes Log codec, but that doesn't mean you can make a Live Photo out of it since iOS controls the API that is allowed to build them. Obscura Cam 4 tried this by exposing that API to the user while enabling Live Photo capture, and it turns out that Deep Fusion is still applied.

So how did Apple do this for their Keynote then which was recorded on the iPhone 15 Pro and then post processed on the Mac?

Apple used ProRes Log to record video and then color grade it. That doesn't mean you can make a Live Photo out of it.

What you are saying is simply not true at all.

You only think that because you're both (a) not listening and (b) making up straw men in your head. My claim isn't that video recording without automatic sharpening is impossible on the iPhone. I am saying that you cannot take a Live Photo without it.


In the end, you are asking to make the stock app much more complex

At this point you are 100% trolling. There is no reasonable human being on the planet who thinks my proposed "toggle Deep Fusion off" solution would make the stock app "much more complex". There are already far more complex options within the camera configuration, including other pipeline customizations, resolution control, 11 different shooting modes, DNG editing, EV adjustments, post-exposure simulated f-stop adjustments, and more.

Well, the difference is that big between the iPhone 15 Pro Max and iPhone 11 Pro Max in terms of colour accuracy, you don't need a scientific method to determine this. Just look with your own eyes how real life looks like in terms of colour and look at the pictures of both phones. It's very hard to miss.
Holy ****ing **** lmao. Yes you're right, your brain is not involved in processing colors, people cannot see colors and contrast differently, everyone sees the same scene the exact same way, Jesus this is the most insane conversation I've ever had on the internet under any circumstances.
 
  • Like
Reactions: Fozamo and 3Rock

Jonnod III

macrumors member
Jan 21, 2004
91
50
Right, that is exactly my point. Which is why it's very important to give the user control over the pipeline that makes those edits, precisely because they cannot be changed after the fact. Whether it's your iPhone or your Canon, if the camera digitally sharpens the photo more than you like and spits out a JPEG, you are toast. You cannot back out that editing.
Well, my digital cameras take RAW files and I can create jpegs in camera, but then jpegs aren't what I want. So I have specialist RAW processing and editing software that deals with the .NEF and .ORF files. They a fine for a quick and dirty output, but since I don't have a need for quick and dirty I have jpegs switched off in camera. Even if I had them switched on, I would know that they were poor versions of what the camera can take, being of lower bits and with compression artefacts.

With the iPhone, it is perfectly possible able to have the same workflow. Take RAW files and process them in a dedicated app.
 
  • Like
Reactions: RRC

schnitzel-pretzel

macrumors regular
Nov 28, 2023
111
144
Kentucky
Well, my digital cameras take RAW files and I can create jpegs in camera, but then jpegs aren't what I want. So I have specialist RAW processing and editing software that deals with the .NEF and .ORF files. They a fine for a quick and dirty output, but since I don't have a need for quick and dirty I have jpegs switched off in camera. Even if I had them switched on, I would know that they were poor versions of what the camera can take, being of lower bits and with compression artefacts.

With the iPhone, it is perfectly possible able to have the same workflow. Take RAW files and process them in a dedicated app.
Sigh. It is tiresome responding to the same argument I've already addressed now six times within this very thread.

The iPhone X takes natural photos I like. I don't have to ****ing take a RAW photo and process it on my computer to get a natural looking result. It's not sharpened too much.

It is not a viable solution to tell me to take RAW photos and post-process them every goddamn time I take a photo with my phone. Like I've already said in this thread multiple times now, this causes me to (a) spend hundreds of hours each year editing the thousands of photos I take, perhaps even longer than that -- and (b) lose out on the Live Photo feature.

I don't know how people can keep suggesting this. Just give me the option to get more natural JPEGs like phones from literally 7 years ago could do. Stop overprocessing things so goddamn much that the proposed solution becomes "take hours of your time each week to edit RAW Photos just to get around the processing".
 

AppleFanatic10

macrumors 68030
Nov 2, 2010
2,803
295
Hawthorne, CA
I think I had a really special 11 Pro. My battery health was 91%, after 4 years of heavy use. It made some fantastic pictures. It was the perfect size, for me. But it was starting to show its age, it was slow or freezing up sometimes (maybe due to iOS 17) and I was constantly low on storage (it was 64 GB).
So I bit the bullet and upgraded to the 15 Pro.

There are a few positives, definitely: the larger storage, the 120Hz screen, and the CPU/RAM increases which are noticeable. The speaker is louder (yey!). Oh, and props to USB-C, finally.

But other than that, I'm surprised to find that:
- the camera is a mess. Ok, the ultrawide is better, but the rest of them are very hit and miss, and overall worse than the 11 Pro. There is a kind of processing that happens after I take the picture that dramatically modifies the final result from the preview. I cannot turn HDR off. The skin tones especially are way off, with a gray/blue tint that I can't get rid of. The lighting overall is altered, the colours modified, the shadows are murdered, I even made side by side comparisons with the 11 Pro in my home under different lighting/shade conditions. It just changes the pictures so much and there's nothing I can do about it. The focusing distance for both the main camera and the zoom have increased, up to a point where I need to adjust distance sometimes, something which had never happened on the 11. This also creates some kind of weird perspective shift between the cameras, especially when trying to use the zoom lens at closer range. Even the selfie camera is worse, the skin just looks horrible now. (To account for display differences, for comparison purposes the pictures were transfered to a neutral display, my 5k iMac)
- there is a gigantic camera bulge on the back for no reason. Like, seriously, this thing takes worse pictures with all its huge sensors and lenses
- the battery life is marginally better, if at all
- the hand feel is... meh. As I said, the 11 Pro was the perfect size, and the straight edges are making this one harder to hold comfortably
- the screen itself is worse quality. Looking at them side by side at similar brightness levels, the 11 Pro is just a tiny bit richer, warmer and more vivid, while the 15 Pro is a bit washed out.. like when you turn on gamma correction too much on a display.

So overall, 4 years later, I was very surprised that some the most important aspects of the phone are kinda worse. And since photography is very important for me, I'm actually debating whether I should keep the new one or try to live one more year with the old 11 Pro.

Rant over :)
Is anyone else in the same boat, or am I crazy?
I've never had the luxury of being able to have the iPhone 11 Pro but I did have the 12/13 pros, and there's really no difference besides the AOD and Dynamic Island. I always did regret upgrading my iPhone 13PM to the iPhone 14PM. The battery life was way better, there were no overheating issues etc.
 
  • Like
Reactions: Fozamo

Euroamerican

macrumors 6502
May 27, 2010
463
340
Boise
Wow, between this and overly bright screens burning people's eyes out, what hope is there for a poor guy or gal who wants to upgrade from a 6S or 8.....

I would like to be able to take better low light photos than what the 8 can do, but not sacrifice overall picture quality.

My wife would like better photos and her 6S is starting to get really sluggish, but do we try going "low" with refurbish older models or cross fingers and buy "above the iPhone 11" and the super-duper OLED screens.
 

jntdroid

macrumors 6502a
Oct 12, 2011
937
1,286
Wow, between this and overly bright screens burning people's eyes out, what hope is there for a poor guy or gal who wants to upgrade from a 6S or 8.....

I would like to be able to take better low light photos than what the 8 can do, but not sacrifice overall picture quality.

My wife would like better photos and her 6S is starting to get really sluggish, but do we try going "low" with refurbish older models or cross fingers and buy "above the iPhone 11" and the super-duper OLED screens.

3rd Gen iPhone SE: Has the "engine" of the iPhone 13, LCD screen, body you and your wife are used to; much faster and better battery life as well.

While it doesn't officially have a night mode, it will still take better low light photos than the 6S or 8; and there are 3rd party apps that do a pretty decent job mimicking official night mode.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.