Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Makosuke

macrumors 604
Original poster
Aug 15, 2001
6,666
1,250
The Cool Part of CA, USA
Totally pointless, but after being absolutely shocked I just had to come and rant: holy CRAP the iPhone 14 Pro takes good low-light video!

Coming from a 13 Pro, I'm used to incredibly good low-light performance for such a tiny camera--a 3-second handheld exposure looks amazingly good even on the 13 Pro--but I've also seen and understand the limitations with video and kind of assumed the 14 Pro would be similar, though I hadn't actually tested.

Until tonight, when I heard some geese flying very low through the fog and jogged outside to record a bit of them honking. I could not believe my eyes when I played it back.

The video I got is probably 3 times brighter than the scene looked to my eyes, the color is accurate and not dull at all, and while there's very little detail in most of it the grain has a pleasant gritty look instead of muddy-looking digital goo. Icing on the cake, a porch light across the street wasn't overexposed and the objects around it are sharp and clear.

I'm used to being able to take stills that are more light sensitive than my eyes, but that experience with video is something else entirely.

This still doesn't do the quality of the video justice (the conversion from HDR looks to have crushed the midrange a bit when I screenshotted it), but to my naked eyes the sky was a dim grey and the tree on the left, lit only by my porch light across the street, only looked like a silhouette.

Screen Shot 2022-10-11 at 2.12.41 AM.png
 

Algr

macrumors 6502
Jul 27, 2022
328
365
Earth (mostly)
Nice. I wish Apple would use all their computational photography skills to make a real camcorder for event coverage. Sony, Canon, and the others have totally abandoned the market in favor of DSLRs that are okay for cinema, but too slow for weddings and parties.
 

beerseagulls

macrumors 6502a
Aug 18, 2021
791
606
Nice. I wish Apple would use all their computational photography skills to make a real camcorder for event coverage. Sony, Canon, and the others have totally abandoned the market in favor of DSLRs that are okay for cinema, but too slow for weddings and parties.

that's what I've been wondering about for for the last few years. If Apple works with Sony and incorporate its computational photography magic with cameras like my a7 IV mirrorless, the result will blow my mind.
 

Algr

macrumors 6502
Jul 27, 2022
328
365
Earth (mostly)
I'm used to video cameras with wide zoom ranges, so I've always found the idea of having to change lenses strange and risky. If you have a telephoto lens on and suddenly need wide angle, how long does it take to switch, and not have the telephoto fall off of a table somewhere?

I imagine a camcorder with seperate wide and telephoto lenses, depth measuring lenses for bokeh and focus, large and small lenses for various light levels, ect, and all used together to create an image beyond what any lens could do alone.
 

michaelraymac84

macrumors newbie
Jun 1, 2023
2
0
Nice. I wish Apple would use all their computational photography skills to make a real camcorder for event coverage. Sony, Canon, and the others have totally abandoned the market in favor of DSLRs that are okay for cinema, but too slow for weddings and parties.
Yeah, not going to happen, at least with only having one processor. The only way would be if they made a dual-silicon design like NVIDIA does with RTX cards where one silicon is programmed to handle a certain type of computation, in real-time and having its own dedicated cache/memory. Photos and videos do not processes the same on cellphones and even dedicated cameras. It’s doesn’t take that much time to process a quick photo with some form of bracketing technique+noise reduction (aka: “computational photography”) because it’s just one frame/image compared to 24-120 frames-per-second. Even if this was possible, you best believe ARRI, RED or Sony would have already patent it, if they haven’t already.
 

Algr

macrumors 6502
Jul 27, 2022
328
365
Earth (mostly)
ARRI, RED or Sony are not going to have access to the kind of computational power that Apple has with their Mx series.
The only way would be if they made a dual-silicon design like NVIDIA does with RTX cards where one silicon is programmed to handle a certain type of computation, in real-time and having its own dedicated cache/memory.
Dual? The A16 already has six cpu cores, five graphics cores, and 16 NPU cores. Imagine what an M3 pro could do with four sensors working together, each with its own unique lens. This would require custom silicon, but who better to do that then Apple? As for patents, those companies' products don't use more then one lens at a time, so... what patents?
 

Algr

macrumors 6502
Jul 27, 2022
328
365
Earth (mostly)
Some examples of what to do with multiple lenses:

- If you have an ambiguous shot and the autofocus doesn't know where to focus, you could have two lenses pick different focal planes. When the user indicates which subject is desired, the footage from the other lens is hidden in the timeline, but still accessible.

- I'd love to be able to cut from a telephoto shot to a wide angle, rather than having to pump the zoom all the time.

- A lens could be pointed in another direction, allowing cutaways for reaction shots, and capturing things that the videographer might miss.

- A reframe zone could allow the telephoto lens to record at full resolution, while the surrounding area is still covered. This would dramatically improve post stabilization, and allow the editor to react to unexpected things before they happen.

- In low light, one sensor could run at 15 fps, while another runs at 60. Combining the videos would yield the best of both, and also improve dynamic range.
 

Makosuke

macrumors 604
Original poster
Aug 15, 2001
6,666
1,250
The Cool Part of CA, USA
Some examples of what to do with multiple lenses:

- If you have an ambiguous shot and the autofocus doesn't know where to focus, you could have two lenses pick different focal planes. When the user indicates which subject is desired, the footage from the other lens is hidden in the timeline, but still accessible.

- I'd love to be able to cut from a telephoto shot to a wide angle, rather than having to pump the zoom all the time.
Thinking purely of smartphones, the combination of those two made me picture a cool hypothetical feature that to my knowledge hasn't been done on a phone yet--if the phone captured video from two (or more) cameras at once, you could cut from wide-angle to zoomed in "in post" without having to plan in advance. I'm not sure whether the bus from the camera module, image processing pipeline, and storage is currently capable of that, but it's certainly not impossible in theory.
 

Algr

macrumors 6502
Jul 27, 2022
328
365
Earth (mostly)
you could cut from wide-angle to zoomed in "in post" without having to plan in advance.
Yes. I think the only thing a phone would need to do this would be a second HEVC encoder. You'd also use up your storage twice as fast, so that isn't something you'd want to leave on all the time.

Myself I wasn't thinking of smartphones, but something with a full Mac inside in the $2500 range. Something that could live stream a multi-camera wedding. DSLRs are useless for streaming - worse then cell phones!
 

michaelraymac84

macrumors newbie
Jun 1, 2023
2
0
ARRI, RED or Sony are not going to have access to the kind of computational power that Apple has with their Mx series.

Dual? The A16 already has six cpu cores, five graphics cores, and 16 NPU cores. Imagine what an M3 pro could do with four sensors working together, each with its own unique lens. This would require custom silicon, but who better to do that then Apple? As for patents, those companies' products don't use more than one lens at a time, so... what patents?

1. You are talking about near latency free computation for each frame. This would require a very power hungry dedicated processor with a very large amount of memory in order to retain each frame, processes it and then save it to the storage.

2. Sony provides the sensors for Apple and also specialize in programming for very complex hardware designs so they are definitely more than capable of doing whatever Apple can do, if not better.
 

Algr

macrumors 6502
Jul 27, 2022
328
365
Earth (mostly)
1. You are talking about near latency free computation for each frame. This would require a very power hungry dedicated processor with a very large amount of memory in order to retain each frame, processes it and then save it to the storage.

2. Sony provides the sensors for Apple and also specialize in programming for very complex hardware designs so they are definitely more than capable of doing whatever Apple can do, if not better.
1: Worst case scenario is that they would compress each output separately and do multi-lens stuff in post. The battery in a typical camcorder is dozens of times bigger than what is found in an iPhone, so that would not be a problem. There was once a VHS camcorder with two lenses and image sensors. You could point the wide angle one in any direction, and cut between them, as well as having a few gimmicky effects like chroma key.

2: Sony can do the lenses and sensors, but not the chips, software or interface. And Sony isn't the only game in town for anything. Given that no one is making camcorders now, there shouldn't be any non-compete agreements holding Apple up.

- I define a "camcorder" as a device form factor optimized for video production as opposed to stills. So it is balanced in the hand and can be pointed steadily for minutes at a time without tripod. Has a quality microphone built in. No limit to record time other than storage and battery life. (I've never heard of a camera overheating before 2020.) And at least 10x optical zoom.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.