Table of Contents Show
But now the “Scary Fast” event for the release of Apple’s M3 computers was filmed entirely on the iPhone, as the credits alone reveal. On YouTube you can find the “Behind the
Scenes…”, where it also becomes clear that the filming and technical effort was actually the same as always(is.gd/iphone_bts). But the dollies, cranes and the drone were actually just sitting on a new iPhone 15 Pro Max. In addition, the predominant visual theme was “black”, matching one of the new laptop colours.
The EU is to blame
This doesn’t mean that this iPhone is a low-light wonder (it isn’t), but that you can set the light to high contrast and then grade it to black. The decisive factor is that recording in ProRes with Log is now available. Apple would certainly have liked to hold on to an in-house connection for longer. But the EU has demanded that all smartphones that come onto the market here in 2024 must have USB-C. Apple has taken the plunge and has already installed the corresponding socket in the new series this year.
Data transfer
In the early productions with iPhones, the employees must have had incredible patience. Recordings had to be transferred via WiFi or the interface with the euphemistic name “Lightning” (or its 30-pin predecessor), which is very slow by today’s standards. Two short test clips of around 750 MB took over ten minutes via “AirDrop”, via USB-C the process was
the process was barely measurable in seconds.
Initially, there were still rumours floating around the web that, apart from charging the battery via USB-C, Apple might restrict the use of external devices to its own offerings or only allow its own cables at pharmacy, sorry, Apple prices.
None of this has materialised: A standard SSD connected via a normal data cable (not just a charging cable, of course) works without any restrictions. With the iPad, we still had to complain that FCP-X cannot work directly from connected storage devices (DP 23:05). On the iPhone, on the other hand, Apple does not force internal recording, as already available apps show. The Apple event was obviously recorded with the help of the free Camera App from Blackmagic (BM for short), Apple even thanks us in the credits.
Storage media
It is clear that Apple also wants to satisfy its shareholders: the new options are limited to the two top models. With the “Pro”, you also have to pay an extra 130 euros for 256 GB of internal memory to get full functionality with the memory-intensive ProRes even without an SSD. This is the basic configuration for the Pro Max. The internal memory is really fast, with 1.6 gigabytes per second (GB/s) when writing and reading. Measured with Diskio by Sergii Khliustin, a free app with witty comments – is.gd/diskio.


Externally, only regular USB-C is supported, no Thunderbolt. However, with a good SSD, this is completely sufficient to record ProRes 422 HQ in UHD with up to 60 fps directly, for which you only need around 200-225 MB/s. Initially, the app even listed ProRes 4444, but that was probably too much for the system. Even on an Arri Alexa, this codec is not always selected, and 422 HQ is completely sufficient as a source for the iPhone. In UHD at 60 fps, it needs just under 800 GB of space for one hour.
A Samsung T5 SSD, for example, as external storage shows around 373 MB/s throughput when writing, depending on the app being measured. Even a T7, which is not recommended for BM cameras, ran for two hours on the iPhone at 25 fps. The free OWC Drive Speed app can actually directly measure the suitability for various codecs, resolutions and the achievable frame rate. However, it already warns of picture dropouts,
without us being able to detect these in practice for the respective media, so this app is probably not yet fully developed.
Even a T5 is still relatively large for an iPhone, although thanks to MagSafe it can be easily attached to the back with a magnetic ring. Some people wonder whether a smaller flash drive with USB-C would also work. Samsung has relatively fast models for this, currently with a maximum of 256 GB. According to the company (we didn’t have one to test), these should be able to read 400 MB/s, but only write 110. This information is always given with the small addition “up to” and only refers to the 256 GB model. This should still be sufficient for ProRes 422 HQ with 25 fps, but ProRes 422 LT could also go up to 60 fps. Unfortunately, such sticks are often not compatible with the mobile phone case.
We have also not tested the idea of connecting a really fast SDXC card with a small USB-C adapter. But a Sony “Tough” or comparable cards with 300 MB/s should work with a really fast adapter. A card reader can of course also be useful for backups on the iPhone when working with conventional cameras. If you want to edit the material on the PC, you should pay attention to the formatting of the media: The iPhone does not recognise NTFS, but exFAT works up to 2 TB.



Be careful, there is a risk of data loss! The “classic” HFS or the new APFS, which the iPhone naturally supports, are better. One of these formats would therefore also be advisable for transferring to the PC. Under Windows there are programmes such as “HFS for Windows” or “APFS for Windows” from Paragon for using these formats. Of course, the iPhone also uses these two formats.
Power and heat
A fully charged Pro Max still showed 66 per cent charge remaining after one hour at 60 fps, and 42 per cent after two hours of recording in ProRes 422 HQ at 25 fps in UHD, with the T5 SSD also being supplied with power. The SSD absorbed the greater proportion at 25 fps, at 60 fps it is the other way round. The iPhone 15 Pro still had 58 per cent after one hour at 60 fps, the camera app needed the larger share, the SSD about 2/3 of it. This was the app from Blackmagic. This value is not the same for all apps, as we will see in their tests. These are amazing runtimes, but the iPhone does get very warm, despite iOS 17.1.1, which has essentially eliminated the initial heat problems. You should definitely remove any silicone case, we got overload messages at 60 fps on the Pro Max after a few minutes at an ambient temperature of 27 degrees. Another tip from BM was to switch off the overlays when recording, i.e. to “wipe them away”. This allowed the Pro Max to run smoothly even in slightly warmer conditions.

The iPhone 15 Pro may seem more attractive for video with its 3x lens and slightly lower price. However, its battery is smaller; in stand-alone mode with SSD, 20 per cent charge was still left after two hours. Unfortunately, the smaller housing also cools less well, and despite the above cooling measures, 60 fps was over after a few minutes. At 25 or 30 fps, the Pro also lasted without overheating and the app consumed less power in relation to the SSD. Very fast SSDs, such as a Samsung 980 Pro in an NVMe enclosure, require too much power and only run on both iPhones with an external power supply via USB-C hub. You don’t necessarily have to take a hub with you to externally power the iPhone itself, as there are already power banks for MagSafe.
Frame rate and synchronisation
With a runtime limited only by power and storage medium, another question arises: What about maintaining the frame rate and thus the picture-sound synchronisation over a longer period of time? In Apple’s film, no statement was longer than 20 seconds without a cut, which any smartphone can actually manage without becoming asynchronous. But unfortunately, practically all of today’s smartphones do not provide a fixed frame rate, but a variable one (VFR = Variable Frame Rate). Although DaVinci Resolve (DR for short) simply reports a highly precise 60,000 under “Shot Frame Rate”, the iPhone is unfortunately no different. Before recording, we switched to aeroplane mode and closed all other apps to rule out any interference.
However, a further measurement showed that this was not necessary. In the free MediaInfo, the information is slightly more correct, showing a frame rate between 59.940 and 60.181 fps – that’s a deviation of far less than one per cent! A longer recording with a film gate showed that the variable frame rate in DR is relatively unproblematic. The sound ran continuously one frame early for over 30 minutes (at 60 fps!), only at one point did we have the impression that it tended towards two frames. Since in DR the picture occasionally moved one frame ahead of the TC flap, which we kept continuously in the picture, the editing software probably does react to VFR. This is usually manageable, but the iPhone is probably not quite suitable for uninterrupted concert recordings lasting several hours.

Cameras and lenses
A few years ago, the statement “My smartphone has three cameras and a LiDAR scanner” would probably have been met with incredulous smiles. But there are actually three lenses here, each with its own sensor, just like before. In fact, most of it is even identical to the iPhone 14, only the signal processing has been further developed. The ultra wide-angle lens with 13 mm, the regular 24 mm and the 77 mm in the iPhone Pro are identical to their predecessors.
Only the 120 mm in the Pro Max is really new and a very clever design with a multiple prism that basically folds a relatively long focal length into the flat body. To us, such a design seems potentially more durable than a miniaturised zoom lens like the competition (usually referred to as a periscope lens), as there are no moving parts other than the sensor. It is therefore not an optical zoom at all, as is sometimes incorrectly written. Please note that in the following we use the term KB full frame for the photographic full frame of 24×36 mm, because experienced film cameramen tend to mean the S-35 full frame. However, its diagonal is not always the same, but depends on the respective widescreen format. The information provided by Apple and in the apps refers to the equivalent focal length of KB full frame. The main lens 1× has a real 6.86 mm with f 1.78 and the ultra wide-angle 0.5 has a real 2.22 mm at f 2.2, the 3× has 9 mm with f 2.8 and the 5× has a real 15.66 mm, also at f 2.8.

The iPhones also offer a zoom across the lenses, but this is based on continuous digital scaling. Unfortunately, this shows a disturbing jump when switching to telephoto due to the parallax of the separate cameras. In addition, even with sophisticated algorithms, the sharpness drops slightly when the resolution of the sensor falls below 48 megapixels (mpx), which is particularly noticeable on the way to 5x. This is a disadvantage that longer focal lengths also have with conventional cameras: The 77 mm only focuses up to 60 cm at close range, the 120 mm only up to 1.3 m.
You have to be careful here: The iPhone should be set to “Keep setting” for macro control. Otherwise, only the small flower indicates that the macro range is switched to the ultra-wide angle camera, although the original lens is still selected. However, this only has 12 mpx, unlike the standard camera. This automatic function can be deactivated. With the default setting, the photos from the 24 mm lens are downscaled to 24 mpx and recorded in HEIC, the full 48 mpx are only available in RAW (DNG format). The main camera also delivers the best quality for video thanks to downsampling. And for the selfies: The front camera has 2.69 mm at f 1.9, so it looks like a 15 mm, but only has just under 8 megapixels (actually photo cells = Sensel).


Photography
The DP is not explicitly aimed at photographers, but in view of the different resolutions of the sensors, a few comments seem appropriate. In our test of the Sony A7SIII (DP 21:05), we already proved that megapixels aren’t everything. Just as the Sony “only” offers 12 mpx, images from the iPhone would also be perfectly suitable for an A4 page. For large posters, however, we would generally recommend professional cameras, even if Apple sees it differently in its own advertising.


It should be noted that the sensor has an aspect ratio of 4:3, which is even closer to the square than the usual 35mm with 3:2. The HEIC photos are very compact at around 2.8 MB, whereas DNG requires over 67 MB for the same subject. A section in 2,400×2,400 pixels from the iPhone at 120 mm (equivalence) with the same section from the Sony A7 IV with approximately the same focal length differs primarily, of course, in the real bokeh, which is not simulated here on the iPhone either. The slightly different colours can of course be adjusted, as both are raw images with plenty of colour depth.
The resolution of the test chart in 48 mpx from the 24 mm of the iPhone is excellent and shows no moiré, but slight colour noise even at ISO 320.

The counterpart in standard setting as HEIC with 24 mpx shows less noise, but colour moiré and clear edge sharpening. The modern HEIC compression is very high quality in itself, but unfortunately Apple applies all the “image enhancements” to these photos that the fast A17 processor is capable of, such as smoothing and sharpening contours or local colour tone mapping. Edge sharpening in particular looks just as distracting as it does on amateur-class drones. Of course, Apple also knows that the sharpening of details in faces is usually undesirable. Accordingly, skin tones tend to be smoothed out, which often gives them the infamous “plastic look”.

The design of these functions is certainly based on comprehensive market analyses, as the vast majority of users will simply take photos with their mobile phones and presumably find all this attractive. Professionals who don’t want to be deprived of the opportunity to tweak their pictures themselves will therefore have to work with RAW Max (as it is called in the menu) and plan the necessary space for it, either conveniently and expensively in the iPhone or more cheaply on an external storage device. However, raising your eyebrows at “computational photography” in general is not entirely appropriate. After all, most professional cameras have been correcting various lens errors in the background for years, primarily vignetting and distortion. This means that lenses can be optimised in other respects without being sinfully expensive.

The iPhone goes beyond this and achieves amazing things in two areas: bokeh and low-light. The ability to use the (actually relatively coarse) information from the LiDAR to subsequently influence and target the depth of field was already available on a number of predecessor models. But Apple has significantly improved the algorithms and, at least in photography, the results can now only be recognised as fake with intensive pixel peeping, even with hair and other difficult edge zones.

The results in RAW (DNG) are impressively good, while artefacts can be seen in HEIC. However, we also photographed the test panel at 320 ISO for the wide angle and 500 for the telephoto. Vignetting or chromatic aberrations are hardly recognisable and the distortion correction for 24 mm is also better than with the uncorrected professional lens. Even moiré is hardly noticeable at 48 mpx from 24 mm and the resolution is even slightly higher, as the A7IV only has 33. At 77 mm or 120 mm, the A7IV has a recognisable advantage over the 12 mpx of the iPhones, although it was at a slight disadvantage in terms of speed each time. A 13 mm was not available to us for the full frame, so no comparison was made.

The Sony’s HIF images look cleaner because, apart from subtle noise filtering in extremely low light, everything else can be deactivated. In low light, the iPhone does all by itself what, in principle, any camera on a tripod can do: A series of photos that are then combined to reduce noise. Red has been offering something similar for 10 years (see DP 13:03). With the iPhone, this also works with good stabilisation from reasonably steady hands, but you can get even more out of it with a tripod. Some even recommend this for astrophotography; there is even a free app for this, AstroShader, which can also save raw images.

Now, we have too much light pollution near a city for a great starry sky, but you can focus on the lights of the city itself instead. It’s impressive what such a tiny chip can do with the help of a computer. The tree in the foreground was barely distinguishable from the night sky with the naked eye. Apple is clearly making a virtue of the A17’s computing power out of the necessity of the tiny sensor and calculating far better images than you would expect from a mobile phone. However, these calculations take time and cannot be used for video. So how do the 15s perform when filming? We use the Sony A7 IV (see DP 22:05) as a reference here, even if some people seriously compare the images with those from an Arri Alexa in grading(is.gd/quazi_iphone_arri).
Film: Resolution
When filming from the high-resolution 24 mm sensor in UHD, the iPhone not only scales down the 48 million photocells (aka Sensel), but also crops slightly from the 4032 horizontal pixels available per photo to 3840. So if Apple had only added a few more Sensel, the iPhone could even deliver the 4096 pixels of the DCI cinema format. There would be more than enough vertical pixels anyway. Other formats are possible with third-party apps; we will go into this in more detail in the respective apps. The same applies to the Ultra-WW and the telephoto lenses, which logically have less oversampling, but are also slightly cropped.
The downsampling of the 48 mpx at least offers the possibility of noise reduction by integrating the neighbouring pixels or a digital 2× zoom without loss of resolution. However, when zooming through to 5× telephoto, not only does the parallax error mentioned above occur, but the resolution also collapses beyond 2×. Up to 3× on the iPhone 15 Pro, these effects are correspondingly less noticeable. A 28 mm and a 35 mm are still simulated quite well by both iPhones. This is comparable to Sony’s digital zoom in the camera, which works with similarly good algorithms, but cannot deliver the full resolution beyond around 1.7.
The resolution of the iPhones is also reduced at the higher levels of stabilisation. “Standard” only uses optical stabilisation by mechanically tracking the sensor, while “Cinematic” and “Extreme” additionally track a reduced area of the sensor surface. Unfortunately, “Extreme” in particular also shows overshoots – or is this intended to simulate the inertia of a Steadicam? Be that as it may, a small gimbal can do this better. Another disadvantage of the two higher stabilisations are massive deviations in the frame rate in the first second; the iPhone is obviously “stressed” here (thanks to Kurt Friis Hansen for these measurements).
The other sensors with 12 mpx naturally also offer less resolution reserve when filming, but are completely sufficient for UHD. For comparison: On S-35, the 5× lens of the Pro Max would correspond to about 75 mm, the 3× to about 48 mm. Because a suitable prime lens was not available for the Sony, we used a very high-quality zoom lens. However, this actually put the A7IV at a slight disadvantage in terms of speed. In all shots, the film images of the iPhones on the test chart show slight moiré and some loss of resolution due to scaling; the Sony obviously has better downsampling. The differences are small at 24, 77 and 120 mm and the moiré can practically only be seen on the test chart. Other cameras only deliver better results with very good downsampling of images in 6K or more.
It is obvious that a lot of calculations are also carried out on the images in log, as the charts show no vignetting and hardly any distortion, just like in photography. This is not to be expected with such lenses without software correction. The very high-quality Zeiss C/Y zoom on the Sony shows more optical errors due to the lack of software correction. It should be understandable that a lens with a real focal length of just under 16 mm (the 5×) without software tricks does not offer such a differentiated depth of field as a real 120 mm for KB full frame (see above). All lenses definitely show some breathing during focus shifts – if Apple doesn’t simulate this as well 😉



Film: Compression
Here, too, the comparison with the Sony A7IV, which only records 10 bit with 4:2:2 as H.265 for maximum quality, ProRes is not available without an external recorder. Our endurance test was a water surface in the rain, which can really torture any GOP codec (such as H.264/265). The iPhone only recorded a good 7 MB/s at 60 fps in UHD, while the same motif was recorded in ProRes 422 HQ at 213 MB/s. The Sony A7 IV records this codec at 50 fps with 25 MB/s. Nevertheless, the typical artefacts, such as block formation, can hardly be seen in the correctly exposed motif on the iPhone. However, this does not only speak in favour of the quality of the encoder alone, as the finest details are already reduced before compression.
This becomes clear with a shot of the same subject that is underexposed by just one f-stop and corrected in DR. Despite 10 bit and 4:2:2, the signal in H.265 already shows signs of banding and is overall much less “full” and more coarsely structured than the version in ProRes. The same applies to noise, which is also filtered more heavily before compression and then “muddied” by the codec. The noise remains much finer and more “organic” in ProRes. It is therefore less disturbing in itself and is also more suitable for careful filtering in post-production. Consequence: If you don’t have enough storage space on location, H.265 is perfectly usable in Log, but should be exposed as correctly as possible. ProRes is considerably more tolerant, while Cinema P3 can record better H.265 at higher data rates (see p. 128).
Film: Motion display
We can be brief here and take the liberty of quoting the values from CineD (is.gd/cined_test) because our reference fan was not available. For UHD, our colleagues measured 5.3 milliseconds for the sensor at 24 mm, and even 5 ms at 77 mm and 120 mm. Our subjective impression from our own tests agrees with this. These are excellent values, also in comparison to conventional film cameras. The jelly images of older iPhones are definitely a thing of the past and action is the order of the day.
However, the Bentley film mentioned at the beginning was still noticeable with the jerking of fast movements at only 24 fps. A look at the individual images shows that they lack any motion blur. Even these tiny sensors are too light-sensitive today to comply with the filmic rule for sufficient motion blur in daylight. After all, mobile phones primarily use the time for exposure control, because the aperture is fixed. Without an ND filter, the only option is to regulate the ISO value, which the iPhones also make intensive use of. But this has consequences for the usable contrast range.


Film: Dynamic range and colours
The HDR shots from an iPhone with the corresponding screen are very impressive at first glance, but on closer inspection the sharpening becomes unpleasant. Added to this is the local tone mapping, which is particularly noticeable with intensely coloured light sources and soft colour gradients in the moving image. Only recording in Apple Log avoids these annoyances, which is why we limited ourselves to it for dynamic measurement. Apple Log is excellently tuned, it does better justice to the iPhone than an A7 IV with its S-Log3, because even for 10 bit it is still very flat. Incidentally, in the current version 18.6.4, DR understands the Apple Log with the colours in Rec 2020 with automatic colour management.

The amazing night vision in photos ends for the iPhone when filming, as there is no time for the exposure tricks from photography. We also used the Sony A7 IV for comparison. It is not quite as sensitive to light as the A7S and also works with noise reduction in low light. Compared to the iPhone, however, it shows the richness of detail that a full-frame sensor can achieve when collecting light.


However, we came across an interesting phenomenon that suggests multiple exposures in the highlights. When shooting at 50 fps in Log with a longer exposure time of up to 1/80 second, the highlights are consistently limited to around 70 per cent. If you now go to at least 1/96, the values expand into the space above. Hardly any further information can be found about this function, only CinemaP3 refers to it as EDR (Enhanced Dynamic Range).


In the documentation for developers, the term is primarily used for playback on HDR displays. Red has been offering something similar for recording under the name “HDRx” for some time, but probably does not hold a patent on this (as with the compression of RAW in the camera). Obviously, two (perhaps even more) readouts of the sensor are combined here as soon as the time until the next single image is sufficient. 7.5 milliseconds is obviously not enough, but 10 ms is fine. This is less noticeable at 25 fps, as the usual exposure time of 1/50 with 20 ms is easily sufficient. Here the effect already starts at 1/40, at 1/30 the remaining time is still too short – which mathematically corresponds to the behaviour at 50 fps with just under 7 ms. The resulting distribution across the log curve can be easily controlled using ND and ISO.

At the other end of the scale, it is noticeable that there is practically no recognisable noise. You can simply cover the lens with manual exposure and the line drops to zero. So there is obviously a lot of filtering going on in the camera, because such a small sensor cannot be so low-noise despite all the technical progress. This becomes quite clear when shooting at high ISO values with short exposure times, as the lower mid-tones are already very noisy. Medium ISO with longer exposure times using ND looks much better.
So there is no “correct”, native ISO value that always fits. The iPhone makes too many adjustments for this, even with Log. It all depends on where the important details are: If you want to bring out the highlights, e.g. add structure to a cloudy sky, a higher ISO value is appropriate. EDR reduces the contrast in the highlights, but the details are there. If the shadow areas are the most important thing, on the other hand, you should go for a low ISO in order to provide the sensor with enough light, then the highlights will be compressed, but largely without clipping.

In our exposure series with EDR, 2 f-stops overexposure and 3 f-stops underexposure were still easily correctable, the colours become weaker, but do not derail. Incidentally, they are excellent: especially in the critical area between red and yellow and thus with the skin tones, the iPhone shows a precision landing. Noise is visible in dark areas, but is fine-grained in ProRes and can be controlled in post. We worked with DR’s colour management and only corrected brightness and contrast. We measured an extremely high-contrast motif with a cloudy sky and molleton in the shadowy foreground with 9.5 f-stops – this would still be usable in post (the blue tone on the blacks is scattered sky blue).

After all these observations, I must also note that Gerald Undone and Patrick Tommaso’s categorisation of ISO 1,100-1,400 as the native value should be treated with caution. The two did not explain whether they used ND filters or varied the exposure time(is.gd/undone_iphone). But DXOmark also assumes 1,250 to 1,480. I tend to favour ISO 200-800 for noise reduction and use EDR with sufficiently long exposure times. Of course, ND filters are the order of the day in bright light, otherwise the typical staccato films from action cams will result.
Film: Equipment
Basically, the only indispensable accessory for good pictures is an ND filter. But a variable ND consists of two polarising filters and has the corresponding side effects, from the infamous “X” to altered colours (warmer ones in our case). Conventional ND filters are better. We were unable to observe any noticeable IR contamination with an inexpensive set, even at 10 f-stops; Apple probably has this well under control. Filters can be attached with a simple clamp mount, but this should not cover too much of the interface on the side of the screen. In addition, such a mount does not fit well with the usual transparent silicone covers, as stray light can then enter from behind. Such covers are also not conducive to cooling in warm environments.


1.33 in blue.

“Gold”.

the worst of the low-cost filters.

minimalist
magnetic holder.

systems are also available from others, here Neewer.
If it’s not too warm, a thin, black cover would be useful. If you want it more perfect, you’ll be back on the road to “Frankenrig” at some point, because the accessories industry has immediately adapted to this. But there are alternatives that can still be put in your pocket as a protective frame without all the accessories. The company Beastgrip has been involved in the development since the Bentley advert and also produced the universal cage that was used for Apple’s event. Their “Beastcage” will soon be released in specific versions for the iPhone 15 Pro and Pro Max and offers a very sophisticated system, including filters, macro lens and anamorphic lenses. Another provider is Freewell with the “Sherpa” – the name of course says it all, especially for the iPhone. It works with clever magnetic holders for filters or lenses.
Freewell not only offers filters, but also anamorphic lens attachments. These are available in both 1.33 and 1.55 and at Freewell optionally with streaks in classic blue as well as in “gold”, i.e. warm yellow. However, the streaks are very pronounced, including the typical side effects such as edge blurring and distortion. However, the 1.55s in particular offer much more horizontal image area, even if they are not easy to align. There is no gain in resolution associated with the anamorphic lenses in ProRes Log. When recording film, the full area of the 4:3 sensor can only be used with H.265 in 8-bit HDR or Rec 709 and then shows all the typical “blurring” effects.
Be careful if you want to use existing filters with a larger diameter via the step-up ring: the 24 mm main camera can then vignette. In addition, the stabilisation of the iPhone does not work with anamorphic lenses and sometimes the iPhone “sees” the adapter as an object and goes into macro mode. So is this more for wannabe Spielbergs? The only filters you really need are NDs and possibly a circular polarising filter, as neither of these can be replaced in post. There are excellent software solutions for everything else, such as “Scatter”.

Advantages and disadvantages
So where do we stand with such a powerful camera phone? Well, first of all, an “always with you” camera is definitely better than no camera at all. In low light, a good hybrid camera like the Sony A7 IV is still clearly superior and you have an almost infinite choice of lenses, but all of this is still much more expensive, even compared to Apple’s prices. If you know your camera really well, you can
blindly with the viewfinder and at the same time see the subject in isolation without distraction. Nowadays, the monitor can usually be swivelled and rotated, but the iPhone’s monitor is still visible even in sunlight.
But you don’t always have all those nice lenses with you. Dust on the sensor when changing lenses is also not an issue with mobile phones. The iPhone is even waterproof, where larger cameras can only withstand splashing water. And you hardly stand out with it, whereas a “real” camera with a longer lens might be eyed suspiciously or even rejected at the entrance. The iPhone 15 can achieve cinema quality in the documentary sector and would certainly have been a dream come true for authors such as Michael Winterbottom with “In This World” or Wim Wenders with “Buena Vista Social Club” – these films were released to the cinema in large parts by DV.
Commentary
Apple has managed an amazing balancing act with the iPhone 15 Pro: 99 per cent or more of users get nice pictures without any great expertise and professionals get a camera that, with the right app, can do far more than a conventional smartphone. The pig called “game changer” is driven through the digital village almost every day – mostly ephemeroptera (mayflies). But the Pro models of the iPhone 15 could actually signal a paradigm shift, similar to the Canon 5D Mark II 15 years ago. Just as that marked the beginning of hybrid photo/film cameras, the iPhone is now a working tool for professionals. And unlike a standard camera, you can even use it to make phone calls..