- cross-posted to:
- technology@lemmy.world
- apple@lemdro.id
- cross-posted to:
- technology@lemmy.world
- apple@lemdro.id
Absolutely zero people expected that Apple would create it with an iPhone and a flash light. That doesn’t change the fact that an iPhone was what recorded the footage.
Yes, if you want production quality results, you have to use production quality equipment…which the iPhone can be considered since it clearly helped create that production.
Still impressive as hell.
You can replicate most of these with good lighting and a hand held gimbal. While not as impressive as what Apple use, they are presenting to millions of people and they have a team so they need super smooth movement all around where as a gimbal is smooth but not super smooth.
They said in the video it’s equipment they would have used with a traditional camera and it’s just the camera itself that is different.
I wonder what app they were using for filming. Seems like a lot of controls that would be amazing to have access to it too.
Exact same equipment is used with 30k plus cameras, I don’t understand what the aim is here?
On another note, why won’t Apple just create a dedicated camera with a CameraOS that can also be a phone if you really really want to.
Honestly, I’m impressed! I watched the embedded video in the article. I thought half the shots were CGI backdrops and such. I was joking with my partner leading up to the Vision Pro announcement about the backgrounds/sets being VR easter eggs for the future headset. Turns out they’re just elaborate set pieces. 😂
Without light there will be no photography
Good!
Misleading claims should be exposed for the fraud that they are.
I swear this has to be some intentional ragebaiting from Verge because the person who wrote this article should get checked with a doc
Umm you would use the same things with a professional camera so what’s your point?
How many years before smartphones will be able to create video of approximately this quality without all the extra equipment?
We talk about computational photography; I suspect we’re moving to a time when our phones will have enough compute power to completely virtualize the act of taking photographs or shooting videos—that is, they will capture the 3D environment in total detail, then allow any change to lighting or camera position after the fact. Like portrait mode taken to its logical conclusion. So you could create videos like Apple’s but with zero extra equipment.
I’m guessing it’s at least 10 years but less than 25.
Why won’t Apple get rid of the annoying phone part and finally release an iCamera with a useful form factor and all of the chip power behind?
Better than the shot on samsung commercials where the reflection showed it was a camera and not a phone?
I doesn’t say Lit with iPhone. This is the dumbest fucking thing lol
Apple just needs to replace their lighting with 1000 iPhones strapped together with the flashlights on
I’m a hobbyist videographer and this is fucking impressive. Professional lighting and gimbals would also needed to be used for pro cameras. Even if you were just shooting on a DSLR, you’d still need lighting design.
What’s actually impressive about the iPhone this time around is that they didn’t add additional LENS. That’s new. In the past you still needed to add extra glass to it.
It means that it’s just using the raw sensor and built in lens, you can replace a cinematic DSLR, that’s actually a huge deal.
Does it mean that soccer moms will start creating full feature films? No, you’d still need to know composition and lighting design and all that, but this is pretty damn cool.
Agreed. The two things that blew me away is
1: No added lens. That was always a cop-out IMO
2: Almost no one guessed that it was iPhone footage before the reveal at the end
On the typical shot on iPhone videos you could either see it clearly or it was using things like external lenses.
Apart from the HW upgrades I think the biggest difference in terms of quality is apple log. It really lets you go away from that mobile look.
Really impressive.
Very often for my Youtube channel (barely 20k subscribers), I shoot my iPhone right alongside footage from my A7C, and quite honestly, the iPhone’s is a better footage thanks to the computational aspect fixing things an idiot like me doesn’t know how to fix.
Is it ever possible for The Verge to report on interesting updates/products/news etc without the author unnecessarily injecting their ridiculous takes into the story?