Has anyone else had this issue? It seems like it may be using deep fusion at all times and it’s making all my images look overly contrasted on the edges of objects and things in my shots, and it also seems to make everything have an almost cartoon looking filter effect as well, ruining the detail. And if you do zoom into the photos you’ll see really bad pixelated glitchy artifacts on some text of objects sometimes. I only point this out because coming from an XR, I never had those kinds of problems. Doesn’t even matter if you shoot in proRAW, some of the processing is still there, even if I shoot with an app like Halide. I can only assume this is something baked into the ISP that can’t be turned off. Photographic Styles are turned off as well. I really hope this will be fixed with a new iOS update, but I’ve been on the beta program for a while and there’s been nothing yet. It’s to the point for me where I don’t even want to use the cameras until the issue gets fixed. I’ve also heard that deep fusion processing is the reason some photos take a few seconds to load its full preview, but is that really why?? The pro and pro max both have an extra GPU core. There really should not be any loading times that long for processing deep fusion. I know it might sound like I’m hating on the 13 right now but the truth is I actually LOVE this phone. It’s the brightest display I’ve ever seen on pretty much anything, it’s superbly fast, especially with the 120hz that makes it feel even faster and more fluid, and the 5G speeds are the fastest I’ve experienced with any 5G phone. It’s a really great phone. The ONLY and I really mean only thing that has disappointed me is the way it processes images. I know the cameras are the best of any phone for sure, and they’re massive too. It’s just the way it processes images that has seemed really felt wrong to me. Is it just me? Or has anybody else seen this too??