I think preferring a lower-than-native resolution over DLSS as a blanket statement is a bit of a wild take, but there can definitely be problems like artifacts, especially in certain games. I'm playing RDR2 at the moment and the TAA (which is forced to High with DLSS) is poorly implemented and causes flickers which is definitely annoying, as an example. I played Alan Wake 2 on an older laptop that barely ran it and I definitely noticed artifacting from DLSS there, though in fairness I was demanding a lot from that machine by forcing it to play AW2.
Frame time will of course be impacted so if you're playing something really fast and twitchy you should stay away from DLSS probably. It's also less bad if you don't enable Frame Generation. Finally, both DLSS and Frame Generation input lag seems to scale with your baseline FPS. Using it to try to reach 60+ FPS will usually mean some input lag, using it when you're already at ~60 FPS to get 80-100 or whatever means less noticeable input lag.
The wildest thing is combining DLSS with DLDSR if you're running say a 1440p system like I am. Set your monitor to 1.78x DLDSR resolution, run your game at 3413x1920 and enable DLSS quality. In the end you render at 2275x1280 but end up with way better image quality than native, and the upscaling+downsampling ends up being a great anti-aliasing method since it sorts out a lot of the bad TAA blur.