EDIT: also, let me stress that this "averaging" code is not a filter to improve the appearance and make the games look better, but it is the best way to make the LCD output similar to the expected tv output when the SNES outputs 512 half-pixels in place of the usual 256 pixels... still an optional hack, and marked as such, but without altering the way things should appear.
How is it not so, it sounds EXACTLY like a filter to improve the appearance of the games and make them look better on an LCD screen. If you wired the SNES up to a good enough quality TV I guess you'd see something like the unfiltered shots. Just because it isn't increasing the resolution and adding fake pixels doesn't mean it isn't a filter, in this case it's bluring pixels into fake pixels instead. Both are image enhancement filters.
I'm just not convinced that the build-up of these things in drivers is a good thing, efforts should be concentrated on improving MAME's ability to filter the output from the drivers instead, which will probably involve convincing Aaron to make the existing filter system more flexible, and maybe allowing drivers to provide additional metadata to it, so that it can better apply effects on lines where fake pixel doubling has occured etc. It would be nice if things like the NTSC filter were available to anybody who wanted to use them, on any driver. It's a decent filter and a lot more useful than burn-in simulators...
Actually doing the filtering in the drivers seems incorrect. I said the same about 'LCD flicker filters' for Gameboy etc. and I'll say the same for this.
What motivation is there to do it properly if all the drivers hack in their own implementations?