I wanna keep it short. So I will just use an example. I have Acer Switch something 2-in-1 convertible laptop/tablet.
it has i3-6100U, Intel HD 520, and a 2160x1400 screen.
Now, we all know we shouldn't game on intel HD but, sometimes you just want to. Firing up CS:GO on it, it gives 40-60 fps at native resolution. (unplayable). However, if I make a custom resolution of 1080x720, which is exactly half of 2160x1440, the framerate goes to 120 average, limited sometimes with the CPU.
And there we have a problem. while 1080x720 is exactly half of 2160x1440, it does not look sharp. This is because the Intel GPU upscales it with bilinear filter. And that is the problem. If there was an option to chose 1:4 pixel mapping, it would make 1080x720 look so much better on 2160x1440 screen, it would be native like sharp. I have personally played on a 720p screen before and i know what 720p looks like.
Now you might be wonder why can one just game on gaming pc with a dedicated gpu and there are obviously many situations, such as:
-Maybe you have a tablet with you at College or uni and wanna fire up a game like csgo or half-life 2 for 15 minutes while you wanna pass the time. (being student my self I find my self having lots of time between lectures, sometimes few hours.)
-Maybe you built a new pc, bought a 1440p screen, and didn't had enough for a dedicated graphics card, so you are gonna have to run iGPU for a month while you save up.
and so on.
It makes so sense for intel not to do this. Something like this would be SOOOOOO useful for thousands of intel customers. Specially with ever increasing screen resolutions on newer laptops/tablets. And yet, they haven't done so.
why? I'm genuinely curious.
And PLEASE add this. PLEASE PLEASE PLEASE