While I was thinking through a way to implement transparency in my graphics engine, I thought up a cool and fast way to render pixels at 50% transparency:
BASIC:
col0 = col0 SHR 1
col1 = col1 SHR 1
col0 = (col0 AND &h007F7F7F)
col1 = (col1 AND &h007F7F7F)
newcol = col0 + col1
C:
col0 >>= 1;
col1 >>= 1;
col0 = col0 & 0x007F7F7F;
col1 = col1 & 0x007F7F7F;
newcol = col0 + col1;
ASM:
mov eax, [col1]
mov ebx, [col2]
shr eax, 1
shr ebx, 1
and eax, 0x007F7F7F
and ebx, 0x007F7F7F
add eax, ebx
mov [newcol], eax
All these examples do is take the average of the two RGB values by using bit shifts and bit logic. Instead of adding by two and then dividing, it divides by two (shift right one) and adds them together. But before it adds the values together, it chops off the bits that may have bled over into the other bytes using the AND operator.
0x7F7F7F is 011111110111111101111111 in binary.
DOS is dead, so is 320x200
DOS may live on amongst a community of zombies who were once among the living in the 80s and early 90s, but for those of us who don't crave the taste of human brains, we've moved on.
One thing that didn't really need to die, but has anyway with the new graphics cards, is the 320x200 and 320x240 resolutions. Other resolutions smaller than 640x480 may be in danger too.
It was some years ago that I noticed, while working on a game of mine, that 320x240 looked fuzzy, blurry, and just not as sharp as it used to be. My game looked fine on older computers, but not on mine. And I've noticed this on other newer computers also.
My theory is that the graphics cards are using OpenGL or Direct3D to render a 320x240 texture on a slightly higher resolution like 640x480. Because of this, filtering is applied to the texture, making it appear slightly blurry. However, that's just a theory, but you get the point.
The other issue is that some computers just flat out don't support resolutions under 640x480. One example of that is my laptop that I bought last year. It's an Acer, so that might explain it, but it still goes to show that the old resolutions are going the way of DOS.
But I'm not saying that anyone should stop making games and programs that run in 320x200 or 320x240 resolutions. But what I would like is if they made sure they had an option to run the game in 640x480, but with the graphics scaled onto the screen so that I can still play that game full screen and see it as sharp as it should be seen, so that I can play the game as it's meant to be played.
I'm not against new releases of retro-style games, I just want to be able to play them. :)
One thing that didn't really need to die, but has anyway with the new graphics cards, is the 320x200 and 320x240 resolutions. Other resolutions smaller than 640x480 may be in danger too.
It was some years ago that I noticed, while working on a game of mine, that 320x240 looked fuzzy, blurry, and just not as sharp as it used to be. My game looked fine on older computers, but not on mine. And I've noticed this on other newer computers also.
My theory is that the graphics cards are using OpenGL or Direct3D to render a 320x240 texture on a slightly higher resolution like 640x480. Because of this, filtering is applied to the texture, making it appear slightly blurry. However, that's just a theory, but you get the point.
The other issue is that some computers just flat out don't support resolutions under 640x480. One example of that is my laptop that I bought last year. It's an Acer, so that might explain it, but it still goes to show that the old resolutions are going the way of DOS.
But I'm not saying that anyone should stop making games and programs that run in 320x200 or 320x240 resolutions. But what I would like is if they made sure they had an option to run the game in 640x480, but with the graphics scaled onto the screen so that I can still play that game full screen and see it as sharp as it should be seen, so that I can play the game as it's meant to be played.
I'm not against new releases of retro-style games, I just want to be able to play them. :)
Subscribe to:
Posts (Atom)