Friday, August 28, 2020

Digest for comp.lang.c++@googlegroups.com - 25 updates in 3 topics

Frederick Gotham <cauldwell.thomas@gmail.com>: Aug 28 09:14AM -0700

I'm thinking of taking the output from the C preprocessor like this:
 
gcc -o compute.translation_unit -E -P compute.c
 
So then I'll have this translation unit and I'll run it through my own preprocessor, which will implement my previous code I shared about 2 weeks ago for doing polymorphism in C :

https://groups.google.com/forum/m/#!topic/comp.lang.c++/5lKrGQRQHQk
 
All my operators must be discernible from C operators, and so to invoke a method on an object, I might use syntax like:
 
object_name~>Method_Name(5, "dog");
 
(Note that that is a tilda instead of a hyphen, i.e. "~>" instead of "->", so it won't be confused with C's operator to access the members of a dereferenced pointer).
 
My preprocessor will turn this line into:
 
Virtcall(object_name,Method_Name)(5,"dog");
 
Construction and destruction of objects will be handled by simply declaring a class, perhaps with the 'class' keyword. I might also have a 'method' keyword like this:
 
class MyClass {
method void Constructor(int);
method void Destructor(void);
method void Some_Method(int, char const*);
 
int a;
double b;
void *p;
};
 
Similar to the old C style of always writing 'struct' or 'union' before a struct or union, you will always have to write 'class' before a class, and so you would create an object like this:
 
class MyClass obj(42);
 
And you would take a pointer in a function's parameter list like this:
 
void Some_Global_Func(class MyClass *p);
 
So if you put the following snippet through my preprocessor:
 
{
class MyClass obj(42);
 
obj~>Some_Method(5, "dog");
}
 
then it will come out as:
 
{
struct MyClass obj;
MyClass_Construct(&obj, 42);
 
Virtcall(&obj,Some_Method)(5,"dog");
 
Virtcall(&obj,Destructor)();
}
 
In order to be able to call the destructor at the right time, I think I'll use 'defer' which you can see on this page if you scroll down about half way:
 
https://mort.coffee/home/obscure-c-features/
 
So then I'll take the output of my own preprocessor and feed it back into the C compiler. The whole process will be something like:
 
gcc -o compute.preprocessed -E -P compute.c
gotham_preprocessor -o compute.translation_unit compute.preprocessed
gcc -o compute.o -c compute.translation_unit
 
So then you'll have an object file, "compute.o" which you can give forward into the linker.
 
So you might ask, what's the point in all this if we already have C++?
Well there are still loads of very small, very slow 8-bit microcontrollers with less than a megibyte of RAM/ROM that don't have a C++ compiler. My new preprocessor will make it a lot easier to port C++ code with polymorphic classes to C.
Christian Gollwitzer <auriocus@gmx.de>: Aug 28 07:46PM +0200

Am 28.08.20 um 18:14 schrieb Frederick Gotham:
> I'm thinking of taking the output from the C preprocessor like this:
 
> gcc -o compute.translation_unit -E -P compute.c
 
> So then I'll have this translation unit and I'll run it through my own preprocessor, which will implement my previous code I shared about 2 weeks ago for doing polymorphism in C :
 
So you are replicating some of the functionality of Cfront.
 
Another way would be to compile C++ to C using an LLVM backend; there is
one at https://github.com/JuliaComputing/llvm-cbe
 
Christian
boltar@nuttyella.co.uk: Aug 28 09:09AM

On Thu, 27 Aug 2020 16:28:40 +0000 (UTC)
>a very long time, and later TV standardizing 25 (PAL) and 30 (NTSC)
>frames per second. People don't understand why these framerates were
>chosen and have this completely wrong misconception about them.
 
Perhaps you never used CRT monitors, but the only time you would notice
any flicker was out the corner of your eye - which is ironically more
sensitive to light - when a bright image was showing and the refresh rate
was at one of its lower settings. Otherwise forget it. Ditto CRT TVs. With LCD
screens the picture never goes "off" inbetween frames as it did with CRTs as
its simply displays its picture buffer until updated with the next frame so
flicker is even less noticable.
 
>something higher. They *most definitely* can. Pretty much *all* people
>can. Why do you think that when The Hobbit used 48 frames per second
>a huge bunch of people complained about it looking somehow "unnatural"?
 
A huge bunch? I've never heard about it.
 
>Of course there are countless double-blind tests where people are
>tested whether they can see a difference between 60 Hz and 120 Hz,
 
And no doubt countless ones where no one noticed any difference.
 
>calculate a crapload of things at a very minimum 60 times per
>second (which is about 16 milliseconds per frame), preferably
>144 times per second and beyond.
 
Someone's drunk the games fps kool aid.
David Brown <david.brown@hesbynett.no>: Aug 28 11:13AM +0200

>> organisation can't be assumed to be the way to do it either.
 
> Any data that is unique to an object should be internal to that object unless
> there's a *really* good reason for that not to happen.
 
That's a reasonable rule of thumb.
 
Performance of a game is a /really/ good reason for that not to happen.
 
Separation of logical aspects, as Ben describes, can also be a /really/
good reason. Sometimes "position" is best considered to be an aspect of
the object and ideally it should be stored in that object. In other
cases, it is an aspect of the "scene" rather than the object. I think
all Ben is saying is that programmers should beware of trying to put
everything connected with an object into members of that class, instead
of thinking more carefully about where the data fits most logically.
boltar@nuttyella.co.uk: Aug 28 09:34AM

On Fri, 28 Aug 2020 11:13:54 +0200
>all Ben is saying is that programmers should beware of trying to put
>everything connected with an object into members of that class, instead
>of thinking more carefully about where the data fits most logically.
 
Thats fine, but if you take out something like position for performance
reasons then presumably something like drawing the object should also be
inlined out of the class and then calculation algorithms and so on. And in
the end you end up with nothing left inside the object and what is essentially
a procedural program.
David Brown <david.brown@hesbynett.no>: Aug 28 11:53AM +0200

>> records, when they were actually more accurate.
 
> Not so. Vinyl allows for the playback of frequencies over 20 kHz,
> 16/44.1 CD does not.
 
That is true, in a /very/ limited sense.
 
First, there is a limit to how high a frequency people can here. The
limit varies from person to person, over age, and with gender. There
are also theoretical complex effects possible due to harmonics even when
these are themselves above human audio range.
 
The standard limit used is 20 kHz. Very few people can hear above that
- and they are almost all kids. (The record is something like 25 or 26
kHz.) If you limit the sample to people who are interested enough in
audio hifi, and have the means to buy high end equipment, then
statistically speaking the limit is more like 16 kHz.
 
In experiments - done by qualified researchers rather than high-end
audio salesmen - no measurable difference has been detected by listeners
when harmonics higher than 20 kHz are included with sounds.
 
Thus we can conclude that for the end result, 20 kHz is a perfectly good
cut-off point - anything higher is pointless.
 
That said, there is of course a point in using higher frequency sampling
in intermediary processing - it lets you use much cleaner filters, and
reduces noise when converting sample rates.
 
A record player amplifier, however, does not need filters - there is no
benefit in having greater than 20 kHz from the platter. Earlier and
cheaper CD players had poor filters, and bad frequency response between,
say, 16 kHz and 22 kHz. Now they all use digital filters and give flat
response up to 20 kHz and a sharp cut-off. Good quality record players
could therefore give better high frequency (but less than 20 kHz) sound
than early CD players.
 
 
Secondly, vinyl only supports frequencies above 20 kHz if the master
source has frequencies above that limit. In virtually all cases, they
do not - any higher frequencies left over from high sample rate digital
processing are removed before cutting the vinyl. And for older analogue
mastering, the tapes did not support higher frequencies.
 
 
Thirdly, records wear - you will only get 20 kHz frequencies for the
first 5 to 10 playbacks of a record (depending on the record player and
record type).
 
 
Fourthly, and perhaps most importantly, high frequency response is only
one small part of the quality and accuracy of the sound reproduction.
CD playback is more accurate in many aspects - in pretty much any way
you try to measure the accuracy of the copy.
 
 
None of this detracts from the fact that some people genuinely prefer
the sound of vinyl. But that is a psychological effect - they prefer
the /imperfections/ and they like the noise, distortion, and other
aspects. That is fine, of course - it's just like preferring a painting
to a photograph. And that was my point - people can prefer the lower
quality audio or film, and find the higher quality version to be
"unnatural", in contrast to the actual precision of the reproduction.
David Brown <david.brown@hesbynett.no>: Aug 28 12:05PM +0200

> screens the picture never goes "off" inbetween frames as it did with CRTs as
> its simply displays its picture buffer until updated with the next frame so
> flicker is even less noticable.
 
CRT monitors had varying refresh rates - 50 Hz would be the absolute
minimum useable. (Note that TV uses interlacing to give 50/60 Hz
refresh rates even though the frame rate is 25/30 Hz.) 72 Hz was, IIRC,
the standard minimum for being "flicker-free".
 
You are right that with LCD's and other technologies that do not have
"refresh", you don't get flicker no matter what the frame rate. But
again, something like 20 Hz is the minimum for motion to appear somewhat
smooth, and most people are easily capable of seeing improvements up to
about 50 or 60 Hz. Serious gamers or others who regularly use high
speed systems can notice the difference of higher rates.
 
(It is not "ironic" that your peripheral vision is more sensitive to low
light levels and faster movement than the centre part of the eye - our
eyes and vision processing have evolved that way due to the significant
benefits of that arrangement.)
 
>> second (which is about 16 milliseconds per frame), preferably
>> 144 times per second and beyond.
 
> Someone's drunk the games fps kool aid.
 
Someone thinks their own opinions and experiences are the only ones that
matter, and doesn't believe anyone else.
David Brown <david.brown@hesbynett.no>: Aug 28 02:22PM +0200

> inlined out of the class and then calculation algorithms and so on. And in
> the end you end up with nothing left inside the object and what is essentially
> a procedural program.
 
As has been explained several times, in some cases - like games
programming - performance is so important that this is an acceptable
cost. Key parts of games and game engines are written in /assembly/ -
losing a bit of encapsulation is a minor cost in comparison to that.
boltar@nuttyella.co.uk: Aug 28 12:39PM

On Fri, 28 Aug 2020 11:53:50 +0200
>On 27/08/2020 20:57, daniel...@gmail.com wrote:
>First, there is a limit to how high a frequency people can here. The
>limit varies from person to person, over age, and with gender. There
 
Indeed. I remember being able to clearly hear the scan flyback noise
generated by TVs when I was kid which for PAL was 15.6 Khz but that ability
vanished in my early 30s, which given the amount of metal concerts I'd been
to by then was a testament to my ears :)
 
>to a photograph. And that was my point - people can prefer the lower
>quality audio or film, and find the higher quality version to be
>"unnatural", in contrast to the actual precision of the reproduction.
 
The other downsides of vinyl is that the dynamic range has to be compressed
particularly at the low end otherwise one groove would be overlapping another
and also IIRC there is some phasing that simply can't be reproduced as the
needle can't physically move in 2 different directions at the same time.
boltar@nuttyella.co.uk: Aug 28 12:42PM

On Fri, 28 Aug 2020 12:05:13 +0200
>> Someone's drunk the games fps kool aid.
 
>Someone thinks their own opinions and experiences are the only ones that
>matter, and doesn't believe anyone else.
 
Well, one can only really go by ones own experiences. But having seen all the
nonsense from the audiophool world thats is provenly BS, it wouldn't surprise
me if gamers subconciously project something similar with fps and monitors.
"Alf P. Steinbach" <alf.p.steinbach+usenet@gmail.com>: Aug 28 03:44PM +0200

On 27.08.2020 20:00, David Brown wrote:
> I've just read on the news that a popular illegal movie copying group
> and website has just been caught, with a Norwegian ringleader. It
> wasn't you (or your mother), was it? :-)
 
No, that was about people who tricked the movie companies into giving or
selling them movies at a very early point in the movie life cycle, then
cracked the protection and placed the movies on the public net.
 
That's like direct sabotage of the companies' business model, using
dishonest means, so I think they probably deserved getting caught.
 
However, the (especially US) entertainment industry's fight against
sharing of movies, series and music, is interestingly deeply irrational.
Nearly all the illegal copying that detracts from their sales income
happens in China and maybe Asia in general, but they don't address that.
And they don't do anything about the issues that cause people like me to
use "private" shared copies rather than buying, such as quality
(especially subtitles), availability (e.g. I could not find Dr. Zhivago
available to pay for, with reasonable quality), having control (e.g. I
remember I had to delete some scenes from the Mr. Robot series before my
old mother could view it, and I believe that would be impossible with
streaming). In short they don't do anything that would help increase
sales income. Instead they shelve out a lot of money to greedy lawyers,
who invariably choose the most spectacular with the least work actions,
such as going after poor Indian mothers and so on, which does not help
the companies they serve at all: it destroys their reputation. Mystery.
 
- Alf (off topic mode)
Torbjorn Lindgren <tl@none.invalid>: Aug 28 02:25PM

>With LCD screens the picture never goes "off" inbetween frames as it
>did with CRTs as its simply displays its picture buffer until updated
>with the next frame so flicker is even less noticable.
 
Well, most LCD's use PWM to control backlight intensity and a
surprising number use frequency low enough that it causes visible
artifacts at lower light intensities, especially when you move your
eyes quickly, the eye is much more sensitive to flicker when either
the eye or the scene is moving.
 
I'd say somewhere around a PWM frequency of 200Hz or lower there is
serious danger of this while > 1000Hz ought to be safe. CCFL usually
run at much higher frequencies (10+kHz) and also have afterglow which
reduce the issue but some LED backlight definitely runs way too slow
(and has no afterglow to reduce it). The article below argues for
2000+ Hz for LED backlight PWM and there's no real reason to not just
run the PWM at 10++kHz.
 
Hence "flicker free backlight" as a term.
 
Also, high end screens with LED backlight often *DO* have entirely
dark period between each frame, this reduces max brightness but they
can use higher power LEDs so there's no actual net loss (same heat
output) and due to how the eye work this increases the perceived
sharpness.
 
I've seen some manufacturer advertise that their screens do multiple
on/off periods per frame, I'm not sure how common that is.
 
IIRC VR glasses often does this and it does seem to help there so
there's probably a benefit for normal screens too but it's probably
much smaller - VR glasses is pretty much "worst case scenario".
 
This article is a bit old since it predates higher frequency screens
and most backlight strobing but covers most of the basics:
https://www.tftcentral.co.uk/articles/pulse_width_modulation.htm
 
 
>>can. Why do you think that when The Hobbit used 48 frames per second
>>a huge bunch of people complained about it looking somehow "unnatural"?
 
>A huge bunch? I've never heard about it.
 
It was definitely a BIG brouhaha about it during the launch.
"Öö Tiib" <ootiib@hot.ee>: Aug 28 07:27AM -0700

On Friday, 28 August 2020 15:23:07 UTC+3, David Brown wrote:
> programming - performance is so important that this is an acceptable
> cost. Key parts of games and game engines are written in /assembly/ -
> losing a bit of encapsulation is a minor cost in comparison to that.
 
I think Ben told about something else. I felt it was about carefully
considering layout of data.
 
It is worth to note that (regardless if it is game or whatever
other processing-heavy software) less than 5% of code base of
it alters performance in any noticeable manner. Also the performance
problems are mostly because of inefficient or non-scalable algorithms
used and so low level tweaks can benefit only quarter of those 5%.
 
So when our code base is million lines of code with 2000 classes
then I expect that less than 100 of those alter performance in
noteworthy manner and less than 25 to benefit from any low level
tweaking. 1900 do not alter performance at all and also those 75
can (and should) follow whatever programming paradigms chosen to
project by letter, just to be less naive a bit.
Stefan Monnier <monnier@iro.umontreal.ca>: Aug 28 10:34AM -0400

> again, something like 20 Hz is the minimum for motion to appear somewhat
> smooth, and most people are easily capable of seeing improvements up to
> about 50 or 60 Hz.
 
I wonder how this limit changes with motion blur: on TV I tend to notice
when the individual images are "perfectly" crisp (as is typically the
case in sports on TV where it's recorded at higher rates to be able to
replay in slow motion IIUC).
 
> Serious gamers or others who regularly use high speed systems can
> notice the difference of higher rates.
 
I know computer-graphics-generated movies go through extra trouble to
add motion blur. Do games do the same? If not, then that explain could
the "need" for higher refresh rates?
 
 
Stefan
boltar@nuttyella.co.uk: Aug 28 02:59PM

On Fri, 28 Aug 2020 14:25:09 -0000 (UTC)
>artifacts at lower light intensities, especially when you move your
>eyes quickly, the eye is much more sensitive to flicker when either
>the eye or the scene is moving.
 
Yes, for apparent brightness reasons a lot of LED systems use PWM including
car headlights instead of just using a steady DC supply.
 
>I'd say somewhere around a PWM frequency of 200Hz or lower there is
 
Can't say I've ever noticed. Perhaps I just have bad eyes.
boltar@nuttyella.co.uk: Aug 28 03:02PM

On Fri, 28 Aug 2020 10:34:17 -0400
>when the individual images are "perfectly" crisp (as is typically the
>case in sports on TV where it's recorded at higher rates to be able to
>replay in slow motion IIUC).
 
A current problem with video streams is the compression algorithm not being
able to keep up with fast motion or panning and introducing a noticable
jerkiness to the output. This only seem to happen on HD so possibly its an
artifact of H264 because SD streams just used to break up into blocks.
David Brown <david.brown@hesbynett.no>: Aug 28 05:17PM +0200


> Well, one can only really go by ones own experiences. But having seen all the
> nonsense from the audiophool world thats is provenly BS, it wouldn't surprise
> me if gamers subconciously project something similar with fps and monitors.
 
Audiophiles and gamers are very different here.
 
For gamers, there is a certain prestige in having the biggest screen or
the fastest computer. But for the most part, they are interested in
what works - what gives them a better competitive edge and higher
scores. This is especially true for at the top level. Remember, these
are people that make their living from gaming - from winning
competitions, from youtube channels, and that sort of thing. They will
not consider buying a screen with a faster refresh rate if spending the
same money on two slower screens will let them have a marginally better
chance in a competition. Of course big numbers, and imagined benefits,
will have some influence - but the primary motivation is better results,
and the results are /measurable/.
 
In the high-end audio world, the results are not measurable in any way.
Unscrupulous suppliers will promote figures, but they have no
meaningful value. And more naïve customers will be fooled by these
figures and pseudo-technical drivel. But for more honest suppliers and
more knowledgeable customers, it is a matter of what sound the customer
likes, and what total impression they like. Some people /like/ the
feeling of a big, solid box producing their sound. That's fine - it is
all part of the experience. Having your dinner from a silver plate with
a string quartet in the corner of the room does not change the chemical
composition of the food, but it changes the experience of eating it. As
long as hi-end audio suppliers and customers are honest about this, and
don't pretend the sound is quantitatively "better", there's nothing
wrong or "phoolish" about it.
 
(In the audio world, there is also a small but not insignificant section
that exists primarily for money laundering. The process goes like this.
A drug baron in, for example, Mexico, sends packets of drugs into the
USA by plane drop. The American distributor picks it up, and sells it.
He takes the supplier's part of that money and uses it to buy
ridiculously over-priced speaker cables (or whatever) from a particular
brand and a particular reseller. This brand is owned by the drug baron,
who produces the cables in Mexico for a tiny fraction of the resale
value. The result is white-washing of the drug money, which goes safely
back to the drug baron. Occasionally, some over-rich numpty buys one of
the cables thinking they are so expensive that they must be great -
that's just a bonus.)
 
 
We can agree that we can only really judge from our own experiences. So
when you don't have any related experience, you should refrain from
judging. You clearly know little of the high-end gaming world, and the
high-end audio world, and know little of either.
David Brown <david.brown@hesbynett.no>: Aug 28 05:20PM +0200

On 28/08/2020 16:34, Stefan Monnier wrote:
 
> I know computer-graphics-generated movies go through extra trouble to
> add motion blur. Do games do the same? If not, then that explain could
> the "need" for higher refresh rates?
 
Computer game graphics and movies have very different needs - one is
interactive, and the other passive. For films, you want smooth motion
and that includes motion blur - the aim is to show the feature to the
viewers with minimal effort for the viewer's visual cortex. For games,
the viewer needs precision, and therefore crisper and faster images.
This means it is more work for their brain in viewing and processing the
images - high-end gaming is actually quite hard work.
Stephen Fuld <sfuld@alumni.cmu.edu.invalid>: Aug 28 08:32AM -0700

On 8/28/2020 8:17 AM, David Brown wrote:
 
snip
 
> back to the drug baron. Occasionally, some over-rich numpty buys one of
> the cables thinking they are so expensive that they must be great -
> that's just a bonus.)
 
I had never heard of this. Ingenious, but since presumably the
distributor pays for the cables in cash, doesn't the cable manufacturer
have the same problem of either making large cash deposits to his bank,
or transferring them across borders as the the distributor would have?
 
 
--
- Stephen Fuld
(e-mail address disguised to prevent spam)
Juha Nieminen <nospam@thanks.invalid>: Aug 28 04:20PM

> Perhaps you never used CRT monitors, but the only time you would notice
> any flicker
 
Who's talking about flicker? Nobody has mentioned anything flicker.
 
The framerate is distinguished by how jittery the motion is. Next time
you are watching a movie, pay attention to when the camera eg. pans
slowly horizontally, and notice how the movement is done in quite
noticeable little jumps, rather than being completely smooth.
 
Compare a 60 Hz movie with horizontal panning with a 30 Hz version
and you'll notice the difference.
 
>>can. Why do you think that when The Hobbit used 48 frames per second
>>a huge bunch of people complained about it looking somehow "unnatural"?
 
> A huge bunch? I've never heard about it.
 
And since you have personally never heard about it, it never happened.
Of course.
 
>>Of course there are countless double-blind tests where people are
>>tested whether they can see a difference between 60 Hz and 120 Hz,
 
> And no doubt countless ones where no one noticed any difference.
 
In your opinion if there's even one test where the participants don't
notice any difference, that completely invalidates the tests where
the participants can tell with 100% accuracy which display is
showing 60 Hz and which one is showing 120 Hz?
 
>>second (which is about 16 milliseconds per frame), preferably
>>144 times per second and beyond.
 
> Someone's drunk the games fps kool aid.
 
Great counter-argument to what I said.
Juha Nieminen <nospam@thanks.invalid>: Aug 28 04:22PM

> when the individual images are "perfectly" crisp (as is typically the
> case in sports on TV where it's recorded at higher rates to be able to
> replay in slow motion IIUC).
 
Motion blur might help a bit, especially if you aren't specifically paying
attention, but you can still see jittery motion, especially when eg. the
camera is slowly panning horizontally. If you pay close attention, you
notice how the motion is jumpy rather than completely smooth.
 
The more you start seeing it, the more you'll be unable to unsee it.
Juha Nieminen <nospam@thanks.invalid>: Aug 28 04:23PM

> able to keep up with fast motion or panning and introducing a noticable
> jerkiness to the output. This only seem to happen on HD so possibly its an
> artifact of H264 because SD streams just used to break up into blocks.
 
You are making stuff up as you go, aren't you?
 
What will you conjure up next?
Bonita Montero <Bonita.Montero@gmail.com>: Aug 28 05:41AM +0200

> Doesn't matter. Make the code smaller and it will fit in RAM and not be
> paged to disk. Or fit in the L2 cache, and not in main store. Or the L1
> cache.
 
You usually get I-cache hit-rates > 95% becaus the I-cache covers the
hot-spots, even with large code.
boltar@nuttyella.co.uk: Aug 28 09:02AM

On Thu, 27 Aug 2020 16:15:12 +0000 (UTC)
 
>Running at 320x200 resolution, 256 colors.
 
>Let's see the same 486 run the same effects in 1920x1080 32-bit color,
>60 Hz.
 
You think a 486 couldn't do 60hz? Anyway, so what? The point still stands
that similar graphics could be done 25 years ago with far less powerful
machines. No one has yet disproved my point that web assembly is just another
piss poor cycle sucking VM that sacrifices performance on the altar of
portability.
 
>Of course you can run almost anything if you lower the resolution
>and color depth enough, use untextured polygons, and so on.
 
His example wasn't of anything that involved polygons.
Juha Nieminen <nospam@thanks.invalid>: Aug 28 04:14PM


>>Let's see the same 486 run the same effects in 1920x1080 32-bit color,
>>60 Hz.
 
> You think a 486 couldn't do 60hz?
 
Not at 1920x1080 32-bit color, no.
 
Maybe at 0.1 Hz. If even that.
 
> Anyway, so what? The point still stands
> that similar graphics
 
You have a strange definition of "similar".
 
> could be done 25 years ago with far less powerful
> machines.
 
Not at the same resolution, same color depth, same framerate.
 
Sure, if you lower the resolution enough, almost anything can run
almost anything. I can create a fully path-traced scene using global
illumination, dynamic shadows and volumetric lighting on a NES...
at 1x1 resolution 2 colors. Clearly the NES is as powerful at
graphics as modern PCs.
You received this digest because you're subscribed to updates for this group. You can change your settings on the group membership page.
To unsubscribe from this group and stop receiving emails from it send an email to comp.lang.c+++unsubscribe@googlegroups.com.

No comments: